September 11, 2024

This text is a component two of a three-part sequence on utilizing Heroku Managed Data merchandise from inside a Salesforce Function. Partially one, we centered on Salesforce Features with Heroku Postgres. Partially two, we’ll discover Salesforce Features with Heroku Data for Redis. Lastly, partly three, we’ll cowl Salesforce Features and Apache Kafka on Heroku.

Introduction to Core Concepts

What Is a Salesforce Function?

A Salesforce Function is a custom piece of code used to extend your Salesforce apps or processes. The custom code can leverage the language and libraries you choose while being run in the secure environment of your Salesforce instance.

For example, you could leverage a JavaScript library to calculate and cache a value based on a triggered process within Salesforce. If you are new to Functions in general, check out “Get to Know Salesforce Functions” to find out about what they’re and the way they work.

What Is Heroku Data for Redis?

Heroku Data for Redis is a Redis key-value datastore that’s absolutely managed for you by Heroku. That signifies that Heroku takes care of issues like safety, backups, and upkeep. All it’s essential do is use it. As a result of Heroku is a part of Salesforce, this makes entry and safety a lot simpler. The Heroku Dev Center documentation is a superb place to search out extra particulars on Heroku Knowledge for Redis.

Examples of Salesforce Functions + Heroku Data for Redis

Redis is commonly used for ephemeral data that you want quick access to. Examples include cached values, a queue of tasks to be performed by workers, session or state data for a process, or users visiting a website. While Redis can persist data to disk, it is primarily used as an “in-memory” datastore. Let’s review several use cases to give you a better idea of how Salesforce Functions and Redis can fit together.

Use Case #1: Store State Between Function Runs

There may be times when a process has multiple stages, with each stage requiring a function run. When the next function runs, you want to capture the state of that function run to be used by the next function that runs.

An example of this might be a price quoting process that requires some backend calculations at each stage. Different people or teams might perform the steps in the process. It’s possible they don’t even all belong within a single Salesforce Org. However, the function that runs at each stage needs to know about the previous outcome.

Salesforce Org and Heroku

Use Case #2: Managing a Queue for Worker Processes

This use case is concerned with flexibility around background jobs. Because applications built on Salesforce run on a multitenant architecture, Salesforce locations restrictions on CPU and reminiscence utilization for purposes. Lengthy-running applications are sometimes out of bounds and restricted.

Then how would possibly you run a protracted or heavy process to your Salesforce Org? The reply is Salesforce Features. You’ll be able to wire up your operate to collect the data wanted and insert it into Redis. Then, your Heroku worker processes can retrieve the data and carry out the duties.

Heroku worker processes

Use Case #3: Cache the Outcomes of Costly Operations

On this final use case, let’s assume that you’ve got an costly question or calculation. The consequence doesn’t change usually, however the report that wants the consequence runs steadily. For instance, maybe we wish to match some standards throughout a lot of data that seldom change. We are able to use a Salesforce Operate to do the work and Redis to retailer the consequence. Subsequent executions of the operate can merely seize the cached consequence.

Salesforce Function

How Do I Get Began?

To get began, you’ll must have a number of items in place—each on the Salesforce Features aspect and the Heroku aspect.

  • Stipulations
  • Getting began with Salesforce Features

Accessing Heroku Data for Redis From a Salesforce Function

Once you have covered the prerequisites and created your project, you can run the following commands to create a Function with Heroku Data for Redis access.

To create the new JavaScript Function, run the following command:

$ sf generate function -n yourfunction -l javascript

That will give you a /functions folder with a Node.js application template.

Connecting to Your Redis Instance

Your function code can use the dotenv package deal for specifying the Redis URL as an surroundings variable and the node-redis package deal as a Redis consumer. Connecting to Redis would possibly look one thing like this:

import "dotenv/config";
import  createClient  from 'redis';

async operate redisConnect() 
  const redis = createClient(
    url: course of.env.REDIS_URL,
    socket: 
      tls: true,
      rejectUnauthorized: false
    
  );
  await redis.join();
  return redis;

For native execution, utilizing course of.env and dotenv assumes that you’ve got a .env file that specifies your REDIS_URL.

Store Data in Redis

The actual body of your Salesforce Function will involve performing some calculations or data fetches, followed by storing the result in Redis. An example may look like this:

export default async function (event, context) 
  const redis = await redisConnect();
  const CACHE_KEY = `my_cache_key`;
  const CACHE_TTL_SECONDS = 86400;

  // Check Redis for cached value
  let cached_value = await redis.get(CACHE_KEY);

  if (cached_value) 
    return  result: cached_value 
   else 
    // Perform some calculation
    const calculated_value = await perform_long_running_computation();

    // Store in Redis
    redis.set(CACHE_KEY, calculated_value, 
      EX: CACHE_TTL_SECONDS,
      NX: true
    );
    // Return result
    return  result: calculated_value 
  

Test Your Salesforce Function Locally

To test your Function locally, you first run the following command:

Then, you can invoke the Function with a payload from another terminal:

$ sf run function -l http://localhost:8080 -p '"payloadID": "info"'

For more information on running Functions locally, see this guide.

Associate Your Salesforce Function and Your Heroku Environment

After verifying locally that our Function runs as expected, we can associate our Salesforce Function with a computing environment. (See this documentation for extra details about making a compute surroundings and deploying a operate.)

Now, affiliate your capabilities and Heroku environments by including your Heroku person as a collaborator to your operate’s compute surroundings:

$ sf env compute collaborator add --heroku-user [email protected]

The environments can now share Heroku knowledge. Subsequent, you’ll need the title of the computing surroundings so to connect the info retailer to it.

Lastly, connect the info retailer.

$ heroku addons:connect <your-heroku-redis> --app <your-compute-environment-name>

Listed here are some extra sources that could be useful for you as you start implementing your Salesforce Operate and accessing Heroku Knowledge for Redis:

Conclusion

And similar to that, you’re up and working with a Salesforce Operate connecting to Heroku Knowledge for Redis!

Salesforce Features provide the flexibility and freedom to work inside your Salesforce software to entry Heroku knowledge—whether or not that knowledge is in Postgres, Redis, and even Kafka. On this second a part of our sequence, we’ve touched on utilizing Salesforce Features to work with Heroku Knowledge for Redis. Whereas it is a pretty high-level overview, it’s best to be capable of see the potential of mixing these two options. Within the last submit for this sequence, we’ll combine with Apache Kafka on Heroku.