
This text is an element two of a three-part collection on utilizing Heroku Managed Data merchandise from inside a Salesforce Function. Partially one, we targeted on Salesforce Capabilities with Heroku Postgres. Partially two, we’ll discover Salesforce Capabilities with Heroku Data for Redis. Lastly, partly three, we’ll cowl Salesforce Capabilities and Apache Kafka on Heroku.
Introduction to Core Concepts
What Is a Salesforce Function?
A Salesforce Function is a custom piece of code used to extend your Salesforce apps or processes. The custom code can leverage the language and libraries you choose while being run in the secure environment of your Salesforce instance.
For example, you could leverage a JavaScript library to calculate and cache a value based on a triggered process within Salesforce. If you are new to Functions in general, check out “Get to Know Salesforce Functions” to study what they’re and the way they work.
What Is Heroku Data for Redis?
Heroku Data for Redis is a Redis key-value datastore that’s absolutely managed for you by Heroku. That implies that Heroku takes care of issues like safety, backups, and upkeep. All you must do is use it. As a result of Heroku is a part of Salesforce, this makes entry and safety a lot simpler. The Heroku Dev Center documentation is a superb place to search out extra particulars on Heroku Knowledge for Redis.
Examples of Salesforce Functions + Heroku Data for Redis
Redis is commonly used for ephemeral data that you want quick access to. Examples include cached values, a queue of tasks to be performed by workers, session or state data for a process, or users visiting a website. While Redis can persist data to disk, it is primarily used as an “in-memory” datastore. Let’s review several use cases to give you a better idea of how Salesforce Functions and Redis can fit together.
Use Case #1: Store State Between Function Runs
There may be times when a process has multiple stages, with each stage requiring a function run. When the next function runs, you want to capture the state of that function run to be used by the next function that runs.
An example of this might be a price quoting process that requires some backend calculations at each stage. Different people or teams might perform the steps in the process. It’s possible they don’t even all belong within a single Salesforce Org. However, the function that runs at each stage needs to know about the previous outcome.
Use Case #2: Managing a Queue for Worker Processes
This use case is concerned with flexibility around background jobs. Because applications built on Salesforce run on a multitenant architecture, Salesforce locations restrictions on CPU and reminiscence utilization for purposes. Lengthy-running applications are sometimes out of bounds and restricted.
Then how may you run a protracted or heavy job in your Salesforce Org? The reply is Salesforce Capabilities. You may wire up your perform to assemble the data wanted and insert it into Redis. Then, your Heroku worker processes can retrieve the data and carry out the duties.
Use Case #3: Cache the Outcomes of Costly Operations
On this final use case, let’s assume that you’ve an costly question or calculation. The consequence doesn’t change typically, however the report that wants the consequence runs continuously. For instance, maybe we wish to match some standards throughout numerous information that seldom change. We are able to use a Salesforce Perform to do the work and Redis to retailer the consequence. Subsequent executions of the perform can merely seize the cached consequence.
How Do I Get Began?
To get began, you’ll must have a couple of items in place—each on the Salesforce Capabilities facet and the Heroku facet.
- Stipulations
- Getting began with Salesforce Capabilities
Accessing Heroku Data for Redis From a Salesforce Function
Once you have covered the prerequisites and created your project, you can run the following commands to create a Function with Heroku Data for Redis access.
To create the new JavaScript Function, run the following command:
$ sf generate function -n yourfunction -l javascript
That will give you a /functions
folder with a Node.js application template.
Connecting to Your Redis Instance
Your function code can use the dotenv
bundle for specifying the Redis URL as an setting variable and the node-redis
bundle as a Redis consumer. Connecting to Redis may look one thing like this:
import "dotenv/config";
import createClient from 'redis';
async perform redisConnect()
const redis = createClient(
url: course of.env.REDIS_URL,
socket:
tls: true,
rejectUnauthorized: false
);
await redis.join();
return redis;
For native execution, utilizing course of.env
and dotenv
assumes that you’ve a .env
file that specifies your REDIS_URL
.
Store Data in Redis
The actual body of your Salesforce Function will involve performing some calculations or data fetches, followed by storing the result in Redis. An example may look like this:
export default async function (event, context)
const redis = await redisConnect();
const CACHE_KEY = `my_cache_key`;
const CACHE_TTL_SECONDS = 86400;
// Check Redis for cached value
let cached_value = await redis.get(CACHE_KEY);
if (cached_value)
return result: cached_value
else
// Perform some calculation
const calculated_value = await perform_long_running_computation();
// Store in Redis
redis.set(CACHE_KEY, calculated_value,
EX: CACHE_TTL_SECONDS,
NX: true
);
// Return result
return result: calculated_value
Test Your Salesforce Function Locally
To test your Function locally, you first run the following command:
Then, you can invoke the Function with a payload from another terminal:
$ sf run function -l http://localhost:8080 -p '"payloadID": "info"'
For more information on running Functions locally, see this guide.
Associate Your Salesforce Function and Your Heroku Environment
After verifying locally that our Function runs as expected, we can associate our Salesforce Function with a computing environment. (See this documentation for extra details about making a compute setting and deploying a perform.)
Now, affiliate your capabilities and Heroku environments by including your Heroku consumer as a collaborator to your perform’s compute setting:
$ sf env compute collaborator add --heroku-user [email protected]
The environments can now share Heroku information. Subsequent, you will want the identify of the computing setting as a way to connect the information retailer to it.
Lastly, connect the information retailer.
$ heroku addons:connect <your-heroku-redis> --app <your-compute-environment-name>
Listed here are some further sources which may be useful for you as you start implementing your Salesforce Perform and accessing Heroku Knowledge for Redis:
Conclusion
And identical to that, you’re up and operating with a Salesforce Perform connecting to Heroku Knowledge for Redis!
Salesforce Capabilities provide the flexibility and freedom to work inside your Salesforce software to entry Heroku information—whether or not that information is in Postgres, Redis, and even Kafka. On this second a part of our collection, we’ve touched on utilizing Salesforce Capabilities to work with Heroku Knowledge for Redis. Whereas this can be a pretty high-level overview, it’s best to be capable to see the potential of mixing these two options. Within the last publish for this collection, we’ll combine with Apache Kafka on Heroku.