Build Custom Functions with the Code Component
The code component allows you to execute product or industry-specific code within an integration. This page outlines when and how to use a code component.
Why use the code component?
You will likely have integration logic that can't be solved using the standard components Prismatic provides. The portion of your integrations that are specific to your product or industry can be accomplished using code component steps, or custom components.
When to build a custom component instead
Code component steps should be succinct and integration-specific. If the code you write could be reused in other integrations, or if the code is complex enough that it would benefit from unit tests, etc., you should build a custom component instead.
Adding a code step to an integration
Within the integration designer, add a step to your integration. Select the Code component, Code Block action.
You will be presented with a new code step in your integration, and you can click the "Edit" button to open the code editor:
Code structure
The code component provides a stub function by default. Let's examine the structure of the code:
module.exports = async ({ logger, configVars }, stepResults) => {
return { data: null };
};
The code component requires you to export an asynchronous function.
The default code uses arrow function notation to create an async
function to return.
Code component parameters
This function is provided a few parameters:
- The first positional parameter, is comprised of several properties:
logger
allows you to write out log lines.configVars
lets you access config variables (including connections).instanceState
,crossFlowState
,integrationState
andexecutionState
gives you access to persisted state.stepId
is the ID of the current step being executed.executionId
is the ID of the current execution.webhookUrls
contains the URLs of the running instance's sibling flows.webhookApiKeys
contains the API keys of the running instance's sibling flows.invokeUrl
was the URL used to invoke the integration.customer
is an object containing anid
,name
, andexternalId
of the customer the instance is assigned to.user
is an object containing anid
,name
,email
(their ID) andexternalId
of the customer user whose user-level config was used for this execution. This only applies to instances with User Level Configuration.integration
is an object containing anid
,name
, andversionSequenceId
of the integration the instance was created from.instance
is an object containing anid
andname
of the running instance.flow
is an object containing theid
andname
of the running flow.
- The second positional parameter,
stepResults
, is an object that contains output from previous steps of the integration.
Logging
context.logger
is an object that can be used for logging and debugging.
context.logger
has four functions: debug
, info
, warn
, and error
.
For example:
module.exports = async (context, stepResults) => {
context.logger.info("Things are going great");
context.logger.warn("Now less great...");
};
// or
module.exports = async ({ logger }, stepResults) => {
logger.info("Hello World");
};
Note: Log lines are truncated after 4096 characters. If you need longer log lines, consider streaming logs to an external log service.
Config variables
context.configVars
provides the Code Component with access to all config variables, including connections, associated with the integration.
If you have a config variable named Acme ERP Base URL
, for example, you could reference that config variable in a code step with context.configVars["Config Variable Name"]
syntax:
module.exports = async ({ configVars }, stepResults) => {
const fuelApiUrl = `${configVars["Acme ERP Base URL"]}/fuel`;
// ...
};
Connections
Connections are a special type of config variable.
You can access the contents of a connection the same way that you would any other config variable.
In this example, suppose you have a connection config variable named Acme ERP Connection
that contains two fields, tenantId
and apiKey
:
module.exports = async ({ logger, configVars }, stepResults) => {
const {
fields: { tenantId, apiKey },
} = configVars["Acme ERP Connection"];
const result = await doAThing({ tenantId, apiKey });
return { data: result };
};
Referencing previous step outputs
Most steps of an integration return some sort of value. An HTTP - GET action, for example, might return a JSON payload from a REST API. An Amazon S3 - Get Object will return a binary file pulled from S3.
The code component can reference those outputs through the stepResults
parameter.
stepResults
is an object that contains results from all previous steps.
For example, if you have an HTTP - GET step named Fetch Users List that pulls down an array of users from https://jsonplaceholder.typicode.com/users, you can generate an array of email addresses with this code:
module.exports = async (context, stepResults) => {
const userArray = stepResults.fetchUsersList.results;
const emailArray = userArray.map((user) => user.email);
return { data: emailArray };
};
Many components return objects that have multiple keys.
So, you can reference stepResults.myStepName.results.someKey
.
It's rare for a component to return serialized JSON, so there's rarely need to JSON.parse()
results from a previous step.
Previous steps names as variables
Since names of steps can include spaces and non-JavaScript-friendly characters, alphanumeric characters of step names are converted to camelCase. So, a step named Download JSON from API would be converted to downloadJsonFromApi. You can test out step name -> referenceable name conversions here:
Referencing integration trigger payload data
The integration trigger is simply another step that can have a unique name.
Suppose an integration is triggered by a webhook, the trigger is named My Integration Trigger
, and the webhook is provided a payload body.data
of {"exampleKey": "exampleValue"}
.
That exampleKey
would be accessible using stepResults.myIntegrationTrigger
like so:
module.exports = async ({ logger }, stepResults) => {
const exampleKey =
stepResults.myIntegrationTrigger.results.body.data.exampleKey;
logger.info(`Received '${exampleKey}'`);
};
Using JavaScript destructuring, you can instead write this:
module.exports = async (
{ logger },
{
myIntegrationTrigger: {
results: {
body: {
data: { exampleKey },
},
},
},
},
) => {
logger.info(`Received '${exampleKey}'`);
};
Notice the logged message in the testing drawer
Persisted data in a code step
Like a custom component, a code step can save and reference persisted state at the flow (instanceState
), cross-flow (crossFlowState
), integration (integrationState
) or execution (executionState
) level.
To save a value bar
as key foo
at the executionState
level:
module.exports = async ({ logger, configVars }, stepResults) => {
return {
data: null,
executionState: { foo: "bar" },
};
};
To load the value of the key foo
at the execution level you can reference your function's first parameter's executionState
property:
module.exports = async ({ executionState }, stepResults) => {
const myvalue = executionState["foo"];
return { data: `My value is ${myvalue}` };
};
Code component return values
The code component can optionally return a value for use by a subsequent step. The return value can be an object, string, integer, etc., and will retain its type as the value is passed to the next step.
The return value is specified using the data
key in the return object.
In this example we return a string with value "https://ipinfo.io/ip"
:
module.exports = async (context, stepResults) => {
return { data: "https://ipinfo.io/ip" };
};
The output can be used as input for the next step by referencing codeComponentStepName.results
.
To see an example of returning more complex data structures, and a good example use case for a code component, see the Using a Code Component to Transform Data quickstart.
Returning binary data from a code step
Sometimes you'll want your code component to return binary data (like a rendered image or PDF).
To do that, return an object with a data
property of type Buffer
(a file buffer), and a contentType
property of type String
that contains the file's MIME type:
module.exports = async (context, stepResults) => {
// ...
const fileBuffer = SomePdfLibrary.generatePdf();
return {
data: fileBuffer,
contentType: "application/pdf",
};
};
To see an example use case for returning binary data from a code component, check out our Generate a PDF with a Code Component.
For More Information: Using a Code Component to Transform Data
Making HTTP calls from a code step
The fetch API is baked into the code component.
To make an HTTP call to an API, you can use the fetch
function:
module.exports = async (context, stepResults) => {
const options = {
method: "POST",
headers: {
Accept: "application/json",
"Content-Type": "application/json",
Authorization: "Bearer abc-123",
},
body: JSON.stringify({ foo: "bar", baz: 123 }),
};
const response = await fetch("https://postman-echo.com/post", options);
const jsonData = await response.json();
return { data: jsonData };
};
Adding dependencies to a code step
If your code component depends on node modules from npm
, dependencies will be dynamically imported from the UNPKG CDN.
For example, if your code component reads:
const lodash = require("lodash@4.17.21/lodash.js");
module.exports = async (context, stepResults) => {
const mergedData = lodash.merge(
{ cpp: "12" },
{ cpp: "23" },
{ java: "23" },
{ python: "35" },
);
return { data: mergedData };
};
Then lodash version 4.17.21 will be imported as a dependency.
You should specify specific known working versions of npm
packages for your code component:
const lodash = require("lodash@2.4.2");
const { PDFDocument } = require("pdf-lib@1.17.1/dist/pdf-lib.js");
You can require any file from npm
using package[@version][/file]
syntax.
Note that with the lodash
import above, no file was specified.
If no file is specified, the main
file defined in the npm
package's package.json
is imported.
An explicit path was called out for the pdf-lib
import because the pdf-lib
package defaults to importing an index file that itself requires other files, and dist/pdf-lib.js
is a completely independent file that can be imported on its own..
Dynamic imports may or may not work, depending on how the dependency maintainer compiled their package.
All JavaScript code must be compiled into a single file to be dynamically imported.
If the package has its own dependencies that are not compiled in, or if the file you reference has its own require()
statements, you may see errors in your code.
You might also see errors if the UNPKG CDN is unavailable for any amount of time.
If you need external dependencies, we strongly recommend using a code step for prototyping, but building a custom component for production use. Custom components have their dependencies compiled in, and are not dependent on the uptime of an external CDN.
Requiring built-in NodeJS modules
You can also require built-in NodeJS modules, like crypto
or path
.
If the library you specify is built in to NodeJS, the client will not reach out to the UNPKG CDN, and will instead use the built-in module.
const crypto = require("crypto");
module.exports = async () => {
const { publicKey, privateKey } = crypto.generateKeyPairSync("rsa", {
modulusLength: 4096,
publicKeyEncoding: {
type: "spki",
format: "pem",
},
privateKeyEncoding: {
type: "pkcs8",
format: "pem",
cipher: "aes-256-cbc",
passphrase: "top secret",
},
});
return {
data: {
publicKey,
privateKey,
},
};
};
Using spectral utility functions in a code step
Prismatic's SDK, @prismatic-io/spectral is automatically imported into each code block as spectral
.
You can reference any utility functions that Spectral declares.
For example, if you need to cast a truthy "NO"
string to a boolean, you can do this:
module.exports = async ({ configVars }, stepResults) => {
const doAThing = spectral.util.types.toBool(configVars["Do a Thing?"]);
if (doAThing) {
return "Did a thing";
} else {
return "Didn't do a thing";
}
};
A list of all util type functions is available in the Spectral SDK.