# Prismatic Docs > Prismatic is the integration platform for B2B SaaS companies, enabling them to build and manage integrations with ease. These docs provide comprehensive guides, tutorials, and API references to help developers create low-code and code-native integrations using Prismatic's platform. ## Documentation ### Prismatic Docs ###### Start Building ###### Select a tutorial to get started [What is Prismatic?](https://prismatic.io/docs/intro/what-is-prismatic.md) [Learn about Prismatic, and how an embedded iPaaS can help you solve your integration needs.](https://prismatic.io/docs/intro/what-is-prismatic.md) [Build an Integration](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) [Build a simple integration that fetches data from an API and sends results to Slack using Prismatic's low-code integration builder.](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) [Develop a Connector](https://prismatic.io/docs/custom-connectors/get-started/setup.md) [Build a custom connector that you can use in your integrations using Prismatic's component SDK.](https://prismatic.io/docs/custom-connectors/get-started/setup.md) [Embed Marketplace](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) [Add an integration marketplace to your app using Prismatic's embedded SDK, so your customers can enable integrations for themselves.](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) ###### Explore Docs ###### [Build](https://prismatic.io/docs/integrations.md) * [Build with the low-code designer](https://prismatic.io/docs/integrations/low-code-integration-designer.md) * [Build with code-native](https://prismatic.io/docs/integrations/code-native.md) * [Manage how your integrations are triggered](https://prismatic.io/docs/integrations/triggers.md) ###### [Deploy](https://prismatic.io/docs/embed.md) * [Design intuitive config wizards](https://prismatic.io/docs/integrations/config-wizard.md) * [Embed the integration marketplace](https://prismatic.io/docs/embed/marketplace.md) * [Embed the workflow builder](https://prismatic.io/docs/embed/workflow-builder.md) ###### [Manage](https://prismatic.io/docs/configure-prismatic.md) * [Manage customers](https://prismatic.io/docs/customers.md) * [Monitor Integrations](https://prismatic.io/docs/monitor-instances.md) --- ### SDK Upgrade Guides #### [πŸ“„οΈ Spectral 2.x Upgrade Guide](https://prismatic.io/docs/spectral/spectral-2-upgrade-guide.md) [Upgrade your custom component from Spectral 1.x to 2.x](https://prismatic.io/docs/spectral/spectral-2-upgrade-guide.md) --- ### Prismatic Changelog OCTOBER 24, 2025 ##### Embedded Connections Management Screen[​](#embedded-connections-management-screen "Direct link to Embedded Connections Management Screen") You can now embed a dedicated connections management screen in your application using the new `prismatic.showConnections()` function. This screen provides your customers with a centralized location to view and manage all of their reusable [customer-activated connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md) for both marketplace integrations and custom workflows. **What's new:** * **Unified connections view**: Display all reusable connections in one place with the `prismatic.showConnections()` function * **Connection details**: Customers can click on any connection to view which instances and workflows use it * **Full management capabilities**: Edit or delete connections directly from the embedded screen * **Seamless experience**: Consistent connection management whether customers are working with marketplace integrations or custom workflows **Benefits:** * Simplified connection management for customers who use the same credentials across multiple integrations * Better visibility into where connections are being used * Reduced need for custom connection management UI in your application This feature requires `@prismatic-io/embedded` version `4.2.0` or later. See the [embedding additional screens documentation](https://prismatic.io/docs/embed/additional-screens.md#showing-the-connection-screen) for implementation details. OCTOBER 17, 2025 ##### New Component - October 2025[​](#new-component---october-2025 "Direct link to New Component - October 2025") We are pleased to announce the following component has been recently added to the Prismatic library: * [Okta](https://prismatic.io/docs/components/okta-management-api.md) - Okta is an identity and access management platform that provides secure authentication and authorization services. Use the component to manage users, groups, applications, and access policies. OCTOBER 15, 2025 ##### Reusable Customer-Activated Connections[​](#reusable-customer-activated-connections "Direct link to Reusable Customer-Activated Connections") Customer-activated connections just got easier to use. Your customers can now save their credentials once and reuse them across multiple marketplace integrations and custom workflows, eliminating the need to re-enter credentials they've already provided. **What's new:** * **Save and reuse credentials**: When customers configure a marketplace integration or build a workflow, they can save their connection credentials and select from existing saved connections in future configurations * **Cross-environment reusability**: Connections created in the embedded workflow builder can be reused in marketplace integrations, and vice versa * **Unified experience**: Customers see a consistent connection management experience whether they're configuring an integration or building a custom workflow * **Test credential support**: Organizations can configure test credentials to preview the end-customer experience during development **Benefits:** * Faster integration activation for customers who use the same third-party services across multiple integrations * Reduced friction in the configuration experience * Better alignment between marketplace and embedded workflow builder experiences To enable this feature for your integrations, assign a [customer-activated connection](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md) to your marketplace integration and ensure you're using `@prismatic-io/embedded` version `4.2.0` or later. OCTOBER 14, 2025 ##### New MCP UI in Integration Designer[​](#new-mcp-ui-in-integration-designer "Direct link to New MCP UI in Integration Designer") We've added a new MCP tab to the integration designer that makes it easier to configure and manage Model Context Protocol (MCP) connections for your agent flows. **Highlights:** * View and copy the custom MCP endpoint URL for connecting AI agents to a specific integration's agent flows * Enable or disable individual flows as MCP tools with simple toggle controls * See at a glance which flows are available as tools for AI agents like Claude, Cursor, or custom applications * Streamlined workflow for configuring which agent flows should be exposed through the MCP flow server Check out the [MCP flow server documentation](https://prismatic.io/docs/ai/model-context-protocol.md) and [connecting AI agents guide](https://prismatic.io/docs/ai/connect-ai-agent.md) for more information. OCTOBER 06, 2025 ##### Prismatic Event Webhooks[​](#prismatic-event-webhooks "Direct link to Prismatic Event Webhooks") Prismatic event webhooks provide a powerful way to receive real-time notifications about events happening in your Prismatic account. With webhooks, you can be informed when customers create workflows using the embedded workflow builder, simplify your integration billing, and stay informed about important changes to customers, integrations, instances and more. For example, you can say "When an `instance.deployed` or `instance.deleted` events occurs, let me know at `https://`." Key features of Prismatic event webhooks include: * **Real-time Notifications**: Receive immediate updates when events occur, without polling the Prismatic API. * **Multiple Event Types**: You can create granular webhooks that subscribe to over 30 unique events types. * **Security**: Use HMAC signatures to verify the authenticity of incoming webhook requests. To get started with Prismatic event webhooks, check out the [webhooks documentation](https://prismatic.io/docs/webhooks.md). SEPTEMBER 30, 2025 ##### Code-Native Development Updates[​](#code-native-development-updates "Direct link to Code-Native Development Updates") Today we're releasing updates that significantly improve the code-native development experience on Prismatic. Build, test, and deploy integrations entirely in code with AI assistance, streamlined component management, and enhanced TypeScript support. **Prism MCP Server** * Enable AI coding assistants (Claude, Cursor) to understand Prismatic's SDK and generate integration flows, components, and test configurations using natural language **VS Code Extension** * Run test executions with detailed logs, pull existing integrations from your Prismatic instance, configure instances through embedded config wizard, and manage authenticationβ€”all without leaving your IDE **Improved Component Manifest Generation** * Generate component manifests with a single command `npx cni-component-manifest ` and use components without installing npm packages **Enhanced Component Referencing** * Full TypeScript support with real-time validation for defining connections, data sources, and triggers in configuration wizards Check out the blog post [link](http://prismatic.io/blog/announcing-code-native-development-updates) for additional info and quick start demo videos. SEPTEMBER 30, 2025 ##### Singleton Executions for Scheduled Flows[​](#singleton-executions-for-scheduled-flows "Direct link to Singleton Executions for Scheduled Flows") Prismatic now supports singleton executions for flows that run on a schedule. This means that if a scheduled flow is still running when the next execution is triggered, the new execution will be skipped. **Highlights:** * Enable singleton executions in the trigger's configuration * Prevent overlapping executions for [scheduled flows](https://prismatic.io/docs/integrations/triggers/schedule.md) and flows that use a [polling trigger](https://prismatic.io/docs/integrations/triggers/app-events.md#app-event-triggers-with-polling) * Avoid duplicate processing, race conditions and troubleshooting headaches Check out the documentation [here](https://prismatic.io/docs/integrations/triggers/schedule.md#ensuring-singleton-executions-for-scheduled-flows). SEPTEMBER 09, 2025 ##### New Feature - Support visibility for Workflows[​](#new-feature---support-visibility-for-workflows "Direct link to New Feature - Support visibility for Workflows") Organizations can now see valuable insights into customer-built workflows through the Prismatic admin **Highlights:** * See the number of customer-built workflows separately from organization-built integrations * View and filter Instances pages, logs, and execution history by their instance type (integration or workflow) * Gain insights on workflow creation and enablement separately from integration instances on customer utilization pages SEPTEMBER 09, 2025 ##### New Feature - Workflow Templates for Embedded Workflow Builder[​](#new-feature---workflow-templates-for-embedded-workflow-builder "Direct link to New Feature - Workflow Templates for Embedded Workflow Builder") You can now create Workflow Templates for Embedded Workflow Builder and publish them to your organization's workflow template library. **Highlights:** * New Workflow Templates option available under the Build section of the Prismatic sidebar * Create and edit templates directly in the Workflow Builder low-code canvas * Save drafts separately from published templates SEPTEMBER 08, 2025 ##### New Feature - FIFO Queues[​](#new-feature---fifo-queues "Direct link to New Feature - FIFO Queues") Prismatic now supports first-in, first-out (FIFO) execution for webhook-triggered flows. Events are processed strictly in order, without needing any external queues or workarounds. **Highlights:** * Flow-level toggle to enable FIFO * One execution at a time while subsequent events queue in order * Built-in event deduplication using an optional ID field * Error handling ensures failed executions don't block the queue Check out the documentation [here](https://prismatic.io/docs/integrations/triggers/fifo-queue.md). SEPTEMBER 05, 2025 ##### New Component - September 2025[​](#new-component---september-2025 "Direct link to New Component - September 2025") We are pleased to announce the following component has been recently added to the Prismatic library: * [Guru](https://prismatic.io/docs/components/guru.md) - Guru is a knowledge management platform that brings company information to your team where they work. Use the component to manage cards, collections, boards, and knowledge sharing workflows. AUGUST 15, 2025 ##### New Component - August 2025[​](#new-component---august-2025 "Direct link to New Component - August 2025") We are pleased to announce the following component has been recently added to the Prismatic library: * [Azure Cosmos DB](https://prismatic.io/docs/components/azure-cosmos-db.md) - Azure Cosmos DB is a Microsoft database service designed for handling various applications. Use the component to manage databases, collections, and documents. AUGUST 01, 2025 ##### Theming Update: Improved Coverage and Customization[​](#theming-update-improved-coverage-and-customization "Direct link to Theming Update: Improved Coverage and Customization") We've expanded theme support across both the embedded and org-facing UI to deliver more consistent and brand-aligned experiences. What's new: You can now define a Neutral value in your theme settings, which dynamically generates the full neutral palette. Many UI elements (including inputs, dropdowns, hover states, filter menus, and more) now reference theme tokens for styling. Previously unthemable components have been updated to use a theme value that you can control for better alignment with your brand. Where theme improvements apply: * Embedded Workflow Builder * Embedded Designer * Configuration Wizard * Marketplace * Dashboard * Integration Listing Pages These improvements enhance visual consistency, boost polish and professionalism, and help your product feel even more cohesive to end users. JULY 17, 2025 ##### Embedded Workflow Builder[​](#embedded-workflow-builder "Direct link to Embedded Workflow Builder") We're excited to announce the next evolution of our embedded workflow builder (formerly embedded designer)! Based on extensive customer feedback, we've reimagined the experience to make customer self-service integrations simpler and more powerful. Key improvements: * **Streamlined workflow building:** Configure connections and data sources as part of setting up step actions, eliminating the need for separate configuration wizards. * **One-click publishing:** Enable workflows instantly without additional marketplace configuration or instance setup steps. * **Error Management:** All errors consolidated within a single Status menu for faster troubleshooting. * **Safe exploration:** New read-only view allows examining workflows without risking accidental edits. * **Consolidated inputs:** Simplified input system consolidates Text, Config Variable, Reference, and Template inputs into a single, flexible type. * **Native integration experience:** Seamless embedding with customizable branding, terminology, and styling. Existing customers using the previous embedded designer can continue using that functionality while having the option to upgrade to the new embedded workflow builder at no additional charge. Read more about the embedded workflow builder in our [blog post](http://prismatic.io/blog/embedded-workflow-builder-is-now-generally-available/). JUNE 05, 2025 ##### MCP Flow Server and Flow Invocation Schema[​](#mcp-flow-server-and-flow-invocation-schema "Direct link to MCP Flow Server and Flow Invocation Schema") We recently announced [several](https://prismatic.io/blog/prismatic-ai-next-evolution/) AI-related initiatives at Prismatic. Today, we're excited to release [Flow Invocation Schema](https://prismatic.io/docs/ai/flow-invocation-schema.md) which gives your AI agents the context necessary to invoke your Prismatic workflows. Along with Flow Invocation Schema, we're also releasing a [Model Context Protocol (MCP)](https://prismatic.io/docs/ai/model-context-protocol.md) flow server which exposes your agent flows as tools for your AI agents to use. JUNE 04, 2025 ##### New Components - June 2025[​](#new-components---june-2025 "Direct link to New Components - June 2025") We are pleased to announce the following AI components have been recently added to the Prismatic library: * [Anthropic](https://prismatic.io/docs/components/anthropic.md) - Anthropic is an artificial intelligence research company that provides various AI systems and large language models (LLM) * [Google Gemini](https://prismatic.io/docs/components/google-gemini.md) - Google Gemini is a family of advanced multimodal AI models developed by Google DeepMind. * [DeepSeek](https://prismatic.io/docs/components/deepseek.md) - DeepSeek is an AI developer of large language models (LLM) focused on providing high performance models. MAY 19, 2025 ##### New Component - May 2025[​](#new-component---may-2025 "Direct link to New Component - May 2025") We are pleased to announce the following component has been recently added to the Prismatic library: * [Hibob](https://prismatic.io/docs/components/hibob.md) - Hibob is a cloud-based HR platform that provides tools for managing employee data, payroll, benefits, and performance. MAY 13, 2025 ##### Low-code / code-native converter and testing code-native from the CLI[​](#low-code--code-native-converter-and-testing-code-native-from-the-cli "Direct link to Low-code / code-native converter and testing code-native from the CLI") Two significant improvements to code-native integrations are now out of beta: 1. You can now convert a low-code integration to code-native. This is ideal if you've built a proof-of-concept in the low-code designer, but would like the flexibility to build the rest of the integration in pure TypeScript. Check out documentation [here](https://prismatic.io/docs/integrations/code-native/get-started/convert-low-code-code-native.md). 2. As a developer, you probably want to remain in your IDE as you build and test your integration. With the `prism integrations:flows:test` command, you can now invoke a test of your code-native integration's flow from your terminal. Read more [here](https://prismatic.io/docs/integrations/code-native/testing.md#testing-a-code-native-integration-from-the-cli). APRIL 17, 2025 ##### Templated connection inputs[​](#templated-connection-inputs "Direct link to Templated connection inputs") If your users need to enter the same information in several connection input fields - for example, if they need to enter their third-party custom domain in an OAuth authorization URL input, and token input, and API base URL input - [templated connection inputs](https://prismatic.io/docs/custom-connectors/connections.md#templating-connection-inputs) can help. Our custom connector SDK has been updated so you can now prompt your customer to enter a single value, and other inputs for that connection can be generated automatically using that value. ![Screenshot of templating connection inputs](/docs/img/changelog/templating-connection-inputs.png) Additionally, `comments` you provide for inputs on your connections now support markdown, so you can bold or otherwise highlight important instructions for your customers. The built-in [Shopify](https://prismatic.io/docs/components/shopify.md) connector has been updated to use this new templated connection input logic, as Shopify issues unique OAuth 2.0 endpoints for each Shopify store. APRIL 17, 2025 ##### New Components - April 2025[​](#new-components---april-2025 "Direct link to New Components - April 2025") We are pleased to announce the following components have been recently added to the Prismatic library: * [Goto Webinar](https://prismatic.io/docs/components/gotowebinar.md) - Goto Webinar is a platform for hosting, managing, and attending live or pre-recorded webinars. This component allows you to schedule, manage, and subscribe to webinars, registrants, attendees, and more. * [Oracle Database](https://prismatic.io/docs/components/oracledb.md) - Oracle Database is a popular relational database system. This component allows you to query an Oracle database. MARCH 31, 2025 ##### Resetting JSON Forms on Input Change[​](#resetting-json-forms-on-input-change "Direct link to Resetting JSON Forms on Input Change") You can now configure your JSON Forms config variables to reset data to defaults if inputs to the config variable change. This is useful if your JSON Form generates default data that is derived from other config variables on previous pages, and the defaults should change if the user selects different values on previous config pages. Read about resetting JSON Forms data [here](https://prismatic.io/docs/integrations/data-sources/json-forms.md#detecting-changes-to-inputs-and-overriding-default-data). MARCH 24, 2025 ##### Recursive Flow Trigger[​](#recursive-flow-trigger "Direct link to Recursive Flow Trigger") Flows can run for up to 15 minutes. But, sometimes you have more than 15 minutes of work to do. Maybe you have 100,000 records to import when an instance is deployed, and you know that it'll take you 4 hours to process them. The [Recursive Flow](https://prismatic.io/docs/components/recursive-flow.md) component helps you chain a series of executions together, so you can process a set of data for more than 15 minutes. See the [Processing Data with Recursive Flows](https://prismatic.io/docs/integrations/common-patterns/processing-data-recursive-flows.md) article for examples of how you can process large sets of data across several executions. MARCH 21, 2025 ##### New Component - March 2025[​](#new-component---march-2025 "Direct link to New Component - March 2025") We are pleased to announce the following component has been recently added to the Prismatic library: * [WhatsApp](https://prismatic.io/docs/components/whatsapp.md) - WhatsApp is a messaging app that allows users to send texts, make voice and video calls, and share media. MARCH 11, 2025 ##### Debug mode[​](#debug-mode "Direct link to Debug mode") You can now enable [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode) in the integration designer or for a specific instance, which gives you more insight into time and memory metrics of your running flow. When debug is enabled, a log line is emitted after each step that details how long the step took and how much memory was consumed. Additionally, a `debug` object has been added to the [`context`](https://prismatic.io/docs/custom-connectors/actions.md#the-context-parameter) parameter so that you can optionally emit debug lines, measure how long certain tasks take within a code block or custom action, or measure how much memory is consumed by portions of your custom code when debug mode is enabled. Read more [here](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode). Prismatic's public components will start transitioning to using debug mode rather than action-specific debug toggles. MARCH 06, 2025 ##### Introducing Additional Connection Types[​](#introducing-additional-connection-types "Direct link to Introducing Additional Connection Types") We've enhanced our connection capabilities to give you better control over authentication across your integrations. What's new: * **Updated**: [Organization-activated customer connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-customer.md) capabilities have been expanded to include support for connections that use OAuth 2.0 Client Credentials. This improves your organization's ability to establish a connection on behalf of your customers. * **New** [Organization-activated global connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-global.md) allows your organization to configure and establish a central live connection that's invisible to customers at deploy time and works across multiple instances. * **New** [Customer-activated connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md) creates a centralized system for managing connections that customers enable themselves. * **Updated** General UI/UX improvements. Head to the **Connections** tab in your organization's settings to set up a new integration-agnostic connection, and manage your connections and test credentials from there. These enhancements enable you to deliver a frictionless, customer-friendly authentication experience tailored to your integration needs, while also making connections easier to manage and reuse across multiple integrations. JANUARY 29, 2025 ##### Hiding the initial config wizard page[​](#hiding-the-initial-config-wizard-page "Direct link to Hiding the initial config wizard page") You can now *opt-in* to remove the first page of the config wizard, creating a smoother activation process for your customers. Opt-in by upgrading the [embedded SDK](https://www.npmjs.com/package/@prismatic-io/embedded) to version 2.12.0 and setting the `configurationWizard.mode` to `"streamlined"`. Read [docs](https://prismatic.io/docs/embed/marketplace.md#configuration-wizard-customization) for more details. This is completely optional and won't affect your existing implementations. DECEMBER 17, 2024 ##### Product update: Native cross-flow calling[​](#product-update-native-cross-flow-calling "Direct link to Product update: Native cross-flow calling") We've simplified how flows work together in Prismatic. Now you can directly reference and call other flows within the same integration with a simple cross-flow component. This makes it more intuitive to build integrations that need to coordinate between multiple flows. ![Add a cross-flow trigger to an integration](/docs/img/changelog/cross-flow-triggers.png) Key improvements: > πŸ–₯️ Directly call other flows through a simple interface > > πŸ”ƒ Clear visibility of flow relationships in both testing and execution > > ⏱️ Real-time monitoring of called flow status Perfect for breaking down complex processes into manageable pieces or creating reusable logic across your integrations. *Want to get started? Check out our [docs](https://prismatic.io/docs/integrations/triggers/cross-flow.md).* DECEMBER 13, 2024 ##### New Components - December 2024[​](#new-components---december-2024 "Direct link to New Components - December 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Active Directory](https://prismatic.io/docs/components/ldap.md) - Active Directory for LDAP (Lightweight Directory Access Protocol) is a protocol for accessing and managing directory information. This component provides tools for operations such as authentication, querying, and managing directory entries. * [TeamViewer](https://prismatic.io/docs/components/teamviewer.md) - TeamViewer is support software that allows users to connect and control devices remotely for troubleshooting, collaboration, and management purposes. NOVEMBER 21, 2024 ##### Polling Triggers[​](#polling-triggers "Direct link to Polling Triggers") Many apps offer [webhooks](https://prismatic.io/docs/integrations/triggers/webhook.md) which notify you when something changes in their system. Not all apps support webhooks, though. When an app doesn't offer webhooks, you need to poll their API periodically for new data. Today, we're releasing **Polling Triggers** - triggers that poll for new records and start an execution if new data is available to process. We've added polling triggers to our [Dropbox](https://prismatic.io/docs/components/dropbox.md) and [Google Drive](https://prismatic.io/docs/components/google-drive.md) components, with more on the way! You can read more about polling triggers [here](https://prismatic.io/docs/integrations/triggers/app-events.md#app-event-triggers-with-polling), or [write your own](https://prismatic.io/docs/custom-connectors/triggers.md#app-event-polling-triggers). NOVEMBER 15, 2024 ##### New Components - November 2024[​](#new-components---november-2024 "Direct link to New Components - November 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Toast](https://prismatic.io/docs/components/toast.md) - Toast is a cloud-based point-of-sale system designed specifically for the restaurant industry, offering tools for order management, payments, and business insights. * [Tenable Vulnerability Management](https://prismatic.io/docs/components/tenable-vulnerability-management.md) - Tenable Vulnerability Management is a leading security solution that identifies, evaluates, and prioritizes vulnerabilities to reduce risk and enhance cybersecurity. * [PDQ](https://prismatic.io/docs/components/pdq.md) - PDQ provides a suite of management tools to automate software deployment, manage patches, and track inventory across a company's networks. * [Freshservice](https://prismatic.io/docs/components/freshservice.md) - Freshservice is a cloud based IT service management software that streamlines IT operations, automates workflows, and improves service delivery for organizations. * [Azure Event Grid](https://prismatic.io/docs/components/azure-event-grid.md) - Microsoft Event Grid is used to build data pipelines, integrate applications, and create event-driven serverless solutions with a fully managed publish-subscribe messaging service. NOVEMBER 05, 2024 ##### Requiring Components in Embedded Builder[​](#requiring-components-in-embedded-builder "Direct link to Requiring Components in Embedded Builder") If your customers to build integrations for themselves within your app using the [embedded builder](https://prismatic.io/docs/embed/workflow-builder.md) you may want to require that your customers' integrations include particular components (like your custom components). You can now require that your customers include certain components in their integration before they are able to publish. Read more in [docs](https://prismatic.io/docs/embed/workflow-builder/designer.md#requiring-components). OCTOBER 22, 2024 ##### Control Names of Branded Elements[​](#control-names-of-branded-elements "Direct link to Control Names of Branded Elements") We call the place where your customers go to enable integrations the "Marketplace", and we call a set of flows with a config wizard an "Integration". But, your company may use other terms like "Solution" or "Workflow" to describe these concepts. You can now control the names of branded elements (like "Marketplace" or "Integration") and rename them to the names that you use internally. ![Renamed marketplace in the embedded app](/docs/img/changelog/renamed-marketplace.png) Changes you make will be reflected in the embedded marketplace and the embedded workflow builder. Read more about controlling the names of branded elements in [docs](https://prismatic.io/docs/embed/theming.md#renaming-integration-and-marketplace). OCTOBER 10, 2024 ##### New Components - October 2024[​](#new-components---october-2024 "Direct link to New Components - October 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Frontify](https://prismatic.io/docs/components/frontify.md) - Frontify is a comprehensive brand management platform that enables organizations to create, manage, and distribute brand assets, guidelines, and digital content across teams and channels, streamlining brand consistency and collaboration. * [PagerDuty](https://prismatic.io/docs/components/pagerduty.md) - PagerDuty is an industry leading incident management tool. Use this component to create and manage Incidents and events. SEPTEMBER 25, 2024 ##### Organization-Activated Connections[​](#organization-activated-connections "Direct link to Organization-Activated Connections") If you rely on connections that are customer-specific, and the same connection is used in multiple integrations, [organization-activated connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-customer.md) can help. Organization-activated connections let you define a connection once for a customer, and then use that connection in multiple instances that are deployed for that customer. This is especially handy when interacting with your own app in integrations. Each of your customers likely has a unique API key for your API. You can set an API key for each of your customers once, and when an instance is deployed to a customer, that instance will use that customer's API key that you set. SEPTEMBER 13, 2024 ##### New Components - September 2024[​](#new-components---september-2024 "Direct link to New Components - September 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Ramp](https://prismatic.io/docs/components/ramp.md) - Ramp is a spend management platform focused on automating accounts payable and procurement processes. * [Microsoft Entra ID (Formerly Azure Active Directory)](https://prismatic.io/docs/components/ms-entra-id.md) - Microsoft Entra ID (Formerly Azure Active Directory) is a cloud-based identity and access management service from Microsoft that helps employees sign in and access resources. * [Gorgias](https://prismatic.io/docs/components/gorgias.md) - Gorgias is a customer support platform designed to help e-commerce businesses manage customer inquiries and support tickets efficiently. * [SAP Business One](https://prismatic.io/docs/components/sap-business-one.md) - SAP Business One is an integrated enterprise resource planning (ERP) solution designed for organizations to manage their entire operations. AUGUST 05, 2024 ##### New Components - August 2024[​](#new-components---august-2024 "Direct link to New Components - August 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Klaviyo](https://prismatic.io/docs/components/klaviyo.md) - Klaviyo is a cloud based email marketing solution that enables e-commerce businesses to create, send, and analyze email and SMS campaigns. * [Bill](https://prismatic.io/docs/components/bill.md) - Bill is a leading provider of cloud-based software that simplifies and automates back-office financial operations for small and midsize businesses. JULY 31, 2024 ##### Write-Only Connection Inputs[​](#write-only-connection-inputs "Direct link to Write-Only Connection Inputs") In some situations, it can be helpful to make a connection input **write-only** (e.g. a user can enter an API key, password, etc, but cannot retrieve it later). You can now mark any connection input as **write-only**. Read more about write-only connection inputs in our [docs](https://prismatic.io/docs/integrations/config-wizard/config-variables.md#write-only-connection-inputs). JULY 17, 2024 ##### Reference Existing Actions in Code-Native[​](#reference-existing-actions-in-code-native "Direct link to Reference Existing Actions in Code-Native") You can now reference existing components' actions in a code-native integration's flow. This allows you to leverage Prismatic's existing [component library](https://prismatic.io/docs/components.md) while working in your favorite IDE with TypeScript and other tools you love. Read more about invoking actions in code-native in our [docs](https://prismatic.io/docs/integrations/code-native/existing-components.md). JULY 08, 2024 ##### New Components - July 2024[​](#new-components---july-2024 "Direct link to New Components - July 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Karbon](https://prismatic.io/docs/components/karbon.md) - Karbon is a collaborative practice management platform for accounting firms. * [Workday (Beta)](https://prismatic.io/docs/components/workday.md) - Workday HCM is a single, cloud-based solution for workforce planning, talent management, and payroll processes. * [Duro PLM](https://prismatic.io/docs/components/duro-plm.md) - Duro PLM is a platform designed to intuitively centralize part data, manage change orders, and connect to the rest of your tech stack. JULY 08, 2024 ##### Start a New Integration From a Template[​](#start-a-new-integration-from-a-template "Direct link to Start a New Integration From a Template") When you create a new integration, you can now start from a variety of Prismatic-built integration templates. The integration templates demonstrate common integration patterns across our most popular connectors. ![Configure a new integration](/docs/img/changelog/integration-templates.png) JUNE 20, 2024 ##### GitHub Actions for Components and Integrations[​](#github-actions-for-components-and-integrations "Direct link to GitHub Actions for Components and Integrations") You can now use Prismatic-provided GitHub Actions to automate the deployment of your components and integrations to Prismatic. This is handy if you have multiple Prismatic tenants and you've designated one for development and the other(s) for production. Read more about how to use the [Prismatic GitHub Actions](https://prismatic.io/docs/api/github-actions.md) in our docs. JUNE 14, 2024 ##### New Components - June 2024[​](#new-components---june-2024 "Direct link to New Components - June 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Sage 200](https://prismatic.io/docs/components/sage-200.md) - Sage 200 is an online business management solution designed to help businesses manage their finances, customers, and business insight in one solution. * [Bynder](https://prismatic.io/docs/components/bynder.md) - Bynder is a leading digital asset management software that allows users to easily create, find, and use content, such as documents, graphics, and videos. JUNE 11, 2024 ##### In-app Docs[​](#in-app-docs "Direct link to In-app Docs") You can now search Prismatic's docs from within the web app. This makes it easier to find the information you need while building integrations. To open up the docs search, click "Help" on the top right of the Prismatic web app and search for the topic you're interested in. ![In-app docs](/docs/img/changelog/in-app-docs.png) MAY 21, 2024 ##### On-Prem Agent[​](#on-prem-agent "Direct link to On-Prem Agent") The on-prem agent allows you to connect your integrations to resources that are on private networks or behind firewalls. This is handy if you want to integrate with a database, file system, or other resource that is not accessible from the public internet. Read more about how the on-prem agent works in [docs](https://prismatic.io/docs/integrations/connections/on-prem-agent.md). MAY 14, 2024 ##### New Components - May 2024[​](#new-components---may-2024 "Direct link to New Components - May 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Microsoft Intune](https://prismatic.io/docs/components/ms-intune.md) - Microsoft Intune is a cloud-based service that focuses on device management and application management. Use the Microsoft Intune component to manage users, devices, and applications. * [ServiceDesk Plus](https://prismatic.io/docs/components/servicedesk-plus.md) - ServiceDesk Plus is a comprehensive service desk software that offers a suite of IT Service management, IT asset management, CBDM, and more. Use the ServiceDesk Plus to efficiently manage Assets and Configuration Items for integrations. * [Databricks](https://prismatic.io/docs/components/databricks.md) - Databricks is an analytics and artificial intelligence platform where you can build, scale and govern data and AI, including generative AI and other machine learning models. Manage compute, workflow jobs, ML models, SQL queries and more within a Databricks workspace. * [Zendesk Knowledge](https://prismatic.io/docs/components/zendesk.md) - Zendesk Knowledge is a comprehensive solution that allows you to manage content such as articles, sections, categories, topics, and posts, as well as subscriptions and attachments in the Help Center of Zendesk. The Knowledge and Help Center actions have been added to our existing Zendesk Component. MAY 03, 2024 ##### New Component - April 2024[​](#new-component---april-2024 "Direct link to New Component - April 2024") We are pleased to announce the following component has been recently added to the Prismatic library: * [Contentful](https://prismatic.io/docs/components/contentful.md) - Contentful is a content management system (CMS) that allows developers to manage and deliver content across multiple platforms and devices. APRIL 15, 2024 ##### Code-Native Integrations - Reference components[​](#code-native-integrations---reference-components "Direct link to Code-Native Integrations - Reference components") You can now reference triggers, connections, and data sources of existing components when building code-native integrations. Both public and private components can be used in your code-native integrations. By leveraging an existing component, you can save time and effort by reusing existing functionality. To reference existing components you will need to upgrade to Prism 6.0.0 and Spectral 8.1.0. Prism 6.0.0 requires Node 18 or higher. For those of you that have already started building code-native integrations, note that type definitions now use strings rather than enums to minimize the number of imports required. Check out the [code-native documentation](https://prismatic.io/docs/integrations/code-native.md) and the video below to learn more. MARCH 29, 2024 ##### New Components - March 2024[​](#new-components---march-2024 "Direct link to New Components - March 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [ServiceTitan](https://prismatic.io/docs/components/servicetitan.md) - ServiceTitan is a comprehensive field service management solution that helps businesses manage their operations, workforce, and customer service. * [Yoti Sign](https://prismatic.io/docs/components/yoti-sign.md) - Yoti Sign is a digital identity and e-signature solution that allows users to verify their identity and sign documents electronically and securely. MARCH 07, 2024 ##### Code-Native Integrations[​](#code-native-integrations "Direct link to Code-Native Integrations") We've introduced a new option for building integrations called code-native! This is the perfect building experience for integration builders who prefer to write code rather than use a low-code designer. Code-native highlights: * Build integrations completely within your IDE rather than in the low-code designer. * Define triggers, connections, integration logic, and the customer-facing configuration experience all in code. * Define integration logic however you see fit rather than leveraging predefined logic steps. * More easily incorporate integrations into your existing CI/CD process and code repositories. Check out [this blog post](https://prismatic.io/blog/introducing-code-native-integrations/) as well as the video below for more information about the benefits of code-native integrations. Our [code-native documentation](https://prismatic.io/docs/integrations/code-native.md) is an excellent place to start when you're ready to start building your first code-native integration.
MARCH 01, 2024 ##### New Components - February 2024[​](#new-components---february-2024 "Direct link to New Components - February 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [ArcGIS](https://prismatic.io/docs/components/arcgis.md) - Esri ArcGIS is an online geographic information system providing and maintaining detailed information and tools for maps and locations. * [Aspose](https://prismatic.io/docs/components/aspose.md) - Aspose is a robust file manipulation service that can manage various document and image file formats. Use the Aspose component to create, edit, process, and convert file formats from several languages, and several platforms. FEBRUARY 05, 2024 ##### Extended i18n Support[​](#extended-i18n-support "Direct link to Extended i18n Support") You can now provide translations for dynamic phrases in your embedded marketplace. That means that you can translate things like the names of your integrations, config variables, flows, steps, config wizard page titles, and more. Update to the latest version of `@prismatic-io/embedded` and check out our [docs](https://prismatic.io/docs/embed/translations-and-internationalization.md) to take advantage of this new feature. JANUARY 26, 2024 ##### New Components - January 2024[​](#new-components---january-2024 "Direct link to New Components - January 2024") We are pleased to announce the following components have been recently added to the Prismatic library: * [Adobe Acrobat Sign](https://prismatic.io/docs/components/adobe-acrobat-sign.md) - Adobe Acrobat Sign is an e-signature management solution. Use the Adobe Acrobat Sign component to send, sign, track, and manage the signature process. * [DocuSign](https://prismatic.io/docs/components/docusign.md) - DocuSign provides intuitive solutions for sending and collecting signatures on documents. Use the DocuSign component to manage signature collection and document distribution. JANUARY 26, 2024 ##### New Components - December 2023[​](#new-components---december-2023 "Direct link to New Components - December 2023") We are pleased to announce the following component has been recently added to the Prismatic library: * [Sage Intacct](https://prismatic.io/docs/components/sage-intacct.md) - Industry-leading financial accounting software system with a broad set of functionalities for businesses across a number of different verticals. Use the Sage Intacct component to manage Invoices, Payments, Vendors, and more. DECEMBER 19, 2023 ##### New Integration Designer UX[​](#new-integration-designer-ux "Direct link to New Integration Designer UX") We've built a new integration designer experience! Many of our customers have already switched to the new designer, and we're hearing great things. The new designer makes it faster and easier to build integrations in our low-code environment. Here are the highlights: 1. Streamlined designer canvas that makes better use of space 2. Panning and zooming so you can move around quickly 3. A more intuitive experience for adding and configuring integration steps 4. ...and more enhancements coming soon! [New Designer UX](https://player.vimeo.com/video/894294812) To try out the new experience, just toggle the Designer Beta setting you'll see at the top of your integration designer screen. (Note that any of your customers using the embedded designer won't see the beta version at this time.) We love it, and we hope you will too! DECEMBER 07, 2023 ##### Labels and Categories in Integration YAML[​](#labels-and-categories-in-integration-yaml "Direct link to Labels and Categories in Integration YAML") An integration's labels and category are now included in its YAML definition when it is exported as a YAML file. DECEMBER 04, 2023 ##### New Components - November 2023[​](#new-components---november-2023 "Direct link to New Components - November 2023") We are pleased to announce the following components have been recently added to the Prismatic library: * [Expensify](https://prismatic.io/docs/components/Expensify.md) - Programmatically download expense report data for analysis or insertion into your accounting package, provision accounts for new hires, and much more. * [Calendly](https://prismatic.io/docs/components/calendly.md) - Manage the scheduling of events; attendee availability; and retrieve pertinent data on users and attendees. * [Confluence](https://prismatic.io/docs/components/confluence.md) - Manage spaces, pages, and content properties on your Confluence workspaces. * [Qlik](https://prismatic.io/docs/components/qlik.md) - Manage your Data Sets, Assets, and Apps. NOVEMBER 29, 2023 ##### Connection templates[​](#connection-templates "Direct link to Connection templates") [Connection templates](https://prismatic.io/docs/integrations/connections/integration-specific.md#connection-templates) allow you to pre-populate input values for common connections used by you and your customers. For example, you can create an OAuth 2.0 connection template for Salesforce that pre-populate your OAuth client ID and client secret. When you (or your customers in embedded designer) create a Salesforce connection, they can select your Salesforce connection template and will not need to set up a client ID and secret themselves. OCTOBER 27, 2023 ##### New Components - October 2023[​](#new-components---october-2023 "Direct link to New Components - October 2023") We are pleased to announce the following components have been recently added to the Prismatic library: * [Paylocity](https://prismatic.io/docs/components/paylocity.md) - Performs tasks related to workforce management, payroll, and other HR tasks SEPTEMBER 29, 2023 ##### New Components - September 2023[​](#new-components---september-2023 "Direct link to New Components - September 2023") We are pleased to announce the following components have been recently added to the Prismatic library: * [Azure OpenAI Service](https://prismatic.io/docs/components/azure-openai-service.md) - Performs OpenAI functions using Microsoft Azure's OpenAI service models. * [Adobe I/O Events](https://prismatic.io/docs/components/adobe-io-events.md) - Facilitates trigger changes to content and data on Adobe's Experience Platform; or when predefined rules or thresholds have been met. * [ShipStation](https://prismatic.io/docs/components/shipstation.md) - This component allows you to list, create, update, and delete orders and shipments in your ShipStation account. * [Segment](https://prismatic.io/docs/components/segment.md) - Manage your Sources, Warehouses, and Destinations of your Segment account. * [Amazon Seller Central](https://prismatic.io/docs/components/amazon-seller-central.md) - Manage your catalog, orders, and shipping information of the managed stores on your amazon seller account. SEPTEMBER 26, 2023 ##### New Trigger Events[​](#new-trigger-events "Direct link to New Trigger Events") Custom triggers can now handle the following events: * `onInstanceDeploy` - when an instance is deployed, all triggers with an `onInstanceDeploy` function will execute that function. These functions are handy for setting up webhooks in third-party apps, or for updating your own API to let your team know that a customer has deployed an instance. * `onInstanceDelete` - when an instance is deleted, all triggers with an `onInstanceDelete` function will execute that function. These functions are handy for cleaning up configuration in third-party apps, or for updating your own API to let your team know that a customer has deleted an instance. Read more about these new events in our [docs](https://prismatic.io/docs/custom-connectors/triggers.md#instance-deploy-and-delete-events-for-triggers). SEPTEMBER 25, 2023 ##### Disabling Log and Step Result Retention[​](#disabling-log-and-step-result-retention "Direct link to Disabling Log and Step Result Retention") For compliance reasons your organization may need to disable the storage of logs and step results. You can now disable the storage of logs and step results on a per-instance basis. For more information, see [docs](https://prismatic.io/docs/monitor-instances/logging.md#disabling-logs-and-step-results). SEPTEMBER 20, 2023 ##### Custom Fonts in Embedded[​](#custom-fonts-in-embedded "Direct link to Custom Fonts in Embedded") You can now use custom fonts in your embedded marketplace and designer. This is handy if you want to match the fonts in your embedded marketplace to the fonts in your app. Prismatic currently supports any font available in the [Google Fonts](https://fonts.google.com/) catalog. Read more about [custom fonts](https://prismatic.io/docs/embed/theming.md#using-a-custom-font) in our docs. SEPTEMBER 05, 2023 ##### Customer-Scoped Custom Components[​](#customer-scoped-custom-components "Direct link to Customer-Scoped Custom Components") Customer users can now build and deploy their own custom components, and can use those custom components in integrations they build in your embedded marketplace. This is handy if your customers need to build custom components to interact with their own internal systems, or if they need to build custom components that interact with third-party apps that you don't already support. Read more about customer-scoped custom components on the [Embedded Designer](https://prismatic.io/docs/custom-connectors.md#customer-users-and-custom-components) article. SEPTEMBER 01, 2023 ##### New Components - September, 1 2023[​](#new-components---september-1-2023 "Direct link to New Components - September, 1 2023") We are pleased to announce the following components have been recently added to the Prismatic library: * [BigCommerce](https://prismatic.io/docs/components/bigcommerce.md) - Manage your Products, Brands, Categories and more. * [Domo](https://prismatic.io/docs/components/domo.md) - Manage your Projects, Streams and various other actions within your business's data sets. * [Gong](https://prismatic.io/docs/components/gong.md) - Manage your calls, users, libraries, to best collect insights from customer interactions. * [Google Cloud Pub/Sub](https://prismatic.io/docs/components/google-cloud-pub-sub.md) - Subscribe to topics and configure push notifications for your various Google integrations. * [Google Docs](https://prismatic.io/docs/components/google-docs.md) - Manage and share documents from your Google cloud. * [Mixpanel](https://prismatic.io/docs/components/mixpanel.md) - Manage your custom reports and measure user engagement with collected data. * [Sage HR](https://prismatic.io/docs/components/sage-hr.md) - Manage employees, teams, projects, and more in this robust Human Resources solution. * [ShipBob](https://prismatic.io/docs/components/shipbob.md) - Manage orders, shipments, and generate labels with this fulfillment services solution. * [Square](https://prismatic.io/docs/components/square.md) - Manage your total point of sale system including payments, refunds, and inventory. * [Zendesk Sell](https://prismatic.io/docs/components/zendesk-sell.md) - Manage your sales force automation including leads, orders, and deals. AUGUST 15, 2023 ##### Integration Runner Moving to NodeJS 18.x[​](#integration-runner-moving-to-nodejs-18x "Direct link to Integration Runner Moving to NodeJS 18.x") **Platform Announcement**: We are pleased to announce that we will be transitioning our integration runner from NodeJS 14.x to 18.x beginning on Monday, August 21st. Running on the latest LTS version of NodeJS has huge advantages, including performance improvements, continued support and security patches, and a baked-in [fetch API](https://developer.mozilla.org/en-US/docs/Web/API/fetch). This change will not require any action on your part. Your flows and components will transition seamlessly to their new environment. We will let you know once the upgrade is complete, so you can take advantage of any new features NodeJS 18.x has to offer. AUGUST 10, 2023 ##### Embedded designer improvements[​](#embedded-designer-improvements "Direct link to Embedded designer improvements") Two new features were added to embedded designer: 1. You can now issue a `prismatic.showDashboard()` to give users a holistic view of their integrations, instances, executions, logs, and more. See [docs](https://prismatic.io/docs/embed/additional-screens.md#showing-the-customer-dashboard) for more information. ![Open the customer dashboard in embedded](/docs/img/changelog/show-dashboard.png) 2. You can now filter the list of components that are available to your customers. See [docs](https://prismatic.io/docs/embed/workflow-builder/designer.md#filtering-components) for more information. AUGUST 09, 2023 ##### Embedded Designer[​](#embedded-designer "Direct link to Embedded Designer") Your customers may want to build integrations for themselves between your product and the other apps and services they use. Embedded designer allows your customers to log in to your app and build integrations using Prismatic's integration designer. You can provision your customers access to private components that you have built, and can provide your customers with "templates" that they can use as a starting point for building their own integrations. [Embedded Designer Announcement](https://player.vimeo.com/video/852820250) Read more about the embedded designer on [Embedding Integration Designer](https://prismatic.io/docs/embed/workflow-builder.md) docs page. If you're interested in using embedded designer, please [contact support](mailto:support@prismatic.io) to discuss enabling embedded designer for your organization. ##### New Embedded SDK[​](#new-embedded-sdk "Direct link to New Embedded SDK") Along with embedded designer comes a new embedded SDK. If you are currently using [@prismatic-io/marketplace](https://www.npmjs.com/package/@prismatic-io/marketplace) for embedded marketplace, swap it out for the new [@prismatic-io/embedded](https://www.npmjs.com/package/@prismatic-io/embedded) NodeJS package to begin using the embedded designer features. `@prismatic-io/marketplace` will continue to work, but `@prismatic-io/embedded` is a superset of features and new features and enhancements will only be added to the new embedded SDK. ##### Embedded SDK v2.0.0[​](#embedded-sdk-v200 "Direct link to Embedded SDK v2.0.0") The newest version of the embedded SDK changes the Marketplace default of `screenConfiguration.marketplace.configuration` to `allow-details` instead of `always-show-details`. See [docs](https://prismatic.io/docs/embed/marketplace.md#integration-configuration-detail-screen) for information on the integration configuration detail screen options. JULY 26, 2023 ##### Linking Execution Replays[​](#linking-execution-replays "Direct link to Linking Execution Replays") [Replays](https://prismatic.io/docs/monitor-instances/retry-and-replay.md) of executions are now linked to the original execution. This makes it easier to query for failed executions, and replay only those executions that don't have a subsequent replay that succeeded. [Linking Execution Replays](https://player.vimeo.com/video/848852783) See [docs](https://prismatic.io/docs/monitor-instances/retry-and-replay.md) for more information, and the [Examples repo](https://github.com/prismatic-io/examples/tree/main/api/replay-failed-executions) in GitHub for the script mentioned in the above video. JULY 25, 2023 ##### New Components - July 2023[​](#new-components---july-2023 "Direct link to New Components - July 2023") Some exciting new components landed this month: * [Google Content Shopping](https://prismatic.io/docs/components/google-content-shopping.md) - Update Google Content Shopping feeds, products, and more * [Azure Service Bus](https://prismatic.io/docs/components/azureServiceBus.md) - Manage Azure Service Bus queues and topics * [SAP S4/HANA](https://prismatic.io/docs/components/sapS4Hana.md) - Maintain ERP records in SAP S4/HANA * [Google BigQuery](https://prismatic.io/docs/components/google-cloud-bigquery.md) - Manage Google BigQuery datasets, tables, and more JUNE 30, 2023 ##### New Components - June 2023[​](#new-components---june-2023 "Direct link to New Components - June 2023") This month we released several new components to help you build integrations for your customers: * [Algolia](https://prismatic.io/docs/components/algolia.md) - Update Algolia indexes and records and perform search operations. * [ClickUp](https://prismatic.io/docs/components/click-up.md) - Manage Click Up users, projects and teams within your customers' Click Up workspaces. * [Greenhouse](https://prismatic.io/docs/components/greenhouse.md) - Manage Greenhouse jobs, candidates, applications, and more. * [Postmark](https://prismatic.io/docs/components/postmark.md) - Send and receive emails using Postmark's email API. * [SMTP](https://prismatic.io/docs/components/smtp.md) - Send emails using SMTP. * [Snowflake](https://prismatic.io/docs/components/snowflake.md) - Manage Snowflake databases, warehouses, schemas, tables, and more. JUNE 21, 2023 ##### Hide the Instance Details Configuration Page[​](#hide-the-instance-details-configuration-page "Direct link to Hide the Instance Details Configuration Page") You now have options for configuring how and if a marketplace user should access the integrations configuration details screen. This includes the ability to prevent a marketplace user from accessing the instance configuration details screen. This is useful if your customers should not access the **Test**, **Executions**, **Monitors** or **Logs** functionality. Check out our [docs](https://prismatic.io/docs/embed/marketplace.md#integration-configuration-detail-screen) for more information. MAY 30, 2023 ##### New Components - May 2023[​](#new-components---may-2023 "Direct link to New Components - May 2023") This month we released a new component to connect to [Adobe Analytics](https://prismatic.io/docs/components/adobe-analytics.md), so your customers can manage companies, report suites, metrics and more. MAY 16, 2023 ##### Multiple Instances of Integrations in Embedded Marketplace[​](#multiple-instances-of-integrations-in-embedded-marketplace "Direct link to Multiple Instances of Integrations in Embedded Marketplace") Your customers can now enable multiple instances of an integration through embedded marketplace. This allows your customers to set up several copies of an integration of themselves, each with different configurations. Check out our [docs](https://prismatic.io/docs/embed/marketplace.md#multiple-instances-of-one-integration-in-marketplace) for more information. MAY 15, 2023 ##### Embedded Config Variable Settings[​](#embedded-config-variable-settings "Direct link to Embedded Config Variable Settings") Granular config variable and connection input settings allow your org to better control who can see and set integration configurations values. Read more in [docs](https://prismatic.io/docs/integrations/config-wizard/config-variables.md#config-variable-visibility). ##### Endpoint Security Settings[​](#endpoint-security-settings "Direct link to Endpoint Security Settings") New Endpoint security settings allow your org to configure which endpoints should have API keys and who should set them (org vs customer). Read more in [docs](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#securing-endpoints-with-api-keys). APRIL 28, 2023 ##### New Components - April 2023[​](#new-components---april-2023 "Direct link to New Components - April 2023") A couple of new components are available! A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [Gusto](https://prismatic.io/docs/components/gusto.md) - Manage payroll, benefits, and human resource within Gusto * [Notion](https://prismatic.io/docs/components/notion.md) - Manage Notion pages, databases, and users Additionally, the [Airtable](https://prismatic.io/docs/components/airtable.md) component was updated to handle OAuth connections, as API key connections are being deprecated early next year. APRIL 18, 2023 ##### Advanced Embedded Marketplace Filtering[​](#advanced-embedded-marketplace-filtering "Direct link to Advanced Embedded Marketplace Filtering") You can now use logical operators like `and`, `or`, `startsWith`, `notEqual` and more to filter the integrations in your embedded marketplace. For example, if you would like to show all integrations that have a category "ERP" and label "paid", and would also like your Dropbox and Slack integrations to be displayed, a filter could look like: ``` [ BooleanOperator.or, [ BooleanOperator.and, [TermOperator.equal, "category", "ERP"], [TermOperator.in, "labels", "paid"], ], [TermOperator.equal, "name", "Dropbox"], [TermOperator.equal, "name", "Slack"], ]; ``` Read more about advanced filters in the [docs](https://prismatic.io/docs/embed/marketplace.md#advanced-integration-filters). MARCH 30, 2023 ##### New Components - March 2023[​](#new-components---march-2023 "Direct link to New Components - March 2023") Our list of built-in components continues to grow. A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [Microsoft Graph API](https://prismatic.io/docs/components/ms-graph-api.md) - Interact with the Microsoft Graph API * [OpenAI](https://prismatic.io/docs/components/openai.md) - Interact with OpenAI models, including Chat GPT and DALLΒ·E Additionally, * [Microsoft Outlook](https://prismatic.io/docs/components/ms-outlook.md) - Added actions to interact with mailboxes, folders and email messages FEBRUARY 22, 2023 ##### New Components - February 2023[​](#new-components---february-2023 "Direct link to New Components - February 2023") We added a few additional components in February: * [Arena PLM](https://prismatic.io/docs/components/arena-plm.md) - Interact with items and resources in Arena PLM * [Google Analytics - GA4](https://prismatic.io/docs/components/google-analytics-ga4.md) - Manage Google Analytics GA4 accounts and data * [HTML Utils](https://prismatic.io/docs/components/html-utils.md) - Helpful HTML-related functions for building HTML documents and HTML-based emails. JANUARY 26, 2023 ##### New Components - January 2023[​](#new-components---january-2023 "Direct link to New Components - January 2023") Our list of built-in components continues to grow. A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [HTML Utils](https://prismatic.io/docs/components/html-utils.md) - Helpful HTML-related functions for building HTML documents and HTML-based emails * [JSON Forms](https://prismatic.io/docs/components/jsonforms.md) - Create powerful custom forms for the configuration wizard * [Marketo](https://prismatic.io/docs/components/marketo.md) - Manage Marketo records * [MessagePack](https://prismatic.io/docs/components/messagepack.md) - Efficiently serialize or deserialize data into a JSON-like format * [Microsoft Outlook](https://prismatic.io/docs/components/ms-outlook.md) - Read and manage Microsoft Outlook calendars * [NetSuite](https://prismatic.io/docs/components/netsuite.md) - Manage NetSuite records DECEMBER 29, 2022 ##### JSON Forms Config Variables[​](#json-forms-config-variables "Direct link to JSON Forms Config Variables") We've integrated [JSON Forms](https://jsonforms.io/) into our configuration wizard to give you more control over your users' integration configuration experience. You can build a static JSON form using the built-in [JSON Forms](https://prismatic.io/docs/components/jsonforms.md) component, or create dynamic configuration experiences by adding JSON Forms [data sources](https://prismatic.io/docs/custom-connectors/data-sources.md) to your custom components. ![](/docs/img/changelog/jsonforms.png) JSON Forms in the configuration wizard DECEMBER 14, 2022 ##### Internationalization (i18n) Support[​](#internationalization-i18n-support "Direct link to Internationalization (i18n) Support") Not all of your customers speak English. You can now offer translations for embedded marketplace, so your customers can enable integrations in their native language. See [Embedding Marketplace](https://prismatic.io/docs/embed/translations-and-internationalization.md) for information on how to offer i18n support to your customers. NOVEMBER 29, 2022 ##### New Components - October / November 2022[​](#new-components---october--november-2022 "Direct link to New Components - October / November 2022") Our component team has been busy! A full catalog is available [here](https://prismatic.io/docs/components.md). In October and November we added: * [GraphQL](https://prismatic.io/docs/components/graphql.md) - Make GraphQL requests (queries and mutations) to a GraphQL-based API * [Microsoft Bing Ads](https://prismatic.io/docs/components/ms-bing-ads.md) - Manage Microsoft Bing Ad Customer Services * [Microsoft Bot Framework](https://prismatic.io/docs/components/ms-bot-framework.md) - Manage conversational interactions across platforms using Microsoft Bot Framework * [Microsoft Outlook](https://prismatic.io/docs/components/ms-outlook.md) - Read and manage Microsoft Outlook calendars * [Odoo](https://prismatic.io/docs/components/odoo.md) - Manage records in an Odoo database * [Zoho](https://prismatic.io/docs/components/zoho.md) - Manage records, users, and more in your Zoho CRM and Books apps NOVEMBER 09, 2022 ##### User Level Configuration[​](#user-level-configuration "Direct link to User Level Configuration") We're proud to introduce a new instance configuration option - **User-Level Configuration** (ULC). **What is ULC**? ULC helps when multiple users of a single customer all need an integration. It allows you to configure a single instance of an integration for a customer, but collect configuration from multiple users and execute using user-specific configuration. **Why use ULC**? ULC is handy if your integration requires user-specific config variables and credentials. For example, suppose your app needs to write data to several users' private Dropbox folders. With ULC, you can collect connection information for several users within a customer, and integrate with each of their individual Dropbox accounts. **How does ULC work**? At a high level, a single instance of an integration is deployed to a customer, and is configured with some customer-wide config variables. Individual users within the customer, then, go through a ULC config wizard and supply user-specific credentials and config variables. When the instance runs, it pulls in user-specific configuration depending on some rules you set. Read more about ULC in our [docs](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md). NOVEMBER 03, 2022 ##### Persisting Data for Integrations[​](#persisting-data-for-integrations "Direct link to Persisting Data for Integrations") You can now persist data between instances of the same integration. This is handy if your customers need to share some state, or if you need to persist a customer mapping for [preprocess flows](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md). See the [Persist Data](https://prismatic.io/docs/components/persist-data.md) docs, or write your own component that [persists integration data](https://prismatic.io/docs/custom-connectors/actions.md#execution-instance-and-cross-flow-state) OCTOBER 04, 2022 ##### Source Code for Prism and Marketplace[​](#source-code-for-prism-and-marketplace "Direct link to Source Code for Prism and Marketplace") Source code for the `@prismatic-io/prism` CLI tool and the embedded marketplace library, `@prismatic-io/marketplace`, have been added to public repositories on GitHub. `prism` wraps the [Prismatic API](https://prismatic.io/docs/api.md) and provides users a way to perform CRUD (create, read, update, delete) operations on a variety of Prismatic resources (components, integrations, instances, customers, etc.) from the command line. Having it publicly available provides a great reference for developers looking to wrap the Prismatic API themselves. The two projects join the custom component SDK, `@prismatic-io/spectral`, which was already publicly available: * `@prismatic-io/prism` - * `@prismatic-io/marketplace` - * `@prismatic-io/spectral` - * `@prismatic-io/examples` - SEPTEMBER 28, 2022 ##### New Components - September 2022[​](#new-components---september-2022 "Direct link to New Components - September 2022") Our list of built-in components continues to grow. A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [Fluent Commerce](https://prismatic.io/docs/components/fluent-commerce.md) - Manage orders within Fluent Commerce * [Google Analytics](https://prismatic.io/docs/components/google-analytics.md) - Manage Google Analytics accounts * [Gmail](https://prismatic.io/docs/components/google-gmail.md) - Fetch, read and manage messages in Gmail * [UUID](https://prismatic.io/docs/components/uuid.md) - Generate UUIDs and GUIDs SEPTEMBER 19, 2022 ##### Re-imagined Instance Configuration Wizard[​](#re-imagined-instance-configuration-wizard "Direct link to Re-imagined Instance Configuration Wizard") We've overhauled the way deploy-time configuration management works, making it much more flexible and dynamic. It's a wizard now, so you can split complex configuration pages in a way that's intuitive to your customers. Any existing configuration pages will simply become part of a one page wizard, so everything will continue to work as-is. Read more about the configuration wizard designer on the [Config Wizard](https://prismatic.io/docs/integrations/config-wizard.md) docs page. ![](/docs/img/changelog/old-instance-config.png) Embedded Instance Configuration Experience (Before)
![](/docs/img/changelog/new-instance-config.png) Embedded Instance Configuration Experience (After) SEPTEMBER 19, 2022 ##### UI Redesign[​](#ui-redesign "Direct link to UI Redesign") You've probably noticed that the Prismatic UI has gotten a facelift! Based on everything we've learned over the last couple of years, we've improved the UI to feel better and be more intuitive. ![](/docs/img/changelog/old-ui.png) UI (Before)
![](/docs/img/changelog/new-ui.png) UI (After) JULY 29, 2022 ##### New Components - July 2022[​](#new-components---july-2022 "Direct link to New Components - July 2022") This month we added a new utility component for zipping and unzipping files: * [Zip](https://prismatic.io/docs/components/zip.md) - Provides utility methods for working with zip files JUNE 29, 2022 ##### Access Instance Metadata from an Action[​](#access-instance-metadata-from-an-action "Direct link to Access Instance Metadata from an Action") You can now access additional information about the currently running execution from your custom component including: * The name and ID of the running instance * The name, ID and external ID of the customer the instance is deployed to * Webhook URLs for all flows of the running instance This is handy if you need to know information about the current run context, or if you're building actions that configure or delete webhooks in a third-party app. To access new `context` properties, update your custom component's `@prismatic-io/spectral` version `6.6.0`. Read more about the expanded [context parameter](https://prismatic.io/docs/custom-connectors/actions.md#the-context-parameter). JUNE 28, 2022 ##### New Components - June 2022[​](#new-components---june-2022 "Direct link to New Components - June 2022") Our list of built-in components continues to grow! A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [GitHub](https://prismatic.io/docs/components/github.md) - Manage users, repositories, licenses, and more on GitHub * [Hash](https://prismatic.io/docs/components/hash.md) - Compute hashes of strings using common hash functions * [Pipedrive](https://prismatic.io/docs/components/pipedrive.md) - Manage leads, companies, activities, and more on the Pipedrive platform * [Rippling](https://prismatic.io/docs/components/rippling.md) - Rippling makes it easy to manage your company's Payroll, Benefits, HR, and IT - all in one, modern platform JUNE 06, 2022 ##### Additional Control Over Marketplace UI[​](#additional-control-over-marketplace-ui "Direct link to Additional Control Over Marketplace UI") You now have more control over the UI elements that appear to your customers in your embedded marketplace. If you would like to hide the **Back to Marketplace** link, or the **Test**, **Executions**, **Logs**, or **Monitors** tabs on an instance configuration screen [you can](https://prismatic.io/docs/embed/marketplace.md#hiding-ui-elements-in-marketplace)! Bump your `@prismatic-io/marketplace` version to `3.1.0`, and add a `screenConfiguration` code block to your marketplace. MAY 24, 2022 ##### New Components - May 2022[​](#new-components---may-2022 "Direct link to New Components - May 2022") We have five new components this month (including a component to Prismatic itself - how meta!). * [IMAP](https://prismatic.io/docs/components/imap.md) - Fetch and manage email via IMAP * [Intercom](https://prismatic.io/docs/components/intercom.md) - Manage companies, contacts and tags on the Intercom platform * [Microsoft Sharepoint](https://prismatic.io/docs/components/ms-sharepoint.md) - Interact with sites, drives, and items within Microsoft Sharepoint * [Pretty Good Privacy (PGP)](https://prismatic.io/docs/components/pgp.md) - Create and translate encrypted messages * [Prismatic](https://prismatic.io/docs/components/prismatic.md) - Interact with the Prismatic API to manage customers, integrations, instances, etc. A full catalog is available [here](https://prismatic.io/docs/components.md). MAY 17, 2022 ##### Improvements to Custom Component Development[​](#improvements-to-custom-component-development "Direct link to Improvements to Custom Component Development") We've made several improvements to the custom component development experience. To highlight a few: * You can now [`clean`](https://prismatic.io/docs/custom-connectors/actions.md#cleaning-inputs) your reusable inputs, which helps ensure type safety and catches problems with inputs before they reach the `perform` function. * You can now add a [global error handler](https://prismatic.io/docs/custom-connectors/error-handling.md#global-error-handlers) to your component, which helps you capture and display more informative errors if they're thrown. * The improved [testing harness](https://prismatic.io/docs/spectral/spectral-6-upgrade-guide.md#new---spectral-testing-harness) gives you more flexibility when unit testing your actions and triggers. * The `prism` CLI tool can now fetch existing integration connections (including OAuth 2.0 access tokens) and store them in environment variables, so you can [use them for unit testing](https://prismatic.io/docs/cli/prism.md#componentsdevrun). Update to the latest [`@prismatic-io/spectral`](https://www.npmjs.com/package/@prismatic-io/spectral) 6.x version to take advantage of these new features! MAY 04, 2022 ##### Cross-Flow State Storage[​](#cross-flow-state-storage "Direct link to Cross-Flow State Storage") You can now store and load across flows of an instance. One flow can save state, and another flow can load that saved state. Check out our [Persist Data](https://prismatic.io/docs/components/persist-data.md) for documentation on the new "Cross Flow" actions, and see our [docs](https://prismatic.io/docs/custom-connectors/actions.md#execution-instance-and-cross-flow-state) to build state storage into your custom components. MAY 04, 2022 ##### Step-Level Error Handling[​](#step-level-error-handling "Direct link to Step-Level Error Handling") Sometimes a step in an integration throws an error. This can be caused by a variety of external factors - temporary network connectivity issues, brief third-party API outages, etc. You can now configure how the integration runner handles errors on each step. You can choose to stop the instance execution (that's the current default behavior), you can ignore the error and continue the run, or you can choose to wait and retry the step at a later time. Read more in our [docs](https://prismatic.io/docs/integrations/low-code-integration-designer/error-handling.md). MAY 03, 2022 ##### Instance Remove Trigger[​](#instance-remove-trigger "Direct link to Instance Remove Trigger") A new management trigger - [Instance Remove](https://prismatic.io/docs/components/management-triggers.md#instance-remove) has been added to the [Management Triggers](https://prismatic.io/docs/components/management-triggers.md) component. Flows that use the instance remove trigger are run when an instance is deleted. This new trigger is handy for cleaning up configuration created by the integration. For example, you can remove webhook configuration in third-party apps, or update our own API so your team knows that a customer removed an integration. MAY 02, 2022 ##### Set Config Variables from Marketplace[​](#set-config-variables-from-marketplace "Direct link to Set Config Variables from Marketplace") You can now set values for configuration variables from your app within your embedded marketplace. This is helpful if you know some information about your customer (their API key, a special endpoint they use, data mapping configuration, etc), and would like to set a config variable value so they don't need to. Read more about [Dynamically Setting Config Variables in Marketplace](https://prismatic.io/docs/embed/marketplace.md#dynamically-setting-config-variables-in-marketplace) in our docs. APRIL 28, 2022 ##### New Components - April 2022[​](#new-components---april-2022 "Direct link to New Components - April 2022") We added some new components to [our catalog](https://prismatic.io/docs/components.md) in April. This past month, we added: * [Facebook Marketing](https://prismatic.io/docs/components/facebook-marketing.md) - Interact with ads and ad sets in your Facebook Marketing account * [Google Ads](https://prismatic.io/docs/components/google-ads.md) - Manage Google Ad campaigns * [WooCommerce](https://prismatic.io/docs/components/woo-commerce.md) - Easily manage your customers, orders, and products in your WooCommerce platform APRIL 28, 2022 ##### Sending Data Through URL Path[​](#sending-data-through-url-path "Direct link to Sending Data Through URL Path") Some popular SaaS applications append URL paths to the webhooks that they're configured to use. So, given a webhook endpoint `https://hooks.prismatic.io/trigger/EXAMPLE==` they might send data to `https://hooks.prismatic.io/trigger/EXAMPLE==/order/created`. You can now send data to webhook triggers four ways: * Request body * Request headers * URL parameters * URL path (added) Check out our [Sending data to webhook triggers](https://prismatic.io/docs/integrations/triggers/webhook/sending-data.md) article for more information. APRIL 26, 2022 ##### Improved Shared Endpoint Configuration[​](#improved-shared-endpoint-configuration "Direct link to Improved Shared Endpoint Configuration") Instance-specific endpoints (meaning all flows in an instance share one webhook URL) and shared endpoints (meaning all instances of an integration share one webhook URL) are now easier to configure, test and troubleshoot. Check out our [Endpoint Configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md) article for details. APRIL 19, 2022 ##### Using the GET HTTP Verb to Invoke Instances[​](#using-the-get-http-verb-to-invoke-instances "Direct link to Using the GET HTTP Verb to Invoke Instances") Instance webhook triggers can now be invoked using the GET HTTP verb in addition to the POST verb. Some third-party apps (notably [Dropbox](https://prismatic.io/docs/components/dropbox.md) among others) verify that a webhook endpoint is ready to receive requests with a GET request. They then send webhook payloads with POST requests. This change was made to support the initial verification GET requests. MARCH 29, 2022 ##### New Components - March 2022[​](#new-components---march-2022 "Direct link to New Components - March 2022") Our list of built-in components continues to grow. A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [Collection Tools](https://prismatic.io/docs/components/collection-tools.md) - Perform common operations on collections * [Microsoft OneDrive](https://prismatic.io/docs/components/ms-onedrive.md) - Interact with files and drives inside Microsoft OneDrive MARCH 24, 2022 ##### Cloning Flows[​](#cloning-flows "Direct link to Cloning Flows") If you need to add a flow that is similar to another flow you've already built, it's helpful to be able to **clone** (make a copy of) a flow. You can now clone an existing flow from the flow menu in the integration designer. ![Clone integration flow in Prismatic app](/docs/img/changelog/clone-flow.png) For more information, see our [Building Integrations](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md#cloning-a-flow) article. MARCH 21, 2022 ##### Filter Marketplace Integrations[​](#filter-marketplace-integrations "Direct link to Filter Marketplace Integrations") You can now filter the integrations that you show in your embedded marketplace by [category](https://prismatic.io/docs/integrations/low-code-integration-designer.md#categorizing-integrations) or [label](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-labels-to-an-integration). This gives you the flexibility to show specific types of integrations to specific users or customers. To get started with filtering your embedded marketplace, update your [@prismatic-io/marketplace](https://www.npmjs.com/package/@prismatic-io/marketplace) package to version 1.1.2, and add a `filters` attribute to your `prismatic.showMarketplace()` invocation - see [docs](https://prismatic.io/docs/embed/marketplace.md#filtering-integrations) for details. MARCH 16, 2022 ##### Labels for Customers, Integrations and Instances[​](#labels-for-customers-integrations-and-instances "Direct link to Labels for Customers, Integrations and Instances") You can now assign labels to your [customers](https://prismatic.io/docs/customers/managing-customers.md#customer-labels), [integrations](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-labels-to-an-integration) and [instances](https://prismatic.io/docs/instances/deploying.md). This helps you keep your Prismatic account organized so you can find what you need quickly. ![Assign labels to customers, integrations and instances in Prismatic app](/docs/img/changelog/labels.png) FEBRUARY 28, 2022 ##### Improvements to Embedded Theming[​](#improvements-to-embedded-theming "Direct link to Improvements to Embedded Theming") You can now create custom themes for your embedded marketplace for both dark and light mode users of your application. Check out our [embedded marketplace docs](https://prismatic.io/docs/embed/theming.md) for information on how to theme your embedded marketplace to match your app's dark and light mode look-and-feel. FEBRUARY 23, 2022 ##### New Components - February 2022[​](#new-components---february-2022 "Direct link to New Components - February 2022") We have a couple new components this month! A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [Math](https://prismatic.io/docs/components/math.md) - Perform common math operations on numbers or lists of numbers * [Microsoft Dynamics 365](https://prismatic.io/docs/components/ms-dynamics.md) - Query, create, update or delete Microsoft Dynamics 365 API records JANUARY 21, 2022 ##### New Components - January 2022[​](#new-components---january-2022 "Direct link to New Components - January 2022") We have some new components to show off this month: * [QuickBooks Time](https://prismatic.io/docs/components/quickbooks-time.md) - Manage employee time tracking within Intuit QuickBooks Time * [Sage](https://prismatic.io/docs/components/sage.md) - Manage contacts and others connected to your Sage account * [SOAP](https://prismatic.io/docs/components/soap.md) - Easily interact with SOAP-based APIs JANUARY 20, 2022 ##### Integrate Faster with SOAP APIs[​](#integrate-faster-with-soap-apis "Direct link to Integrate Faster with SOAP APIs") It's now easier to integrate with SOAP-based APIs. For quick one-off calls, you can use our built-in [SOAP component](https://prismatic.io/docs/components/soap.md) to fetch WSDL definitions and make requests to an API's SOAP methods. For more complex SOAP APIs, you can leverage our custom component SDK to wrap SOAP methods into a series of component actions. JANUARY 05, 2022 ##### Simpler, More Flexible Authentication[​](#simpler-more-flexible-authentication "Direct link to Simpler, More Flexible Authentication") We've revamped the way that components connect to third-party apps and services. The new concept is called **connections**, and they make authentication *simpler*, more *flexible*, and *easier to support*. We'll be updating our built-in components to use connections in the coming weeks, and credentials will eventually be phased out in favor of connections (but don't worry - credentials won't be sunset immediately!). For a full run-down how connections improve integration development and support and customer self-deployment, check out our [blog announcement.](https://prismatic.io/docs/blog/simpler-more-flexible-authentication) Here's a quick summary: * Component developers have *more flexibility* when declaring what information their components need to connect to a third party. They can define [custom connections](https://prismatic.io/docs/custom-connectors/connections.md) that include any number of fields, like username, password, API key, tenant ID, endpoint URL or other fields that are unique to the service they're integrating with. * The [OAuth 2.0 flow](https://prismatic.io/docs/integrations/connections/oauth2.md) got much *simpler* and cleaner for both integration builders and customers who deploy the integration - customers just see a single button to click when they need to authenticate with OAuth. * Authorization got *simpler* in general - connections live within the integration designer or a deployed instance. You don't need to create credentials from the organization or customer settings pages, nor juggle credential types. Components know what connections they're compatible with, and can only be paired with those connection config variables. * Connections are *easier to support*. You can now configure alert monitors to [notify you](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md#alerting-on-connection-errors) when connections (OAuth or otherwise) expire or fail to authenticate in an integration. DECEMBER 22, 2021 ##### New Components - December 2021[​](#new-components---december-2021 "Direct link to New Components - December 2021") We created new components for three popular SaaS apps this month: * [BambooHR](https://prismatic.io/docs/components/bamboohr.md) - Keep track of employees' HR needs * [Xero](https://prismatic.io/docs/components/xero.md) - Create and manage invoices, items, accounts, payments and more objects within a Xero account * [Zoom](https://prismatic.io/docs/components/zoom.md) - Manage Zoom users, meetings and webinars A full catalog of all of our components is available [here](https://prismatic.io/docs/components.md). DECEMBER 14, 2021 ##### Looping and Pagination[​](#looping-and-pagination "Direct link to Looping and Pagination") The [loop component](https://prismatic.io/docs/components/loop.md) has been improved to facilitate easily looping over a paginated API. Many third-party APIs limit the number of records you can fetch at once, and let you load a batch (page) of records at a time. You can now more easily loop over paged records that you fetch from an external API, and you can break out of a loop whenever you've paged over all available records. Check out our [quickstart](https://prismatic.io/docs/integrations/common-patterns/loop-over-paginated-api.md) for a tutorial on how to loop over pages of records in an integration. NOVEMBER 24, 2021 ##### New Components - November 2021[​](#new-components---november-2021 "Direct link to New Components - November 2021") We have a bunch of new built-in components this month. A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [Asana](https://prismatic.io/docs/components/asana.md) - Manage users, projects, and teams in your Asana workspace * [Monday](https://prismatic.io/docs/components/monday.md) - Manage boards, items, and columns inside your Monday account * [Microsoft Project](https://prismatic.io/docs/components/ms-project.md) - Make queries to reporting data from a Project Web App instance * [New Relic](https://prismatic.io/docs/components/new-relic.md) - Easily manage metrics, logs, and events * [Tableau](https://prismatic.io/docs/components/tableau.md) - Manage projects and workbooks in your Tableau site * [Zendesk](https://prismatic.io/docs/components/zendesk.md) - Manage Tickets and users in Zendesk NOVEMBER 16, 2021 ##### Stream Logs to External Logging Services[​](#stream-logs-to-external-logging-services "Direct link to Stream Logs to External Logging Services") Customers on enterprise plans can now stream logs and metrics to external logging services (like DataDog or New Relic). This is useful, since you likely already use a logging service to collect logs from your various applications. Now, your integration logs can live alongside the rest of your applications' logs. Read more on our [logging article](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md). NOVEMBER 10, 2021 ##### New Step Input: Expressions[​](#new-step-input-expressions "Direct link to New Step Input: Expressions") You can now reference multiple config variables, step results, and static strings for step input using templated [inputs](https://prismatic.io/docs/integrations/low-code-integration-designer/passing-data-between-steps.md#template-inputs). This lets you concatenate config variables, text, and step results together, without needing an additional step to do the concatenation. It's helpful for dynamically generating URLs, queries, messages, and more. ![Expressions for step inputs in Prismatic app](/docs/img/changelog/expression-input.png) **Update:** As of 2022-05-25, "Expression inputs" have been renamed "Template inputs" OCTOBER 20, 2021 ##### New Components - October 2021[​](#new-components---october-2021 "Direct link to New Components - October 2021") Our list of built-in components continues to grow. A full catalog is available [here](https://prismatic.io/docs/components.md). This past month, we added: * [AWS Glue](https://prismatic.io/docs/components/aws-glue.md) - Perform data transformation through AWS Glue * [AWS Lambda](https://prismatic.io/docs/components/aws-lambda.md) - Manage and invoke AWS Lambdas * [CSV](https://prismatic.io/docs/components/csv.md) - Build and parse CSV files to and from JavaScript arrays * [Firebase](https://prismatic.io/docs/components/firebase.md) - Create, read, update, and delete documents in a Firebase Cloud Firestore database collection * [Google Calendar](https://prismatic.io/docs/components/google-calendar.md) - Manage calendars and events in Google Calendar * [Hubspot](https://prismatic.io/docs/components/hubspot.md) - Manage objects and associations in the Hubspot CRM platform * [Jira](https://prismatic.io/docs/components/atlassian-jira.md) - Manage Jira issues, comments, projects and users * [Mailchimp](https://prismatic.io/docs/components/mailchimp.md) - Interact with email campaign lists and e-commerce resources * [Microsoft Excel](https://prismatic.io/docs/components/ms-excel.md) - Parse and build xlsx files (spreadsheets) * [Microsoft Teams](https://prismatic.io/docs/components/ms-teams.md) - Manage the teams, groups, channels, and messages associated with your Microsoft Teams account * [Redis](https://prismatic.io/docs/components/redis.md) - Manage items in a Redis database OCTOBER 18, 2021 ##### Write Your Own Triggers[​](#write-your-own-triggers "Direct link to Write Your Own Triggers") The vast majority of integrations are triggered in one of two ways: they either run on a schedule (i.e "At 15 minutes past each hour") or they're invoked by an HTTP request to a [webhook](https://prismatic.io/docs/integrations/triggers/webhook.md). Not all apps and services that you integrate with are the same, though, and some require additional functionality or validation. For example, Salesforce [outbound messages](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_om_outboundmessaging_understanding.htm) (webhooks) require a special XML-formatted acknowledgement (ACK) response to a webhook request, and Amazon's Simple Notification Service (SNS) requires that integrations send an HTTP POST request to AWS to [confirm an SNS subscription](https://docs.aws.amazon.com/sns/latest/dg/SendMessageToHttp.prepare.html). With those considerations in mind, we've extended our [custom component SDK](https://www.npmjs.com/package/@prismatic-io/spectral) to allow you to write your own triggers for your components. Your triggers can handle things like: * Replying to webhook requests with custom responses * Validating webhook headers and payload data * Transforming and processing XML, CSV, or proprietary data formats so the rest of your integration can easily reference data that comes in If you've [written your own actions](https://prismatic.io/docs/custom-connectors/actions.md), writing a trigger will feel very familiar. Check out our docs on [writing triggers](https://prismatic.io/docs/custom-connectors/triggers.md) to get started. ##### Configurable Webhook Triggers[​](#configurable-webhook-triggers "Direct link to Configurable Webhook Triggers") The general webhook trigger is now more configurable. You can now specify the HTTP code, headers, response type, and response body that the webhook trigger returns to a webhook caller. This helps you handle APIs that require custom responses, and allows you to redirect webhook callers as needed. ![Configure webhook trigger in Prismatic app](/docs/img/changelog/webhook-trigger-responses.png) SEPTEMBER 16, 2021 ##### Improved Embedded Marketplace Experience[​](#improved-embedded-marketplace-experience "Direct link to Improved Embedded Marketplace Experience") We've significantly enhanced Prismatic's embedded marketplace experience, enabling you to provide your customers a seamless, native integration experience with minimal engineering effort. You can now embed Prismatic's integration marketplace into your application with just a few lines of code. You can choose to display the sleek marketplace UX directly within your application or as a popover, and apply [custom theming](https://prismatic.io/docs/embed/theming.md) to make your integration marketplace look native to your application. The embedded marketplace showcases your integration offerings and allows customers to self-activate and configure the integrations they need. Just as previously, you can specify which integrations appear in your marketplace, which ones can be self-activated, and define each integration's configuration screen. Your customers do not need to juggle another set of credentials to access your embedded marketplace. Instead, you can sign JSON web tokens (JWTs) for your users, which can be used to automatically authenticate them for the marketplace. Check out our [docs](https://prismatic.io/docs/embed/marketplace.md) to get started. SEPTEMBER 14, 2021 ##### Configurable Webhook Endpoints[​](#configurable-webhook-endpoints "Direct link to Configurable Webhook Endpoints") You now have more control over how webhook endpoints are configured for deployed instances. You already had the option to create a webhook endpoint for each flow of each deployed instance (**Instance and Flow-Specific**). You can now select two other configuration options: * **Instance-Specific**: Create a single webhook endpoint for each instance. Identify which of the instances' flows should run based on data in the webhook request. * **Shared**: Create a single webhook that is shared by all customers who have a particular integration. Route the webhook request to a flow in a specific customer's instance based on data in the webhook request. Both of these additional configuration options allows you to route webhook requests to a particular customer and flow based on the data that comes in to the webhook. If the data that comes in needs additional processing, or if you need to look up a flow's name or customer's ID, you can assign one of your integration's flows to be a **Preprocess Flow** - a flow that's run when a webhook is invoked and aids in making sure the request gets to the right place. For more information, check out our [webhook docs](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md). SEPTEMBER 01, 2021 ##### Granular Permissions for Third Parties[​](#granular-permissions-for-third-parties "Direct link to Granular Permissions for Third Parties") You can now invite third-party vendors to log in to Prismatic with limited access to your integrations, custom components, and customers. This is helpful if you need to collaborate on a new integration with a third-party vendor. You can grant them **view** or **edit** access to a particular integration or set of custom components, which allows them to test the integration against their app or service. You can debug and iterate faster on integration and custom component development, and can have one central place to view logs and test runs. You can also view logs of each test a third-party vendor performs to give you a sense of how their side of the integration development is progressing. Permissions are granular - third-party users only see what they've been given permissions to see. So, if you're integrating with two competing companies, or even with one of your competitors, they are not given insight into the other integrations, custom components, or customers you have in your Prismatic account. Read more about the third-party user role in our [docs](https://prismatic.io/docs/configure-prismatic/organization-users.md#third-party-users). AUGUST 25, 2021 ##### Integration Categories and Icons[​](#integration-categories-and-icons "Direct link to Integration Categories and Icons") You can now assign a category and icon to each of your integrations. This helps your team manage and filter integrations and improves the way you present them to customers. ![Assign integration to category in Prismatic app](/docs/img/changelog/assign-category.png) ![Filter integrations by category in Prismatic app](/docs/img/changelog/filter-by-category.png) Check out our docs on [categorizing integrations](https://prismatic.io/docs/integrations/low-code-integration-designer.md#categorizing-integrations) and [assigning an icon](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-an-icon-to-an-integration) for more info. AUGUST 24, 2021 ##### Multi-Flow Integrations[​](#multi-flow-integrations "Direct link to Multi-Flow Integrations") Prismatic now provides full support for multi-flow integrations! Integrations can now include multiple **flows**. (A flow is a trigger and a series of steps.) This enables you to provide your customers with a complex third-party integration that performs multiple related tasks, but is packaged and deployed as a single integration. For example, you might integrate with an ERP that sends a variety of data via webhooks to your application (a webhook when inventory is updated, a webhook when customer info is updated, and so on). Rather than constructing integrations with complex branches or assembling multiple integrations, you can now create a single integration with flows that handle each type of webhook payload. You would create a flow to handle inventory updates, another flow to handle customer updates, and deploy all of those flows together as a single instance to a customer. Each flow has its own trigger (so it gets its own webhook URL), and flows are tested and run independently of one another. When customers or customer-facing teams deploy a multi-flow integration, they configure and deploy all of the flows at once using a single configuration screen. Your existing integrations will continue to operate as expected. Please note: 1. The **Selected Test Run** dropdown has been moved to the input reference selector. When configuring an input for a step, you can select which test run to preview outputs for within the **Reference** tab. ![Select test run in Prismatic app](/docs/img/changelog/selected-test-run.png) 2. Prismatic's CLI, Prism, has been updated to version 3.0.0 to account for this change. To install the latest Prism, run: ``` npm install --global @prismatic-io/prism ``` 3. The YAML that defines integrations has been updated. Check out our [building integrations](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) page for more information. ##### Deploy-Time Triggers[​](#deploy-time-triggers "Direct link to Deploy-Time Triggers") You can now configure integration flows with triggers that are invoked when an instance is deployed to a customer. This is helpful if you have a set of "initialization" tasks that need to be completed *once* to set up a customer's instance. A deploy-time flow could enable features in a third-party app, set up third-party users or permissions, create a directory structure in a file storage system, or even set up webhooks in a third-party application to point to the instance's other flows. Check out our [deploy trigger](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger) docs for more info. AUGUST 16, 2021 ##### New Components - August 2021[​](#new-components---august-2021 "Direct link to New Components - August 2021") We've continued to expand our built-in component offering. A full catalog is available [here](https://prismatic.io/docs/components.md). This month, we added: * [Airtable](https://prismatic.io/docs/components/airtable.md) - List, create, delete, and update records in an Airtable Base * [Amazon SES](https://prismatic.io/docs/components/aws-ses.md) - Send email through Amazon's Simple Email Service (SES) * [Google Drive](https://prismatic.io/docs/components/google-drive.md) - Manage files that are stored in a Google Drive account * [Google Sheets](https://prismatic.io/docs/components/google-sheets.md) - Create, read and modify spreadsheets in a Google Drive account * [Mongo DB](https://prismatic.io/docs/components/mongo.md) - Create, read, update and delete documents inside a NoSQL MongoDB collection * [Microsoft Power BI](https://prismatic.io/docs/components/ms-power-bi.md) - Interact with datasets and data schemas within Microsoft's data visualization and business analytics service * [MySQL](https://prismatic.io/docs/components/mysql.md) - Query and manage data in a MySQL database * [Shopify](https://prismatic.io/docs/components/shopify.md) - Interact with Shopify's Access Service API * [Stripe](https://prismatic.io/docs/components/stripe.md) - Interact with Stripe's payment platform The Shopify and Stripe components were both generated from OpenAPI definitions using Prism's [component generator tool](https://prismatic.io/docs/custom-connectors/initializing.md#custom-connectors-from-wsdls-or-openapi-specs). AUGUST 05, 2021 ##### Per-Action Authorization in Components[​](#per-action-authorization-in-components "Direct link to Per-Action Authorization in Components") You can now configure authorization settings per *action* (as opposed to per *component*). This is helpful if you are building a component with multiple actions and only some of your actions require authorization. Read about how to upgrade your component to use per-action authorization on our [Spectral 3.x Upgrade Guide](https://prismatic.io/docs/spectral/spectral-3-upgrade-guide.md). JULY 14, 2021 ##### New Components - July 2021[​](#new-components---july-2021 "Direct link to New Components - July 2021") Several new components have been added to our [catalog](https://prismatic.io/docs/components.md) of built-in components: * [Amazon DynamoDB](https://prismatic.io/docs/components/aws-dynamodb.md) - Create, update, fetch, or delete items in an Amazon DynamoDB database * [Amazon SNS](https://prismatic.io/docs/components/aws-sns.md) - Manage subscriptions, topics, and messages within Amazon SNS * [Amazon SQS](https://prismatic.io/docs/components/aws-sqs.md) - Send, receive and manage messages within an Amazon SQS queue * [AMQP](https://prismatic.io/docs/components/amqp.md) - Send and receive messages on an AMQP-based message broker * [Apache Kafka](https://prismatic.io/docs/components/kafka.md) - Publish messages to an Apache Kafka event stream * [Customer.io](https://prismatic.io/docs/components/customer-io.md) - Manage customers on the Customer.io platform * [Microsoft SQL Server](https://prismatic.io/docs/components/ms-sql-server.md) - Query and manage data in a Microsoft SQL Server Database * [MQTT](https://prismatic.io/docs/components/mqtt.md) - Send and receive messages on an MQTT-based queue * [PostgreSQL](https://prismatic.io/docs/components/postgres.md) - Query and manage data in a PostgreSQL database JULY 08, 2021 ##### Spectral 2.x Released[​](#spectral-2x-released "Direct link to Spectral 2.x Released") Prismatic's custom component TypeScript library, `@prismatic-io/spectral`, has been expanded and updated to improve the developer experience for [building custom components](https://prismatic.io/docs/custom-connectors.md). Updated syntax for creating components, actions, and inputs helps to catch common errors at compile time (rather than runtime), and new utility functions help to guarantee that you pass the correct variable types to third party SDKs and APIs. For info on upgrading an existing 1.x custom component to 2.x, see our [Upgrade Guide](https://prismatic.io/docs/spectral/spectral-2-upgrade-guide.md). You can dive in to the Spectral code on [GitHub](https://github.com/prismatic-io/spectral). JUNE 16, 2021 ##### Enhanced Versioning for Components, Integrations, and Instances[​](#enhanced-versioning-for-components-integrations-and-instances "Direct link to Enhanced Versioning for Components, Integrations, and Instances") Versioning has been improved for components, integrations, and instances to give you more fine-grained control over exactly what code is deployed to customers. **Components** are now assigned an integer version that increments each time the component is published. If a custom component is at "version 3" and you publish a new component definition, that new definition gets "version 4". This allows you to update or extend components without unintentionally impacting existing integrations that use them, ensuring your integrations remain stable. Integration builders can then update the component versions used in their integrations, or roll back to a previous versions when desired, and will be notified when newer versions of components are available. Read more about [Versioning of Components](https://prismatic.io/docs/custom-connectors/publishing.md#component-versioning) and [Choosing Components Versions in Integrations](https://prismatic.io/docs/integrations/low-code-integration-designer/steps.md#choosing-component-versions). **Integration** versioning has been improved, giving you more control over what versions of integrations you deploy to customers. When you publish new changes to an integration, similar to components, your integration is assigned a new version number. Then, when you deploy an instance to a customer, you can choose which version of the integration to use. That means you can have some customers on version 1, and others on version 2 as needed, giving you control over which customers have what, and allowing you to test a new integration version with a small subset of your customer base before deploying it broadly. Rolling back an instance deployment is a breeze - if you deploy a new version of an integration to a customer and something seems off, you can easily roll back your instance to a known working version of the integration with just a couple of clicks. As always, updating customers' instances can be scripted, so you don't need to manually deploy a new version of an integration to each customer. Read more about [Publishing an Integration](https://prismatic.io/docs/integrations/low-code-integration-designer.md#publishing-an-integration). MAY 17, 2021 ##### Generate Custom Components From API Specs[​](#generate-custom-components-from-api-specs "Direct link to Generate Custom Components From API Specs") APIs often have hundreds of unique endpoints that you can interact with. With the release of [Prism](https://prismatic.io/docs/cli.md) version 1.0.8, you can now generate a custom component from a WSDL or OpenAPI file. That means you can have a custom component for a third-party service with hundreds of actions with a single CLI command. Read more on our [Writing Custom Components](https://prismatic.io/docs/custom-connectors/initializing.md#custom-connectors-from-wsdls-or-openapi-specs) article. APRIL 27, 2021 ##### Customer Self-Service[​](#customer-self-service "Direct link to Customer Self-Service") It's now easier for your customers to manage instances of integrations that have been deployed to them. Customer users with **admin** permissions can update config variables and credentials that are associated with their instances. So, if their config or credentials for a third-party service change, they can log in and make the change without needing your help. For more information on customer user roles and permissions, see the [users article](https://prismatic.io/docs/customers/customer-users.md). ##### Custom Theming[​](#custom-theming "Direct link to Custom Theming") Organizations with an enterprise plan can now create a custom theme for the Prismatic web application. This takes Prismatic's white-label capabilities to the next level by allowing you to customize the color scheme and other UI elements to match your branding. Once you apply a custom theme, it will be displayed for both your team members and customers. For more information, see our [custom theming docs](https://prismatic.io/docs/embed/theming.md). MARCH 30, 2021 ##### Configure Instances to Run on a Per-Customer Schedule[​](#configure-instances-to-run-on-a-per-customer-schedule "Direct link to Configure Instances to Run on a Per-Customer Schedule") It's now much easier to configure instances of your integrations to run on a unique schedule for each of your customers. For example, Customer A could be set up to run the integration each day at 4:00PM, while Customer B could be set up to run the integration hourly, depending on their needs. ![Configure instances to run on customer schedule via Prismatic app](/docs/img/changelog/schedule-config-variable.png) For more information, check out our [integrations article](https://prismatic.io/docs/integrations/triggers/schedule.md). MARCH 25, 2021 ##### Intuitive Instance Deployment[​](#intuitive-instance-deployment "Direct link to Intuitive Instance Deployment") Significant improvements have been made to credentials, integration configuration and instance deployment. Integration builders now have the ability to create an easy-to-use configuration page for customer-facing teams. Builders can define config variable names, give hints as to what sort of data is expected, add headers, etc., giving their customer-facing teams an intuitive experience when it comes to deploying an integration. ![Integration configuration and config variables via Prismatic app](/docs/img/changelog/intuitive-instance-deployment.webp) This ultimately makes for easier and faster deployment of integrations, without the need for developer intervention. Read more about setting up config variables on our [integrations](https://prismatic.io/docs/integrations/config-wizard/config-variables.md) article, and about the new instance configuration experience on the [instances](https://prismatic.io/docs/instances/deploying.md) article. MARCH 12, 2021 ##### Persisting Instance State[​](#persisting-instance-state "Direct link to Persisting Instance State") Small amounts of data (state) can now be stored between instance executions. This is handy if you want to save some information about one instance execution to use later in a subsequent execution. Prismatic handles several common state persistence scenarios for you through the new [Persist Data](https://prismatic.io/docs/components/persist-data.md) and [Process Data](https://prismatic.io/docs/components/process-data.md) components. Check out the [Integrations](https://prismatic.io/docs/integrations/persist-data.md) article to learn how to leverage state persistence in your integrations, or read the [Writing Custom Components](https://prismatic.io/docs/custom-connectors/actions.md#execution-instance-and-cross-flow-state) article to incorporate state persistence into your custom components. MARCH 04, 2021 ##### Terraform Provider[​](#terraform-provider "Direct link to Terraform Provider") You can now publish Prismatic integrations and custom components using the Prismatic Terraform Provider. This helps you incorporate Prismatic into your existing CI/CD pipeline, and push changes to integrations and custom components automatically when pull requests are approved. JANUARY 21, 2021 ##### Retry and Replay[​](#retry-and-replay "Direct link to Retry and Replay") Organizations with a professional or enterprise plan can now configure instances to [automatically retry](https://prismatic.io/docs/monitor-instances/retry-and-replay/automatic-retry.md) if an execution fails. You can control how many times an instance attempts to run with the same input, and how long it should wait between failed attempts. If you have an integration that relies on a flaky third-party API, for example, this minimizes interruptions for both your customers and your team. You can also [replay](https://prismatic.io/docs/monitor-instances/retry-and-replay/replaying-failed-executions.md) - manually retry - a specific failed execution of an instance. ##### Invoking Instances Synchronously[​](#invoking-instances-synchronously "Direct link to Invoking Instances Synchronously") You can now choose to invoke your instances [synchronously or asynchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md). When you invoke an instance synchronously, your request says open until the instance completes, and results from the instance's run are returned as an HTTP response. --- ### Setting up a Repo for Prismatic When building integrations and custom connectors (also called *custom components*) with Prismatic, it's important to organize your repository in a way that promotes code reuse, maintainability, and efficient CI/CD workflows. This guide will walk you through setting up a repository structure that supports both [code-native integrations](https://prismatic.io/docs/integrations/code-native.md) and [custom connectors](https://prismatic.io/docs/custom-connectors.md). Click [here](https://vimeo.com/1129318603) to watch a webinar on this topic. #### Example repository[​](#example-repository "Direct link to Example repository") For a complete working example, check out our [example project structure repository](https://github.com/prismatic-io/example-project-structure) on GitHub. This repository demonstrates best practices for organizing your Prismatic projects and includes CI/CD automation using GitHub Actions. #### Recommended project structure[​](#recommended-project-structure "Direct link to Recommended project structure") A well-organized Prismatic repository typically includes the following directories: ``` my-prismatic-project/ β”œβ”€β”€ .github/ β”‚ └── workflows/ # CI/CD pipelines for automation β”‚ β”œβ”€β”€ components.yml β”‚ └── integrations.yml β”œβ”€β”€ components # Custom components for low-code integrations β”‚ β”œβ”€β”€ acme β”‚ └── todoist β”œβ”€β”€ integrations # Code-native integrations β”‚ β”œβ”€β”€ slack β”‚ └── todoist └── shared-libs # Shared libraries for CNI + Components β”‚ β”œβ”€β”€ acme β”‚ └── todoist └── package.json ``` ##### Components directory[​](#components-directory "Direct link to Components directory") The `components/` directory contains your custom connectors - reusable building blocks that can be used across multiple integrations. Each component should be in its own subdirectory with its own `package.json`, source code, and tests. ``` components/ β”œβ”€β”€ acme-crm/ β”‚ β”œβ”€β”€ src/ β”‚ β”‚ β”œβ”€β”€ index.ts β”‚ β”‚ β”œβ”€β”€ actions.ts β”‚ β”‚ └── connections.ts β”‚ β”œβ”€β”€ package.json β”‚ └── tsconfig.json └── todoist/ β”œβ”€β”€ src/ β”‚ └── index.ts └── package.json ``` Components are published to your Prismatic tenant and can then be used in both low-code and code-native integrations. ##### Integrations directory[​](#integrations-directory "Direct link to Integrations directory") The `integrations/` directory contains your [code-native integrations](https://prismatic.io/docs/integrations/code-native.md) - complete integration solutions built entirely in TypeScript. Like components, each integration should have its own subdirectory with its dependencies and configuration. Each integration should be initialized using the Prismatic CLI tool by running `prism integrations:init`. ``` integrations/ └── slack/ β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ index.ts β”‚ └── flows.ts β”‚ └── configPages.ts β”œβ”€β”€ package.json └── tsconfig.json ``` ##### Shared libraries directory[​](#shared-libraries-directory "Direct link to Shared libraries directory") The `shared-libs/` directory contains reusable TypeScript packages that can be shared across both components and integrations. This promotes code reuse and keeps your codebase DRY (Don't Repeat Yourself). Using shared libraries offers several advantages: 1. **Faster iterations**: Updates to shared code immediately benefit all dependent projects 2. **Local code visibility**: All code remains in your repository for easier navigation and debugging 3. **Code reusability**: Common logic (API clients, utilities, types) centralizes in one location ``` shared-libs/ β”œβ”€β”€ acme-client/ β”‚ β”œβ”€β”€ src/ β”‚ β”‚ β”œβ”€β”€ index.ts β”‚ β”‚ └── types.ts β”‚ β”œβ”€β”€ package.json β”‚ └── tsconfig.json └── common-utils/ β”œβ”€β”€ src/ β”‚ └── index.ts └── package.json ``` Shared libraries can be referenced in your components and integrations as local dependencies in their `package.json` files: components/acme-crm/package.json ``` { "dependencies": { "@prismatic-io/spectral": "^9.0.0", "acme-client": "file:../../shared-libs/acme-client" } } ``` Why use shared libraries? When building both custom components and code-native integrations that interact with the same external APIs, you have two options for sharing code: 1. Abstract common logic into shared libraries 2. Publish custom component and install the component's [manifest](https://prismatic.io/docs/integrations/code-native/existing-components.md#adding-component-manifests-to-your-code-native-project) into your code-native project. Using shared libraries is often the better choice because it allows for faster iterations and easier debugging. When you update a shared library, all components and integrations that depend on it immediately benefit from the changes without needing to republish components. #### Publishing components and integrations[​](#publishing-components-and-integrations "Direct link to Publishing components and integrations") ##### Publishing from the command line[​](#publishing-from-the-command-line "Direct link to Publishing from the command line") You can manually publish components and integrations using the Prism CLI: ``` # Publish a component cd components/my-component npm run build prism components:publish # Publish a code-native integration cd integrations/my-integration npm run build export INTEGRATION_ID=$(prism integrations:import) prism integrations:publish ${INTEGRATION_ID} ``` ##### Publishing in a CI/CD pipeline[​](#publishing-in-a-cicd-pipeline "Direct link to Publishing in a CI/CD pipeline") For automated publishing, you can integrate the Prism CLI into your CI/CD pipeline. The CLI supports authentication via refresh tokens, making it easy to automate deployments. If you're using **GitHub Actions**, Prismatic provides pre-built actions that make publishing even easier. See our [GitHub Actions guide](https://prismatic.io/docs/api/github-actions.md) for detailed instructions on: * Setting up authentication with GitHub secrets * Publishing components automatically when code changes * Publishing integrations automatically when code changes * Ensuring components are published before integrations that depend on them * Linking component and integration versions to pull requests For other CI/CD systems (GitLab CI, Jenkins, CircleCI, Azure DevOps, etc.), you can use the Prism CLI directly. See [Publishing components in a CI/CD pipeline](https://prismatic.io/docs/custom-connectors/publishing.md#publishing-components-in-a-cicd-pipeline) for details. #### Managing multiple environments[​](#managing-multiple-environments "Direct link to Managing multiple environments") If you have multiple Prismatic tenants (for example, a development environment and production environments in different regions), you can manage them in your CI/CD pipeline by: 1. Creating separate refresh tokens for each environment 2. Storing them as secrets in your CI/CD system (e.g., `PRISM_REFRESH_TOKEN_DEV`, `PRISM_REFRESH_TOKEN_PROD`) 3. Storing the Prismatic URL for each environment as variables (e.g., `PRISMATIC_URL_DEV`, `PRISMATIC_URL_PROD`) 4. Creating separate workflow jobs or branches for each environment See the [Example Project Structure](https://github.com/prismatic-io/example-project-structure) repo for a complete example of publishing to multiple regions. The example repo leverages GitHub Actions' [Environments](https://docs.github.com/en/actions/deployment/targeting-different-environments/using-environments-for-deployment) feature to manage secrets for different Prismatic tenants. #### Best practices[​](#best-practices "Direct link to Best practices") ##### Use version control[​](#use-version-control "Direct link to Use version control") Always commit your component and integration source code to version control (Git). This allows you to track changes, collaborate with team members, and roll back if needed. ##### Organize by domain[​](#organize-by-domain "Direct link to Organize by domain") If you have lots of custom components, group related components and integrations together. For example, if you have multiple components related to your CRM system, consider placing them in a `components/crm/` subdirectory. ##### Document your code[​](#document-your-code "Direct link to Document your code") Add README files to your components and integrations explaining: * What the component or integration does * How to install dependencies * How to build and test locally * Any configuration required ##### Test in a dev environment before publishing[​](#test-in-a-dev-environment-before-publishing "Direct link to Test in a dev environment before publishing") When possible, test your components and integrations in a development Prismatic tenant before publishing to production. This helps catch issues early and ensures a smoother deployment process. ##### Leverage monorepo tools[​](#leverage-monorepo-tools "Direct link to Leverage monorepo tools") For larger projects with many components and integrations, consider using monorepo tools like: * [npm workspaces](https://docs.npmjs.com/cli/v7/using-npm/workspaces) * [yarn workspaces](https://yarnpkg.com/features/workspaces) * [pnpm workspaces](https://pnpm.io/workspaces) * [bun workspaces](https://bun.com/docs/install/workspaces) These tools make it easier to manage dependencies, run scripts across multiple packages, and optimize build times. #### Next steps[​](#next-steps "Direct link to Next steps") Now that you have your repository set up, you're ready to start building: * [Build your first code-native integration](https://prismatic.io/docs/integrations/code-native/get-started/first-integration.md) * [Write a custom component](https://prismatic.io/docs/custom-connectors.md) * [Set up GitHub Actions for automated publishing](https://prismatic.io/docs/api/github-actions.md) * [Explore the Prism CLI](https://prismatic.io/docs/cli.md) --- ### Prism CLI Overview The Prismatic CLI tool enables programmatic interaction with the Prismatic API, allowing you to build, deploy, and support integrations from the command line. The CLI tool is built on the [Prismatic API](https://prismatic.io/docs/api.md), so any action that can be completed through the web application or API can also be completed through the CLI tool. #### Installing the CLI tool[​](#installing-the-cli-tool "Direct link to Installing the CLI tool") [![Prism NPM version](https://badge.fury.io/js/@prismatic-io%2Fprism.svg)](https://www.npmjs.com/package/@prismatic-io/prism) Prismatic's CLI tool, `prism`, is available at and can be installed using `npm` or `yarn`: ``` npm install -g @prismatic-io/prism # OR yarn global add @prismatic-io/prism ``` Prism's source code is available on [GitHub](https://github.com/prismatic-io/prism) and serves as an excellent example of how to wrap the [Prismatic API](https://prismatic.io/docs/api.md). #### Authenticating with the CLI tool[​](#authenticating-with-the-cli-tool "Direct link to Authenticating with the CLI tool") Once `prism` has been installed, log in by running: ``` prism login ``` This will open a web browser for you to authenticate with your Prismatic credentials. Once you authenticate, your CLI tool will store an authentication token for subsequent `prism` commands. To verify that you are logged in, run `prism me` to view information about your user. ``` prism me Name: Alex Cooper Email: alexander.cooper@progix.io Organization: Progix Software Endpoint URL: https://app.prismatic.io ``` To view the authentication token that your CLI tool uses, run `prism me:token`. ``` prism me:token eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCIsImtpZCI6Ik5lVV9aYzFNdFRrSE93bXB1T2ZlUCJ9.eyJodHRwczovL3ByaXNtYXRpYy5pby9lbWFpbCI6InRlc3QudXNlckBlbWFpbC5jb20iLCJodHRwczovL3ByaXNtYXRpYy5pby9sYXN0X2xvZ2luIjoiMzAyMS0wMS0wMVQwMDowMDowMC4wMDFaIiwiaXNzIjoiaHR0cHM6Ly9wcmlzbWF0aWMtaW8udXMuYXV0aDAuY29tLyIsImF1ZCI6WyJodHRwczovL3ByaXNtYXRpYy5pby9hcGkiLCJodHRwczovL3ByaXNtYXRpYy1pby51cy5hdXRoMC5jb20vdXNlcmluZm8iXSwiaWF0IjoxNjEyMjA4NjQyLCJleHAiOjE2MTIyOTUwNDIsInNjb3BlIjoib3BlbmlkIHByb2ZpbGUgZW1haWwgb2ZmbGluZV9hY2Nlc3MifQ.iKQWx95vUWTxF62O3-mZFqHPgfapH7TQjsy-BunqWWDJrhk88byJpJQYy__hJE779qAahkEtZD914zgpZ8UnjGW0i_PUcCf5nZsDJBR-jfTEARCLmeVYge3Hy40BAFzj3eCcCouDFqxMNaD3oeXSjfizO9Cy_P-XKEkDdIOJ-rk ``` To clear your token from memory and log out, run `prism logout`. Logging in to other regions By default, your data is stored in the US commercial region and `prism` authenticates against that region. If your plan includes additional regions or private cloud, you will need to configure `prism` to point to that region. See [Integrations in Multiple Regions](https://prismatic.io/docs/configure-prismatic/deployment-regions.md#logging-in-to-additional-regions-with-prism). #### Autocomplete in Prism[​](#autocomplete-in-prism "Direct link to Autocomplete in Prism") Enable autocomplete in `prism` by running `prism autocomplete` and then follow the displayed instructions. Instructions differ depending on which shell you use (bash, zsh, etc.). #### Running CLI commands and getting help[​](#running-cli-commands-and-getting-help "Direct link to Running CLI commands and getting help") All Prismatic CLI commands generally follow the form `prism COMMAND`. For example, you can run `prism customers:list` to list all customers, or `prism integrations:create` to create an integration. A complete list of `prism` commands can be found on the [Prismatic CLI Command Reference](https://prismatic.io/docs/cli/prism.md) page. Running `prism --help` will also list top-level commands that you can execute. ``` $ prism --help Build, deploy, and support integrations in Prismatic from the comfort of your command line VERSION @prismatic-io/prism/7.6.4 darwin-arm64 node-v22.11.0 USAGE $ prism [COMMAND] TOPICS alerts Manage Alerting resources components Manage, create, and publish Components customers Manage Customers executions Fetch results of Instance executions or Integration test runs instances Manage Instances integrations Manage and import Integrations logs List Log Severities for use by Alert Triggers me Print your user profile information on-prem-resources Delete an On-Premise Resource organization Manage your Organization translations Generate Dynamic Phrases for Embedded Marketplace COMMANDS autocomplete Display autocomplete installation instructions. help Display help for prism. login Log in to your Prismatic account logout Log out of your Prismatic account me Print your user profile information ``` To view subcommands of top-level commands, run `prism COMMAND --help`. For example, to see customer management options, run: ``` $ prism customers --help Manage Customers USAGE $ prism customers:COMMAND TOPICS customers:users Manage Customer Users COMMANDS customers:create Create a new Customer customers:delete Delete a Customer customers:list List your Customers customers:update Update a Customer ``` For a list of all required arguments for a command, run `prism COMMAND:SUBCOMMAND --help`. For example, to view the required arguments for creating a customer, run: ``` Create a new Customer USAGE $ prism customers:create -n [--print-requests] [--quiet] [-d ] [-e ] [-l ] FLAGS -d, --description= longer description of the customer -e, --externalId= external ID of the customer from your system -l, --label=... a label to apply to the customer -n, --name= (required) short name of the new customer GLOBAL FLAGS --print-requests Print all GraphQL requests that are issued --quiet Reduce helpful notes and text DESCRIPTION Create a new Customer EXAMPLES Apply multiple labels to a customer $ prism customers:create --name "Widgets Inc" --externalId "abc-123" --label "Prod Customers" --label "Beta \ Testers" ``` --- ### Bash Scripting with Prism #### Using the Prismatic CLI in Bash scripts[​](#using-the-prismatic-cli-in-bash-scripts "Direct link to Using the Prismatic CLI in Bash scripts") Multiple `prism` commands can be combined to manage Prismatic resources. For example, to create an instance you need the integration ID and the customer ID for deployment. You can use `customers:list`, `integrations:list`, and `instances:create` commands together to create a new instance. ``` # Get the Customer ID CUSTOMER_ID=$( prism customers:list \ --columns id \ --filter 'Name=^FTL Rockets$' \ --no-header) # Get the Integration ID INTEGRATION_ID=$( prism integrations:list \ --columns id \ --filter 'name=^Acme$' \ --no-header) # Get the integration's latest version ID VERSION_ID=$( prism integrations:versions ${INTEGRATION_ID} \ --columns id \ --latest-available \ --no-header) # Create the instance prism instances:create \ --customer ${CUSTOMER_ID} \ --integration ${VERSION_ID} \ --name 'Acme ERP' \ --description 'Sync data with Acme ERP' ``` #### Headless prism usage for CI/CD pipelines[​](#headless-prism-usage-for-cicd-pipelines "Direct link to Headless prism usage for CI/CD pipelines") To use `prism` on a headless (no GUI) server for CI/CD or scripting purposes, you must log in on a system with a web browser and then transfer your "refresh token" to the headless system. Refresh tokens do *not* expire and are used to generate short-lived access tokens for Prismatic's API. After logging into `prism` with `prism login`, retrieve your refresh token with `prism me:token --type refresh`. Note your token and the API endpoint you are currently using (view this with `prism me`). Now, on your headless CI/CD system, set an environment variable `PRISM_REFRESH_TOKEN` with the retrieved value: ``` export PRISM_REFRESH_TOKEN=my-refresh-token ``` If you're working with [regions](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md) other than the default US commercial region, you can also specify an endpoint: ``` export PRISM_REFRESH_TOKEN=my-refresh-token export PRISMATIC_URL=https://app.eu-west-1.prismatic.io ``` If your app uses a white-labeled domain (like integrations.my-company.com), you can use that endpoint instead for `PRISMATIC_URL`. **Note**: For PowerShell on Windows, you can set an environment variable using this syntax: ``` $ENV:PRISMATIC_URL="https://app.eu-west-1.prismatic.io" ``` Use GitHub Actions If you use GitHub, consider using Prismatic's integration and component [GitHub actions](https://prismatic.io/docs/api/github-actions.md). --- ### Custom Connector and Code-Native Integration Testing with Prism You can test both [code-native integrations](https://prismatic.io/docs/integrations/code-native.md) and [custom connectors](https://prismatic.io/docs/custom-connectors.md) using the Prism CLI tool. To test a code-native integration's flow from the command line, use `prism integrations:flows:test`. See [Testing Code-Native Integrations](https://prismatic.io/docs/integrations/code-native/testing.md#testing-a-code-native-integration-from-the-cli) for more information. To test a custom connector, use `prism components:dev:test`. See [Unit Testing Custom Connectors](https://prismatic.io/docs/custom-connectors/unit-testing.md#testing-components-from-the-cli) for more information. --- ### Listing Resources with Prism All types of Prismatic resources (customers, components, integrations, instances, actions, etc.) have `:list` subcommands. By default, list commands display basic information about the resource, such as name and description, but additional information like resource ID can be displayed. You can optionally select exactly which attributes of resources you want to list, filter the results, and format the results as CSV, JSON, or YAML. ``` prism components:list Label Public Description Version Category ─────────────────────── ────── ───────────────────────────────────────────────────────────────────────────────────────────── ─────── ────────────────────── Acme ERP false Interact with Acme ERP's inventory and customer systems 2 null Airtable true Manage items (records) in an Airtable Base 2 Data Platforms Amazon DynamoDB true Create, update, fetch, or delete items in an Amazon (AWS) DynamoDB database 5 Data Platforms Amazon S3 true Manage files within an Amazon (AWS) S3 bucket 34 Data Platforms Amazon SES true Send Emails through Amazon (AWS) SES 5 Application Connectors Amazon SNS true Manage subscriptions, topics, and messages within Amazon (AWS) SNS 7 Data Platforms Amazon SQS true Send, receive and manage messages within an Amazon (AWS) SQS queue 10 Data Platforms AMQP true Send and receive messages on an AMQP-based message broker 5 Data Platforms ``` ##### Listing resource IDs[​](#listing-resource-ids "Direct link to Listing resource IDs") All Prismatic resources have unique IDs. IDs are not displayed by default through `list` subcommands, but can optionally be displayed with the `--extended` flag. For example, to display IDs for components, run: ``` prism components:list --extended Id Key Label Public Description Version Category ──────────────────────────────────────────────────────────────── ─────────────────────── ─────────────────────── ────── ───────────────────────────────────────────────────────────────────────────────────────────── ─────── ────────────────────── Q29tcG9uZW50OjI3ZWM4ODlmLTI1ODUtNDFiMy05MDdlLWI2YWExNTg5ZGNhNA== acmeerp Acme ERP false Interact with Acme ERP's inventory and customer systems 2 null Q29tcG9uZW50OmVkMjcwNmExLThiMTEtNDI0YS05MjM0LTgzZjU4NDBmNzA3NQ== airtable Airtable true Manage items (records) in an Airtable Base 2 Data Platforms Q29tcG9uZW50Ojg3NzE3YThhLTFiODktNDY5My1hYmZlLWRjY2VkMjMxM2RlZg== aws-dynamodb Amazon DynamoDB true Create, update, fetch, or delete items in an Amazon (AWS) DynamoDB database 5 Data Platforms Q29tcG9uZW50OjE3NmRjYWU3LWEzMzktNDQ2NC1iYmJkLTU4ODllNzdmOWJjYQ== aws-s3 Amazon S3 true Manage files within an Amazon (AWS) S3 bucket 34 Data Platforms Q29tcG9uZW50Ojg3NjlhODE1LTY1OTEtNDliZC1hMGQ5LTNhMWNlYjUxZmZkYQ== aws-ses Amazon SES true Send Emails through Amazon (AWS) SES 5 Application Connectors Q29tcG9uZW50OjNkMzFkYjYxLWFlYzItNDRjZS05NGNkLTVhZWJjMjIxNjlhZg== aws-sns Amazon SNS true Manage subscriptions, topics, and messages within Amazon (AWS) SNS 7 Data Platforms Q29tcG9uZW50OmQ5ZmJkYzViLTFhMGUtNDRlMS1hNDcxLTNjMWE0NzFhYzAwNQ== aws-sqs Amazon SQS true Send, receive and manage messages within an Amazon (AWS) SQS queue 10 Data Platforms Q29tcG9uZW50OmFmYjNlMTNmLTg0NDctNGJmMC05MWIyLTAxNGQ1OTliYThkYg== amqp AMQP true Send and receive messages on an AMQP-based message broker 5 Data Platforms ``` ##### Configuring columns of a list to display[​](#configuring-columns-of-a-list-to-display "Direct link to Configuring columns of a list to display") You can optionally choose which resource attributes to display using the `--columns` flag. For example, to retrieve the Key, Label, and ID of all components, run: ``` prism components:list --columns "key,label,id" Key Label Id ──────────── ─────────────── ──────────────────────────────────────────────────────────────── acmeerp Acme ERP Q29tcG9uZW50OjI3ZWM4ODlmLTI1ODUtNDFiMy05MDdlLWI2YWExNTg5ZGNhNA== airtable Airtable Q29tcG9uZW50OmVkMjcwNmExLThiMTEtNDI0YS05MjM0LTgzZjU4NDBmNzA3NQ== aws-dynamodb Amazon DynamoDB Q29tcG9uZW50Ojg3NzE3YThhLTFiODktNDY5My1hYmZlLWRjY2VkMjMxM2RlZg== aws-s3 Amazon S3 Q29tcG9uZW50OjE3NmRjYWU3LWEzMzktNDQ2NC1iYmJkLTU4ODllNzdmOWJjYQ== aws-ses Amazon SES Q29tcG9uZW50Ojg3NjlhODE1LTY1OTEtNDliZC1hMGQ5LTNhMWNlYjUxZmZkYQ== aws-sns Amazon SNS Q29tcG9uZW50OjNkMzFkYjYxLWFlYzItNDRjZS05NGNkLTVhZWJjMjIxNjlhZg== aws-sqs Amazon SQS Q29tcG9uZW50OmQ5ZmJkYzViLTFhMGUtNDRlMS1hNDcxLTNjMWE0NzFhYzAwNQ== amqp AMQP Q29tcG9uZW50OmFmYjNlMTNmLTg0NDctNGJmMC05MWIyLTAxNGQ1OTliYThkYg== ``` ##### Filtering list output[​](#filtering-list-output "Direct link to Filtering list output") You can filter the output that a `:list` subcommand displays using the `--filter` flag. For example, to show only the component with the key "aws-s3", run: ``` prism components:list --filter 'key=^aws-s3$' Label Public Description Version Category ───────── ────── ───────────────────────────────────────────── ─────── ────────────── Amazon S3 true Manage files within an Amazon (AWS) S3 bucket 34 Data Platforms ``` Filter uses regex The `--filter` flag uses [regex](https://regexr.com/) pattern matching, hence the "start of string" `^` character and "end of string" `$` character. In a bash script, you can combine the `--filter` flag with the `--columns` and `--no-header` flags to retrieve the ID of a specific resource: ``` AWS_S3_COMPONENT_ID=$(prism components:list --filter 'key=aws-s3' --no-header --columns id) echo ${AWS_S3_COMPONENT_ID} Q29tcG9uZW50OjJlMDcyMGU4LTFjNTUtNDY1Ni04NzY0LTI1N2RmZDVhNTE3Mw== ``` ##### Formatting list output[​](#formatting-list-output "Direct link to Formatting list output") Lists can be optionally formatted as CSV, JSON, or YAML using the `--output` flag. This flag can be combined with the `--columns` and `--filter` flags as well. For example, to retrieve the ID and key of all components in CSV format, run: ``` prism components:list --output csv --columns "id,key" Id,Key Q29tcG9uZW50OjNiODQ1NGVkLTE5MjEtNGYxNS04MDhmLTBlZjkxNDEzNGRhZA==,airtable Q29tcG9uZW50OjE5YWYzMzQzLTU2OWQtNDY0Yy1iNTAwLWUzM2RhNjg3YmQxYQ==,aws-dynamodb Q29tcG9uZW50OjQ1ZGVkMzEyLTE2ZmUtNGY0Mi04OWVlLWZhOTIzNTQ0ZDEyYQ==,aws-s3 Q29tcG9uZW50OmEyNjRjMTVkLThjM2QtNGY0Yi1hNDNkLWEzYzMzZjgxZGY0MQ==,aws-ses Q29tcG9uZW50OjRmNTM5MWVkLWE3ZDEtNDljZi1hNjViLTE4ZGNmMTRmNGJlMA==,aws-sns Q29tcG9uZW50OmQ0YjhmOTllLWU3YTYtNDUxMS04YWIxLWNkOGQ1M2QyNDJiZg==,aws-sqs ``` --- ### Prismatic CLI Command Reference The Prismatic CLI tool allows you to interact with the Prismatic API programmatically so you can build, deploy, and support integrations from the command line. This page lists the subcommands of `prism` that you can invoke. For an introduction on using the Prismatic CLI tool, see the CLI [usage page](https://prismatic.io/docs/cli.md). *** #### Alerts CLI Commands[​](#alerts-cli-commands "Direct link to Alerts CLI Commands") ##### alerts:events:list[​](#alertseventslist "Direct link to alerts:events:list") List Alert Events for an Alert Monitor ``` prism alerts:events:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `alertMonitorId` | | ID of an alert monitor | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### alerts:groups:create[​](#alertsgroupscreate "Direct link to alerts:groups:create") Create an Alert Group ``` prism alerts:groups:create [--print-requests] [--quiet] --name [--users ] [--webhooks ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------------- | -------- | | `--name` | `-n` | name of the group to be created | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--users` | `-u` | JSON-formatted list of Prismatic user IDs to alert | false | | `--webhooks` | `-w` | JSON-formatted list of Alert Webhook IDs to alert | false | ``` # Create an group for "DevOps" prism alerts:groups:create \ --name DevOps \ --users "[\"$(prism organization:users:list \ --columns id \ --filter 'Name=John Doe' \ --no-header)\"]" ``` ##### alerts:groups:delete[​](#alertsgroupsdelete "Direct link to alerts:groups:delete") Delete an Alert Group ``` prism alerts:groups:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `group` | | ID of the group to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### alerts:groups:list[​](#alertsgroupslist "Direct link to alerts:groups:list") List Alert Groups in your Organization ``` prism alerts:groups:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ``` # Fetch the ID and Name of all alert groups in JSON format, sorted descending by name prism alerts:groups:list --columns "id,name" --output json --sort name ``` ##### alerts:monitors:clear[​](#alertsmonitorsclear "Direct link to alerts:monitors:clear") Clear an Alert Monitor ``` prism alerts:monitors:clear [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `monitor` | | ID of the monitor to clear | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### alerts:monitors:create[​](#alertsmonitorscreate "Direct link to alerts:monitors:create") Create an Alert Monitor by attaching an Alert Trigger and a set of users and webhooks to an Instance ``` prism alerts:monitors:create [--print-requests] [--quiet] --name --instance --triggers [--duration ] [--log-severity ] [--groups ] [--users ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ---------------------------------------------------------------------------- | -------- | | `--duration` | `-d` | greatest time allowed (in seconds) for time-based triggers | false | | `--groups` | `-g` | JSON-formatted list of group IDs to alert | false | | `--instance` | `-i` | ID of the instance to monitor | true | | `--log-severity` | `-s` | greatest log level (debug, info, warn, error) allowed for log-based triggers | false | | `--name` | `-n` | name of the alert monitor to be created | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--triggers` | `-t` | JSON-formatted list of trigger IDs that should trigger this monitor | true | | `--users` | `-u` | JSON-formatted list of Prismatic user IDs alert | false | Alert Monitors and Alert Groups While individual users and webhooks can be tied to alert monitors, it is recommended that you create [alert groups](#alertsgroupscreate) and attach alert groups to alert monitors. This helps in the case that you need to add a user to a set of monitors: it's simpler to edit a single alert group than to edit dozens of alert monitors. ``` # Create an alert monitor for an instance named "My Instance" # and alert the "DevOps" group in the event that an # instance execution takes longer than 10 seconds. prism alerts:monitors:create \ --name "Alert Devops of slow execution" \ --instance $(prism instances:list \ --columns id \ --filter 'name=^My Instance$' \ --no-header) \ --triggers "[\"$(prism alerts:triggers:list \ --columns id \ --filter 'name=^Execution Duration Matched or Exceeded$' \ --no-header)\"]" \ --duration 10 \ --groups "[\"$(prism alerts:groups:list \ --columns id \ --filter 'name=^DevOps$' \ --no-header)\"]" ``` ##### alerts:monitors:delete[​](#alertsmonitorsdelete "Direct link to alerts:monitors:delete") Delete an Alert Monitor ``` prism alerts:monitors:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `monitor` | | ID of the monitor to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### alerts:monitors:list[​](#alertsmonitorslist "Direct link to alerts:monitors:list") List Alert Monitors for Customer Instances ``` prism alerts:monitors:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### alerts:triggers:list[​](#alertstriggerslist "Direct link to alerts:triggers:list") List Alert Triggers ``` prism alerts:triggers:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### alerts:webhooks:create[​](#alertswebhookscreate "Direct link to alerts:webhooks:create") Create an Alert Webhook ``` prism alerts:webhooks:create [--print-requests] [--quiet] --name --url [--headers ] --payloadTemplate ``` | Flag | Shorthand | Description | Required | | ------------------- | --------- | ------------------------------------------------------------------------------------ | -------- | | `--headers` | `-h` | JSON-formatted object of key/value pairs to include in the request header | false | | `--name` | `-n` | name of the webhook to be created | true | | `--payloadTemplate` | `-p` | template string that will be used as the request body, see documentation for details | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--url` | `-u` | URL that will receive a POST request for an alert | true | ##### alerts:webhooks:delete[​](#alertswebhooksdelete "Direct link to alerts:webhooks:delete") Delete an Alert Webhook ``` prism alerts:webhooks:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `webhook` | | ID of the webhook to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### alerts:webhooks:list[​](#alertswebhookslist "Direct link to alerts:webhooks:list") List Alert Webhooks ``` prism alerts:webhooks:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | *** #### Components CLI Commands[​](#components-cli-commands "Direct link to Components CLI Commands") ##### components:actions:list[​](#componentsactionslist "Direct link to components:actions:list") List Actions that Components implement ``` prism components:actions:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--public] [--private] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | | `componentKey` | | The key of the component to show actions for (e.g. 'salesforce') | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--private` | | Show actions for the private component with the given key. Use this flag when you have a private component with the same key as a public component. | false | | `--public` | | Show actions for the public component with the given key. Use this flag when you have a private component with the same key as a public component. | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ``` # Get the ID of the GET action of the HTTP component by action key prism components:actions:list --columns id --filter 'key=^httpGet$' --no-header http # Get actions related to the SFTP component prism components:actions:list sftp ``` ##### components:data-sources:list[​](#componentsdata-sourceslist "Direct link to components:data-sources:list") List Data Sources that Components implement ``` prism components:data-sources:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--public] [--private] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | | `componentKey` | | The key of the component to show data sources for (e.g. 'salesforce') | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--private` | | Show data sources for the private component with the given key. Use this flag when you have a private component with the same key as a public component. | false | | `--public` | | Show data sources for the public component with the given key. Use this flag when you have a private component with the same key as a public component. | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ``` # Get data sources related to the Salesforce component prism components:datasources:list salesforce ``` ##### components:delete[​](#componentsdelete "Direct link to components:delete") Delete a Component ``` prism components:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `component` | | ID of the component to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### components:dev:run[​](#componentsdevrun "Direct link to components:dev:run") Fetch an integration's active connection and execute a CLI command with that connection's fields as an environment variable. ``` prism components:dev:run -i -c -- /command/to/run ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------------------- | -------- | | `--connectionKey` | `-c` | Key of the connection config variable to fetch meta/state for | true | | `--instanceId` | | Instance ID. | false | | `--integrationId` | `-i` | Integration ID | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | After specifying an integration ID and connection config variable name, this command executes a CLI command with that connection's fields saved as a config variable named `PRISMATIC_CONNECTION_VALUE`. ###### components:dev:run Examples[​](#componentsdevrun-examples "Direct link to components:dev:run Examples") To simply print an integration's basic auth config variable named "My Connection" and pipe the resulting JSON to jq, run: ``` $ prism components:dev:run --integrationId SW50ZWexample --connectionKey "My Connection" -- printenv PRISMATIC_CONNECTION_VALUE | jq ``` If one of your integrations has an authenticated OAuth 2.0 config variable "Slack Connection", you could run your component's unit tests with that environment variable: ``` $ prism components:dev:run -i SW50ZWexample -c "Slack Connection" -- yarn run test ``` If you would like to fetch a connection from an instance deployed to one of your customers, specify the --instanceId flag instead ``` $ prism components:dev:run --instanceId SW50ZWexample -c "Slack Connection" -- yarn run test ``` ##### components:dev:test[​](#componentsdevtest "Direct link to components:dev:test") Run an action of a component within a test integration in the integration runner ``` prism components:dev:test [--print-requests] [--quiet] [--envPath ] [--[no-]build] [--output-file ] [--print-results] [--clean-up] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------------------------------- | -------- | | `--[no-]build` | `-b` | Build the component prior to testing | false | | `--clean-up` | | Clean up the integration and temporary component after running the action | false | | `--envPath` | `-e` | Path to dotenv file to load for supplying testing values | false | | `--output-file` | `-o` | Output the results of the action to a specified file | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--print-results` | | Print the results of the action to stdout | false | | `--quiet` | | Reduce helpful notes and text | false | ##### components:init[​](#componentsinit "Direct link to components:init") Initialize a new Component ``` prism components:init [--wsdl-path ] [--open-api-path ] [--verbose] ``` | Flag | Shorthand | Description | Required | | ----------------- | --------- | --------------------------------------------------------------------------------------- | -------- | | `name` | | Name of the new component to create (alphanumeric characters, hyphens, and underscores) | true | | `--open-api-path` | | The path to an OpenAPI Specification file (JSON or YAML) used to generate a Component | false | | `--verbose` | | Output more verbose logging from Component generation | false | | `--wsdl-path` | | Path to the WSDL definition file used to generate a Component | false | ``` # Initialize a new component directory for a component named "send-customer-invoices" prism components:init send-customer-invoices ``` By providing a path to a local WSDL definition file, Prism is able to generate a component that translates the API methods available into Component Actions. First run the init command and provide a path to the WSDL file. ``` prism components:init --wsdl-path ./example.wsdl Example WSDL # Next navigate into the created directory and install/build the Component with NPM or Yarn yarn install && yarn build # Finally publish your WSDL Component prism components:publish ``` ##### components:init:component[​](#componentsinitcomponent "Direct link to components:init:component") Initialize a new Component ``` prism components:init:component [--name ] [--description ] ``` | Flag | Shorthand | Description | Required | | --------------- | --------- | ----------------------------- | -------- | | `--description` | `-d` | Description for the component | false | | `--name` | `-n` | Name of the component | false | ##### components:list[​](#componentslist "Direct link to components:list") List available Components ``` prism components:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--showAllVersions] [--search ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | ---------------------------------------------------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--search` | `-s` | Search components by label first, then by key (case insensitive) | false | | `--showAllVersions` | `-a` | If specified this command returns all versions of all components rather than only the latest version | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### components:publish[​](#componentspublish "Direct link to components:publish") Publish a Component to Prismatic ``` prism components:publish [--print-requests] [--quiet] [--comment ] [--[no-]confirm] [--[no-]check-signature] [--skip-on-signature-match] [--customer ] [--commitHash ] [--commitUrl ] [--repoUrl ] [--pullRequestUrl ] ``` | Flag | Shorthand | Description | Required | | --------------------------- | --------- | --------------------------------------------------------------------------- | -------- | | `--[no-]check-signature` | | Check signature of existing component and confirm publish if matched | false | | `--comment` | `-c` | Comment about changes in this Publish | false | | `--commitHash` | | Commit hash corresponding to the component version being published | false | | `--commitUrl` | | URL to the commit details for this component version | false | | `--[no-]confirm` | | Interactively confirm publish | false | | `--customer` | | ID of customer with which to associate the component | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--pullRequestUrl` | | URL to the pull request that modified this component version | false | | `--quiet` | | Reduce helpful notes and text | false | | `--repoUrl` | | URL to the repository containing the component definition | false | | `--skip-on-signature-match` | | Skips component publish if the new signature matches the existing signature | false | Building Components See [writing custom connectors](https://prismatic.io/docs/custom-connectors.md) for information on building components with `webpack`. ``` # Build and publish a component npx webpack prism components:publish ``` ##### components:signature[​](#componentssignature "Direct link to components:signature") Generate a Component signature ``` prism components:signature [--print-requests] [--quiet] [--skip-signature-verify] ``` | Flag | Shorthand | Description | Required | | ------------------------- | --------- | ----------------------------------------------------------------------------------------------------------------------------------- | -------- | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--skip-signature-verify` | | This consistently returns a signature, regardless of whether the corresponding component has been published to the platform or not. | false | ##### components:triggers:list[​](#componentstriggerslist "Direct link to components:triggers:list") List Triggers that Components implement ``` prism components:triggers:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--public] [--private] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | | `componentKey` | | The key of the component to show triggers for (e.g. 'salesforce') | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--private` | | Show actions for the private component with the given key. Use this flag when you have a private component with the same key as a public component. | false | | `--public` | | Show actions for the public component with the given key. Use this flag when you have a private component with the same key as a public component. | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ``` # Get the ID of the Webhook trigger of the Webhook Triggers component by key prism components:triggers:list --columns id --filter 'key=^webhook$' --no-header webhook-triggers # Get triggers related to the Management Triggers component prism components:triggers:list management-triggers ``` *** #### Customers CLI Commands[​](#customers-cli-commands "Direct link to Customers CLI Commands") ##### customers:create[​](#customerscreate "Direct link to customers:create") Create a new Customer ``` prism customers:create [--print-requests] [--quiet] --name [--description ] [--externalId ] [--label ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------- | -------- | | `--description` | `-d` | longer description of the customer | false | | `--externalId` | `-e` | external ID of the customer from your system | false | | `--label` | `-l` | a label to apply to the customer | false | | `--name` | `-n` | short name of the new customer | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ###### customers:create Examples[​](#customerscreate-examples "Direct link to customers:create Examples") Apply multiple labels to a customer ``` prism customers:create --name "Widgets Inc" --externalId "abc-123" --label "Prod Customers" --label "Beta Testers" ``` ##### customers:delete[​](#customersdelete "Direct link to customers:delete") Delete a Customer ``` prism customers:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `customer` | | ID of the customer to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### customers:list[​](#customerslist "Direct link to customers:list") List your Customers ``` prism customers:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### customers:update[​](#customersupdate "Direct link to customers:update") Update a Customer ``` prism customers:update [--print-requests] [--quiet] [--name ] [--description ] [--externalId ] [--label ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------- | -------- | | `customer` | | ID of a customer | true | | `--description` | `-d` | description of the customer | false | | `--externalId` | `-e` | external ID of the customer from your system | false | | `--label` | `-l` | a label to apply to the customer | false | | `--name` | `-n` | name of the customer | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ###### customers:update Examples[​](#customersupdate-examples "Direct link to customers:update Examples") Apply multiple labels to a customer (note: previously set labels will be overwritten) ``` prism customers:update Q3VzdG9tZXI6MmUzZDllOTUtMWIyMy00N2FjLTk3MjUtMzU1OTA2YzgyZWZj --label "Prod Customers" --label "Beta Testers" ``` ##### customers:users:create[​](#customersuserscreate "Direct link to customers:users:create") Create a User for the specified Customer ``` prism customers:users:create [--print-requests] [--quiet] --email --role --customer [--name ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ----------------------------------------------- | -------- | | `--customer` | `-c` | ID of the customer this user is associated with | true | | `--email` | `-e` | email address | true | | `--name` | `-n` | name of the new user | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--role` | `-r` | ID of the role to assign the user | true | ``` # Add a new 'Member' user for customer 'My First Customer' prism customers:users:create \ --email 'bar@email.com' \ --name 'Thomas Bar' \ --customer $(prism customers:list \ --columns id \ --no-header \ --filter 'name=^My First Customer$') \ --role $(prism customers:users:roles \ --columns id \ --no-header \ --filter 'name=^Member$') ``` ##### customers:users:delete[​](#customersusersdelete "Direct link to customers:users:delete") Delete a Customer User ``` prism customers:users:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `user` | | ID of the user to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### customers:users:list[​](#customersuserslist "Direct link to customers:users:list") List Customer Users ``` prism customers:users:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `customer` | | ID of the customer | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### customers:users:roles[​](#customersusersroles "Direct link to customers:users:roles") List Roles you can grant to Customer Users ``` prism customers:users:roles [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | Customer User Roles Click [here](https://prismatic.io/docs/customers/customer-users.md#customer-user-roles) for descriptions of roles that can be assigned to customer users ##### customers:users:update[​](#customersusersupdate "Direct link to customers:users:update") Update a User ``` prism customers:users:update [--print-requests] [--quiet] [--name ] [--phone ] [--dark-mode ] [--dark-mode-os-sync ] ``` | Flag | Shorthand | Description | Required | | --------------------- | --------- | ---------------------------------------------- | -------- | | `user` | | ID of a user | true | | `--dark-mode` | `-d` | whether the user should have dark mode enabled | false | | `--dark-mode-os-sync` | `-o` | whether dark mode should sync with OS settings | false | | `--name` | `-n` | name of the user | false | | `--phone` | `-p` | phone number of the user | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | *** #### Executions CLI Commands[​](#executions-cli-commands "Direct link to Executions CLI Commands") ##### executions:step-result:get[​](#executionsstep-resultget "Direct link to executions:step-result:get") Gets the Result of a specified Step in an Instance Execution ``` prism executions:step-result:get [--print-requests] [--quiet] --executionId --stepName [--outputPath ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ---------------------------------------------------------------------------- | -------- | | `--executionId` | `-e` | ID of an Execution | true | | `--outputPath` | `-p` | Output result to a file. Output will be printed to stdout if this is omitted | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--stepName` | `-s` | Name of an Integration Step | true | This command can be used to pull down step results for both integration tests and instance executions. For example, you can run a test of a flow of an integration and then pull down specific step results like this: ``` $ prism integrations:flows:test ${FLOW_ID} Execution ID: SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6MWFkZTYwMGQtMjg2Ni00ZTljLWI2N2EtYmUxNzgwOWY4ODI4 $ prism executions:step-result:get \ --executionId SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6MWFkZTYwMGQtMjg2Ni00ZTljLWI2N2EtYmUxNzgwOWY4ODI4 \ --stepName "Fetch Invoice Info" {"invoiceId":"3EB14053-D836-4BDC-8C8F-5E52A2F01C29","customerId":"E2B76A06-5FB6-4CF8-B296-B4000485471E","total":124.61} ``` *** #### Instances CLI Commands[​](#instances-cli-commands "Direct link to Instances CLI Commands") ##### instances:config-vars:list[​](#instancesconfig-varslist "Direct link to instances:config-vars:list") List Config Variables used on an Instance ``` prism instances:config-vars:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `instance` | | ID of an instance | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### instances:create[​](#instancescreate "Direct link to instances:create") Create an Instance ``` prism instances:create [--print-requests] [--quiet] --name --integration --customer [--description ] [--config-vars ] [--label ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | --------------------------------------------------------------------------------- | -------- | | `--config-vars` | `-v` | config variables to bind to steps of your instance | false | | `--customer` | `-c` | ID of customer to deploy to | true | | `--description` | `-d` | longer description of the instance | false | | `--integration` | `-i` | ID of the integration or a specific integration version ID this is an instance of | true | | `--label` | `-l` | a label or set of labels to apply to the instance | false | | `--name` | `-n` | name of your new instance. | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ``` # Find the ID of the integration you want to deploy INTEGRATION_ID=$(prism integrations:list --columns id --no-header --filter 'name=Acme Inc') # Find the version ID of the latest available published version VERSION_ID=$(prism integrations:versions ${INTEGRATION_ID} --latest-available --columns id --no-header) # Connection config variables must be escaped CREDENTIALS='[{\"name\":\"username\",\"type\":\"value\",\"value\":\"my.username\"},{\"name\":\"password\",\"type\":\"value\",\"value\":\"Pa$$W0Rd\"}]' # Create an instance given a specific $VERSION_ID, # a customer $CUSTOMER_ID and required config variables # ("My Endpoint", "Do Thing?" and "Acme Basic Auth"): prism instances:create \ --name 'Acme Inc' \ --description 'Acme Inc instance for Smith Rocket Co' \ --integration ${VERSION_ID} \ --customer ${CUSTOMER_ID} \ --config-vars "[ { \"key\": \"My Endpoint\", \"value\": \"https://example.com/api\" }, { \"key\": \"Do Thing?\", \"value\": \"true\" }, { \"key\": \"Acme Basic Auth\", \"values\": \"${CREDENTIALS}\" } ]" \ --label 'Production' \ --label 'Paid' ``` ##### instances:delete[​](#instancesdelete "Direct link to instances:delete") Delete an Instance ``` prism instances:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `instance` | | ID of the instance to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### instances:deploy[​](#instancesdeploy "Direct link to instances:deploy") Deploy an Instance ``` prism instances:deploy [--print-requests] [--quiet] [--force] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------------------------------------------------- | -------- | | `instance` | | ID of an instance | true | | `--force` | `-f` | Force deployment even when there are certain conditions that would normally prevent it | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### instances:disable[​](#instancesdisable "Direct link to instances:disable") Disable an Instance ``` prism instances:disable [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `instance` | | ID of an instance | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### instances:enable[​](#instancesenable "Direct link to instances:enable") Enable an Instance ``` prism instances:enable [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `instance` | | ID of an instance | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### instances:flow-configs:list[​](#instancesflow-configslist "Direct link to instances:flow-configs:list") List Instance Flow Configs ``` prism instances:flow-configs:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `instance` | | ID of an Instance | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### instances:flow-configs:test[​](#instancesflow-configstest "Direct link to instances:flow-configs:test") Test a Flow Config of an Instance ``` prism instances:flow-configs:test [--print-requests] [--quiet] [--extended] [--columns ] [--tail] [--payload ] [--contentType ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------------------ | -------- | | `flowConfig` | | ID of a Flow Config to test | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--contentType` | `-c` | Optional content-type for the test payload | false | | `--extended` | `-x` | show extra columns | false | | `--payload` | `-p` | Optional JSON-formatted data payload to submit with the test | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--tail` | `-t` | Tail logs of the flow config test run | false | ##### instances:list[​](#instanceslist "Direct link to instances:list") List Instances ``` prism instances:list [--print-requests] [--quiet] [--customer ] [--integration ] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--customer` | `-c` | ID of a customer | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--integration` | `-i` | ID of an integration | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### instances:update[​](#instancesupdate "Direct link to instances:update") Update an Instance ``` prism instances:update [--print-requests] [--quiet] [--name ] [--description ] [--version ] [--deploy] [--label ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------- | -------- | | `instance` | | ID of an instance | true | | `--deploy` | | Deploy the instance after updating | false | | `--description` | `-d` | Description for the instance | false | | `--label` | `-l` | a label or set of labels to apply to the instance | false | | `--name` | `-n` | Name of the instance | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--version` | `-v` | ID of integration version | false | *** #### Integrations CLI Commands[​](#integrations-cli-commands "Direct link to Integrations CLI Commands") ##### integrations:available[​](#integrationsavailable "Direct link to integrations:available") Mark an Integration version as available or unavailable ``` prism integrations:available [--print-requests] [--quiet] --[no-]available ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `integration` | | ID of an integration version | true | | `--[no-]available` | `-a` | Version is available or unavailable | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### integrations:convert[​](#integrationsconvert "Direct link to integrations:convert") Convert a Low-Code Integration's YAML file into a Code Native Integration ``` prism integrations:convert --yamlFile [--folder ] [--registryPrefix ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ----------------------------------------------------------------------------------------------- | -------- | | `--folder` | `-f` | Optional: Folder name to install the integration into (kebab-cased integration name by default) | false | | `--registryPrefix` | `-r` | Optional: Your custom NPM registry prefix | false | | `--yamlFile` | `-y` | Filepath to a Low-Code Integration's YAML | true | ##### integrations:create[​](#integrationscreate "Direct link to integrations:create") Create an Integration ``` prism integrations:create [--print-requests] [--quiet] --name --description [--customer ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------------ | -------- | | `--customer` | `-c` | ID of customer with which to associate the integration | false | | `--description` | `-d` | longer description of the integration | true | | `--name` | `-n` | name of the integration to create | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### integrations:delete[​](#integrationsdelete "Direct link to integrations:delete") Delete an Integration ``` prism integrations:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `integration` | | ID of the integration to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### integrations:export[​](#integrationsexport "Direct link to integrations:export") Export an integration to YAML definition ``` prism integrations:export [--print-requests] [--quiet] [--latest-components] [--version ] ``` | Flag | Shorthand | Description | Required | | --------------------- | --------- | -------------------------------------------------------------- | -------- | | `integration` | | ID of an integration to export | true | | `--latest-components` | `-l` | Use the latest available version of each Component upon import | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--version` | `-v` | Define the definition version to export. | false | ##### integrations:flows:list[​](#integrationsflowslist "Direct link to integrations:flows:list") List Integration Flows ``` prism integrations:flows:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `integration` | | ID of an Integration | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### integrations:flows:test[​](#integrationsflowstest "Direct link to integrations:flows:test") Run a test execution of a flow ``` prism integrations:flows:test [--print-requests] [--quiet] [--flow-url ] [--integration-id ] [--payload ] [--payload-content-type ] [--sync] [--tail-results] [--tail-logs] [--cni-auto-end] [--timeout ] [--result-file ] [--jsonl] [--debug] [--apiKey ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | | `--apiKey` | | Optional API key for flows with secured endpoints. | false | | `--cni-auto-end` | | Automatically stop polling activity once an CNI flow execution completes. Some logs & results may not be returned this way. DOES NOT WORK FOR LOW-CODE FLOWS. | false | | `--debug` | | Enables debug mode on the test execution. | false | | `--flow-url` | `-u` | Invocation URL of the flow to run. | false | | `--integration-id` | `-i` | ID of the integration containing the flow to test. | false | | `--jsonl` | | Optionally format the step and tail results output into JSON Lines. | false | | `--payload` | `-p` | Optional file containing a payload to run the flow with. | false | | `--payload-content-type` | `-c` | Optional Content-Type for the test payload. | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--result-file` | `-r` | Optional file to append tailed execution result data to. Results are saved into JSON Lines. | false | | `--sync` | | Forces the flow to run synchronously. | false | | `--tail-logs` | | Tail logs from the test execution until user interrupt or timeout. | false | | `--tail-results` | | Tail step results from the test execution until user interrupt or timeout. | false | | `--timeout` | | Optionally set a timeout (in seconds) to stop tail activity. Compatible with both low-code and CNI flows. | false | ``` # Test an integration flow with a payload file and tail the logs and step results prism integrations:flows:test \ -p=some_payload_file.xml -c=application/xml --tail-logs --tail-results ``` ##### integrations:fork[​](#integrationsfork "Direct link to integrations:fork") Fork an Integration ``` prism integrations:fork [--print-requests] [--quiet] --name --description ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------- | -------- | | `parent` | | ID of the Integration to fork | true | | `--description` | `-d` | longer description of the forked integration | true | | `--name` | `-n` | name of the forked integration | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### integrations:import[​](#integrationsimport "Direct link to integrations:import") Import an Integration using a YAML definition file or a Code Native Integration ``` prism integrations:import [--print-requests] [--quiet] [--path ] [--integrationId ] [--icon-path ] [--open] [--replace] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ----------------------------------------------------------------------------------------------------------------------- | -------- | | `--icon-path` | | If supplied, the path to the PNG icon for the integration. Not applicable for Code Native Integrations. | false | | `--integrationId` | `-i` | The ID of the integration being imported | false | | `--open` | `-o` | If supplied, open the Designer for the imported integration | false | | `--path` | `-p` | If supplied, the path to the YAML definition of the integration to import. Not applicable for Code Native Integrations. | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--replace` | `-r` | If supplied, allows replacing an existing integration regardless of code-native status. Requires integrationId. | false | ##### integrations:init[​](#integrationsinit "Direct link to integrations:init") Initialize a new Code Native Integration ``` prism integrations:init [--clean] ``` | Flag | Shorthand | Description | Required | | --------- | --------- | ----------------------------------------------------------------------------------------- | -------- | | `name` | | Name of the new integration to create (alphanumeric characters, hyphens, and underscores) | true | | `--clean` | | Generate clean scaffold without example code | false | ``` # Initialize a new directory for a Code Native Integration named "acme-integration" prism integrations:init acme-integration # Next navigate into the created directory and install/build the Integration with NPM or Yarn yarn install && yarn build # Finally import your Code Native Integration prism integrations:import ``` ##### integrations:list[​](#integrationslist "Direct link to integrations:list") List Integrations ``` prism integrations:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--showAllVersions] [--customer ] [--org-only] [--search ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | ------------------------------------------------------------------------------------------------------ | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--customer` | `-c` | If specified this command returns only integrations that are available to the specified customer ID | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--org-only` | `-o` | If specified this command returns only org integrations | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--search` | `-s` | If specified, search for integrations by name (case insensitive). | false | | `--showAllVersions` | `-a` | If specified this command returns all versions of all integrations rather than only the latest version | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### integrations:marketplace[​](#integrationsmarketplace "Direct link to integrations:marketplace") Make a version of an Integration available in the Marketplace ``` prism integrations:marketplace [--print-requests] [--quiet] --[no-]available [--[no-]deployable] [--[no-]allow-multiple-instances] --overview ``` | Flag | Shorthand | Description | Required | | --------------------------------- | --------- | ----------------------------------------------------------------------------------------------------------- | -------- | | `integration` | | ID of an integration version to make marketplace available | true | | `--[no-]allow-multiple-instances` | `-m` | Allow a customer to deploy multiple instances of this integration | false | | `--[no-]available` | `-a` | Mark this Integration version available in the marketplace | true | | `--[no-]deployable` | `-d` | Mark this Integration version as deployable in the marketplace; does not apply if not also marked available | false | | `--overview` | `-o` | Overview to describe the purpose of the integration | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### integrations:open[​](#integrationsopen "Direct link to integrations:open") Open the Designer for the specified Integration ``` prism integrations:open ``` | Flag | Shorthand | Description | Required | | --------------- | --------- | ----------------------------- | -------- | | `integrationId` | | ID of the integration to open | true | ##### integrations:publish[​](#integrationspublish "Direct link to integrations:publish") Publish a version of an Integration for use in Instances ``` prism integrations:publish [--print-requests] [--quiet] [--comment ] [--commitHash ] [--commitUrl ] [--repoUrl ] [--pullRequestUrl ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------------------------------- | -------- | | `integration` | | ID of an integration to publish | true | | `--comment` | `-c` | comment about changes in this publication | false | | `--commitHash` | | Commit hash corresponding to the integration version being published | false | | `--commitUrl` | | URL to the commit details corresponding to this integration version | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--pullRequestUrl` | | URL to the pull request that modified this integration version | false | | `--quiet` | | Reduce helpful notes and text | false | | `--repoUrl` | | URL to the repository containing the definition for this integration | false | ##### integrations:set-debug[​](#integrationsset-debug "Direct link to integrations:set-debug") Set debug mode on or off for an integration's test instance. ``` prism integrations:set-debug [--print-requests] [--quiet] [--integration-id ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------------------------------------------ | -------- | | `debug` | | Boolean value to set whether globalDebug should be enabled for the given integration | true | | `--integration-id` | `-i` | ID of the integration containing the flow to test. | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### integrations:update[​](#integrationsupdate "Direct link to integrations:update") Update an Integration's name or description ``` prism integrations:update [--print-requests] [--quiet] [--name ] [--description ] [--customer ] [--test-config-vars ] ``` | Flag | Shorthand | Description | Required | | -------------------- | --------- | ------------------------------------------------------ | -------- | | `integration` | | ID of an integration | true | | `--customer` | `-c` | ID of customer with which to associate the integration | false | | `--description` | `-d` | new description to give the integration | false | | `--name` | `-n` | new name to give the integration | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--test-config-vars` | | JSON-formatted config variables to be used for testing | false | ##### integrations:versions[​](#integrationsversions "Direct link to integrations:versions") List Integration versions ``` prism integrations:versions [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--latest-available] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `integration` | | ID of an integration | true | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--latest-available` | `-l` | Show only the latest available version | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | *** #### Login CLI Commands[​](#login-cli-commands "Direct link to Login CLI Commands") ##### login[​](#login "Direct link to login") Log in to your Prismatic account ``` prism login [--force] [--url] ``` | Flag | Shorthand | Description | Required | | --------- | --------- | --------------------------------------------------------------- | -------- | | `--force` | `-f` | re-authenticate, even if you are already logged in | false | | `--url` | `-u` | returns a challenge url without automatically opening a browser | false | *** #### Logout CLI Commands[​](#logout-cli-commands "Direct link to Logout CLI Commands") ##### logout[​](#logout "Direct link to logout") Log out of your Prismatic account ``` prism logout [--browser] ``` | Flag | Shorthand | Description | Required | | ----------- | --------- | ------------------------------------------------------ | -------- | | `--browser` | `-b` | additionally log out of your default browser's session | false | *** #### Logs CLI Commands[​](#logs-cli-commands "Direct link to Logs CLI Commands") ##### logs:severities:list[​](#logsseveritieslist "Direct link to logs:severities:list") List Log Severities for use by Alert Triggers ``` prism logs:severities:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | *** #### Me CLI Commands[​](#me-cli-commands "Direct link to Me CLI Commands") ##### me[​](#me "Direct link to me") Print your user profile information ``` prism me ``` ##### me:token[​](#metoken "Direct link to me:token") Print your authorization tokens ``` prism me:token [--type {access,refresh}] ``` | Flag | Shorthand | Description | Required | | ----------------------- | --------- | ------------------------- | -------- | | `--type access,refresh` | `-t` | Which token type to print | false | ##### me:token:revoke[​](#metokenrevoke "Direct link to me:token:revoke") Revoke all refresh tokens for your user ``` prism me:token:revoke ``` *** #### On-prem-resources CLI Commands[​](#on-prem-resources-cli-commands "Direct link to On-prem-resources CLI Commands") ##### on-prem-resources:delete[​](#on-prem-resourcesdelete "Direct link to on-prem-resources:delete") Delete an On-Premise Resource ``` prism on-prem-resources:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `resource` | | ID of the On-Premise Resource to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### on-prem-resources:list[​](#on-prem-resourceslist "Direct link to on-prem-resources:list") List On-Premise Resources ``` prism on-prem-resources:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--customer ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | ----------------------------------------------------------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--customer` | `-c` | If specified this command returns only On-Premise Resources that are available to the specified customer ID | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### on-prem-resources:registration-jwt[​](#on-prem-resourcesregistration-jwt "Direct link to on-prem-resources:registration-jwt") Create a JWT that may be used to register an On-Premise Resource. ``` prism on-prem-resources:registration-jwt [--print-requests] [--quiet] [--customerId ] [--orgOnly] [--resourceId ] [--rotate] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | -------------------------------------------------------------------------------------------- | -------- | | `--customerId` | `-c` | The ID of the customer for which to create the JWT. Only valid for Organization users. | false | | `--orgOnly` | | Register a Resource available to Organization users only. Only valid for Organization users. | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--resourceId` | `-r` | An optional ID of an existing On-Premise Resource for which to generate a new JWT. | false | | `--rotate` | | Invalidate all JWTs for the On-Premise Resource and get a new JWT. | false | *** #### Organization CLI Commands[​](#organization-cli-commands "Direct link to Organization CLI Commands") ##### organization:connections:list[​](#organizationconnectionslist "Direct link to organization:connections:list") List all integration-agnostic connections available to the organization ``` prism organization:connections:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] [--managed-by {org,customer}] ``` | Flag | Shorthand | Description | Required | | --------------------------- | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--managed-by org,customer` | | Filter connections by management type | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### organization:signing-keys:delete[​](#organizationsigning-keysdelete "Direct link to organization:signing-keys:delete") Delete an embedded marketplace signing key ``` prism organization:signing-keys:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `signingKeyId` | | ID of the signing key to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### organization:signing-keys:generate[​](#organizationsigning-keysgenerate "Direct link to organization:signing-keys:generate") Generate an embedded marketplace signing key ``` prism organization:signing-keys:generate [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | `organization:signingkeys:generate` will generate a new embedded marketplace signing key. The RSA public key is saved in Prismatic, and the private key is returned and immediately removed from Prismatic. Once the private key is returned, it cannot be retrieved again. ##### organization:signing-keys:import[​](#organizationsigning-keysimport "Direct link to organization:signing-keys:import") Import a RSA public key for use with embedded marketplace ``` prism organization:signing-keys:import [--print-requests] [--quiet] --public-key-file ``` | Flag | Shorthand | Description | Required | | ------------------- | --------- | ------------------------------------------ | -------- | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--public-key-file` | `-p` | public key file | true | | `--quiet` | | Reduce helpful notes and text | false | `organization:signingkeys:import` allows you import your own RSA public key. You can use `openssl` to generate a new RSA key pair: First, run `openssl genrsa -out my-private-key.pem 4096` to generate an RSA private key. Next, run `openssl rsa -in my-private-key.pem -pubout > my-public-key.pub` to generate the associated RSA public key. It should look something like this: ``` -----BEGIN PUBLIC KEY----- EXAMPLE -----END PUBLIC KEY----- ``` Finally, import the *public* key: `prism organization:signingkeys:import -p my-public-key.pub` ##### organization:signing-keys:list[​](#organizationsigning-keyslist "Direct link to organization:signing-keys:list") List embedded signing keys for embedded marketplace ``` prism organization:signing-keys:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### organization:update[​](#organizationupdate "Direct link to organization:update") Update your Organization ``` prism organization:update [--print-requests] [--quiet] [--name ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `--name` | `-n` | name of the organization | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### organization:updateAvatarUrl[​](#organizationupdateavatarurl "Direct link to organization:updateAvatarUrl") Update your Organization Avatar URL ``` prism organization:updateAvatarUrl [--print-requests] [--quiet] --organizationId [--avatarUrl ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `--avatarUrl` | `-n` | Url of the organization avatar | false | | `--organizationId` | | ID of an organization | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### organization:users:create[​](#organizationuserscreate "Direct link to organization:users:create") Create a User for your Organization ``` prism organization:users:create [--print-requests] [--quiet] [--name ] --email --role ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `--email` | `-e` | email address of the user | true | | `--name` | `-n` | name of the user | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--role` | `-r` | role the user should assume | true | Organization Roles See the [users](https://prismatic.io/docs/configure-prismatic/organization-users.md) page for information on roles that users can assume ``` # Create an organization user for Susan and grant her the 'Integrator' role prism organization:users:create \ --email 'foo@email.com' \ --name 'Susan Foo' \ --role $(prism organization:users:roles \ --columns id \ --no-header \ --filter 'name=^Integrator$') ``` ##### organization:users:delete[​](#organizationusersdelete "Direct link to organization:users:delete") Delete an Organization User ``` prism organization:users:delete [--print-requests] [--quiet] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------ | -------- | | `user` | | ID of the user to delete | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### organization:users:list[​](#organizationuserslist "Direct link to organization:users:list") List Users of your Organization ``` prism organization:users:list [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### organization:users:roles[​](#organizationusersroles "Direct link to organization:users:roles") List Roles you can grant to other users in your Organization ``` prism organization:users:roles [--print-requests] [--quiet] [--columns ] [--csv] [--extended] [--filter ] [--no-header] [--no-truncate] [--output {csv,json,yaml}] [--sort ] ``` | Flag | Shorthand | Description | Required | | ------------------------ | --------- | -------------------------------------------------------- | -------- | | `--columns` | | only show provided columns (comma-separated) | false | | `--csv` | | output is csv format \[alias: --output=csv] | false | | `--extended` | `-x` | show extra columns | false | | `--filter` | | filter property by partial string matching, ex: name=foo | false | | `--no-header` | | hide table header from output | false | | `--no-truncate` | | do not truncate output to fit screen | false | | `--output csv,json,yaml` | | output in a more machine friendly format | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--sort` | | property to sort by (prepend '-' for descending) | false | ##### organization:users:update[​](#organizationusersupdate "Direct link to organization:users:update") Update a User ``` prism organization:users:update [--print-requests] [--quiet] [--name ] [--phone ] [--dark-mode ] [--dark-mode-os-sync ] ``` | Flag | Shorthand | Description | Required | | --------------------- | --------- | ---------------------------------------------- | -------- | | `user` | | ID of a user | true | | `--dark-mode` | `-d` | whether the user should have dark mode enabled | false | | `--dark-mode-os-sync` | `-o` | whether dark mode should sync with OS settings | false | | `--name` | `-n` | name of the user | false | | `--phone` | `-p` | phone number of the user | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | *** #### Translations CLI Commands[​](#translations-cli-commands "Direct link to Translations CLI Commands") ##### translations:list[​](#translationslist "Direct link to translations:list") Generate Dynamic Phrases for Embedded Marketplace ``` prism translations:list [--print-requests] [--quiet] [--output-file ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ---------------------------------------------------- | -------- | | `--output-file` | `-o` | Output the results of the action to a specified file | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | *** #### Workflows CLI Commands[​](#workflows-cli-commands "Direct link to Workflows CLI Commands") ##### workflows:export[​](#workflowsexport "Direct link to workflows:export") Export an embedded workflow or workflow template YAML definition ``` prism workflows:export [--print-requests] [--quiet] [--[no-]latest-components] ``` | Flag | Shorthand | Description | Required | | -------------------------- | --------- | --------------------------------------------------------------------------------- | -------- | | `workflow` | | ID of the workflow to export | true | | `--[no-]latest-components` | `-l` | Use the latest available version of each component upon import. Defaults to true. | false | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | ##### workflows:import[​](#workflowsimport "Direct link to workflows:import") Import an embedded workflow or workflow template YAML definition ``` prism workflows:import [--print-requests] [--quiet] --path [--workflow ] [--customer ] ``` | Flag | Shorthand | Description | Required | | ------------------ | --------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -------- | | `--customer` | `-c` | The ID of the customer to associate with the imported workflow. This will overwrite the existing workflow. If omitted, the workflow will be imported as a template. | false | | `--path` | `-p` | The path to the YAML definition of the workflow to import | true | | `--print-requests` | | Print all GraphQL requests that are issued | false | | `--quiet` | | Reduce helpful notes and text | false | | `--workflow` | `-w` | The ID of the workflow being imported. If omitted, a new workflow will be created. | false | --- ### Troubleshooting and FAQ Misconfiguration of [WSL](https://docs.microsoft.com/en-us/windows/wsl/) or [Node.js](https://nodejs.org/) can result in unexpected behaviors for a Node-based package like `prism`. Here are some common problems with their respective solutions: #### Error: spawn cmd.exe ENOENT[​](#error-spawn-cmdexe-enoent "Direct link to Error: spawn cmd.exe ENOENT") When you run `prism login` from within Windows Subsystem for Linux (WSL), you might encounter `Error: spawn cmd.exe ENOENT`. This is an issue with the WSL distribution's `PATH` environment variable. Ensure that your `PATH` environment variable contains `System32`, which is where `cmd.exe` is located. Typically, you can run something like this: ``` export PATH=$PATH:/mnt/c/Windows/System32 ``` #### Error: spawn xdg-open ENOENT[​](#error-spawn-xdg-open-enoent "Direct link to Error: spawn xdg-open ENOENT") If you run `prism login` in a headless Linux environment (a Linux environment without a desktop or web browser, such as an Ubuntu server or Docker container), `prism` will be unable to open a GUI web browser to authenticate you. You might encounter an error that reads `Error: spawn xdg-open ENOENT` or `Error: Exited with code 3`. You will need to authenticate on a computer with a desktop environment and web browser, then set the refresh token you receive as an environment variable on the headless server. See [Headless prism Usage for CI/CD Pipelines](https://prismatic.io/docs/cli/bash-scripting.md#headless-prism-usage-for-cicd-pipelines). #### prism: command not found[​](#prism-command-not-found "Direct link to prism: command not found") If you have followed the [installation instructions](https://prismatic.io/docs/cli.md#installing-the-cli-tool) to install `prism`, but then encounter `prism: command not found` on Linux or MacOS, or `'prism' is not recognized as an internal or external command, operable program or batch file` on Windows when you run `prism`, you likely don't have your NodeJS `PATH` configured correctly. Ensure that your `PATH` environment variable contains the `bin/` directory of your NodeJS installation. --- ### Account Overview To clarify terminology, Prismatic's customers (i.e. *you*) are referred to as **organizations**, while your customers are referred to as **customers** throughout the documentation. #### Creating your organization[​](#creating-your-organization "Direct link to Creating your organization") To create an organization within Prismatic, first [sign up](https://prismatic.io/docs/free-trial). For large enterprises with multiple distinct divisions, consider creating a separate organization for each division. note If your company has already created an organization, request that your organization's administrator create a user account for you instead. Registering with Prismatic again will create a new, separate organization. #### Editing your organization[​](#editing-your-organization "Direct link to Editing your organization") * Web App * CLI * API To edit your organization's settings, click the **Settings** link on the left-hand sidebar. From this screen, you can manage team members, alerting, and subscription settings. You can also view your organization's utilization of Prismatic resources. To change your organization's logo, navigate to the **Theme** tab. Logos are cropped and resized to 512 x 512 pixels and must be image files. Transparent, square PNG images typically yield the best results. ![Change logo for org in Prismatic app](/docs/img/configure-prismatic/change-org-name.png) To rename your organization, use the `prism organization:update` subcommand: ``` prism organization:update --name "New Organization Name" ``` To update your organization's name programmatically, use the [updateOrganization](https://prismatic.io/docs/api/schema/mutation/updateOrganization.md) mutation: ``` mutation { updateOrganization(input: { name: "New Organization Name" }) { organization { id } } } ``` #### Deleting your organization[​](#deleting-your-organization "Direct link to Deleting your organization") You can delete your organization from the **Subscription** tab within the organization **Settings**. Note that deleting your organization is permanent. --- ### Custom Domains Feature Availability The custom domains feature is available to customers on specific pricing plans. Refer to your pricing plan or contract, or contact the Prismatic support team for more information. You can configure the Prismatic platform to operate through a custom domain that you control. For example, you might want to white-label Prismatic so your team members and customers access `https://integrations.your-company.com` instead of [https://app.prismatic.io](https://app.prismatic.io/). To configure a custom domain, please contact our [support team](mailto:support@prismatic.io), and specify the subdomain (like `https://integrations.your-company.com`) you would like to use. We'll create an [SOA record](https://www.cloudflare.com/learning/dns/dns-records/dns-soa-record/) for that subdomain along with several A and CNAME records for web app access, webhook invocation, and OAuth 2.0 callbacks: * `https://integrations.your-company.com` - web app and embedded URL * `https://hooks.integrations.your-company.com` - webhooks * `https://oauth2.integrations.your-company.com` - OAuth 2.0 callback Once those records are ready, you'll need to create an [NS record](https://www.cloudflare.com/learning/dns/dns-records/dns-ns-record/) pointing to a list of nameservers we'll provide. *Note:* Some DNS providers represent nameserver lists as separate NS records (one server per record), while others use single NS records with multiple values. Once your domain is white-labeled, your OAuth2 applications can use `https://oauth2.integrations.your-company.com/callback` as a callback URL. Webhook and OAuth 2.0 URLs are derived from the URL you visit Once a subdomain is configured, you can access your tenant through either the custom domain or the default Prismatic domain. Custom domains are not tenant-specific. If you have production and development tenants, both tenants can use the custom domain you establish. * If you access or embed Prismatic via , you will see webhook and OAuth URLs with the Prismatic domain. * If you access or embed Prismatic via your custom domain like `https://integrations.your-company.com`, you will see webhook and OAuth URLs with your custom domain. --- ### Deployment Regions #### Why use multiple regions?[​](#why-use-multiple-regions "Direct link to Why use multiple regions?") There are several reasons an organization might choose to host Prismatic integrations across multiple regions: * GDPR compliance requirements, ensuring EU-based customer data remains in the EU and US-based customer data remains in the US * CJIS or ITAR compliance requirements necessitating customer data storage in GovCloud * Low-latency requirements between their application and Prismatic, necessitating hosting integrations in Australia or the EU * Private cloud hosting to fulfill specific customer requirements Regardless of the requirement, when hosting integrations across multiple Prismatic stacks, you must maintain component and integration synchronization between regions. #### Logging in to additional regions[​](#logging-in-to-additional-regions "Direct link to Logging in to additional regions") Prismatic's public regions are accessible through the following URLs: | Stack | App URL | | -------------------- | ----------------------------------------- | | US Commercial (Ohio) | | | US GovCloud | | | Europe (Ireland) | | | Europe (London) | | | Canada (Central) | | | Australia (Sydney) | | #### Prismatic IP allowlist (whitelist)[​](#prismatic-ip-allowlist-whitelist "Direct link to Prismatic IP allowlist (whitelist)") If your integration connects to an external application or service that restricts connections based on IP address, you can add the following relevant IPs to your allowlist. | Stack | App URL | IP Addresses | | -------------------- | ------------------------------- | ------------------------------------- | | US Commercial (Ohio) | app.prismatic.io | `3.132.205.204`
`3.139.185.169` | | US GovCloud | app.us-gov-west-1.prismatic.io | `15.200.86.230`
`15.205.78.158` | | Australia (Sydney) | app.ap-southeast-2.prismatic.io | `52.65.181.77`
`54.252.173.54` | | Canada (Central) | app.ca-central-1.prismatic.io | `35.182.143.99`
`3.99.22.244` | | Europe (Ireland) | app.eu-west-1.prismatic.io | `54.78.26.19`
`54.246.201.85` | | Europe (London) | app.eu-west-2.prismatic.io | `18.132.171.185`
`18.134.91.207` | If your integrations are hosted in a private cloud, please contact Prismatic support for the appropriate allowlist. **Note**: Integrations in Prismatic **send** data to third-party applications and services from these IP addresses. Invoking integrations in Prismatic uses name-based routing (e.g., hooks.prismatic.io) and IPs are subject to change. #### Private regions[​](#private-regions "Direct link to Private regions") If your contract includes a private cloud deployment, Prismatic's DevOps team will deploy a Prismatic stack in your AWS account within your chosen region. Contact Prismatic [support](mailto:support@prismatic.io) for your endpoint URL and IP allowlist. ##### Access to additional regions[​](#access-to-additional-regions "Direct link to Access to additional regions") By default, new Prismatic accounts are provisioned in the US Commercial (Ohio) region, [app.prismatic.io](https://app.prismatic.io). Access to additional regions can be enabled by Prismatic for enterprise customers whose contracts include additional regions. Users need to be added to each region User authentication spans all regions, allowing you to use the same email and password to log in to each region. However, user data is not shared across regions. Once your organization has been enabled by Prismatic support in an additional region, you must invite your team members to the new region for them to access your tenant. ##### Logging in to additional regions with Prism[​](#logging-in-to-additional-regions-with-prism "Direct link to Logging in to additional regions with Prism") The [prism](https://prismatic.io/docs/cli/prism.md) CLI tool interacts with the US Commercial region by default. To interact with an additional region, set a `PRISMATIC_URL` environment variable with the target region's endpoint: Log in to US and then EU stacks from a Unix-based terminal ``` $ prism login Press any key to open prismatic.io in your default browser: Login complete! $ prism me Name: John Doe Email: john.doe@example.com Organization: Example Corp - US Region Endpoint URL: https://app.prismatic.io $ export PRISMATIC_URL=https://app.eu-west-1.prismatic.io $ prism login Press any key to open prismatic.io in your default browser: Login complete! $ prism me Name: John Doe Email: john.doe@example.com Organization: Example Corp - EU Region Endpoint URL: https://app.eu-west-1.prismatic.io ``` **Note**: For PowerShell on Windows, you can set an environment variable using this syntax: ``` $ENV:PRISMATIC_URL="https://app.eu-west-1.prismatic.io" ``` For `prism` usage on headless systems (like a CI/CD pipeline or automated script), you will need to generate a refresh token. See docs on [Headless prism Usage](https://prismatic.io/docs/cli/bash-scripting.md#headless-prism-usage-for-cicd-pipelines). --- ### Integrations in Multiple Regions #### Syncing integrations between regions[​](#syncing-integrations-between-regions "Direct link to Syncing integrations between regions") An integration can be represented as a YAML definition that specifies the flows, steps, configuration, and config variables comprising the integration. Integrations can be replicated between regions by downloading an integration's YAML definition from one region and importing the YAML file into another region. ##### Exporting an integration's YAML definition[​](#exporting-an-integrations-yaml-definition "Direct link to Exporting an integration's YAML definition") To access an integration's YAML definition from the integration builder, click the **Integration Details** gear icon on the left side of the integration designer and select **Save to file**. To access an integration's YAML definition using `prism`, identify the integration's ID by running `prism integrations:list --extended`, or by copying the `SW5.....` portion of your integration's URL when it's open. Then, run `prism integrations:export`: Save an integration's YAML definition to my-file.yaml ``` prism integrations:export SW50ZWdyYXRpb246YmE0NGU5NmQtYWMzOS00MDMxLTg4MmUtMWQyNzA5ZjY5MDg0 > my-file.yaml ``` Component Versions Across Regions Component versions may not be consistent across regions. For example, you might publish an "Acme Inc" component 50 times during testing on the US Commercial stack, but only twice on your Europe tenant. In this case, "v50" in the US stack would correspond to "v2" in the Europe stack. A YAML file specifying "Acme Inc v50" would be meaningless to the Europe stack. To avoid version mismatches, export your YAML definition using the **Save to file with latest component versions** button in the UI, or by adding the `--latest-components` flag to your `prism` command: ``` prism integrations:export --latest-components SW50Z... ``` When you import your integration in another region, the latest versions of each component will be used. ##### Importing an integration's YAML definition[​](#importing-an-integrations-yaml-definition "Direct link to Importing an integration's YAML definition") Once you have the integration's YAML definition, you can import the integration through the UI or using `prism`. In the integration designer, open the **Integration details** modal again, then select **Import**. Using prism, run `prism integrations:import --path ./my-file.yaml` to import your YAML file. **Troubleshooting**: If you encounter errors, ensure that the custom components used in your source stack are also deployed to your destination stack. You may have an older version of your custom component that lacks some actions, inputs, etc. ##### Renaming flows and config variables in YAML definitions[​](#renaming-flows-and-config-variables-in-yaml-definitions "Direct link to Renaming flows and config variables in YAML definitions") Use caution when modifying flow names or config variable names in YAML If you modify the name of a flow or config variable in the YAML definition, you should include a `renameAttributes` section in your YAML. Your integration consists of flows, steps, config variables, and additional metadata, and can be represented as a YAML file. When a YAML file is imported, the Prismatic API attempts to match flows and config variables in the YAML with flows and config variables in the existing integration. If a match is found, the existing config variable or flow will remain. If no match is found, a new config variable or flow will be created and the old flow or config variable will be removed. This creates problems when you want to rename flows or config variables in your YAML definition: * If you rename a **flow**, the Prismatic API will not be able to match the flow in the YAML with a flow in the existing integration, and will create a new flow and delete the old one. Your integration will function identically, but the flow's ID will be different. Instances of your integration will be assigned new webhook URLs for the renamed flow when the instance is updated to the latest integration version. Different webhook URLs can cause issues if you already have webhooks configured using the old flow's URL. * If you rename a **config variable**, the Prismatic API will not be able to match the config variable in the YAML with the config variable in the existing integration, and will create a new config variable and delete the old one. This will cause issues if you have already configured instances of your integration. When a customer updates their instance, their old config variables will be removed and new ones will need to be reconfigured (this includes OAuth 2.0 connections - the customer user will need to reauthenticate their connection). To ensure that the Prismatic API matches the flow or config variable in the YAML with the flow or config variable in the existing integration, you should include a `renameAttributes` section in your YAML. This section includes `requiredConfigVars` and `flows` sections, which specify the old and new names of config variables and flows, respectively. ``` configPages: - elements: - type: configVar value: SFDC Connection - type: configVar value: SFDC Record Type name: Page 1 userLevelConfigured: false definitionVersion: 7 endpointType: flow_specific flows: - description: "" endpointSecurityType: customer_optional isSynchronous: false name: SFDC Data Import steps: [] name: SFDC Integration requiredConfigVars: - key: SFDC Record Type description: "" dataType: string orgOnly: false defaultValue: "" - key: SFDC Connection description: "" dataType: string orgOnly: false defaultValue: "" renameAttributes: requiredConfigVars: - newName: SFDC Connection oldName: Salesforce Connection - newName: SFDC Record Type oldName: Salesforce Record Type flows: - newName: SFDC Data Import oldName: Salesforce Data Import ``` In the above example, we renamed the flow `Salesforce Data Import` to `SFDC Data Import`, and renamed the config variable `Salesforce Connection` to `SFDC Connection` and `Salesforce Record Type` to `SFDC Record Type`. With this block included, the Prismatic API will know to match the old flow and config variable names with the new flow and config variable names, and will not replace flows or config variables with new ones. #### Syncing components between regions[​](#syncing-components-between-regions "Direct link to Syncing components between regions") When you publish a component, it is published to a single region. To publish a built component to multiple regions, you must log in to each region and run `prism components:publish`. Alternatively, you can store a [refresh token](https://prismatic.io/docs/cli.md#authenticating-with-the-cli-tool) for each region. Then, you can deploy the component to any desired region: Publish a component to multiple regions ``` # Publish to default US commercial region prism components:publish # Publish to EU with inline environment variables PRISM_REFRESH_TOKEN=your-eu-refresh-token PRISMATIC_URL=https://app.eu-west-1.prismatic.io prism components:publish # Publish to GovCloud by exporting environment variables export PRISM_REFRESH_TOKEN=your-govcloud-refresh-token export PRISMATIC_URL=https://app.us-gov-west-1.prismatic.io prism components:publish ``` Note that component versions increment independently in each region. If you've published your custom component 50 times to US Commercial and 10 versions to the Sydney region, the same component code will be versioned as `v50` and `v10` respectively. #### Building multi-region deployment into a CI/CD pipeline[​](#building-multi-region-deployment-into-a-cicd-pipeline "Direct link to Building multi-region deployment into a CI/CD pipeline") To develop integrations and custom components in one tenant and automatically deploy them to another, you can store your custom component code and integration YAML in a version control system (like a git repository), and configure a CI/CD pipeline to automatically deploy new versions of components and integrations when code passes QA and is merged to your production branch. Since integrations depend on components, [build and publish](https://prismatic.io/docs/custom-connectors/publishing.md) your components first, then update your integrations. See [above](#syncing-components-between-regions) for information on publishing components across multiple regions. Once components are published, you can use the same `prism` authentication you used for components to [import integrations](#importing-an-integrations-yaml-definition). The `prism integrations:import` command will return an **integration ID** (SW50Z....). Using that integration ID, you can **publish** a new version of your integration: ``` prism integrations:publish SW50Z... --comment "My publication comment" ``` The `integrations:publish` command will return an **integration version ID** (SW50Z...). You can use that ID with the [`updateInstance`](https://prismatic.io/docs/api/schema/mutation/updateInstance.md) GraphQL mutation to update deployed instances to the latest published version of your integration. --- ### Organization Team Members Organization users are team members employed by *your* company. They are responsible for building, deploying, and supporting integrations for your customers, and can be granted permissions based on their assigned role. #### Organization team member roles[​](#organization-team-member-roles "Direct link to Organization team member roles") [What are Team Member Roles?](https://player.vimeo.com/video/502283103) Organization users can be assigned a variety of *roles*: * An organization **owner** is a super-user who can manage all aspects of an organization (users, customers, integrations, billing, etc.). * An organization **admin** has all the permissions of an owner, except the ability to make changes to the organization itself and manage billing. This role is typically granted to user management teams (like your IT team). * An organization **integrator** can manage customers, integrations, and instances. Most developers, DevOps, and implementation technicians will have this role. * An organization **guest** is a read-only user who can view information about customer instances but cannot modify anything. This is suitable for support technicians who need to view logs but should not modify instance configurations. * An organization **customer manager** has limited permissions and can manage customers but cannot view or manage their instances. This is appropriate for support users who should not have access to customer instance configurations. * An organization **third-party** user is used when you are integrating with a third-party app or service and would like to grant limited access to a user from that third-party to specific integrations, components, or customers. The **third-party** role is described in more detail [below](#third-party-users). | | Owner | Admin | Integrator | Guest | Customer Manager | Third-Party | | ------------------------- | ----- | ----- | ---------- | ----- | ---------------- | ----------- | | View Customers | x | x | x | x | x | ? | | View Customer Users | x | x | x | x | x | | | View Customer Instances | x | x | x | x | | | | View Alert Monitors | x | x | x | x | | | | Manage Customers | x | x | x | | x | | | Manage Customer Users | x | x | x | | x | | | Manage Components | x | x | x | | | ? | | Manage Instances | x | x | x | | | ? | | Manage Integrations | x | x | x | | | ? | | Manage Organization Users | x | x | | | | | | Configure Embedded Themes | x | x | | | | | | Manage Embedded Settings | x | x | | | | | | Configure Log Streaming | x | x | | | | | | Manage Organization | x | | | | | | | Manage Billing | x | | | | | | #### Managing organization users[​](#managing-organization-users "Direct link to Managing organization users") Only organization users with **admin** or **owner** roles can manage organization users. To manage organization users in the web app, click **Settings** on the left-hand sidebar, and select the **Team Members** tab. ##### Listing organization users[​](#listing-organization-users "Direct link to Listing organization users") * Web App * CLI * API Organization users are listed under the **Team Members** tab. You can filter users by typing a name into the search bar at the top of the page. You can also filter by email address by clicking the **Filter** link to the right of the search bar. ![List of org users in Prismatic app](/docs/img/configure-prismatic/organization-users/list-org-users.png) Users can be listed via CLI using the `organization:users:list` subcommand. List all organization users ``` prism organization:users:list Name Email ──────────────── ────────────────────────── James Patton james.patton@progix.io Samantha Johnson samantha.johnson@progix.io Ed Davis edward.davis@progix.io Kristin Henry kristin.henry@progix.io Alex Cooper alexander.cooper@progix.io ``` List users by querying the `users` field on [organization](https://prismatic.io/docs/api/schema/query/organization.md): ``` query { organization { users { nodes { id name email } } } } ``` ##### Adding organization users[​](#adding-organization-users "Direct link to Adding organization users") * Web App * CLI * API From the **Team Members** tab, click the **+ Add team member** button in the upper-right. Select an appropriate role for the new user (see above for permissions), and provide a name and email address for the user. ![Add team member in Prismatic app](/docs/img/configure-prismatic/organization-users/add-org-user.png) Management of organization users is performed through the `prism organization:users` subcommands. You can find the role IDs you are allowed to grant using `organization:users:roles`. Add 'Susan Smith' as a new 'integrator' ``` ROLE_ID=$(prism organization:users:roles \ --columns id \ --no-header \ --filter 'name=Integrator') prism organization:users:create \ --email 'susan.smith@progix.io' \ --name 'Susan Smith' \ --role ${ROLE_ID} ``` To create an organization user, you will need the ID of the role you want to assign: ``` query listOrganizationRoles { authenticatedUser { grantableRoles(roleType: ORGANIZATION) { id name description } } } ``` Once you have the role ID, use the [createOrganizationUser](https://prismatic.io/docs/api/schema/mutation/createOrganizationUser.md) mutation to create a new organization user: ``` mutation { createOrganizationUser( input: { name: "Susan Smith" email: "susan.smith@progix.io" role: "Um9sZTpmYzE0ODIwNC1mZmQxLTQxMWUtYmRlYS1iNmFmYzM4YmViOGE=" } ) { user { id } } } ``` After creating the new user, they will receive a confirmation email with a link to set up their profile and password. ##### Changing an organization user's role, name, avatar, or phone number[​](#changing-an-organization-users-role-name-avatar-or-phone-number "Direct link to Changing an organization user's role, name, avatar, or phone number") From the **Team Members** tab, click the name of a user. You can change the user's role, name, phone number, or avatar under the **Details** tab. After making changes, click **Save** to apply your updates. ![Edit team member in Prismatic app](/docs/img/configure-prismatic/organization-users/edit-org-user.png) ##### Deleting organization users[​](#deleting-organization-users "Direct link to Deleting organization users") * Web App * CLI * API From the **Team Members** tab, click the name of a user. Select the **Details** tab within that user's page and click the **Delete user** button at the bottom of the page. Enter the **Confirmation text** and click **Remove user** to confirm removal. Delete user 'Susan Smith' ``` USER_ID=$(prism organization:users:list \ --columns id \ --no-header \ --filter 'email=susan.smith@progix.io') prism organization:users:delete ${USER_ID} ``` To delete an organization team member, use the [deleteUser](https://prismatic.io/docs/api/schema/mutation/deleteUser.md) mutation: ``` mutation { deleteUser( input: { id: "VXNlcjpiMmNmNmY5MS1iMjljLTRlODUtOTc1My04NWE0NGM2ZDE2YzE=" } ) { user { id } } } ``` #### Third-party users[​](#third-party-users "Direct link to Third-party users") It is often necessary to involve people from third-party vendors as you build, test, and debug your integrations. Granting third-party vendors the ability to view and test specific integrations and components accelerates development and ensures all stakeholders remain aligned regarding development progress and data flow between systems. ##### Creating third-party users[​](#creating-third-party-users "Direct link to Creating third-party users") Organization users with **admin** or **owner** permissions can create new organization-level users with the **third-party** role. This role is highly restricted - by default, **third-party** users can only edit their own profile information and view built-in components. They cannot view information about your custom components, integrations, or customers. Once created, you can grant additional permissions to allow interaction with specific resources. You can create a third-party user as you would any other organization-level user: click **Settings** on the left-hand sidebar, then click **+ Add team member**. Assign the new user the **Third-Party** role. ![Add third-party team member in Prismatic app](/docs/img/configure-prismatic/organization-users/add-third-party-user.png) ##### Granular access for third-party users[​](#granular-access-for-third-party-users "Direct link to Granular access for third-party users") To grant access to specific resources, like integrations, custom components or customers, click **Settings** on the left-hand sidebar and then **Team Members**. Select the third-party user you would like to grant access to, and click into the **Granular Access** tab. ![Set granular access for third-party team member in Prismatic app](/docs/img/configure-prismatic/organization-users/granular-access-tab.png) From here, you can grant the user access to specific **integrations**, **components**, or **customers** by clicking the **+ Add permission** button on the top-right. ![Add specific permissions to third-party team member in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-add-permission-dialogue.png) ##### Integration access[​](#integration-access "Direct link to Integration access") The most common use case for third-party users is to allow a third-party vendor to view, and possibly edit and test an integration. That way, they can test invoking an integration in Prismatic from their third-party service and can verify that the data the integration receives is in the format you agreed upon. Giving integration access to a third-party vendor also allows you to see what sort of attempts are being made on their end to make sure the integration works. You can view logs of each test a third-party vendor performs to give you a sense of how their side of the integration development is progressing, and if and when you jump on calls with your mutual customer and the third party, you can test and debug issues quickly (rather than relying on email chains that drag on for weeks). To grant a third-party vendor access to a specific integration, select **Integration** from the **+ Add permission** dialog, then search for and select the integration you want to give permissions for. ![Select integration for third-party permissions in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-select-integration.png) On the next screen select the types of permissions you would like to grant for that integration. If you would like the third-party user to be able to see the integration in their Integrations list view, select **View Integration**. If you would like the third-party user to be able to edit the integration, select **Edit Integration**. ![Choose integration permissions for third-party team member in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-select-integration-permissions.png) The third party user will then be able to see the integration that they've been granted permission to see, but all other integrations will remain hidden from them. This is handy if you are integrating with multiple competing vendors - the third-party vendors cannot see one another's integrations (or even know they exist). ![Single integration in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-only-one-integration.png) You must also grant access to relevant custom components If you give a third-party user access to an integration that uses custom components, you must also grant them access to those custom components. ##### Component access[​](#component-access "Direct link to Component access") Similar to integrations, you can grant third-party users access to specific custom components. By default, third-party users have access to Prismatic built-in public components, but you may not want third-party vendors to see all of the custom components you've published (especially if you integrate with several competing vendors). To grant a third-party user access to a custom component, select **Component** after opening the **+ Add permission** dialog, and then search for and select the component you would like to grant access to. ![Select component for third-party permissions in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-select-component.png) You can grant a variety of component-related permissions to a third-party user. If they are assisting in the development of the custom component, they will need the **Edit Component** permission. Otherwise, to use the component in an integration they will just need the **View Component** permission. ![Choose component permissions for third-party team member in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-select-component-permissions.png) Custom components that are not granted to a user are not visible. This is once again handy if you are integrating with several competing companies, or your own competitors - their users will not be able to see what other custom components you've published. ##### Customer access[​](#customer-access "Direct link to Customer access") You can grant a third-party user access to a specific customer. This is handy if you and another vendor share a customer in common, and are working on an integration together for that customer. To grant permissions to a specific customer, select **Customer** after clicking **+ PERMISSION** and then select the customer you'd like to grant permissions for. ![Select customer for third-party permissions in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-select-customer.png) Next, select the permissions on this customer you would like to grant. There are a variety of options, each with a description below them. You can elect to let the third-party user view or manage the customer, the customer's users, and the instances deployed to the customer. read-only access grants viewing of customer credentials Note that if you grant the **View Customer** permission on a customer to a third-party user, that user can view the customer's saved credentials. ![Choose component permissions for third-party team member in Prismatic app](/docs/img/configure-prismatic/organization-users/third-party-select-customer-permissions.png) Permissions are scoped to a specific customer. That way, if you are developing an integration with a competing software vendor they will not be able to view information about the other customers in your system. --- ### Single Sign-On (SSO) Feature Availability The single sign-on feature is available to customers on specific pricing plans. Refer to your pricing plan or contract, or contact the Prismatic support team for more information. You can configure authentication for your team members through your existing identity provider. If your team members authenticate to other applications through Active Directory, Active Directory Federation Services (ADFS), or Lightweight Directory Access Protocol (LDAP), we can configure your organization to authenticate to Prismatic using the same method. When **single sign-on** (SSO) is enabled, team members will be redirected to your identity provider from the login screen if they enter an email address matching your domain. Otherwise, they will be prompted for a password using standard authentication. ![Diagram showing paths and logic for SSO (single sign on)](/docs/img/configure-prismatic/single-sign-on/sso.png) If you are interested in implementing Single Sign-On for Prismatic, please contact our [support](mailto:support@prismatic.io) team to configure and enable SSO for your organization. --- ### User Settings #### Managing your own user profile[​](#managing-your-own-user-profile "Direct link to Managing your own user profile") You can update your name, password, phone number, avatar image, and light/dark mode preferences. Begin by clicking your user avatar at the top-right of the screen, then select the **User settings** link. All user profile preferences are available in the **Password** and **Details** tabs. ##### Updating your password[​](#updating-your-password "Direct link to Updating your password") After clicking your avatar at the top-right of the screen and selecting **User settings**, open the **Password** tab. Enter your current password, then select a new password. Your password must contain: * At least 8 characters * At least one uppercase letter * At least one lowercase letter * At least one number ![Change own password in Prismatic app](/docs/img/configure-prismatic/user-settings/change-own-password.png) ##### Resetting a forgotten password[​](#resetting-a-forgotten-password "Direct link to Resetting a forgotten password") If you have forgotten your password, navigate to . If you are not logged out, do so by clicking your avatar at the top-right of the screen and selecting **Logout**. Enter your email address, then click **Continue**. Click the **Forgot password?** button and then click **Continue** again. You will receive an email with a password reset link where you can create a new password. ##### Updating your name, avatar, or phone number[​](#updating-your-name-avatar-or-phone-number "Direct link to Updating your name, avatar, or phone number") After clicking your avatar at the top-right of the screen, select the **Details** tab. You can modify your name or phone number from this screen. If you provide a phone number, it can be used by your team members for monitoring and alerting purposes. If you modify your avatar image, the uploaded image will be resized and cropped to 500 x 500 pixels. Transparent PNG avatar images typically yield the best results. ##### Setting light or dark mode[​](#setting-light-or-dark-mode "Direct link to Setting light or dark mode") By default, the web application will present light or dark mode to match your operating system settings. To override the light/dark mode setting, click your avatar at the top-right of the screen, and click the **Light** or **Dark** button as needed. ![Set light/dark mode in Prismatic app](/docs/img/configure-prismatic/user-settings/dark-mode.png) Light/dark mode settings are associated with your user account, so if you set a preference on one computer, that preference will be retained when you log in from another computer. --- ### Customers Overview Customers in Prismatic represent *your* organization's customers. Your organization creates customers and [integrations](https://prismatic.io/docs/integrations.md) in Prismatic. Your customers can either enable instances of your integrations themselves through your [integration marketplace](https://prismatic.io/docs/embed/marketplace.md), or your team members can deploy [instances](https://prismatic.io/docs/instances/deploying.md) to them. You can create [users](https://prismatic.io/docs/customers/customer-users.md) for your customers and assign them [roles](https://prismatic.io/docs/customers/customer-users.md#customer-user-roles) that determine how they can manage and gain insight into their deployed instances. ![Simple diagram of an org and customers](/docs/img/customers/organization_and_customers.png) Like all Prismatic resources, customers can be managed through Prismatic's CLI tool, [prism](https://prismatic.io/docs/cli.md), or through the web app by clicking the **Customers** link on the left-hand sidebar. ![List of customers in Prismatic app](/docs/img/customers/customers-page.png) --- ### Customer Users #### Customer users[​](#customer-users "Direct link to Customer users") **Customer users** are users associated with your customers. Their permissions are limited in scope to their specific customer account. They can view and manage users and instances of integrations that have been deployed to their customer, but they cannot access other customers' resources. Customer users can be granted permission to update instance configurations, enabling them to modify config variables and credentials tied to their instances without requiring direct support from your team. #### Customer user roles[​](#customer-user-roles "Direct link to Customer user roles") Customer users can be assigned one of two roles: * A customer **admin** can manage their users and the instances deployed to them. Assign this role to users who should be able to deploy, modify, enable, or disable instances of integrations. * A customer **member** has read-only access to view information about users and instances deployed to them. This role is suitable for users who need to view logs but should not have permission to modify instances. * A customer created through an [Embedded Marketplace](https://prismatic.io/docs/embed/marketplace.md) automatically receives the **marketplace admin** role, which allows them to deploy and manage their integrations. A **marketplace user** can only configure their own [User level configuration](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md) for an instance. Note: **marketplace** users cannot log in to Prismatic directly; they can only access it through the embedded integration marketplace. | | Admin | Member | Marketplace Admin | Marketplace User | | --------------------- | ----- | ------ | ----------------- | ---------------- | | View Customer Users | x | x | | | | Manage Customer Users | x | | | | | View Instances | x | x | x | | | View Instance Logs | x | x | x | | | Configure Instances | x | | x | | | Test Instances | x | | x | | | Configure ULC | | | x | x | #### Managing customer users[​](#managing-customer-users "Direct link to Managing customer users") you do not need to create embedded customer users Note that it is uncommon to create customer users manually, and you should only do so if you intend for your customer users to log in to Prismatic's web app. Most of the time, your users will interact with Prismatic through an [Embedded Marketplace](https://prismatic.io/docs/embed/marketplace.md). You do not need to create embedded users manually - a user will be automatically created when they authenticate within your app. Only customer users with the **admin** role or organization users with **integrator**, **admin**, or **owner** roles can manage customer users. To manage customers users in the web app, click **Customers** on the left-hand sidebar, select a customer from that list and open the customer's **Users** tab. If you are a customer user with the **admin** role, click the **Team Members** link on the left-hand sidebar to manage your users. ##### Listing customer users[​](#listing-customer-users "Direct link to Listing customer users") * Web App * CLI * API Users for a specific customer are listed on the main customer's **Users** tab. You can filter what users are shown by typing the name of a user into the search bar on the top of the page. You can also filter by email address by clicking the **Filter** link to the right of the search bar. ![Filter customer users in Prismatic app](/docs/img/customers/customer-users/list-customer-users.png) To list users with the Prismatic CLI tool, use the `prism customers:users:list` subcommand. List all users of customer 'FTL Rockets' ``` # Get your customer's ID CUSTOMER_ID=$(prism customers:list \ --columns=id \ --filter 'Name=^FTL Rockets$' \ --no-header) # List that customer's users prism customers:users:list --customer ${CUSTOMER_ID} Name Email ─────────── ─────────────────────────── Jim Simms james.simms@ftl-rockets.com Lisa Nguyen lisa.nguyen@ftl-rockets.com ``` To list all customers query [customers](https://prismatic.io/docs/api/schema/query/customers.md), or query [customer](https://prismatic.io/docs/api/schema/query/customer.md) for a specific customer's users: ``` query { customers { nodes { name users { nodes { name email } } } } } ``` ##### Adding a customer user[​](#adding-a-customer-user "Direct link to Adding a customer user") * Web App * CLI * API From the customer's **Users** tab, click the **+ User** button in the upper-right. Select an appropriate role for the new user (see above for permissions), and provide a name and email address for the user. ![Add customer user in Prismatic app](/docs/img/customers/customer-users/add-customer-user.png) To add a new customer, use the `customers:users:create` subcommand. You can use `customers:list` to get a customer's ID, and `customers:users:roles` to get a desired role ID. Add a new 'Admin' user for customer 'FTL Rockets' ``` # Get your customer's ID CUSTOMER_ID=$(prism customers:list \ --columns=id \ --filter 'Name=^FTL Rockets$' \ --no-header) # Get the ID of the "Admin" role ROLE_ID=$(prism customers:users:roles \ --columns id \ --no-header \ --filter 'name=Admin') # Create a user "Lisa Nguyen" that has the "Admin" role prism customers:users:create \ --email 'lisa.nguyen@ftl-rockets.com' \ --name 'Lisa Nguyen' \ --customer ${CUSTOMER_ID} \ --role ${ROLE_ID} ``` To create a new customer user, use the [createCustomerUser](https://prismatic.io/docs/api/schema/mutation/createCustomerUser.md) mutation. You will need to know the ID of the role you want to assign the user, as well as the ID of the customer: ``` mutation { createCustomerUser( input: { name: "Lisa Nguyen" email: "lisa.nguyen@ftl-rockets.com" role: "Um9sZTplMTRlZjUzNC0yOTZiLTQ4MjAtYjhmNS1jZjQ1Zjg4N2I0YjM=" customer: "Q3VzdG9tZXI6ZDIyOGUwNjItYzc0NC00NDFkLWE0MDMtNjQ1NTU4MDQ1OTZk" } ) { user { id } } } ``` ##### Changing a customer user's role, name, avatar picture or phone number[​](#changing-a-customer-users-role-name-avatar-picture-or-phone-number "Direct link to Changing a customer user's role, name, avatar picture or phone number") From the customer's **Users** tab, click the name of a user. Like organization users, you can change the role of a customer user under the **Details** tab. You can also change the user's name, avatar picture, or phone number under the **Details** tab. ![Edit customer user in Prismatic app](/docs/img/customers/customer-users/edit-customer-user.png) ##### Deleting a customer user[​](#deleting-a-customer-user "Direct link to Deleting a customer user") * Web App * CLI * API From the specific customer's **Users** tab, click the name of a user. Open the **Details** tab, and click the **Delete user** button on the bottom of the page. Enter the **Confirmation text** and click the **Remove user** button to confirm the removal. ![Delete customer user in Prismatic app](/docs/img/customers/customer-users/delete-customer-user.png) You can find the ID of the user to delete through the `customers:users:list` subcommand, and can delete the user with the `customers:users:delete` subcommand. Delete customer user 'Lisa Nguyen' ``` USER_ID=$(prism customers:users:list \ --customer EXAMPLEtZXI6NTRlNDQyMDgtNTJiNi00ZGVhLTgyODYtOWRkNDU4MTA2ZTYw \ --columns id \ --no-header \ --filter 'Email=lisa.nguyen@ftl-rockets.com') prism customers:users:delete ${USER_ID} ``` ``` mutation { deleteUser( input: { id: "VXNlcjpmYjFiNDMyZC0zN2Y5LTQyZTUtOTljNy1hNjc1ZWIzZGUyNTA=" } ) { user { id } } } ``` ##### Searching all customer users[​](#searching-all-customer-users "Direct link to Searching all customer users") You can search for users on a per-customer basis from the customer's **Users** tab. To search users of all customers, click the **Users** link on the left-hand sidebar. To search for a user by name, enter their name in the search bar on the top of the page. To search for a user by email, click the **Filter** link on the top of the page. ![Search all customer users in Prismatic app](/docs/img/customers/customer-users/search-customer-users.png) --- ### Managing Customers #### Creating new customers[​](#creating-new-customers "Direct link to Creating new customers") * Web App * CLI * API After clicking on the **Customers** link on the left-hand sidebar, click the **+ Add Customer** button on the upper-right. Enter an appropriate name and description for your customer. ![Add customer in Prismatic app](/docs/img/customers/managing-customers/add-customer.png) To create new customers, use the `prism customers:create` subcommand. ``` prism customers:create \ --name 'FTL Rockets' \ --description 'Faster-Than-Light Rocket Inc' ``` Create a new customer using the [createCustomer](https://prismatic.io/docs/api/schema/mutation/createCustomer.md) mutation: ``` mutation { createCustomer( input: { name: "FTL Rockets", description: "Faster-Than-Light Rocket Inc" } ) { customer { id } } } ``` when using embedded, you do not need to create customer *users* If you are using [embedded](https://prismatic.io/docs/embed.md) to present an integration marketplace or workflow builder in your app, you must create customers that have external IDs (you can do this through the Web App, CLI or API), but you should not create customer *users*. Customer users are automatically created for you when you [sign a JWT](https://prismatic.io/docs/embed/authenticate-users.md) for authentication. Embedded customer users are different from standard customer users - standard customer users require a valid email address and receive Prismatic-branded confirmation emails. Embedded customer users can have any value (like a UUID) for their unique identifier and do not receive transactional emails. #### Searching customers[​](#searching-customers "Direct link to Searching customers") * Web App * CLI * API After clicking the **Customers** link on the left-hand sidebar, you can enter a portion of a customer's name into the search bar to filter customers by name. To filter customers by **description**, **[external ID](#customer-external-ids)**, or **[label](#customer-labels)** instead, click the **Filter** button to the right of the search bar. ![Search customer in Prismatic app](/docs/img/customers/managing-customers/search-customers.png) You can list customers using `prism customers:list`, and can `--filter` or `--sort` the results: ``` prism customers:list --filter "Name=Smith Rocket Company" ``` Query [customers](https://prismatic.io/docs/api/schema/query/customers.md) for a list of customers: ``` query { customers { nodes { id name description } } } ``` #### Modifying customers[​](#modifying-customers "Direct link to Modifying customers") After clicking the **Customers** link on the left-hand sidebar, you will be presented with a list of your customers. Clicking the name of any customer will bring you to the customer's page. This page contains a menu with options to manage instances assigned to the customer, alert monitors, logs, customer users, and file attachments. ##### Editing customer name, description and logo[​](#editing-customer-name-description-and-logo "Direct link to Editing customer name, description and logo") * Web App * CLI * API From the customer's page, click into the **Details** tab on the top of the page to change the customer's name or modify the longer customer description. To modify the customer's avatar icon, click the *Upload a photo* link on the **Details** tab. The avatar icon you upload will be resized and cropped to 500 x 500 pixels. Transparent PNG images tend to look the best. ![Edit customer details in Prismatic app](/docs/img/customers/managing-customers/details.png) Update the customer name or description from the command line using the `prism customers:update` subcommand with the customer's ID: ``` prism customers:update \ EXAMPLEtZXI6M2JkYzcwNTAtZTU2ZS00ZGJkLThmMzQtNWI0MDdhOTEXAMPLE \ --name "New Customer Name" \ --description "New Customer Description" ``` Use the [updateCustomer](https://prismatic.io/docs/api/schema/mutation/updateCustomer.md) mutation to update a customer: ``` mutation { updateCustomer( input: { id: "Q3VzdG9tZXI6ZDIyOGUwNjItYzc0NC00NDFkLWE0MDMtNjQ1NTU4MDQ1OTZk" name: "New Customer Name" description: "New customer description" } ) { customer { id } } } ``` ##### Customer labels[​](#customer-labels "Direct link to Customer labels") Labels help you keep your customers organized. You can assign any number of labels to a customer from the customer's **summary** tab, and can then [search](#searching-customers) for customers by label. *Note*: for consistency, labels are always lower-cased. ![Search customers by label in Prismatic app](/docs/img/customers/managing-customers/search-by-label.png) #### Deleting customers[​](#deleting-customers "Direct link to Deleting customers") Deleting a Customer is Permanent When a customer is deleted, all associated users, and instances are also deleted. Use caution when deleting customers. * Web App * CLI * API After clicking on the **Customers** link on the left-hand sidebar, click the name of the customer you would like to delete, and then select the **Details** tab. Verify that the name shown matches the customer you wish to delete, and click the **Delete customer** button. Confirm your choice by typing the customer name exactly, and then click **Remove customer**. ![Delete customer in Prismatic app](/docs/img/customers/managing-customers/delete-customer.png) To delete a customer, use the `prism customers:delete` subcommand. ``` # Get the customer's ID CUSTOMER_ID=$(prism customers:list \ --columns id \ --filter 'Name=^FTL Rockets$' \ --no-header) prism customers:delete ${CUSTOMER_ID} ``` Delete a customer using the [deleteCustomer](https://prismatic.io/docs/api/schema/mutation/deleteCustomer.md) mutation: ``` mutation { deleteCustomer( input: { id: "Q3VzdG9tZXI6ZDIyOGUwNjItYzc0NC00NDFkLWE0MDMtNjQ1NTU4MDQ1OTZk" } ) { customer { id } } } ``` #### Customer external IDs[​](#customer-external-ids "Direct link to Customer external IDs") A customer can be assigned an **External ID** - a unique identifier from an external system. So, if you know "Smith Rocket Company" as a customer with an ID of `abc-123` in another system you use, you can assign "Smith Rocket Company" in Prismatic the `externalId` of `abc-123`. This is helpful if you need to quickly look up or associate customers in Prismatic with customers in your external system. External IDs can be set programmatically (see the next section), or can be edited within the Prismatic web app by clicking the **Customers** link on the left-hand sidebar, selecting a customer, and then clicking on the **Details** tab. ![Set customer external IDs in Prismatic app](/docs/img/customers/managing-customers/external-id.png) External IDs are required if you want to route webhook requests to instances deployed to specific customers using [shared](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md) webhook triggers. --- ### Sync Customers Programmatically When you embed Prismatic into your app, you need to [authenticate](https://prismatic.io/docs/embed/authenticate-users.md) your customer users as users of specific customers in Prismatic. You can establish these customer records in one of two ways: 1. Create them programmatically through our API. When authenticating embedded customer users, ensure that your customer record's `externalId` matches your embedded JWT's `customer` claim. 2. Specify both the customer's **External ID** *and* **Name** as embedded JWT claims. With this approach, customer records will be automatically created if they don't already exist. This guide focuses on option 1, demonstrating how to programmatically create and manage customer records rather than relying on automatic creation from your embedded application. We'll explore **External IDs**, which provide a way to link your external customer identifiers with Prismatic customer records. #### Setting up[​](#setting-up "Direct link to Setting up") We'll use Python in this tutorial, though the same ideas can be adapted to any language that has a [GraphQL client library](https://graphql.org/code/). Full code shown in this tutorial can be found on [GitHub](https://github.com/prismatic-io/examples/blob/main/api/customers/customers.py). To start, we'll add `gql` to our dependencies for our Python project: ``` pip install gql ``` Next, we'll import our necessary libraries and create a GraphQL client so we can run queries against Prismatic's API. We'll define the transport layer for getting to our API (HTTP), which will include an authorization header to authenticate our client against Prismatic: ``` import json import os import sys from gql import gql, Client from gql.transport.requests import RequestsHTTPTransport token = os.environ['PRISMATIC_API_KEY'] api_endpoint = "https://app.prismatic.io/api/" transport = RequestsHTTPTransport( url=api_endpoint, headers={'Authorization': f'Bearer {token}'} ) client = Client(transport=transport) ``` This code assumes that an environment variable, `PRISMATIC_API_KEY`, has been set (it's best practice not to hard-code API keys). To get an API key into our environment variables, we can leverage the `me:token` [prism subcommand](https://prismatic.io/docs/cli/prism.md#metoken): ``` export PRISMATIC_API_KEY=$(prism me:token) ``` Now that we have a working client, let's write some GraphQL queries and mutations to manage customers within Prismatic: #### GraphQL queries and mutations[​](#graphql-queries-and-mutations "Direct link to GraphQL queries and mutations") To sync customers, we'll use the [customers](https://prismatic.io/docs/api/schema/query/customers.md) query and [createCustomer](https://prismatic.io/docs/api/schema/mutation/createCustomer.md) and [deleteCustomer](https://prismatic.io/docs/api/schema/mutation/deleteCustomer.md) mutations. GraphQL Queries and Mutations In GraphQL lingo, a **query** is similar to a `SELECT` statement in SQL. You query for a particular set of information in a read-only way. A **mutation** is similar to an `INSERT`, `UPDATE`, or `DELETE` statement in SQL. You mutate data by creating new records, or updating or deleting existing records. #### List all customers[​](#list-all-customers "Direct link to List all customers") First, let's write a function that gets a list of all of our customers, including their `name`, `description`, Prismatic `id`, and `externalId`: ``` def getCustomers(): query = gql(""" query { customers (isSystem: false) { nodes { id name description externalId } } } """) result = client.execute(query) return result['customers']['nodes'] ``` This function runs a [customers](https://prismatic.io/docs/api/schema/query/customers.md) query against Prismatic's API. We specify `isSystem: false` because your Prismatic tenant has a set of "system" customers that are used behind-the-scenes for running test integrations as you build them. This will filter out those "system" customers, and return only your "real" customers. The query returns an object with a `customers` key, which has (in GraphQL lingo) has a series of **nodes** (customers). We return `result['customers']['nodes']` so that just a list of customer objects are returned. Let's invoke this function, and use `json.dumps()` for readability: ``` print(json.dumps(getCustomers())) ``` ``` [ { "id": "Q3VzdG9tZXI6MThjZTBjM2EtYmQ5NS00MWJiLWIyMjUtN2MwYjVjMDg3YmE4", "name": "Mars Missions", "description": "Mars Missions Corp", "externalId": "abc-123" }, { "id": "Q3VzdG9tZXI6MDllZDQyNTctMTNkMS00YTY4LWFiNTktY2Y5NzNmZGUyOTQy", "name": "Eastern Space Flight", "description": "Eastern Space Flight - Houston, TX", "externalId": "xyz-456" } ] ``` We could use the output of this function to figure out which customers have not been synced *into* Prismatic, or to sync Prismatic customers back to an outside system. ##### GraphQL pagination[​](#graphql-pagination "Direct link to GraphQL pagination") Before moving on, I want to address pagination. By default, the Prismatic API returns the first 100 results for a query. So, the [customers](https://prismatic.io/docs/api/schema/query/customers.md) query only returns 100 customers, even if we have more than 100 in the system. If we want to download **all** customers, we'll need to update our `getCustomers()` function a bit. We'll update our query so that it requests pagination information (`pageInfo`). We'll request whether or not there are additional pages to fetch (`hasNextPage`), and a unique ID indicating where to start the next page (`endCursor`). We'll feed that `endCursor` back in to our query as a parameter `after`, so the GraphQL API knows where it last left off: ``` def getCustomers(): query = gql(""" query ($startCursor: String){ customers(after: $startCursor, isSystem: false) { nodes { id name description externalId } pageInfo { hasNextPage endCursor } } } """) cursor = "" hasNextPage = True customers = [] # Used to accumulate customer objects while hasNextPage: result = client.execute(query, variable_values={"startCursor": cursor}) customers += result['customers']['nodes'] hasNextPage = result['customers']['pageInfo']['hasNextPage'] cursor = result['customers']['pageInfo']['endCursor'] return customers ``` #### Fetch a specific customer by external ID[​](#fetch-a-specific-customer-by-external-id "Direct link to Fetch a specific customer by external ID") Now, let's look at how to look up a specific customer by their `externalId`. The [customers](https://prismatic.io/docs/api/schema/query/customers.md) query allows you to filter customers by `externalId`. We will pass in the `externalId` as a variable to the customers query using an object, `params`, that contains the GraphQL variables we want to use. We'll also assert that we got exactly one customer object back from the API: ``` def getCustomerByExternalId(externalId): query = gql(""" query ($externalId: String!) { customers (externalId: $externalId) { nodes { id name description externalId } } } """) params = {"externalId": externalId} response = client.execute(query, variable_values=params) assert len(response["customers"]["nodes"]) == 1, f"No customer with external ID '{externalId}' exists." return response["customers"]["nodes"][0] ``` If we invoke this function now, we'll get a single customer based on the `externalId` that we provide: ``` print(json.dumps(getCustomerByExternalId("abc-123"))) ``` ``` { "id": "Q3VzdG9tZXI6MThjZTBjM2EtYmQ5NS00MWJiLWIyMjUtN2MwYjVjMDg3YmE4", "name": "Mars Missions", "description": "Mars Missions Corp", "externalId": "abc-123" } ``` #### Create a new customer[​](#create-a-new-customer "Direct link to Create a new customer") Next, we'll write a function that creates a new customer given the customer's `name`, `description`, and `externalId`. To do that, we'll use the the [createCustomer](https://prismatic.io/docs/api/schema/mutation/createCustomer.md) GraphQL mutation. Similar to the [getCustomerByExternalId](#fetch-a-specific-customer-by-external-id) function above, we'll supply some input variables to our mutation with a `params` object: ``` def createCustomer(name, description="", externalId=""): mutation = gql(""" mutation($name: String!, $description: String, $externalId: String) { createCustomer( input: { name: $name, description: $description, externalId: $externalId } ) { customer { id name description externalId } errors { messages } } } """) params = { "name": name, "description": description, "externalId": externalId } result = client.execute(mutation, variable_values=params) if ("errors" in result): raise Exception(result.errors) else: return result["createCustomer"]["customer"] ``` The `createCustomer` mutation will return `errors` if the `name` or `externalId` you specified is already in the Prismatic system. If the mutation does not throw an error, it will return a customer object containing the new customer's `name`, `description`, `externalId` and generated Prismatic `id`. Let's try it out: ``` print( json.dumps( createCustomer( name="Rockets Rockets Rockets", description="Rockets^3", externalId="456-xyz" ))) ``` ``` { "id": "Q3VzdG9tZXI6NGQ3ZDc3ZTktOTllNy00NmJiLWFlNDktMTg1N2JlNWNiYjUz", "name": "Rockets Rockets Rockets", "description": "Rockets^3", "externalId": "456-xyz" } ``` #### Delete a customer by external ID[​](#delete-a-customer-by-external-id "Direct link to Delete a customer by external ID") For the last customer-related function, let's delete a customer given their external ID. To do that, we'll use the [deleteCustomer](https://prismatic.io/docs/api/schema/mutation/deleteCustomer.md) mutation. First, we'll look up the customer we would like to delete using the [getCustomerByExternalId](#fetch-a-specific-customer-by-external-id) function we wrote previously. Then, we'll feed the Prismatic ID that we get into the `deleteCustomer` mutation: ``` def deleteCustomer(externalId): customer = getCustomerByExternalId(externalId) mutation = gql(""" mutation ($id: ID!) { deleteCustomer( input: { id: $id } ) { customer { id name description externalId } errors { messages } } } """) params = {"id": customer["id"]} result = client.execute(mutation, variable_values=params) if ("errors" in result): raise Exception(result.errors) else: return result["deleteCustomer"]["customer"] ``` If we supply an external ID that is not attached to a customer, the `getCustomerByExternalId()` function will throw an error indicating that. --- ### Prism MCP Server The **Prism MCP Server** is a local Model Context Protocol (MCP) server that helps AI assistants work with the Prismatic API for code-native integration and custom component development. Source code for the Prism MCP server is available on [GitHub](https://github.com/prismatic-io/prism-mcp). Not to be confused with Prismatic's MCP flow server! The Prism MCP server is different from Prismatic's MCP flow server. * This tool, the Prism MCP server, is a development tool for use with AI coding assistants to help you build [custom connectors](https://prismatic.io/docs/custom-connectors.md) and [code-native integrations](https://prismatic.io/docs/integrations/code-native.md). * [Prismatic's MCP flow server](https://prismatic.io/docs/ai/model-context-protocol.md) is a hosted service that lets AI agents interact with workflows deployed on the Prismatic platform. You'd use this server for building integrations and connectors, and the Prismatic MCP flow server for interacting with deployed workflows. #### Features[​](#features "Direct link to Features") This MCP server provides several tools, organized into categories. You may register whatever set of tools are most relevant to your use case. ![Claude Code using the Prism MCP server to assist with code-native integration development](/docs/img/dev-tools/prism-mcp/claude-vscode.png) ##### General tools (always available)[​](#general-tools-always-available "Direct link to General tools (always available)") * **prism\_me**: Check login status and display current user profile information * **prism\_components\_list**: List all available components with version options ##### Integration tools (toolset: "integration")[​](#integration-tools-toolset-integration "Direct link to Integration tools (toolset: \"integration\")") ###### Utilities[​](#utilities "Direct link to Utilities") * **prism\_integrations\_list**: List all integrations * **prism\_integrations\_init**: Initialize a new Code Native Integration * **prism\_integrations\_convert**: Convert a Low-Code Integration's YAML file to Code Native * **prism\_integrations\_flows\_list**: List flows for an integration * **prism\_integrations\_flows\_test**: Test a flow in an integration * **prism\_integrations\_import**: Import an integration from a specific directory ###### Code generation[​](#code-generation "Direct link to Code generation") * **prism\_install\_component\_manifest**: Generate component manifest in CNI src directory (requires spectral@10.6.0 or greater) * **prism\_install\_legacy\_component\_manifest**: Generate line to add to a CNI's devDependencies for legacy component manifest installation * **prism\_integrations\_generate\_flow**: Generate boilerplate file for a CNI flow * **prism\_integrations\_generate\_config\_page**: Generate boilerplate code for a CNI config page * **prism\_integrations\_generate\_config\_var**: Generate boilerplate code for a config variable * **prism\_integrations\_add\_connection\_config\_var**: Returns path to connection wrapper function if available, otherwise generates boilerplate code for a connection config variable * **prism\_integrations\_add\_datasource\_config\_var**: Returns path to datasource wrapper function if available, otherwise generates boilerplate code for a datasource config variable ##### Component tools (toolset: "component")[​](#component-tools-toolset-component "Direct link to Component tools (toolset: \"component\")") * **prism\_components\_init**: Initialize a new Component (supports WSDL/OpenAPI generation) * **prism\_components\_publish**: Publish a component from a specific directory * **prism\_components\_generate\_manifest**: Generate the manifest for a Prismatic component ##### Toolset configuration[​](#toolset-configuration "Direct link to Toolset configuration") Tools are organized into **toolsets** that can be selectively enabled via the `TOOLSETS` environment variable: * **`integration`** - Enables all integration-related tools * **`component`** - Enables all component-related tools * **General tools** are always available regardless of toolset configuration If no `TOOLSETS` environment variable is set, all tools are registered by default. #### Prerequisites[​](#prerequisites "Direct link to Prerequisites") 1. Install the Prism CLI globally: ``` npm install --global @prismatic-io/prism ``` 2. Authenticate with Prismatic: ``` prism login ``` #### Usage[​](#usage "Direct link to Usage") ##### General configuration[​](#general-configuration "Direct link to General configuration") Configuration location and methods vary slightly depending on the AI tool you are using, but the following is relatively standard. More specific instructions are below. Example setup: ``` { "mcpServers": { "prism": { "type": "stdio", "command": "npx", "args": ["-y", "@prismatic-io/prism-mcp", "."], "env": { "PRISMATIC_URL": "https://app.prismatic.io" } } } } ``` If you would like the MCP server to run in a different directory than the currently open workspace, replace the `.` path argument with your working directory, ``` { "args": ["-y", "@prismatic-io/prism-mcp", "/path/to/code-native/integration"] } ``` Command-line arguments: * First argument: **Required.** Working directory path that determines where Prism CLI commands are run from. Most coding agents (like Cursor and Claude Code) will interpret `.` as the current workspace directory. If your coding agent does not support this, you can specify an absolute path of your code-native integration or custom component project instead. * Remaining arguments: **Optional.** Toolsets to enable (`integration`, `component`). If no toolsets are specified, all tools are registered by default. Being selective about toolsets may improve performance. For example, to enable only integration-related tools: ``` { "args": ["-y", "@prismatic-io/prism-mcp", ".", "integration"] } ``` Optional environment variable options: * `PRISMATIC_URL`: `https://app.prismatic.io` by default. If your Prismatic tenant is hosted in a different region, or if you use a private stack deployment, set this variable to your Prismatic URL. ##### Installing with Claude Desktop[​](#installing-with-claude-desktop "Direct link to Installing with Claude Desktop") Add the above JSON config to your `claude_desktop_config.json` file. ##### Installing with Claude Code[​](#installing-with-claude-code "Direct link to Installing with Claude Code") To use this MCP server with Claude code, add the above config to your working directory's `.mcp.json` configuration file. Alternatively, run ``` claude mcp add-json prism '{"type":"stdio","command":"npx","args":["-y","@prismatic-io/prism-mcp","."],"env":{"PRISMATIC_URL":"https://app.prismatic.io"}}' ``` ##### Installing with Cursor[​](#installing-with-cursor "Direct link to Installing with Cursor") You can configure available MCP Servers via `Cursor Settings` > `MCP Tools`, then add the above config to your `mcp.json` file. Or, click this link to install automatically: [Add MCP Server to Cursor](cursor://anysphere.cursor-deeplink/mcp/install?name=prism\&config=ewogICJ0eXBlIjogInN0ZGlvIiwKICAiY29tbWFuZCI6ICJucHgiLAogICJhcmdzIjogWwogICAgIkBwcmlzbWF0aWMtaW8vcHJpc20tbWNwIiwKICAgICIuIgogIF0sCiAgImVudiI6IHsKICAgICJQUklTTUFUSUNfVVJMIjogImh0dHBzOi8vYXBwLnByaXNtYXRpYy5pbyIKICB9Cn0%3D) ##### Installing with VS Code / GitHub Copilot[​](#installing-with-vs-code--github-copilot "Direct link to Installing with VS Code / GitHub Copilot") Add the above config to the `.vscode/mcp.json` in your workspace, or the global `mcp.json` file (accessible via the "Add MCP Server..." option in the Command Palette). Or, click this link to install automatically: [Add MCP Server to VS Code](vscode:mcp/install?%7B%22name%22%3A%22prism%22%2C%22type%22%3A%22stdio%22%2C%22command%22%3A%22npx%22%2C%22args%22%3A%5B%22%40prismatic-io%2Fprism-mcp%22%2C%22.%22%5D%2C%22env%22%3A%7B%22PRISMATIC_URL%22%3A%22https%3A%2F%2Fapp.prismatic.io%22%7D%7D) ##### Other tools[​](#other-tools "Direct link to Other tools") If your agent of choice is not listed, please reference their official documentation for setup instructions. --- ### Prismatic Extension for VS Code & Cursor An extension for VSCode & Cursor that improves the developer experience around Code-Native Integrations (CNI) by enabling test execution, integration imports, instance configuration, and inspection of execution results directly within the IDE. #### Purpose[​](#purpose "Direct link to Purpose") The main intent of this extension is to offer: 1. **Seamless Development Workflow Integration:** This extension bridges the gap between local development and the Prismatic platform by providing direct access to integration testing, configuration, and debugging tools within your IDE. Instead of constantly switching between your code editor and the Prismatic web interface, developers can manage their entire CNI development lifecycle from VS Code, reducing context switching and improving productivity. 2. **Real-time Testing and Debugging:** The extension provides immediate feedback on integration performance through real-time test execution and detailed step-by-step output streaming. This allows developers to quickly identify issues, debug problems, and iterate on their integrations without leaving their development environment, significantly reducing the feedback loop between coding and testing. 3. **Unified Configuration Management:** By integrating the Prismatic CLI directly into VS Code, the extension ensures consistent configuration management across different environments and team members. The Config Wizard provides a guided interface for setting up integration instances, while maintaining synchronization with the Prismatic platform, ensuring that local development configurations stay aligned with production environments. #### Features[​](#features "Direct link to Features") * **Authentication**: Secure login and token management through the Prismatic CLI. * **Config Wizard**: Configure integration instances with a guided interface. * **Execution Results**: View detailed step-by-step outputs and logs. * **Integration Import**: Direct import of integrations from Prismatic. * **Message Passing**: Bi-directional communication between webviews and extension. * **React Integration**: Modern UI components using React and styled-components. * **State Management**: Persistent state across extension sessions. * **VSCode Theming**: Seamless integration with VS Code's theme system. #### Prerequisites[​](#prerequisites "Direct link to Prerequisites") * A [Prismatic account](https://prismatic.io). * The Prismatic CLI installed globally [Prism](https://prismatic.io/docs/cli/#installing-the-cli-tool). * VSCode [version 1.96.0 or higher](https://code.visualstudio.com/updates/v1_96). * A [Prismatic Code-Native Integration (CNI) project](https://prismatic.io/docs/integrations/code-native/). #### Installation[​](#installation "Direct link to Installation") Install the Prismatic extension for VS Code by visiting the [Visual Studio Marketplace](https://marketplace.visualstudio.com/items?itemName=prismatic.prismatic-io) or by searching for "Prismatic" in the VS Code Extensions view. #### Usage[​](#usage "Direct link to Usage") The extension provides commands and webview panels that can be accessed through the VS Code command palette: 1. Press `Cmd+Shift+P` (Mac) or `Ctrl+Shift+P` (Windows/Linux) 2. Type "Prismatic" to see available commands. ##### Available Commands[​](#available-commands "Direct link to Available Commands") ![Available Commands](/docs/img/dev-tools/vscode-extension/marketplace-commands.png) ###### `Prismatic: Config Wizard`[​](#prismatic-config-wizard "Direct link to prismatic-config-wizard") Launches the Config Wizard to edit configuration values for your integration instance. ###### `Prismatic: Import Integration`[​](#prismatic-import-integration "Direct link to prismatic-import-integration") Imports the Code-Native Integration (CNI) from your local project into the Prismatic platform. ###### `Prismatic: Test Integration`[​](#prismatic-test-integration "Direct link to prismatic-test-integration") Executes a test for the Code-Native Integration (CNI). After the test is complete, it streams step outputs and logs for debugging. ###### `Prismatic: Create Flow Payload`[​](#prismatic-create-flow-payload "Direct link to prismatic-create-flow-payload") Generates a sample flow payload for testing purposes. This can be used to simulate real-world data inputs during integration testing. ###### `Prismatic: Login`[​](#prismatic-login "Direct link to prismatic-login") Logs in to your Prismatic account using your globally installed Prismatic CLI (Prism) then stores your authentication session. ###### `Prismatic: Logout`[​](#prismatic-logout "Direct link to prismatic-logout") Logs out of your Prismatic account using your globally installed Prismatic CLI (Prism) then clears your authentication session. ###### `Prismatic: Prismatic URL`[​](#prismatic-prismatic-url "Direct link to prismatic-prismatic-url") Sets your systems PRISMATIC\_URL environment variable for your Prismatic CLI (Prism) then syncs it to the extension. This allows you to change your Prismatic stack environment. ###### `Prismatic: Open Integration in Browser`[​](#prismatic-open-integration-in-browser "Direct link to prismatic-open-integration-in-browser") Opens the Code-Native Integration (CNI) in the browser. This is useful for debugging and inspecting the integration from the Prismatic application. ###### `Prismatic: Me`[​](#prismatic-me "Direct link to prismatic-me") Displays details about the currently authenticated Prismatic user, including name, organization, and Prismatic stack environment information using your globally installed Prismatic CLI (Prism). ###### `Prismatic: Refresh Token`[​](#prismatic-refresh-token "Direct link to prismatic-refresh-token") Refreshes your Prismatic authentication token to ensure continued access without needing to logout and log in again using your globally installed Prismatic CLI (Prism). ###### `Prismatic: Focus on Execution Results View`[​](#prismatic-focus-on-execution-results-view "Direct link to prismatic-focus-on-execution-results-view") Displays the results of the Code-Native Integration (CNI) test. This includes the executions, step results (onTrigger and onExecution), and step outputs & logs. ##### Available Webviews[​](#available-webviews "Direct link to Available Webviews") ###### `Prismatic: Focus on Execution Results View`[​](#prismatic-focus-on-execution-results-view-1 "Direct link to prismatic-focus-on-execution-results-view-1") Displays the results of the Code-Native Integration (CNI) test. This includes the executions, step results (onTrigger and onExecution), and step outputs & logs. ![Execution Results](/docs/img/dev-tools/vscode-extension/marketplace-execution-results.png) ###### `Prismatic: Config Wizard`[​](#prismatic-config-wizard-1 "Direct link to prismatic-config-wizard-1") Displays the Config Wizard to edit configuration values for your integration instance. ![Config Wizard](/docs/img/dev-tools/vscode-extension/marketplace-config-wizard.png) #### Troubleshooting[​](#troubleshooting "Direct link to Troubleshooting") If you encounter issues with the Prismatic CLI: 1. Ensure the CLI is installed globally: `npm install -g @prismatic-io/prism` 2. Verify the installation: `prism --version` 3. Check your PATH environment variable includes the npm global bin directory 4. Try reinstalling the extension --- ### Prismatic Insider #### Upcoming events[​](#upcoming-events "Direct link to Upcoming events") ##### Subscribing to Events with Prismatic's API and Event Webhooks (2025-11-18)[​](#subscribing-to-events-with-prismatics-api-and-event-webhooks-2025-11-18 "Direct link to Subscribing to Events with Prismatic's API and Event Webhooks (2025-11-18)") Learn how to subscribe to Prismatic's event webhooks to monitor your integrations and workflows, and how to query Prismatic's GraphQL API to get real-time data about your integrations. Register [here](https://www.prismatic.io/events/webinar-event-webhooks-2025/). #### Past events[​](#past-events "Direct link to Past events") ##### Incorporating code-native integrations into your CI/CD pipeline. (2025-10-21)[​](#incorporating-code-native-integrations-into-your-cicd-pipeline-2025-10-21 "Direct link to Incorporating code-native integrations into your CI/CD pipeline. (2025-10-21)") Learn how to organize your integration code repository, leverage GitHub Actions to deploy connectors and integrations, and include best practices for sharing logic between custom connectors and code-native integrations. ##### Integration Marketplace Best Practices (2025-09-16)[​](#integration-marketplace-best-practices-2025-09-16 "Direct link to Integration Marketplace Best Practices (2025-09-16)") Learn how to embedded and deploy a marketplace in your instance ##### Productized Integrations (2025-08-19)[​](#productized-integrations-2025-08-19 "Direct link to Productized Integrations (2025-08-19)") Learn how Prismatic enables you to turn your integrations into products. ##### Incorporate AI in Your Integrations (2025-07-22)[​](#incorporate-ai-in-your-integrations-2025-07-22 "Direct link to Incorporate AI in Your Integrations (2025-07-22)") Learn about the new AI capabilities in Prismatic and how you can leverage them for your integrations. ##### Visibility features for efficient integrations (2025-06-17)[​](#visibility-features-for-efficient-integrations-2025-06-17 "Direct link to Visibility features for efficient integrations (2025-06-17)") Explore Prismatic's visibility features, including debugging tools recently added to the custom connector and code-native integration SDK. ##### Using a code-native approach to build on your low-code integration (2025-05-20)[​](#using-a-code-native-approach-to-build-on-your-low-code-integration-2025-05-20 "Direct link to Using a code-native approach to build on your low-code integration (2025-05-20)") Learn how to convert Prismatic's low-code integrations into code-native integrations (CNI) with our new CNI converter. This session includes a brief overview of CNI and a live demo of the converter. --- ### Instances Overview An **instance** of an [integration](https://prismatic.io/docs/integrations.md) is a copy of an integration that has been configured for a specific [customer](https://prismatic.io/docs/customers.md). When configuring an instance, you or your customers set up connections to third-party applications and services, along with customer-specific configuration variables, by walking through a [configuration wizard](https://prismatic.io/docs/integrations/config-wizard.md). You can deploy instances of your integrations [on behalf of your customers](https://prismatic.io/docs/instances/deploying.md), or your customers can enable instances themselves through the [integration marketplace](https://prismatic.io/docs/embed/marketplace.md). For your customers, the term **instance** doesn't have any specific meaning - they either have an integration or they don't. When customers log in to Prismatic, they see phrases such as "activate this integration" or "configure this integration." A customer "activates" an "integration" - which is equivalent to deploying an "instance" in organization user terminology. When a [flow](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) within an instance is triggered, an [execution](https://prismatic.io/docs/monitor-instances/executions.md) of that instance's flow runs. #### What happens when an instance is deployed[​](#what-happens-when-an-instance-is-deployed "Direct link to What happens when an instance is deployed") When an instance is deployed, any triggers marked as [deploy triggers](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger) are invoked. We recommend adding an [alert monitor](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md) to instances you deploy so you can be notified if an execution (including a deploy flow) fails to complete. Webhooks are generated for each flow and become available for invocation after the instance is deployed. Schedule triggers are registered with the Prismatic scheduler and will run at your specified intervals. Instances are billed based on how long the instance is enabled. You are not billed when an instance is paused. --- ### Deploying Instances #### Options for deploying instances[​](#options-for-deploying-instances "Direct link to Options for deploying instances") [Options for deploying an instance](https://player.vimeo.com/video/894996278) Once you've built and published an integration it's time to configure and deploy an instance of your integration to a customer. You can either deploy the instance yourself, or grant your customer access to deploy the instance themselves. #### Option 1: Deploy an instance yourself[​](#option-1-deploy-an-instance-yourself "Direct link to Option 1: Deploy an instance yourself") As you work with your first few customers and build out an MVP, it may be easiest to [deploy instances yourself](#configuring-instances). **Advantages**: * You can quickly iterate on your integration and fix any production issues that arise. * No additional development work is required to support self-deployment. **Disadvantages**: * You'll need to manually deploy instances for each customer. * You'll possibly need to handle the customers' credentials for the third-party service as you configure the instance. #### Option 2: Grant customers access to deploy instances through the Prismatic app[​](#option-2-grant-customers-access-to-deploy-instances-through-the-prismatic-app "Direct link to Option 2: Grant customers access to deploy instances through the Prismatic app") You can invite your customers to log in to Prismatic and deploy instances themselves. You can do this by publishing your integration to the [integration marketplace](https://prismatic.io/docs/embed/marketplace.md) and then invite the [customer user](https://prismatic.io/docs/customers/customer-users.md) to your tenant. The customer user will only be able to see instances that are deployed to them (and not to your other customers). **Advantages**: * Your customer can configure and deploy instances themselves. * You don't need to handle the customers' credentials for the third-party service. **Disadvantages**: * Your customer will require an additional login to Prismatic. #### Option 3: Embed the integration marketplace in your app[​](#option-3-embed-the-integration-marketplace-in-your-app "Direct link to Option 3: Embed the integration marketplace in your app") You can [embed the integration marketplace](https://prismatic.io/docs/embed/marketplace.md) in your app so that your customers can deploy instances without leaving your app. **Advantages**: * Customers can deploy instances for themselves without leaving your app. * You can leverage your existing authentication system to authenticate customers. * You have some control over the fonts, colors and other styling of the marketplace, so you can make the embedded iframe match your app. **Disadvantages**: * Some development work is required to embed the marketplace in your app (though, the Prismatic embedded SDK makes this easy!). #### Option 4: Build your own UI for deploying instances[​](#option-4-build-your-own-ui-for-deploying-instances "Direct link to Option 4: Build your own UI for deploying instances") If you want to build a custom UI for deploying instances, you can use the Prismatic embedded SDK to [make API requests](https://prismatic.io/docs/embed/embedded-api-requests.md) to the Prismatic API on behalf of your customer user. You can query for integrations in the integration marketplace, components, etc., and can map those records to custom UI elements in your app. **Advantages**: * You have full control over the UI and can build a custom UI that fits your app's look and feel. **Disadvantages**: * This option requires the most development work. #### Configuring instances[​](#configuring-instances "Direct link to Configuring instances") When you or your customer deploy an instance, you will work through the [Configuration Wizard](https://prismatic.io/docs/integrations/config-wizard.md) that your integration builders created. This is where you can set values for connections and configuration variables for your deployed instance. If your integration developers set default values for the config variables, those will be set initially, but you can override them if you choose. Depending on config variable type, you'll have the option to toggle boolean values, enter string values, enter JSON or XML for a code config variable, or select options from a dropdown. ![Configure instance in Prismatic app](/docs/img/instances/deploying/configuration.png) Be sure to click **Save and deploy** on the last page of the **Configuration Wizard** to save any changes you make. ##### Creating an unconfigured instance[​](#creating-an-unconfigured-instance "Direct link to Creating an unconfigured instance") The embedded marketplace shows only integrations that have been explicitly added to marketplace. But, there are situations where you may want to provide a particular integration that is not part of marketplace to a customer. To do that, you will need to create an unconfigured instance for a customer and elect to [show all instances](https://prismatic.io/docs/embed/marketplace.md#showing-all-instances-in-marketplace) to your customer. Your customer will be able to log in to marketplace and configure that instance for themselves. To create an unconfigured instance, select the **Skip configuration** button when creating your instance. ![Skip instance configuration](/docs/img/instances/deploying/skip-configuration.png) An unconfigured instance will not be deployed until your customer enters their configuration information, and will not count towards your monthly instance count. ##### Setting integration version for an instance[​](#setting-integration-version-for-an-instance "Direct link to Setting integration version for an instance") When an integration is published, a new version of the integration is created. Instances of the integration can then be updated to use the new integration version. In the instance's configuration page, you'll see **New Version Available** if your instance can be updated. To update your instance, click the **Reconfigure** button at the top right of the page, and then select the latest version from the **Integration Version** field. ![Set instance version in Prismatic app](/docs/img/instances/deploying/set-integration-version.png) You can pin instances to different integration versions Not all instances need to run the same version of the integration. For example, one customer might be running a legacy version of a third-party app. They can continue to run "version X" of an integration until they upgrade their third-party app, at which point their instance can be upgraded to "version Y". If an instance upgrade causes problems - suppose a new definition of an integration has a bug that an older one didn't have - you can always reconfigure the instance to run an older version of the integration by similarly clicking **Reconfigure** and choosing an older known working integration version. --- ### Integration Marketplace Prismatic's **integration marketplace** allows you to present your integrations to your customers. You can [embed](https://prismatic.io/docs/embed/marketplace.md) the marketplace within your app so your users can seamlessly deploy integrations natively using your existing authentication system. Within the integration marketplace, your customers can: * Explore your integration offerings in an attractive marketplace * Easily self-activate integrations that connect your app to third-party services they use * Monitor their active integrations using powerful logging and alerting tools You can choose which integrations to include in your integration marketplace, and how they're presented to your customers. Customers follow a simple configuration and deployment experience that [you create](https://prismatic.io/docs/integrations/config-wizard.md) when you build the integration. They enter configuration values and credentials, select a few options from some dropdown menus, and click "activate". Disambiguating "integrations" and "instances" For organization users, an **integration** refers to a general, productized and published integration that can be configured and deployed to multiple customers. An **instance** of an integration is a copy of the integration that has been configured and deployed to a specific customer. For your customers, **instance** has no meaning - they either have an integration or they don't. So, when customers log in to Prismatic they see phrases like "activate this integration", or "configure this integration". A customer "activates" an "integration" - which is the same as deploying an "instance" in your lingo as an organization user. #### Presenting the integration marketplace to your customers[​](#presenting-the-integration-marketplace-to-your-customers "Direct link to Presenting the integration marketplace to your customers") You have a few options for presenting the integration marketplace to your customers: 1. You can [embed the integration marketplace](https://prismatic.io/docs/embed/marketplace.md) directly into your app as an iframe. This option allows your customers to activate integrations seamlessly within your app. You can [theme](https://prismatic.io/docs/embed/theming.md) the integration marketplace to match your app's design. ![](/docs/img/integrations/integration-marketplace/presenting-embed-iframe.png) Your customers log in to your app and activate integrations through an embedded iframe 2. For a completely custom experience, you can use the [embedded SDK](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) to [query Prismatic's API](https://prismatic.io/docs/embed/embedded-api-requests.md) for information on integrations. You can then use this information to build your own integration marketplace within your app. This option requires the most development effort, but allows you to build a completely custom integration marketplace experience. ![](/docs/img/integrations/integration-marketplace/presenting-custom-ui.png) Your customers log in to your app and activate integrations through a custom UI 3. You can create Prismatic accounts for your customers and let them log in to Prismatic to activate integrations. This is the simplest option, and works well if you don't have an existing authentication system. However, this option requires your customers to log in to Prismatic to activate integrations, rather than activating integrations directly within your app. ![](/docs/img/integrations/integration-marketplace/presenting-login-to-prismatic.png) Your customers log in to Prismatic to activate integrations #### Preparing an integration for the integration marketplace[​](#preparing-an-integration-for-the-integration-marketplace "Direct link to Preparing an integration for the integration marketplace") If you would like to configure an integration to appear in the integration marketplace for your customers, follow these steps: 1. First, [publish your integration](https://prismatic.io/docs/integrations/low-code-integration-designer.md#publishing-an-integration) from the integration designer. 2. Next, open your integration's marketplace configuration page. You can either: 1. Click the **Settings** link on the left-hand side of the integration designer and choose **Marketplace Configuration**. Click the **RECONFIGURE** button that appears in the upper-right corner of the next screen. Or, 2. Select the **Integration Marketplace** link on the left-hand sidebar of the main screen, click **+ Integration**, and select the integration you'd like to add. 3. Select the version of your integration that you would like to appear in the integration marketplace. Provide a nice overview to explain to users what your integration does. Your overview can contain [markdown](https://markdownlivepreview.com/). ![Configure integration for marketplace in Prismatic app](/docs/img/integrations/integration-marketplace/org-configure-integration.png) Enable **Customer Deployable** if you would like your customers to be able to activate the integration themselves. If **Customer Deployable** is disabled, your integration will appear in the integration marketplace with a note instructing your customers to contact your company to activate the integration: ![Toggle customer deployable option for integration in Prismatic app](/docs/img/integrations/integration-marketplace/customer-deployable-off.png) Click the **UPDATE** button when you are ready for the integration to be available in the integration marketplace. To change any details about your integration marketplace offering, open the **Integration Marketplace** link on the left-hand sidebar again. Select the integration you want to modify. Add an icon to your integration To make your integration stand out in the integration marketplace, you can [add an icon](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-an-icon-to-an-integration) to your integration. ##### Preparing the configuration experience[​](#preparing-the-configuration-experience "Direct link to Preparing the configuration experience") Integrations are driven by [config variables](https://prismatic.io/docs/integrations/config-wizard/config-variables.md). As you develop your integration, you create a series of config variables and connections that your integration steps reference. We recommend grouping related config variables together. For example, put all Amazon S3 variables together. Add appropriate headers between groups of config variables. This will give your customers a better deployment experience. You can choose which config variables customers can configure when you create the config variables: ![Enable config variables for customers in Prismatic app](/docs/img/integrations/integration-marketplace/customer-configurable-config-variable.png) This is useful if you have a config variable or credential that you want to use for all customers, but don't want customers to see its value. For example, you might have an API key that you use organization-wide but don't want customers to see. If a config variable is **not** customer-configurable, the variable's **Default Value** is used. Set default values for non-customer-configurable variables All non-customer-configurable config variables must have default values. If one does not have a default value, the integration cannot be deployed. Instead, your customers will get a message when they attempt to activate the integration. The message will instruct them to contact your company, and your team will need to set a value for that missing config variable. #### Updating integration marketplace version[​](#updating-integration-marketplace-version "Direct link to Updating integration marketplace version") Your customers have access to the version of the integration available in the integration marketplace. You can update the version in two ways: 1. Select the **Marketplace configuration** link on the left-hand sidebar of the main screen and select your integration, and then select a version of your integration to present. 2. When you publish a new version of your integration, you can choose to update your integration marketplace version from the publish drawer in the integration designer. ![Update integration version in integration marketplace from designer](/docs/img/integrations/integration-marketplace/update-integration-version.png) #### Activating integrations as a customer[​](#activating-integrations-as-a-customer "Direct link to Activating integrations as a customer") You can embed marketplace, or let your customers log in to Prismatic Most organizations choose to [embed](https://prismatic.io/docs/embed/marketplace.md) marketplace in their apps so integration activation is a seamless experience and their users never have to leave their app. The instructions below apply if you're choosing *not* to embed marketplace in your app. You have the option to create [customer users](https://prismatic.io/docs/customers/customer-users.md) in Prismatic, allowing your customers to log in to Prismatic and activate integrations for themselves. Once you have an integration in your marketplace, customers can activate the integration to themselves. The customer's view of Prismatic differs from an organization user's view - a customer can only see integrations, configuration, users, etc., that are specific to them. As a customer user, open the **Integration Marketplace** link on the left-hand sidebar. Integrations that have been activated are marked with a green check mark: ![List of integrations with activated integrations highlighted in Prismatic integration marketplace](/docs/img/integrations/integration-marketplace/integration-marketplace.png) A customer can activate a new integration by clicking into an integration that hasn't been activated (no green check mark). They will be greeted with a popover with the name, description, and overview of the integration: ![Customer activates integration in Prismatic integration marketplace](/docs/img/integrations/integration-marketplace/deploy-modal.png) Once a customer has clicked **ACTIVATE**, they are brought to the integration configuration screen: ![Customer configures integration in Prismatic integration marketplace](/docs/img/integrations/integration-marketplace/customer-configure-integration.png) Your customer can now enter values for config variables and connections, and click **ACTIVATE** when they are finished. They can test the integration by clicking into the **Test** tab, and view logs and execution data by clicking into the **Logs** and **Executions** tabs. Webhook URL(s) are listed at the bottom of the page so they can be used to configure third-party apps and services. #### Deactivating an integration as a customer[​](#deactivating-an-integration-as-a-customer "Direct link to Deactivating an integration as a customer") To remove an integration as a customer, open the integration from the **Integration Marketplace** page, and click **Deactivate Integration** on the bottom of the page. #### Supporting and modifying deployed instances[​](#supporting-and-modifying-deployed-instances "Direct link to Supporting and modifying deployed instances") To modify an activated integration as a customer, open the integration from the **Integration Marketplace** page again and click **Reconfigure**. Make changes to configuration and click **SAVE** to save changes. As an organization user, you can modify an integration that a customer activated as you would any other deployed [instance](https://prismatic.io/docs/instances/deploying.md). You also have the ability to modify config variables that are marked non-customer-configurable. #### Removing an integration from the integration marketplace[​](#removing-an-integration-from-the-integration-marketplace "Direct link to Removing an integration from the integration marketplace") To remove an integration from the integration marketplace, open the **Integration Marketplace** page from the left-hand sidebar. Select the integration you would like to remove, and then click **Remove Integration** on the bottom of the page. Activated Instances will not be removed If you remove an integration from the integration marketplace, instances of the integration that customers have activated will not automatically be removed. You can safely remove an integration marketplace offering without affecting existing deployments, and instances will still show up on the **Instances** page. #### Embedding marketplace in an application[​](#embedding-marketplace-in-an-application "Direct link to Embedding marketplace in an application") Marketplace can be embedded into your application. See our [Embedding Marketplace](https://prismatic.io/docs/embed/marketplace.md) article for more details. --- ### Managing Instances #### Enabling and disabling instances[​](#enabling-and-disabling-instances "Direct link to Enabling and disabling instances") If you would like to stop a deployed instance from executing, click the **Enabled** button under **Status** on the instance's **Summary** tab. When disabled, your instance will not execute on a cron schedule (if configured to use [scheduled triggers](https://prismatic.io/docs/integrations/triggers/schedule.md)), nor respond to webhook invocations. ![Pause instance in Prismatic app](/docs/img/instances/managing/pause-instance.png) To re-enable a disabled instance, click the yellow **Paused** button. #### Viewing instance execution results[​](#viewing-instance-execution-results "Direct link to Viewing instance execution results") It's useful for debugging purposes to be able to see execution results of instance invocations. Click the **Executions** tab from an instance's page to see the logs and step outputs of each execution of the instance. ![Instance execution results in Prismatic app](/docs/img/instances/managing/execution-results.png) If an instance failed to run to completion for whatever reason, you can review the data that was passed in to the instance when it was invoked, and that can help you to debug instances. Results for all instances and all customers can be found by clicking the **Executions** link on the left-hand sidebar, and results for a specific customer can be found by clicking into a **Customer**, and then selecting their **Executions** tab. #### Instance execution retry and replay[​](#instance-execution-retry-and-replay "Direct link to Instance execution retry and replay") Executions sometimes fail. Errors can be thrown for a variety of reasons. A third-party app your integration relies on may be down, or your integration may encounter an edge-case that it doesn't handle correctly. With **retry** and **replay** you can re-run failed executions so you don't miss important data. * **Retry** allows you to automatically re-run an execution if it fails to run to completion. This is useful if you want to handle temporary outages of third-party apps or services. Following a failure, your instance will re-attempt the execution a configurable number of times, with a configurable delay between each attempt. * **Replay** allows you to manually re-run an execution. This is useful if you want to fix a bug in your integration and then re-run an execution with the same payload that caused it to run initially. ##### Execution retry[​](#execution-retry "Direct link to Execution retry") Integrations can be configured to automatically [retry](https://prismatic.io/docs/monitor-instances/retry-and-replay/automatic-retry.md) in the event that an instance fails to run to completion. Information about instance retries can be found on the [execution results pages](#viewing-instance-execution-results). There, you will see when the instance last ran, and when it will attempt to run again. ![Instance execution details in Prismatic app](/docs/img/instances/managing/execution-retry.png) An icon beside an execution indicates that the execution was an automatic retry of an instance execution that failed. ##### Execution replay[​](#execution-replay "Direct link to Execution replay") To manually retry (i.e. "Replay") an invocation of an instance, click the icon beside any execution run. The instance will be run again with the same webhook payload that caused it to run initially. To programmatically retry many failed executions, the [executionResults](https://prismatic.io/docs/api/schema/query/executionResults.md) query can be used to find executions that failed to run to completion. You can then use the [replayExecution](https://prismatic.io/docs/api/schema/mutation/replayExecution.md) mutation to replay the execution. For an example of how to bulk-replay failed executions, see our [examples repository](https://github.com/prismatic-io/examples/tree/main/api/replay-failed-executions) on GitHub or review the [Execution Retry & Replay](https://prismatic.io/docs/monitor-instances/retry-and-replay.md) documentation. Replays are linked with the original execution, so you can query for only original executions that have not had a successful subsequent replay. #### Adding alert monitors to instances[​](#adding-alert-monitors-to-instances "Direct link to Adding alert monitors to instances") Instance alert monitors allow you to notify your team when a variety of things occur, including failed instance executions, slow executions, instances in unexpected disabled states, etc. They can be found by clicking the **Monitors** tab from the instance's page. **For more information**: [Creating Alert Monitors](https://prismatic.io/docs/monitor-instances/alerting.md). #### Viewing instance logs[​](#viewing-instance-logs "Direct link to Viewing instance logs") Logs for an instance can be viewed by clicking the **Logs** tab from the instance's page. You can search log message text through the **Search Logs** search bar on the top of the page, and you can filter logs by Log Severity or date range by clicking the **Filter** link to the right of the search bar. ![Filter instance logs in Prismatic app](/docs/img/instances/managing/instance-logs.png) **For More Information**: [Logging](https://prismatic.io/docs/monitor-instances/logging.md), [Log Retention](https://prismatic.io/docs/monitor-instances/logging.md#log-retention) #### Deleting instances[​](#deleting-instances "Direct link to Deleting instances") Deleting an instance removes the instance and any associated data. Before choosing to delete an instance, consider if you want to [disable](#enabling-and-disabling-instances) the instance from running instead. If you choose to delete the instance, scroll to the bottom of the instance's **Details** tab. Click the **Delete instance** button, and type the name of the instance in the input field to confirm that you want to delete the instance. Click **Remove instance**. ![Delete instance in Prismatic app](/docs/img/instances/managing/delete-instance.png) --- ### Testing Instances #### Invoking instances[​](#invoking-instances "Direct link to Invoking instances") An instance's flows can be invoked one of four ways: 1. You can set up your integration to run [on a schedule](https://prismatic.io/docs/integrations/triggers/schedule.md) 2. You can invoke them [through a webhook](https://prismatic.io/docs/integrations/triggers/webhook.md) 3. You can configure your flow to run [on deployment](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger) or [on instance removal](https://prismatic.io/docs/integrations/triggers/management.md#instance-remove-trigger) 4. You can test a flow manually #### Testing instances from the web app[​](#testing-instances-from-the-web-app "Direct link to Testing instances from the web app") You can invoke an instance outside of its cron schedule or webhook invocations to ensure it functions properly. To run a test of an instance, open the **Test** tab. You can fill in a test payload body and custom HTTP headers to simulate a webhook trigger payload. Click the **Run** button to invoke the test. Alternatively, look up the ID of a flow in an instance with `prism instances:flow-configs:list ${INSTANCE_ID}` and then run `prism instances:flow-configs:test ${FLOW_ID}` from the command line. ![Test instance in Prismatic app](/docs/img/instances/testing/test-instance.png) Logs from the test can be found by clicking the **Logs** tab. #### Invoking instances with webhook triggers[​](#invoking-instances-with-webhook-triggers "Direct link to Invoking instances with webhook triggers") If you choose to invoke your instance's flows with a webhook trigger, webhook URLs will be generated for each flow when you deploy the instance. To invoke an instance's flow programmatically, you can send a POST request to the webhook URL with an optional payload. Here's an example using `curl`, though you can use any language you prefer: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --location \ --header "Content-Type: application/json" \ --data '{"examplePayloadKey": "examplePayloadValue"}' ``` **More Information**: [Webhook Triggers](https://prismatic.io/docs/integrations/triggers/webhook.md) --- ### How Do I Build Integrations? Integrations between your app and the other apps your customers use require several pieces: 1. Code that moves data between systems 2. Compute resources to run your code 3. UI/UX that allows customers to configure integrations independently 4. Secure mechanisms for OAuth 2.0 and other authentication 5. Infrastructure to handle incoming webhook notifications 6. Logging, monitoring, and alerting 7. *and more...* With the Prismatic platform, you focus on #1 (building flows that move data between systems), while the platform handles the rest. You can build and test integrations in our [low-code designer](https://prismatic.io/docs/integrations/low-code-integration-designer.md), or, if you are a developer, you can build a [code-native](https://prismatic.io/docs/integrations/code-native.md) integration in TypeScript using your preferred IDE and our integration SDK. ![Low-code and code-native development](/docs/img/intro/faq/building-integrations/low-code-and-code-native.png) --- ### Can I Offer Self-Serve Building? Your customers may require integrations between your application and their internal systems, or with bespoke applications unique to them. Prismatic's [embedded workflow builder](https://prismatic.io/docs/embed/workflow-builder.md) enables your customers to build custom integrations directly within your application. When logged in, customers can create integrations that synchronize data between your application and their internal or custom systems - all through a drag-and-drop interface that maintains your application's look and feel. ![Embedded workflow builder](/docs/img/intro/faq/offer-self-serve-building/embedded-workflow-builder.png) Prismatic provides authentication, connectors, and templates, while your customers gain the flexibility to build and maintain their own integration logic within your ecosystem. This reduces custom development overhead and empowers customers to manage their integrations independently. --- ### Can I Offer Self-Serve Deployment? Your customers should be able to enable integrations independently, without requiring assistance from your team. You can embed the Prismatic [integration marketplace](https://prismatic.io/docs/embed/marketplace.md) within your application so that it appears as a native feature. ![Embedded marketplace](/docs/img/intro/faq/offer-self-serve-deployment/embedded-marketplace.png) Customers can browse available integrations and, upon selecting one, enter authentication and configuration details to enable the integration for themselves. --- ### How Do I Productize Integrations? A **productized integration** is built as a core feature of your product, rather than as a one-off solution for a specific customer. Productizing an integration typically involves market and user research, product design, development, testing, deployment, and ongoing monitoring and management. #### Benefits of productizing integrations[​](#benefits-of-productizing-integrations "Direct link to Benefits of productizing integrations") Productized integrations offer several advantages for your SaaS application, including: * Prospective customers are more likely to choose your application if it integrates with the other systems they use. * Integrations reduce the "time to value" for new customers during onboarding, as they quickly see their data reflected in your application. * Existing customers are more likely to remain engaged ("sticky"), reducing churn and making your application a core part of their workflow. * Integrations create upsell opportunities. Depending on your commercial model, you can leverage integrations to increase contract value. #### Steps to productize integrations[​](#steps-to-productize-integrations "Direct link to Steps to productize integrations") To productize integrations, follow these steps: ##### Conduct user and market research[​](#conduct-user-and-market-research "Direct link to Conduct user and market research") Interview current and prospective customers to identify which other applications and services they use. Determine which systems are most valuable to integrate with. ##### Build your integration[​](#build-your-integration "Direct link to Build your integration") [Develop](https://prismatic.io/docs/integrations.md) an integration between your app and the other apps you've identified. ![Build an integration](/docs/img/intro/faq/productize-integrations/build-integration.png) ##### Create a configuration experience[​](#create-a-configuration-experience "Direct link to Create a configuration experience") Design an intuitive [configuration experience](https://prismatic.io/docs/integrations/config-wizard.md) to streamline deployment. Configuration should be straightforward for end users, minimizing the need for your team's intervention. ![Configure an instance of an integration](/docs/img/intro/faq/productize-integrations/configure-instance.png) ##### Create an integration marketplace[​](#create-an-integration-marketplace "Direct link to Create an integration marketplace") Add an [integration marketplace](https://prismatic.io/docs/embed/marketplace.md) to your application so customers can self-serve integrations. ![Integration marketplace](/docs/img/intro/faq/productize-integrations/integration-marketplace.png) #### Customer stories[​](#customer-stories "Direct link to Customer stories") See [Customer Stories](https://prismatic.io/customers/) for interviews with Prismatic customers describing how they implemented productized integrations and integration marketplaces in their applications. --- ### Terminology | Term | Definition | | ------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Integration | A collection of logical flows and steps that move data between your app and other apps your customers use. | | Customer | A business that purchases your product. Typically, there is a one-to-one relationship between customers in your application and those created in Prismatic. | | Instance | A configured deployment of an integration for a specific customer, using their credentials and configuration options. | | Flow | A sequence of steps beginning with a trigger, designed to accomplish a specific task (such as moving records between systems). An integration can contain multiple flows. | | Execution | A single run of an instance's flow. An instance may have multiple concurrent executions. | | Component (Connector) | Reusable code that performs specific tasks or connects to third-party services. Components can be utility-focused (e.g., math operations) or connection-focused ("connectors" such as the Salesforce component). Components contain actions (e.g., "Get Record"), triggers, data sources for dynamic UI elements, and connections for third-party authentication. | | Custom Component | A component you build to connect to your own application or a third-party service you integrate with. | | Trigger | Determines when a flow should run. Types include scheduled triggers (e.g., "run every 5 minutes on weekdays"), webhook triggers that respond to incoming payloads, and instance lifecycle triggers (e.g., run when an instance is deployed). | | Action | An individual operation within a component that performs a specific task (such as "Create Record" or "List Accounts"). | | Raw Request | A mechanism to interact with API endpoints not covered by built-in component actions. While components wrap common endpoints as actions, Raw Request allows you to send HTTP requests to any endpoint. | | Config Variable | A configuration option presented as an input box, dropdown menu, boolean toggle, or other UI element. Customers set these when configuring integration instances. Config variables can be referenced throughout the integration and drive its logic. | | Connection | A specialized config variable containing authentication and connection details such as usernames, passwords, API keys, OAuth 2.0 credentials, endpoints, and API versions required for connecting to external services. | | Data Source | A dynamic config variable that uses a customer's connection to populate UI elements with data from third-party systems. For example, a "List Channels" data source might populate a dropdown menu with the customer's Slack channels. | | Configuration Wizard | The interface where customers enter credentials and select config variable values when enabling an integration instance. | | OAuth 2.0 | An authorization protocol commonly used in integrations. Customers authorize access to their third-party application data by clicking a "connect" button in your configuration wizard. | | Webhook | A real-time notification sent between applications when events occur. Webhooks enable systems to notify each other of changes and can trigger integration flows to sync those changes. | | Marketplace | A collection of your integrations that customers can browse, configure, and enable in a self-service manner. | | Embedded Marketplace | A marketplace integrated into your application, allowing customers to configure and enable integrations without leaving your app. | | Embedded Workflow Builder | A tool that enables customers to build their own integrations directly within your application. | | Alert Monitor | A system for notifying your team via SMS, email, or webhook when integrations behave unexpectedly (e.g., errors occur or performance degrades). | | Prism CLI Tool | A command-line interface for managing customers, integrations, instances, and custom components. | | Prismatic API | A GraphQL-based API used by Prismatic's frontend and CLI tools. You can use it to create scripts for programmatically managing customers, deploying instances, and more. | --- ### What is Prismatic? As a B2B software company, your customers expect seamless data synchronization between your application and other platforms they use. Prismatic is an embedded integration platform as a service (iPaaS) that provides the tools and infrastructure you need to build, deploy, and manage integrations at scale. #### How does my team build integrations?[​](#how-does-my-team-build-integrations "Direct link to How does my team build integrations?") ![Low-code or code-native integration development](/docs/img/intro/what-is-prismatic/low-code-or-code-native.png) You can build [integrations](https://prismatic.io/docs/integrations.md) in three ways with Prismatic: * The [low-code integration designer](https://prismatic.io/docs/integrations/low-code-integration-designer.md) offers an intuitive drag-and-drop UX for both developers and technical non-developers. Use pre-built [connectors](https://prismatic.io/docs/components.md) to create [workflows](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) that sync data between apps. * Developers can use Prismatic's integration SDK to build [code-native integrations](https://prismatic.io/docs/integrations/code-native.md) in TypeScript, leveraging their preferred IDE and development tools. * [Embed the low-code workflow builder](https://prismatic.io/docs/embed/workflow-builder.md) in your application to allow customers to create their own custom integrations. #### How do customers enable integrations?[​](#how-do-customers-enable-integrations "Direct link to How do customers enable integrations?") ![Embedding marketplace in your app](/docs/img/intro/what-is-prismatic/embedded-marketplace.png) After building and testing an integration, it's time to configure and deploy copies of the integration for your customers. You can deploy integrations [on behalf of your customers](https://prismatic.io/docs/instances/deploying.md), or, more commonly, embed the [Prismatic marketplace](https://prismatic.io/docs/embed/marketplace.md) in your application to enable customer self-service. The marketplace requires minimal code using our embedded SDK and appears native to your app. When customers deploy an integration, they are guided through a custom, intuitive [configuration wizard](https://prismatic.io/docs/integrations/config-wizard.md) that you design for their specific integration instance. #### How do I maintain integrations?[​](#how-do-i-maintain-integrations "Direct link to How do I maintain integrations?") ![Monitoring integrations in Prismatic](/docs/img/intro/what-is-prismatic/monitoring.png) Once customers have deployed integration instances, you need to ensure reliable data flow. Prismatic provides comprehensive tools for [logging](https://prismatic.io/docs/monitor-instances/logging.md), [monitoring](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md), [alerting](https://prismatic.io/docs/monitor-instances/alerting.md), and [observability](https://prismatic.io/docs/monitor-instances/executions.md). You can stream logs to your favorite logging platform (like [DataDog](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-to-datadog.md) or [New Relic](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-to-new-relic.md)), and you can configure notifications for issues through SMS, email, or your favorite alerting system (like [Slack](https://prismatic.io/docs/monitor-instances/alerting/sending-alerts-to-slack.md) or [PagerDuty](https://prismatic.io/docs/monitor-instances/alerting/sending-alerts-to-pagerduty.md)). As your integration offerings scale, you can leverage Prismatic's [GraphQL API](https://prismatic.io/docs/api.md) to automate the deployment and management of your integrations. #### Why not build integrations in-house?[​](#why-not-build-integrations-in-house "Direct link to Why not build integrations in-house?") Your engineering team is capable of building native integrations. Writing code to move data between systems is rarely the most challenging aspect - most modern applications offer robust APIs, and a developer can quickly script data transfers. But [building an integration isn't the hard part](https://prismatic.io/blog/building-an-integration-isnt-the-hard-part/). Beyond the core logic, you must address: * Provisioning compute resources to run your integrations * Deployment pipelines to push updates * Scaling integration infrastructure to handle load * Presenting integrations within your application * Customer configuration workflows * Secure authentication management * OAuth 2.0 standards (callback URLs, token refresh, etc.) * Webhook handling and retry * Scheduled (non-webhook) integration execution * Log storage and observability * Alerting and monitoring * *and more...* ![Monitoring integrations in Prismatic](/docs/img/intro/what-is-prismatic/integration-environment.png) While your team can build this infrastructure, doing so diverts resources from your core product. Prismatic provides these capabilities out of the box, allowing your engineers to focus on delivering product value rather than burning dev hours on maintaining integration infrastructure. #### How do I start?[​](#how-do-i-start "Direct link to How do I start?") First, [sign up](https://prismatic.io/docs/free-trial) for a free trial. After signing up, we recommend working through our [getting started](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) tutorial to build your first integration. --- ### Monitoring Overview No system is perfect. An API that you integrate with may go down for a period of time, you might encounter an unexpected edge-case when importing data, or a cosmic ray might [flip a bit](https://www.youtube.com/watch?v=o3Cx2wmFyQQ) in a computer! Whatever the case, rapid detection, alerting and resolution of issues is critical. The Prismatic platform provides three core monitoring capabilities for customer-deployed instances: 1. Comprehensive [logging](https://prismatic.io/docs/monitor-instances/logging.md) enables detailed analysis of each execution step. When anomalies occur, logs provide the primary source of information for understanding data flow and execution behavior within your instances. 2. Configurable [alerts](https://prismatic.io/docs/monitor-instances/alerting.md) notify your team (via webhook, email, or SMS) of specific events, such as execution failures or missed execution schedules. 3. Automated [retry](https://prismatic.io/docs/monitor-instances/retry-and-replay/automatic-retry.md) mechanisms handle transient issues with third-party applications without manual intervention. For integration logic issues, you can update the integration and [replay](https://prismatic.io/docs/monitor-instances/retry-and-replay/replaying-failed-executions.md) previously failed executions. --- ### Alerting Effective monitoring and alerting are crucial for maintaining reliable integrations. When issues occur - such as a dependent REST endpoint becoming unavailable or an integration's performance degrading significantly - your incident response team needs immediate notification. Proactive monitoring ensures that your team can identify and address integration issues before they impact your customers. With properly configured monitoring and alerting you can put your mind at ease - no news is good news! Prismatic alert monitors are configurable. * Choose from multiple alert triggers including elevated log levels, execution time thresholds, and failed executions * Notify your integration team through various channels: * Email and SMS notifications * Integration with services like Slack and PagerDuty * Custom webhook support for any notification system [How to Alert People When an Instance has an Error](https://player.vimeo.com/video/499651619) #### Terminology[​](#terminology "Direct link to Terminology") * An **alert group** defines a set of users and webhooks to notify when an instance exhibits noteworthy or unexpected behavior, such as execution failures. * An **alert trigger** specifies the conditions that initiate an alert monitor. Triggers can be configured for: * Performance issues (e.g., exceeded execution time thresholds) * Error conditions (e.g., error or warning log messages) * Status changes (e.g., successful runs or instance enablement) For a comprehensive list of available triggers, see [Alert Triggers](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md#alert-triggers). * An **alert monitor** combines an alert group with alert triggers for a specific [instance](https://prismatic.io/docs/instances.md). It defines which conditions should trigger alerts and which groups should be notified. * An **alert event** is generated when an alert trigger's conditions are met. For example, if an instance scheduled to run every 15 minutes fails, an alert event would notify the DevOps team. Subsequent failures would generate new events until the issue is resolved. --- ### Alert Groups #### Alert groups[​](#alert-groups "Direct link to Alert groups") Alert groups enable you to efficiently manage notifications by grouping users who should be notified when specific integrations encounter issues. Instead of configuring notifications individually for each integration, you can create an **alert group** and assign it to multiple alert monitors. This centralized approach simplifies user management - when you add a new team member, such as a DevOps engineer, you can add them to the relevant alert group, and they'll automatically receive notifications for all associated alert monitors. Alert groups can include both **organization** team members and **customer** users. For optimal organization, we recommend creating separate alert groups for: * Your internal teams (e.g., DevOps, Support) * Each customer who needs notifications This structure allows you to: * Attach your team's alert group(s) to all relevant alert monitors * Associate customer-specific alert groups only with monitors for their respective instances ##### Creating alert groups[​](#creating-alert-groups "Direct link to Creating alert groups") * Web App * CLI * API Click **Settings** on the left-hand sidebar, and select the **Alert Groups** tab. Click the **+ Add alert group** button on the upper-right and give your alert group a name (e.g. "Progix DevOps Team"). From there, you can enumerate users to be notified and webhooks to be invoked upon an alert being triggered. ![Create alert group in Prismatic app](/docs/img/monitor-instances/alerting/alert-groups/create-alert-group.png) Use the `alerts:group:create` subcommand to create a new alert group. You can pass in JSON-formatted lists of user IDs and webhook IDs ``` USER_IDS="[ \"$(prism organization:users:list --columns id --no-header --filter 'Email=edward.davis@progix.io')\", \"$(prism organization:users:list --columns id --no-header --filter 'Email=kristin.henry@progix.io')\", \"$(prism organization:users:list --columns id --no-header --filter 'Email=samantha.johnson@progix.io')\" ]" WEBHOOK_IDS="[\"$(prism alerts:webhooks:list --columns id --no-header --filter 'name=Devops Webhook')\"]" # Create an alert group to email your DevOps Team prism alerts:groups:create \ --name DevOps \ --users "${USER_IDS}" \ --webhooks "${WEBHOOK_IDS}" ``` You can take advantage of [jq](https://stedolan.github.io/jq/) to process JSON on the command line to simply your user IDs query. ``` # Create an alert group to alert customer users at FTL Rockets CUSTOMER_ID=$(prism customers:list --columns id --no-header --filter 'Name=^FTL Rockets$') CUSTOMER_USER_IDS=$( prism customers:users:list \ --customer $CUSTOMER_ID \ --output json \ --columns id | jq '[.[].id]') prism alerts:groups:create \ --name 'Customer - FTL Rockets' \ --users "${CUSTOMER_USER_IDS}" ``` To create an alert group you will need to know the IDs of the users and webhooks who you would like to add to the group. You can look up user IDs grouped by customer name with this query: ``` query { customers { nodes { name users { nodes { id name } } } } } ``` Alert webhook IDs can be queried for using this query: ``` query { alertWebhooks { nodes { id name } } } ``` Once you have user IDs and alert webhook IDs, create an alert group using the [createAlertGroup](https://prismatic.io/docs/api/schema/mutation/createAlertGroup.md) mutation: ``` mutation createAlertGroup($name: String!, $users: [ID], $webhooks: [ID]) { createAlertGroup(input: { name: $name, users: $users, webhooks: $webhooks }) { alertGroup { id } } } ``` Query Variables ``` { "name": "DevOps", "users": [ "VXNlcjplZTI3N2I4My0zOTBmLTQ3ODAtOGU4ZS1iYmNjOGY1Y2RlMTk=", "VXNlcjpiNmZmNDJhNS1mOTM3LTRlOWEtYWMyYi0yNjNjYTFiYjgzYjQ=" ], "webhooks": [ "QWxlcnRXZWJob29rOjA2NmJkN2Q1LThiYTgtNGJlMi1hM2MyLTE3NzFlMzY3NmI3Zg==" ] } ``` ##### Editing existing alert groups[​](#editing-existing-alert-groups "Direct link to Editing existing alert groups") To modify an existing alert group, you will return to the same screen you saw when you created your alert group by clicking **Settings** on the left-hand sidebar and then select the **Alert Groups** tab. Click into an existing alert group. Within this screen, you can modify the name of the group by clicking the group's name at the top of the page. You can also modify the list of users and webhooks associated with the group. ##### Deleting alert groups[​](#deleting-alert-groups "Direct link to Deleting alert groups") * Web App * CLI * API To delete an alert group click the **Settings** link on the left-hand sidebar. Then, click the **Alert Groups** tab and select an alert group. Scroll to the bottom of the alert group's page and click **Delete alert group**. Click **Remove alert group** to confirm deletion. Find the ID of the alert group you want to delete using ``` prism alerts:groups:list --extended ``` and then reference that ID using ``` prism alerts:groups:delete ${ALERT_GROUP_ID} ``` Delete an alert group using the [deleteAlertGroup](https://prismatic.io/docs/api/schema/mutation/deleteAlertGroup.md) mutation: ``` mutation { deleteAlertGroup( input: { id: "QWxlcnRHcm91cDo3MmU0OTMyNi1lMWYyLTRlNGEtYTNmZi00ZGIxZmY1NWViNmU=" } ) { alertGroup { id } } } ``` --- ### Alert Monitors #### Alert triggers[​](#alert-triggers "Direct link to Alert triggers") Many events can trigger an alert monitor: * **Execution Completed**: Triggers when an instance runs successfully * Use case: Notify customers of successful integration runs * **Execution Duration Matched or Exceeded**: Triggers when execution time exceeds a specified threshold * Example: Alert if an integration takes longer than 10 seconds when it typically takes 5 * **Execution Failed**: Triggers on instance execution failure * **Execution Failed, Retry Pending**: Triggers when an execution fails but is scheduled for retry * **Execution Overdue**: Triggers when an expected execution hasn't occurred within the specified interval * **Execution Started**: Triggers when an instance begins execution * **Instance Disabled**: Triggers when an instance is deactivated * **Instance Enabled**: Triggers when an instance becomes active * Use case: Notify project managers when an instance is ready for customer use * **Instance Removed**: Triggers when an instance is deleted * **Log Level Matched or Exceeded**: Triggers when logs meet or exceed specified severity levels * Monitors unexpected `error` or `warn` log entries * **Connection Threw an Exception**: Triggers on connection failures * Indicates expired credentials, invalid authentication, or API availability issues Note: Some triggers are instance-wide (like status changes), while others are flow-specific (like execution events). This allows for granular monitoring configuration through [alert monitors](#alert-monitors). **For More Information**: [Log Levels](https://prismatic.io/docs/monitor-instances/logging.md#log-levels) #### Alert monitors[​](#alert-monitors "Direct link to Alert monitors") An **alert monitor** is a combination of an [alert group](https://prismatic.io/docs/monitor-instances/alerting/alert-groups.md) (users and webhooks) and an [alert trigger](#alert-triggers) that is configured for an [instance](https://prismatic.io/docs/instances.md). When you add an alert monitor to an instance, you specify when the monitor should be triggered, and which alert group(s) should be notified in the event of a trigger firing. Alert monitors cannot be bound to preprocess flows Note that if your instances are configured to use a [shared endpoint](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#shared-endpoint-configuration) and a **preprocess flow**, an alert monitor cannot be assigned to the preprocess flow since the preprocess flow runs independently of any deployed instance. ##### Creating an alert monitor[​](#creating-an-alert-monitor "Direct link to Creating an alert monitor") * Web App * CLI * API After selecting an instance from a customer's **Instances** tab or the **Instances** link on the left-hand sidebar, click the instance's **Monitors** tab. Click the **+ Add alert monitor** button on the top-right of the screen. Specify a name for the monitor and select a trigger. if you are in a customer's **Instances** tab, you'll need to also specify the instance. ![Create alert monitor in Prismatic app](/docs/img/monitor-instances/alerting/alert-monitors/create-alert-monitor.png) After creating the alert monitor you will find yourself in the monitor's **Details** tab. Within this tab, you can add additional triggers to your alert monitor within the **Triggers** card. You can also choose the groups or users to notify and webhooks to trigger when an alert trigger fires. ![Configure alert monitor in Prismatic app](/docs/img/monitor-instances/alerting/alert-monitors/configure-alert-monitor.png) To create an alert monitor you need to look up ID of the trigger you would like to create a monitor for, the instance's ID, and the ID of the group(s) to alert. Then, you can create a monitor using that trigger ID, instance ID, and group ID(s). ``` # Get Trigger ID prism alerts:triggers:list --extended --filter 'name=^Execution Failed$' Id Name ──────────────────────────────────────────────────────────────────── ──────────────── QWxlcnRUcmlnZ2VyOjQyYmM2MDY5LTE5YTktNDE1MS04ZjAwLTQ4ZWExN2E3MzZjMQ== Execution Failed # Get Alert Group ID prism alerts:groups:list --extended --filter 'name=^DevOps$' Id Name ──────────────────────────────────────────────────────────────── ────── QWxlcnRHcm91cDplMzcwMzY2OC0yZWM4LTQ0MWEtODdlYS02OGZjYTg1N2U5N2E= DevOps INSTANCE_ID=$(prism instances:list --columns id --filter 'name=^Fabricate 3D Model for FTL Rockets$' --no-header) prism alerts:monitors:create \ --groups "[\"QWxlcnRHcm91cDplMzcwMzY2OC0yZWM4LTQ0MWEtODdlYS02OGZjYTg1N2U5N2E=\"]" \ --name 'Alert Devops on Failure' \ --instance ${INSTANCE_ID} \ --triggers "[\"QWxlcnRUcmlnZ2VyOjQyYmM2MDY5LTE5YTktNDE1MS04ZjAwLTQ4ZWExN2E3MzZjMQ==\"]" ``` To create an alert monitor for an instance, you will need to query [alertTriggers](https://prismatic.io/docs/api/schema/query/alertTriggers.md) and select which types of triggers should result in an alert: ``` query listTriggers { alertTriggers { nodes { id name } } } ``` You will also need the ID of the instance you want to attach the alert to, and the IDs of any users or groups who should be notified. Then, use the [createAlertMonitor](https://prismatic.io/docs/api/schema/mutation/createAlertMonitor.md) mutation to create the alert monitor: ``` mutation ( $name: String! $instance: ID! $triggers: [ID]! $groups: [ID] $users: [ID] ) { createAlertMonitor( input: { name: $name instance: $instance triggers: $triggers groups: $groups users: $users } ) { alertMonitor { id } } } ``` Query Variables ``` { "name": "Alert Alex and DevOps on Execution Failure", "instance": "SW5zdGFuY2U6OTc1YzgyMTEtYTIxZi00OTg1LThhODYtMTUxMTczM2ZiYTJh", "triggers": [ "QWxlcnRUcmlnZ2VyOjhiOTg3YmYxLTk4YmMtNDViNy1hZDFkLTEwNWY0YTExZjdlOA==", "QWxlcnRUcmlnZ2VyOjdlOWEzMDA2LTQxODItNDQ0MC1iYzE2LTFiNjNjMzI2NzkwZA==" ], "groups": ["QWxlcnRHcm91cDo5MDQyYmM1ZC1hYTU5LTQ3Y2EtOWE4NC00NWIxNDBmZjYzYmQ"], "users": ["VXNlcjo4MzBjZTZmYS1iNDFlLTQ2MTQtODgzNi04NjA1MTcyY2IyOTc="] } ``` ##### Alerting on connection errors[​](#alerting-on-connection-errors "Direct link to Alerting on connection errors") You can set up an alert monitor to notify you if a connection in an instance becomes invalid (i.e. credentials expired or have been revoked, an API is down, etc). To alert on connection errors, create a new alert monitor and select **Connection Threw an Exception** as the trigger. This is especially useful with OAuth 2.0 connections. You can be alerted if refreshing your access key fails for any reason, and you will be directed straight to relevant logs from the alert message that is sent to you or your team members ##### Editing existing alert monitors[​](#editing-existing-alert-monitors "Direct link to Editing existing alert monitors") To modify an existing alert monitor, click **Instances** on the left-hand sidebar and then select an instance. Under the instance's **Monitors** tab, select a monitor. This will bring you to the same screen you saw when you created the monitor, where you can modify who is notified under what conditions under the **Details** tab. ##### Deleting an alert monitor[​](#deleting-an-alert-monitor "Direct link to Deleting an alert monitor") * Web App * CLI * API Click **Customers** from the left-hand sidebar and select a customer. Under the customer's **Instances** tab, select an instance and then click **Monitors**. Click into an alert monitor and open the **Details** tab. Scroll to the bottom of the page. Click **Delete Monitor** and confirm deletion by clicking **Remove monitor** Find the ID of the alert monitor you would like to delete using ``` prism alerts:monitors:list --extended ``` and then delete the monitor with ``` prism alerts:monitors:delete ${ALERT_MONITOR_ID} ``` Delete an alert monitor with the [deleteAlertMonitor](https://prismatic.io/docs/api/schema/mutation/deleteAlertMonitor.md) mutation: ``` mutation { deleteAlertMonitor( input: { id: "QWxlcnRNb25pdG9yOjQ4ZjVkZjkzLWU3MTAtNGFmNi1iZmRmLWU5ZWM4MDAzYTAyOA==" } ) { alertMonitor { id } } } ``` --- ### Alert Webhooks #### Alert webhooks[​](#alert-webhooks "Direct link to Alert webhooks") Beyond email and SMS notifications, alert monitors can be configured to send HTTP requests to webhook endpoints with customizable payloads. Alert webhooks enable integration with incident management systems like PagerDuty or OpsGenie, custom DevOps alert endpoints, or any HTTP-based alerting service. ##### Creating alert webhooks[​](#creating-alert-webhooks "Direct link to Creating alert webhooks") * Web App * CLI * API To create or modify a webhook endpoint, navigate to the **Settings** page and select the **Alert Webhooks** tab. Click the **+ Add alert webhook** button to configure the webhook name, URL, and payload template. Alert webhooks are designed to be reusable across multiple alert monitors through configurable payload templates. In the **Payload Template** section, you can use predefined variables that are dynamically replaced when an alert monitor triggers: * `$SUBJECT` - The static string "Prismatic.io Alert" * `$NAME` - The name of the [alert monitor](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md) that triggered * `$INSTANCE` - The name of the instance associated with the triggered alert monitor * `$INSTANCE_ID` - The global identifier of the instance (the `SW5z....` portion of the instance URL) * `$EXECUTION_ID` - The global identifier of the execution * `$CUSTOMER` - The name of the customer to whom the instance is deployed * `$CUSTOMER_EXTERNAL_ID` - The [external ID](https://prismatic.io/docs/customers/managing-customers.md#customer-external-ids) of the customer to whom the instance is deployed * `$FLOW` - The name of the flow that was executing when the alert monitor triggered * `$TRIGGER` - The name of the [alert trigger](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md#alert-triggers) (e.g., "Execution Failed") * `$STEP` - The name of the step within the integration that triggered the alert monitor * `$URL` - A direct link to the triggered alert monitor ![Configure alert webhook in Prismatic app](/docs/img/monitor-instances/alerting/alert-webhooks/alert-webhook.png) After creating the alert webhook, you can update the name, URL, or payload template, and optionally configure HTTP headers. Headers are commonly used for authentication, such as passing an API token to the webhook endpoint. ![Configure HTTP headers for alert webhook in Prismatic app](/docs/img/monitor-instances/alerting/alert-webhooks/alert-webhook-headers.png) To create an alert webhook, use the `alerts:webhooks:create` subcommand: ``` prism alerts:webhooks:create \ --name 'Devops Webhook' \ --headers '{"Authorization": "Bearer abc123"}' \ --payloadTemplate 'The instance "$INSTANCE" was triggered by "$TRIGGER" on monitor "$NAME". It appears that step "$STEP" generated an error. Respond by visiting $URL.' \ --url https://devops.progix.io/alerts/webhook ``` To create an alert webhook, use the [createAlertWebhook](https://prismatic.io/docs/api/schema/mutation/createAlertWebhook.md) mutation: ``` mutation { createAlertWebhook( input: { name: "Devops Webhook" url: "https://devops.progix.io/alerts/webhook" headers: "{\"Authorization\": \"Bearer abc123\"}" payloadTemplate: "The instance \"$INSTANCE\" was triggered by \"$TRIGGER\" on monitor \"$NAME\". It appears that step \"$STEP\" generated an error. Respond by visiting $URL." } ) { alertWebhook { id } } } ``` ##### Editing existing alert webhooks[​](#editing-existing-alert-webhooks "Direct link to Editing existing alert webhooks") To modify an existing alert webhook, navigate to **Settings** in the left-hand sidebar and select the **Alert Webhooks** tab. Select the webhook to edit. You can update the webhook name by clicking the name at the top of the page. In the **Details** tab, you can modify the webhook template, payload template, or URL, and configure optional HTTP headers for authentication or other requirements. ![Edit alert webhook configuration in Prismatic app](/docs/img/monitor-instances/alerting/alert-webhooks/edit-webhook.png) ##### Deleting alert webhooks[​](#deleting-alert-webhooks "Direct link to Deleting alert webhooks") * Web App * CLI * API To delete an alert webhook, navigate to the **Settings** page from the left-hand sidebar. Select the **Alert Webhooks** tab and choose the webhook to delete. On the webhook's page, click **Delete Alert Webhook**. Confirm the deletion by clicking **Remove alert webhook**. Retrieve the webhook's ID with: ``` prism alerts:webhooks:list --extended ``` Then delete the webhook using: ``` prism alerts:webhooks:delete ${WEBHOOK_ID} ``` Delete an alert webhook using the [deleteAlertWebhook](https://prismatic.io/docs/api/schema/mutation/deleteAlertWebhook.md) mutation: ``` mutation { deleteAlertWebhook( input: { id: "QWxlcnRXZWJob29rOjczOGNiNTM2LWFhMGMtNGUwNS05ZTBmLTQ5ZDMzZDE5ODYwNA==" } ) { alertWebhook { id } } } ``` --- ### Creating Alert Monitors Programmatically #### Programmatically creating alert monitors[​](#programmatically-creating-alert-monitors "Direct link to Programmatically creating alert monitors") [Alert monitors](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md) enable automated notifications when specific events occur within an instance's flow execution. Most commonly, alert monitors are used to notify you when an execution fails. This guide demonstrates how to programmatically create alert monitors across all instances. An example script that creates alert monitors for all flows of all instances is available in the [examples repository](https://github.com/prismatic-io/examples/tree/main/api/create-alert-monitors). ##### List instances programmatically[​](#list-instances-programmatically "Direct link to List instances programmatically") First, retrieve all customer instances, including their flows, associated customers, and existing monitors. The following GraphQL query provides this information: ``` query myGetInstancesQuery($cursor: String) { instances( isSystem: false enabled: true sortBy: { direction: ASC, field: CREATED_AT } after: $cursor ) { nodes { id name flowConfigs { nodes { id flow { name } monitors { nodes { id name groups { nodes { id } } } } } } customer { id name } } pageInfo { hasNextPage endCursor } } } ``` Note three important details about this query: * `isSystem: false` excludes test instances used in the integration designer * `enabled: true` filters for currently active instances * The combination of `sortBy { direction: ASC, field: CREATED_AT }`, `after: $cursor`, and `pageInfo` enables [pagination](https://prismatic.io/docs/api/pagination.md) through results If you'd like to see an example of how to paginate through results, check out the [example script](https://github.com/prismatic-io/examples/blob/main/api/create-alert-monitors/queries/get-instances.ts) which implements the query above. ##### Fetch alert trigger information[​](#fetch-alert-trigger-information "Direct link to Fetch alert trigger information") Next, retrieve information about the desired alert trigger. Alert monitors can be [triggered](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md#alert-triggers) by several events, such as execution failures, successful executions, or execution duration thresholds. Query available alert triggers using: ``` { alertTriggers { nodes { id name } } } ``` The returned `id` is required for creating the alert monitor. An example of this query is available in the [example script](https://github.com/prismatic-io/examples/blob/main/api/create-alert-monitors/queries/get-alert-trigger.ts). ##### Fetch notification recipient information[​](#fetch-notification-recipient-information "Direct link to Fetch notification recipient information") To configure user notifications, retrieve the recipient's information. The user must be registered in your Prismatic organization to receive email notifications. Query a user by email address: ``` query myGetUsersByEmail($email: String!) { users(email: $email) { nodes { id name email } } } ``` Note that this query may return zero or one users - you'll need to check the length of the `nodes` array to determine if a user was found, like the example script does [here](https://github.com/prismatic-io/examples/blob/main/api/create-alert-monitors/queries/get-user.ts). The user's `id` is required for alert monitor creation. ##### Create the alert monitor[​](#create-the-alert-monitor "Direct link to Create the alert monitor") Finally, iterate through instances and their flows to create alert monitors. For each flow, check for existing monitors before creating a new one using the `createAlertMonitor` mutation: ``` mutation myCreateAlertMonitor( $name: String! $instanceId: ID! $flowConfigId: ID! $triggerId: ID! $userId: ID! ) { createAlertMonitor( input: { name: $name instance: $instanceId flowConfig: $flowConfigId triggers: [$triggerId] users: [$userId] } ) { alertMonitor { id } errors { field messages } } } ``` Required mutation parameters: * `name`: Alert monitor identifier. Use a consistent naming pattern (e.g., `[Generated] Alert on Error - FLOW NAME`) to ensure idempotency * `instanceId`: Target instance ID from the instance listing * `flowConfigId`: Target flow ID from the instance listing * `triggerId`: Alert trigger ID from the trigger listing * `userId`: Notification recipient ID from the user query Handle potential errors by checking the `errors` field in the response. See the [example implementation](https://github.com/prismatic-io/examples/blob/main/api/create-alert-monitors/queries/create-alert-monitor.ts) for error handling. --- ### Responding to Alert Events #### Alert events[​](#alert-events "Direct link to Alert events") An **alert event** is generated when an alert monitor is triggered. Upon event creation, users in the monitor's associated alert groups receive notifications (email or SMS) containing a direct link to the event. Team members can acknowledge and indicate active resolution of the issue by **clearing** the alert event. ##### Viewing alert events[​](#viewing-alert-events "Direct link to Viewing alert events") * Web App * CLI * API The most direct method to access an alert event is through the link provided in the notification email or SMS. Alternatively, navigate to the **Instances** section via the left-hand sidebar to view all instances. Each instance displays an indicator in the lower-right corner when alert monitors are triggered but not yet cleared. ![Alert monitor status indicator on instance in Prismatic app](/docs/img/monitor-instances/alerting/responding-to-alert-events/triggered-alert-monitor.png) Select an instance with triggered monitors and navigate to the **Monitors** tab to view active alerts. ![Instance monitors with active alerts highlighted in Prismatic app](/docs/img/monitor-instances/alerting/responding-to-alert-events/list-instance-monitors.png) Select a triggered monitor to access its **Details** tab, then navigate to the **Events** tab for specific event information. Selecting an individual alert event displays relevant logs from the time period surrounding the event at the bottom of the page. ![Contextual logs for alert event in Prismatic app](/docs/img/monitor-instances/alerting/responding-to-alert-events/alert-event-logs.png) To view alert events, use the `alerts:events:list` subcommand with the target alert monitor's ID: ``` prism alerts:monitors:list --extended Id Name Triggered ──────────────────────────────────────────────────────────────────── ────────────────────────────────── ───────── QWxlcnRNb25pdG9yOmQyM2NlOGZlLTZiMzktNGFkNy1hMGM1LTFlMTRjMjY4MTI1Mg== Alert Project Managers on Enabling true QWxlcnRNb25pdG9yOmQ0MTM3N2M5LWE1NTItNDJjNi04ZWYwLWNiY2ZkM2E2ODMxYg== Alert Devops false prism alerts:events:list QWxlcnRNb25pdG9yOmQ0MTM3N2M5LWE1NTItNDJjNi04ZWYwLWNiY2ZkM2E2ODMxYg== ``` Query [alertEvents](https://prismatic.io/docs/api/schema/query/alertEvents.md) to list alert events for a specific alert monitor. **For More Information:** [Log Retention](https://prismatic.io/docs/monitor-instances/logging.md#log-retention) #### Clearing a triggered alert monitor[​](#clearing-a-triggered-alert-monitor "Direct link to Clearing a triggered alert monitor") [How to Respond to an Alert Message](https://player.vimeo.com/video/500203509) When multiple team members receive alert notifications, it's crucial to track the event's resolution status. Clearing an alert monitor indicates acknowledgment of the event and active resolution efforts. Navigate to the **Monitors** section via the left-hand sidebar. Select one or more triggered monitors. Click the icon to clear the selected events. ![Clear selected alert events in Prismatic app](/docs/img/monitor-instances/alerting/responding-to-alert-events/clear-events.png) --- ### Sending Alerts to PagerDuty PagerDuty is a leading incident response platform that helps teams manage and track production issues effectively. Prismatic's alert webhooks can be sent to PagerDuty using [PagerDuty's Events API](https://developer.pagerduty.com/docs/events-api-v2/trigger-events/) to automatically create and manage incidents. 1. Create a new alert webhook in Prismatic 2. Set the webhook URL to: `https://events.pagerduty.com/v2/enqueue` 3. Configure the payload template with the following JSON: ``` { "routing_key": "YOUR-PAGERDUTY-KEY", "event_action": "trigger", "links": [{ "href": "$URL", "text": "Link to Prismatic alert monitor" }], "payload": { "summary": "$NAME triggered - $INSTANCE failed to run.", "severity": "error", "source": "$SUBJECT" } } ``` * Additional fields can be added to the payload template as documented in the [PagerDuty API documentation](https://developer.pagerduty.com/docs/events-api-v2/trigger-events/) * No additional headers are required as the PagerDuty integration key is included in the payload * Each alert monitor trigger will create a corresponding incident in PagerDuty ![Sample alert details in PagerDuty app](/docs/img/monitor-instances/alerting/pagerduty.png) --- ### Sending Alerts to Slack [How to Send Alerts to Slack](https://player.vimeo.com/video/500205967) Many operations teams use Slack to notify themselves of production issues. Prismatic alert webhooks can be configured to send messages to a Slack channel. To send alerts as messages to Slack, first generate a new Slack webhook: 1. Visit 2. Click **Create New App** and select your workspace 3. Under **Add features and functionality**, select **Incoming Webhooks** 4. Toggle **Activate Incoming Webhooks** to enable the feature 5. Click **Add New Webhook to Workspace** and select your target channel 6. Copy the generated Webhook URL (format: `https://hooks.slack.com/services/foo/bar/baz`) Next, create a new alert webhook in Prismatic to send notifications to Slack. 1. Create a new alert webhook in Prismatic 2. Enter the Slack webhook URL from Step 1 3. Configure the payload template with the following JSON: ``` { "text": "$NAME triggered - $INSTANCE failed to run. See $URL" } ``` No additional headers are required for Slack integration. Once configured, any alert monitor using this webhook will automatically send notifications to your specified Slack channel. ![Sample channel with alert details in Slack app](/docs/img/monitor-instances/alerting/slack.png) --- ### Instance Executions When a flow is triggered, an execution is initiated. An execution represents a single run of a flow. Executions can be [triggered](https://prismatic.io/docs/integrations/triggers.md) by multiple event types: 1. Scheduled runs via [schedule triggers](https://prismatic.io/docs/integrations/triggers/schedule.md) 2. Webhook invocations ([webhook triggers](https://prismatic.io/docs/integrations/triggers/webhook.md)) 3. Instance deployment or removal events ([deployment triggers](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger)) A flow within an instance may be triggered concurrently, resulting in multiple simultaneous executions. Flows can also invoke other flows within the same instance by calling sibling flows' webhook URLs. Each invocation is a distinct execution. If an execution fails, or if you need to re-run an execution, you can [replay](https://prismatic.io/docs/monitor-instances/retry-and-replay.md) previous executions. #### Viewing execution step results[​](#viewing-execution-step-results "Direct link to Viewing execution step results") For debugging and analysis, you can review the results of instance executions. Navigate to the **Executions** tab on an instance's page to view logs and step outputs for each execution. Alternatively, view executions for all instances by selecting **Executions** from the left-hand sidebar. ![Instance execution results in Prismatic app](/docs/img/executions/execution-results.png) If an instance fails to complete successfully, you can inspect the input data provided at invocation to assist with debugging. Execution results for all instances and customers are accessible via the **Executions** link in the sidebar. For a specific customer, navigate to their **Executions** tab. #### Fetching step results from the API[​](#fetching-step-results-from-the-api "Direct link to Fetching step results from the API") Step results are available via the Prismatic GraphQL API using the `executionResult` query. Results are serialized with [MessagePack](https://msgpack.org/index.html) and can be deserialized using the MessagePack library for your preferred language. For more information: [Fetching and Unpacking Step Results](https://prismatic.io/docs/api/common-queries/fetching-step-results.md) #### Viewing execution logs[​](#viewing-execution-logs "Direct link to Viewing execution logs") Instance logs are accessible from the **Logs** tab on the instance's page. You can also view logs for all instances via the **Logs** link in the sidebar, or for a specific customer by selecting their **Logs** tab. Use the **Search Logs** bar at the top of the page to search log messages. Filter logs by severity or date range using the **Filter** link to the right of the search bar. ![Filter instance logs in Prismatic app](/docs/img/executions/instance-logs.png) **For More Information**: [Logging](https://prismatic.io/docs/monitor-instances/logging.md) --- ### Logging Comprehensive log access is essential for building, deploying, and supporting integrations. When an [alert monitor](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md) notifies your team of unexpected instance behavior, detailed logs provide insight into execution timing, step status, and error details. Prismatic offers access to logs for all instance invocations and test runs. You can also [stream logs](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md) to an external logging system for centralized analysis. #### Log retention[​](#log-retention "Direct link to Log retention") Logs and step results are retained for 14 days before automatic deletion. ##### Disabling logs and step results[​](#disabling-logs-and-step-results "Direct link to Disabling logs and step results") Organizations may need to disable log and step result storage for compliance reasons. To discuss retention policy adjustments, contact [support](mailto:support@prismatic.io). When storage is disabled, log and step result data is neither persisted in Prismatic's database nor available in the web app. If your organization has custom retention policies, a toggle will appear in the instance configuration wizard to disable storage for specific instances. ![Disable logs and step results in Prismatic app](/docs/img/monitor-instances/logging/disable-logs-and-step-results.png) #### Viewing logs for all customers[​](#viewing-logs-for-all-customers "Direct link to Viewing logs for all customers") To view logs for all instances across all customers, select **Logs** from the left-hand sidebar. Displayed columns include log **messages**, **timestamps** (in your local time), **instance** name, **integration** name, and **customer** name. #### Viewing logs for a specific customer[​](#viewing-logs-for-a-specific-customer "Direct link to Viewing logs for a specific customer") To view logs for a specific customer, select **Customers** in the sidebar, choose a customer, and click the **Logs** tab. Displayed columns include log **messages**, **timestamps** (in your local time), **instance** name, and **integration** name. **For More Information**: [Customers](https://prismatic.io/docs/customers.md) #### Viewing logs for a specific instance[​](#viewing-logs-for-a-specific-instance "Direct link to Viewing logs for a specific instance") To view logs for a specific instance: 1. Click **Instances** in the sidebar and select an instance, or 2. Click **Customers**, select a customer, and choose an instance under the **Instances** tab. Once viewing an instance, select the **Logs** tab. Displayed columns include log **messages**, **timestamps** (in your local time), **integration** name, and **customer** name. **For More Information**: [Instances](https://prismatic.io/docs/instances.md) #### Searching and filtering logs[​](#searching-and-filtering-logs "Direct link to Searching and filtering logs") Search log messages using the **Search Logs** bar at the top of any log page. For detailed information about a specific log entry, click the log line to display an information panel at the bottom of the screen. ![Customer log details in Prismatic app](/docs/img/monitor-instances/logging/log-line-more-info.png) Filter logs using the **Filter** dropdown to the right of the search bar. Filter by: * **Log Type** (execution, connection, data source, or trigger logs) * **Time range** * **Log Severity** (Error, Warn, Info, Debug) * **Flow** ![Filter customer logs in Prismatic app](/docs/img/monitor-instances/logging/filter-customer-logs.png) #### Viewing connection logs[​](#viewing-connection-logs "Direct link to Viewing connection logs") Connections generate logs during testing in the integration designer and when used in deployed instances. If a connection encounters an error (e.g., expired credentials), it is recorded in the connection's logs. To view a connection's logs, click the log icon next to the connection. ![Connection logs in Prismatic app](/docs/img/monitor-instances/logging/connection-log-button.png) Click any log line in the resulting popover to view more details. #### Viewing data source config variable logs[​](#viewing-data-source-config-variable-logs "Direct link to Viewing data source config variable logs") [Data sources](https://prismatic.io/docs/integrations/data-sources.md) fetch data from third-party APIs and present it in the [config wizard](https://prismatic.io/docs/integrations/config-wizard.md). Data source logs are not tied to specific executions. Organization users can view data source logs by clicking the log icon near the data source config variable. **Note**: This icon is not available to customer users configuring integrations in your [embedded marketplace](https://prismatic.io/docs/embed/marketplace.md). ![Data source logs in Prismatic app](/docs/img/monitor-instances/logging/data-source-logs.png) Data source logs are also available with their associated config variables in the **Test Configuration** drawer under **Logs**. ![Test configuration drawer logs in Prismatic app](/docs/img/monitor-instances/logging/test-configuration-logs-drawer.png) #### Viewing trigger lifecycle logs[​](#viewing-trigger-lifecycle-logs "Direct link to Viewing trigger lifecycle logs") Most trigger functions run as part of an execution - receiving a webhook request and returning a value, or running on a schedule. Some [trigger lifecycle functions](https://prismatic.io/docs/custom-connectors/triggers.md#instance-deploy-and-delete-events-for-triggers) (such as `onInstanceDeploy` and `onInstanceDelete`) execute when instances are created or deleted. These function logs are available in the **Test Configuration** drawer under **Logs**. ![Test configuration drawer logs in Prismatic app](/docs/img/monitor-instances/logging/test-configuration-logs-drawer.png) #### What gets logged?[​](#what-gets-logged "Direct link to What gets logged?") When a component calls `context.logger.{debug,info,warn,error}()`, the log entry is saved in Prismatic's logging system. In addition to component-generated logs, the following standard log types are recorded: | Type | Example | Purpose | Log Level | | -------------- | ----------------------------------- | -------------------------------------------------------- | --------- | | Instance Start | Starting Instance 'Sample Instance' | Marks the beginning of an instance run | info | | Instance End | Ending Instance 'Sample Instance' | Indicates successful instance completion | info | | Step Started | Fetch file from Dropbox | Shows the name of the step being executed | info | | Step Failed | `{{ ERROR MESSAGE }}` | Indicates step failure with the associated error message | error | **For More Information**: [`context.logger`](https://prismatic.io/docs/custom-connectors/actions.md#logger-object) #### Log levels[​](#log-levels "Direct link to Log levels") Prismatic uses four log levels: `debug`, `info`, `warn`, and `error`. Each level is visually distinguished: * `debug`: green icons * `info`: gray icons * `warn`: yellow icons * `error`: red icons ![Log levels illustrated and explained](/docs/img/monitor-instances/logging/levels-logs.png) --- ### Streaming Logs Externally [Streaming Prismatic Logs to DataDog](https://player.vimeo.com/video/894997344) #### External log streaming[​](#external-log-streaming "Direct link to External log streaming") Feature Availability The external log streaming feature is available to customers on some pricing plans. Refer to your contract, or contact Prismatic support for details. Prismatic can stream logs to external logging services (such as [DataDog](https://www.datadoghq.com/), [New Relic](https://newrelic.com/)), or to your own logging infrastructure. Most logging services accept HTTP POST requests with JSON payloads containing log data. To configure log streaming in Prismatic, specify the destination URL, payload format, and any required headers (e.g., authorization or API keys). To set up external log streaming, open **Settings** in the sidebar and select the **Log Streams** tab. Create a new log stream by clicking **+ Log stream**. ![Add log stream in Prismatic app](/docs/img/monitor-instances/logging/streaming-logs-externally/add-log-stream.png) Enter the destination URL and add any required headers (such as API keys or authorization tokens). ![Configure log streaming to external service in Prismatic app](/docs/img/monitor-instances/logging/streaming-logs-externally/url-and-headers.png) Next, define a log message template. This template determines the structure of the log message sent to your logging service. You can include placeholders for log content and metadata about the instance, customer, flow, and step. ![Create log message template in Prismatic app](/docs/img/monitor-instances/logging/streaming-logs-externally/sample-template.png) The following placeholders are available and will be replaced with actual values when a log message is sent: | Placeholder | Description | Example | | ------------------------------ | -------------------------------------------------------------------------------------------------------------- | ------------------------------------------ | | `{{ timestamp }}` | Timestamp in milliseconds since epoch | 1637087683123 | | `{{ timestamp_s }}` | Timestamp in seconds since epoch | 1637087683 | | `{{ timestamp_ns }}` | Timestamp in nanoseconds since epoch | 1637087683123000000 | | `{{ timestamp_iso }}` | Timestamp in ISO format | "2021-11-16T18:34:43.123Z" | | `{{ message }}` | Full log message | "This is a test" | | `{{ severity }}` | Log level (debug, info, warn, error, metric) | "warn" | | `{{ severityNumber }}` | [Syslog severity level](https://en.wikipedia.org/wiki/Syslog#Severity_level) | 4 | | `{{ instanceId }}` | Global ID of the instance | "SW5zdEXAMPLE" | | `{{ instanceName }}` | Name of the instance | "Update Inventory" | | `{{ instanceLabels }}` | Labels assigned to the instance | \["label1", "label2"] | | `{{ flowConfigId }}` | Global ID of the instance's configured flow | "SW5zdEXAMPLE" | | `{{ integrationId }}` | Global ID of the deployed integration version | "SW5zdEXAMPLE" | | `{{ integrationName }}` | Name of the integration | "Update Inventory" | | `{{ logType }}` | Log type: "DATA\_SOURCE", "CONNECTION", "EXECUTION", or "MANAGEMENT" | "EXECUTION" | | `{{ flowId }}` | Global ID of the deployed integration flow | "SW5zdEXAMPLE" | | `{{ flowName }}` | Name of the integration flow | "Remove inventory after order fulfillment" | | `{{ stepName }}` | Name of the step, if available | "Loop over order items" | | `{{ isTestExecution }}` | Indicates if the log is from a test in the integration designer | true | | `{{ executionId }}` | Global ID of the execution | "SW5zdEXAMPLE" | | `{{ customerExternalId }}` | [External ID](https://prismatic.io/docs/customers/managing-customers.md#customer-external-ids) of the customer | "abc-123" | | `{{ customerName }}` | Name of the customer | "Acme Corp" | | `{{ executionErrorStepName }}` | Name of the step that resulted in an execution error | "Loop over order items" | | `{{ durationMS }}` | Duration in milliseconds of the execution | "1000" | | `{{ succeeded }}` | Whether the step or execution succeeded | "true" | | `{{ errorMessage }}` | Error message for the step or execution | "This is an error" | | `{{ retryAttemptNumber }}` | Number of retry attempts for the step or execution | "0" | | `{{ retryForExecutionId }}` | Global ID of the original execution in case of retry | "SW5zdEXAMPLE" | This template is compatible with most logging platforms, but you may need to adjust it for your specific requirements. ###### Default message template[​](#default-message-template "Direct link to Default message template") ``` { "message": {{ message }}, "timestamp": {{ timestamp }}, "severity": {{ severity }}, "service": "Prismatic", "instance": {{ instanceName }}, "customer": {{ customerExternalId }}, "integration": {{ integrationName }}, "logType": {{ logType }}, "isTestExecution": {{ isTestExecution }}, "flow": {{ flowName }}, "step": {{ stepName }}, "executionid": {{ executionId }}, "instanceId": {{ instanceId }}, "flowConfigId": {{ flowConfigId }}, "integrationId": {{ integrationId }}, "flowId": {{ flowId }}, "executionErrorStepName": {{ executionErrorStepName }}, "duration": {{ durationMS }}, "succeeded": {{ succeeded }}, "errorMessage": {{ errorMessage }}, "retryAttempt": {{ retryAttemptNumber }}, "retryForExecutionId": {{ retryForExecutionId }} } ``` ##### Testing log streaming[​](#testing-log-streaming "Direct link to Testing log streaming") After saving your configuration, test your external logging setup by clicking the **Test payload** button at the top right of the log stream screen. This sends a test log message to your external logging system, substituting test values (e.g., "Test message", "Test integration") into your template. **Note**: If your logging provider enforces CORS and blocks logs sent directly from the browser, the **Test payload** button may not work. In this case, save your configuration and run a test integration; logs from the execution will be sent to your external provider. ##### Logging metrics to an external service[​](#logging-metrics-to-an-external-service "Direct link to Logging metrics to an external service") In addition to log lines, you can use [`context.logger`](https://prismatic.io/docs/custom-connectors/actions.md#logger-object) to emit objects containing metrics for external streaming. For example, a code component can include: ``` logger.metric({ inventoryItem: { id: "123", price: 10.55, quantity: 3 } }); ``` Your external streaming configuration can extract attributes from the object passed to `metric()`. For example: ``` { "message": {{ message }}, "timestamp": {{ timestamp }}, "severity": {{ severity }}, "itemId": {{ inventoryItem.id }}, "itemPrice": {{ inventoryItem.price }}, "itemQuantity": {{ inventoryItem.quantity }} } ``` When a metric log contains `inventoryItem.id`, those attributes are included in the payload sent to the logging system. Messages without these fields simply omit them. info When `logger.metric()` is called, `{{ message }}` is the [stringified](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify) JSON version of the object, and `{{ level }}` is set to `99`. --- ### Streaming Logs to DataDog [DataDog](https://www.datadoghq.com/) is an application monitoring platform and logging system. To stream logs to DataDog, first [generate an API key](https://docs.datadoghq.com/account_management/api-app-keys/#add-an-api-key-or-client-token). Be sure to generate an API key (not an application key). Set the endpoint as specified in the below table, and add a header named `DD-API-KEY` and provide your API key. #### DataDog Log Intake Endpoints by Region[​](#datadog-log-intake-endpoints-by-region "Direct link to DataDog Log Intake Endpoints by Region") | Region | Endpoint URL | | ------------ | -------------------------------------------------------- | | US (default) | `https://http-intake.logs.datadoghq.com/api/v2/logs` | | US3 | `https://http-intake.logs.us3.datadoghq.com/api/v2/logs` | | US5 | `https://http-intake.logs.us5.datadoghq.com/api/v2/logs` | | EU | `https://http-intake.logs.datadoghq.eu/api/v2/logs` | | AP1 | `https://http-intake.logs.ap1.datadoghq.com/api/v2/logs` | | US1-FED | `https://http-intake.logs.ddog-gov.com/api/v2/logs` | > [See DataDog docs for the latest endpoints.](https://docs.datadoghq.com/api/latest/logs/#endpoints) ![Configure DataDog log streaming in Prismatic app](/docs/img/monitor-instances/logging/streaming-logs-to-datadog/datadog-filled-in.png) The default message template ([see here](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md#default-message-template)) is compatible with DataDog, but you may customize it to match your attribute naming conventions. Once configured, logs from all enabled customer instances and all test runs in the integration designer will be streamed to DataDog: ![List of logs in DataDog app](/docs/img/monitor-instances/logging/streaming-logs-to-datadog/datadog.png) Testing Payloads with DataDog For security reasons, DataDog prohibits sending logs directly from your web browser when using an API key. As a result, the **TEST PAYLOAD** button does not work with DataDog. Rest assured, you should see instance logs in DataDog as long as your API key is valid. --- ### Streaming Logs to Google Cloud Streaming logs to Google Cloud Logging requires a service account and mapping Prismatic's severity levels to Google's (`WARNING` vs `warn`). The recommended approach is to send logs to a Google Cloud Function, which then writes to Cloud Logging. 1. In [Google Cloud IAM](https://console.cloud.google.com/iam-admin/serviceaccounts), create a new service account. Grant it the **Logging Admin** role via the [IAM dashboard](https://console.cloud.google.com/iam-admin/iam). 2. Create a [Google Cloud Function](https://console.cloud.google.com/functions/) and assign the service account. Add [@google-cloud/logging](https://www.npmjs.com/package/@google-cloud/logging) as a dependency in `package.json`: ``` { "dependencies": { "@google-cloud/functions-framework": "^3.0.0", "@google-cloud/logging": "11.2.0" } } ``` Its `index.js` can read something like this (replace `PROJECT_ID`): ``` const functions = require("@google-cloud/functions-framework"); const { Logging } = require("@google-cloud/logging"); const PROJECT_ID = "INSERT YOUR PROJECT ID HERE"; const LOG_NAME = "prismatic"; const SEVERITY_MAP = { debug: "DEBUG", info: "INFO", warn: "WARNING", error: "ERROR", }; functions.http("helloHttp", async (req, res) => { // Creates a client const logging = new Logging({ projectId: PROJECT_ID }); // Selects the log to write to const log = logging.log(LOG_NAME); const { timestamp, severity, message, ...rest } = req.body; // Labels must be strings const labels = Object.entries(rest).reduce( (acc, [key, value]) => ({ [key]: value === null ? "" : `${value}`, ...acc, }), {}, ); // The metadata associated with the entry const metadata = { resource: { type: "global" }, severity: SEVERITY_MAP[severity], timestamp, labels, }; // Prepares a log entry const entry = log.entry(metadata, message); await log.write(entry); res.send({ success: true }); }); ``` 3. Deploy the function and note its URL (e.g., `https://us-central1-your-project-12345.cloudfunctions.net/your-function`). 4. In Prismatic, configure an external log stream to point to your function's URL. Ensure the timestamp is sent in ISO format. Use the following payload template: ``` { "message": {{ message }}, "timestamp": {{ timestamp_iso }}, "severity": {{ severity }}, "instance": {{ instanceName }}, "customer": {{ customerExternalId }}, "integration": {{ integrationName }}, "isTestExecution": {{ isTestExecution }}, "flow": {{ flowName }}, "step": {{ stepName }}, "executionId": {{ executionId }}, "instanceId": {{ instanceId }}, "flowConfigId": {{ flowConfigId }}, "integrationId": {{ integrationId }}, "flowId": {{ flowId }}, "executionErrorStepName": {{ executionErrorStepName }}, "duration": {{ durationMS }}, "succeeded": {{ succeeded }}, "errorMessage": {{ errorMessage }}, "retryAttempt": {{ retryAttemptNumber }}, "retryForExecutionId": {{ retryForExecutionId }} } ``` ![](/docs/img/monitor-instances/logging/streaming-logs-to-google-cloud/google-logging.png) --- ### Streaming Logs to New Relic [New Relic](https://newrelic.com/) is an application monitoring platform and logging service. To stream logs to New Relic, first [generate an API key](https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/#ingest-license-key). Ensure you create an **INGEST - LICENSE** key. Next, create a new external log stream in Prismatic. * For US-hosted data, use the endpoint: `https://log-api.newrelic.com/log/v1` * For EU-hosted data, use: `https://log-api.eu.newrelic.com/log/v1` Add a header named `X-License-Key` and provide the license key you generated. **Note**: When copying your key, select "Copy key" (not "key ID"). ![Configure New Relic log streaming in Prismatic app](/docs/img/monitor-instances/logging/streaming-logs-to-new-relic/new-relic-filled-in.png) The default message template ([see here](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md#default-message-template)) is compatible with New Relic, but you may customize it to match your attribute naming conventions. Once configured, logs from all enabled customer instances and all test runs in the integration designer will be streamed to New Relic: ![List of logs in New Relic app](/docs/img/monitor-instances/logging/streaming-logs-to-new-relic/newrelic.png) --- ### Execution Retry and Replay Because external systems are not always available, it is important to [implement retry functionality](https://prismatic.io/docs/monitor-instances/retry-and-replay/automatic-retry.md) in your integrations where appropriate. Retry enables the system to automatically resend payloads to destination systems at regular intervals or using exponential backoff (e.g., retrying after 2 minutes, then 4, then 8, etc.). Implementing robust retry logic helps ensure your integration succeeds even if the destination system experiences a temporary outage. By leveraging retry, you can avoid involving your development team in integration issues until you have confirmed the problem is more serious than a transient outage. This prevents many ephemeral issues from reaching your support team. However, if errors persist (for example, if the destination system appears operational but the integration continues to return internal server errors), you may need to escalate [from retry to replay functionality](https://prismatic.io/docs/monitor-instances/retry-and-replay/replaying-failed-executions.md). Unlike retry, replay is a manual operation. It is initiated by clicking the replay button in the instance UI. Replay reruns the entire integration with the original payload, allowing you to debug by monitoring each step's execution and pinpoint where the process succeeds or fails. --- ### Automatic Execution Retry #### Integration retry configuration[​](#integration-retry-configuration "Direct link to Integration retry configuration") You can configure your [asynchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md)-invoked instances to retry if they fail to run to completion. This is especially useful if your integration relies on an unreliable third-party API that may experience brief outages. By enabling retry, your integration can attempt execution again after a short delay, reducing unnecessary alerting and manual intervention. To enable automatic retry, select your trigger and then choose **Flow retry**. ![Set integration to retry in Prismatic app](/docs/img/monitor-instances/retry-and-replay/automatic-retry/configure-retry.png) **Retry Attempts** specifies the maximum number of times (up to 10) Prismatic will attempt to run the same instance invocation after a failure. If the number of failures exceeds **Retry Attempts**, the run is marked as *execution failed* and any configured [alert monitors](https://prismatic.io/docs/monitor-instances/alerting/alert-monitors.md) will fire. **Minutes Between Attempts** sets the interval (in minutes) between retry attempts. For example, if set to **4** minutes and the first attempt fails at 10:24, subsequent attempts will occur at 10:28, 10:32, 10:36, 10:40, and 10:44 if failures persist. *Note:* Retry intervals are precise to the minute (not the second). Thus, a retry scheduled for 4 minutes after a failure at 10:24 may occur at 10:28 or 10:29. If **Exponential Backoff** is enabled, the interval between retries increases exponentially (factor of 2). For example, with **Minutes Between Attempts** set to 3 and **Exponential Backoff** enabled, retries will occur after 3, 6, 12, 24, and 48 minutes. *Note*: The maximum delay before a retry is **24 hours**. If exponential backoff would result in a longer delay, the retry will occur after 24 hours instead. **Retry Cancellation** allows you to cancel pending retries if a more recent invocation occurs. For example, if your integration processes payloads with unique IDs, you may want to cancel retries for older invocations when new data arrives, preventing outdated data from overwriting newer updates. To configure retry cancellation, select a unique request ID from the trigger payload. For example, you might pass in a header, `x-my-unique-id: abc123` as part of your trigger payload. If another invocation with that header comes in that updates resource `abc123`, you might want to cancel currently queued retries. To do that, select your trigger's `results.headers.results.headers.x-my-unique-id` reference as your **Unique Cancellation ID**. Cancellation IDs do not need to be headers. Instead, you can select a key from the payload body. For example, if your instance invocation looks like this: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --location \ --header "Content-Type: application/json" \ --data '{"productId":"abc123","price":"250","description":"A box of widgets"}' ``` You can key your unique cancellation ID off of `results.body.data.productId`. **For More Information**: [Instance Retry and Replay](https://prismatic.io/docs/monitor-instances/retry-and-replay.md) --- ### Replay Failed Executions #### Replaying failed executions programmatically[​](#replaying-failed-executions-programmatically "Direct link to Replaying failed executions programmatically") Errors are inevitable in software integrations. * A third-party API may be unavailable when your instance attempts to access it. While retry can address brief outages, it does not handle scenarios where the API is down for extended periods. * A third party may begin sending data in an unexpected format. * Your instance may encounter other edge cases that are not handled gracefully. Regardless of the error's cause, it is useful to re-run your instance with the exact same input data after the third-party API is restored or after you have updated your integration to handle new data formats. Replay allows you to re-execute a previous run's data through your instance. This can be performed easily via [Prismatic's GraphQL API](https://prismatic.io/docs/api.md). ##### Querying for failed executions[​](#querying-for-failed-executions "Direct link to Querying for failed executions") First, query an instance for failed executions. You can find an instance's ID in your browser's URL bar or via the API. Filter for executions where `error_Isnull: false` (i.e., executions that encountered an error). To exclude replays, include `replayForExecution_Isnull: true`. To fetch replays that have since succeeded, use `replays(error_Isnull: true)`: Query for Failed Executions ``` query getFailedExecutions($instanceId: ID!, $startCursor: String) { executionResults( instance: $instanceId error_Isnull: false replayForExecution_Isnull: true after: $startCursor ) { nodes { id startedAt replays(error_Isnull: true) { nodes { id startedAt } } error } pageInfo { hasNextPage endCursor } } } ``` Query Variables ``` { "instanceId": "SW5zdGFuY2U6ZGVkZDQ3ZjQtNmQ4OC00NjJmLWE5YmYtNWM1OGNiMTg0MDAy", "startCursor": "" } ``` [Try It Out ❯](https://prismatic.io/docs/explorer?query=query+getFailedExecutions%28%24instanceId%3A+ID%21%2C+%24startCursor%3A+String%29+%7B%0A++executionResults%28%0A++++instance%3A+%24instanceId%0A++++error_Isnull%3A+false%0A++++replayForExecution_Isnull%3A+true%0A++++after%3A+%24startCursor%0A++%29+%7B%0A++++nodes+%7B%0A++++++id%0A++++++startedAt%0A++++++replays%28error_Isnull%3A+true%29+%7B%0A++++++++nodes+%7B%0A++++++++++id%0A++++++++++startedAt%0A++++++++%7D%0A++++++%7D%0A++++++error%0A++++%7D%0A++++pageInfo+%7B%0A++++++hasNextPage%0A++++++endCursor%0A++++%7D%0A++%7D%0A%7D\&query_variables=%7B%0A++%22instanceId%22%3A+%22SW5zdGFuY2U6ZGVkZDQ3ZjQtNmQ4OC00NjJmLWE5YmYtNWM1OGNiMTg0MDAy%22%2C%0A++%22startCursor%22%3A+%22%22%0A%7D) If there are multiple pages of executions (more than 100), use the `endCursor` as the `startCursor` to paginate results. The GraphQL API will return failed executions for the instance, along with any successful replays: ``` { "data": { "instance": { "executionResults": { "nodes": [ { "id": "SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6NjBkZDliOWMtOGIyOS00NDQyLWFkNDctMjZkZTg5Y2NlNWM5", "startedAt": "2023-07-26T17:18:15.886806+00:00", "replays": { "nodes": [] }, "error": "Unable to connect to API" }, { "id": "SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6MmYxNTcxZTktNDVmOS00Mzc2LTg2OGUtMTJkNjZkNDhiNzRl", "startedAt": "2023-07-26T17:18:13.800335+00:00", "replays": { "nodes": [ { "id": "SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6N2U1MDBkZTgtY2ZmYS00NWY5LWI0OGYtNGU1YjU2YWMzMzFh", "startedAt": "2023-07-26T17:28:10.003443+00:00" } ] }, "error": "Unable to connect to API" } ] } } } } ``` ##### Replaying failed executions programmatically[​](#replaying-failed-executions-programmatically-1 "Direct link to Replaying failed executions programmatically") With the IDs of failed executions, issue a [replayExecution](https://prismatic.io/docs/api/schema/mutation/replayExecution.md) mutation for each one that does not have a successful replay: Replay a failed execution ``` mutation myReplayExecution($executionId: ID!) { replayExecution(input: {id: $executionId}) { instanceExecutionResult { id } errors { field messages } } } ``` Query Variables ``` { "executionId": "SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6NjBkZDliOWMtOGIyOS00NDQyLWFkNDctMjZkZTg5Y2NlNWM5" } ``` [Try It Out ❯](https://prismatic.io/docs/explorer?query=mutation+myReplayExecution%28%24executionId%3A+ID%21%29+%7B%0A++replayExecution%28input%3A+%7Bid%3A+%24executionId%7D%29+%7B%0A++++instanceExecutionResult+%7B%0A++++++id%0A++++%7D%0A++++errors+%7B%0A++++++field%0A++++++messages%0A++++%7D%0A++%7D%0A%7D\&query_variables=%7B%0A++%22executionId%22%3A+%22SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6NjBkZDliOWMtOGIyOS00NDQyLWFkNDctMjZkZTg5Y2NlNWM5%22%0A%7D) The mutation returns the ID of the new execution. You can then query the API for that execution to verify success or perform further debugging if needed. For a script that automates these GraphQL calls, see our [examples GitHub repo](https://github.com/prismatic-io/examples/blob/main/api/replay-failed-executions/queries.ts). For more information, see the [API docs](https://prismatic.io/docs/api.md). --- ### Search the documentation [Skip to main content](#__docusaurus_skipToContent_fallback) [![Prismatic Logo](/docs/img/logo.png)](https://prismatic.io/docs/)[Docs](https://prismatic.io/docs/.md) [](#) * [Custom Component SDK (Spectral)](https://github.com/prismatic-io/spectral) * [CLI Tool (Prism)](https://github.com/prismatic-io/prism) * [Embedded Library](https://github.com/prismatic-io/embedded) * [Examples](https://github.com/prismatic-io/examples) [Sign Up](https://prismatic.io/free-trial/)[Log In](https://app.prismatic.io) Search Type your search here Powered by[](https://www.algolia.com/) [![Prismatic logo](/docs/img/logo.png)](https://prismatic.io/docs/) [](https://www.linkedin.com/company/prismatic-io/)[](https://twitter.com/prismatic_io)[](https://github.com/prismatic-io/) Platform * [Platform Overview](https://prismatic.io/platform/) * [Low-Code Integration Designer](https://prismatic.io/platform/low-code-integration-designer/) * [Code-Native Integrations](https://prismatic.io/platform/code-native-integrations/) * [Embedded Workflow Builder](https://prismatic.io/platform/embedded-workflow-builder/) * [Connectors](https://prismatic.io/connectors/) * [Integration Marketplace](https://prismatic.io/platform/integration-marketplace/) * [Configuration & Deployment Tools](https://prismatic.io/platform/configuration-deployment-tools/) * [Customer Self-Serve Support Tools](https://prismatic.io/platform/customer-self-serve-support-tools/) * [Monitoring & Management Tools](https://prismatic.io/platform/monitoring-management-tools/) Compare * [How We're Different](https://prismatic.io/why-prismatic/) * [Appmixer](https://prismatic.io/platform/how-we-compare/appmixer/) * [Boomi OEM](https://prismatic.io/platform/how-we-compare/boomi/) * [Cyclr](https://prismatic.io/platform/how-we-compare/cyclr/) * [In-House](https://prismatic.io/platform/how-we-compare/in-house/) * [n8n Embedded](https://prismatic.io/platform/how-we-compare/n8n/) * [Pandium](https://prismatic.io/platform/how-we-compare/pandium/) * [Paragon](https://prismatic.io/platform/how-we-compare/paragon/) * [Tray Embedded](https://prismatic.io/platform/how-we-compare/tray/) * [Workato Embedded](https://prismatic.io/platform/how-we-compare/workato/) * [Zapier](https://prismatic.io/platform/how-we-compare/zapier/) For Devs * [Dev Hub](https://prismatic.io/dev-hub/) * [Docs](https://prismatic.io/docs/.md) * [GitHub Examples](https://github.com/prismatic-io/examples) * [API Explorer](https://prismatic.io/docs/explorer) * [Prism CLI](https://prismatic.io/docs/cli.md) * [Embedded SDK](https://prismatic.io/docs/embed.md) * [Getting Started](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) * [Low-Code vs Code-Native](https://prismatic.io/docs/integrations.md#low-code-vs-code-native) Company * [About](https://prismatic.io/about/) * [Contact](https://prismatic.io/contact/) * [Careers](https://prismatic.io/careers/) * [News](https://prismatic.io/press/) * [Customers](https://prismatic.io/customers/) * [Security](https://prismatic.io/legal/security/) * [Trust Center](https://www.trust-prismatic.io/) Β© 2025 Prismatic LLC. All rights reserved. LLM? See [llms.txt](https://prismatic.io/docs/llms.txt) [Privacy Policy](https://prismatic.io/docs/../legal/privacy)[Terms of Service](https://prismatic.io/docs/../legal/terms)[Acceptable Use](https://prismatic.io/docs/../legal/acceptable-use)[Status](https://www.prismatic-status.io/) --- ### Spectral 10.6 Upgrade Guide Spectral 10.6 introduces a new way to reference existing component actions, data sources and connections in a code-native integration. New action, data source, and connection reference functions make it easier to build code-native integrations that leverage existing custom components, and provide better type safety and autocompletion in your IDE. Existing reference syntax will continue to work, but we recommend updating your code-native integrations to use the new reference functions. #### Generating new component manifests[​](#generating-new-component-manifests "Direct link to Generating new component manifests") Previously, all component type manifests were installed as npm dependencies. Now, you can install component manifests into your `src/manifests/` directory. This lets you skip the step of generating manifests for your custom components and publishing them to npm - the manifest is generated from information in the Prismatic API. New manifests can be generated for both public and private components, and contain new type wrappers. To install a component manifest, run the following command: ``` # Public component npx cni-component-manifest slack # Private component npx cni-component-manifest slack --private ``` Next, remove your component's npm dependency from `package.json` and run `npm install` or `yarn install`. Finally, update your import statements to import from the manifest file instead of the npm package. componentRegistry.ts ``` @@ -1,5 +1,5 @@ import { componentManifests } from "@prismatic-io/spectral"; -import slack from "@component-manifests/slack"; +import slack from "./manifests/slack"; export const componentRegistry = componentManifests({ slack, ``` #### New component action functions[​](#new-component-action-functions "Direct link to New component action functions") Actions can now be imported from a component manifest and invoked directly. yourFlow.ts ``` @@ -8,6 +8,7 @@ import { flow, util } from "@prismatic-io/spectral"; import axios from "axios"; +import slackActions from "../manifests/slack/actions"; interface TodoItem { id: number; @@ -34,7 +35,7 @@ export const todoAlertsFlow = flow({ } else { logger.info(`Sending message for item ${item.id}`); try { - await context.components.slack.postMessage({ + await slackActions.postMessage.perform({ channelName: util.types.toString( configVars["Select Slack Channel"] ``` #### New component trigger functions[​](#new-component-trigger-functions "Direct link to New component trigger functions") Similar to actions, triggers can be imported from a component manifest and invoked directly. ``` import { flow } from "@prismatic-io/spectral"; import { salesforceFlowOutboundMessageTrigger } from "./manifests/salesforce/triggers/flowOutboundMessageTrigger"; export const salesforceAccountNotifications = flow({ name: "Listen for Salesforce Account Notifications", stableKey: "salesforce-account-notifications", description: "This flow uses an existing component trigger to listen for Account notifications from Salesforce.", onTrigger: salesforceFlowOutboundMessageTrigger({ connection: { configVar: "Salesforce Connection" }, prefix: { value: "acme" }, triggerObject: { value: "Account" }, fields: { value: ["Id", "Name"] }, }), onExecution: async (context, params) => { // ... }, }); ``` #### New component connection functions[​](#new-component-connection-functions "Direct link to New component connection functions") Connections can be imported from a component manifest, and are named ``. The first parameter of the connection function is the stable key of the connection config var. configPages.ts ``` @@ -1,9 +1,9 @@ import { configPage, configVar, - connectionConfigVar, dataSourceConfigVar, } from "@prismatic-io/spectral"; +import { slackOauth2 } from "./manifests/slack/connections/oauth2"; import { SLACK_CLIENT_ID, SLACK_CLIENT_SECRET, @@ -14,34 +14,26 @@ export const configPages = { Connections: configPage({ tagline: "Authenticate with Slack", elements: { - "Slack OAuth Connection": connectionConfigVar({ - stableKey: "slack-oauth-connection", - dataType: "connection", - connection: { - component: "slack", - key: "oauth2", - values: { - clientId: { - value: SLACK_CLIENT_ID, - permissionAndVisibilityType: "organization", - visibleToOrgDeployer: false, - }, - clientSecret: { - value: SLACK_CLIENT_SECRET, - permissionAndVisibilityType: "organization", - visibleToOrgDeployer: false, - }, - signingSecret: { - value: SLACK_SIGNING_SECRET, - permissionAndVisibilityType: "organization", - visibleToOrgDeployer: false, - }, - scopes: { - value: "chat:write chat:write.public channels:read", - permissionAndVisibilityType: "organization", - visibleToOrgDeployer: false, - }, - }, + "Slack OAuth Connection": slackOauth2("slack-oauth-connection", { + clientId: { + value: SLACK_CLIENT_ID, + permissionAndVisibilityType: "organization", + visibleToOrgDeployer: false, + }, + clientSecret: { + value: SLACK_CLIENT_SECRET, + permissionAndVisibilityType: "organization", + visibleToOrgDeployer: false, + }, + signingSecret: { + value: SLACK_SIGNING_SECRET, + permissionAndVisibilityType: "organization", + visibleToOrgDeployer: false, + }, + scopes: { + value: "chat:write chat:write.public channels:read", + permissionAndVisibilityType: "organization", + visibleToOrgDeployer: false, }, }), }, ``` #### New component data source functions[​](#new-component-data-source-functions "Direct link to New component data source functions") Similar to connections, data sources can be imported from a component manifest, and are named ``. The first parameter of the data source function is the stable key of the data source config var. configPages.ts ``` @@ -1,9 +1,6 @@ import { configPage, configVar, - dataSourceConfigVar, } from "@prismatic-io/spectral"; import { slackOauth2 } from "./manifests/slack/connections/oauth2"; +import { slackSelectChannels } from "./manifests/slack/dataSources/selectChannels"; import { SLACK_CLIENT_ID, SLACK_CLIENT_SECRET, @@ -61,16 +58,9 @@ export const configPages = { "Slack Config": configPage({ tagline: "Select a Slack channel from a dropdown menu", elements: { - "Select Slack Channel": dataSourceConfigVar({ - stableKey: "select-slack-channel", - dataSource: { - component: "slack", - key: "selectChannels", - values: { - connection: { configVar: "Slack OAuth Connection" }, - includePublicChannels: { value: true }, - }, - }, + "Select Slack Channel": slackSelectChannels("select-slack-channel", { + connection: { configVar: "Slack OAuth Connection" }, + includePublicChannels: { value: true }, }), }, }), ``` --- ### Spectral 10.x Upgrade Guide Spectral 10.x introduces the ability to add custom icons to connections on a per-connection basis. This is useful when building a code-native integration that interacts with multiple third-party APIs. One connection in your integration can have a Pied Piper icon, while another can have a Hooli icon. Prior to Spectral 10.x, an OAuth 2.0 connection in a custom component or code-native integration was defined like this: Connection in Spectral 9.x ``` export const myConnection = oauth2Connection({ key: "myConnection", label: "This is my label", comments: "This is my description", // Optional in-app description iconPath: "connect.png", // Optional OAuth connect button override inputs: { /* ... */ }, }); ``` Now, `label`, `comments`, and `iconPath` are nested within a `display` object, where: * `display.label` represents the connection's "type" in the integration builder * `display.description` was previously called `comments` and represents an optional in-app description * `display.icons.avatarPath` is a new option and represents the icon displayed next to the connection when a customer deploys your integration * `display.icons.oauth2ConnectionIconPath` was previously called `iconPath` and, when present, is displayed instead of the default OAuth 2.0 **Connect** button Connection in Spectral 10.x ``` export const myConnection = oauth2Connection({ key: "myConnection", display: { label: "This is my label", description: "This is my description", // Previously called "comments" icons: { avatarPath: "hooli.png", oauth2ConnectionIconPath: "connect.png", // Previously called "iconPath" }, }, inputs: { /* ... */ }, }); ``` --- ### Spectral 2.x Upgrade Guide Several syntactical changes were made between Spectral 1.x and Spectral 2.x to improve the developer experience and catch common errors at compile time rather than at runtime. Let's examine those changes and the upgrade process from version 1.x to 2.x. To see an example of upgrading a component from Spectral 1.x to 2.x, [this commit](https://github.com/prismatic-io/examples/commit/411598bac4163c8a25b492a9275590b3fd7855a9) upgrades the "Format Name" example component from the [writing custom components](https://prismatic.io/docs/custom-connectors.md) article. To start, update `@prismatic-io/spectral` in your `package.json` file and then run `npm install` or `yarn install`: ``` { "dependencies": { "@prismatic-io/spectral": "^2.0.0" } } ``` #### Input keys have moved[​](#input-keys-have-moved "Direct link to Input keys have moved") **Motivation**: In Spectral 1.x, each `input` had a `key` attribute to uniquely identify it. This caused problems, however, as the value of the `input`'s `key` had to exactly match the `action`'s perform function's `params.keyName` to reference inputs correctly. Mismatched input keys and parameter keys would yield `unknown` values for inputs. TypeScript generics were used in 2.x to ensure that your editor and compiler catch mismatched keys prior to deployment. First, remove the `key:` property from each input: ``` const myFirstInput = input({ /* key: "first", */ // Removed in spectral 2.x label: "My Input Field", placeholder: "Some example input", type: "string", required: true, }); const mySecondInput = input({ /* key: "second", */ label: "My Input Field", placeholder: "Some example input", type: "string", required: true, }); ``` Inputs on actions are now an **object** (key-value pairs) instead of an **array**. Adjust your actions accordingly: ``` /* Spectral 1.x */ const myAction = action({ perform: async (context, { first, second }) => { /*...*/ }, inputs: [myFirstInput, mySecondInput], }); /* Spectral 2.x */ const myAction = action({ perform: async (context, { first, second }) => { /*...*/ }, inputs: { first: myFirstInput, second: mySecondInput, }, }); ``` Ensure that your destructured `params` parameter (in this case, `{ first, second }`) has keys that match the keys you provide in the `inputs` object. If they do not match, your compiler will throw an error. #### Action keys have moved[​](#action-keys-have-moved "Direct link to Action keys have moved") Similar to inputs, action keys have also moved. This addresses an issue in Spectral 1.x where if an action's `key` property did not match the name of the variable representing the `action`, a component would compile but the component's action code would not be found at runtime. Action's unique keys are now declared as part of the `component` declaration, and the `key` property has been removed from the `action` declaration: ``` /* Spectral 1.x */ const myAction = action({ key: "myAction", // Remove this /*...*/ }); export default component({ /*...*/ actions: { // Remove the spread operators ...myAction, ...myOtherAction, }, }); /* Spectral 2.x */ const myAction = action({ /*...*/ }); export default component({ /*...*/ actions: { myAction: myAction, myOtherAction: myOtherAction, }, }); ``` Note: you can use JavaScript [shorthand property names](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Object_initializer) to write `actions: { myAction, myOtherAction }` instead. #### Component version is deprecated[​](#component-version-is-deprecated "Direct link to Component version is deprecated") [Component versioning](https://prismatic.io/docs/custom-connectors/publishing.md#component-versioning) is handled by the platform, so there's no need to pass in a `version` property in your `component` declaration. You can remove it. ``` export default component({ key: "my-component", /* version: "123", */ }); ``` #### Input types are more strict[​](#input-types-are-more-strict "Direct link to Input types are more strict") In Spectral 1.x, all inputs had an `any` type. So, if a helper function expected `(string, number)` inputs, this type of invocation would be acceptable by TypeScript with Spectral 1.x: ``` const helper = (foo: string, bar: number) => `Hi, ${foo}, adding 3 gives you ${bar + 3}`; const myAction = action({ perform: async (context, { myInput1, myInput2 }) => { helper(myInput1, myInput2); }, }); ``` This caused runtime issues. What if a component's user entered the string `"2"` for `myInput2`? That would cause problems, since `"2" + 3` equals `"23"` in JavaScript. That's probably not the desired result. In Spectral 2.x, we changed all inputs to have an `unknown` type, so you need to explicitly convert inputs into the type expected by your helper functions and third-party libraries. We created a series of utility functions to help convert inputs to types your helper functions and third-party libraries require. The above code could be written using `util.types` functions: ``` const helper = (foo: string, bar: number) => `Hi, ${foo}, adding 3 gives you ${bar + 3}`; const myAction = action({ perform: async (context, { myInput1, myInput2 }) => { helper(util.types.toString(myInput1), util.types.toNumber(myInput2)); }, }); ``` If a string `"2"` is input into `myInput2` using Spectral 2.x now, that string is converted to a number and then passed into `helper`, and the result would be `5` instead of `"23"`. #### Unit testing[​](#unit-testing "Direct link to Unit testing") In Spectral 2.x, it is assumed that you want a `PerformDataReturn` object when you run an `invoke` function in a unit test. Passing the return object type to the invoke function is no longer required. ``` /* Spectral 1.x */ await invoke(myAction, { someInput }); /* Spectral 2.x */ await invoke(myAction, { someInput }); ``` --- ### Spectral 3.x Upgrade Guide Several changes were made between Spectral 2.x and Spectral 3.x to allow component developers to specify authorization methods (Basic Auth, OAuth2, etc.) at the action level rather than at the component level. Let's examine this change and how to upgrade a component from Spectral 2.x to Spectral 3.x. To start, update `@prismatic-io/spectral` to `^3.0.0` in your `package.json` file and then run `npm install` or `yarn install`: ``` { "dependencies": { "@prismatic-io/spectral": "^3.0.0" } } ``` Next, **remove** the `authorization` block from your `component` definition: ``` export default component({ key: "sampleComponent", public: false, display: { label: "Sample Component", description: "sampleComponent", iconPath: "icon.png", }, actions: { myAction }, authorization: { required: true, methods: ["basic", "api_key"], }, }); ``` Finally, **add** an `authorization` block to any `action` definition that requires credentials: ``` export const myAction = action({ display: { label: "My Action", description: "This is my action", }, perform: async ({ credential }, { myInput }) => { // Do something with the credential ... }, inputs: { myInput: myInputField }, authorization: { required: true, methods: ["basic", "api_key"], }, }); ``` --- ### Spectral 4.x Upgrade Guide Spectral 4.x introduces the ability to write [component-specific triggers](https://prismatic.io/docs/custom-connectors/triggers.md). This version is purely additive, so very few changes to your code are required. To start, update `@prismatic-io/spectral` to `^4.0.6` in your `package.json` file and then run `npm install` or `yarn install`: ``` { "dependencies": { "@prismatic-io/spectral": "^4.0.6" } } ``` Next, optionally write some [triggers](https://prismatic.io/docs/custom-connectors/triggers.md) and add those triggers to your `component` definition: ``` export default component({ key: "sampleComponent", public: false, display: { label: "Sample Component", description: "sampleComponent", iconPath: "icon.png", }, actions: { myAction1, myAction2, myAction3 }, triggers: { myTrigger1, myTrigger2 }, }); ``` --- ### Spectral 5.x Upgrade Guide The major change in Spectral 5.x is the introduction of `connections`, which allow you to package all required information (endpoints, credentials, etc.) into single inputs. Connections work similar to credentials, except that you can create connections with any number of custom fields required. It's easiest to illustrate the upgrade through an example. Suppose you have a component that interacts with "Acme Inventory". The component has a single action, "Get Item", which fetches information about a specific item in inventory. To connect to an Acme Inventory server, you need to know the server's hostname. To authenticate with Acme, you need the user's `username`, `token`, and `tenantid`. In Spectral 4.x, this component might have looked like this: Sample Component using Spectral 4.x ``` import { action, input, component } from "@prismatic-io/spectral"; import { acmeClient } from "@acme-inventory/client"; const itemIdInput = input({ label: "Item ID", type: "string" }); const acmeEndpointInput = input({ label: "Acme Endpoint", type: "string" }); const tenantIdInput = input({ label: "Tenant ID", type: "string" }); const getItem = action({ display: { label: "Get Item", description: "Get an item from inventory", }, inputs: { item: itemIdInput, endpoint: acmeEndpointInput, tenant: tenantIdInput, }, perform: async ({ credential }, { item, endpoint, tenant }) => { const { username, password: token } = credential.fields; const client = new acmeClient({ endpoint, username, token, tenant }); const item = await client.get(item); return { data: item }; }, authorization: { required: true, methods: ["basic"] }, }); export default component({ key: "acme-inventory", public: false, display: { label: "Acme Inventory", description: "Manage items in Acme inventory", iconPath: "icon.png", }, actions: { getItem }, }); ``` Note that secret information, such as `username` and `token`, were slotted into basic auth (username/password) out of convenience (each has two fields). The action declared with its `authorization` block that it supports basic auth. Other information about how to connect to Acme Inventory (`endpoint` and `tenant`) were passed in as inputs. Since username, token, endpoint, and tenant are all used to connect to Acme, it makes more sense to package them together into one cohesive **connection** input. Here's the same component and action in Spectral 5.x, using connections: Sample Component using Spectral 5.x ``` import { action, input, component, connection } from "@prismatic-io/spectral"; import { acmeClient } from "@acme-inventory/client"; const itemIdInput = input({ label: "Item ID", type: "string" }); const acmeConnection = connection({ key: "acmeTokenAuth", label: "Acme Token Authentication", inputs: { endpoint: { label: "Acme Endpoint", type: "string" }, tenant: { label: "Tenant ID", type: "string", example: "606B0F820C0C" }, token: { label: "Token", type: "string", example: "30A5979A91F0" }, username: { label: "Username", type: "string" }, }, }); const getItem = action({ display: { label: "Get Item", description: "Get an item from inventory", }, inputs: { connection: acmeConnection, item: itemIdInput }, perform: async (context, { connection, item }) => { const { endpoint, tenant, token, username } = connection.fields; const client = new acmeClient({ endpoint, username, token, tenant }); const item = await client.get(item); return { data: item }; }, }); export default component({ key: "acme-inventory", public: false, display: { label: "Acme Inventory", description: "Manage items in Acme inventory", iconPath: "icon.png", }, actions: { getItem }, connections: [acmeConnection], }); ``` Endpoint, tenant ID, token, and username are now presented as a cohesive **connection** config variable to integration builders, and that config variable is passed to an action as an input with multiple `fields`. Finally, the component has a `connections:` block which declares the types of custom connections that this component supports. --- ### Spectral 6.x Upgrade Guide Spectral 6.x is completely backwards-compatible with Spectral 5.x. You can safely upgrade your version of Spectral from 5.x to 6.x without changes to your custom component code. The focus of Spectral 6.x was developer experience enhancements - helping developers follow [DRY](https://en.wikipedia.org/wiki/Don%27t_repeat_yourself) principles, ensuring type safety, improving error handling, and building a more robust component testing suite. #### New - cleaning and typing input values[​](#new---cleaning-and-typing-input-values "Direct link to New - cleaning and typing input values") An input of an action can be anything - a number, string, boolean, JavaScript Buffer, a complex object with numerous properties, etc. Prior to Spectral 6.x, this meant that inputs were passed to `perform` functions with TypeScript type `unknown`. ![Screenshot of code editor showing unknown type](/docs/img/spectral/spectral-6-upgrade-guide/unknown-type.png) To ensure that the input was properly typed, each step that used that input had to use a utility function, like `util.types.toNumber()`, to ensure that the value was cast to the proper type. If multiple actions shared the same input, this resulted in repetitive code. You can now `clean` an input prior to it being presented to the `perform` function. Ensure an input is cast to a string with a clean function ``` const lastName = input({ label: "Last Name", placeholder: "Last name of a person", type: "string", required: true, clean: (value) => util.types.toString(value), }); ``` With a typed clean function (like `util.types.toString`, which always returns a `string`), your perform function will be cognizant of the input's type. ![Screenshot of code editor showing string type](/docs/img/spectral/spectral-6-upgrade-guide/string-type.png) The `clean` function always takes one parameter - the input from the integration runner - and should return a typed value. Your clean functions can be as simple or complex as required, and you can incorporate data validation within the clean function to catch incorrectly formatted input. For example, you can ensure that the input you received is an array, and that the array's values are cast to numbers (in case they happen to come in as strings): Ensure input is an array of numbers ``` const prices = input({ label: "Prices", placeholder: "A list of prices", type: "string", required: true, clean: (value) => { if (!Array.isArray(value)) { throw new Error("Provided list is not an array."); } return value.map(util.types.toNumber); }, }); ``` An example of a more complex `clean` function that returns an object with multiple fields is available in our [examples repository](https://github.com/prismatic-io/examples/blob/9ff35d3174c6d55b25381a9b0c33997aea5625f1/components/data-example/src/index.ts#L10-L27). #### New - global error handlers[​](#new---global-error-handlers "Direct link to New - global error handlers") This is another improvement that helps keep your component code DRY. The actions in your component might all wrap API endpoints using an HTTP client, and that client might throw specific errors. You could handle those errors within each action, but you would end up writing the same error handlers repeatedly. You can now specify an error handler function to run whenever any of your actions throws an error. To specify an error handler, add a `handlers` block to your `component({})` function definition: ``` components({ // ... handlers: { error: (error) => doSomething(error), }, }); ``` For example, the popular HTTP client [axios](https://www.npmjs.com/package/axios) throws an error whenever it receives a status code that's [*not* between 200-299](https://github.com/axios/axios/blob/1f13dd7e26124a27c373c83eff0a8614acc1a04f/lib/defaults/index.js#L127-L129). If your HTTP client receives a status code in the 4xx or 5xx range, an error is thrown with a minimal message. If you require additional information, such as the status code or full response to the HTTP request, you can inspect the error being thrown and return a more detailed error message, as illustrated in Spectral [here](https://github.com/prismatic-io/spectral/blob/v6.5.0/packages/spectral/src/clients/http/index.ts#L53-L62). #### New - spectral testing harness[​](#new---spectral-testing-harness "Direct link to New - spectral testing harness") New error handlers are defined at the *component* level, so testing your component's behavior holistically is important. Prior to Spectral 6.x, you would write a Jest unit test for an action using an `invoke` function like this: Example of unit testing in Spectral 5.x ``` import { myAction } from "."; import { invoke } from "@prismatic-io/spectral/dist/testing"; describe("test my action", () => { test("verify the return value of my action", async () => { const sampleInputData = { productName: "Widget", price: 1.25, quantity: 75, }; const expectedOutput = "This is an invoice for 75 Widgets at price $1.25. Total price: $93.75"; const { result } = await invoke(myAction, { pointOfSale: sampleInputData, }); expect(result.data).toBe(expectedOutput); }); }); ``` The same test can be performed by creating a testing harness with `new ComponentTestHarness(component)`. The advantage of this method is that your component-level global error handling hooks and input `clean` functions will be included in your test. The same test in Spectral 6.x ``` import component from "."; import { ComponentTestHarness } from "@prismatic-io/spectral/dist/testing"; const harness = new ComponentTestHarness(component); describe("test my action", () => { test("verify the return value of my action", async () => { const sampleInputData = { productName: "Widget", price: 1.25, quantity: 75, }; const expectedOutput = "This is an invoice for 75 Widgets at price $1.25. Total price: $93.75"; const result = await harness.action("myAction", { pointOfSale: sampleInputData, }); expect(result.data).toBe(expectedOutput); }); }); ``` --- ### Spectral 7.x Upgrade Guide Spectral 7.x is completely backwards-compatible with Spectral 6.x. You can safely upgrade your version of Spectral from 6.x to 7.x without changes to your custom component code. The focus of Spectral 7.x was improved instance deployment. You can now create **Data Sources**. Similar to triggers or actions, data sources use connections to reach out to third-party APIs to gather data. In this case, they gather data to dynamically insert into config variables in an instance configuration wizard. Read more about the instance configuration wizard [here](https://prismatic.io/docs/integrations/data-sources.md), and about writing your own data source [here](https://prismatic.io/docs/custom-connectors/data-sources.md). --- ### Spectral 8.x Upgrade Guide Spectral 8.x is completely backwards-compatible with Spectral 7.x. You can safely upgrade your version of Spectral from 7.x to 8.x without changes to your custom component code. Spectral 8.x introduced [code-native integrations](https://prismatic.io/docs/integrations/code-native.md), which allow you to build integrations entirely in code using the Spectral SDK. --- ### Spectral 9.x Upgrade Guide Spectral 9.x includes an upgrade to TypeScript 5, which introduces several breaking changes. While the Spectral component APIs remain backwards-compatible with Spectral 8.x, you will need to update your custom component's TypeScript to version 5 or later. For [code-native integrations](https://prismatic.io/docs/integrations/code-native.md), Spectral 9.x introduces the ability to reference existing components' actions within a code-native flow. To update a code-native integration to Spectral 9.x, it may be easiest to re-initialize a code-native project and copy your flows and config wizard code to the new project. Alternatively, initialize a new project for reference and perform the following: * Copy `.spectral/index.ts` from the new project into your current project * Copy `.npmrc` from the new project into your current project * Update `@prismatic-io/spectral` in your `package.json` file to the latest version Several syntax changes were made between 8.x and 9.x: * Flows no longer require `configPage` generics to infer the shape of your configuration variables. `flow()` can be changed simply to `flow()`. * References to existing components' data sources and connections are now done by installing a component's manifest package into your project. See [Using existing components in code-native integrations](https://prismatic.io/docs/integrations/code-native/existing-components.md). * References to existing components' triggers remain syntactically unchanged. Visibility of inputs on connections can now be explicitly set. For example, a reference to an existing Slack OAuth connection can now read: ``` export const configPages = { Connections: configPage({ tagline: "Authenticate with Slack", elements: { "Slack OAuth Connection": connectionConfigVar({ stableKey: "slack-oauth-connection", dataType: "connection", connection: { component: "slack", key: "oauth2", values: { clientId: { value: SLACK_CLIENT_ID, permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, clientSecret: { value: SLACK_CLIENT_SECRET, permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, signingSecret: { value: SLACK_SIGNING_SECRET, permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, scopes: { value: "chat:write chat:write.public channels:read", permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, }, }, }), }, }), }; ``` --- ### Prismatic Event Webhooks A webhook is a way for an application to provide other applications with real-time information. This article covers how to set up and use **outbound event webhooks** in Prismatic, so you can be notified of events that occur in your Prismatic account. This article covers *outbound* webhooks Here, we're talking about Prismatic-specific events that occur (e.g. an integration in Prismatic was published or a customer in Prismatic was updated), and you want an external app to know about that change. If you're interested in *incoming webhooks* for your integrations (e.g. A Salesforce contact was updated and you want your integration's trigger to be notified), see [What is a Webhook?](https://prismatic.io/docs/integrations/triggers/webhook.md). #### Why use Prismatic event webhooks?[​](#why-use-prismatic-event-webhooks "Direct link to Why use Prismatic event webhooks?") Prismatic event webhooks allow you to stay informed about important changes and activities within your Prismatic platform. Here are some common use cases: * **Audit Trail**: Keep a record of all platform changes for compliance and tracking purposes * **Monitoring & Alerting**: Get notified when critical events occur, such as integrations being published or instances being deployed * **Integration with External Systems**: Send Prismatic events to your monitoring tools, logging systems, or custom applications * **Real-time Updates**: Receive immediate notifications instead of manually checking the web app or querying the Prismatic API #### Setting up event webhooks[​](#setting-up-event-webhooks "Direct link to Setting up event webhooks") To set up or manage your event webhooks, first navigate to the **Event Webhooks** tab in your Prismatic organization settings. Click **+Event Webhook** to configure a new webhook, or click an existing webhook to modify it. ![](/docs/img/webhooks/configure.png) Fill in the required information * **Name**: A descriptive name for your webhook (e.g., "Production Monitoring") * **URL**: The endpoint where webhooks will be sent * **Secret**: (Optional) An HMAC secret key for [verifying webhook signatures](https://prismatic.io/docs/webhooks.md#prismatic-event-webhook-security) * **Description**: Additional context about the webhook's purpose * **Is Enabled**: Toggle to enable or disable the webhook Next, select the [events](https://prismatic.io/docs/webhooks.md#available-event-types) that should trigger this webhook * You can select individual events or use category checkboxes to select all events of a certain type * Common selections include: * Instance lifecycle events (created, updated, deployed) * Integration changes (published, updated) * Customer management events * Alert monitor 4. **Test your webhook** * Use the **Test Webhook** button to verify your endpoint is working * Check that you receive the test payload at your endpoint 5. **Save and enable** * Click **Save** to create the webhook * Ensure the **Is Enabled** toggle is turned on ##### Testing Prismatic event webhooks[​](#testing-prismatic-event-webhooks "Direct link to Testing Prismatic event webhooks") Use the **Test Webhook** button in the Prismatic UI to: * Verify your endpoint is accessible * Confirm the payload format is correct * Test your webhook processing logic * Validate authentication and security measures When you click **Test Webhook**, your endpoint will receive a `webhook.test` event with a payload that looks like ``` { "message": "This is a test webhook event from Prismatic.", "webhook_endpoint": { "id": "V2ViaG9va0VuZHBvaW50OmJiNGVjYTIzLWI5NzgtNDU1Mi05MDljLTI5YmRlMzZjZTYxMQ==", "name": "Notify Acme of Changes" }, "user": { "id": "VXNlcjoyMzZkMDA3ZS0zZGIxLTQ4MWItOTMyNS0zMjhhYTE0OTY5MDA=", "email": "john.doe@example.io", "name": "John Doe" }, "event_type": "webhook.test", "timestamp": "2025-08-21T20:21:40.405396+00:00", "organization_id": "T3JnYW5pemF0aW9uOmJjYjE0NjEzLTNjZTItNGQ0MC04OTZmLTIyNTZiNjcyYTllYw==", "webhook_id": "bdea273d-2500-42a2-854b-5189c3f66cfa" } ``` #### Webhook payload structure[​](#webhook-payload-structure "Direct link to Webhook payload structure") When an event occurs, Prismatic sends a POST request to your webhook URL with a JSON payload. Here's an example of what the payload looks like: ``` { "integration": { "id": "SW50ZWdyYXRpb246ZWM5YzViM2EtZjNhNy00MDliLTllM2QtODA3MDAxNDVlNWU0", "name": "Slack Integration", "description": "Get alerts in Slack when new contacts are created", "category": "Communication", "has_unpublished_changes": false, "version_number": 6, "created_at": "2025-08-21T20:22:54.828609+00:00", "updated_at": "2025-08-21T20:22:54.828609+00:00" }, "customer": null, "parent_integration": null, "user": { "id": "VXNlcjoyMzZkMDA3ZS0zZGIxLTQ4MWItOTMyNS0zMjhhYTE0OTY5MDA=", "email": "user@example.com", "name": "John Doe" }, "event_type": "integration.published", "timestamp": "2025-08-21T20:22:57.855591+00:00", "organization_id": "T3JnYW5pemF0aW9uOmJjYjE0NjEzLTNjZTItNGQ0MC04OTZmLTIyNTZiNjcyYTllYw==", "webhook_id": "c1e226a7-8d13-4d46-98d9-a99a2d6bdb37" } ``` ##### Payload fields[​](#payload-fields "Direct link to Payload fields") | Field | Description | | ----------------- | ------------------------------------------------------------------ | | `event_type` | The type of event that occurred (e.g., `integration.published`) | | `timestamp` | When the event occurred (ISO 8601 format) | | `webhook_id` | Unique identifier for the webhook that sent this notification | | `organization_id` | Your Prismatic organization ID | | `user` | Information about the user who triggered the event (if applicable) | | `integration` | Integration details (for integration-related events) | | `customer` | Customer details (for customer-related events) | | `instance` | Instance details (for instance-related events) | | `workflow` | Workflow details (for workflow-related events) | | `component` | Component details (for component-related events) | | `connection` | Connection details (for connection-related events) | | `alert_monitor` | Alert monitor details (for alert-related events) | | `alert_group` | Alert group details (for alert group events) | | `log_stream` | Log stream details (for log stream events) | #### Available event types[​](#available-event-types "Direct link to Available event types") Prismatic provides comprehensive coverage of platform events. Here's a complete list of available event types: ##### Instance Events[​](#instance-events "Direct link to Instance Events") | Event | ID | Description | | ----------------- | ------------------- | ------------------------------------------------------- | | Instance Created | `instance.created` | Triggered when a new integration instance is created | | Instance Updated | `instance.updated` | Triggered when an instance's configuration is modified | | Instance Deleted | `instance.deleted` | Triggered when an instance is removed from the platform | | Instance Deployed | `instance.deployed` | Triggered when an instance is successfully deployed | | Instance Enabled | `instance.enabled` | Triggered when an instance is activated | | Instance Disabled | `instance.disabled` | Triggered when an instance is deactivated | ##### Customer Events[​](#customer-events "Direct link to Customer Events") | Event | ID | Description | | ---------------- | ------------------ | ------------------------------------------------------- | | Customer Created | `customer.created` | Triggered when a new customer is added to your platform | | Customer Updated | `customer.updated` | Triggered when customer information is modified | | Customer Deleted | `customer.deleted` | Triggered when a customer is removed from the platform | ##### User Events[​](#user-events "Direct link to User Events") | Event | ID | Description | | ------------ | -------------- | --------------------------------------------------- | | User Created | `user.created` | Triggered when a new user account is created | | User Updated | `user.updated` | Triggered when user profile information is modified | | User Deleted | `user.deleted` | Triggered when a user account is removed | ##### Integration Events[​](#integration-events "Direct link to Integration Events") | Event | ID | Description | | --------------------- | ----------------------- | ------------------------------------------------------------- | | Integration Created | `integration.created` | Triggered when a new integration is created | | Integration Updated | `integration.updated` | Triggered when integration configuration is modified | | Integration Deleted | `integration.deleted` | Triggered when an integration is removed | | Integration Published | `integration.published` | Triggered when an integration is published to the marketplace | ##### Workflow Events[​](#workflow-events "Direct link to Workflow Events") | Event | ID | Description | | ------------------ | -------------------- | -------------------------------------------------------------- | | Workflow Created | `workflow.created` | Triggered when a new workflow is created within an integration | | Workflow Updated | `workflow.updated` | Triggered when workflow configuration is modified | | Workflow Deleted | `workflow.deleted` | Triggered when a workflow is removed | | Workflow Published | `workflow.published` | Triggered when a workflow is published | | Workflow Enabled | `workflow.enabled` | Triggered when a workflow is activated | | Workflow Disabled | `workflow.disabled` | Triggered when a workflow is deactivated | ##### Component Events[​](#component-events "Direct link to Component Events") | Event | ID | Description | | ------------------- | --------------------- | ---------------------------------------------------------- | | Component Deleted | `component.deleted` | Triggered when a custom component is removed | | Component Published | `component.published` | Triggered when a component is published to the marketplace | ##### Connection Events[​](#connection-events "Direct link to Connection Events") | Event | ID | Description | | ------------------ | -------------------- | --------------------------------------------------- | | Connection Updated | `connection.updated` | Triggered when connection configuration is modified | | Connection Deleted | `connection.deleted` | Triggered when a connection is removed | ##### Alert & Monitoring Events[​](#alert--monitoring-events "Direct link to Alert & Monitoring Events") | Event | ID | Description | | --------------------- | ----------------------- | ---------------------------------------------------- | | Alert Monitor Created | `alert_monitor.created` | Triggered when a new alert monitor is configured | | Alert Monitor Updated | `alert_monitor.updated` | Triggered when alert monitor settings are modified | | Alert Monitor Deleted | `alert_monitor.deleted` | Triggered when an alert monitor is removed | | Alert Group Created | `alert_group.created` | Triggered when a new alert group is created | | Alert Group Updated | `alert_group.updated` | Triggered when alert group configuration is modified | | Alert Group Deleted | `alert_group.deleted` | Triggered when an alert group is removed | ##### Log Stream Events[​](#log-stream-events "Direct link to Log Stream Events") | Event | ID | Description | | ------------------ | -------------------- | ----------------------------------------------- | | Log Stream Created | `log_stream.created` | Triggered when a new log stream is configured | | Log Stream Updated | `log_stream.updated` | Triggered when log stream settings are modified | | Log Stream Deleted | `log_stream.deleted` | Triggered when a log stream is removed | ##### OAuth2 Events[​](#oauth2-events "Direct link to OAuth2 Events") | Event | ID | Description | | ------------------------------ | -------------------------------- | ------------------------------------------------------------- | | OAuth2 Authorization Completed | `oauth2.authorization_completed` | Triggered when OAuth2 authorization is successfully completed | | OAuth2 Authorization Failed | `oauth2.authorization_failed` | Triggered when OAuth2 authorization fails | | OAuth2 Token Refreshed | `oauth2.token_refreshed` | Triggered when an OAuth2 token is successfully refreshed | | OAuth2 Token Refresh Failed | `oauth2.token_refresh_failed` | Triggered when OAuth2 token refresh fails | ##### System Events[​](#system-events "Direct link to System Events") | Event | ID | Description | | ----- | -------------- | ---------------------------------------------- | | Test | `webhook.test` | Triggered when testing a webhook configuration | #### Prismatic event webhook security[​](#prismatic-event-webhook-security "Direct link to Prismatic event webhook security") If you set a **Secret** when configuring your webhook, Prismatic will use it to generate an HMAC SHA-256 signature for each webhook request. The signature will be sent as a header, `x-webhook-signature` in the form `sha256=`. This allows you to ensure that the webhook request is coming from Prismatic and has not been tampered with. For example, suppose you receive the following webhook event payload: Example webhook event payload ``` {"message":"This is a test webhook event from Prismatic.","webhook_endpoint":{"id":"V2ViaG9va0VuZHBvaW50OjEwNTI1MjE3LTE4NDMtNGRiNC04YjYyLTgwZTdmOTc5OGEzZA==","name":"Testing"},"user":{"id":"VXNlcjozNDkwNjA3MC0wMjRmLTQxNzMtYjYxMy1mN2I0MWFmYmEwNDM=","email":"john.doe@example.com","name":"John Doe"},"event_type":"webhook.test","timestamp":"2025-10-08T14:13:05.914923+00:00","organization_id":"T3JnYW5pemF0aW9uOjQ0ZjkyMTlkLWU0ZGEtNGEwZi04ZmNhLWJkZmJlNTdiMzBjNA==","webhook_id":"4c79940b-b166-41cb-8b45-944e25b480f3"} ``` If your secret is set to `my-secret-key-abc-123`, the `x-webhook-signature` header would contain Example x-webhook-signature header ``` sha256=88563276df8a665d1e57bf8a05c2c2432ff80b583297082b768fb06f173e0b59 ``` ##### Example of verifying signatures with Express[​](#example-of-verifying-signatures-with-express "Direct link to Example of verifying signatures with Express") In this example [Express](https://expressjs.com/) app, we verify the request signature using Node.js's `crypto` module before processing the webhook event: Verifying webhook signatures in Node.js with Express ``` import express from "express"; import { createHmac } from "node:crypto"; const PORT = 3000; const PRISMATIC_SIGNING_SECRET = "my-secret-key-abc-123"; const app = express(); app.post("/my-webhook-endpoint", express.raw({ type: "*/*" }), (req, res) => { // Get HMAC signature from header and compare it to the one we generate const signatureHeader = req.headers["x-webhook-signature"]; const signature = createHmac("sha256", PRISMATIC_SIGNING_SECRET) .update(req.body) .digest("hex"); // If the signatures don't match, return a 401 if (signatureHeader !== `sha256=${signature}`) { console.warn("Rejecting request with invalid HMAC signature"); return res.status(401).send({ error: "Invalid signature" }); } // Parse the event request and handle the event const payload = JSON.parse(req.body.toString()); switch (payload.event_type) { case "webhook.test": console.log("Got a test webhook"); break; case "instance.created": { console.log( `Instance (${payload.instance.id}) created from integration (${payload.integration.name}) for customer ${payload.customer.external_id}`, ); break; } default: console.warn(`Unhandled event type: ${req.body.toString()}`); } res.status(200).send({ received: true }); }); app.listen(PORT, () => { console.log(`Example webhook receiver listening on port ${PORT}`); }); ``` Similar strategies can be used in other programming languages and frameworks. #### Webhook retry[​](#webhook-retry "Direct link to Webhook retry") If a Prismatic event webhook request fails (e.g., due to a network error or a non-200 HTTP response), Prismatic will automatically retry the request up to three times after 100, 200, and 400ms. If more than 25 webhook requests have failed within one minute, the webhook will be disabled for 5 minutes. All events that accumulate during that duration will be sent once the webhook is re-enabled after 5 minutes. #### Troubleshooting[​](#troubleshooting "Direct link to Troubleshooting") **Webhook not receiving events** * Verify the webhook is enabled * Check that the correct event types are selected * Ensure your endpoint is accessible and responding with 2xx status codes **Missing events** * Confirm the event types you've selected * Check if the events actually occurred in Prismatic * Verify your webhook configuration is saved **Authentication errors** * Ensure your endpoint accepts POST requests * Check that you're not requiring authentication that Prismatic can't provide * Verify your endpoint can handle the webhook payload format --- ### AI Integrations #### AI Overview Agent Flows, AI Components, and MCP acceleration #### Build AI-powered integrations Transform your product's integrations with built-in intelligence. Prismatic lets you drop in AI connectors, orchestrate agent flows across systems, and give your AI agents live context via the Model Context Protocol (MCP). You don't need to rebuild your integration layer to launch smart features quickly. [Ship AI Features Faster]() [Connect to OpenAI, ChatGPT, and other AI services through ready-to-use components. Focus on building the integration logic that matters to your customers while we handle the AI service plumbing]() [Unify AI Across Your Integrations]() [Add sentiment analysis to support tickets, extract entities from documents, or generate responses with ChatGPT. Mix AI components with your current logic flows, all within your existing integrations.]() [Future-Proof Your Integrations]() [As AI services evolve and new providers emerge, Prismatic's components stay updated. Switch between AI providers, test new models, and adapt to changing requirements without rebuilding entire integrations.]() [Request AI preview](https://prismatic.io/request-a-demo/) #### What you can build[​](#what-you-can-build "Direct link to What you can build") Explore the AI capabilities below. Each one includes a quick example to show how it works, with links to more detailed documentation if you want to go deeper. ##### Use built-in AI components (Preview)[​](#use-built-in-ai-components-preview "Direct link to Use built-in AI components (Preview)") Ready-to-use connectors for services like OpenAI and Anthropic let you classify text, enrich data, and generate content. You can configure them in the UI or call them from code. Code-NativeLow-Code[](https://github.com/prismatic-io/examples) ``` export const processTranscript = flow({ name: "ProcessTranscript", description: "Process incoming transcripts and extracts key insights.", onExecution: async (context, params) => { const conversation = await context.components.googleGemini.sendMessage({ connection: context.configVars["Google Gemini Connection"], model: "gemini-pro", prompt: `Summarize this message and extract key entities: ${params.onTrigger.results.body.data}`, }); return { data: conversation }; }, }); ``` [Explore AI Components β†’](https://prismatic.io/docs/components/openai.md) ##### Bring your own AI[​](#bring-your-own-ai "Direct link to Bring your own AI") Use your preferred agent SDKs and frameworks to integrate with any AI service or model. Build code-native flows with tools like LangChain, CrewAI, AutoGen, or connect to proprietary and on-prem models. ``` // Example: OpenAI agents SDK in a Prismatic flow import { flow } from "@prismatic-io/spectral"; import { Agent, run, setDefaultOpenAIKey } from "@openai/agents"; export default flow({ name: "AI Agent Flow", onExecution: async ({ configVars }, params) => { // Set OpenAI API key from config const apiKey = configVars.openAiApiKey.fields.apiKey; setDefaultOpenAIKey(apiKey); // Create a simple agent const agent = new Agent({ name: "Assistant", instructions: "You help users with their requests", }); // Run the agent with user input const result = await run(agent, [ { role: "user", content: params.message }, ]); return { data: result.finalOutput }; }, }); ``` [Code Native Guide β†’](https://prismatic.io/docs/integrations/code-native/get-started/setup.md) ##### Expose integrations as AI tools with Agent Flows[​](#expose-integrations-as-ai-tools-with-agent-flows "Direct link to Expose integrations as AI tools with Agent Flows") Agent flows let you build workflows that can make decisions, call APIs, and interact with external systems autonomously. They're ideal for routing data, validating inputs, or triggering the right actions at the right time. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/blob/main/integrations/code-native-integrations/outlook/src/flows/checkCalendarEvents.ts) ``` import { type CheckCalendarEventsInput } from "../schemas/flows"; import { createGraphClient, extractAccessToken } from "../services/graphClient"; export const checkCalendarEvents = flow({ name: "Check Calendar Events", description: "Check upcoming calendar events for a specified number of days ahead", schemas: { invoke: { properties: { daysAhead: { type: "number", description: "Number of days ahead to check", }, includeAllDay: { type: "boolean", description: "Include all-day events", }, }, required: ["daysAhead", "includeAllDay"], }, }, onExecution: async (context, params) => {/* workflow logic */}; }); ``` [Agent flow schema β†’](https://prismatic.io/docs/ai/flow-invocation-schema.md) ##### Combine AI with human oversight[​](#combine-ai-with-human-oversight "Direct link to Combine AI with human oversight") For critical workflows, combine automation with human approval to ensure accuracy and maintain trust. Approval tasks can be added to agent flows to pause and wait for review before proceeding. ![Incident Monitoring](/docs/img/ai/hitl-slack-2.png) ###### Reference Examples:[​](#reference-examples "Direct link to Reference Examples:") * **Slack Chatbot Agent** - Build interactive AI agents that request human approval via Slack before taking actions. See the [complete example on GitHub](https://github.com/prismatic-io/examples/tree/main/ai/slack-chatbot-agent). * **Human Approval Tools** - Use the OpenAI Agent component to create custom human-in-the-loop tools for your workflows. Learn how to [create human approval tools](https://prismatic.io/docs/components/openai/#agent-create-human-approval-tool). ##### Accelerate with MCP[​](#accelerate-with-mcp "Direct link to Accelerate with MCP") The Model Context Protocol (MCP) flow server connects AI agents to your integration platform. Choose the right MCP server based on your use case: **Two ways to use MCP:** * **Building with AI** - Use the **Prism MCP Flow Server** in your IDE to accelerate integration development. Connect your AI copilot (Cursor, Windsurf, etc.) to generate code, publish integrations, and reference components directly from your development environment. [Get started with Prism MCP β†’](https://github.com/prismatic-io/prism-mcp) * **Running AI in Production** - Use the **Prismatic MCP Flow Server** to connect your production AI agents to customer integrations. This exposes your agent flows as tools that AI agents can invoke to interact with external systems and data. ``` { "mcpServers": { "prismatic": { "command": "npx", "args": ["mcp-remote", "https://mcp.prismatic.io/mcp"] } } } ``` [MCP flow server documentation β†’](https://prismatic.io/docs/ai/model-context-protocol.md) #### Common AI use cases[​](#common-ai-use-cases "Direct link to Common AI use cases") Explore practical patterns for implementing AI capabilities in your integrations. Each example includes working code and detailed implementation guides. [Data Enrichment](https://prismatic.io/docs/ai/how-to-guide.md#data-enrichment) [Add context and insights to existing data using AI analysis. Research entities, analyze sentiment, score records based on custom criteria, or augment data with external information.](https://prismatic.io/docs/ai/how-to-guide.md#data-enrichment) [See lead enrichment example β†’](https://prismatic.io/docs/ai/how-to-guide.md#data-enrichment) [AI Routing & Classification](https://prismatic.io/docs/ai/how-to-guide.md#ai-routing-and-classification) [Route data based on AI-powered content analysis. Detect duplicates, classify incoming data into categories, determine priority levels, or make branching decisions based on confidence thresholds.](https://prismatic.io/docs/ai/how-to-guide.md#ai-routing-and-classification) [See duplicate detection example β†’](https://prismatic.io/docs/ai/how-to-guide.md#ai-routing-and-classification) [Smart Data Extraction](https://prismatic.io/docs/ai/how-to-guide.md#smart-data-extraction) [Convert unstructured input into structured data using defined schemas. Parse free-form text, extract entities from documents, or transform logs into actionable records.](https://prismatic.io/docs/ai/how-to-guide.md#smart-data-extraction) [See log parsing example β†’](https://prismatic.io/docs/ai/how-to-guide.md#smart-data-extraction) [Conversational Interfaces](https://prismatic.io/docs/ai/how-to-guide.md#conversational-interfaces) [Build AI-powered interfaces that process natural language input. Handle user queries, execute commands through conversation, or create interactive workflows.](https://prismatic.io/docs/ai/how-to-guide.md#conversational-interfaces) [See Slack chatbot example β†’](https://prismatic.io/docs/ai/how-to-guide.md#conversational-interfaces) [Human-in-the-Loop Approvals](https://prismatic.io/docs/ai/how-to-guide.md#human-in-the-loop-approval-flows) [Gate AI actions behind human approval workflows. Pause execution for review, request confirmation before sensitive operations, or implement escalation paths.](https://prismatic.io/docs/ai/how-to-guide.md#human-in-the-loop-approval-flows) [See approval workflow example β†’](https://prismatic.io/docs/ai/how-to-guide.md#human-in-the-loop-approval-flows) [View all implementation details in Common Use Cases β†’](https://prismatic.io/docs/ai/how-to-guide.md) #### How to get started[​](#how-to-get-started "Direct link to How to get started") Follow these steps to build your first AI-enabled integration: 1. **Start with a template** - Choose an AI-enabled template that includes common patterns and best practices. [Browse templates β†’](https://prismatic.io/docs/integrations/low-code-integration-designer.md#integration-templates) 2. **Add AI components** - Insert a built-in component or create a custom one to add intelligence to your flows. [Add AI components β†’](https://prismatic.io/docs/components/openai.md) 3. **Experiment with agent flows** - Build autonomous workflows that can decide and act. [Agent flow guide β†’](https://prismatic.io/docs/ai/flow-invocation-schema.md) 4. **Enable MCP** - Use MCP servers to provide your AI agents with context and accelerate development. [MCP flow server setup β†’](https://prismatic.io/docs/ai/model-context-protocol.md) [Request AI preview β†’](https://prismatic.io/docs/request-a-demo) #### Best practices and considerations[​](#best-practices-and-considerations "Direct link to Best practices and considerations") To build reliable, secure AI integrations, keep these guidelines in mind: * **Pick the right approach** - Start with built-in components for common tasks; use custom connectors for specialized models; use agent flows for multi-step processes. * **Provide context and handle errors** - Use structured output for deterministic AI responses, implement fallback logic, and use timeouts where appropriate. * **Monitor and control usage** - Track API usage and costs; set rate limits and implement throttling. * **Secure your data** - Never expose API keys in client-side code; use environment variables for secrets; sanitize inputs and follow AI service security best practices. If you need more guidance, consult our documentation on custom connector development, integration patterns, and security configuration. #### Frequently Asked Questions[​](#frequently-asked-questions "Direct link to Frequently Asked Questions") ##### What's the difference between Prismatic MCP Flow Server and Prism MCP Server? Prismatic MCP Flow Server runs in production and gives AI agents access to your deployed integrations and data. Prism MCP Server runs in your IDE (like Kiro) and helps with development by providing AI-powered code generation and integration building assistance. [Learn more about MCP β†’](https://prismatic.io/docs/ai/model-context-protocol.md) ##### How is my data kept safe when using AI features? Your data is processed securely with encryption in transit and at rest. AI services only receive the specific data you configure to send, and you maintain full control over what information is shared with AI models. [Security documentation β†’](https://prismatic.io/docs/configure-prismatic.md) ##### Can I use my own custom LLMs or AI models? Yes! You can create custom components to connect to any AI service or model. The Spectral SDK provides the tools to build connectors for proprietary models, on-premises AI services, or specialized AI APIs. [Custom connector guide β†’](https://prismatic.io/docs/custom-connectors.md) ##### How do I get access to the AI preview features? AI features including agent flows and MCP servers are currently in preview. Contact our team to request early access and get started with AI-powered integrations. [Request preview access β†’](https://prismatic.io/docs/ai/connect-ai-agent.md) ##### Are there code samples and examples available? Yes, we provide comprehensive examples for AI components, agent flows, and MCP server implementations. You'll find code samples in our documentation and GitHub repositories. [Browse examples β†’](https://prismatic.io/docs/integrations.md) #### Resources[​](#resources "Direct link to Resources") [AI Component Library](https://prismatic.io/docs/components/openai.md) [Browse built-in AI components and their capabilities](https://prismatic.io/docs/components/openai.md) [Learn more β†’](https://prismatic.io/docs/components/openai.md) [AI Agent Guide](https://prismatic.io/docs/ai/connect-ai-agent.md) [Step-by-step guide to connecting AI agents with MCP](https://prismatic.io/docs/ai/connect-ai-agent.md) [Learn more β†’](https://prismatic.io/docs/ai/connect-ai-agent.md) [GitHub Examples](https://github.com/prismatic-io) [Sample integrations and AI component implementations](https://github.com/prismatic-io) [Learn more β†’](https://github.com/prismatic-io) --- #### Connect AI Agents to Prismatic The [Prismatic MCP flow server](https://prismatic.io/docs/ai/model-context-protocol.md) allows you to connect AI agents like OpenAI, Claude, or Cursor to your [agent flows](https://prismatic.io/docs/ai/flow-invocation-schema.md). Let's explore how to connect an AI agent to your agent flows. #### Connecting to all agent flows[​](#connecting-to-all-agent-flows "Direct link to Connecting to all agent flows") If you would like to connect your AI agent to all of your agent flows, use the global MCP endpoint for your [region](https://prismatic.io/docs/ai/model-context-protocol.md#prismatics-mcp-flow-server). #### Connecting to a specific integration's agent flows[​](#connecting-to-a-specific-integrations-agent-flows "Direct link to Connecting to a specific integration's agent flows") Suppose you have a Salesforce integration among dozens of other integrations, but you want to connect only to the agent flows associated with that Salesforce integration. In that case, open the MCP tab of your Salesforce integration and take note of the custom MCP endpoint. It will look like `https://mcp.prismatic.io/SW5...../mcp` (depending on your regions). ![Custom MCP endpoint for a specific integration](/docs/img/ai/integration-designer-mcp-menu.png) If your customers have instances of this integration deployed to them, they can also connect to the agent flows associated with their customer by using the same MCP endpoint. #### Authentication with Prismatic's MCP flow server[​](#authentication-with-prismatics-mcp-flow-server "Direct link to Authentication with Prismatic's MCP flow server") Prismatic's MCP flow server uses OAuth (similar to how you log in with the [prism CLI tool](https://prismatic.io/docs/cli.md)). When connecting an AI client to Prismatic's MCP flow server, your client will likely direct you to Prismatic's OAuth 2.0 consent screen in order to connect. Alternatively, you can run `prism me:token` to generate a temporary access token that you can use to authenticate your AI client. Provide the token as a bearer token to Prismatic's MCP flow server. ##### Authentication for embedded customer users[​](#authentication-for-embedded-customer-users "Direct link to Authentication for embedded customer users") If your AI client is embedded in your application (for example, you've built a chat bot with the [AI SDK](https://www.npmjs.com/package/ai)), you will need to [create an embedded JWT](https://prismatic.io/docs/embed/authenticate-users.md) for your customer user and provide that JWT as a bearer token to Prismatic's MCP flow server. #### Connecting AI Agents to Prismatic's MCP flow server[​](#connecting-ai-agents-to-prismatics-mcp-flow-server "Direct link to Connecting AI Agents to Prismatic's MCP flow server") ##### General notes[​](#general-notes "Direct link to General notes") Prismatic's MCP flow server is designed to be used with any client that supports the current draft of the [Model Context Protocol](https://modelcontextprotocol.io/specification/draft/basic/overview) (MCP) specification. Specifically, your client must support the following: * [MCP OAuth](https://modelcontextprotocol.io/specification/draft/basic/authorization) as the means to manage authorization between your application and Prismatic, so your client implementation must provide this support. * Streamable HTTP transport, not stdio, nor the legacy HTTP SSE. Stand-alone clients can leverage the [mcp-remote](https://www.npmjs.com/package/mcp-remote) package to connect to Prismatic's MCP servers. `mcp-remote` handles OAuth connection and stores your Prismatic token for you in `$HOME/.mcp-auth/`. ##### Claude Desktop[​](#claude-desktop "Direct link to Claude Desktop") Here's a sample `claude_desktop_config.json` config file that will connect to Prismatic's MCP flow server. You may need to adjust the URL to match your region. claude\_desktop\_config.json ``` { "mcpServers": { "prismatic": { "command": "npx", "args": ["-y", "mcp-remote", "https://mcp.prismatic.io/mcp"] } } } ``` This will instruct Claude to use the `mcp-remote` shim to connect to Prismatic's MCP server. If successful, after editing your configuration file and *restarting* Claude, you should automatically be directed to an OAuth 2.0 consent screen to grant Claude access to your Prismatic environment. After authenticating, ask Claude "Who am I?". Claude will use Prismatic's `get-me` tool (which is a default tool you'll have access to from Prismatic's MCP server). ![Connecting Claude to Prismatic's MCP server](/docs/img/ai/connect-ai-agent/claude.png) ##### Claude Code[​](#claude-code "Direct link to Claude Code") Similar to Claude Desktop, claude code will use the `mcp-remote` package to connect to Prismatic's MCP flow server. Running this command will add Prismatic's MCP flow server to Claude Code: ``` claude mcp add-json prismatic '{"type":"stdio","command":"npx","args":["-y","mcp-remote","https://mcp.prismatic.io/mcp"]}' ``` You may need to run `claude mcp list` once in order for authentication to be successful. ![Connecting Claude Code to Prismatic's MCP server](/docs/img/ai/connect-ai-agent/claude-code.png) ##### Cursor[​](#cursor "Direct link to Cursor") To add Prismatic's MCP flow server to Cursor, add this JSON to your `mcp.json` configuration file: mcp.json ``` { "mcpServers": { "prismatic": { "type": "stdio", "command": "npx", "args": ["-y", "mcp-remote", "https://mcp.prismatic.io/mcp"], "env": {} } } } ``` Alternatively, [add Prismatic MCP to Cursor](cursor://anysphere.cursor-deeplink/mcp/install?name=prismatic\&config=ewogICJ0eXBlIjogInN0ZGlvIiwKICAiY29tbWFuZCI6ICJucHgiLAogICJhcmdzIjogWyIteSIsICJtY3AtcmVtb3RlIiwgImh0dHBzOi8vbWNwLnByaXNtYXRpYy5pby9tY3AiXQp9) directly. ![Connecting Cursor to Prismatic's MCP server](/docs/img/ai/connect-ai-agent/cursor.png) ##### Node AI SDK[​](#node-ai-sdk "Direct link to Node AI SDK") If you're building your own AI client using the [AI SDK](https://www.npmjs.com/package/ai), you can use the `createMCPClient` function to register Prismatic's MCP flow server with the SDK. You'll need to provide a `StreamableHTTPClientTransport` that points to Prismatic's MCP flow server, and also provide an [embedded JWT](https://prismatic.io/docs/embed/authenticate-users.md) for the customer user (it can be the same JWT you use for embedding Prismatic). When invoking your LLM with `streamText`, provide the tools fetched from Prismatic's MCP flow server to the `tools` option. Connecting to Prismatic's MCP flow server with the AI SDK ``` import { openai } from "@ai-sdk/openai"; import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js"; import { streamText, experimental_createMCPClient as createMCPClient, } from "ai"; // Allow streaming responses up to 30 seconds export const maxDuration = 30; // Fetch all tools available from Prismatic's MCP server given an embedded customer user's access token const getTools = async (prismaticAccessToken: string) => { const transport = new StreamableHTTPClientTransport( new URL("https://mcp.prismatic.io/mcp"), { requestInit: { headers: { Authorization: `Bearer ${prismaticAccessToken}`, }, }, }, ); const mcpClient = await createMCPClient({ transport: transport, onUncaughtError(error) { console.error("Error in MCP client:", error); throw error; }, }); // Remove the "get-me" tool if it exists const { "get-me": getMe, ...tools } = await mcpClient.tools(); return tools; }; // Handle chat completion requests from a chat bot export async function POST(req: Request) { const { messages } = await req.json(); const mcpTools = await getTools( (req.headers.get("Authorization") ?? "").replace("Bearer ", ""), ); const result = streamText({ model: openai("gpt-4o"), messages, tools: { ...mcpTools }, // TODO add built-in tools maxSteps: 20, onError: (error) => { console.error("Error in AI response:", error); throw error; }, }); return result.toDataStreamResponse(); } ``` A full example Next.js app is available on [GitHub](https://github.com/prismatic-io/examples/tree/main/ai/nextjs-chatbot). * [util/tools.ts](https://github.com/prismatic-io/examples/blob/main/ai/nextjs-chatbot/src/util/tools.ts) is a helper to fetch tools from Prismatic's MCP flow server. * [app/api/chat/route.ts](https://github.com/prismatic-io/examples/blob/main/ai/nextjs-chatbot/src/app/api/chat/route.ts) is the API route that handles chat completions. * [app/ai-chat/page.tsx](https://github.com/prismatic-io/examples/blob/main/ai/nextjs-chatbot/src/app/ai-chat/page.tsx) is the front-end chat UI. --- #### Flow Invocation Schemas Similar to other remote web services and APIs, Prismatic Flows can be invoked by AI agents when they are expressed as tools. To make a flow compatible with LLM calls, you must do two things: 1. Give the flow an **invocation schema** 2. Mark the flow as **tool-enabled** #### Invocation schema[​](#invocation-schema "Direct link to Invocation schema") The LLM must understand the structure of requests that the flow expects. For example, if you have a flow that fetches a person given their first and last name, it may expect a request body like this: ``` { "first": "string", "last": "string" } ``` The shape of the request can be expressed with JSON schema, which is a standard way to describe the shape of JSON data: Example invocation schema in JSON ``` { "$schema": "https://json-schema.org/draft/2020-12/schema", "title": "search-people", "$comment": "Given a first and last name of a person, search for matching people in Acme CRM", "type": "object", "properties": { "first": { "description": "A person's first name", "type": "string" }, "last": { "description": "A person's last name", "type": "string" } } } ``` **Optional**: The response that your flow returns can also be described using JSON schema. Here, we describe a response that returns an array of people records, each with an `id`, `name`, and `address`. Example result schema in JSON ``` { "$schema": "https://json-schema.org/draft/2020-12/schema", "title": "search-people-result", "$comment": "Returns any people records that match the search query", "type": "object", "properties": { "people": { "description": "An array of people who match the query", "items": { "properties": { "address": { "description": "Address of person in Acme CRM", "type": "string" }, "id": { "description": "ID of a person in Acme CRM", "type": "number" }, "name": { "description": "Full name of a person in Acme CRM", "type": "string" } }, "type": "object" }, "type": "array" } } } ``` ##### Adding invocation schema to a flow[​](#adding-invocation-schema-to-a-flow "Direct link to Adding invocation schema to a flow") To add an invocation schema to a flow, click on the flow's trigger and select the **Schemas** tab. ![Schemas tab in the flow designer](/docs/img/ai/flow-invocation-schema/edit-invocation-schema.png) Note that `invoke` schemas are required but `result` schemas are optional. Your agent flows must be synchronous In order to return a pertinent response to an AI agent, your flow must respond [synchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md#synchronous-invocations-and-redirects). As a reminder, a synchronous flow will return the results of the last step of the flow to the caller, and must complete its work within 30 seconds. This very simple example integration receives a request with `first` and `last` properties, fetches users from [JSON Placeholder](https://jsonplaceholder.typicode.com/users), and returns a list of users whose names match the searched strings. When an AI client like Claude invokes this flow, it translates a human language prompt like: > Search people in Acme CRM whose first name includes "Clem" Into an HTTP request to the agent flow's webhook URL with a payload of `{ "first": "Clem" }`. ![Claude using an agent flow to fetch data](/docs/img/ai/flow-invocation-schema/claude-fetch-person.png) #### Marking a flow as tool-enabled[​](#marking-a-flow-as-tool-enabled "Direct link to Marking a flow as tool-enabled") After adding an invocation schema to your flow, you must also mark the flow as **tool-enabled**. This is done in the integration designer by clicking the MCP icon on the left side of the flow designer, and toggling the **Tool-enabled** switch for specific flows. ![Marking a flow as tool-enabled](/docs/img/ai/integration-designer-mcp-menu.png) #### Querying for invocation schemas[​](#querying-for-invocation-schemas "Direct link to Querying for invocation schemas") The flows you build that have invocation schemas are considered **agent flows**. As a Prismatic organization user, you can programmatically query for agent flows that are associated with test instances (the instances used when testing an integration in the low-code designer). Customer users, on the other hand, can only query for agent flows that are associated with instances deployed to the customer they are logged in as. Information about these flows, including webhook URL and invokeSchema / resultSchema can be fetched from the Prismatic API using the `ai { agentFlows }` query: Query for deployed agent flows ``` query agentFlows { ai { agentFlows { nodes { id name description webhookUrl apiKeys invokeSchema resultSchema } } } } ``` The above query, when run by an organization team member, will return information about all agent flows for your test instances. If you are querying as a customer user using an embedded JWT, on the other hand, you will see only agent flows deployed to the customer you're logged in as. Agent flows response ``` { "data": { "ai": { "agentFlows": { "nodes": [ { "id": "SW5zdGFuY2VGbG93Q29uZmlnOmI2N2EzMjU2LWMzOTgtNDRkMy04Mzg0LWExZDkyMjZkN2JhYg==", "name": "search-people", "description": "Given a first and last name of a person, search for matching people in Acme CRM", "webhookUrl": "https://hooks.dev.prismatic-dev.io/trigger/SW5zdGFuY2VGbG93Q29uZmlnOmI2N2EzMjU2LWMzOTgtNDRkMy04Mzg0LWExZDkyMjZkN2JhYg==", "apiKeys": [], "invokeSchema": "{\"type\": \"object\", \"title\": \"search-people\", \"$comment\": \"Given a first and last name of a person, search for matching people in Acme CRM\", \"properties\": {\"last\": {\"type\": \"string\", \"description\": \"A person's last name\"}, \"first\": {\"type\": \"string\", \"description\": \"A person's first name\"}}}", "resultSchema": "{\"type\": \"object\", \"title\": \"search-people-result\", \"$comment\": \"Returns any people records that match the search query\", \"properties\": {\"people\": {\"type\": \"array\", \"items\": {\"type\": \"object\", \"properties\": {\"id\": {\"type\": \"number\", \"description\": \"ID of a person in Acme CRM\"}, \"name\": {\"type\": \"string\", \"description\": \"Full name of a person in Acme CRM\"}}}, \"description\": \"An array of people who match the query\"}}}" } ] } } } } ``` Once you have the `invokeSchema` and `webhookUrl`, you can [create tools](https://prismatic.io/docs/ai/connect-ai-agent.md) for AI agents (like OpenAI or Claude) to consume. #### Flow schema in code-native integrations[​](#flow-schema-in-code-native-integrations "Direct link to Flow schema in code-native integrations") To add flow schema to a [code-native flow](https://prismatic.io/docs/integrations/code-native/flows.md), add a `schemas.invoke` property to your flow definition. Additionally, you must set `isAgentFlow: true` on the flow definition. In this example, we define a flow that searches for people in Acme CRM by first and last name. Our invocation schema expects two optional string properties, `first` and `last`. Example code-native flow with invocation schema ``` import { flow } from "@prismatic-io/spectral"; import axios from "axios"; export const searchPeople = flow({ name: "Search People", stableKey: "search-people", description: "Search for People in Acme CRM", isAgentFlow: true, schemas: { invoke: { $schema: "https://json-schema.org/draft/2020-12/schema", $comment: "Given a first and last name of a person, search for matching people in Acme CRM", properties: { first: { description: "A person's first name", type: "string", }, last: { description: "A person's last name", type: "string", }, }, title: "search-people-in-acme", type: "object", }, }, isSynchronous: true, onExecution: async (context, params) => { const { first: firstNameSearch, last: lastNameSearch } = params.onTrigger .results.body.data as { first: string; last: string }; const { data: people } = await axios.get<{ name: string }[]>( "https://jsonplaceholder.typicode.com/users", ); const matchingPeople = people.filter((person) => { const [firstName, lastName] = person.name.split(" "); if (firstNameSearch) { if (!firstName.toLowerCase().includes(firstNameSearch.toLowerCase())) { return false; } } if (lastNameSearch) { if (!lastName.toLowerCase().includes(lastNameSearch.toLowerCase())) { return false; } } return true; }); return { data: matchingPeople }; }, }); export default [searchPeople]; ``` --- #### Common Use Cases This guide provides practical patterns for implementing AI capabilities in your Prismatic integrations. Each section explains the core concept, shows how to implement it, and provides real-world examples. #### Data Enrichment[​](#data-enrichment "Direct link to Data Enrichment") Enrich existing data by using AI agents equipped with tools. Agents gather information from multiple sources, analyze content, and augment records with AI-generated insights. ##### CRM Lead Enrichment[​](#crm-lead-enrichment "Direct link to CRM Lead Enrichment") Automatically research incoming leads and enrich them with company information, scoring, and insights before adding to your CRM. This example demonstrates researching a lead's company, analyzing fit criteria, and creating an enriched record in Salesforce. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/tree/main/ai/salesforce-lead-enricher-and-routing) ``` /** * Enrich Incoming Lead Flow * * Researches incoming leads using web search, enriches them with company data, * and creates them in Salesforce. */ export const enrichIncomingLead = flow({ name: "Enrich Incoming Lead", description: "Research and enrich leads before creating them in Salesforce", onExecution: async (context, params) => { const { configVars } = context; const triggerPayload = params.onTrigger.results.body as TriggerPayload; // Step 1: Set up web search tool const setupWebResearchTool = await context.components.openai.createWebSearchTool({ name: "Web Search", searchContextSize: "high", }); // Step 2: Create lead research agent const createLeadResearchAgent = await context.components.openai.createAgent({ instructions: `When provided a new lead, attempt to research them based on their domain. Look for their industry, employee count, and the problem the company solves`, modelName: "gpt-5-mini-2025-08-07", name: "Lead Researcher", outputSchema: { type: "object", properties: { company: { type: "string", }, employeeCount: { type: "number", }, vertical: { type: "string", }, companyDescription: { type: "string", }, }, required: ["employeeCount", "vertical", "companyDescription"], additionalProperties: false, }, outputSchemaName: "output", outputSchemaStrict: false, tools: [setupWebResearchTool.data], }); // Step 3: Research and enrich the lead const researchAndEnrichLead = await context.components.openai.runAgent({ agentConfig: createLeadResearchAgent.data, maxTurns: "10", openaiConnection: configVars["OpenAI Connection"], userInput: `Research this lead on the web: Company: ${triggerPayload.data.company} Email: ${triggerPayload.data.email} Name: ${triggerPayload.data.firstName} ${triggerPayload.data.lastName}`, }); const enrichedData = researchAndEnrichLead.data.finalOutput; // Step 4: Create lead in Salesforce const createLead = await context.components.salesforce.createLead({ company: triggerPayload.data.company, connection: configVars["Salesforce Connection"], description: enrichedData.companyDescription, email: triggerPayload.data.email, employeeCount: enrichedData.employeeCount.toString(), firstName: triggerPayload.data.firstName, lastName: triggerPayload.data.lastName, leadStatus: "Open", version: "63.0", }); return { data: createLead }; }, }); ``` #### AI Routing and Classification[​](#ai-routing-and-classification "Direct link to AI Routing and Classification") Use AI to analyze incoming data and make intelligent decisions. The AI evaluates content against defined criteria, classifies it, and produces a structured output to enable branching and intelligent routing. ##### Duplicate Record Detection[​](#duplicate-record-detection "Direct link to Duplicate Record Detection") Prevent duplicate records by using AI to analyze and compare incoming data against existing records. This example queries for potential matches and uses AI classification to determine if an account already exists, with confidence thresholds to ensure accuracy. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/salesforce-lead-enricher-and-routing) ``` /** * Check for Duplicates Flow * * Analyzes incoming leads against existing Salesforce accounts to identify and prevent * duplicate entries. Uses AI classification to determine similarity with high confidence. */ export const checkForDuplicates = flow({ name: "Check for Duplicates", description: "Prevent duplicate lead creation by checking against existing Salesforce accounts", onExecution: async (context, params) => { const { configVars } = context; const incomingLead = params.onTrigger.results.body.data as IncomingLeadData; // Step 1: Query Salesforce for potential duplicate accounts const findAccounts = await context.components.salesforce.query({ connection: configVars["Salesforce Connection"], queryString: `SELECT Id, Name, Website FROM Account WHERE Name like '${incomingLead.company}%'`, version: "63.0", }); // Step 2: Use AI to classify if the lead is a duplicate const classification = await context.components.openai.classifyAndBranch({ openaiConnection: configVars["OpenAI Connection"], model: "gpt-5-mini-2025-08-07", branches: { Duplicate: "The name, domain, or firmographics suggest it is a duplicate", "Not a Duplicate": "The account appears to be unique based on the provided information.", Else: "You are unable to determine if it is a duplicate.", }, classificationInstructions: `Analyze the account and possible duplicates. Use all available information to determine if this is a duplicate account.`, inputText: `New Account: ${JSON.stringify(incomingLead)} Possible Duplicates: ${JSON.stringify(findAccounts.data.records)}`, }); // Step 3: Route based on classification result if (classification.data.selectedBranch === "Not a Duplicate") { // Create new lead in Salesforce const createLead = await context.components.salesforce.createLead({ connection: configVars["Salesforce Connection"], company: incomingLead.company, email: incomingLead.email, firstName: incomingLead.firstName, lastName: incomingLead.lastName, leadStatus: "Open", version: "63.0", }); return { data: createLead }; } return { data: null }; }, }); ``` #### Smart Data Extraction[​](#smart-data-extraction "Direct link to Smart Data Extraction") Transform unstructured content into structured data by defining JSON schemas that enforce consistent output formats. AI agents use these schemas to parse documents, logs, and other content into predictable, validated structures ##### Error Logs to Jira Issues[​](#error-logs-to-jira-issues "Direct link to Error Logs to Jira Issues") Convert application logs into structured Jira issues by extracting error details, severity, and priority. This example uses a JSON schema to enforce output structure, ensuring the AI returns data in the exact format needed for ticket creation. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/tree/main/ai/jira-issues-from-error-logs) ``` /** * Create Jira Issue for Error Logs Flow * * This flow automatically analyzes system error logs and creates Jira issues * for significant errors that require attention. It uses AI to intelligently * classify errors, determine their severity, and generate appropriate ticket * descriptions. */ export const createJiraIssueForErrorLogs = flow({ name: "Create Jira Issue for Error Logs", description: "Automatically analyze error logs and create Jira issues for critical errors using AI", onExecution: async (context, params) => { const { configVars } = context; const logData = params.onTrigger.results.body.data as TriggerPayload; // Step 1: Create AI agent for log analysis const createLogAnalyzer = await context.components.openai.createAgent({ instructions: `You are a log analyzer that creates Jira issues from system errors. ## Your Task 1. Identify the main error in the logs 2. Extract the details necessary to create a Jira issue ## Priority Rules - CRITICAL logs or customer-facing errors β†’ High - ERROR logs β†’ Medium - WARN logs β†’ Low ## Severity Scale 1. **Minimal** - Cosmetic issue, no functional impact 2. **Minor** - Small feature affected, easy workaround exists 3. **Moderate** - Feature degraded, some users impacted 4. **Major** - Feature broken, many users affected 5. **Critical** - System down, data loss, or security issue ## Confidence Score Rate 0.0 to 1.0 based on: - Clear error with stack trace β†’ 0.8-1.0 - Timeout or connection issue β†’ 0.6-0.8 - Warning that might be transient β†’ 0.3-0.5 - Unclear if action needed β†’ 0.0-0.3 ## Important Guidelines - Keep the title clear and actionable`, mcpServers: [], modelName: "gpt-5-mini-2025-08-07", name: "Log Analysis Expert", outputSchema: JSON.stringify(JIRA_ISSUE_SCHEMA), outputSchemaName: "jira_issue_output", outputSchemaStrict: false, tools: [], }); // Step 2: Analyze logs and extract Jira issue data const extractJiraIssueInputs = await context.components.openai.runAgent({ agentConfig: createLogAnalyzer.data, maxTurns: "10", openaiConnection: configVars["OpenAI Connection"], previousResponseId: "", userInput: `Analyze the following logs and attempt to extract the necessary fields to create a Jira issue:\n\n${JSON.stringify( logData, )}`, }); const issueData = extractJiraIssueInputs.data.finalOutput; // Step 3: Check confidence threshold before creating issue if (issueData.confidence < 0.3) { return { data: { message: "No significant errors requiring Jira ticket", confidence: issueData.confidence, analysis: issueData, success: true, }, }; } // Step 4: Create Jira issue const createIssue = await context.components.atlassianJira.createIssue( { issueTypeId: configVars["Issue Type"], projectId: configVars["Project"], summary: issueData.title, description: issueData.description, jiraConnection: configVars["Jira Connection"], }, ); // Step 5: Return comprehensive result return { data: { jiraIssue: createIssue.data, analysis: { title: issueData.title, priority: issueData.priority, severity: issueData.severity, confidence: issueData.confidence, }, success: true, }, }; }, }); /** * Jira issue output schema for structured data extraction */ const JIRA_ISSUE_SCHEMA = { $schema: "http://json-schema.org/draft-07/schema#", title: "JiraIssueOutput", type: "object", required: [ "title", "description", "priority", "issue_type", "severity", "confidence", ], properties: { title: { type: "string", maxLength: 100, description: "Brief description of the error", }, description: { type: "string", description: "Detailed description including what happened, when, and error details", }, priority: { type: "string", enum: ["High", "Medium", "Low"], description: "Issue priority level", }, issue_type: { type: "string", enum: ["Bug"], description: "Type of Jira issue. Always capitalized", }, severity: { type: "integer", minimum: 1, maximum: 5, description: "Impact severity (1=minimal, 5=critical)", }, confidence: { type: "number", minimum: 0.0, maximum: 1.0, description: "Confidence score that this needs a Jira ticket", }, }, additionalProperties: false, }; ``` ##### Extract Invoices from PDFs in Dropbox[​](#extract-invoices-from-pdfs-in-dropbox "Direct link to Extract Invoices from PDFs in Dropbox") Automatically process PDF invoices from a Dropbox folder by extracting structured data and creating records in your accounting system. This example shows how to combine file monitoring, PDF parsing, and AI extraction with schema validation. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/tree/main/ai/dropbox-extract-receipt-from-pdf) ``` /* Import Receipts from PDFs Flow This flow automatically processes PDF files from a Dropbox folder, identifies receipts/invoices using AI classification, and extracts structured data from valid documents. The flow runs every 5 minutes and performs the following steps: 1. Lists all files in the configured Dropbox import folder 2. Downloads each PDF file 3. Uploads files to OpenAI for processing 4. Classifies documents to identify receipts/invoices 5. Extracts structured data from valid receipts @returns Extracted receipt data or empty object if no valid receipts found */ export const importReceiptsFromPdFs = flow({ name: "Import Receipts from PDFs", description: "Automatically process PDF receipts from Dropbox, classify documents, and extract structured receipt data using AI", onExecution: async (context) => { const { configVars } = context; const processedReceipts: ExtractedReceipt[] = []; const listImportFolder = await context.components.dropbox.listFolder({ dropboxConnection: configVars["Dropbox Connection"], path: configVars["Import Folder"], }); // Step 1: Process each file for (const file of listImportFolder.data.result.entries) { // Step 2: Download the file from Dropbox const downloadFile = await context.components.dropbox.downloadFile({ dropboxConnection: configVars["Dropbox Connection"], path: file.path_lower, }); // Step 3: Upload file to OpenAI for processing const uploadFile = await context.components.openai.uploadFile({ connection: configVars["OpenAI Connection"], file: downloadFile.data as any, filename: file.name, purpose: "assistants", timeout: 10000, }); // Step 4: Classify the document to determine if it's a receipt/invoice const agentClassifyAndBranch = await context.components.openai.classifyAndBranch( { agentMcpServers: [], agentTools: [], branches: { "Needs Processing": "The analyzed file is an invoice or receipt that contains transaction data.", }, classificationInstructions: `Analyze the provided file carefully. Determine if it is an invoice or receipt that should be processed. A document should be classified as "Needs Processing" if it contains: - Transaction details (items, prices, totals) - Vendor/store information - Date of transaction - Receipt or invoice number If the document doesn't contain these elements or you cannot determine its type, return the "Else" branch. Always return the required output schema with confidence and reasoning.`, fileIds: [uploadFile.data.id], inputText: `Analyze this PDF file and determine if it's a receipt or invoice that contains extractable transaction data.`, model: "gpt-5-2025-08-07", openaiConnection: configVars["OpenAI Connection"], }, ); // Step 5: Extract structured data from the receipt/invoice if (agentClassifyAndBranch.branch === "Needs Processing") { // Create pdf extraction ai agent const pdfExtractionAgent = await context.components.openai.createAgent({ instructions: `You are an expert at analyzing PDFs and extracting receipt and invoice information. Your task is to: 1. Carefully read and analyze the entire document 2. Extract all transaction details including items, prices, and totals 3. Identify store/vendor information 4. Extract dates in ISO format 5. Ensure all numeric values are accurate 6. If a receipt ID is not visible, generate one based on the store name and date Be thorough and accurate in your extraction.`, mcpServers: [], modelName: "gpt-5-mini-2025-08-07", name: "PDF Receipt Data Extractor", outputSchema: JSON.stringify(RECEIPT_SCHEMA), outputSchemaName: "receipt_data", outputSchemaStrict: false, tools: [], }); // Run the extraction agent against the uploaded pdf const extractedReceipt = await context.components.openai.runAgent<{ data: { finalOutput: ExtractedReceipt }; }>({ agentConfig: pdfExtractionAgent.data, maxTurns: "10", openaiConnection: configVars["OpenAI Connection"], fileIds: [uploadFile.data.id], userInput: `Please analyze this PDF document and extract all receipt/invoice information according to the provided schema. Be thorough in identifying all line items, calculating totals, and extracting vendor information.`, }); processedReceipts.push(extractedReceipt.data.finalOutput); } } // Return summary of processed receipts return { data: { processedReceipts, summary: { totalProcessed: processedReceipts.length, totalFiles: listImportFolder.data.result.entries.length, success: true, }, }, }; }, }); ``` #### Conversational Interfaces[​](#conversational-interfaces "Direct link to Conversational Interfaces") Build AI-powered chat interfaces that maintain conversation context across multiple interactions. Agents store and retrieve chat history to understand context, process natural language, and execute workflows based on user intent. ##### Slack Assistant[​](#slack-assistant "Direct link to Slack Assistant") Create an AI assistant that responds to user messages in Slack threads, providing application intelligence directly in the customer's Slack. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/tree/main/ai/slack-chatbot-agent) ``` export const eventHandler = flow({ name: "Slack Message Handler", description: "Handles Slack Events and generates responses with OpenAI Assistant SDK", onExecution: async (context, params) => { const { configVars, customer, integration, instanceState } = context; const connection = configVars["Slack Connection"]; const openaiKey = util.types.toString( configVars.OPENAI_API_KEY.fields.apiKey, ); const prismaticRefreshToken = util.types.toString( configVars.PRISMATIC_REFRESH_TOKEN, ); // Set OpenAI API key globally setDefaultOpenAIKey(openaiKey); // Build agent tools const tools = await buildTools( customer.externalId !== "testCustomerExternalId" ? customer : undefined, prismaticRefreshToken, integration.id, ); const agent = new Agent({ name: "Slack Assistant", instructions: configVars.SYSTEM_PROMPT, tools, }); const executionId = params.onTrigger.results.executionId; // Create slack assistant const assistant = new Assistant({ userMessage: async (args) => { const { client, message, logger, setStatus } = args; if ( !("text" in message) || !("thread_ts" in message) || !message.text || !message.thread_ts ) { return; } setStatus("is typing..."); const conversationId = message.thread_ts; const userInput = message.text; try { // Get stored state for this conversation const convState = instanceState[conversationId]; const lastResponseId = convState.lastResponseId; // Run the agent with the message const result = await run(agent, [user(userInput)], { previousResponseId: lastResponseId, }); // Handle interruptions if (result.interruptions && result.interruptions.length > 0) { const firstInterruption = result.interruptions[0]; // Store state in instanceState instanceState[conversationId] = { state: result.state.toString(), lastResponseId: result.lastResponseId, pendingInterruption: { functionId: firstInterruption.rawItem.id!, name: firstInterruption.rawItem.name, arguments: firstInterruption.rawItem.arguments, }, }; // Post approval block await client.chat.postMessage({ channel: message.channel, thread_ts: message.thread_ts, blocks: createApprovalBlocks( firstInterruption.rawItem.name, firstInterruption.rawItem.arguments, executionId, ), text: `Approval required for tool: ${firstInterruption.rawItem.name}`, metadata: { event_type: "tool_approval", event_payload: { conversationId }, }, }); } else { // Store lastResponseId for next message instanceState[conversationId] = { lastResponseId: result.lastResponseId, }; // Post response await client.chat.postMessage({ channel: message.channel, thread_ts: message.thread_ts, text: result.finalOutput || "I couldn't generate a response.", metadata: { event_type: "execution_id", event_payload: { execution_id: executionId }, }, }); } } catch (e) { await args.say({ text: "I encountered an error processing your request. Please try again.", }); } }, threadStarted: async (args) => { await args.say("Hi! I'm your AI assistant. How can I help you today?"); await args.saveThreadContext(); }, }); const actionHandlers: ActionHandlers = { onToolApproval: async ({ approved, previousExecutionId, userId, conversationId, channelId, client, updateMessage, }) => { // Get stored state for this conversation const convState = instanceState[conversationId] as ConversationState; // Deserialize and apply users decision let agentState = await RunState.fromString(agent, convState.state); const interrupts = agentState.getInterruptions(); const interrupt = interrupts[0]; if (approved) { agentState.approve(interrupt); } else { agentState.reject(interrupt); } // Update message to show decision await updateMessage( approved ? `βœ… Tool execution approved by <@${userId}>` : `❌ Tool execution denied by <@${userId}>`, ); // Continue execution const result = await run(agent, agentState); instanceState[conversationId] = { lastResponseId: result.lastResponseId, } as ConversationState; // Post final response await client.chat.postMessage({ channel: channelId, thread_ts: conversationId, text: result.finalOutput || "Task completed.", metadata: { event_type: "execution_id", event_payload: { execution_id: executionId, }, }, }); }, }; const app = App(connection, { assistant, actionHandlers }); const handler = await app.start(); await handler(params.onTrigger.results); return { data: { result: "Event processed successfully", }, }; }, }); ``` ##### Next.js Chatbot[​](#nextjs-chatbot "Direct link to Next.js Chatbot") Expose your Prismatic integrations as tools for external AI applications using the Model Context Protocol (MCP). This Next.js example demonstrates how to discover and invoke your deployed integration flows from a standalone AI chat interface. [Try the example β†’](https://github.com/prismatic-io/examples/tree/main/ai/nextjs-chatbot) ``` import { generateToken } from "@/util/token"; import { StreamableHTTPClientTransport } from "@modelcontextprotocol/sdk/client/streamableHttp.js"; import { experimental_createMCPClient as createMCPClient } from "ai"; import { getTools } from "@/util/tools"; import { openai } from "@ai-sdk/openai"; import { streamText } from "ai"; // Allow streaming responses up to 30 seconds export const maxDuration = 30; // Utility function to get available tools from Prismatic MCP export const getTools = async () => { const prismaticAccessToken = generateToken(); const MCP_URL = process.env.MCP_URL || "https://mcp.prismatic.io/mcp"; const transport = new StreamableHTTPClientTransport(new URL(MCP_URL), { requestInit: { headers: { Authorization: `Bearer ${prismaticAccessToken}`, }, }, }); const mcpClient = await createMCPClient({ transport: transport, onUncaughtError(error) { console.error("Error in MCP client:", error); throw error; }, }); const tools = await mcpClient.tools(); return tools; }; // Chat API Endpoint using Prismatic flows as tools export async function POST(req: Request) { const { messages } = await req.json(); // Fetch prismatic tools const mcpTools = await getTools(); const result = streamText({ model: openai("gpt-5"), messages, tools: { ...mcpTools }, maxSteps: 20, onError: (error) => { console.error("Error in AI response:", error); throw error; }, }); return result.toDataStreamResponse(); } ``` #### Human-in-the-Loop Approval Flows[​](#human-in-the-loop-approval-flows "Direct link to Human-in-the-Loop Approval Flows") Combine AI automation with human oversight by restricting tool access and implementing approval gates for sensitive operations. Define permission levels (read vs. write), create approval workflows that pause execution for human review, and maintain audit logs. ##### API Operation Tools Requiring Approval[​](#api-operation-tools-requiring-approval "Direct link to API Operation Tools Requiring Approval") Gate sensitive API operations behind human approval workflows. This example shows how to differentiate between read-only operations (no approval needed) and write operations (approval required), with a mechanism to pause execution and resume after human review. Code-NativeLow-Code[](https://github.com/prismatic-io/examples/blob/main/ai/openai-agent/src/flows/agentWithApprovals.ts) ``` export const approvalFlow = flow({ name: "Approval Flow", description: "Demonstrates wrapping REST APIs as AI tools for interaction", onExecution: async ({ configVars }, params) => { const openaiKey = util.types.toString( configVars.OPENAI_API_KEY.fields.apiKey, ); // Set the OpenAI API key setDefaultOpenAIKey(openaiKey); // Create agent with API tools const agent = new Agent({ name: "API Assistant", instructions: `You are an API assistant that helps users interact with their data. Use the available tools to fulfill user requests.`, tools: [ // Read-only tools apiTools.getCurrentUserInfo, apiTools.getPosts, apiTools.getPost, apiTools.getPostComments, // Write tools apiTools.createPost, //needsApproval: true apiTools.updatePost, //needsApproval: true ], }); // Get the message from the payload const { message, conversationId, lastResponseId, state, interruptions: userResponses, } = params.onTrigger.results.body.data as ChatRequest; if (userResponses && state) { let agentState = await RunState.fromString(agent, state); agentState = updateStateWithUserResponse( agentState, agentState.getInterruptions(), userResponses, ); const result = await run(agent, agentState); const interruptions: Interruption[] = handleInterrupt( result.interruptions, ); return { data: { response: interruptions.length > 0 ? undefined : result.finalOutput, interruptions, lastResponseId: result.lastResponseId, conversationId, state: result.state.toString(), }, }; } else { if (!message) { throw new Error("Message is required to run the agent"); } // Run the agent with the message const result = await run(agent, [user(message)], { previousResponseId: lastResponseId, }); const interruptions: Interruption[] = handleInterrupt( result.interruptions, ); return { data: { response: interruptions.length > 0 ? undefined : result.finalOutput, interruptions, lastResponseId: result.lastResponseId, conversationId, state: result.state.toString(), }, }; } }, }); function updateStateWithUserResponse( state: RunState>, interrupts: RunToolApprovalItem[], userResponses: Interruption[], ) { for (const userResponse of userResponses) { const interrupt = interrupts.find( (i) => i.rawItem.id === userResponse.functionId, ); if (interrupt) { if (userResponse.approved) { state.approve(interrupt); } else { state.reject(interrupt); } } } return state; } function handleInterrupt(interrupts: RunToolApprovalItem[]): Interruption[] { if (interrupts.length === 0) { return []; } const userApprovalItems = interrupts.map((intr) => ({ functionId: intr.rawItem.id!, name: intr.rawItem.name, approved: false, arguments: intr.rawItem.arguments, })); return userApprovalItems; } ``` ##### Incident Monitoring Slackbot[​](#incident-monitoring-slackbot "Direct link to Incident Monitoring Slackbot") Monitor backend services for errors and automatically alert your team via Slack with AI-generated summaries. Enable operators to trigger deeper investigation or create incidents directly from Slack, combining automated detection with human decision-making. ![Incident Monitoring](/docs/img/ai/hitl-slack-2.png) Code-NativeLow-Code[](https://github.com/prismatic-io/examples/tree/main/ai/slack-acme-incident-monitoring) ``` /* Flow for processing new incident alerts with AI agent assistance. This flow: 1. Creates tools for the AI agent (get on-call staff, create incident) 2. Configures an AI agent with incident response capabilities 3. Runs the agent to process the alert 4. If approval is needed, posts an interactive message to Slack 5. Stores agent state for resumption when approval is received */ export const newIncidentAlert = flow({ name: "New Incident Alert", description: "Create a new incident from an incoming alert", onExecution: async (context, params) => { const { configVars } = context; // Setup tools for the agent const agentCreateIncidentTool = await context.components.openai.createFlowTool({ flowName: "Create Incident", requiresApproval: true, strictMode: false, toolDescription: "Create a new incident using the provided description", }); const agentGetOnCallStaffTool = await context.components.openai.createFlowTool({ flowName: "Get On Call Staff", requiresApproval: false, strictMode: false, toolDescription: "Get On Call Staff", }); // Create the AI agent with our configuration const agentCreateAssistantAgent = await context.components.openai.createAgent({ instructions: AGENT_INSTRUCTIONS, mcpServers: [], modelName: "gpt-5-2025-08-07", name: "Acme SaaS Assistant", outputSchema: JSON.stringify(INCIDENT_RESPONSE_SCHEMA), outputSchemaName: "output", outputSchemaStrict: false, tools: [ agentCreateIncidentTool.data, agentGetOnCallStaffTool.data, ], }); // Prepare the alert input for the agent const setupAlertInputPrompt = `You must create a new incident from the provided alert for the on-call user. First, use a tool to get the on call staff, second create an incident using the create incident tool from the following alert. \nAlert Detected: ${JSON.stringify( params.onTrigger.results.body.data, )}`; // Run the agent to process the alert const runAgentCreateIncident = await context.components.openai.runAgent({ agentConfig: agentCreateAssistantAgent.data, fileIds: [], handoffs: [], history: "", maxTurns: "10", openaiConnection: configVars["OpenAI Connection"], previousResponseId: "", userInput: setupAlertInputPrompt, }); // Handle approval interruptions if (runAgentCreateIncident.data.hasInterruptions) { const approvalRequest = runAgentCreateIncident.data.pendingApprovals?.[0].approvalRequest; const approvalArgs = { ...JSON.parse(runAgentCreateIncident.data.pendingApprovals[0].arguments), approvalRequest, }; // Build approval message blocks const createApprovalBlocks = buildApprovalMessage(approvalArgs); // Post approval request to Slack await context.components.slack.postBlockMessage({ channelName: configVars["Alert Channel"], connection: configVars["slackConnection"], blocks: createApprovalBlocks as any, message: "An approval is required to create a new incident", }); // Store agent state for resumption after approval const crossFlowState = context.crossFlowState; crossFlowState[approvalArgs.anomaly_id] = { ...runAgentCreateIncident.data, agentConfig: agentCreateAssistantAgent.data, }; return { data: { interrupted: true }, crossFlowState, }; } }, }); ``` Code-NativeLow-Code[](https://github.com/prismatic-io/examples/tree/main/ai/slack-acme-incident-monitoring) ``` /* Handles Slack events and interactions for the incident management system. This flow processes approval actions from Slack buttons when users decide whether to create an incident from an anomaly alert. Integration flow: 1. newIncidentAlert flow detects anomaly and requests approval 2. User clicks approve/investigate/ignore button in Slack 3. This flow processes the interaction and resumes the AI agent 4. Agent completes the incident creation or rejection 5. Result is posted back to Slack */ export const handleSlackEventsAndInteractions = flow({ name: "Handle Slack Events and Interactions", onExecution: async (context, params) => { const triggerResults = params.onTrigger.results.body; // Decode the URL-encoded payload from Slack const rawBody = util.types.toString(triggerResults.data); const formData = new URLSearchParams(rawBody); const payloadString = formData.get("payload"); if (!payloadString) { console.log("No payload found in request body"); return { data: { error: "No payload found" } }; } // Parse the JSON payload const interactionPayload = JSON.parse(payloadString) as any; // Build response data const responseData = { type: interactionPayload.type, }; // Add common fields if (interactionPayload.trigger_id) { responseData.trigger_id = interactionPayload.trigger_id; } if (interactionPayload.user) { responseData.user = interactionPayload.user; } // Process different interaction types switch (interactionPayload.type) { case "block_actions": { const blockAction = interactionPayload as BlockAction; responseData.actions = blockAction.actions; responseData.response_url = blockAction.response_url; responseData.container = blockAction.container; // Process approval action const action = blockAction.actions[0]; // Parse the action value const approvalAction = parseApprovalAction(action); const { anomalyId, functionId, approved } = approvalAction; try { const storedState = retrieveStoredAgentState(context, anomalyId); const pendingApprovals = storedState.pendingApprovals || []; const matchingApproval = findMatchingApproval( pendingApprovals, functionId, ); // Create approval response const approvalResponses = createApprovalResponse( functionId, approved, action.action_id, ); // Resume the agent with approval response const resumeResult = await resumeAgent( context, storedState, approvalResponses, ); // Handle the agent's final output const finalOutput = resumeResult.data.finalOutput; await postIncidentResult(context, finalOutput); // Update the original approval message await updateApprovalMessage( context, blockAction, approved, action.action_id, ); // Clean up stored state await cleanupStoredState(context, anomalyId); responseData.handled = true; responseData.anomalyId = anomalyId; responseData.finalOutput = finalOutput; } catch (error) { console.error("Error handling approval:", error); responseData.error = error instanceof Error ? error.message : String(error); } break; } default: console.log("Unsupported interaction type:", interactionPayload); responseData.raw = interactionPayload; } return { data: responseData }; }, }); ``` --- #### MCP Flow Server [MCP](https://modelcontextprotocol.io/) is an open protocol, introduced by Anthropic, that standardizes how LLMs request context from applications. When an LLM connects to an MCP server, it requests a list of **tools** that the MCP server offers. These tools can perform actions such as "Find people with a specified name in HubSpot" or "Add an event to my Google Calendar." An MCP server offers a standardized schema so LLMs know to send a first and last name to the "find people" tool or an event name and date to the "add event" tool. **How does Prismatic's MCP Flow Server work?** In the Prismatic MCP Flow Server, the **tools** are implemented as [agent flows](https://prismatic.io/docs/ai/flow-invocation-schema.md). By using [agent flows](https://prismatic.io/docs/ai/flow-invocation-schema.md) as your tools, you're able to add validation and better error handling using either the Low Code UI or you can leverage the full power of Code Native Integrations to build your flows. Prismatic offers a built-in MCP flow server which allows you to connect to all of your [agent flows](https://prismatic.io/docs/ai/flow-invocation-schema.md). This will be the best supported approach to interact with the Prismatic platform via MCP, but if you have specific use cases you can opt to [build your own MCP server](https://prismatic.io/docs/ai/model-context-protocol.md#building-your-own-mcp-flow-server-using-prismatic-flows) and still interact with your Prismatic agent flows. Before connecting Prismatic's MCP flow server (or before building your own), ensure that you've created [agent flows](https://prismatic.io/docs/ai/flow-invocation-schema.md) that have invocation schema, so your LLM has something to query. Organization team members have access to agent flows of test instances When querying for agent flows as an organization team member, you will receive a list of agent flows within your test instances (the instances you interact with in the integration designer). You will not see your customers' instances' agent flows. Customer users, on the other hand, will see agent flows of production instances deployed to their customer. #### Prismatic's MCP flow server[​](#prismatics-mcp-flow-server "Direct link to Prismatic's MCP flow server") Prismatic offers a hosted MCP flow server that can be used to connect to your agent flows. Below are MCP endpoints for Prismatic's public regions. If your organization uses private stack, please contact support for your MCP endpoint. | Region | MCP Endpoint | | ----------------------- | ------------------------------------- | | US Commercial (default) | `mcp.prismatic.io/mcp` | | US GovCloud | `mcp.us-gov-west-1.prismatic.io/mcp` | | Europe (Ireland) | `mcp.eu-west-1.prismatic.io/mcp` | | Europe (London) | `mcp.eu-west-2.prismatic.io/mcp` | | Canada (Central) | `mcp.ca-central-1.prismatic.io/mcp` | | Australia (Sydney) | `mcp.ap-southeast-2.prismatic.io/mcp` | | Private Stack | `mcp./mcp` | These global MCP endpoints will provide your AI agent with access to all agent flows. Connecting to a specific integration's agent flows If you would like the MCP server to return only the agent flows for a specific integration, open the MCP tab of your integration and take note of the custom MCP endpoint. It will loo like `https://mcp.prismatic.io/SW5...../mcp`. ![Custom MCP endpoint for a specific integration](/docs/img/ai/integration-designer-mcp-menu.png) ##### Connecting an MCP Client to Prismatic's MCP flow server[​](#connecting-an-mcp-client-to-prismatics-mcp-flow-server "Direct link to Connecting an MCP Client to Prismatic's MCP flow server") See [Connecting AI Agents](https://prismatic.io/docs/ai/connect-ai-agent.md) for more details on how to connect an AI agent (like Claude, OpenAI, Cursor or other agents) to Prismatic's MCP flow server. #### Building your own MCP flow server using Prismatic flows[​](#building-your-own-mcp-flow-server-using-prismatic-flows "Direct link to Building your own MCP flow server using Prismatic flows") If you would like to build your own MCP flow server and instruct it to interact with **agent flows** in Prismatic, your MCP flow server must be told which flows are available to be queried. This example code instructs an MCP flow server to fetch agent flows from Prismatic's API, and creates a [resource](https://modelcontextprotocol.io/specification/2025-03-26/server/resources) for each: ``` import { type JSONSchema, JSONSchemaToZod, } from "@dmitryrechkin/json-schema-to-zod"; import { McpServer, type ToolCallback, } from "@modelcontextprotocol/sdk/server/mcp.js"; import { gql, request } from "graphql-request"; const STACK_BASE_URL = "https://app.prismatic.io"; // Change this to your own region const PRISMATIC_API_KEY = ""; interface AgentFlowResponse { ai: { agentFlows: { nodes: { id: string; name: string; description: string; webhookUrl: string; apiKeys: string[]; invokeSchema: string; resultSchema: string; }[]; }; }; } // Query for agentFlows const result = await request( new URL("/api", STACK_BASE_URL).toString(), gql` query agentFlows { ai { agentFlows { nodes { id name description webhookUrl apiKeys invokeSchema resultSchema } } } } `, {}, { Authorization: `Bearer ${PRISMATIC_API_KEY}`, }, ); // Customize MCP server to allow custom tool registrations, particularly // for use with JSON Schema (as, by default, only zod schemas are supported). class DynamicMcpServer extends McpServer { jsonSchemaTool( name: string, description: string, schema: JSONSchema, cb: ToolCallback, ): void { const zodSchema = JSONSchemaToZod.convert(schema) as any; super.tool(name, description, zodSchema.shape, cb); } } const server = new DynamicMcpServer({ name: "person-getter", version: "1.0.0", capabilities: { resources: {}, tools: {}, }, }); // For each flow, create a for (const flow of result.ai.agentFlows.nodes) { server.jsonSchemaTool( flow.name, flow.description, JSON.parse(flow.invokeSchema), async (args) => { console.log("flow invoke", flow.name, args); const result = await fetch(flow.webhookUrl, { method: "POST", body: JSON.stringify(args), headers: { "Content-Type": "application/json", "prismatic-synchronous": "true", }, }); return { content: [{ type: "text", text: await result.text() }] }; }, ); } ``` --- #### Documentation for LLMs Prismatic provides documentation optimized for large language models (LLMs). * [llms.txt](https://prismatic.io/docs/llms.txt) is a directory of all documentation pages optimized for LLMs. * [llms-full.txt](https://prismatic.io/docs/llms-full.txt) is a single file containing all documentation content optimized for LLMs. Additionally, an LLM-optimized version of any article can be accessed by replacing the trailing slash of its URL with `.md`. For example, * This article's LLM-optimized version can be found at `/docs/ai/prismatic-docs.md`. * `/docs/integrations/low-code-integration-designer/get-started/first-integration/` becomes `/docs/integrations/low-code-integration-designer/get-started/first-integration.md` --- ### Build Integrations #### Integrations Overview An **integration** is a collection of logical flows and steps that move data between your app and another app that your customers use. When you build an integration, you can build it in the [low-code designer](https://prismatic.io/docs/integrations/low-code-integration-designer.md) or as a TypeScript project in your preferred IDE. We call an integration built with code a [code-native integration](https://prismatic.io/docs/integrations/code-native.md) (or CNI). ![Low-code or code-native](/docs/img/integrations/overview/low-code-or-code-native.png) An **integration** built with the low-code builder consists of a series of [steps](https://prismatic.io/docs/integrations/low-code-integration-designer/steps.md) that execute sequentially. Each step runs an **action** - a discrete piece of code designed to perform a specific task. Actions can be operations like "HTTP - GET" to fetch the contents of a webpage from the internet, or "Amazon S3 - Put Object" to save a file to Amazon S3. You can use a combination of actions from common [built-in components](https://prismatic.io/docs/components.md) and your own [custom components](https://prismatic.io/docs/custom-connectors.md) to build an integration. A code-native **integration** is a set of flows (functions) written in TypeScript that run when a trigger fires. An integration starts when its [trigger](https://prismatic.io/docs/integrations/triggers.md) fires. Triggers can either follow a [schedule](https://prismatic.io/docs/integrations/triggers/schedule.md) or can be invoked via a [webhook URL](https://prismatic.io/docs/integrations/triggers/webhook.md). Integrations should be developed to be **configuration-driven**, so they can be deployed to multiple customers with potentially different configurations. This is accomplished by leveraging [config variables](https://prismatic.io/docs/integrations/config-wizard.md) and configuring steps to reference those variables. Some integrations have a single [flow](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md). That is, they have a single trigger that fires and a set of steps that are executed sequentially. Prismatic also supports grouping multiple related flows together into a single deployable integration. For example, if you have a third-party service (Acme ERP) that sends data via various webhooks to your integrations, it probably makes sense to have a single Acme ERP integration that you or your customers deploy that consists of several logical flows. Each flow has its own trigger, though they all share config variables. When an integration is completed, it can be published. [Customers](https://prismatic.io/docs/customers.md) can then enable the integration for themselves through the [embedded integration marketplace](https://prismatic.io/docs/embed/marketplace.md), or your team members can deploy an [instance](https://prismatic.io/docs/instances.md) of the integration on the customer's behalf. Regardless of who enables an instance of the integration - your team member or your customer - the person deploying the instance configures the instance with customer-specific config variables. We recommend that you follow our low-code [Getting Started](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) tutorials to first familiarize yourself with integration development and deployment. #### Low-code vs code-native[​](#low-code-vs-code-native "Direct link to Low-code vs code-native") The Prismatic low-code designer and code-native SDK are both excellent tools that you can use to build, test, and deploy integrations. When using the **low-code designer**, you build integrations by adding triggers, actions, loops, and branches to a canvas. When using the **code-native SDK**, you write TypeScript code to define your triggers and flow logic. Depending on your team structure, technical expertise, and the complexity of the integration you are building, you may choose to use one or the other. Let's examine a quick comparison, with more detail below: | Topic | Low-Code | Code-Native | | --------------------- | ------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------- | | Build method | Integration builders add triggers, flows and steps to an integration using the low-code integration builder. | Integration builders write triggers and integration logic in TypeScript using the code-native SDK. | | Flows | A low-code flow is a sequence of steps that run in a specific order. | A Code-Native flow is a JavaScript function that executes when the flow's trigger is invoked. | | Config Wizard | Config wizards are built within the low-code builder. | The config wizard is defined in TypeScript using the code-native SDK. | | Testing | Integrations are tested from within the integration designer. | Integrations can be tested within Prismatic or locally through unit tests. | | Step results and logs | Step results for each step are collected and stored and can be viewed later. | Code-native integrations are "single-step", and logging can be used for debugging. | | Best fit for | Hybrid teams of developers and non-developers | Highly technical teams who prefer code | ##### When to use low-code vs code-native[​](#when-to-use-low-code-vs-code-native "Direct link to When to use low-code vs code-native") You may want to reach for the low-code designer when: * Your team is looking to save development time and has non-dev resources that are technical enough to build integrations. * It is important for your non-developer team members to have a visual representation of the integration. * You would like your customers to build their own integrations using [embedded workflow builder](https://prismatic.io/docs/embed/workflow-builder.md). You might want to use code-native when: * You have a highly technical team that is comfortable writing TypeScript. * Your integration requires complex logic that is easier to write in code. * You want to unit test entire integrations rather than individual actions and triggers. * You want to use a version control system to manage your integration code. --- #### Connections Overview **Connections** contain the information necessary for the steps in your integration to connect to third-party apps and services. A connection is made up of fields for things like usernames, passwords, API keys, OAuth 2.0 secrets, host endpoints, API versions, and more - whatever a component needs to know to connect to an outside service. For example, an [Asana](https://prismatic.io/docs/components/asana.md) personal access token requires a single API key, and the [Slack](https://prismatic.io/docs/components/slack.md) connection requires an OAuth 2.0 authorize URL, token URL, client ID, and client secret (though end users will only see a single "Connect" button when they deploy an integration). Connections are generally presented to customer users on the first page of your integration's configuration wizard, but you can also set up the connection on your customers' behalf if you know the values the connection requires. ![Acme and Slack connection in Prismatic config wizard](/docs/img/integrations/connections/asana-and-slack.png) #### Integration-specific connections[​](#integration-specific-connections "Direct link to Integration-specific connections") [Integration-specific connections](https://prismatic.io/docs/integrations/connections/integration-specific.md) are the fastest way to add a connection to an integration. With this approach, you can configure all connection inputs directly within the design canvas, eliminating extra steps. However, if you need to reuse the connection in future integrations or want to set up connections on behalf of your customers (hide them from the config wizard), then [integration-agnostic connections](https://prismatic.io/docs/integrations/connections.md#integration-agnostic-connections) are the better choice. ![Integration-specific connections](/docs/img/integrations/connections/integration-specific-connections.png) #### Integration-agnostic connections[​](#integration-agnostic-connections "Direct link to Integration-agnostic connections") Integration-agnostic connections are centrally managed and can be referenced across one or multiple integrations. They provide several key advantages over integration-specific connections: 1. **Reusable** - Once set up, they can be easily referenced in future integrations that require connecting to the same app. 2. **Flexible Customer Interaction** - Purpose-built connection types allow you to control whether customers should or should not interact with the connection in the config wizard. 3. **Test Connection Support** - Easily define test connections for test runs. There are three types of integration-agnostic connections. The flowchart below will help you determine which one best fits your use case. * An [Organization-Activated Global Connection](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-global.md) is used when a third-party account *you own* is used by your customers' instances. Your customers are not aware of organization-activated global connections and will not see the connection when configuring an instance. **Example**: Your organization has a [Twilio](https://prismatic.io/docs/components/twilio.md) API key that all of your customers' instances will use for SMS. * An [Organization-Activated Customer Connection](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-customer.md) is used when each of your customers have unique credentials, and *you* know their values and want to provide them on behalf of your customer. When your customers deploy an integration that uses an organization-activated customer connection, the connection you created on their customer record will be used, and they will not see the connection in the config wizard. **Example**: Your customers' instances need to connect to *your* app. It would be an awkward experience for your customers to log in to your app, navigate to your [embedded marketplace](https://prismatic.io/docs/embed/marketplace.md), and configure an integration only to be prompted for an API key for the app they're already logged in to. You can generate API keys for your app on behalf of your customers. * A [Customer-Activated Connection](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md) is used when each of your customers have unique credentials to a third-party app, and they need to enter the credentials (or go through an OAuth 2.0 flow) themselves. Your customer will enter usernames/passwords/API keys/etc for the third-party app, and that connection can then be used for one or more instances that require that connection type. **Example**: You've built a couple of Salesforce-related integrations with a Salesforce OAuth 2.0 connection, and you're expecting your customers to work through the OAuth 2.0 auth code flow to enable those integrations. Rather than entering your Salesforce client ID and client secret in several integrations, you enter them once for a single customer-activated connection and use that customer-activated connection in several integrations. See [Code-Native Config Wizard](https://prismatic.io/docs/integrations/code-native/config-wizard.md#integration-agnostic-connections-in-code-native) for details on how to use integration-agnostic connections in code-native integrations. --- #### Customer-Activated Connections *** *** A **Customer-Activated Connection** is used when each of your customers have unique credentials to a third-party app, and they need to enter the credentials (or go through an OAuth 2.0 auth code flow) themselves. Your customer will enter usernames/passwords/API keys/etc for the third-party app when they deploy an instance of an integration that uses them. The customer-activated connection is defined once by you, the integration author, and then can be used in one or more integrations. When your customer activates an integration that uses a customer-activated connection, they will be prompted to enter their unique credentials to the third-party app. If they have previously saved credentials for that connection type, they can select from existing saved connections, saving them time and effort. If you would like to provide your customers with a central page where they can manage all of their reusable connections, you can embed a [connections page](https://prismatic.io/docs/embed/additional-screens.md#showing-the-connection-screen) using the embedded SDK. #### Creating a new customer-activated connection[​](#creating-a-new-customer-activated-connection "Direct link to Creating a new customer-activated connection") To create a new customer-activated connection, open your organization's settings page by clicking your organization's name on the bottom-left, and then open the **Connections** tab. * Select **+ Add Connection**. * Select a connector and connection type * Under **Connection Type**, select **Customer-Activated Connection** * Give your customer-activated connection a recognizable name and description. * If the connector supports more than one connection type (like OAuth 2.0 and Personal API Key), select the connection type you'd like to use. * If you have more than one Prismatic tenant (e.g. US and EU tenants, or dev and prod tenants), update the default **Stable Key** to the same value in each tenant. That key will be used to match up customer-activated connections across your tenants. ![Create a new customer-activated connection](/docs/img/integrations/connections/integration-agnostic-connections/customer-activated/create-new.png) * Fill in all connection inputs and then click **Next**. * You'll have the option to provide test credentials, which can be used in the test runner as you build an integration that uses this connection. If you want your customers to be able to use your connection in the embedded workflow builder, be sure to toggle **Use for Workflows**. ![Create a new customer-activated connection, step 2](/docs/img/integrations/connections/integration-agnostic-connections/customer-activated/enabled-for-workflows.png) #### Using a customer-activated connection in an integration[​](#using-a-customer-activated-connection-in-an-integration "Direct link to Using a customer-activated connection in an integration") The next time you add a connection to an integration, if you have a customer-activated connection defined for that connection type, your integration designer will automatically reference it. ![Customer-activated connection in the config wizard designer](/docs/img/integrations/connections/integration-agnostic-connections/customer-activated/config-wizard-designer.png) When running a test execution within the integration designer, the test connection that you set previously will be used (or you can set one up within your integration). ![Test execution using a customer-activated connection](/docs/img/integrations/connections/integration-agnostic-connections/customer-activated/config-wizard-test-connection.png) If you deploy this integration to a customer, the customer will be prompted for their unique credentials to the third-party app. #### Reusing customer-activated connections[​](#reusing-customer-activated-connections "Direct link to Reusing customer-activated connections") When a customer-activated connection is assigned to a [marketplace](https://prismatic.io/docs/embed/marketplace.md) integration or used in the [Embedded Workflow Builder](https://prismatic.io/docs/embed/workflow-builder.md), customers can save their credentials and reuse them across multiple integrations and workflows. This eliminates the need to re-enter credentials they've already provided, simplifying the configuration experience. Their credentials are saved outside of the context of the instance they've deployed, so they can be reused in other instances of the same integration or in different integrations that are configured to use the same customer-activated connection. **When customer-activated connections are not used**: If you don't assign a customer-activated connection to your integration, the configuration wizard will fall back to requiring credentials be entered each time a marketplace integration is activated. **Note**: You must use `@prismatic-io/embedded` version `4.2.0` or later to support reusing customer-activated connections. #### FAQ[​](#faq "Direct link to FAQ") ##### What will customer users see?[​](#what-will-customer-users-see "Direct link to What will customer users see?") When a customer activates a marketplace integration that requires a customer-activated connection: * If they have never saved credentials for that customer activated connection type, they will be prompted to enter them, as they would for any standard connection * If they have previously saved credentials for that connection type, they can select from existing saved connections or opt to enter a new set of credentials ![Customer view of a customer-activated connection](/docs/img/integrations/connections/integration-agnostic-connections/customer-activated/config-wizard-customer-view.png) * Within the embedded workflow builder, customers will see a list of their saved connections when they add a step that uses a component with an associated customer-activated connection. ![Customer view of a customer-activated connection in the embedded workflow builder](/docs/img/integrations/connections/integration-agnostic-connections/customer-activated/workflow-builder-customer-view.png) ##### Can customer-activated connections be used if I have multiple tenants?[​](#can-customer-activated-connections-be-used-if-i-have-multiple-tenants "Direct link to Can customer-activated connections be used if I have multiple tenants?") Yes. If your organization has multiple tenants (for example, a US and EU tenant, or a dev and prod tenant), be sure to assign your customer-activated connections the same **Stable Key** in each tenant. Then, when you [sync integrations](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md) between tenants, your integrations will automatically use the correct customer-activated connection in the new tenant. --- #### Organization-Activated Customer Connections *** *** If you rely on connections that are customer-specific but are used in multiple integrations, **organization-activated customer connections** help you (the organization) configure a connection once for a customer that can be used in several of the customer's instances. For example, suppose you are Acme SaaS. Several of your integrations interact with your API, and each of your customers have their own Acme SaaS API key. *You* know their Acme SaaS API keys. It would feel strange for a customer user who is logged in to your Acme SaaS web app to enter their own Acme SaaS API key into an embedded config wizard. So, you can set each of your customers' Acme SaaS API keys on their behalf once, and the instances that your customer deploys that rely on that key will reference the customer-specific connection. #### Creating a new organization-activated customer connection[​](#creating-a-new-organization-activated-customer-connection "Direct link to Creating a new organization-activated customer connection") To create a new organization-activated customer connection, open your organization's settings page by clicking your organization's name on the bottom-left, and then open the **Connections** tab. * Select **+ Add Connection**. * Select a connector and connection type * Under **Connection Type**, select **Organization-Activated Customer Connection** * Give your organization-activated customer connection a recognizable name and description. * If the connector supports more than one connection type (like OAuth 2.0 and Personal API Key), select the connection type you'd like to use. * If you have more than one Prismatic tenant (e.g. US and EU tenants, or dev and prod tenants), update the default **Stable Key** to the same value in each tenant. That key will be used to match up organization-activated customer connections across your tenants. ![Create a new organization-activated customer connection](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-customer/create-new.png) Under the **Stable Key** is a section called **Inputs**. Here, you can specify whether each of the connection's inputs are **Global** (same value for all customers) or **Customer-Specific** (unique for each customer). For example, if your customers all have accounts on an SFTP server, their `username` and `password` may be unique, but they may all use the same `host` and `port`. Check global values carefully Be sure to check global values that you set carefully. Once set, they cannot be changed. ![Create a new organization-activated customer connection input config](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-customer/create-new-input-config.png) When ready, click **Create**. #### Configuring a test connection for building integrations[​](#configuring-a-test-connection-for-building-integrations "Direct link to Configuring a test connection for building integrations") After creating a new organization-activated customer connection, you'll be prompted to supply test credentials for that connection. These test credentials will be used in the integration designer as you're building your integrations if your integration relies on an organization-activated customer connection. If you'd like to change your test credentials at any time, you can return to the **Connections** tab in your organization settings, select your connection, and edit your credentials under **Test Runner Connection**. Your test instances that run as you test your integration in the integration designer will immediately begin using the new test credentials that you supply. #### Configuring a customer's connection[​](#configuring-a-customers-connection "Direct link to Configuring a customer's connection") Once an organization-activated customer connection is created, you can assign connection values for each of your customers. You can do this through the UI or [programmatically](#programmatically-creating-a-customers-connection). Through the UI, open **Customers** and select a customer. Under the customer's **Connections** tab, select **+ Add connection** and then select the organization-activated customer connection you just created. ![Create a new customer's organization-activated customer connection](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-customer/create-customer-oac.png) Fill in the fields that your connection requires (API key, username, password, endpoint URL, etc.) with customer-specific values. ![Fill in a new customer's organization-activated customer connection fields](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-customer/fill-in-customer-oac.png) The values you fill in here will be used whenever this customer deploys an instance of an integration that relies on your connection. #### Programmatically creating a customer's connection[​](#programmatically-creating-a-customers-connection "Direct link to Programmatically creating a customer's connection") Programmatically setting a customer's connection values is a three-query process using the [Prismatic GraphQL API](https://prismatic.io/docs/api.md): 1. Fetch the organization-activated customer connection's ID You can do this by running a `scopedConfigVariables` query: ``` query { scopedConfigVariables(stableKey: "acme-api-key") { nodes { id key stableKey connection { inputs { nodes { id key } } } } } } ``` This will yield your organization-activated customer connection's ID along with its inputs and their keys. Take note of the ID you get back - it should start with `U2Nvc...`. 2. Fetch your customer's ID. You can do that through a `customers` query. ``` query { customers(externalId: "my-external-id") { nodes { id name } } } ``` Take note of your customer's ID (note: not their external ID) - it should start with `Q3Vzd...`. 3. Create a connection for your customer with some values you know. You can do that with a `createCustomerConfigVariable` mutation using the connection and customer ID you noted, along with an array of `inputs`. Inputs should have the shape ``` { value: string, name: string, type: "value" } ``` Each input's `name` is the key of the connection input (e.g. "username", "apiKey", "host", etc.). `value` is the customer's value for that input. ``` mutation { createCustomerConfigVariable( input: { scopedConfigVariable: "U2NvcGVkQ29uZmlnVmFyaWFibGU6NjIzZjM2NWItMzc3Ny00MWRkLWEyODAtNTBjNjliMjQyMGQ4" customer: "Q3VzdG9tZXI6NzFlY2NiYzQtYjc5OC00YzQzLWIzZDAtZjdmYzE5OTEyYzlj" inputs: [{ value: "Testing", name: "apiKey", type: "value" }] } ) { customerConfigVariable { id } errors { field messages } } } ``` #### Using organization-activated customer connections in an integration[​](#using-organization-activated-customer-connections-in-an-integration "Direct link to Using organization-activated customer connections in an integration") The next time you add a connection to an integration, if you have an organization-activated customer connection defined for that connection type, your integration designer will automatically reference it. ![Organization-activated customer connection in the config wizard designer](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-customer/config-wizard-designer.png) When running a test execution within the integration designer, the test connection that you set previously will be used. If you deploy this integration to a customer, that customer's organization-activated customer connection will be used. The customer will not see the connection as they walk through your integration's config wizard. #### FAQ[​](#faq "Direct link to FAQ") ##### What will customer users see?[​](#what-will-customer-users-see "Direct link to What will customer users see?") Nothing. You as an organization will create the connection for your customer. When a customer configures an instance that relies on that connection, no UI elements will appear in their config wizard. ##### Can organization-activated customer connections be used if I have multiple tenants?[​](#can-organization-activated-customer-connections-be-used-if-i-have-multiple-tenants "Direct link to Can organization-activated customer connections be used if I have multiple tenants?") Yes. If your organization has multiple tenants (for example, a US and EU tenant, or a dev and prod tenant), be sure to assign your organization-activated customer connections the same **Stable Key** in each tenant. Then, when you [sync integrations](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md) between tenants, your integrations will automatically use the correct organization-activated customer connection in the new tenant. --- #### Organization-Activated Global Connections *** *** Organization-activated global connections are used when a third-party account *you own* is used by your customers' instances. Your customers are not aware of organization-activated global connections and will not see the connection when configuring an instance. For example, your organization may have a [Twilio](https://prismatic.io/docs/components/twilio.md) API key that all of your customers' instances will use for SMS. #### Creating a new organization-activated global connection[​](#creating-a-new-organization-activated-global-connection "Direct link to Creating a new organization-activated global connection") To create a new organization-activated global connection, open your organization's settings page by clicking your organization's name on the bottom-left, and then open the **Connections** tab. * Select **+ Add Connection**. * Select a connector and connection type * Under **Connection Type**, select **Organization-Activated Global Connection** * Give your organization-activated global connection a recognizable name and description. * If the connector supports more than one connection type (like OAuth 2.0 and Personal API Key), select the connection type you'd like to use. * If you have more than one Prismatic tenant (e.g. US and EU tenants, or dev and prod tenants), update the default **Stable Key** to the same value in each tenant. That key will be used to match up organization-activated global connections across your tenants. ![Create a new organization-activated global connection](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-global/create-new.png) * Fill in all connection inputs and then click **Next**. If you're creating an OAuth 2.0 connection, you'll be prompted to go through the Authorization Code flow on the next screen. After that, you'll be prompted for test configuration, which allows you to use a distinct configuration in the test runner than will be used for production instances. #### Using an organization-activated global connection in an integration[​](#using-an-organization-activated-global-connection-in-an-integration "Direct link to Using an organization-activated global connection in an integration") The next time you add a connection to an integration, if you have an organization-activated global connection defined for that connection type, your integration designer will automatically reference it. ![Organization-activated global connection in the config wizard designer](/docs/img/integrations/connections/integration-agnostic-connections/org-activated-global/config-wizard-designer.png) When running a test execution within the integration designer, the test connection that you set previously will be used. If you did not set up a test connection on the organization-activated global connection, the default global connection will be used. If you deploy this integration to a customer, the global connection will be used. The customer will not see the connection as they walk through your integration's config wizard. #### FAQ[​](#faq "Direct link to FAQ") ##### What will customer users see?[​](#what-will-customer-users-see "Direct link to What will customer users see?") Nothing. You as an organization will create the global connection. When a customer configures an instance that relies on that connection, no UI elements will appear in their config wizard. ##### Can organization-activated global connections be used if I have multiple tenants?[​](#can-organization-activated-global-connections-be-used-if-i-have-multiple-tenants "Direct link to Can organization-activated global connections be used if I have multiple tenants?") Yes. If your organization has multiple tenants (for example, a US and EU tenant, or a dev and prod tenant), be sure to assign your organization-activated global connections the same **Stable Key** in each tenant. Then, when you [sync integrations](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md) between tenants, your integrations will automatically use the correct organization-activated global connection in the new tenant. --- #### Integration-Specific Connections **Integration-specific connections** are the "classic" way of adding connections to an integration. They were first introduced [when we developed connections](https://prismatic.io/blog/simpler-more-flexible-authentication/) and are useful when you have a connection that is used by a single integration, and you want your customers to enter their connection information (or go through the OAuth 2.0 flow). #### Adding a connection to an integration[​](#adding-a-connection-to-an-integration "Direct link to Adding a connection to an integration") When you add a step to your integration that requires a connection, that step will be bound to a connection config variable that represents that connection. If a connection config variable for that component doesn't exist yet, a new one will be created. One connection config variable can be used for multiple steps (so all of your Amazon S3 steps can reference a single connection config variable). ![Connection step input in Prismatic app](/docs/img/integrations/connections/integration-specific/connection-step-input.png) The config variable that is referenced will contain the information necessary for the step to connect to the outside app or service. You can provide default values for the connection's input fields, or you can choose to have customers enter their own values when they deploy an instance of the integration. ![Connection config variables in Prismatic app](/docs/img/integrations/connections/integration-specific/config-var-default-values.png) You can control the visibility of each field of a connection config variable by clicking the icon. You can choose to show the field to customer users, make it settable programmatically in embedded, or hide it entirely from customers. #### Connection templates[​](#connection-templates "Direct link to Connection templates") A **Connection Template** allows you to pre-fill a connection's input fields with default values. A template can be referenced by an integration you build or by an integration your customers build using [embedded workflow builder](https://prismatic.io/docs/embed/workflow-builder.md). Connection templates are useful for a few reasons: 1. If several of your integrations require the same connection input fields (for example, an OAuth 2.0 client ID and secret), you can create a template that contains those fields and reference it in each integration. 2. If you want to provide a template for your customers to use in their own integrations, you can create a template and share it with them. They can then reference the template in their own integrations but will not be able to view or change the template's input fields (so they can't access your client secret). 3. If you store your integrations' [YAML definitions](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md#exporting-an-integrations-yaml-definition) in source control, your YAML will be cleaner if you reference a template instead of including the connection's input fields directly in the YAML. By referencing a template that contains a client ID and client secret, you can avoid committing sensitive information to source control. connection templates vs integration-agnostic connections If your goal is to pre-fill some connection values for integration builders (your own team or customer embedded workflow builders), connection templates make sense. If you're interested in establishing a connection that is reusable, check out [integration-agnostic connections](https://prismatic.io/docs/integrations/connections.md#integration-agnostic-connections) instead. ##### Creating a connection template[​](#creating-a-connection-template "Direct link to Creating a connection template") To create a connection template, first click on your organization's name in the bottom left corner of the screen. Then, open the **Connections** tab. Click **+Add Connection** to create a new connection template. ![Add connection template in Prismatic app](/docs/img/integrations/connections/integration-specific/add-connection-template.png) Select the connector to add a connection template for, and then select **Connection Template** for the **Connection Type**. Give your template a name, and then add input fields that you would like to pre-fill to the template. Fields that you omit will be configurable by an integration builder (either an organization member or a customer using embedded designer). ![Add connection template inputs in Prismatic app](/docs/img/integrations/connections/integration-specific/connection-template-inputs.png) Updating a connection template's input values A connection template's input values can be updated until a version of an integration that uses the template has been published. If an integration that uses the template has been published, the template's input values cannot be updated. To update a connection template's input values, create a new connection template with your updated values and update your integration to reference the new connection template. Your deployed instances (perhaps on `v10` of your integration) will continue to reference the old connection template, but new instances (perhaps on `v11` of your integration) will reference the new connection template and its values. ##### Referencing a connection template in a low-code integration[​](#referencing-a-connection-template-in-a-low-code-integration "Direct link to Referencing a connection template in a low-code integration") To reference a connection template in an integration, add a new connection to the integration. Then, select the template you created from the **Connection Template** dropdown menu. You will now only see fields that were not included in the template. ![Reference connection template in Prismatic app](/docs/img/integrations/connections/integration-specific/connection-template-reference.png) ##### Referencing a connection template in a code-native integration[​](#referencing-a-connection-template-in-a-code-native-integration "Direct link to Referencing a connection template in a code-native integration") Connection templates can be used if you [reference an existing component's connection](https://prismatic.io/docs/integrations/code-native/existing-components.md#using-existing-connections-in-code-native) in your code-native integration. You can do this by adding a `template` property to your connection reference. Reference connection template in code-native connection ``` connectionConfigVar({ stableKey: "my-salesforce-connection", dataType: "connection", connection: { component: "salesforce", key: "oauth2", values: {}, template: "My Salesforce Connection", }, }); ``` --- #### What is OAuth 2.0? [OAuth 2.0](https://oauth.net/2/) is a special type of connection that is ubiquitous in integration development. OAuth 2.0 allows your customers to authorize your integration to perform certain functions on their behalf without needing to give you their username or password. For example, customers can authorize your integration to fetch their Salesforce leads, create Slack channels, or generate Quickbooks invoices for them. You've probably come across OAuth 2.0 at some point - any time you click "Log in with my Google Account" or "Connect my Dropbox" on a website, that website leverages OAuth 2.0 to fetch information (your email address, files, etc.) on your behalf. You don't enter your Google or Dropbox credentials into the website. Instead, you enter your credentials on a Google, Dropbox, etc. page, and the OAuth provider generates a unique code that grants the website a set of your permissions. With Prismatic, you can offer your customers a single "Connect to Acme" button in your integrations' config wizards, and your customers can seamlessly grant you permission to their accounts in other platforms. Prismatic's OAuth 2.0 service takes care of generating authentication URLs, handling token exchange and token refresh, and ensuring that up-to-date keys are available to your instances when they run. ![Configure OAuth 2.0 connection via Prismatic app](/docs/img/integrations/connections/oauth2/connect-oauth-app.webp) #### Why use OAuth 2.0?[​](#why-use-oauth-20 "Direct link to Why use OAuth 2.0?") OAuth 2.0 has some advantages over other authentication mechanisms (like basic auth): 1. OAuth 2.0 provides your users with a seamless authentication experience. They only need to click a "connect" button and then select "I approve" on a permissions consent screen, and the OAuth service takes care of the rest. 2. Permissions are granular. Your customers can grant you permission to do specific tasks in their account, like "read Salesforce leads" or "write Slack messages". This gives customers peace of mind. 3. Customers don't need to hand you their credentials. They don't enter a username and password for a third-party in your app. Instead, they authenticate with the third-party app, and your app is handed an access token with granular permission to do specific tasks. It's fine if they change their third-party password; they don't need to log in to your app and change their integration configuration, too. 4. Tokens can generally be revoked at any time. Most apps have a screen that displays what apps have access to their account, where they can see things like *Acme has read access to your Dropbox files*. Those screens generally have a "revoke access" button. #### OAuth 2.0 grant types[​](#oauth-20-grant-types "Direct link to OAuth 2.0 grant types") The [OAuth 2.0 framework](https://oauth.net/2/) supports several **grant types**, three of which are common for B2B integrations: 1. Most common is the [Authorization Code grant type](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md). When one of your customers configures an integration, they click a "Connect to Dropbox" or "Connect my Salesforce" button. After logging in to the external app and consenting to give you permissions to their account, the user returns to a Prismatic callback URL, where the **auth code** they brought back is exchanged for an access token that you can use to access their data. 2. The [Client Credentials grant type](https://prismatic.io/docs/integrations/connections/oauth2/client-credentials-grant-type.md) is also common in integrations. Sometimes called the **machine to machine** (M2M) grant type, this process is a little more involved for your customer. They log in to their third-party app, generate a **Client ID** / **Client Secret** key pair, and enter their key into your integration's config wizard. That key pair is exchanged for an access token for the third-party app. 3. While officially [deprecated](https://oauth.net/2/grant-types/password/), the [Password grant type](https://prismatic.io/docs/integrations/connections/oauth2/password-grant-type.md) prompts a user for their username and password for a third-party app. That username and password are exchanged for an access token. --- #### OAuth 2.0 Authorization Code Grant Type #### Authorization code grant type overview[​](#authorization-code-grant-type-overview "Direct link to Authorization code grant type overview") The OAuth 2.0 **Authorization Code** grant type is something you've probably used before. Any time you've clicked "Log in with Google" or "Connect my Outlook Calendar", the application asking for your Google account information or for access to your Outlook calendar uses the authorization code flow to fetch an API key that they can use to interact with Google or Microsoft on your behalf. **Additional resources**: #### How does the authorization code grant type work?[​](#how-does-the-authorization-code-grant-type-work "Direct link to How does the authorization code grant type work?") At a high level, the OAuth 2.0 authorization code flow works like this: 1. You as a software vendor register with the third-party application. You tell them your application's name, a description, and a callback URL. They give you a **client ID** and **client secret** that are unique to you. 2. You send customers to the third-party application's OAuth 2.0 **authorize endpoint**. You include your **client ID**, a **redirect URL**, and an optional list of [scopes](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md#authorization-code-grant-type-scopes) (permissions) that you want to request from your user as a set of search parameters. For example, you might ask for `files.content.read` from Dropbox, so you can read your customer's Dropbox files. 3. The customer authenticates with the third-party application and then interacts with a **consent screen**. This is the page you've likely seen before that says something like *Acme corp would like to read files in your Dropbox folder. Are you okay with that?* Once they consent, the user is directed back to your application with a unique authentication `code`. 4. Your application exchanges the `code` using the third-party application's **token endpoint**, along with your **client ID** and **client secret**. The third-party application verifies that the code is valid, and if so, responds with an **access token** and an optional **refresh token**. 5. You periodically exchange the refresh token for a new access token and use the access token to make API calls on behalf of the customer. ![Infographic describing the OAuth 2.0 auth code flow](/docs/img/integrations/connections/oauth2/how-oauth2-works.png) #### Creating an OAuth 2.0 "app" in a third-party service[​](#creating-an-oauth-20-app-in-a-third-party-service "Direct link to Creating an OAuth 2.0 \"app\" in a third-party service") To use the authorization code grant type in your integration, you will need to work with the third-party service to create an "OAuth 2.0 Application". Most common SaaS platforms have documentation on how to create an OAuth application, and we link to that documentation on our component documentation pages. It's usually found in an "API Access" section of a settings page or on a page similarly named. registering with a third-party can take time Most third-party services allow you to create an unverified OAuth application for testing purposes but require you to go through a verification process before you can use the application in production. The verification process can take days or weeks, so it's best to start the process early. The third-party may require you to provide a privacy policy, terms of service, and other information about your use case and may require a partnership agreement. ##### Authorization code callback URL[​](#authorization-code-callback-url "Direct link to Authorization code callback URL") When you configure your OAuth application, you'll likely need to set up an authorized **callback URL**. That's the URL to which users return with a special `code` after granting you permission to their account. Your callback URL depends on the region where your tenant resides and whether or not you use a [custom domain](https://prismatic.io/docs/configure-prismatic/custom-domains.md): | Region | Callback URL | | ----------------------- | ----------------------------------------------------- | | US Commercial (default) | `https://oauth2.prismatic.io/callback` | | US GovCloud | `https://oauth2.us-gov-west-1.prismatic.io/callback` | | Europe (Ireland) | `https://oauth2.eu-west-1.prismatic.io/callback` | | Europe (London) | `https://oauth2.eu-west-2.prismatic.io/callback` | | Canada (Central) | `https://oauth2.ca-central-1.prismatic.io/callback` | | Australia (Sydney) | `https://oauth2.ap-southeast-2.prismatic.io/callback` | | Custom Domain | `https://oauth2./callback` | ![Set up callback URL for OAuth in Dropbox app console](/docs/img/integrations/connections/oauth2/dropbox-configure-app.png) ##### Authorization code client ID and secret[​](#authorization-code-client-id-and-secret "Direct link to Authorization code client ID and secret") The third-party application will supply you with a **client ID** and **client secret**. These are sometimes called "App ID", "App Key", or something similar. Take note of these - you'll use them to configure your Prismatic integration's connection. ##### Configuring an authorization code consent screen[​](#configuring-an-authorization-code-consent-screen "Direct link to Configuring an authorization code consent screen") Depending on what application you're integrating with, they likely let you specify your application's name, your company's icon, a link to a privacy policy, and more. Be sure to enter *your* application's name - not "Prismatic". This is the page that users will see after clicking the "connect" button in your integration's config wizard. The page will say something like > Acme corp would like access to view and create leads in your Salesforce account. Are you okay with that? ##### Authorization code grant type scopes[​](#authorization-code-grant-type-scopes "Direct link to Authorization code grant type scopes") A **scope** is a specific permission that you would like to request from your customer. For example, you might request `file.contents.write` permission for your customer's Dropbox account, so you can write files to their Dropbox, or you might request the `channels:read` permission from your customer's Slack account so you can get a list of public channels they have access to. Some applications, like Dropbox and Salesforce, have you identify which permissions you need when you create your application. Others have you specify scopes as a URL search parameter when you send your customers to their **authorization URL**. ###### The offline\_access scope[​](#the-offline_access-scope "Direct link to The offline_access scope") Many applications offer a scope called `offline_access`. Granting this permission signifies that you want long-term access and will need a refresh token so you can continually refresh the access token you have. #### Adding an OAuth 2.0 authorization code connection to an integration[​](#adding-an-oauth-20-authorization-code-connection-to-an-integration "Direct link to Adding an OAuth 2.0 authorization code connection to an integration") Once you have an OAuth 2.0 Application configured in a third-party service, add a new connection in your Prismatic integration for the third-party you're integrating with. You can do that by either adding a step for the third-party (add a Salesforce step to automatically create a Salesforce connection, etc.) or opening the **Configuration Wizard Designer** and creating a new **Connection**. Enter the **client ID** and **client secret** that you noted in the previous step. You may need to enter an **Auth URL** or **Token URL** if those are different for different tenants. You can find those in the third-party applications' documentation, and they often look like `https://example.com/authorize` and `https://example.com/oauth/token` respectively. There's a good chance the URLs are the same for everyone, so they are hidden in the Prismatic UI. Mark fields that you don't want your customers to see "hidden" by clicking the icon. Your client ID and client secret are hidden by default. ![Add OAuth connection to integration in Prismatic app](/docs/img/integrations/connections/oauth2/oauth-config-var.png) If the application you're connecting with allows you to request **scopes** on a per-connection basis, you'll be prompted for scopes here as well. #### Configuring an authorization code connection[​](#configuring-an-authorization-code-connection "Direct link to Configuring an authorization code connection") If you've created an integration with an OAuth 2.0 connection, customers will see a **Connect** button when they enable the integration. When they click the **Connect** button, they will be brought to the third-party application's OAuth service and will be prompted to verify that they want to grant permissions to your integration. Once they are done, they'll see an "Authorization completed successfully" page, which they can close to return to the instance configuration screen. ![Configure OAuth 2.0 connection via Prismatic app](/docs/img/integrations/connections/oauth2/connect-oauth-app.webp) To change an OAuth connection (for example, if you logged in as the incorrect person when you clicked **Connect**), you can click **Disconnect** and then **Connect** again to reauthenticate against the OAuth provider. If any problems occur during the OAuth flow (incorrect Auth or Token URL, incorrect scopes, etc.), you can view related connection logs by clicking the button to the right of the connection config variable. The connection will be marked with a green if the connection has been used successfully in an execution of the instance, yellow if it has been configured (but not used), and red if the component using the connection threw a connection-related error. #### Disconnecting an authorization code connection[​](#disconnecting-an-authorization-code-connection "Direct link to Disconnecting an authorization code connection") When you **disconnect** an OAuth 2.0 connection, two things (and one optional thing) happen: 1. Prismatic stops periodically refreshing the access token 2. Prismatic deletes the access and refresh tokens from its database, so steps can't reference it 3. \[Optional] Some OAuth 2.0 providers allow you to *revoke* a token. [Quickbooks](https://prismatic.io/docs/components/quickbooks.md) is a prominent example of an API that supports revocation. If a *revocation endpoint* is present in a component's connection, Prismatic reaches out to that endpoint to revoke the token with the third-party's API. To disconnect an active OAuth 2.0 connection, click the **Disconnect** button under the config variable name: ![Disconnect active OAuth 2.0 connection in Prismatic app](/docs/img/integrations/connections/oauth2/disconnect.png) #### Authorization code connections in custom components[​](#authorization-code-connections-in-custom-components "Direct link to Authorization code connections in custom components") If you would like to build a custom component that implements an OAuth 2.0 auth code connection, see the example code on the custom connectors [connections](https://prismatic.io/docs/custom-connectors/connections.md#writing-oauth-20-connections) article. --- #### OAuth 2.0 Client Credentials Grant Type #### Client credentials grant type overview[​](#client-credentials-grant-type-overview "Direct link to Client credentials grant type overview") The OAuth 2.0 **Client Credentials** grant type is sometimes called the Machine-to-Machine (M2M) grant type and allows your application to communicate with a third-party directly. The **Client Credentials** flow is different from the [Authorization Code](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md) flow in a couple of key ways: 1. Customers do not work through a consent screen. Rather, customers generate their own client ID / secret key pair and explicitly grant permissions to the key pair they generate. 2. Key pairs are generally not associated with a specific user. Instead, the key pairs have permissions to access certain resources in their account. 3. This flow generally does not require an approval process from the third-party app, since you don't create an OAuth 2.0 app. Instead, your customer logs in to their account to create the key pair that you will use. **Additional resources**: #### How does the client credentials grant type work?[​](#how-does-the-client-credentials-grant-type-work "Direct link to How does the client credentials grant type work?") At a high level, the OAuth 2.0 client credentials flow works like this: 1. You ask your customer to log in to their third-party app account and generate a **Client ID** / **Client Secret** key pair. You ask them to grant that key pair a certain set of permissions. 2. Your customers enter their key pair in your integration's config wizard. 3. Your app exchanges the key pair using the third-party app's **token URL** for an access token that you can use to interact with your customer's third-party account. The Prismatic OAuth service takes care of the token exchange for you. #### Adding an OAuth 2.0 authorization code connection to an integration[​](#adding-an-oauth-20-authorization-code-connection-to-an-integration "Direct link to Adding an OAuth 2.0 authorization code connection to an integration") When your customer configures an instance of your integration, they'll need to create their client ID and secret key pair and enter those values into your configuration wizard. If your token URL is the same for all users, we recommend that you mark that input as only organization-visible (so your customers don't risk editing it). You can do the same with the scopes input. ![Client credentials input visiblity](/docs/img/integrations/connections/oauth2/client-credentials-input-visibility.png) Add helpful instructions to your config wizard Creating a client ID and secret and assigning the key pair a set of permissions can be a daunting task for a customer user. You can add [helpful instructions](https://prismatic.io/docs/integrations/config-wizard/config-pages.md#displaying-additional-helper-text-in-the-configuration-wizard) including links to documentation and screenshots to guide the user through the key pair creation process. #### Configuring a client credentials connection[​](#configuring-a-client-credentials-connection "Direct link to Configuring a client credentials connection") When your customer walks through your configuration wizard, they will be prompted to enter their **Client ID** and **Client Secret**. Clicking **Connect** will cause Prismatic's OAuth 2.0 service to exchange their key pair with the third-party API for an access token that your integration will then begin to use. ![Client credentials input visiblity](/docs/img/integrations/connections/oauth2/client-credentials-config-wizard.png) After clicking **Connect**, the user will either see an "Authorization Complete" or an "Authorization Failed" screen, depending on whether their connection was successful or not. If you'd like this screen to close immediately, see [these documentation files](https://prismatic.io/docs/integrations/connections/oauth2/custom-redirects.md#closing-oauth-20-success-pages-immediately). --- #### Custom OAuth 2.0 Redirects #### Configuring custom OAuth 2.0 redirects[​](#configuring-custom-oauth-20-redirects "Direct link to Configuring custom OAuth 2.0 redirects") Normally, a customer user who completes an OAuth 2.0 flow finds themselves on an "Authorization Complete" screen - . If you would like to customize where a customer is redirected after a successful or failed OAuth 2.0 flow, toggle the **Custom OAuth Redirects** option on the connection and enter URLs for **OAuth Success Redirect URI** and **OAuth Failure Redirect URI**. ![Custom oauth redirect configuration](/docs/img/integrations/connections/oauth2/custom-oauth-redirect-config.png) Your user will be redirected to those URLs with URL search parameters representing: * The instance's `instanceId` and `instanceName` * The integration's `integrationId` and `integrationName` * The required config variable's `requiredConfigVariableId` and `requiredConfigVariableKey` * The connection's `id`. For instance-level connections, this will be the instance's config variable ID. For [user-level](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md) connections, this will be the user-level config variable ID. ![Custom oauth redirect result](/docs/img/integrations/connections/oauth2/custom-oauth-redirect-result.png) #### Closing OAuth 2.0 success pages immediately[​](#closing-oauth-20-success-pages-immediately "Direct link to Closing OAuth 2.0 success pages immediately") If you'd like to omit the "connection successful" page altogether, create a publicly-available HTML page that immediately runs a JavaScript `parent.close()` function, like this: ```

Success!

You have successfully authorized the application to access your account.

You can now close this window

``` After arriving at Prismatic's OAuth 2.0 callback URL, the user will be redirected to your HTML page that immediately closes their tab. That should leave them on your integration's config wizard, ready to complete the rest of the integration configuration. --- #### OAuth 2.0 for Microsoft Apps [Configuring OAuth 2.0 for integrations with Microsoft apps](https://player.vimeo.com/video/907604023) #### Configuring OAuth 2.0 for integrations with Microsoft applications[​](#configuring-oauth-20-for-integrations-with-microsoft-applications "Direct link to Configuring OAuth 2.0 for integrations with Microsoft applications") Many Microsoft applications (like [Teams](https://prismatic.io/docs/components/ms-teams.md), [Outlook](https://prismatic.io/docs/components/ms-outlook.md), [OneDrive](https://prismatic.io/docs/components/ms-onedrive.md), etc.) use OAuth 2.0 for authorization. To enable OAuth 2.0 authentication in your integration, you'll first need to register your application with Microsoft. 1. Open [Azure Portal](https://portal.azure.com/) and create a new application registration. 2. Be sure to select **Any Azure AD directory - Multi-tenant** as the supported account type, so your customers (who have different Microsoft tenants) can use your integration. 3. Select **Web** under **Platforms** and add the Prismatic OAuth 2.0 callback URL as the **Redirect URI**. The Prismatic OAuth 2.0 callback URL for the US commercial region is `https://oauth2.prismatic.io/callback`. If your Prismatic tenant is in a different region or you're using a custom domain, you'll need to use the appropriate callback URL for your region or domain. See [OAuth 2.0 callback URLs](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md#creating-an-oauth-20-app-in-a-third-party-service) for more information. 4. Open **Certificates & Secrets** and add a new Client Secret. Note the **value** of the secret (not the ID!). 5. Note the **Application (client) ID** from the **Overview** page. With your application registered, you can now configure your integration to use OAuth 2.0 using the **client ID** and **client secret** you generated. #### Customizing the Microsoft OAuth 2.0 consent screen[​](#customizing-the-microsoft-oauth-20-consent-screen "Direct link to Customizing the Microsoft OAuth 2.0 consent screen") You can customize the icon and name that appear on the OAuth 2.0 consent screen by adding a **Branding & properties** section to your application registration. #### Microsoft OAuth 2.0 app approval[​](#microsoft-oauth-20-app-approval "Direct link to Microsoft OAuth 2.0 app approval") Microsoft will allow you to test your integration with your own Microsoft account, but you'll need to submit your application for approval before it can be used by other users. You can do that by adding your MPN ID under the **Branding & properties** section of your application registration. --- #### OAuth 2.0 Password Grant Type #### Password grant type overview[​](#password-grant-type-overview "Direct link to Password grant type overview") The OAuth 2.0 **Password** grant type is a legacy way to exchange a user's username and password for an access token. This grant type is generally not recommended, since it requires a user to enter their credentials to a third-party app within your app. **Additional resources**: #### How does the password grant type work?[​](#how-does-the-password-grant-type-work "Direct link to How does the password grant type work?") At a high level, the OAuth 2.0 password flow works like this: 1. As a software vendor, you ask your users for their username and password for a third-party app. 2. You exchange their credentials for an access token using the third-party's **token URL**. 3. You use the access token to access third-party resources the user has access to. #### Implementing OAuth 2.0 password grant type in custom components[​](#implementing-oauth-20-password-grant-type-in-custom-components "Direct link to Implementing OAuth 2.0 password grant type in custom components") The password grant type is [deprecated](https://datatracker.ietf.org/doc/html/draft-ietf-oauth-security-topics-29#section-2.4), and Prismatic's OAuth 2.0 service does not automatically exchange usernames and passwords for access tokens. That logic will need to be implemented within your custom component. Whenever an action that calls the third-party app is run, it will need to exchange the username and password for an access token and then initialize an HTTP client that has that access token. This example HTTP client code handles the password exchange and returns an authenticated HTTP client: ``` import { Connection, util } from "@prismatic-io/spectral"; import { createClient } from "@prismatic-io/spectral/dist/clients/http"; export const createAcmeClient = async (connection: Connection) => { // Extract necessary fields from connection definition const { tokenUrl, username, password, client_id, client_secret } = connection.fields; // Password grant often requires a client ID / secret base64-encoded as an authorization header const authHeader = Buffer.from(`${client_id}:${client_secret}`).toString( "base64", ); // Create an HTTP client to make a token exchange request const authClient = createClient({ baseUrl: "https://auth.acme.com", headers: { Authorization: `Basic ${authHeader}`, }, }); // Exchange username/password for an access token const { data: authResponseData } = await auth.post("/oauth/token", { grant_type: "password", username, password, }); const { access_token } = authResponseData; // Return an authenticated HTTP client return createClient({ baseUrl: "https://api.acme.com", headers: { Authorization: `Bearer ${access_token}`, }, }); }; ``` --- #### Troubleshooting OAuth 2.0 Connections This page focuses on troubleshooting OAuth 2.0 [authorization code](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md) connections, but similar debugging concepts can be applied to [client credentials](https://prismatic.io/docs/integrations/connections/oauth2/client-credentials-grant-type.md) connections. Note that every application implements OAuth 2.0 slightly differently, but this provides general recommendations for debugging OAuth 2.0 connections. #### Troubleshooting authorization endpoints[​](#troubleshooting-authorization-endpoints "Direct link to Troubleshooting authorization endpoints") When a customer user clicks **Connect**, they are brought to a third-party app's **Authorization URL**. Your client ID, the config variable's ID, permission scopes and the redirect URI is appended as search parameters to the Authorize URL. For example, if the external application's authorize URL is `https://auth.example.com/authorize`, and your client ID is `abc-123`, your user will be directed to ``` https://auth.example.com/authorize?client_id=abc-123&redirect_uri=https%3A%2F%2Foauth2.prismatic.io%2Fcallback&scope=widget%3Aread+widget%3Awrite&state=SW5example ``` Here, `state` represents the config variable's ID in Prismatic and is used when the user returns to our callback URL to determine which config variable to update. ##### Invalid client\_id errors[​](#invalid-client_id-errors "Direct link to Invalid client_id errors") If your users arrive at an authorization page that says "Invalid client\_id parameter" (or a similar error), you should double-check your client ID. Verify that there are no leading or trailing whitespace characters and that the client ID matches the client ID that you saw when you configured your application in the third-party system. ##### Incorrect redirect\_uri errors[​](#incorrect-redirect_uri-errors "Direct link to Incorrect redirect_uri errors") If your customers see an authorization page that says "redirect\_uri mismatch" (or a similar error), you should verify that the callback URL you configured in the third-party system is correct. For the US region, the callback URL is `https://oauth2.prismatic.io/callback`. For other public regions, private cloud hosted options, or white-labeled callback URLs, see [Authorization code callback URL](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md#authorization-code-callback-url). #### Troubleshooting code token exchange[​](#troubleshooting-code-token-exchange "Direct link to Troubleshooting code token exchange") After authenticating with a third-party app and walking through the app's consent screen, a user will return to Prismatic's OAuth 2.0 callback URL with their **authorization code** in hand. Depending on whether you white-label the callback URL or if you're hosted in a different region, they'll end up on a URL that looks similar to: ``` https://oauth2.prismatic.io/callback?code=some-unique-auth-code&state=SW5example ``` The Prismatic OAuth 2.0 service then loads the config variable from the `state` parameter to match a config variable's ID and attempts to exchange the `code` for an `access_token` using the third-party app's **token URL**. ##### Flavors of auth code token exchange[​](#flavors-of-auth-code-token-exchange "Direct link to Flavors of auth code token exchange") Different apps implement auth code exchange in different ways. Some apps expect you to pass your client ID and secret as a base64-encoded auth header. Others expect that they're passed in a body. Some apps expect that your body is formdata-encoded, while others expect JSON. The Prismatic OAuth 2.0 service attempts each of the six common "flavors" of auth code token exchange, in order of popularity, and succeeds once one has succeeded, and fails if none succeed. Suppose your client ID is `my-client-id`, your client secret is `my-client-secret`, and the `code` your customer returned with is `some-unique-auth-code`. Prismatic would attempt these token exchanges: 1. Client ID / secret are URL-encoded and then base64-encoded and sent as an auth header. Body is formdata-encoded ``` curl -X POST \ --header 'Authorization: Basic bXktY2xpZW50LWlkOm15LWNsaWVudC1zZWNyZXQ=' \ --header 'Content-Type: application/x-www-form-urlencoded' \ https://app.example.com/oauth2/token \ --data 'grant_type=authorization_code&scope=widgets%3Aread%20widgets%3Awrite%20offline_access&redirect_uri=https%3A%2F%2Foauth2.prismatic.io%2Fcallback&code=some-unique-auth-code' ``` 2. Client ID / secret are URL-encoded and then base64-encoded and sent as an auth header. Body is JSON-encoded ``` curl -X POST \ --header 'Authorization: Basic bXktY2xpZW50LWlkOm15LWNsaWVudC1zZWNyZXQ=' \ --header 'Content-Type: application/json' \ https://app.example.com/oauth2/token \ --data '{"code":"some-unique-auth-code","grant_type":"authorization_code","redirect_uri":"https://oauth2.prismatic.io/callback","scope":"widgets:read widgets:write offline_access"}' ``` 3. Client ID / secret are sent within the body. Body is formdata-encoded ``` curl -X POST \ --header 'Content-Type: application/x-www-form-urlencoded' \ https://app.example.com/oauth2/token \ --data 'grant_type=authorization_code&scope=widgets%3Aread%20widgets%3Awrite%20offline_access&redirect_uri=https%3A%2F%2Foauth2.prismatic.io%2Fcallback&code=some-unique-auth-code&client_id=my-client-id&client_secret=my-client-secret' ``` 4. Client ID / secret are sent within the body. Body is JSON-encoded ``` curl -X POST \ --header 'Content-Type: application/json' \ https://app.example.com/oauth2/token \ --data '{"client_id":"my-client-id","client_secret":"my-client-secret","code":"some-unique-auth-code","grant_type":"authorization_code","redirect_uri":"https://oauth2.prismatic.io/callback","scope":"widgets:read widgets:write offline_access"}' ``` 5. Client ID / secret are just base64-encoded and sent as an auth header. Body is formdata-encoded. This will be the same as flavor #1, but characters (like whitespace) are not URL-encoded before base64-encoding. 6. Client ID / secret are just base64-encoded and sent as an auth header. Body is JSON-encoded. This will be the same as flavor #2, but characters (like whitespace) are not URL-encoded before base64-encoding. ##### Mocking a token exchange endpoint[​](#mocking-a-token-exchange-endpoint "Direct link to Mocking a token exchange endpoint") If you'd like to see exactly what the Prismatic OAuth 2.0 service attempts to send a token endpoint, you can spin up a Docker container locally to simulate a token endpoint. Run a [Smocker](https://smocker.dev) container: ``` docker run -d --restart=always \ -p 8080:8080 \ -p 8081:8081 \ --name smocker \ thiht/smocker ``` Then, declare a token "smock" endpoint: ``` curl -XPOST \ localhost:8081/mocks \ --header "Content-Type: application/x-yaml" \ --data \ ' - request: method: POST path: /oauth2/token response: status: 200 headers: Content-Type: application/json body: > { "access_token": "my-access-token", "token_type": "bearer", "expires_in": 60, "example_parameter": "example_value", "refresh_token": "my-refresh-token" } ' ``` This endpoint will accept any request and return a fake access token response. Next, expose your Docker container with [ngrok](https://ngrok.com/): ``` ngrok http 8080 ``` From there, you can take note of your `ngrok` endpoint and set your **Token URL** in your connection to something like `https://31ce-123-123-123-123.ngrok-free.app/oauth/token`. By visiting `http://localhost:8081`, you'll be able to view each request Prismatic's OAuth 2.0 service made. If you change the `status` in the smock to `status: 500`, you'll see all six token exchange requests that Prismatic's OAuth 2.0 token exchange service made. ![Mocking OAuth 2.0 token endpoint with Smocker](/docs/img/integrations/connections/oauth2/mocking-token-endpoint-smocker.png) ##### Verify token exchange works with Postman[​](#verify-token-exchange-works-with-postman "Direct link to Verify token exchange works with Postman") Once you've captured a token request, try to send that equivalent request to your third-party app either with `curl` or [Postman](https://postman.com/). If you get a 404, you may have the wrong token URL configured. You may also get a more descriptive response that can help you identify changes you need to make to scopes, etc. #### Token refresh errors[​](#token-refresh-errors "Direct link to Token refresh errors") If a token exchange works initially, but later the token fails to refresh, it's possible that you omitted an [offline\_access](https://prismatic.io/docs/integrations/connections/oauth2/authorization-code-grant-type.md#the-offline_access-scope) scope, which many apps use to indicate that you need long-term access. --- #### On-Prem Agent The **On-Prem Agent** lets you connect your instances to resources that are not accessible from the public internet. This is useful when you or your customers have databases, file storage systems, or other services that reside on a private network behind a firewall. Feature Availability The on-prem feature is available to customers on specific pricing plans. Refer to your pricing plan or contract, or contact the Prismatic support team to learn more. #### How the on-prem agent works[​](#how-the-on-prem-agent-works "Direct link to How the on-prem agent works") The on-prem agent is a lightweight [Docker container](https://hub.docker.com/r/prismaticio/on-prem-agent) that you or your customer can install on your own infrastructure. When the Docker container is started, it establishes a secure [mutual TLS](https://en.wikipedia.org/wiki/Mutual_authentication) (mTLS) connection to an on-prem service running within the Prismatic platform and thereafter maintains a persistent connection with Prismatic. When an instance of your integration is deployed, your customer can select the OPA as the connection method. When an on-prem connection is used in the instance, the instance communicates with the OPA on the private network using the established connection, which in turn communicates with your resource on the private network. Data sent from the instance to the OPA through the on-prem service is encrypted using mTLS, and data is transmitted on [OSI Layer 4](https://www.cloudflare.com/learning/ddos/glossary/open-systems-interconnection-model-osi/) (transport layer). This allows you to send both HTTP and non-HTTP traffic through the OPA. No inbound ports need to be opened Note that the on-prem agent initiates the connection to the Prismatic platform, so you do not need to open any inbound ports on your firewall. The on-prem agent only needs to be able to make *outbound* connections to the Prismatic platform on ports 22 and 443: * The agent will connect on **port 22** to `onprem.prismatic.io` (or `onprem.` for other regions or white-label domains) to create a persistent connection. For example, `onprem.eu-west-1.prismatic.io` for the Europe (Ireland) region, or `onprem.integrations.example.com` for a white-labeled domain. * The agent will also connect on **port 443** to `app.prismatic.io` (or your region or white-labeled domain) for authentication and configuration data. #### Setting up the on-prem agent[​](#setting-up-the-on-prem-agent "Direct link to Setting up the on-prem agent") To set up an on-prem agent, you will need a system on your private network that is capable of running a [Docker](https://www.docker.com/) container. This can be the same server that serves the database, filesystem, or other resource you want to connect to, or a separate server on the same network that can access the resource. The on-prem container itself is very lightweight, generally consuming less than 100MB of memory and a small amount of CPU. While we recommend using a Linux Docker host for the on-prem container, you can run the on-prem agent on Windows as well. Please see the [On-Prem Agent on Windows](https://prismatic.io/docs/integrations/connections/on-prem-agent/on-prem-agent-windows.md) article. ##### Configuring the on-prem Docker container[​](#configuring-the-on-prem-docker-container "Direct link to Configuring the on-prem Docker container") An on-prem resource is configured for a specific customer. As an organization team member, you can view all on-prem resources by running `prism on-prem-resources:list`: ``` prism on-prem-resources:list Name Status Customer ─────────────── ─────────── ──────── Acme PostgreSQL AVAILABLE Acme Corp Hooli SFTP UNAVAILABLE Hooli ``` To create a new on-prem resource, first look up the ID of the customer whom the resource is for: ``` prism customers:list --columns "Id,Name" Id Name ──────────────────────────────────────────────────────────── ───────────────── Q3VzdG9tZXI6YjBmZDAyZTItYmE1OC00NzE0LWJhYzgtMDMwNWM5N2JiY2Vj Acme Corp Q3VzdG9tZXI6MTE0ODdlYmItNDdlMC00MGFjLWI1NjYtYzBiZWVjNjlkZTMz Initech Q3VzdG9tZXI6M2RkMjAwYjAtMjlmYy00MzZjLTk2OWYtMmNkMjUzYWNkYzY1 Stark Enterprises Q3VzdG9tZXI6NzFlY2NiYzQtYjc5OC00YzQzLWIzZDAtZjdmYzE5OTEyYzlj Hooli ``` Next, generate a registration JSON web token (JWT) for your customer: ``` prism on-prem-resources:registration-jwt \ --customerId Q3VzdG9tZXI6YjBmZDAyZTItYmE1OC00NzE0LWJhYzgtMDMwNWM5N2JiY2Vj eyJ0eXAiO.... ``` create org-only resources for testing To test the on-prem agent in the integration designer, you can create an on-prem resource that is only visible to your organization (and not attached to a particular customer). To do that, run `prism on-prem-resources:registration-jwt --orgOnly` Now, with a registration JWT in hand, you can start the on-prem agent Docker. The container takes a set of environment variables to configure the connection to the Prismatic platform: * `PRISMATIC_URL` is the URL of the Prismatic platform. For the US commercial region, that's `https://app.prismatic.io`. For [other regions](https://prismatic.io/docs/configure-prismatic/deployment-regions.md), use the appropriate URL. * `APP_HOST` is the hostname of the service running on the private network. For example, if you're connecting to a database that runs on a host with IP address `10.1.2.3`, enter that as the `APP_HOST`. Connect to the docker host If you run the on-prem agent on the same host as the service you're connecting to, you can use the special hostname `host.docker.internal` to connect to the host. `host.docker.internal` resolves to the internal IP address of the host running the Docker container. Note that `localhost` or `127.0.0.1` does not work in this context, as it refers to the container itself. * `APP_PORT` is the port on which the service is running (`5432` for PostgreSQL, `3306` for MySQL, `22` for SFTP, etc.). * `NAME` is the name of the on-prem resource that you will see when you run `prism on-prem-resources:list`. * `REGISTRATION_JWT` is the JWT you generated for the customer. Start the on-prem agent Docker container ``` export REGISTRATION_JWT=$(prism on-prem-resources:registration-jwt --customerId Q3VzdG9tZXI6YjBmZDAyZTItYmE1OC00NzE0LWJhYzgtMDMwNWM5N2JiY2Vj) docker run \ --env PRISMATIC_URL=https://app.prismatic.io \ --env APP_PORT=1433 \ --env APP_HOST=host.docker.internal \ --env "NAME=Acme MS SQL" \ --env REGISTRATION_JWT \ -t prismaticio/on-prem-agent:latest ``` ##### Running the on-prem agent using Docker Compose[​](#running-the-on-prem-agent-using-docker-compose "Direct link to Running the on-prem agent using Docker Compose") [Docker Compose](https://docs.docker.com/compose/) allows you to define and run multi-container Docker applications and has some useful features like automatic restart of containers on system reboot. Here's an example `docker-compose.yml` file that starts the on-prem agent: On-Prem docker-compose.yml ``` services: on-prem-agent: image: prismaticio/on-prem-agent:latest environment: PRISMATIC_URL: https://app.prismatic.io APP_PORT: 1433 APP_HOST: host.docker.internal # Or specify the IP of the service NAME: Acme MS SQL REGISTRATION_JWT: ${REGISTRATION_JWT} # Source from host's environment variable restart: always # Use "always" to start this service when the Docker engine starts ``` After creating a `docker-compose.yml` file, you can run `docker-compose up` from the command line to start the on-prem agent, or `docker-compose up -d` to start it in the background. ##### Configuring an instance to use the on-prem agent[​](#configuring-an-instance-to-use-the-on-prem-agent "Direct link to Configuring an instance to use the on-prem agent") Once an on-prem agent is running and has connected to the Prismatic platform, you can configure an instance to use the on-prem agent. First, you need to update connections on your integration to support an on-prem connection. Open a connection in your config wizard designer and select **Allow On-Prem Connections**. ![](/docs/img/integrations/connections/on-prem-agent/allow-on-prem-connections.png) When your customer configures an instance of your integration, they can select an existing on-prem agent to use for the connection by toggling **Use On-Prem Connection** and selecting a connection to use: ![](/docs/img/integrations/connections/on-prem-agent/use-on-prem-connection.png) Note that when an on-prem connection is selected, the connection's "Host" and "Port" inputs disappear. That is because the on-prem service is responsible for connecting to the private network service, and the instance communicates with the on-prem service. The on-prem service will provide the instance with a local host and port to connect to when an execution is run. #### Regenerating or revoking the registration JWT[​](#regenerating-or-revoking-the-registration-jwt "Direct link to Regenerating or revoking the registration JWT") If you lose the registration JWT for an on-prem resource, you can regenerate it using the `prism on-prem-resources:registration-jwt` command. You will need to provide the command with a `--customerId` and `--resourceId` of the on-prem resource you want to regenerate the JWT for. Those values can be found by running `prism on-prem-resources:list --extended --output json`. If you need to revoke an on-prem resource registration JWT, you revoke all old JWTs and generate a new one by running `prism on-prem-resources:registration-jwt --customerId {ID} --resourceId {ID} --rotate`. #### Connectors with on-prem support[​](#connectors-with-on-prem-support "Direct link to Connectors with on-prem support") The following built-in connectors support on-prem connections: * [FTP](https://prismatic.io/docs/components/ftp.md) * [HTTP](https://prismatic.io/docs/components/http.md) * [IMAP](https://prismatic.io/docs/components/imap.md) * [Active Directory](https://prismatic.io/docs/components/ldap.md) * [Microsoft SQL Server](https://prismatic.io/docs/components/ms-sql-server.md) * [MySQL](https://prismatic.io/docs/components/mysql.md) * [Oracle Database](https://prismatic.io/docs/components/oracledb.md) * [PostgreSQL](https://prismatic.io/docs/components/postgres.md) * [SAP Business One](https://prismatic.io/docs/components/sap-business-one.md) * [SFTP](https://prismatic.io/docs/components/sftp.md) * [SMTP](https://prismatic.io/docs/components/smtp.md) #### Supporting on-prem connections in a custom connector[​](#supporting-on-prem-connections-in-a-custom-connector "Direct link to Supporting on-prem connections in a custom connector") Prismatic provides several built-in connectors that connect to systems that are often on-prem (like PostgreSQL, MySQL, and MS SQL databases, SFTP file systems, SMTP and IMAP email servers, etc.). Those built-in connectors support on-prem connections out of the box. If you've built a connector that connects to a system that is often hosted on-prem, you can add support for on-prem connections as well. Within your custom connector, change your `connection()` invocation to an `onPremConnection` invocation. The `onPremConnection` function takes the same arguments as `connection` but requires that your connection include inputs named `host` and `port`. Additionally, update your `host` and `port` to have the property `onPremControlled: true`. For example, here is a basic auth on-prem connection for a custom connector: ``` import { onPremConnection } from "@prismatic-io/spectral"; export const basicAuth = onPremConnection({ key: "basicAuth", display: { label: "Username, password and endpoint", description: "Basic auth username and password and endpoint", }, inputs: { username: { label: "Username", placeholder: "Username", type: "string", example: "john.doe", required: false, shown: true, }, password: { label: "Password", placeholder: "Password", type: "password", example: "p@s$W0Rd", required: false, shown: true, }, host: { label: "Host", placeholder: "Name of the host", type: "string", required: true, comments: "The address of the Acme server. This should be an IP address or hostname.", example: "server.example.io", onPremControlled: true, }, port: { label: "Port", placeholder: "Port of the host", default: "1234", required: true, comments: "The port of the Acme server.", type: "string", onPremControlled: true, }, }, }); ``` For a full example of a connector that supports on-prem connections, see our SFTP connector source code in GitHub. `src/connections.ts` contains the connection definitions for the connector. [SFTP component source code](https://github.com/prismatic-io/examples/tree/main/components/sftp) When an execution is run, the instance will provide the connection with the host and port of the on-prem agent to connect to. Support connections that don't have host and port inputs What if your custom connector doesn't have `host` and `port` inputs? You might have an input called `endpoint` for example that represents your customer's (generally publicly available) app endpoint that is a URL like `https://my-customer-id.example.com`. Add `host` and `port` inputs but set them to `required: false, shown: false`. Then, in your connector's code, you can check if `yourConnection.fields.host` has a value. If it does, construct the endpoint from `https://${yourConnection.fields.host}:${yourConnection.fields.port}`. ##### Handling servers that use host-based routing with the on-prem agent[​](#handling-servers-that-use-host-based-routing-with-the-on-prem-agent "Direct link to Handling servers that use host-based routing with the on-prem agent") Some HTTP servers use host-based routing to determine which site to serve. For example, a server with IP `10.1.2.3` might serve both your app and a different app on port 80, and determine which app to serve based on the `Host` header in the HTTP request. When using the on-prem agent, the `Host` header in your HTTP request will default to the IP address of the on-prem service. You can override the `Host` header in your HTTP client to match the hostname of the service you want to connect to. For example, you could specify the `Host` header as the `endpoint` input of your connection: ``` const response = client.get( `https://${yourConnection.fields.host}:${yourConnection.fields.port}`, { headers: { Host: yourConnection.fields.endpoint } }, ); ``` A full example connector that supports on-prem connections with host-based routing can be found in the GitHub examples repo. [Example on-prem-compatible component](https://github.com/prismatic-io/examples/tree/main/components/on-prem-example) ##### Handling HTTPS-based connections with the on-prem agent[​](#handling-https-based-connections-with-the-on-prem-agent "Direct link to Handling HTTPS-based connections with the on-prem agent") If the service that you are connecting to uses HTTPS on the private network, you will need to make sure that your HTTP client in your connector is configured to trust (or ignore) the SSL certificate of the service. The HTTPS client that the custom connector SDK provides is an instance of Axios, which uses the `https` module from Node.js. You can ignore SSL certificate errors by setting the `rejectUnauthorized` option to `false` in the `https` module's global agent: ``` const https = require("https"); const agent = new https.Agent({ rejectUnauthorized: false, }); const response = client.get( `https://${yourConnection.fields.host}:${yourConnection.fields.port}`, { headers: { Host: yourConnection.fields.endpoint, }, httpsAgent: agent, }, ); ``` #### White-labeling the on-prem agent[​](#white-labeling-the-on-prem-agent "Direct link to White-labeling the on-prem agent") If you would like to white-label the on-prem agent, so your customers install and run a Docker container from your organization, follow these steps: 1. Create a `Dockerfile` that reads: ``` FROM prismaticio/on-prem-agent:latest ENV PRISMATIC_URL=https://app.prismatic.io ``` 2. Build and publish the image with a white-labeled name to Docker Hub: ``` docker build . -t acme-corp/on-prem-agent:latest docker push acme-corp/taylor-on-prem-agent:latest ``` 3. Your customers can then start a Docker container using your white-label name, and can omit the `PRISMATIC_URL` parameter, since that's hard-coded in your `Dockerfile` above: ``` docker run \ --env APP_PORT=1433 \ --env APP_HOST=host.docker.internal \ --env "NAME=Acme MS SQL" \ --env REGISTRATION_JWT \ -t acme-corp/taylor-on-prem-agent:latest ``` --- #### On-Prem Databases Requiring Client Libraries or Drivers Prismatic's integration runner executes NodeJS code. Node.js is great for connecting to HTTP-based APIs (like REST, SOAP, or GraphQL APIs) or when connecting to databases when pure JavaScript clients are available. For example, connectors like [MySQL](https://prismatic.io/docs/components/mysql.md), [PostgreSQL](https://prismatic.io/docs/components/postgres.md), or [Redis](https://prismatic.io/docs/components/redis.md) can rely on pure JavaScript libraries like [mysql2](https://www.npmjs.com/package/mysql2), [pg-promise](https://www.npmjs.com/package/pg-promise), and [redis](https://www.npmjs.com/package/redis) respectively. However, some on-prem databases like [IBM DB2](https://www.ibm.com/db2) do not have pure JavaScript libraries. They may offer NodeJS packages (like [ibm\_db](https://www.npmjs.com/package/ibm_db)), but those packages are wrappers around ODBC drivers or non-JavaScript libraries that require additional compilation or installation. #### Connecting to databases like IBM DB2[​](#connecting-to-databases-like-ibm-db2 "Direct link to Connecting to databases like IBM DB2") When connecting to on-prem databases that lack pure JavaScript client libraries, we recommend installing the required database libraries, drivers, or binaries on a Docker image that lives alongside the [on-prem agent](https://prismatic.io/docs/integrations/connections/on-prem-agent.md) container. This database client container can translate HTTP requests from your integration into requests that the database understands and can return HTTP-based responses to your integration. ##### Creating an IBM DB2 client container[​](#creating-an-ibm-db2-client-container "Direct link to Creating an IBM DB2 client container") IBM offers a [set of libraries](https://www.ibm.com/docs/en/db2-warehouse?topic=python-sqlalchemy-django-framework) that wrap their database client. In this example, we'll create a database client using Python and [Flask](https://github.com/pallets/flask) (though you could do the same with NodeJS and Express, PHP, etc.). Dockerfile for an IBM DB2 client in Python ``` FROM --platform=linux/x86_64 python:3 WORKDIR /app # Download and decompress database driver ADD https://public.dhe.ibm.com/ibmdl/export/pub/software/data/db2/drivers/odbc_cli/linuxx64_odbc_cli.tar.gz /tmp/linuxx64_odbc_cli.tar.gz RUN mkdir /app/db2_driver RUN tar -xzvf /tmp/linuxx64_odbc_cli.tar.gz -C /app/db2_driver ENV IBM_DB_HOME=/app/db2_driver ENV LD_LIBRARY_PATH="/app/db2_driver/lib:$LD_LIBRARY_PATH" # Install Python Dependencies RUN pip install --upgrade setuptools pip RUN pip install flask RUN pip install ibm_db # Copy our app code and run a server on port 4000 COPY app.py . EXPOSE 4000 CMD ["flask", "run", "--host=0.0.0.0", "--port=4000"] ``` In this `Dockerfile`, we generate a container from the `python:3` image. We download the DB2 driver from IBM and extract the driver, setting two required environment variables. Then, we install two Python dependencies: `flask`, which is a webserver, and `ibm_db`, which relies on the IBM driver we downloaded. Finally, we run our web server on port 4000. The webserver is defined in `app.py`. It's a short script that declares a single `POST` endpoint that takes a `username`, `password`, `database`, and SQL `query` from the `POST` request and issues that query against that database. It returns the resulting records in an HTTP response as JSON. app.py to translate HTTP to IBM DB2 ``` import ibm_db import os from flask import Flask, request, jsonify, make_response app = Flask(__name__) # Handle SELECT statements and return results @app.route("/select", methods=['POST']) def query(): data = request.get_json() username = data["username"] password = data["password"] database = data["database"] query = data["query"] hostname = os.environ["DB2_HOST"] port = os.environ["DB2_PORT"] conn = ibm_db.connect(f"DATABASE={database};HOSTNAME={hostname};PORT={port};PROTOCOL=TCPIP;UID={username};PWD={password};", "", "") stmt = ibm_db.exec_immediate(conn, query) response_data = [] result = True while(result): result = ibm_db.fetch_assoc(stmt) if (result): response_data.append(result) return make_response(jsonify(response_data)) ``` This short script only handles `SELECT` queries but could easily be extended to respond appropriately to `CREATE`, `UPDATE`, or `DELETE` SQL statements. ##### Running a client container alongside an on-prem agent[​](#running-a-client-container-alongside-an-on-prem-agent "Direct link to Running a client container alongside an on-prem agent") It is important that the database client container runs alongside the on-prem agent container, and you can do that by declaring the database client service in the same `docker-compose.yml` file as the on-prem agent service. Here, we run three services: * `db2` is an IBM DB2 database that we run locally to simulate a database on our network. Your customer likely runs a full IBM DB2 database. * `db_client` runs our Python code above, translating HTTP requests to DB2 queries * `on-prem-agent` is the Prismatic [on-prem agent](https://prismatic.io/docs/integrations/connections/on-prem-agent.md) which establishes a connection between your integration in Prismatic's cloud and the `db_client` container on your customer's network. docker-compose.yml ``` version: "3.1" volumes: dbdata: name: db2-data services: # This container runs an IBM DB2 database and simulates an on-prem database # After starting this container, wait up to 15 minutes for the database to # be initialized and ready for connections (it takes a while!). db2: platform: linux/x86_64 image: ibmcom/db2 privileged: true environment: LICENSE: accept DBNAME: testdb DB2INST1_PASSWORD: my-pass volumes: - dbdata:/database # This container takes HTTP requests and translates them into DB2 queries db_client: platform: linux/x86_64 build: ./db-client environment: DB2_HOST: db2 DB2_PORT: 50000 # Prismatic on-prem agent which will proxy requests to the "client" container on-prem-agent: image: prismaticio/on-prem-agent:latest environment: PRISMATIC_URL: https://app.prismatic.io APP_PORT: 4000 APP_HOST: db_client NAME: DB2 Client REGISTRATION_JWT: eyJ0e... restart: always # Use "always" to start this service when the Docker engine starts ``` ##### Invoking the database client from an integration[​](#invoking-the-database-client-from-an-integration "Direct link to Invoking the database client from an integration") Now that an HTTP server is running alongside the on-prem agent, you can either use the built-in [HTTP](https://prismatic.io/docs/components/http.md) connector or build a custom connector to send requests to the database client. Since our database client runs a web server on port 4000 and we're proxying requests through the on-prem agent to the database client, we can point an [HTTP POST](https://prismatic.io/docs/components/http.md#post-request) request to `http://localhost:4000/select` and send a query that we would like to run. ![Send a POST request to our on-prem database client](/docs/img/integrations/connections/on-prem-agent/databases-with-client-libraries/post-request.png) --- #### On-Prem Agent on Windows The on-prem agent is a Linux-based Docker container, which naturally lends itself well to a Linux host. For ease of installation, we recommend running the on-prem agent on a Linux host. However, if for compliance or other reasons you are required to run the on-prem agent on a Windows host, you can. This document outlines considerations for installing the on-prem agent on a Windows host. #### Installing Docker on a Windows host[​](#installing-docker-on-a-windows-host "Direct link to Installing Docker on a Windows host") You have several options for installing Docker on a Windows host. 1. You can run a Linux host in Windows Subsystem for Linux (WSL2) and run `docker` within WSL2. To install WSL2 on your Windows host, run ``` wsl --install ``` from PowerShell and follow the prompts. Once WSL2 is installed, assume the root user with `sudo su` and run ``` apt update && apt upgrade -y ``` Then, follow the [steps on Docker's website](https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository) to add Docker's aptitude repository and install the latest Docker packages. Additionally, install `docker-compose` with ``` apt install docker-compose ``` You can start the docker service as the root user with ``` service docker start ``` 2. You can download and install [Docker Desktop](https://www.docker.com/) for Windows. Note that depending on your company's size and other factors, you may need to license Docker Desktop. Please consult Docker's licensing information. 3. You can purchase and install the [Mirantis Container Runtime](https://www.mirantis.com/software/mirantis-container-runtime/) Docker engine. #### Ensuring containers run on boot on a Windows host[​](#ensuring-containers-run-on-boot-on-a-windows-host "Direct link to Ensuring containers run on boot on a Windows host") By default, WSL2 and Docker Desktop are not launched until a user has logged in to the Windows host. This makes maintenance difficult - a system that reboots does not automatically launch WSL2 or Docker Desktop without a manual login. However, you can use Windows scheduled tasks to ensure that WSL2 or Docker Desktop start automatically. note Regardless of whether you use WSL2, Docker Desktop, or another Docker service on your Windows host, please remember to mark your on-prem agent as `restart: always` and run docker compose in "detached" mode (i.e. `docker-compose up --detach`) For example, docker-compose.yml ``` services: on-prem-agent: image: prismaticio/on-prem-agent:latest environment: PRISMATIC_URL: https://app.prismatic.io APP_PORT: 1433 APP_HOST: 10.0.0.123 NAME: Acme MS SQL REGISTRATION_JWT: eyJ0eXAiOiJK... restart: always # Use "always" to start this service when the Docker engine starts ``` To create a new scheduled task, search for "Task Scheduler" from your Windows search bar. ![Open the task scheduler from the windows search bar](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/task-scheduler.png) Then, select **Action** > **Create Task** ![Create a new scheduled task](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/create-task.png) Within the scheduled task, ensure that **Run whether user is logged on or not** and **Run with highest privileges** are selected. Give the task an identifiable name and select an appropriate version of Windows for your installation. ![Configure a task to run on boot](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/run-as-user.png) ##### Configuring Docker Desktop to run on boot[​](#configuring-docker-desktop-to-run-on-boot "Direct link to Configuring Docker Desktop to run on boot") Within a new scheduled task's **Triggers** tab, create a new trigger to begin the task **At startup**. Set the **Delay task for** to **1 minute** and ensure **Enabled** is selected. ![Setting up triggers for Docker Desktop](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/triggers-docker-desktop.png) Under the **Actions** tab, create a new action and start the program `"C:\Program Files\Docker\Docker\Docker Desktop.exe"` ![Setting up actions for Docker Desktop](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/actions-docker-desktop.png) After rebooting your Windows host, you should see Docker Desktop and any containers marked `restart: always` automatically start without user login within a few minutes. ##### Ensuring WSL2 runs on boot[​](#ensuring-wsl2-runs-on-boot "Direct link to Ensuring WSL2 runs on boot") Within a new scheduled task's **Triggers** tab, create a new trigger to begin the task **At startup**. Set the **Delay task for** to **1 minute** and ensure **Enabled** is selected. Additionally, set **Repeat task every** to every couple of minutes for the first 15 minutes. HyperV may not be ready to start the Docker service one minute after boot, and the additional task repeats help to ensure that the task completes eventually. ![Setting up triggers for WSL2](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/triggers-wsl.png) Under the **Actions** tab, create a new action and start the program `"C:\Program Files\WSL\wsl.exe"`. Under **Add arguments (optional)**, include the arguments: ``` -u root -e sh -c "service docker status || service docker start" ``` ![Setting up actions for WSL2](/docs/img/integrations/connections/on-prem-agent/on-prem-agent-windows/actions-wsl.png) This instructs your machine to run `wsl.exe` on boot, and as the root user it will start the `docker` service if it is not already running. After rebooting your Windows host, you should see WSL, its Docker service, and any containers marked `restart: always` automatically start without user login within a few minutes. --- #### Troubleshooting On-Prem Connections When an integration makes a request to an on-prem resource, it takes several steps to get there: 1. The integration sends a request to the cloud-based on-prem service. 2. The on-prem service forwards that request to an on-prem agent on your customer's network. 3. The on-prem agent forwards the request to your customer's private resource (database, file server, etc.). When a step that relies on an on-prem connection throws an error, it's important to be able to determine where in the chain the connection failed. This article offers basic troubleshooting tips you can use to verify connectivity. #### Verifying integration connectivity to the on-prem service[​](#verifying-integration-connectivity-to-the-on-prem-service "Direct link to Verifying integration connectivity to the on-prem service") There's not much to debug when it comes to this initial network hop. When an instance configured with an on-prem connection runs, the connection contains two system-generated inputs: `host` and `port`. These values refer to a host and port of Prismatic's **on-prem service** running alongside your integration. If you're using a [built-in connector that supports on-prem](https://prismatic.io/docs/integrations/connections/on-prem-agent.md#connectors-with-on-prem-support), they will use these host and port values automatically. If you want to build a custom connector that supports on-prem connections, see [Supporting on-prem connections in a custom connector](https://prismatic.io/docs/integrations/connections/on-prem-agent.md#supporting-on-prem-connections-in-a-custom-connector). Verify that you are not hard-coding host or port values or caching these values. Some database libraries persist database connections behind the scenes and can cause issues if the `port` value changes between executions. Ensure that you close connections when a custom action finishes. #### Verifying the on-prem agent is connected[​](#verifying-the-on-prem-agent-is-connected "Direct link to Verifying the on-prem agent is connected") Next, you should verify that the on-prem agent on your customer's network is connected to Prismatic's on-prem service. Running `prism on-prem-resources:list` will show you a list of on-prem resources and their statuses: ``` $ prism on-prem-resources:list Name Status Customer ───────────────────── ─────────── ───────────── Test MSSQL Server AVAILABLE Acme MSSQL Server UNAVAILABLE Acme Corp IBM DB2 Test Server AVAILABLE Hooli IBM DB2 Server AVAILABLE Hooli ``` Resources without a **Customer** listed are those that you've marked `--orgOnly` and are used for testing on-prem connections in the integration designer. Verify that your customer's on-prem resource appears on this list and is marked `AVAILABLE`. Additionally, check the on-prem agent container's logs. You should see lines that look like this: ``` 2024-12-18 11:48:38 Name: IBM DB2 Test Server 2024-12-18 11:48:39 Registered successfully. ``` If the `Registered successfully` log line is missing, your customer may not allow network egress from the Docker network and will need to open outbound connections to Prismatic on ports 22 and 443. Ensure your Docker container can reach `onprem.prismatic.io` (or `onprem.` for other regions or white-label domains) on **port 22**, and `app.prismatic.io` (or your region or white-label domain) on **port 443**. From the Docker network, your customer can verify connectivity to Prismatic's on-prem service by running `telnet `, replacing `` with the appropriate on-prem service endpoint for their region or white-label domain, and `` with 22 or 443. They should see a connection message like this: ``` > telnet onprem.eu-west-1.prismatic.io 22 Trying 99.80.255.255... Connected to onprem.eu-west-1.prismatic.io. Escape character is '^]'. SSH-2.0-OpenSSH_9.9 ``` #### Verifying connectivity to the customer's resource[​](#verifying-connectivity-to-the-customers-resource "Direct link to Verifying connectivity to the customer's resource") The easiest way to verify that the on-prem agent can access the customer's resource is to attempt to `telnet` from the on-prem container to the resource. That can be done by opening `bash` in the on-prem agent, installing telnet, and checking for connectivity: ``` # Identify the ID of your on-prem container (something like 0ff1cec0ffee) $ docker container ls # Open a bash shell using the container's ID within your on-prem container $ docker exec -it 0ff1cec0ffee bash # Make sure APP_HOST and APP_PORT environment variables look correct in the on-prem container root@0ff1cec0ffee:/home/envoy$ printenv # Install telnet in the on-prem container root@0ff1cec0ffee:/home/envoy$ apt update && apt install telnet # Verify connectivity from on-prem container to private resource root@0ff1cec0ffee:/home/envoy$ telnet ${APP_HOST} ${APP_PORT} ``` If the `telnet` command throws an error, the on-prem agent container is unable to access the private resource. The private resource may have a firewall in place, or the customer's network might prevent connectivity between the Docker network and the private resource's network. Ask your customer to enable network traffic between the Docker container and private resource. Additionally, check the on-prem container's logs. If it has received requests from your integration but cannot forward those requests to the customer's private resource, you'll see a log that looks like this: ``` Received data but failed to forward it to the final destination service (172.21.0.2:4000) ``` --- #### Data Sources **Data Sources** allow you to dynamically populate config variables after users enter their connection settings. This is particularly useful because customers' configurations in third-party applications vary. Customers have different Facebook business names, Google Analytics accounts, Salesforce resource fields, and more. You can retrieve this information and allow customers to choose specific options, such as Slack channels, Google Drive folders, or Salesforce field mappings. To populate a config variable dynamically from a data source, first ensure that a **connection** config variable for the third-party application exists on your first config page. These connections are typically created automatically when you add a step from that third-party to your integration. On subsequent pages, add your desired config variables. Under **Data Source**, select the component and data source you want to use (for example, the [Slack Select Channel](https://prismatic.io/docs/components/slack.md#select-channel) data source). When customers configure the integration, they'll authenticate with the third-party application on the first config page. Then, they'll see their data dynamically loaded from the third-party service populate the config variables. Many [public connectors](https://prismatic.io/docs/components.md) include data sources for common configuration tasks (like selecting records from dropdown menus). To build your own data sources, either using an existing public connector's connection or adding a data source to your custom connector, see [config Wizard Data Sources](https://prismatic.io/docs/custom-connectors/data-sources.md). --- #### Debugging JSON Forms Locally In this video, we debug a JSON Form config variable that is throwing an error by running the data source locally in our IDE. Invoking the data source within our editor as we edit and test our code provides a tighter feedback loop and allows us to iterate quickly. The debugging demonstrated in this video leveraged the [`prism components:dev:run`](https://prismatic.io/docs/cli/prism.md#componentsdevrun) command to fetch an active Slack OAuth 2.0 connection from a Prismatic test instance. You can read more about the custom connector unit testing harness that was used [here](https://prismatic.io/docs/custom-connectors/unit-testing.md). --- #### Salesforce Field Mapper Every customer of Salesforce uses SFDC differently. Some customers add unique fields to existing resources. Others use existing fields in unique ways. If you integrate with a CRM, you may want your users to map fields from the CRM to your product. In this tutorial, we'll examine how to use an existing Salesforce connection to fetch your customer's SFDC fields and present them to a user as a field map during the instance configuration process using [JSON Forms](https://prismatic.io/docs/custom-connectors/data-sources.md#json-forms-data-sources). ![Screenshot of a data mapper in the configuration wizard](/docs/img/integrations/data-sources/field-mapping/salesforce-field-mapper.png) The final product is available in our [examples repo](https://github.com/prismatic-io/examples/blob/main/components/salesforce-field-mapping-example/src/index.ts). [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=data-mapper-accordion) #### Initializing the data source component[​](#initializing-the-data-source-component "Direct link to Initializing the data source component") You can initialize the data mapper data source project as you would [any other custom component](https://prismatic.io/docs/custom-connectors/initializing.md). After initializing the component, remove all templated files in the `src/` directory. Then, install Salesforce's NPM package, [jsforce](https://jsforce.github.io/): Install JSforce and its TypeScript definitions ``` npm install jsforce npm install --save-dev @types/jsforce ``` If you are integrating with a different CRM, you can use their respective NPM package or a [generic HTTP client](https://prismatic.io/docs/custom-connectors/connections.md#using-the-built-in-createclient-http-client). #### Reusing HTTP connections[​](#reusing-http-connections "Direct link to Reusing HTTP connections") It would be a poor user experience to require a user to authenticate with Salesforce twice. Instead, add a Salesforce step to your integration (that will automatically create a Salesforce connection config variable). You can re-use that existing Salesforce connection for your data source! You do not need to define any `connections` for your component. Simply add a connection input to your data source: Reuse the existing Salesforce connection ``` const salesforceFieldMappingExample = dataSource({ dataSourceType: "jsonForm", display: { label: "Salesforce field mapping example", description: "Map fields from a Salesforce 'Lead' to an acme 'Sale'", }, inputs: { sfConnection: input({ label: "Salesforce Connection", type: "connection", required: true, }), }, // ... }); ``` The Salesforce connection uses OAuth, so the access token that you'll need will be available via `sfConnection.token?.access_token`. #### Fetch fields from SFDC[​](#fetch-fields-from-sfdc "Direct link to Fetch fields from SFDC") Next, we'll use the existing connection to fetch fields from SFDC. Fetch custom fields on the Lead resource from Salesforce ``` { // ... perform: async (context, params) => { // Reference an existing SFDC OAuth access token const salesforceClient = new jsforce.Connection({ instanceUrl: util.types.toString(params.sfConnection.token?.instance_url), version: "51.0", accessToken: util.types.toString(params.sfConnection.token?.access_token), }); // Fetch all fields on a Lead using https://jsforce.github.io/document/#describe const { fields } = await salesforceClient.sobject("Lead").describe(); // Filter out non-required fields const salesforceRequiredLeadFields = fields.filter( ({ nillable }) => !nillable, ); }; } ``` For illustration purposes, we fetched fields on the "Lead" resource and filtered them down to only fields that are required (i.e. not `nillable`). You can fetch fields on any resource and can choose to filter those fields or not. #### Generate a JSON Forms schema[​](#generate-a-json-forms-schema "Direct link to Generate a JSON Forms schema") JSON Forms allows you to define a **schema** where you declare how the UI that your customer uses should look. Here, we hard-code some fields from "Acme" and create an array where every element of the array has one Salesforce Lead field and one Acme field: Create the JSON Form Schema ``` // Hard-code Acme fields - these can be fetched from an external source, too const acmeSaleFields: { name: string; id: number }[] = [ { id: 123, name: "First Field" }, { id: 456, name: "Second Field" }, { id: 789, name: "Third Field" }, ]; // Schema defines the shape of the object to be returned to the integration, // along with options for dropdown menus const schema = { type: "object", properties: { mymappings: { // Arrays allow users to make one or more mappings type: "array", items: { // Each object in the array should contain a salesforceField and an acmeField type: "object", properties: { salesforceLeadField: { type: "string", // Have users select "one of" a dropdown of items oneOf: salesforceRequiredLeadFields.map((field) => ({ // Display the pretty "label" like "My First Name" to the user title: field.label, // Feed programmatic "name" like "My_First_Name__c" to the integration const: field.name, })), }, acmeSaleField: { type: "string", oneOf: acmeSaleFields.map((field) => ({ title: field.name, const: util.types.toString(field.id), // JSON Forms requires string values })), }, }, }, }, }, }; ``` JSON Forms also require UI schema, which determines how the above UI elements should be placed (vertically, horizontally, etc): Define UI Schema ``` // UI Schema defines how the schema should be displayed in the configuration wizard const uiSchema = { type: "VerticalLayout", elements: [ { type: "Control", scope: "#/properties/mymappings", label: "Salesforce Lead <> Acme Sale Field Mapper", }, ], }; ``` #### Add an optional default mapping[​](#add-an-optional-default-mapping "Direct link to Add an optional default mapping") If you have an idea of what SFDC fields should map to your fields, you can provide a default mapping. Here, for illustration purposes, we simply map the first three SFDC fields to our three fields: Add an optional default mapping ``` const defaultValues = { mymappings: [ { salesforceLeadField: util.types.toString( salesforceRequiredLeadFields[0].name, ), acmeSaleField: util.types.toString(acmeSaleFields[0].id), }, { salesforceLeadField: util.types.toString( salesforceRequiredLeadFields[1].name, ), acmeSaleField: util.types.toString(acmeSaleFields[1].id), }, { salesforceLeadField: util.types.toString( salesforceRequiredLeadFields[2].name, ), acmeSaleField: util.types.toString(acmeSaleFields[2].id), }, ], }; return { result: { schema, uiSchema, data: defaultValues }, }; ``` #### The completed code[​](#the-completed-code "Direct link to The completed code") The full source code of this field mapper can be found in our [examples repo](https://github.com/prismatic-io/examples/blob/main/components/salesforce-field-mapping-example/src/index.ts). It serves as a good jumping-off point for your own field mapping, and you can modify it however you like. For example you could: * Fetch your app's fields dynamically rather than hard-coding them * Fetch fields for other SFDC resources, like opportunities or accounts * Fetch fields from another CRM #### Using the field mapper data in an integration[​](#using-the-field-mapper-data-in-an-integration "Direct link to Using the field mapper data in an integration") The next step is to use the results of that field mapper to map data in a flow. In the video above, we create a field mapper that maps Salesforce fields to "Acme" fields. The result of the field mapper config variable is a JavaScript object that looks like this: ``` [ { "source": "Id", "destination": "external_id", }, { "source": "Name", "destination": "acct_name", }, { "source": "AnnualRevenue", "destination": "revenue", }, ]; ``` Salesforce yields data that looks like this, with keys in [Pascal Case](https://en.wikipedia.org/wiki/Camel_case): ``` { "Id": "0018c0000321QvpAAE", "Name": "Example Account", "AnnualRevenue": 1000000 } ``` To map fields from the Salesforce payload to an "Acme" payload, we can apply a JavaScript [reduce](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/reduce) function onto the map configuration object: ``` module.exports = async ({ logger, configVars }, stepResults) => { const sfdcAccount = stepResults.loopOverAccounts.currentItem; const mapping = configVars["Salesforce Account Field Mapping"]; const mappedFields = mapping.reduce( (acc, { source, destination }) => ({ [destination]: sfdcAccount[source], ...acc, }), {}, ); return { data: mappedFields }; }; ``` That will yield an object that our "Acme" API can consume: ``` { "external_id": "0018c0000321QvpAAE", "acct_name": "Taylor test account 5", "revenue": 10000 } ``` --- #### JSON Forms Config Variables JSON Forms help customize the deployment experience for your customers. They allow you to add one or many custom fields to the configuration wizard by defining a JSON schema and UI schema. This article describes how JSON Forms can be used in the [configuration wizard](https://prismatic.io/docs/integrations/config-wizard.md) to provide your customers a tailored deployment experience. Check out the [JSON Forms Playground](https://prismatic.io/docs/jsonforms/playground) to see several examples of JSON Forms in action. Several additional examples can be found in the JSON Forms project [documentation](https://jsonforms.io/). To build a JSON form within a custom connector, check out the docs and video [here](https://prismatic.io/docs/custom-connectors/data-sources.md#json-forms-data-sources). #### Schema and UI schema[​](#schema-and-ui-schema "Direct link to Schema and UI schema") A JSON Form is defined by `schema`, which is the data model (the shape of the data you expect to be returned), and `uiSchema`, which describes how various input fields should be rendered. A simple `schema` might look like this: Example JSON Schema ``` { "type": "object", "properties": { "companyName": { "type": "string" }, "companyDescription": { "type": "string", "description": "You can enter multiple lines here" }, "numEmployees": { "type": "integer", "description": "Include employees in all offices" }, "continent": { "type": "string", "enum": [ "North America", "South America", "Europe", "Asia", "Africa", "Australia" ] }, "biDirectionalSync": { "type": "boolean" } }, "required": ["companyName"] } ``` In the example above, we declare that this JSON form will return an object, and that object will have five properties: `companyName`, `companyDescription`, `numEmployees`, `continent`, and `biDirectionalSync`. Company name is required, but the other fields are optional. The `companyDescription` field is a string, and the `numEmployees` field is an integer. The `continent` field is a string, but it can only be one of the values in the `enum` array. The `biDirectionalSync` field is a boolean. This JSON form will return an object like this: Example JSON Form data ``` { "companyName": "Acme Corp", "companyDescription": "We make everything", "numEmployees": 100, "continent": "North America", "biDirectionalSync": true } ``` In order to render this form, we need to define a `uiSchema` that describes how the fields should be rendered. A `uiSchema` could look something like this: Simple UI Schema ``` { "type": "VerticalLayout", "elements": [ { "type": "Control", "scope": "#/properties/companyName" }, { "type": "Control", "scope": "#/properties/companyDescription", "options": { "multi": true } }, { "type": "Control", "label": "Employee Count", "scope": "#/properties/numEmployees" }, { "type": "Control", "scope": "#/properties/continent" }, { "type": "Control", "label": "Sync Data Bi-Directionally?", "scope": "#/properties/biDirectionalSync" } ] } ``` In the example above, we declare that the form should be rendered as a vertical layout, and that the fields should be rendered in the order they are defined in the `elements` array. Some optional labels were added to override the default labels that are derived from property names. ![Screenshot of a basic JSON form](/docs/img/integrations/data-sources/json-forms/basic-form.png) [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=basic) #### Types of fields in JSON Forms[​](#types-of-fields-in-json-forms "Direct link to Types of fields in JSON Forms") Several types of input fields are supported in JSON Forms, including plain strings, boolean checkboxes or toggles, numbers, datetime pickers, and more. Note that time and datetime fields are `string` type fields with a `time` or `date-time` format property. Examples of fields in JSON forms ``` { "type": "object", "properties": { "string": { "type": "string" }, "boolean": { "type": "boolean", "description": "Boolean description as a tooltip" }, "number": { "type": "number" }, "integer": { "type": "integer" }, "date": { "type": "string", "format": "date" }, "time": { "type": "string", "format": "time" }, "dateTime": { "type": "string", "format": "date-time" } } } ``` #### Schema UI layouts[​](#schema-ui-layouts "Direct link to Schema UI layouts") UI Elements can be laid out horizontally or vertically, and layouts can be nested. In this example, a `HorizontalLayout` contains two `VerticalLayout` elements. The second vertical layout element is a `Group` type layout, which adds a container and `label` around the vertically organized input elements. Nested Layouts ``` { "type": "HorizontalLayout", "elements": [ { "type": "VerticalLayout", "elements": [ { "type": "Control", "scope": "#/properties/companyName" }, { "type": "Control", "scope": "#/properties/companyDescription" } ] }, { "type": "Group", "label": "Additional Data", "elements": [ { "type": "Control", "label": "Employee Count", "scope": "#/properties/numEmployees" }, { "type": "Control", "scope": "#/properties/continent" } ] } ] } ``` ![Screenshot of nested layouts in JSON forms](/docs/img/integrations/data-sources/json-forms/nested-layouts.png) [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=nested-layouts) #### Dropdown menus in JSON Forms[​](#dropdown-menus-in-json-forms "Direct link to Dropdown menus in JSON Forms") If you would like to present your customer with a dropdown menu of prepopulated options, you can use an `enum` or `oneOf` property in your schema. An `enum` can be used to select from one of a set of string values: Example using enum ``` { "type": "object", "properties": { "continent": { "type": "string", "enum": [ "North America", "South America", "Europe", "Asia", "Africa", "Australia" ] } } } ``` ![Screenshot of dropdown menu in JSON forms](/docs/img/integrations/data-sources/json-forms/enum.png) In this case, if someone selects "North America", the data presented to the integration will be `{ continent: "North America" }`. `oneOf` is handy if you would like to present a dropdown menu with human-readable labels, but you want to return a different value or to the integration. In this example, the same dropdown menu is presented, but the data returned to the integration will be the two-letter continent code (i.e. "NA" for North America): Example using oneOf ``` { "type": "object", "properties": { "continent": { "type": "string", "oneOf": [ { "title": "North America", "const": "NA" }, { "title": "South America", "const": "SA" }, { "title": "Europe", "const": "EU" }, { "title": "Asia", "const": "AS" }, { "title": "Africa", "const": "AF" }, { "title": "Australia", "const": "AU" } ] } } } ``` #### Arrays of fields in JSON Forms[​](#arrays-of-fields-in-json-forms "Direct link to Arrays of fields in JSON Forms") Arrays allow your users to specify values for several copies of a set of fields. In this example, two fields (`channel` which is a `string` and `notifications` which is a dropdown menu) are properties of an object. The object can be represented one or many times within an `array` named `slackChannels`. Example of an array ``` { "type": "array", "items": { "type": "object", "properties": { "channel": { "type": "string" }, "notifications": { "type": "string", "enum": [ "Opportunity Created", "Opportunity Updated", "Opportunity Closed/Won", "Opportunity Closed/Lost" ] } } } } ``` You can optionally choose to display sort buttons, so a customer user can reorder the array elements within the `uiSchema`: Example UI Schema ``` { "type": "VerticalLayout", "elements": [ { "type": "Control", "scope": "#", "options": { "showSortButtons": true } } ] } ``` ![Screenshot of a basic array in JSON forms](/docs/img/integrations/data-sources/json-forms/basic-array.png) Arrays are presented to the integration as a list of objects: ``` [ { "channel": "#sales-opportunities", "notifications": "Opportunity Created" }, { "channel": "#sales-opportunities", "notifications": "Opportunity Updated" }, { "channel": "#sales-opportunities-won", "notifications": "Opportunity Closed/Won" }, { "channel": "#sales-opportunities-lost", "notifications": "Opportunity Closed/Lost" } ] ``` [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=basic-array) Arrays can be presented in an `Accordion` layout to save space in the configuration wizard. The field used to derive the accordion labels can be specified with the `elementLabelProp` option: Accordion layout UI Schema ``` { "type": "VerticalLayout", "elements": [ { "type": "Control", "scope": "#", "options": { "layout": "Accordion", "elementLabelProp": "channel", "showSortButtons": true } } ] } ``` ![Screenshot of a basic array in JSON forms](/docs/img/integrations/data-sources/json-forms/basic-array-accordion.png) [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=basic-array-accordion) ##### Data mapping with JSON Forms[​](#data-mapping-with-json-forms "Direct link to Data mapping with JSON Forms") Arrays are especially helpful when building a data mapping UI. See the [Building a Field Mapper Data Source](https://prismatic.io/docs/integrations/data-sources/field-mapping/salesforce-field-mapper.md) tutorial for an example of how to build a data mapper between custom fields in Salesforce, and custom fields in your app. [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=data-mapper-accordion) #### Showing or hiding inputs with UI rules[​](#showing-or-hiding-inputs-with-ui-rules "Direct link to Showing or hiding inputs with UI rules") You can choose to show or hide fields conditionally based on the value of other fields. A `rule` is defined in UI schema and contains a `condition` that determines whether or not an `effect` should be applied. In this example, the `convertToUSD` input field is only shown if the value of the `country` field is *not* `United States`: ``` { "type": "Control", "scope": "#/properties/convertToUSD", "options": { "toggle": true }, "rule": { "effect": "SHOW", "condition": { "scope": "#/properties/country", "schema": { "not": { "const": "United States" } } } } } ``` [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=ui-rules) Additional examples are available in JSON Forms' [documentation](https://jsonforms.io/docs/uischema/rules) #### Input validation in JSON Forms[​](#input-validation-in-json-forms "Direct link to Input validation in JSON Forms") You can validate what your customers enter into input fields by adding `format`, `minimum`, `maximum`, `maxLength` and other properties to your schema. For string inputs, a `pattern` property allows you to specify a [regular expression](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_expressions) (regex) pattern that values must adhere to - this is useful if you are expecting a very specific format of user-supplied input. Examples of input validation ``` { "type": "object", "properties": { "stringMinLength": { "type": "string", "minLength": 5, "description": "Please enter a string with at least 5 characters" }, "stringMaxLength": { "type": "string", "maxLength": 5, "description": "Please enter a string with at most 5 characters" }, "email": { "type": "string", "format": "email", "description": "Please enter a valid email address." }, "uri": { "type": "string", "format": "uri", "description": "Please enter a valid URI." }, "ipv4": { "type": "string", "format": "ipv4", "description": "Please enter a valid IPv4 address." }, "regex": { "type": "string", "pattern": "^(\\([0-9]{3}\\))?[0-9]{3}-[0-9]{4}$", "description": "Please enter a valid phone number in the form (123)456-7890." }, "intOneTen": { "type": "integer", "minimum": 1, "maximum": 10, "description": "Please enter an integer between 1 and 10." }, "startDate": { "type": "string", "format": "date", "description": "Please enter a date between the first and last day of the current month.", "formatMinimum": "2023-08-01", "formatMaximum": "2023-08-31" } }, "required": [ "stringMinLength", "stringMaxLength", "email", "uri", "ipv4", "regex", "intOneTen", "startDate" ] } ``` [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=input-validation) If you would like to add custom validation rules to your JSON Form, see [JSON Form Validation](https://prismatic.io/docs/integrations/data-sources/json-forms/form-validation.md). #### Autocomplete for JSON Forms dropdown inputs[​](#autocomplete-for-json-forms-dropdown-inputs "Direct link to Autocomplete for JSON Forms dropdown inputs") You can provide a list of options for a dropdown input, and JSON Forms will provide an autocomplete feature for the dropdown menu. Autocomplete can be added to a `oneOf` dropdown menu input by specifying the `autocomplete` option in the `uiSchema`: ``` { "options": { "autocomplete": true } } ``` In this example, begin typing `"Cor"`. Both `"Acme Corp"` and `"Umbrella Corp"` will match. ![Screenshot of a autocomplete in dropdown menu in JSON forms](/docs/img/integrations/data-sources/json-forms/autocomplete.png) [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=autocomplete) #### Field mapper with custom layout[​](#field-mapper-with-custom-layout "Direct link to Field mapper with custom layout") The [Building a Field Mapper Data Source](https://prismatic.io/docs/integrations/data-sources/field-mapping/salesforce-field-mapper.md) tutorial demonstrates how to build a basic field mapper with JSON Forms. If you would like to control the layout of the field mapper, you can use a custom layout. To specify your custom layout, give your `uiSchema` a `options.layout` of `"Accordion"` and design your custom layout using `options.detail`. Custom layout UI Schema ``` { "type": "Control", "label": "Salesforce Lead <> Acme Sale Field Mapper", "scope": "#", "options": { "layout": "Accordion", "detail": { "type": "HorizontalLayout", "elements": [ { "type": "Control", "scope": "#/properties/source", "options": { "autocomplete": true } }, { "type": "Control", "scope": "#/properties/destination", "options": { "autocomplete": true } } ] } } } ``` ![Screenshot of a custom field mapper in JSON forms](/docs/img/integrations/data-sources/json-forms/field-mapper-custom-ui.png) [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=data-mapper-detail) #### JSON Forms with multiple tabs[​](#json-forms-with-multiple-tabs "Direct link to JSON Forms with multiple tabs") JSON forms can be used to build complex field mappers with multiple tabs. For example, suppose you would like to map fields from multiple objects in Salesforce to multiple objects. Create tabs with a `"type": "Categorization"` layout in your `uiSchema`. Give each category a `label`: Tabs UI Schema ``` { "type": "Categorization", "elements": [ { "type": "Category", "label": "Contacts", "elements": [ { "type": "Control", "scope": "#/properties/contacts" } ] }, { "type": "Category", "label": "Leads", "elements": [ { "type": "Control", "scope": "#/properties/leads" } ] }, { "type": "Category", "label": "Opportunities", "elements": [ { "type": "Control", "scope": "#/properties/opportunities" } ] } ] } ``` ![Screenshot of a field mapper with tabs in JSON forms](/docs/img/integrations/data-sources/json-forms/field-mapper-tabs.png) [See example in playground](https://prismatic.io/docs/jsonforms/playground?key=data-mapper-tabs) #### Wrapping JSON Forms in a custom component[​](#wrapping-json-forms-in-a-custom-component "Direct link to Wrapping JSON Forms in a custom component") If your form is static and does not depend on fetching data from third-party apps before it renders, you can provide `schema` and `uiSchema` directly to the [JSONForms component](https://prismatic.io/docs/components/jsonforms.md) for rendering. If you would like your form to present dropdown menus and other fields that are sourced from a third-party app, you can build a form in a custom component using the custom component SDK (see [documentation](https://prismatic.io/docs/custom-connectors/data-sources.md#json-forms-data-sources)). ##### Pre-filling default data[​](#pre-filling-default-data "Direct link to Pre-filling default data") In addition to `schema` and `uiSchema` properties, your custom JSON Forms data sources can include a `data` property. When `data` is included, your form will be pre-filled with the data you provide. #### Detecting changes to inputs and overriding default data[​](#detecting-changes-to-inputs-and-overriding-default-data "Direct link to Detecting changes to inputs and overriding default data") Suppose your integration has a config wizard that looks like this: * **Page 1**: Connect to Microsoft Sharepoint with OAuth 2.0 * **Page 2**: Select a Sharepoint site from a dropdown menu * **Page 3**: Present a JSON form tailored to the selected sharepoint site, with default data from the selected site Your config wizard works fine the first time a user walks through its pages, and will continue to work fine if the user re-opens the config wizard and makes the same selections again. But, what if they open the config wizard again and select a different site from page 2? By default, if a config variable has user-supplied data, the default `data` your config variable generates is ignored in favor of the user's selections. You can override that behavior. Within the config wizard designer, navigate to **Data Source Reset** within your JSON Forms config variable. ![Configure data source resetting in config wizard](/docs/img/integrations/data-sources/json-forms/data-source-reset.png) * If you select **Never**, the default behavior is used and the user's previous data is displayed. * If you select **Prompt**, the end user will be notified that their data may be stale, and they'll have the option to reset the form to the default values supplied by your data source. ![Configure data source resetting in config wizard](/docs/img/integrations/data-sources/json-forms/data-source-reset-prompt.png) * If you select **Always** and inputs of the data source changed, your datasource's default `data` will override the user's previous selection. --- #### JSON Forms Data Validation [JSON Forms Validation](https://player.vimeo.com/video/1053980092) Specific fields of a JSON Forms config variable can have [validation rules](https://prismatic.io/docs/integrations/data-sources/json-forms.md#input-validation-in-json-forms) applied. For example, you can check that an input matches a [regular expression](https://en.wikipedia.org/wiki/Regular_expression), ensuring that it is an email address or phone number, or you can ensure a date is within a certain range. But, not all data validation can be done with regex and min/max rules. Sometimes it's useful to examine the form in its entirety for correctness. For example, you may want to ensure that a [Salesforce field mapper](https://prismatic.io/docs/integrations/data-sources/field-mapping/salesforce-field-mapper.md) contains a one-to-one field mapping (so a customer can't map "Phone" to both "Cell Phone" and "Work Phone" fields, etc). One straightforward way to tackle JSON Forms form validation is to feed the results of a JSON Form config variable into a subsequent JSON Form data source that validates the data with JavaScript rules you write. This "validator" form can show helpful, human-readable error messages and prevent a user from completing a config wizard until they've corrected their mistakes. In this example, we'll validate a simple JSON Form. #### Our example JSON Form[​](#our-example-json-form "Direct link to Our example JSON Form") For this example, we have a simple form that contains: 1. A field mapper mapping "source" fields to "destination" fields. For example, you can map `Source Option 2` to `Destination Option 5`, etc. It's important that these mappings are unique. So, you can't map both `Source Option 2` and `Source Option 3` to `Destination Option 5`. 2. A number input representing a person's age. While you could validate a number like this with `minimum` and `maximum` [input validators](https://prismatic.io/docs/integrations/data-sources/json-forms.md#input-validation-in-json-forms), we'll show how to validate it with JavaScript here. Example JSON Form data source ``` const myJsonForm = dataSource({ dataSourceType: "jsonForm", display: { label: "My JSON Form", description: "A JSON form for testing", }, inputs: {}, perform: async () => { const schema = { type: "object", properties: { mappings: { type: "array", items: { type: "object", properties: { source: { type: "string", enum: [ "Source Option 1", "Source Option 2", "Source Option 3", "Source Option 4", "Source Option 5", ], }, destination: { type: "string", enum: [ "Destination Option 1", "Destination Option 2", "Destination Option 3", "Destination Option 4", "Destination Option 5", ], }, }, }, required: ["source", "destination"], }, age: { type: "integer", }, }, }; const uiSchema = { type: "VerticalLayout", elements: [ { type: "Control", scope: "#/properties/mappings", }, { type: "Control", scope: "#/properties/age", }, ], }; return Promise.resolve({ result: { schema, uiSchema } }); }, }); ``` ![Example JSON Form with field mapping](/docs/img/integrations/data-sources/json-forms/form-validation/example-form.png) Our JSON Forms config variable will yield an object that looks like this: ``` { "mappings": [ { "source": "Source Option 1", "destination": "Destination Option 1" }, { "source": "Source Option 2", "destination": "Destination Option 5" }, { "source": "Source Option 2", "destination": "Destination Option 3" }, { "source": "Source Option 4", "destination": "Destination Option 3" } ], "age": -5 } ``` #### Our example validator form[​](#our-example-validator-form "Direct link to Our example validator form") Within our form validator we want to: 1. Verify that each `source` field was selected at most once. 2. Verify that each `destination` field was selected at most once. 3. Verify that `age` was a positive number no greater than 130. If any of these checks fail, we want to display an error and disallow a user from continuing (note the disabled "Finish" button). ![Failed JSON Forms validation](/docs/img/integrations/data-sources/json-forms/form-validation/validation-failed.png) If all checks pass, we want to display confirmation that their data looks correct and allow the user to continue. ![Passed JSON Forms validation](/docs/img/integrations/data-sources/json-forms/form-validation/validation-passed.png) In our validator code below, we pass our previous JSON Form's results to our "validator" data source. We initialize an array, `errors`, to `[]`. Then, if we detect invalid data in the form that is being processed, we push error messages onto our `errors` array. If `errors` is empty at the end of the function, we display a JSON Form with a label that says `βœ… No errors found`, and the user is able to click the "finish" button in the config wizard. If `errors` contains error messages, those messages like `❌ Age must be a positive number` are displayed in the config wizard. The form then requires an invisible field, `isInvalid`, which cannot be set because it is invisible. This prevents a user from clicking "Finish" when errors are present. Validation JSON Form ``` interface JsonFormData { mappings: { source: string; destination: string }[]; age: number; } const myValidator = dataSource({ dataSourceType: "jsonForm", display: { label: "Validator", description: "Validates previous JSON form", }, inputs: { data: input({ label: "Data", type: "data", required: true, clean: (value) => value as JsonFormData, }), }, perform: async (context, { data }) => { const { mappings, age } = data; // Initialize with an empty set of errors const errors: string[] = []; if ((mappings || []).length === 0) { // If the user submitted no mappings, add an error errors.push("❌ At least one mapping is required"); } else { mappings.forEach((mapping, index) => { // If multiple source fields are mapped to a single destination, add an error if (mappings.findIndex((m) => m.source === mapping.source) !== index) { errors.push( `❌ Duplicate source of "${mapping.source}" selected. Only use each source once.`, ); } // If multiple destination fields were mapped to a single source, add an error if ( mappings.findIndex((m) => m.destination === mapping.destination) !== index ) { errors.push( `❌ Duplicate destination of "${mapping.destination}" selected. Only use each destination once.`, ); } }); } // Add an error if there is no age, or the age is too high or low if (age === undefined) { errors.push("❌ You must specify an age"); } else { if (age < 0) { errors.push("❌ Age must be a positive number"); } if (age > 130) { errors.push("❌ Nobody is that old."); } } // If any errors were added to the errors array, return a series of labels displaying the errors if (errors.length) { return Promise.resolve({ result: { schema: { type: "object", properties: { isInvalid: { type: "string", }, }, // Add an invisible, but required, input to prevent the "Finish" button from being clickable required: ["isInvalid"], }, uiSchema: { type: "VerticalLayout", elements: errors.map((error) => ({ type: "Label", text: `Error: ${error}`, })), }, }, }); } else { // If no errors were present, display a single affirmative label and allow a user to continue return Promise.resolve({ result: { schema: { type: "object", properties: {}, }, uiSchema: { type: "VerticalLayout", elements: [{ type: "Label", text: "βœ… No errors found" }], }, }, }); } }, }); ``` --- #### Flows Overview An integration in Prismatic is comprised of one or more **flows** that each serve a specific purpose. For example, if you're building a bidirectional integration with Salesforce, you might have several flows: 1. A flow that is invoked by your app, and syncs leads from your app to Salesforce. 2. A flow that is invoked by your app, and syncs contacts from your app to Salesforce. 3. A flow that queries Salesforce for new leads, and syncs changes to your app. 4. A flow that queries Salesforce for new contacts, and syncs changes to your app. A [flow in the low-code designer](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) represents a series of steps that are run one after another. Each [step](https://prismatic.io/docs/integrations/low-code-integration-designer/steps.md) runs an action (like "Create Salesforce Lead" or "Get Salesforce Contacts"). You can feed the results of steps into subsequent steps, so the results of "Get Salesforce Leads" can be fed into a step that syncs SFDC leads to your app. A [flow in code-native](https://prismatic.io/docs/integrations/code-native/flows.md) represents a TypeScript function that runs when the flow is invoked. Your flow's code can be as simple or complex as required - it can be a simple HTTP call to fetch data from a third-party, or it can be a sophisticated function that fetches, parses, filters and transforms data from one app to another. Regardless of whether you're using low-code or code-native to build your integrations, each flow begins with a [trigger](https://prismatic.io/docs/integrations/triggers.md). A trigger determines when the flow starts, which can be on a [schedule](https://prismatic.io/docs/integrations/triggers/schedule.md) (i.e. "Every 5 minutes") or can be invoked in response to an [app event](https://prismatic.io/docs/integrations/triggers/app-events.md) (i.e. "when a new file in Dropbox is created."). When a flow's trigger is invoked, an [execution](https://prismatic.io/docs/monitor-instances/executions.md) starts. Prismatic is designed to scale, and multiple executions of a flow can run concurrently (though you can opt to run executions sequentially using a [FIFO queue](https://prismatic.io/docs/integrations/triggers/fifo-queue.md)). --- #### Integration Limits #### The Prismatic runner environment[​](#the-prismatic-runner-environment "Direct link to The Prismatic runner environment") Instances of integrations (and test runs of integrations) execute in isolated NodeJS containers with distinct filesystems and memory. Two instances of the same integration run in distinct isolated environments. Integrations currently run using NodeJS version 18.x. #### Runner limitations[​](#runner-limitations "Direct link to Runner limitations") ##### Memory allocation[​](#memory-allocation "Direct link to Memory allocation") The Prismatic integration runner is allocated 1GB of RAM for execution. If you find that your instances are running out of memory, please review [Memory Management](https://prismatic.io/docs/integrations/memory-management.md) for strategies on how to stay within memory limits. ##### Execution time limitations[​](#execution-time-limitations "Direct link to Execution time limitations") An instance will run for up to 15 minutes. If you have large datasets to process, you can break data into smaller chunks and process chunks [in parallel](https://prismatic.io/docs/integrations/common-patterns/processing-data-in-parallel.md) or in several subsequent executions with [recursive flows](https://prismatic.io/docs/integrations/common-patterns/processing-data-recursive-flows.md). #### Webhook limitations[​](#webhook-limitations "Direct link to Webhook limitations") ##### Webhook request size limitations[​](#webhook-request-size-limitations "Direct link to Webhook request size limitations") Webhook payload size is limited to 6MB. 6MB is generally large enough to handle most JSON, XML, or other webhook payloads. If the payload you need to process exceeds 6MB (i.e. you are processing large images, PDFs, etc.), we recommend saving the large file to a file storage system first (Dropbox, Amazon S3, Azure Files, etc.) and sending *metadata* about the file in your webhook request. Your integration can use the metadata to fetch the file for processing. ##### Synchronous webhook response size limitations[​](#synchronous-webhook-response-size-limitations "Direct link to Synchronous webhook response size limitations") When a webhook is invoked synchronously, the response contains the results of the last step of the flow (so if the last step returned a PDF file, the webhook response would be a PDF file). Prismatic writes the response to a file in Amazon S3 and responds with an HTTP 303 (Redirect) to the object in S3. Step results have a maximum size of 500MB. If the results that you generate exceed 500MB, consider writing the file to a file storage system (Dropbox, your own Amazon S3 bucket, etc.) and returning metadata about the file instead. **Read More**: [Synchronous Invocations and Redirects](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md#synchronous-invocations-and-redirects) ##### Synchronous invocation timeouts[​](#synchronous-invocation-timeouts "Direct link to Synchronous invocation timeouts") A webhook request will time out after 30 seconds. Webhook requests to [synchronous triggers](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md) (triggers that wait until the execution finishes running before responding) must complete their work in under 30 seconds. ##### Webhook rate limiting and concurrent executions[​](#webhook-rate-limiting-and-concurrent-executions "Direct link to Webhook rate limiting and concurrent executions") The number of concurrent executions your organization can run is determined by your pricing plan. If your organization is already running that many executions and an additional request is received, the requester will receive a 429 "too many requests" response. When an execution starts, the first log line includes the number of executions that are currently running. ![Concurrent executions in logs](/docs/img/integrations/integration-runner-environment-limits/concurrent-executions.png) ##### Alerting on concurrency limits[​](#alerting-on-concurrency-limits "Direct link to Alerting on concurrency limits") You can alert your team when you approach your concurrent execution limit by creating a concurrency threshold alert monitor. To do that, open **Monitors** from the left sidebar and then select **+ Add alert monitor**. Give your monitor a name and select **Concurrency Threshold Warning** as the **Trigger**. ![Concurrent execution alert monitor](/docs/img/integrations/integration-runner-environment-limits/alert-monitor.png) Within your trigger, select recipients to receive alerts (via email or webhook request). You will be alerted if the number of concurrent executions you're running exceeds 80% of your maximum limit (for example, if your contract allows for 500 concurrent executions, you will receive an alert if you run over 400 executions at one time). --- #### Memory Management By default, the integration runner is allocated [1024 MB (1 GB)](https://prismatic.io/docs/integrations/integration-runner-environment-limits.md#memory-allocation) of memory. This is sufficient memory to complete most common integration tasks. This article covers common memory issues (and how to avoid them). #### Common out-of-memory scenarios[​](#common-out-of-memory-scenarios "Direct link to Common out-of-memory scenarios") There are a few common situations where you may run out of memory and encounter an error stating `Execution has exceeded the maximum allowed memory`. ##### Loading entire large files into memory[​](#loading-entire-large-files-into-memory "Direct link to Loading entire large files into memory") Suppose that your integration needs to pull down a CSV file from an SFTP server, deserialize the file into a JavaScript object, and perform some task on each record. If your file is large (say, 80MB in size) and you use several steps to accomplish your goal, you can end up using well over 1 GB of memory: * The SFTP step will use at least 160MB when downloading the file, as it fetches the file contents as a Buffer and converts that to a string. * The SFTP step will use at least 80MB when serializing and persisting the step result. * The step that deserializes the CSV string to a JavaScript object will (in our experience) use 12 times the size of the file in memory. So, this step will likely consume around 960MB of memory! It's easy to see how you could run out of memory quickly. When processing large files, you may benefit from writing a custom action that processes data in a [NodeJS stream](https://nodejs.org/api/stream.html). This way, you load smaller portions of the large file into memory at a time, processing and then discarding each portion. See [Handling Large Files in Custom Components](https://prismatic.io/docs/custom-connectors/handling-large-files-in-custom-components.md) for examples of how to process large files as streams. ##### Excessive logging[​](#excessive-logging "Direct link to Excessive logging") [Logs](https://prismatic.io/docs/monitor-instances/logging.md) are written asynchronously by the runner's logger service to ensure that your flows are not slowed down by logs. If you log thousands of debug lines in a step, the logger service will use a large amount of memory to serialize and commit each of those lines. Try to avoid placing log lines in a loop that iterates thousands of times. ##### Ending a loop with a large step result[​](#ending-a-loop-with-a-large-step-result "Direct link to Ending a loop with a large step result") If you have a set of steps in a [loop](https://prismatic.io/docs/components/loop.md) and the final step returns large step results, the loop step itself may cause an out-of-memory error. This is because the loop step returns an array containing the result of the last step of the loop from each iteration. For example, suppose you have a loop with three steps, and the final step returns a result that is 50 KB. If the loop iterates 10,000 times, then the loop step will return an array that is 50 KB x 10000 = 500MB. Add a no-op code step that returns `{ data: null }` at the end of your loop to avoid high-memory-usage loop steps. #### Displaying memory usage[​](#displaying-memory-usage "Direct link to Displaying memory usage") If you'd like to see how much memory is used by each step of your flow, you can enable [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode). After each step, you'll see a log line that displays memory usage. `memoryUsage.rss` represents the total memory used by the flow (in *megabytes*). ![Memory usage per step](/docs/img/integrations/memory-management/step-memory-report.png) Additionally, when your execution completes, you will see the maximum amount of memory used by your flow in your execution's logs. ![Memory usage per execution](/docs/img/integrations/memory-management/execution-memory-report.png) If you would like to profile a single custom code or custom connector step within your integration to determine which portion of your code consumes memory, you can use the [`debug.memoryUsage`](https://prismatic.io/docs/integrations/troubleshooting.md#measuring-memory-performance-in-a-code-block-or-custom-connector). #### Configuring memory limits[​](#configuring-memory-limits "Direct link to Configuring memory limits") By default, the integration runner runs with 1GB of memory. If you continue to encounter memory constraints and have already implemented the above strategies, please contact your Prismatic account manager. We can work out a proposal to allocate additional memory to some instances. --- #### Persisting State between Executions #### Persisting state between executions[​](#persisting-state-between-executions "Direct link to Persisting state between executions") Sometimes it's useful to save data from one execution of an instance so it can be used in a subsequent execution. For example, imagine you have an integration that pulls down and processes records from a third-party API. Your integration recently processed a record with ID `123`, and the next time your integration runs, you want to ensure it processes ID `124` and above. Prismatic provides components and programmatic access to persisted state, so you can save data in one execution and use it in the next. You can persist `123` using a [Save Value](https://prismatic.io/docs/components/persist-data.md#flow---save-value) action, and then the next time your integration runs, it can use [Get Value](https://prismatic.io/docs/components/persist-data.md#flow---get-value) to know that `123` was the most recently processed record. #### Levels of persisted state[​](#levels-of-persisted-state "Direct link to Levels of persisted state") There are four levels of persisted data: * **Execution state** stores state for a single execution of a flow. Data stored are ephemeral and not persisted between executions. Execution state is generally used as a temporary variable or as an accumulator. For example, if you are looping over an array of records and fetching data for each one, you could use the [Execution - Append Value to List](https://prismatic.io/docs/components/persist-data.md#execution---append-value-to-list) action to append each record to a list, which you could load up after your loop in its entirety. * **Flow state** (programmatically called `instanceState` for historical reasons) stores persisted data for a single flow. A flow can access its own state but not its sibling flows' states. Each flow has its own state, and two different flows can run concurrently without overwriting one another's flow state. Flow state is useful if you have a scheduled process that checks for new records in a third-party app. You can use flow state to persist a cursor, so the next time your flow runs, it can pick up where the previous execution left off. * **Cross-Flow state** is shared between all flows within an instance. A flow can access its sibling flows' cross-flow state. If two flows run concurrently and both change state, the flow that finishes last overwrites the data that the first stored. * **Integration state** stores persisted data for all flows of all instances of an integration. All instances of an integration deployed to different customers share state. Integration state can be useful if you're building an integration with an app that only allows you to specify a single inbound webhook URL for all of your customers. In that situation, you could generate a key-value store, matching customers' third-party external IDs to their instance's webhook URLs, allowing you to route requests that arrive at a shared endpoint to the proper instances. #### How persisted state works in Prismatic[​](#how-persisted-state-works-in-prismatic "Direct link to How persisted state works in Prismatic") The persist data lifecycle is straightforward: 1. When an execution begins, flow state, cross-flow state, and integration state are downloaded and parsed from JSON files. Execution state is initialized to an empty object `{}`. 2. Throughout your execution, you may create, update, or delete key-value pairs in one or more of the states. You can either use the [Persist Data](https://prismatic.io/docs/components/persist-data.md) component or programmatically do something like `context.crossFlowState["Last Product ID"] = "abc-123";`. 3. When the execution completes successfully, execution state disappears. Flow state, cross-flow state, and integration state are compared to their initial values. If their values changed, they are serialized to JSON and written to storage to be loaded in the next execution. Levels of state are evaluated independently Flow, cross-flow, and integration state are evaluated independently. If you change cross-flow state but not flow or integration state, only cross-flow state will be persisted at the end of the execution. #### Limitations of persisted state[​](#limitations-of-persisted-state "Direct link to Limitations of persisted state") It's important to know what persisted state is, and more importantly, what it is not. Persisted state is a useful tool to cache small key/value pairs between executions. It is not a database (and certainly not an [ACID](https://en.wikipedia.org/wiki/ACID) database). Generally, either your app or the app you're integrating with should be considered the source of truth. ##### Concurrent execution limitations[​](#concurrent-execution-limitations "Direct link to Concurrent execution limitations") Persisted state is loaded at the start of an execution and written at the end of a successful execution. State is written out in its entirety when it is changed. Let's look at a few scenarios where you may run concurrent executions: 1. Suppose you want to keep track of a list of records to process. You have two flows that use cross-flow state (one flow adds items to cross-flow state, and one flow reads and removes items from state). Suppose that both flows are invoked at the same time with an initial cross-flow state of `["a", "b"]`. The first flow adds `"c"` to the list and finishes first. It writes out `["a", "b", "c"]` to persisted state. The second flow reads and removes `"a"` and `"b"` from state and writes `[]` to persisted state. In this case, the second flow would overwrite the first flow's state (`[]` would overwrite `["a", "b", "c"]`), and item `"c"` would never be processed. Depending on which flow completes first, you may miss items or double-process items. When processing items, if order is important, consider leveraging a [FIFO queue](https://prismatic.io/docs/integrations/triggers/fifo-queue.md) to ensure that each item is processed exactly once. If order is not important, consider omitting persisted state and process records in the same flow that you receive them. 2. Suppose you have a flow that is invoked via webhook and tracks orders that are processed as key-value pairs. Flow state might look like this: ``` { "id-abc-123": { "item": "Widgets", "qty": 5 }, "id-def-456": { "item": "Gadgets", "qty": 10 } } ``` If two invocations of the same flow occur at the same time, and each attempts to add a key-value pair to flow state, each flow will write out state with different key-value pairs. The flow that finishes last will overwrite (effectively removing) the key that the first flow wrote. State is written in its entirety Note that state is written in its entirety (rather than key by key). That means that for two concurrently running flows, if one flow writes a value for `instanceState["foo"]`, and then another writes a value for `instanceState["bar"]`, the change to `"foo"` will be overwritten. Generally, Prismatic should be used as the mechanism to move data between systems. The systems (your app and the app you're integrating with) should be the sources of truth where records are stored. 3. Suppose you have two flows, and one calls another via [cross-flow trigger](https://prismatic.io/docs/integrations/triggers/cross-flow.md). The first flow writes state for the second flow to read. This scenario doesn't work, since the second flow starts before the first completes. The second would load state that doesn't contain the first flow's persisted values. When invoking sibling flows, consider sending the data to the sibling flow via POST request. The [cross-flow trigger](https://prismatic.io/docs/integrations/triggers/cross-flow.md) lets you specify data to send to the sibling flow. 4. Suppose you have a flow that processes records that are stored in a list. When your flow runs, it loads 50 records from persisted state. After processing 20 records and removing them from the persisted list, the API you're working with throws an error. This causes your flow to throw an error and stop. In this scenario, the 30 remaining records would *not* be persisted, since your flow did not complete successfully. When it runs next, it will attempt to process the first 20 records a second time. If processing and removing records is important, consider leveraging a [FIFO queue](https://prismatic.io/docs/integrations/triggers/fifo-queue.md). Alternatively, if you know an API is unreliable and may throw errors, you can configure [step-level error handling](https://prismatic.io/docs/integrations/low-code-integration-designer/error-handling.md#step-level-error-handling) to ignore errors from records that cannot be processed, or you can send bad records to a dead-letter queue that you can examine later. ##### Persisted state size limitations[​](#persisted-state-size-limitations "Direct link to Persisted state size limitations") Persisted state is ideal for storing small amounts of data in key-value storage between executions. When serialized to JSON, integration state, cross-flow state, and flow state combined should not exceed 64 MB. If you attempt to store more than 64 MB of state, you will encounter an error stating `Unable to complete execution, persisted state exceeded maximum limit of 67108864 bytes.` ##### When should I use alternative data stores?[​](#when-should-i-use-alternative-data-stores "Direct link to When should I use alternative data stores?") If you are attempting to persist large items (like PDFs or images), consider writing the files to a file storage system like [Amazon S3](https://prismatic.io/docs/components/aws-s3.md) or [Google Drive](https://prismatic.io/docs/components/google-drive.md). If you need to store thousands of key-value pairs, consider a purpose-built key-value store, like [Firebase](https://prismatic.io/docs/components/firebase.md) or [Amazon DynamoDB](https://prismatic.io/docs/components/aws-dynamodb.md). If you need to process records that you receive in order, consider leveraging a [FIFO queue](https://prismatic.io/docs/integrations/triggers/fifo-queue.md). #### The persist data component[​](#the-persist-data-component "Direct link to The persist data component") Data can be persisted between runs using the [Persist Data](https://prismatic.io/docs/components/persist-data.md) component. Data are stored in key-value pairs, and values can be strings, numbers, objects, or lists. You can choose to persist data with the `Flow -` actions - that lets you persist data scoped to the current flow. You can also use the `Cross Flow -` actions to persist data that can be shared between flows of an instance. Or you can use the `Integration -` actions to persist data between instances of the same integration (so multiple customers can share a data store). You can store a key/value pair using the [Save Value](https://prismatic.io/docs/components/persist-data.md#flow---save-value) action, or you can use [Persist Data](https://prismatic.io/docs/components/persist-data.md)'s other actions to append to a persisted list. If you would like to save a timestamp instead, you can use the [Save Current Time](https://prismatic.io/docs/components/persist-data.md#flow---save-current-time) action to save the current time into a key of your choosing. Later, in a subsequent run, you can fetch the value you saved using the [Get Value](https://prismatic.io/docs/components/persist-data.md#flow---get-value) action. If a key is not set, **Get Value** will return `null`. You can remove data from an array or remove a key/value pair altogether using [Persist Data](https://prismatic.io/docs/components/persist-data.md)'s other actions. #### Accessing persisted data in a code block or custom component[​](#accessing-persisted-data-in-a-code-block-or-custom-component "Direct link to Accessing persisted data in a code block or custom component") Persisted state is accessible through the `context` parameter, which can be referenced in custom components and code steps. ##### Reading persisted state programmatically[​](#reading-persisted-state-programmatically "Direct link to Reading persisted state programmatically") The `context` parameter contains execution state (`executionState`), flow state (`instanceState` for historical reasons), cross-flow state (`crossFlowState`), and integration state (`integrationState`). Within a `perform` function or code step, you can access variables like this: ``` for (const item of context.crossFlowState["My Items"]) { // Process each item } ``` For more information, see [Execution, instance, and cross-flow state](https://prismatic.io/docs/custom-connectors/actions.md#execution-instance-and-cross-flow-state). ##### Writing persisted state programmatically[​](#writing-persisted-state-programmatically "Direct link to Writing persisted state programmatically") To set new values for persisted state keys, you can either return the new values in your return block or mutate `context.*` objects directly. ``` return { data: "Some Data", crossFlowState: { exampleKey: "example value", anotherKey: [1, 2, 3] }, }; // or context.crossFlowState["exampleKey"] = "example value"; context.crossFlowState["anotherKey"] = [1, 2, 3]; ``` To delete a value from state, return a `null` value for the key you want removed: ``` return { data: "Some Data", crossFlowState: { exampleKey: null }, }; ``` ##### Persisted data in code-native integrations[​](#persisted-data-in-code-native-integrations "Direct link to Persisted data in code-native integrations") Persisted data can be accessed in code-native integrations using the same `context` parameter, just as you do for custom connectors. See [Code-Native Flows](https://prismatic.io/docs/integrations/code-native/flows.md#persisting-data-between-executions). --- #### Preparing Integrations for the Marketplace Once you've built your integration, it's time to offer it to your customers through the [embedded marketplace](https://prismatic.io/docs/embed/marketplace.md). Before you do that, however, you should take a moment to polish your integration, so it's as user-friendly to set up as possible. Here's a few recommendations... #### Name your integration after the third-party app you're integrating with[​](#name-your-integration-after-the-third-party-app-youre-integrating-with "Direct link to Name your integration after the third-party app you're integrating with") Generally speaking, you can name your integration after the third-party app you're integrating with. Your customer is already logged into your app and is aware that they're enabling an integration between your app and the third-party. They're also aware that you provide integrations, so including "integration" in your integration's name is extraneous. * βœ… Hubspot * βœ… Slack * βœ… Salesforce * ❌ Hubspot Integration * ❌ Acme -> Slack * ❌ Sync records Salesforce <> Acme #### Give your integration an icon[​](#give-your-integration-an-icon "Direct link to Give your integration an icon") Ensure you've added an [icon](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-an-icon-to-an-integration) to your integration. Typically, you can use the icon of the app you're integrating with. If you're integrating with an app that is in our [public connector library](https://prismatic.io/docs/components.md), you can download and use an icon from our documentation. If you're integrating with another app, try to find a square PNG between 128 and 512 pixels wide. ![Integration with icon in marketplace](/docs/img/integrations/preparing-for-marketplace/integration-with-icon.png) #### Give your integration an eye-catching description[​](#give-your-integration-an-eye-catching-description "Direct link to Give your integration an eye-catching description") Your integration's **description** is what appears just under your integration's name in the embedded marketplace. The description is plain text and should be a short summary of what your integration does. For example, `Sync contact records with Salesforce`. Longer, more detailed information about your integration should appear in the [marketplace overview](https://prismatic.io/docs/integrations/preparing-for-marketplace.md#provide-context-and-helpful-text-in-the-marketplace-overview). #### Provide context and helpful text in the marketplace overview[​](#provide-context-and-helpful-text-in-the-marketplace-overview "Direct link to Provide context and helpful text in the marketplace overview") The marketplace overview is seen when a user selects an integration from your marketplace (but has not yet clicked **Configure**). The marketplace overview supports [markdown syntax](https://www.markdownguide.org/basic-syntax), and you can add headings, lists, hyperlinks, images, and more to your overview. This is where you should add additional detail about how your integration works, what information is transferred, what setup is required, etc. ![Integration marketplace overview screen](/docs/img/integrations/preparing-for-marketplace/marketplace-overview.png) #### Add helpful text and links to your integration's config wizard[​](#add-helpful-text-and-links-to-your-integrations-config-wizard "Direct link to Add helpful text and links to your integration's config wizard") Within the config wizard, you can add [helpful headings, text, images, hyperlinks](https://prismatic.io/docs/integrations/config-wizard/config-pages.md#displaying-additional-helper-text-in-the-configuration-wizard) and more. You can add any HTML you'd like to your config wizard. If, for example, setting up authentication in a third-party app is a complex, multi-step process, you can include a step-by-step guide complete with screenshots and links to documentation. ![Netsuite auth guide in config wizard](/docs/img/integrations/preparing-for-marketplace/netsuite-auth-guide.png) #### Ensure your integration's configuration is data source-driven[​](#ensure-your-integrations-configuration-is-data-source-driven "Direct link to Ensure your integration's configuration is data source-driven") Configuration of your integration should be low-effort on the part of the end user. Where possible, pull data from the third-party you're integrating with using [data sources](https://prismatic.io/docs/integrations/data-sources.md). For example, your SharePoint integration may require a SharePoint site ID. Rather than having your customer open their SharePoint account and copy/paste a site ID, use the [List Sites from SharePoint](https://prismatic.io/docs/components/ms-sharepoint.md#list-sites-from-sharepoint) data source to allow them to select a site from a dropdown menu. [Custom data sources](https://prismatic.io/docs/custom-connectors/data-sources.md) can be written to dynamically fetch data from third-party apps and can be used to build dropdown menus or [custom field mappers](https://prismatic.io/docs/integrations/data-sources/json-forms.md#field-mapper-with-custom-layout) using JSON Forms. --- #### Troubleshooting and Debugging Integrations #### Debug mode[​](#debug-mode "Direct link to Debug mode") Debug mode can be enabled in the [integration designer](https://prismatic.io/docs/integrations/troubleshooting.md#enabling-debug-mode-in-the-integration-designer) or for a [specific instance](https://prismatic.io/docs/integrations/troubleshooting.md#enabling-debug-mode-for-an-instance). When debug mode is enabled, two things happen: 1. After each step runs, a log is emitted detailing how long the step took to run and how much memory was consumed. 2. Each code step and component is provided a `debug` object, which you can use to emit additional debug logs or measure memory or time performance of your code. debug mode can cause performance issues Note that debug mode generates additional logs and performance metrics and can cause performance issues when left on. Please use debug mode selectively and deactivate it on instances when you are not actively debugging production issues. ##### Enabling debug mode in the integration designer[​](#enabling-debug-mode-in-the-integration-designer "Direct link to Enabling debug mode in the integration designer") To enable debug mode in the integration designer, click the gray **Debug Mode** button on the right side of the test runner drawer. Then, confirm that you truly want to enable debug mode (as logs can be verbose and can cause performance issues). ![Debug mode button in the integration designer](/docs/img/integrations/troubleshooting/integration-designer-debug-mode.png) To disable debug mode, click the same (now yellow) button again. ##### Enabling debug mode for an instance[​](#enabling-debug-mode-for-an-instance "Direct link to Enabling debug mode for an instance") To enable debug mode for a specific instance, open the instance's **Summary** tab and select **Disabled** under **Debug Mode**. ![Debug mode button in an instance](/docs/img/integrations/troubleshooting/instance-debug-mode.png) To disable debug mode on an instance, click the same (now yellow) **Enabled** button again. ##### Debug mode logs[​](#debug-mode-logs "Direct link to Debug mode logs") When debug mode is enabled, a log line will be emitted after each step that looks like this: Example line emitted when in debug mode ``` { "stepName": "codeBlock", "memoryUsage": { "rss": 381.484375, "heapTotal": 58.97265625, "heapUsed": 53.90106964111328, "external": 258.9808874130249, "arrayBuffers": 0.14438152313232422 }, "duration": 3.401575000025332, "platform": true } ``` Within the log line are two useful metrics: 1. `duration` indicates the number of milliseconds that the step took to run. 2. `memoryUsage` indicates how much memory was used when this step completed. The `rss` value is the most important metric - it represents the total amount of memory (in MB) that the runner was using. The other values are outlined [here](https://nodejs.org/api/process.html#processmemoryusage). Note that unlike NodeJS documentation (which displays memory values in **bytes**), the log line presents **MB** for readability. ##### Running steps conditionally based on debug mode[​](#running-steps-conditionally-based-on-debug-mode "Direct link to Running steps conditionally based on debug mode") A flow's trigger returns, among other things, the value of `globalDebug`. ![Trigger step global debug flag](/docs/img/integrations/troubleshooting/trigger-step-global-debug.png) You can reference that value in a [branch](https://prismatic.io/docs/components/branch.md) step to determine which branch to follow. #### Using debug mode in a code block or custom connector[​](#using-debug-mode-in-a-code-block-or-custom-connector "Direct link to Using debug mode in a code block or custom connector") The [context](https://prismatic.io/docs/custom-connectors/actions.md#the-context-parameter) parameter passed to a custom connector's actions and triggers contains an object, `debug`, which can be used to write additional debug logging or measure memory or time performance of specific parts of your code. ##### Checking if debug mode is enabled in a code block or custom connector[​](#checking-if-debug-mode-is-enabled-in-a-code-block-or-custom-connector "Direct link to Checking if debug mode is enabled in a code block or custom connector") The context parameter's `debug.enabled` property will be `true` if debug mode is enabled, and `false` otherwise. Determining if debug is enabled in a code step ``` module.exports = async (context, stepResults) => { const widgetId = stepResults.getWidget.results.id; if (context.debug.enabled) { context.logger.log(`Got widget ID "${widgetId}"`); } // ... }; ``` Similar concepts can be applied to custom connector code or code-native integrations using their `context` parameters. ##### Measuring time performance in a code block or custom connector[​](#measuring-time-performance-in-a-code-block-or-custom-connector "Direct link to Measuring time performance in a code block or custom connector") If a code block or custom action is running slowly, you can book-end a portion of potentially "slow" code with `context.debug.timeElapsed.mark` functions to note when the "slow" code started and finished. The `context.debug.timeElapsed.measure` function can then be used to determine the time between start and finish, giving you performance metrics for each portion of your code. Get metrics on runtime of parts of a code step ``` async function sleep(ms) { return new Promise((resolve) => setTimeout(resolve, ms)); } const mySlowFunction = async () => sleep(2000); const mySlowerFunction = async () => sleep(5000); module.exports = async (context, stepResults) => { context.debug.timeElapsed.mark(context, "slowCode1_start"); await mySlowFunction(); // A "long-running" process that takes about 2 seconds context.debug.timeElapsed.mark(context, "slowCode1_end"); context.debug.timeElapsed.measure(context, "longRunning1", { start: "slowCode1_start", end: "slowCode1_end", }); context.debug.timeElapsed.mark(context, "slowCode2_start"); await mySlowerFunction(); // A "long-running" process that takes about 5 seconds context.debug.timeElapsed.mark(context, "slowCode2_end"); context.debug.timeElapsed.measure(context, "longRunning2", { start: "slowCode2_start", end: "slowCode2_end", }); return { data: null }; }; ``` Note that in the above code, two functions book-ended with `mark` functions take about 2000 and 5000ms to run respectively. We can find these durations in our instance's logs: ![Time metrics on a code step](/docs/img/integrations/troubleshooting/time-metrics-code-step.png) ##### Measuring memory performance in a code block or custom connector[​](#measuring-memory-performance-in-a-code-block-or-custom-connector "Direct link to Measuring memory performance in a code block or custom connector") At any point in your code block or custom action or trigger, you can invoke `context.debug.memoryUsage`, which will take a snapshot of memory usage at a specific time. Here, we determine how much memory was used before and after fetching data from an API: Determine how much memory was used before and after fetching data ``` module.exports = async (context, stepResults) => { context.debug.memoryUsage(context, "Before Fetching Data"); const response = await fetch("https://jsonplaceholder.typicode.com/todos"); const todoItems = await response.json(); context.debug.memoryUsage(context, "After Fetching Data"); return { data: todoItems }; }; ``` Memory usage (`rss`) represents the total amount of memory consumed by the execution **in MB**. Here, we can see that fetching data consumed about 10MB of memory. ![Memory metrics on a code step](/docs/img/integrations/troubleshooting/memory-metrics-code-step.png) #### Common error messages[​](#common-error-messages "Direct link to Common error messages") This section outlines common error messages that you may see and how to fix them. ##### P10001: Result too large to serialize[​](#p10001-result-too-large-to-serialize "Direct link to P10001: Result too large to serialize") You may see this error when dealing with large files (greater than 128MB) due to limitations on how large a JavaScript `Buffer` can be. Reduce the size of your step result if possible. Consider [processing files using streams](https://prismatic.io/docs/custom-connectors/handling-large-files-in-custom-components.md). ##### P10002: Result contains non-serializable functions[​](#p10002-result-contains-non-serializable-functions "Direct link to P10002: Result contains non-serializable functions") You may see this error if your step result contains JavaScript functions. Prismatic serializes step results using [MessagePack](https://msgpack.org/index.html). JavaScript functions are non-serializable. So, you cannot `return { data: { foo: () => "myReturnValue" } }`, for example. As a work-around, you can JSON stringify and JSON parse an object, and functions will be automatically removed. ``` const myReturnValue = { foo: 123, bar: "Hello, World", baz: () => { console.log("Hi!"); }, }; const sanitizedResults = JSON.parse(JSON.stringify(myReturnValue)); return { data: sanitizedResults }; // Returns {foo: 123, bar: "Hello, World"} ``` ##### P10003: Error serializing step results[​](#p10003-error-serializing-step-results "Direct link to P10003: Error serializing step results") An unknown error occurred when serializing your step results. Prismatic serializes step results using [MessagePack](https://msgpack.org/index.html). MessagePack supports a number of complex and primitive types (numbers, strings, objects, arrays, Buffers, null, etc.). Ensure that the data you are returning is included in the [MessagePack spec](https://github.com/msgpack/msgpack/blob/master/spec.md). --- #### Code-Native Integrations ##### Code-Native Integrations New to code-native? Are you new to code-native? Check out our [getting started guide](https://prismatic.io/docs/integrations/code-native/get-started/first-integration.md) to build your first integration. When building an integration, you can use either the [low-code designer](https://prismatic.io/docs/integrations/low-code-integration-designer.md) or create a TypeScript project in your favorite IDE. We call integrations built with code "code-native integrations" (or CNIs). This article explains how to build an integration entirely in code. ![A code-native integration being built in Visual Studio Code](/docs/img/integrations/code-native/cni-in-vscode.png) For example code-native integrations, please visit our [GitHub repository](https://github.com/prismatic-io/examples/tree/main/integrations/code-native-integrations). The rest of the platform is the same Regardless of how you build your integrations, the rest of the platform remains identical. Both code-native and low-code integrations are deployed to the same runner environment. Both can include OAuth 2.0 connections, data sources, and other advanced configuration wizard steps. Integrations built using either approach can include multiple flows and can be added to your integration marketplace. The same logging, monitoring, and alerting tools are available for both types of integrations. The key difference lies in how you build the integration - you either assemble a set of low-code steps in the designer or write TypeScript code to accomplish the same task. --- ##### Code-Native Config Wizard Just like low-code integrations, code-native integrations include a [config wizard](https://prismatic.io/docs/integrations/config-wizard.md). The config wizard can include things like OAuth 2.0 connections, API key connections, dynamically-sourced UI elements (data sources), and other advanced configuration wizard steps. A config wizard consists of multiple pages. Each page has a title, which is derived from the `key` of the configPage object, and a `tagline` as well as a set of `elements` (individual config variables). For example, a config wizard might contain a page for a Slack OAuth 2.0 connection, a page where the user selects a channel from a dynamically-populated dropdown menu, and a page where a user enters two static string inputs: Example config pages definition ``` import { configPage, configVar } from "@prismatic-io/spectral"; import { slackConnectionConfigVar } from "./connections"; import { slackSelectChannelDataSource } from "./dataSources"; export const configPages = { Connections: configPage({ tagline: "Authenticate with Slack", elements: { "Slack OAuth Connection": slackConnectionConfigVar, }, }), "Slack Config": configPage({ tagline: "Select a Slack channel from a dropdown menu", elements: { "Select Slack Channel": slackSelectChannelDataSource, }, }), "Other Config": configPage({ elements: { "Acme API Endpoint": configVar({ stableKey: "acme-api-endpoint", dataType: "string", description: "The endpoint to fetch TODO items from Acme", defaultValue: "https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo", }), "Webhook Config Endpoint": configVar({ stableKey: "webhook-config-endpoint", dataType: "string", description: "The endpoint to call when deploying or deleting an instance", }), }, }), }; ``` #### Config variable visibility in code-native[​](#config-variable-visibility-in-code-native "Direct link to Config variable visibility in code-native") Each config variable can have a `permissionAndVisibilityType` property with one of three values: * `customer` is the default value. Customer users can view and edit the config variable, and it will always appear in the config wizard. * `embedded` makes it so the config variable does not show up in the config wizard, but your app is able to [set it programmatically](https://prismatic.io/docs/embed/marketplace.md#dynamically-setting-config-variables-in-marketplace) through the embedded SDK. This is useful if you want to set an API key for a user during the configuration process but not allow the user to see or edit the value that is set. * `organization` makes it so the config variable is not visible to your customer and is not able to be set programmatically by your app. Config variables marked **organization** must have a default value, or else your team members will need to set the value on behalf of your customer. Additionally, `visibleToOrgDeployer` determines if an organization user will see this config variable in the config wizard UI. While organization team members always have programmatic access to instances' config variables and their values, this helps to visually conceal some config variable values like generated metadata from data sources, etc. Defaults to `true`. A debug config variable that is only visible to org team members ``` configVar({ stableKey: "debug", dataType: "boolean", description: "Enable debug logging", defaultValue: "false", permissionAndVisibilityType: "customer", visibleToOrgDeployer: true, }); ``` #### Connections in code-native integrations[​](#connections-in-code-native-integrations "Direct link to Connections in code-native integrations") Connection definitions in CNI are very similar to [custom component connection definitions](https://prismatic.io/docs/custom-connectors/connections.md), but you use the `connectionConfigVar` function instead of the `connection` function. For example, a Slack OAuth 2.0 connection might look like this: Example connection definition ``` export const slackConnectionConfigVar = connectionConfigVar({ stableKey: "slack-oauth-connection", dataType: "connection", oauth2Type: OAuth2Type.AuthorizationCode, inputs: { authorizeUrl: { label: "Authorize URL", placeholder: "Authorize URL", type: "string", default: "https://slack.com/oauth/v2/authorize", required: true, shown: false, comments: "The OAuth 2.0 Authorization URL for the API", }, tokenUrl: { label: "Token URL", placeholder: "Token URL", type: "string", default: "https://slack.com/api/oauth.v2.access", required: true, shown: false, comments: "The OAuth 2.0 Token URL for the API", }, revokeUrl: { label: "Revoke URL", placeholder: "Revoke URL", type: "string", required: true, shown: false, comments: "The OAuth 2.0 Revocation URL for Slack", default: "https://slack.com/api/auth.revoke", }, scopes: { label: "Scopes", placeholder: "Scopes", type: "string", required: false, shown: false, comments: "Space separated OAuth 2.0 permission scopes for the API", default: "chat:write chat:write.public channels:read", }, clientId: { label: "Client ID", placeholder: "Client ID", type: "string", required: true, shown: false, comments: "Client Identifier of your app for the API", default: SLACK_CLIENT_ID, }, clientSecret: { label: "Client Secret", placeholder: "Client Secret", type: "password", required: true, shown: false, comments: "Client Secret of your app for the API", default: SLACK_CLIENT_SECRET, }, signingSecret: { label: "Signing Secret", type: "password", required: true, shown: false, default: SLACK_SIGNING_SECRET, }, }, }); ``` The above OAuth 2.0 connection would be rendered in a config wizard as a card with a **connect** button that a user can click to authenticate with Slack. ![A card with a connect button that a user can click to authenticate with Slack](/docs/img/integrations/code-native/config-wizard/config-page-oauth.png) Connection inputs yield objects. To access one of the fields in the connection definition (like the `signingSecret` input) in a trigger or `onExecution` function, you can access it using the `configVars` object in the `context` parameter. Accessing connection inputs in a trigger or onExecution function ``` // Access a field defined in the connection definition context.configVars["Slack OAuth Connection"].fields.signingSecret; // Access the access token from the OAuth 2.0 flow context.configVars["Slack OAuth Connection"].token?.access_token; ``` ##### Integration-agnostic connections in code-native[​](#integration-agnostic-connections-in-code-native "Direct link to Integration-agnostic connections in code-native") [Integration-agnostic connections](https://prismatic.io/docs/integrations/connections.md#integration-agnostic-connections) are centrally managed and can be referenced across one or multiple integrations. You can configure * [org-activated global connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-global.md) (all customers share a connection) * [org-activated customer connectors](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-customer.md) (you as an organization create a connection for your customer) * [customer-activated connections](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md) (your customer creates their own connection which can be used in multiple integrations). To use an integration-agnostic connection in a code-native integration, take note of the connection's **stable key**. If the connection is an customer-activated connection, use the `customerActivatedConnection` function and add your connection to the first `configPage` in your config wizard. Including a customer-activated connection in your code-native integration ``` import { configPage, customerActivatedConnection, } from "@prismatic-io/spectral"; export const configPages = { Connections: configPage({ elements: { // Customer-activated connection "Salesforce Connection": customerActivatedConnection({ stableKey: "acme-sfdc-connection", }), }, }), }; ``` If the connection is an org-activated customer or org-activated global connection, use the `organizationActivatedConnection` function and add the connection to the `scopedConfigVars` property of your integration definition in `index.ts`. Be sure to export `scopedConfigVars` from your `index.ts` file so TypeScript can infer config variable types in your flows. Including an org-activated connection in your code-native integration ``` import { integration, organizationActivatedConnection, } from "@prismatic-io/spectral"; import flows from "./flows"; import { configPages } from "./configPages"; import { componentRegistry } from "./componentRegistry"; export { configPages } from "./configPages"; export { componentRegistry } from "./componentRegistry"; export const scopedConfigVars = { // Org-activated customer connection "Acme API Key": organizationActivatedConnection({ stableKey: "acme-api-key", }), // Org-activated global connection "OpenAI API Key": organizationActivatedConnection({ stableKey: "openai-api-key", }), }; export default integration({ name: "Acme OpenAI Integration", description: "Connect Acme to OpenAI", iconPath: "icon.png", flows, configPages, componentRegistry, scopedConfigVars, }); ``` #### Data sources in code-native integrations[​](#data-sources-in-code-native-integrations "Direct link to Data sources in code-native integrations") Data sources are defined in CNI similar to how they are defined in [custom components](https://prismatic.io/docs/custom-connectors/data-sources.md), but you use the `dataSourceConfigVar` function rather than the `dataSource` function. For example, if you would like to add a picklist dropdown menu to your config wizard that displays a list of Slack channels, you could define a data source like this: Example data source definition ``` import { Connection, Element, dataSourceConfigVar, } from "@prismatic-io/spectral"; import { createSlackClient } from "./slackClient"; import { AxiosResponse } from "axios"; interface Channel { id: string; name: string; } interface ListChannelsResponse { ok: boolean; channels: Channel[]; response_metadata?: { next_cursor: string; }; } export const slackSelectChannelDataSource = dataSourceConfigVar({ stableKey: "slack-channel-selection", dataSourceType: "picklist", perform: async (context) => { const client = createSlackClient( context.configVars["Slack OAuth Connection"] as Connection, ); let channels: Channel[] = []; let cursor = null; let counter = 1; // Loop over pages of conversations, fetching up to 10,000 channels // If we loop more than 10 times, we risk hitting Slack API limits, // and returning over 10,000 channels can cause the UI to hang do { const response: AxiosResponse = await client.get( "conversations.list", { params: { exclude_archived: true, types: "public_channel", cursor, limit: 1000, }, }, ); if (!response.data.ok) { throw new Error( `Error when fetching data from Slack: ${response.data}`, ); } channels = [...channels, ...response.data.channels]; cursor = response.data.response_metadata?.next_cursor; counter += 1; } while (cursor && counter < 10); // Map conversations to key/label objects, sorted by name const objects = channels .sort((a, b) => (a.name < b.name ? -1 : 1)) .map((channel) => ({ key: channel.id, label: channel.name, })); return { result: objects }; }, }); ``` The above data source would be rendered in a config wizard as a picklist dropdown menu that displays a list of Slack channels. ![A picklist dropdown menu that displays a list of Slack channels](/docs/img/integrations/code-native/config-wizard/config-page-datasource.png) You can build advanced UI elements, like field mappers, into your config wizard using [JSON Forms data sources](https://prismatic.io/docs/integrations/data-sources/json-forms.md). #### Other config variable types in code-native integrations[​](#other-config-variable-types-in-code-native-integrations "Direct link to Other config variable types in code-native integrations") Other types of config variables that Prismatic supports (picklists, text inputs, schedules, etc) can be added to CNI integrations, as well. These are generally defined inline alongside other `elements` in a `configPage`: Example config variable definition ``` export const configPages = { // ... "Other Config": configPage({ "Acme API Endpoint": configVar({ stableKey: "1F886045-27E7-452B-9B44-776863F6A862", dataType: "string", description: "The endpoint to fetch TODO items from Acme", defaultValue: "https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo", }), }), }; ``` ![A page in the config wizard with two static string inputs](/docs/img/integrations/code-native/config-wizard/config-page-static-inputs.png) #### Additional config wizard helper text in code-native integrations[​](#additional-config-wizard-helper-text-in-code-native-integrations "Direct link to Additional config wizard helper text in code-native integrations") In addition to config variables, you can add helpful text and images to guide your customers as they work through your config wizard. To add HTML to the config wizard (which can include links, images, etc), include a string `element` to a `configPage` definition: Include helper text in the config wizard ``` export const configPages = { Connections: configPage({ elements: { helpertext1: "

Asana Instructions

", helpertext2: "To generate an Asana API Key, visit the " + 'developer portal ' + 'and select "Create new token".', "Asana API Key": connectionConfigVar({ stableKey: "f0eab60f-545b-4b46-bebf-04d3aca6b63c", dataType: "connection", inputs: { // ... }, }), }, }), }; ``` ![A page in the config wizard with additional helper text](/docs/img/integrations/code-native/config-wizard/helper-text.png) #### User-level config wizards in code-native integrations[​](#user-level-config-wizards-in-code-native-integrations "Direct link to User-level config wizards in code-native integrations") If your integration relies on [user-level config](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md), you can add a user-level config wizard similar to how you create the integration's config wizard. Within `configPages.ts` create a `userLevelConfigPages` object that has the same shape as `configPages`: User-level config wizard ``` export const userLevelConfigPages = { Options: configPage({ elements: { "My ULC Config Variables": configVar({ dataType: "string", stableKey: "my-ulc-config-var", description: "Enter a widget value", }), }, }), }; ``` Then, in `index.ts` import the `userLevelConfigPages` object. Provide the object as an export of your project (so TypeScript can infer types via `.spectral/index.ts`), and include it in your `integration()` definition: Including user-level config in your component ``` import { integration } from "@prismatic-io/spectral"; import flows from "./flows"; import { configPages, userLevelConfigPages } from "./configPages"; import { componentRegistry } from "./componentRegistry"; export { configPages, userLevelConfigPages } from "./configPages"; export { componentRegistry } from "./componentRegistry"; export default integration({ name: "ulc-example", description: "My user-level config example integration", iconPath: "icon.png", flows, configPages, userLevelConfigPages, componentRegistry, }); ``` #### Config variable stable keys[​](#config-variable-stable-keys "Direct link to Config variable stable keys") Config variables each have a user-supplied `stableKey` property. These keys are used to uniquely identify the config variable in the Prismatic API, and help guard against inadvertent changes to the name of the config variable. Without a stable key, if a config variable's name can be changed the Prismatic API will treat it as a new config variable and existing values assigned to the config variable will be lost. With a stable key, the Prismatic API will be able to map the old config variable to the renamed one, and retain config variable values. Stable keys can be any user-supplied string. You can choose a random UUID, or a string that describes the flow or config variable. --- ##### Code-Native Endpoint Configuration By default, instances of integrations that you deploy will be assigned unique webhook URLs - one URL for each flow. We call this **Instance and Flow Specific** endpoint configuration. Alternatively, you can choose **Instance Specific** endpoint configuration (each instance gets its own webhook URL and all flows share the single URL) or **Shared** endpoint configuration, where all flows of all instances share one URL. To specify endpoint type, add an `endpointType` property to the `integration()` definition in `src/index.ts`. It can have values `"instance_specific"`, `"flow_specific"` or `"shared_instance"` and defaults to `"flow_specific"`: ``` import { integration } from "@prismatic-io/spectral"; import flows from "./flows"; import { configPages } from "./configPages"; export default integration({ name: "shared-endpoint-example", description: "Shared Endpoint Example", iconPath: "icon.png", flows, configPages, componentRegistry, endpointType: "instance_specific", }); ``` When **Instance Specific** or **Shared** endpoint configuration is selected, you need some logic to determine which flow (and which customer's instance in the case of **Shared**) should be run. This can be done with or without a [preprocess flow](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#instance-specific-endpoint-with-a-preprocess-flow), and both methods are described below. Full documentation on endpoint configuration is available in the [Endpoint Configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md) article. #### Endpoint configuration in code-native without preprocess flow[​](#endpoint-configuration-in-code-native-without-preprocess-flow "Direct link to Endpoint configuration in code-native without preprocess flow") If the flow that you want to run is specified in the webhook request's body or in a header, you can configure shared endpoint without a preprocess flow. If, for example, the flow you want to run is specified using a header named `x-acme-flow`, note that header's name in your integration definition using the `triggerPreprocessFlowConfig` property: Instance specific endpoint configuration without a preprocess flow ``` export default integration({ name: "shared-endpoint-example", description: "Shared Endpoint Example", iconPath: "icon.png", flows, configPages, componentRegistry, endpointType: "instance_specific", triggerPreprocessFlowConfig: { flowNameField: "headers.x-acme-flow", }, }); ``` To invoke an instance of an execution that has been deployed, this `curl` command would invoke the flow named "Create Opportunity": Invoke an instance specific endpoint with a flow name specified in a header ``` curl https://hooks.prismatic.io/trigger/SW5ExampleInstanceSpecificEndpoint \ -X POST \ --header "content-type: application/json" \ --header "x-acme-flow: Create Opportunity" \ --data '{ "opportunity": { "name": "Foo", "amount": 10000 } }' ``` If all of your instances share an endpoint, you can similarly specify a customer external ID from the request body or headers: Shared endpoint configuration without a preprocess flow ``` export default integration({ name: "shared-endpoint-example", description: "Shared Endpoint Example", iconPath: "icon.png", flows, configPages, componentRegistry, endpointType: "shared_instance", triggerPreprocessFlowConfig: { flowNameField: "headers.x-acme-flow", externalCustomerIdField: "body.data.acmeAccountId", }, }); ``` Invoke a shared endpoint with a flow name header and customer ID in the body ``` curl https://hooks.prismatic.io/trigger/SW5ExampleSharedEndpoint \ -X POST \ --header "content-type: application/json" \ --header "x-acme-flow: Create Opportunity" \ --data '{ "acmeAccountId": "abc-123", "opportunity": { "name": "Foo", "amount": 10000 } }' ``` #### Endpoint configuration in code-native with preprocess flow[​](#endpoint-configuration-in-code-native-with-preprocess-flow "Direct link to Endpoint configuration in code-native with preprocess flow") A preprocess flow allows you to run custom logic to determine which flow should be run (and in the case of **Shared** endpoint config, which customer should be run). One of your flows can look at the request body or headers, make API calls, etc., and then return the name of the flow (and customer) to run. If you use a preprocess flow, one (and exactly one) of your flows must be marked as the preprocess flow. You cannot specify both a preprocess flow and a `triggerPreprocessFlowConfig` property. This example preprocess flow has an `onExecution` function (like any other flow). This flow returns two properties: `myFlowName` and `myCustomerId` - you can name those properties whatever you like. The `preprocessFlowConfig` property specifies which properties to look for in the response from the preprocess flow: Shared endpoint configuration with a preprocess flow ``` import axios from "axios"; import { flow } from "@prismatic-io/spectral"; const flowMapper = { create_opportunity: "Create Opportunity", update_opportunity: "Update Opportunity", }; interface CreateOpportunityPayload { event: "create_opportunity"; acctId: string; opportunity: { name: string; amount: number; }; } interface UpdateOpportunityPayload { event: "update_opportunity"; acctId: string; opportunity: { id: string; name: string; amount: number; }; } type Payload = CreateOpportunityPayload | UpdateOpportunityPayload; export const myPreprocessFlow = flow({ name: "My Preprocess Flow", stableKey: "my-preprocess-flow", preprocessFlowConfig: { flowNameField: "myFlowName", externalCustomerIdField: "myCustomerId", }, description: "This determines which sibling flow should be invoked", onExecution: async (context, params) => { const { event, acctId } = params.onTrigger.results.body.data as Payload; const customerIdResponse = await axios.post( "https://api.example.com/get-customer-id", { acmeAcctId: acctId, }, ); return Promise.resolve({ data: { myFlowName: flowMapper[event], myCustomerId: customerIdResponse.data.customerId, }, }); }, }); ``` The above preprocess flow will look at a property named `event` in the request body and map an event of `create_opportunity` to the string `Create Opportunity`, returning `Create Opportunity` as the name of the flow to run. It will also extract an `acctId` from the request body and make an HTTP request to `https://api.example.com/get-customer-id` to get an external customer ID, returning that customer ID as well. ``` curl https://hooks.prismatic.io/trigger/SW5ExampleSharedEndpoint \ -X POST \ --header "content-type: application/json" \ --data '{ "event": "create_opportunity", "acctId": "abc-123", "opportunity": { "name": "Foo", "amount": 10000 } }' ``` To create a preprocess flow for **Instance Specific** endpoint configuration, omit the `externalCustomerIdField` property from the `preprocessFlowConfig` object. --- ##### Existing Components in Code-Native Prismatic provides a number of [existing components](https://prismatic.io/docs/components.md) that you can use in your code-native integrations. By leveraging an existing component, you can save time and effort by reusing existing functionality. An example integration that uses the existing Slack OAuth 2.0 connection, Slack "Select Channel" data source, and Slack "Post Message" action is available in [GitHub](https://github.com/prismatic-io/examples/tree/main/integrations/code-native-integrations/slack-with-components). #### Adding component manifests to your code-native project[​](#adding-component-manifests-to-your-code-native-project "Direct link to Adding component manifests to your code-native project") Tip: Leverage prism-mcp If you are using an AI coding agent like [Cursor](https://cursor.com/en) or [GitHub Copilot](https://github.com/features/copilot), you can leverage [prism-mcp](https://github.com/prismatic-io/prism-mcp?tab=readme-ov-file#codegen) to accelerate code-native integration development. `prism-mcp` comes with tools that generate code snippets for referencing existing components. To use an existing trigger, connection, data source, or action, you need to add the component's **manifest** to your code-native project. This can be done in one of two ways: 1. **Recommended Method**: Generate a component manifest from Prismatic's API. From a terminal in your code-native directory, run: Generate a component manifest from Prismatic's API for Slack ``` npx cni-component-manifest slack ``` The component **key** that you supply to the `cni-component-manifest` command can be found at the top of a component's docs page. A private component manifest can be generated by adding the `--private` flag to the command: Generate a component manifest from a private component ``` npx cni-component-manifest my-private-component --private ``` 2. *Legacy Method*: Manifests for Prismatic-provided public components are available through Prismatic's component manifests repository. When you initialized your code-native integration, an `.npmrc` file was created that read: .npmrc ``` @component-manifests:registry=https://app.prismatic.io/packages/npm ``` That instructs your package manager to look for packages that begin with `@component-manifests` in the Prismatic repository. To add a component's manifest package to your code-native project, take note of the component's key and run: Add the Slack component's manifest to a code-native project ``` npm install @component-manifests/slack ``` Update to new component reference syntax If you previously added a component manifest using the legacy method, see our [Spectral 10.6 migration guide](https://prismatic.io/docs/spectral/spectral-10-6-upgrade-guide.md) for instructions on how to use new component reference syntax. ##### Including components in your component registry[​](#including-components-in-your-component-registry "Direct link to Including components in your component registry") Once the component manifest is installed, add it to `componentRegistry.ts`: componentRegistry.ts ``` import { componentManifests } from "@prismatic-io/spectral"; import slack from "./manifests/slack"; // Or, if you installed the manifest as an npm dependency, // import slack from "@component-manifests/slack"; export const componentRegistry = componentManifests({ slack, }); ``` Behind the scenes, `.spectral/index.ts` will inspect your code-native project's exported `componentRegistry` object and will provide type hinting to your TypeScript based on which components are included in your component registry. #### Using existing connections in code-native[​](#using-existing-connections-in-code-native "Direct link to Using existing connections in code-native") After adding an existing component's manifest to your code-native project, you can use a component's connection in your code-native integration's `configPages` definition. The component manifest includes a connection definition wrapper function that you can reference. Using an existing connection in a code-native integration ``` import { configPage } from "@prismatic-io/spectral"; import { slackOauth2 } from "./manifests/slack/connections/oauth2"; export const configPages = { Connections: configPage({ tagline: "Authenticate with Slack", elements: { "Slack OAuth Connection": slackOauth2("my-slack-connection", { clientId: { value: "REPLACE_ME_WITH_YOUR_CLIENT_ID", permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, clientSecret: { value: "REPLACE_ME_WITH_YOUR_CLIENT_SECRET", permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, signingSecret: { value: "REPLACE_ME_WITH_YOUR_SIGNING_SECRET", permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, scopes: { value: "chat:write chat:write.public channels:read", permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, authorizeUrl: { permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, isUser: { permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, tokenUrl: { permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, revokeUrl: { permissionAndVisibilityType: "organization", visibleToOrgDeployer: false, }, }), }, }), }; ``` #### Using existing data sources in code-native[​](#using-existing-data-sources-in-code-native "Direct link to Using existing data sources in code-native") After adding an existing component's manifest to your code-native project, you can use a component's data source in your code-native integration's `configPages` definition. At the top of your config pages file, import your data source wrapper function from the component's manifest: Using an existing data source in a code-native integration ``` import { configPage } from "@prismatic-io/spectral"; import { slackSelectChannels } from "./manifests/slack/dataSources/selectChannels"; export const configPages = { // ... "Slack Config": configPage({ tagline: "Select a Slack channel from a dropdown menu", elements: { "Select Slack Channel": slackSelectChannels("my-slack-channel-picklist", { connection: { configVar: "Slack OAuth Connection" }, includePublicChannels: { value: true }, }), }, }), }; ``` If the data source requires a connection, you can pass a connection to the data source using the appropriate input value by specifying a `configVar` by name (`connection: { configVar: "Slack OAuth Connection" }` in the example above). #### Using existing triggers in code-native[​](#using-existing-triggers-in-code-native "Direct link to Using existing triggers in code-native") After adding an existing component's manifest to your code-native project, you can use a component's trigger in your code-native integration's `flow` definition. At the top of your config pages file, import your trigger wrapper function from the component's manifest: Using an existing trigger in a code-native integration ``` import { hashHmacWebhookTrigger } from "./manifests/hash/triggers/hmacWebhookTrigger"; export const existingComponentTriggerFlow = flow({ name: "Existing Component Trigger Flow", stableKey: "6f58c32c-b29a-4f55-97e6-b86bf9e24551", description: "This flow uses an existing component trigger", onTrigger: hashHmacWebhookTrigger({ hmacHeaderName: { value: "x-signature-256" }, secretKey: { configVar: "My Secret Key Config Var" }, hashFunction: { value: "sha256" }, headers: { value: [] }, }), onExecution: async (context, params) => { return Promise.resolve({ data: null }); }, }); export default [existingComponentTriggerFlow]; ``` If the trigger takes inputs, those inputs are passed to the trigger as `values`, and values can be either static string `value` or can reference config variables with `configVar: "Config Variable Name"`. How do lifecycle functions work with existing component triggers? If a trigger has an [`onInstanceDeploy` function](https://prismatic.io/docs/custom-connectors/triggers.md#app-event-webhook-triggers), it will be called automatically when the integration is deployed as an instance. The same applies to `onInstanceDelete` functions. If your flow also specifies an `onInstanceDeploy` function, the trigger's function will run first, followed by the flow's function. #### Using existing actions in code-native[​](#using-existing-actions-in-code-native "Direct link to Using existing actions in code-native") After importing an existing component's manifest and adding it to your `componentManifests` export, you can call one of the component's actions from within your `flow` by importing the action from your component manifest. For example, Using an existing action in a code-native integration ``` import slackActions from "../manifests/slack/actions"; export const existingComponentTriggerFlow = flow({ name: "Send a Slack Message", stableKey: "send-a-slack-message", description: "Send 'Hello World' to a Slack channel", onExecution: async (context, params) => { await slackActions.postMessage.perform({ channelName: util.types.toString(configVars["Select Slack Channel"]), connection: configVars["Slack OAuth Connection"], message: `Incomplete item: ${item.task}`, }); return { data: null }; }, }); ``` #### Salesforce CNI example[​](#salesforce-cni-example "Direct link to Salesforce CNI example") In this example, we use the existing Salesforce component's [OAuth 2.0 connection](https://prismatic.io/docs/components/salesforce.md#salesforce-oauth-20), [Flow Outbound Message Webhook](https://prismatic.io/docs/components/salesforce.md#flow-outbound-message-webhook) trigger, and [Get Record](https://prismatic.io/docs/components/salesforce.md#get-attachment) action to create a flow that listens for Account notifications from Salesforce and fetches the full record for each notification. * configPages.ts * flows.ts ``` import { configPage } from "@prismatic-io/spectral"; import { salesforceOauth2 } from "./manifests/salesforce/connections/oauth2"; if ( !process.env.SALESFORCE_CLIENT_ID || !process.env.SALESFORCE_CLIENT_SECRET ) { throw new Error( "Missing Salesforce OAuth2 client ID or client secret. Please set the SALESFORCE_CLIENT_ID and SALESFORCE_CLIENT_SECRET environment variables.", ); } export const configPages = { Connections: configPage({ elements: { "Salesforce Connection": salesforceOauth2("salesforce-connection", { clientId: { value: process.env.SALESFORCE_CLIENT_ID, permissionAndVisibilityType: "organization", }, clientSecret: { value: process.env.SALESFORCE_CLIENT_SECRET, permissionAndVisibilityType: "organization", }, }), }, }), }; ``` ``` import { flow } from "@prismatic-io/spectral"; import { salesforceFlowOutboundMessageTrigger } from "./manifests/salesforce/triggers/flowOutboundMessageTrigger"; import salesforceActions from "./manifests/salesforce/actions"; interface SalesforceNotification { Id: string; Name: string; type: string; } export const salesforceAccountNotifications = flow({ name: "Listen for Salesforce Account Notifications", stableKey: "salesforce-account-notifications", description: "This flow uses an existing component trigger to listen for Account notifications from Salesforce.", onTrigger: salesforceFlowOutboundMessageTrigger({ connection: { configVar: "Salesforce Connection" }, prefix: { value: "acme" }, triggerObject: { value: "Account" }, fields: { value: ["Id", "Name"] }, }), onExecution: async (context, params) => { const notifications = params.onTrigger.results.body .data as SalesforceNotification[]; for (const notification of notifications) { const record = await salesforceActions.getRecord.perform({ connection: context.configVars["Salesforce Connection"], recordId: notification.Id, recordType: notification.type, }); context.logger.info("Fetched record", { record }); } return { data: null }; }, }); export default [salesforceAccountNotifications]; ``` --- ##### Code-Native Flows Just like low-code integrations, code-native integrations can include multiple flows. Flows consist of a `name` and `description`, and a `stableKey` that uniquely identifies the flow. They also contain four functions: * `onTrigger` is called when a flow is invoked. If a flow is invoked asynchronously, you can return a custom HTTP response to the caller from the trigger. * `onExecution` is run immediately after the `onTrigger` function and is where the bulk of the flow's work is done. * `onInstanceDeploy` is called when a new instance of the integration is deployed. These functions are optional and can be used to set up resources or perform other tasks when an instance is deployed. * `onInstanceDelete` is called when an instance of the integration is deleted. #### Code-native flow triggers[​](#code-native-flow-triggers "Direct link to Code-native flow triggers") The `onTrigger` function of a flow is called when the flow is invoked. A simple no-op trigger that is called asynchronously can simply return the payload it was called with: ``` flow({ // ... onTrigger: async (context, payload) => { return Promise.resolve({ payload }); }, }); ``` Use the generic webhook trigger If you omit the `onTrigger` function, the Prismatic platform will automatically use the generic Webhook trigger. This will yield the payload to the next step, the `onExecution` function. If you want to return a custom HTTP response to the caller or would like to complete additional work in the trigger, you can additionally return an `HttpResponse` object from the trigger. In this example, suppose the trigger is invoked via an XML webhook payload that looks like this: ``` new_account 067DEAB4-B89C-4211-9767-84C96A39CF8C Nelson Bighetti Hooli Palo Alto CA ``` The app calling the trigger requires that you parse the XML payload and return the `challenge` property in the HTTP response body with an HTTP 200 Response. You could write a trigger that parses the XML payload and returns the `challenge` property: ``` import { HttpResponse, flow, util } from "@prismatic-io/spectral"; import { XMLParser } from "fast-xml-parser"; flow({ // ... onTrigger: async (context, payload) => { // Parse the raw XML from the webhook request const parser = new XMLParser(); const parsedBody = parser.parse(util.types.toString(payload.rawBody.data)); // Respond to the request with a plaintext response that includes the challenge key const response: HttpResponse = { statusCode: 200, contentType: "text/plain", body: parsedBody.notification.challenge, }; // Ensure that the payload is updated with the parsed body return Promise.resolve({ payload: { ...payload, body: { data: parsedBody } }, response, }); }, }); ``` ##### Cross-flow invocations[​](#cross-flow-invocations "Direct link to Cross-flow invocations") If you'd like one of your flows to invoke an execution of a sibling flow, you can use the flow's `context.invokeFlow` function to invoke a sibling flow. See the example [here](https://prismatic.io/docs/integrations/triggers/cross-flow.md#using-cross-flow-triggers-in-code-native). #### Running code-native flows on a schedule[​](#running-code-native-flows-on-a-schedule "Direct link to Running code-native flows on a schedule") Code-native flows support running on a schedule. The schedule can either be a static schedule that you define in your code, or you can create a config variable of `"schedule"` and let your customer define the schedule. In the example below, the first flow defines a static schedule of "Run at 10:20 every day on US Central time". For tips on creating cron strings, check out [crontab.guru](https://crontab.guru/). The second flow uses a config variable to define the schedule. ``` import { configPage, flow, integration } from "@prismatic-io/spectral"; const scheduleWithCronExpression = flow({ name: "Schedule Trigger with cron expression", description: "This flow is triggered by schedule following a cron expression", stableKey: "schedule-trigger-cron-expression", onExecution: async (context) => { const now = new Date(); context.logger.info(`Flow executed at ${now}`); return Promise.resolve({ data: null }); }, schedule: { value: "20 10 * * *", timezone: "America/Chicago" }, // Run at 10:20 AM CST }); const scheduleWithConfigVar = flow({ name: "Schedule with config var", description: "This flow is triggered by a schedule following a config var", stableKey: "schedule-trigger-config-var", onExecution: async (context) => { const now = new Date(); context.logger.info(`Flow executed at ${now}`); return Promise.resolve({ data: null }); }, schedule: { configVar: "My Schedule" }, // Run on a user-defined schedule }); export default integration({ name: "schedule-trigger-test", description: "Schedule Trigger Test", iconPath: "icon.png", flows: [scheduleWithCronExpression, scheduleWithConfigVar], configPages: { "Page One": configPage({ elements: { "My Schedule": { dataType: "schedule", stableKey: "my-schedule", }, }, }), }, }); ``` #### Enabling singleton executions for code-native flows[​](#enabling-singleton-executions-for-code-native-flows "Direct link to Enabling singleton executions for code-native flows") To ensure that only one execution of your schedule-based code-native flow runs at a time, you can enable [singleton executions](https://prismatic.io/docs/integrations/triggers/schedule.md#ensuring-singleton-executions-for-scheduled-flows). Add a `queueConfig.singletonExecutions` property to your flow that runs on a schedule: Run every 5 minutes, unless the previous execution is still running ``` export const salesforceAccountNotifications = flow({ name: "Fetch data every 5 minutes", stableKey: "fetch-data", description: "Fetch and import data every 5 minutes", schedule: { value: "*/5 * * * *" }, queueConfig: { singletonExecutions: true }, onExecution: async (context, params) => { return Promise.resolve({ data: context.configVars }); }, }); ``` #### Code-native flow onInstanceDeploy and onInstanceDelete[​](#code-native-flow-oninstancedeploy-and-oninstancedelete "Direct link to Code-native flow onInstanceDeploy and onInstanceDelete") Code-native flows support `onInstanceDeploy` and `onInstanceDelete` callback functions. These functions run when an instance of the integration is deployed and deleted, respectively. These functions are useful for setting up resources or performing other tasks when an instance is deployed or deleted and are often used to set up webhooks in third-party apps. The functions work the same as custom trigger functions, which are documented in the [Writing Custom Components](https://prismatic.io/docs/custom-connectors/triggers.md#instance-deploy-and-delete-events-for-triggers) article. #### Code-native flow onExecution[​](#code-native-flow-onexecution "Direct link to Code-native flow onExecution") The `onExecution` function runs immediately after the `onTrigger` function and is where the bulk of the flow's work is done. The `onExecution` function takes two parameters: * `context` - in addition to the [attributes](https://prismatic.io/docs/custom-connectors/actions.md#the-context-parameter) that a normal custom component receives (like a logger, persisted data, metadata about the integration, customer, and instance), a CNI flow's `context` object also contains a `configVars` object that has the values of all config variables that your integration includes. * `params` - the `params` object contains the payload that was returned from the `onTrigger` function. This example `onExecution` function performs the same logic that the low-code [Build Your First Integration](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) integration did, but in TypeScript: ``` import { flow } from "@prismatic-io/spectral"; import axios from "axios"; import { createSlackClient } from "../slackClient"; interface TodoItem { id: number; completed: boolean; task: string; } export const todoAlertsFlow = flow({ // ... onExecution: async (context) => { // Config variables are accessed using the context object const { logger, configVars } = context; // Make an HTTP request to the Acme API using the config variable const { data: todoItems } = await axios.get( configVars["Acme API Endpoint"], ); // Create an HTTP Slack client using the Slack OAuth connection const slackClient = createSlackClient(configVars["Slack OAuth Connection"]); // Loop over the todo items for (const item of todoItems) { if (item.completed) { logger.info(`Skipping completed item ${item.id}`); } else { // Send a message to the Slack channel for each incomplete item logger.info(`Sending message for item ${item.id}`); try { await slackClient.post("chat.postMessage", { channel: configVars["Select Slack Channel"], text: `Incomplete task: ${item.task}`, }); } catch (e) { throw new Error(`Failed to send message for item ${item.id}: ${e}`); } } } // Asynchronously-invoked flows should simply return null return { data: null }; }, }); ``` ##### Referencing the trigger payload in the onExecution function[​](#referencing-the-trigger-payload-in-the-onexecution-function "Direct link to Referencing the trigger payload in the onExecution function") The trigger will generally return the payload it received, but you can also return a modified payload from the trigger. The `onExecution` function will receive the payload that was returned from the trigger. The trigger may receive a payload of any format, so annotating a TypeScript `interface` is helpful for type hinting and code completion: Reference the trigger payload in the onExecution function ``` import { createSlackClient } from "../slackClient"; interface AccountNotification { notification: { type: string; challenge: string; account: { first: string; last: string; company: { name: string; city: string; state: string; }; }; }; } const sendMessagesFlow = flow({ // ... onExecution: async (context, params) => { const { configVars } = context; const slackClient = createSlackClient(configVars["Slack OAuth Connection"]); // The parsed XML payload is available in the params object const data = params.onTrigger.results.body.data as AccountNotification; // Construct a message to send to Slack const message = `New account received:\n` + `Name: ${data.notification.account.first} ${data.notification.account.last}\n` + `Company: ${data.notification.account.company.name}\n` + `Location: ${data.notification.account.company.city}, ${data.notification.account.company.state}\n`; await slackClient.post("chat.postMessage", { channel: configVars["Select Slack Channel"], text: message, }); return { data: null }; }, }); ``` #### Flow stable keys[​](#flow-stable-keys "Direct link to Flow stable keys") Flows have a user-supplied `stableKey` property. These keys are used to uniquely identify the flow in the Prismatic API, and help guard against inadvertent changes to the name of a flow. Without a stable key, if a flow name is changed the Prismatic API will treat it as a new flow, and deployed flows will receive new webhook URLs. With a stable key, the Prismatic API will be able to map the renamed flow and retain its webhook URL. Stable keys can be any user-supplied string. You can choose a random UUID, or a string that describes the flow or config variable. #### Persisting data between executions[​](#persisting-data-between-executions "Direct link to Persisting data between executions") Code-native flows can persist data between executions using the `context.instanceState`, `context.crossFlowState`, and `context.integrationState` objects. * `context.instanceState` (named for historical reasons) is scoped to the current flow of the current instance. Only the current flow can read and write to this state. * `context.crossFlowState` is scoped to the current instance, but can be read and written by any flow in the current instance. * `context.integrationState` is scoped to the entire integration, and can be read and written by any instance of the integration deployed to any customer. These state objects behave like simple key-value stores. To set a value, assign a value to a key on the state object. ``` context.instanceState["lastRun"] = new Date().toISOString(); ``` To read instance state, simply access the key on the state object. ``` const lastRun = context.instanceState["lastRun"]; if (lastRun) { context.logger.info(`The last run was at ${lastRun}`); } else { context.logger.info("This is the first run"); } ``` **What about `executionState`?** The `context.executionState` object is handy in the low-code designer as an accumulator or temporary variable holder, but in code-native flows you can simply use Node.js variables in your `onExecution` function. #### Building AI-compatible code-native flows[​](#building-ai-compatible-code-native-flows "Direct link to Building AI-compatible code-native flows") Flows can be invoked by AI agents using Prismatic's [MCP flow server](https://prismatic.io/docs/ai/model-context-protocol.md). To make your flows AI-compatible, add a `schemas.invoke` property to your flow definition. See [Flow Invocation Schema](https://prismatic.io/docs/ai/flow-invocation-schema.md#flow-schema-in-code-native-integrations) for an example. --- ##### Convert Low-Code Integrations to Code-Native You might build a proof-of-concept integration in the [low-code builder](https://prismatic.io/docs/integrations/low-code-integration-designer.md) but later want to switch to code-native. The low-code-to-code-native converter tool allows you to turn your low-code integration into a [code-native](https://prismatic.io/docs/integrations/code-native.md) integration, so you can extend your integration using native TypeScript in your favorite IDE. This is a one-way conversion All concepts (running steps, branching, looping, etc) in low-code have a code-native equivalent. A "loop over items" will become a `for ... of` loop in code-native, for example. The same is not true in reverse: some code-native concepts (like using `Promise.all` to run steps in parallel) do not have a low-code equivalent. Converting a low-code integration to code-native is a one-way operation. You cannot convert a code-native integration back to low-code. #### Converting a low-code YAML definition to code-native[​](#converting-a-low-code-yaml-definition-to-code-native "Direct link to Converting a low-code YAML definition to code-native") To convert a low-code integration to code-native, first [export the low-code integration](https://prismatic.io/docs/integrations/low-code-integration-designer.md#yaml-definition) as a YAML file. Next, ensure that you have the latest version of the [prism CLI tool](https://prismatic.io/docs/cli.md) installed. Then run the converter command: Convert a low-code YAML definition to code-native ``` # Create a code-native integration in a new folder called "my-cni-integration" prism integrations:convert --yamlFile /path/to/my-integration.yaml --folder ./my-cni-integration ``` After generating the component, it's a good idea to install dependencies and automatically format your code: ``` cd ./my-cni-integration npm install npm run format npm update --save ``` ##### Handling custom components[​](#handling-custom-components "Direct link to Handling custom components") If your low-code integration uses custom components, you can either: 1. Abstract the logic in the custom component into a package that both your custom component and code-native integration use 2. Invoke the custom component actions, connections, triggers, and data sources from your code-native integration If you would like to do the latter, you likely use a custom package registry prefix for your custom component manifests. You can specify your custom component package prefix with the `--registryPrefix` flag: Include custom components ``` prism integrations:convert --yamlFile /path/to/my-integration.yaml --registryPrefix "@acme-connectors" ``` #### Post-generation instructions[​](#post-generation-instructions "Direct link to Post-generation instructions") Depending on your integration, a few manual steps may be required to get your integration to compile and run properly. ##### Remove unused step result assignments[​](#remove-unused-step-result-assignments "Direct link to Remove unused step result assignments") If you had a low-code step that is not referenced by a subsequent step, you may see `myStep is declared but its value is never read`. Simply remove the variable assignment: ``` // go from: const myAction = await components.myComponent.myAction({}); // to: await components.myComponent.myAction({}); ``` ##### Provide TypeScript types for each step[​](#provide-typescript-types-for-each-step "Direct link to Provide TypeScript types for each step") By default, a step returns an object of `unknown` type. Use [generics](https://www.typescriptlang.org/docs/handbook/2/generics.html) to provide your step with a return type. In this example, an HTTP - Get step returns an array of todo items. We create a TypeScript interface and provide that interface as a generic for the step invocation: Add TypeScript types to steps ``` interface TodoItem { id: number; completed: boolean; task: string; } export const flow1 = flow({ //... onExecution: async (context, params) => { const getToDoTasks = await context.components.http.httpGet<{ data: TodoItem[]; }>({ url: "https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo", responseType: "json", }); for (const item of getToDoTasks.data) { // TypeScript now knows item is of type TodoItem } }, }); ``` With type generics, TypeScript now knows the shape of our `getToDoTasks` variable. ##### Simplify conditionals in branches[​](#simplify-conditionals-in-branches "Direct link to Simplify conditionals in branches") To ensure that generated code-native code behaves identically to low-code branching conditionals, we import `isEqual` and other functions from Spectral. These can generally be safely converted to JavaScript equivalents. ``` // Generated code import { isEqual } from "@prismatic-io/spectral/dist/conditionalLogic"; if (isEqual(val1, val2)) { doSomething(); } // Likely equivalent code with no import if (val1 === val2) { doSomething(); } ``` ##### Code component usage[​](#code-component-usage "Direct link to Code component usage") If your low-code integration used [code steps](https://prismatic.io/docs/components/code.md), invoking a code step in code-native is redundant. You can refactor your code so that you do not invoke the code step and instead simply run the code within your flow's code. This may require updating subsequent steps' references to your code step. ##### Conditional branch names[​](#conditional-branch-names "Direct link to Conditional branch names") The [branch](https://prismatic.io/docs/components/branch.md) component's result is the name of the branch that was traversed. Some low-code integrations use that value to determine what to do once the branch has completed. The code-native equivalent looks like this: ``` let myBranchStep = "Else"; if (something()) { doSomethingElse(); myBranchStep = "Unexpected Error"; } else { doYetAnotherThing(); myBranchStep = "Else"; } ``` You can remove `myBranchName` if your low-code integration did not make use of the branch step's result. #### Importing your converted code-native integration[​](#importing-your-converted-code-native-integration "Direct link to Importing your converted code-native integration") You can import your converted code-native integration the same way you would [import any code-native integration](https://prismatic.io/docs/integrations/code-native/get-started/setup.md#building-and-importing-your-code-native-integration). **Note:** by default, a safeguard exists to prevent accidentally overwriting a low-code integration with a code-native integration. When importing your integration, by default, a *new* integration will be created. If you would like to overwrite your existing low-code integration with your new code-native one, run the import with a `--replace` flag: Overwrite a low-code integration with a code-native one ``` prism integrations:import --integrationId SW5example --replace ``` Use caution when replacing a low-code integration with code-native We strongly recommend that you make a backup of your low-code integration's YAML file prior to replacing it with a code-native integration. If you'd like to revert a code-native integration back to low code, you can issue a similar `prism` command using your low-code integration's definition file: ``` prism integrations:import --integrationId SW5example --replace --path /path/to/low-code/file.yaml ``` If you don't have your integration's YAML definition, you can view the YAML definition of a previously published version of your integration from the **Management** > **Version history** drawer. --- ##### Get Started with Code-Native This tutorial will guide you through fundamental code-native development concepts. You will: * Create a new code-native integration project * Build a simple integration that retrieves JSON data from a REST API and sends messages to Slack * Create a configuration wizard for your integration * Add a second flow that handles incoming webhook requests * Build, import and test your integration in Prismatic Please follow along with the videos and code snippets below. #### Integration overview[​](#integration-overview "Direct link to Integration overview") The integration you'll build will retrieve data (a list of todo tasks) from an API, loop over the list, identify tasks marked "incomplete", and notify you via Slack of any incomplete tasks. We'll use a placeholder API for the todo list data - `https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo`. You'll design this integration to be **configurable**, meaning the same integration can be deployed to multiple customers with different API endpoints, Slack credentials, and Slack channel names. #### Start coding[​](#start-coding "Direct link to Start coding") ##### Code-native prerequisites[​](#code-native-prerequisites "Direct link to Code-native prerequisites") * Install a recent version of [Node.js](https://nodejs.org/en) * Install [Prismatic's CLI tool](https://prismatic.io/docs/cli.md) * (Optional) install the [Prism MCP server](https://prismatic.io/docs/dev-tools/prism-mcp.md) to enable AI coding assistant support * (Optional) install the [Prismatic VS Code extension](https://prismatic.io/docs/dev-tools/vscode-extension.md) to enable IDE integration for code-native development ##### Initialize a new code-native project[​](#initialize-a-new-code-native-project "Direct link to Initialize a new code-native project") To create a new code-native integration project, run the following command in your terminal: ``` prism integrations:init todo-slack-integration ``` You will be prompted to give your integration a description and select a connection type (OAuth 2.0 or basic auth). Then, a boilerplate TypeScript project will be created in the `todo-slack-integration` directory. After creating the project, navigate to the new directory and install dependencies: ``` cd todo-slack-integration npm install ``` ##### Build and publish the integration[​](#build-and-publish-the-integration "Direct link to Build and publish the integration") The boilerplate project includes a sample integration that you can build and deploy immediately. To build the integration, run: ``` npm run build ``` This will create a production build of your integration in the `dist` directory. Next, publish the integration to your Prismatic account: ``` prism integrations:import --open ``` examine the boilerplate code Take a moment to explore the code in the `src` directory to understand how the sample integration is structured. Key files to review: * `src/index.ts`: Defines the integration's metadata and references its flows and configuration pages. * `src/flows.ts`: Contains the integration's flows, including a sample flow that retrieves data from a placeholder API. * `src/configPages.ts`: Defines configuration pages that end users will interact with when deploying the integration. Notice how the configuration variables in the code correspond to the fields you'll see in the Prismatic UI when configuring a test integration. Note: The boilerplate code uses a mock API endpoint. When you test the integration, you can enter any value for the "API Key" connection config variable input. ##### Clean up the boilerplate code[​](#clean-up-the-boilerplate-code "Direct link to Clean up the boilerplate code") To focus on the core concepts, we'll start fresh by replacing the boilerplate code with our own implementation. * Delete `src/client.ts` and `src/flows.test.ts` * Replace the contents of `src/configPages.ts` and `src/flows.ts` with the following: - flows.ts - configPages.ts ``` import { flow } from "@prismatic-io/spectral"; export const processTodoItems = flow({ name: "Process Todo Items", stableKey: "process-todo-items", description: "Fetch items from an API and post incomplete items to Slack", onExecution: async (context) => { return Promise.resolve({ data: null }); }, }); export default [processTodoItems]; ``` ``` import { configPage } from "@prismatic-io/spectral"; export const configPages = { Connections: configPage({ elements: {}, }), }; ``` We now have a project with a blank config wizard and a single flow that does nothing. #### Fetch data from a REST API[​](#fetch-data-from-a-rest-api "Direct link to Fetch data from a REST API") Now that we have a blank slate, let's start building our integration by fetching data from a REST API. We'll use [axios](https://www.npmjs.com/package/axios) to fetch a list of todo items from an API endpoint specified by the user (though you can use another HTTP client of your choice). For each item, we log whether the task is completed or pending. Our configuration page now includes a configuration variable for the user to specify the API endpoint. This allows customers to use different API endpoints when they deploy our integration. * flows.ts * configPages.ts ``` import axios from "axios"; import { flow } from "@prismatic-io/spectral"; interface TodoItem { id: number; completed: boolean; task: string; } export const processTodoItems = flow({ name: "Process Todo Items", stableKey: "process-todo-items", description: "Fetch items from an API and post incomplete items to Slack", onExecution: async (context, params) => { const response = await axios.get( context.configVars["Todo API Endpoint"], ); for (const item of response.data) { if (item.completed) { context.logger.info(`Completed: ${item.task}`); } else { context.logger.warn(`Pending: ${item.task}`); } } return { data: null }; }, }); export default [processTodoItems]; ``` ``` import { configPage, configVar } from "@prismatic-io/spectral"; export const configPages = { Connections: configPage({ elements: { "Todo API Endpoint": configVar({ stableKey: "todo-api-endpoint", dataType: "string", description: "The endpoint for the Todo API", }), }, }), }; ``` #### Add Slack to our project[​](#add-slack-to-our-project "Direct link to Add Slack to our project") Now that we're successfully retrieving data from the API, the next step is to send that data to Slack. We'll do that by first creating a Slack connection, then adding the Slack SDK to our project, and finally posting messages to Slack. ##### Create Slack connection[​](#create-slack-connection "Direct link to Create Slack connection") Slack uses OAuth 2.0 (authorization code flow) for authentication, so we'll create an OAuth 2.0 `connectionConfigVar` that has `oauth2Type: OAuth2Type.AuthorizationCode`. We can reference [Slack's documentation](https://docs.slack.dev/authentication/installing-with-oauth/) for the authorization and token URLs, as well as the required scopes for posting messages to Slack and listing Slack channels. We'll hardcode the authorization URL, token URL, and scopes in our config page, since they're well-known and all customers will use the same values. However, we'll use environment variables for the client ID and client secret (which are sensitive). You can reference environment variables in your code-native project using `process.env.VARIABLE_NAME`, and save them to a `.env` file in the root of your project. * configPages.ts * .env ``` import { configPage, configVar, connectionConfigVar, OAuth2Type, } from "@prismatic-io/spectral"; export const configPages = { Connections: configPage({ elements: { "Slack Connection": connectionConfigVar({ stableKey: "slack-connection", dataType: "connection", oauth2Type: OAuth2Type.AuthorizationCode, inputs: { authorizeUrl: { label: "Authorize URL", default: "https://slack.com/oauth/v2/authorize", type: "string", shown: false, }, tokenUrl: { label: "Token URL", default: "https://slack.com/api/oauth.v2.access", type: "string", shown: false, }, clientId: { label: "Client ID", type: "string", shown: false, default: process.env.SLACK_CLIENT_ID, }, clientSecret: { label: "Client Secret", type: "string", shown: false, default: process.env.SLACK_CLIENT_SECRET, }, scopes: { label: "Scopes", type: "string", shown: false, default: "chat:write chat:write.public channels:read", }, }, }), "Todo API Endpoint": configVar({ stableKey: "todo-api-endpoint", dataType: "string", description: "The endpoint for the Todo API", }), }, }), }; ``` ``` SLACK_CLIENT_ID=your-client-id SLACK_CLIENT_SECRET=your-client-secret ``` After building and publishing your integration, you'll see a new Slack connection option in the Prismatic UI when configuring a test integration. ##### Add the Slack SDK to our project[​](#add-the-slack-sdk-to-our-project "Direct link to Add the Slack SDK to our project") Next, we'll add the Slack SDK to our project so we can post messages to Slack. For this tutorial, we'll use [@slack/web-api](https://www.npmjs.com/package/@slack/web-api). You could also use another Slack SDK or call the Slack REST API directly with your preferred Node.js HTTP client. ``` npm install @slack/web-api ``` ##### Post messages to Slack[​](#post-messages-to-slack "Direct link to Post messages to Slack") Now that we have a Slack connection and the Slack SDK installed, we can post messages to Slack. We'll update our flow to post a message to a Slack channel for each incomplete task we find. We can initialize the Slack client using the OAuth 2.0 access token from our Slack connection, which we can access via `context.configVars["Slack Connection"].token?.access_token`. For simplicity, we'll hardcode the Slack channel name in our code, but we'll make that configurable in the next step. flows.ts ``` import axios from "axios"; import { WebClient } from "@slack/web-api"; import { flow, util } from "@prismatic-io/spectral"; interface TodoItem { id: number; completed: boolean; task: string; } export const checkTodoItems = flow({ name: "Check Todo Items", stableKey: "check-todo-items", description: "Fetch todo items from an API and send to Slack", onExecution: async (context, params) => { const slackClient = new WebClient( util.types.toString( context.configVars["Slack Connection"].token?.access_token, ), ); const response = await axios.get( context.configVars["Todo API Endpoint"], ); for (const item of response.data) { if (item.completed) { context.logger.info(`Completed: ${item.task}`); } else { await slackClient.chat.postMessage({ text: `Pending Task: ${item.task}`, channel: "#cni-todo-demo", // replace with your channel name }); } } return { data: context.configVars }; }, }); export default [checkTodoItems]; ``` After building and testing your integration, you should see messages in your specified Slack channel for any incomplete tasks retrieved from the API. ![](/docs/img/integrations/code-native/get-started/first-integration/slack-messages.png) ##### Create Slack channel picker data source[​](#create-slack-channel-picker-data-source "Direct link to Create Slack channel picker data source") In the previous step we hardcoded the Slack channel name in our code. Let's make that configurable by adding a `dataSourceConfigVar` to our config page that retrieves a list of Slack channels using the Slack SDK. Since our connection is on the first `configPage` and our data source requires the connection, we'll add a second `configPage` that contains the data source config variable. We'll also extract the Slack client creation logic into a separate `slackClient.ts` file for better code organization, and update our flow to use the new Slack client function and config variable. * configPages.ts * flows.ts * slackClient.ts ``` import { configPage, configVar, Connection, connectionConfigVar, dataSourceConfigVar, Element, OAuth2Type, } from "@prismatic-io/spectral"; import { createSlackClient } from "./slackClient"; export const configPages = { Connections: configPage({ elements: { // same as before }, }), "Configure Slack": configPage({ tagline: "Select Slack Channel", elements: { "Select Slack Channel": dataSourceConfigVar({ stableKey: "my-select-slack-channel", dataSourceType: "picklist", perform: async (context) => { const slackClient = createSlackClient( context.configVars["Slack Connection"] as Connection, ); const response = await slackClient.conversations.list({ types: "public_channel", }); const result: Element[] = (response.channels ?? []) .map((channel) => ({ label: channel.name || "", key: channel.id || "", })) .sort((a, b) => (a.label < b.label ? -1 : 1)); return { result }; }, }), }, }), }; ``` ``` +import { flow } from "@prismatic-io/spectral"; import axios from "axios"; -import { WebClient } from "@slack/web-api"; -import { flow, util } from "@prismatic-io/spectral"; +import { createSlackClient } from "./slackClient"; @@ -13,10 +13,8 @@ export const checkTodoItems = flow({ stableKey: "check-todo-items", description: "Fetch todo items from an API and send to Slack", onExecution: async (context, params) => { - const slackClient = new WebClient( - util.types.toString( - context.configVars["Slack Connection"].token?.access_token - ) + const slackClient = createSlackClient( + context.configVars["Slack Connection"] ); const response = await axios.get( context.configVars["Todo API Endpoint"] @@ -27,7 +25,7 @@ export const checkTodoItems = flow({ } else { await slackClient.chat.postMessage({ text: `Pending Task: ${item.task}`, - channel: "#cni-todo-demo", + channel: context.configVars["Select Slack Channel"], }); } } ``` ``` import { Connection, util } from "@prismatic-io/spectral"; import { WebClient } from "@slack/web-api"; export function createSlackClient(connection: Connection) { return new WebClient(util.types.toString(connection.token?.access_token)); } ``` With the new config page and data source in place, your users be able to select a Slack channel when configuring your integration. ![](/docs/img/integrations/code-native/get-started/first-integration/data-source.png) #### Add a second flow that handles webhook requests[​](#add-a-second-flow-that-handles-webhook-requests "Direct link to Add a second flow that handles webhook requests") Integrations can have multiple flows, each handling different types of events or data processing. Let's add a second flow that handles incoming webhook requests. This flow will be triggered when a new account is created in a CRM system. The CRM system sends an XML payload to our integration's webhook URL, like this: Example webhook body ``` new_account Nelson Bighetti Hooli Palo Alto CA ``` We'll install an XML parser using `npm install fast-xml-parser` and then update our `flows.ts` file to add a new flow that: * Parses the incoming XML payload in the trigger * Passes the parsed payload to the execution step via the trigger's `body.data` field * Sends a message to Slack with details from the new account flows.ts ``` import { flow, util } from "@prismatic-io/spectral"; import axios from "axios"; import { XMLParser } from "fast-xml-parser"; import { createSlackClient } from "./slackClient"; export const checkTodoItems = flow({ /* as before */ }); interface AccountNotificationPayload { notification: Notification; } interface Notification { type: string; account: Account; } interface Account { first: string; last: string; company: Company; } interface Company { name: string; city: string; state: string; } export const newAccountFlow = flow({ name: "New Account Flow", stableKey: "new-account-flow", description: "Flow to run when a new account is created", onTrigger: async (context, payload) => { const parser = new XMLParser(); const xmlData = parser.parse(util.types.toString(payload.rawBody.data)); return Promise.resolve({ payload: { ...payload, body: { data: xmlData } }, response: { statusCode: 200, contentType: "text/plain", body: "OK" }, }); }, onExecution: async (context, params) => { const data = params.onTrigger.results.body .data as AccountNotificationPayload; // Parsed XML data const slackClient = createSlackClient( context.configVars["Slack Connection"], ); const message = `New Account Created:\n` + `Name: ${data.notification.account.first} ${data.notification.account.last}\n` + `Company: ${data.notification.account.company.name}\n` + `Location: ${data.notification.account.company.city} ${data.notification.account.company.state}`; await slackClient.chat.postMessage({ text: message, channel: context.configVars["Select Slack Channel"], }); return { data: null }; }, }); export default [checkTodoItems, newAccountFlow]; ``` You can simulate the request from the CRM by taking note of your flow's webhook URL and running the following curl command: Simulate webhook request from CRM ``` curl -X POST "https://hooks.prismatic.io/trigger/SW5example==" \ -H "Content-Type: application/xml" \ -d ' new_account Nelson Bighetti Hooli Palo Alto CA ' ``` #### Prepare for marketplace[​](#prepare-for-marketplace "Direct link to Prepare for marketplace") Now that our integration is complete, we can prepare it for production use in our integration marketplace. We'll perform a few final cleanup tasks: * We'll add instructional text to our config pages to guide users through the configuration process. You can add any HTML you like here, including links and images. Add helper text to config pages ``` export const configPages = { Connections: configPage({ elements: { Instructions: "

Slack Connection

Click the button below to authorize Acme to use your Slack account.

", "Slack Connection": connectionConfigVar({ /* Same as before */ }), "Todo Instructions": "

Acme Todo API

Enter the endpoint for the Acme Todo API.

", "Todo API Endpoint": configVar({ /* Same as before */ }), }, }), "Configure Slack": configPage({ tagline: "Select Slack Channel", elements: { "Channel Instructions": "

Slack Channel

Select the Slack channel to post todo items to.

", "Select Slack Channel": dataSourceConfigVar({ /* Same as before */ }), }, }), }; ``` * Generally, you'll want to name your integration after the app you're integrating with. Since customers will know they're configuring a Slack integration, it makes sense to rename your integration in `index.ts` to simply `name: "Slack"`. You should also give your integration a better description, such as `description: "Send notifications to Slack"`. * Finally, you can update the icon representing your integration. Replace `assets/icon.png` with the below Slack logo. ![](/docs/img/integrations/code-native/get-started/first-integration/icon.png) Additionally, you can update the Slack connection to have a "Sign in with Slack" button by adding this to your `connectionConfigVar()` after saving the below icon file to your `assets` directory: ``` icons: { oauth2ConnectionIconPath: "sign_in_with_slack.png", }, ``` ![](/docs/img/integrations/code-native/get-started/first-integration/sign_in_with_slack.png) With the icons in place, your users will see a more polished experience when configuring your integration. ![](/docs/img/integrations/code-native/get-started/first-integration/with-slack-icons.png) --- ##### Set Up a New Project #### Setting up your development environment[​](#setting-up-your-development-environment "Direct link to Setting up your development environment") To build a code-native integration, you'll need to set up your development environment. The requirements are the same as the requirements for building custom components. Check out [this article](https://prismatic.io/docs/custom-connectors/get-started/setup.md) for a detailed guide on setting up your development environment. In addition to those requirements, you may also want to install the [Prism MCP server](https://prismatic.io/docs/dev-tools/prism-mcp.md) and the [Prismatic VS Code extension](https://prismatic.io/docs/dev-tools/vscode-extension.md) to enhance your development workflow. #### Initializing a new code-native integration[​](#initializing-a-new-code-native-integration "Direct link to Initializing a new code-native integration") To initialize a new code-native integration, you can use the Prismatic CLI. Run the following command to create a new code-native integration: ``` prism integrations:init my-new-integration ``` You will be prompted to give your integration a description and select a type of connection (OAuth 2.0 or basic auth). Then, a boilerplate TypeScript project will be created in the `my-new-integration` directory. Your project's directory will look like this: ``` β”œβ”€β”€ .npmrc # Instructs NPM to look for component manifests in Prismatic's NPM repository β”œβ”€β”€ .spectral β”‚ └── index.ts # A helper file that will enable type hinting for component references β”‚ └── prism.json # A metadata file that tracks this code-native integration's integration ID β”œβ”€β”€ assets β”‚ └── icon.png # The icon for your integration β”œβ”€β”€ jest.config.js # Configuration for the Jest unit testing suite β”œβ”€β”€ package.json # Includes a dependency on @prismatic-io/spectral β”œβ”€β”€ src β”‚ β”œβ”€β”€ client.ts # Code for connecting to a third-party API β”‚ β”œβ”€β”€ componentRegistry.ts # Where you list components that you reference β”‚ β”œβ”€β”€ configPages.ts # The config wizard experience β”‚ β”œβ”€β”€ flows.ts # Your integration's flows β”‚ β”œβ”€β”€ index.test.ts # Unit testing code β”‚ └── index.ts # Metadata about the integration β”œβ”€β”€ tsconfig.json └── webpack.config.js ``` #### Configuring code-native integration metadata[​](#configuring-code-native-integration-metadata "Direct link to Configuring code-native integration metadata") Your integration's name, description, and other metadata are defined in the `src/index.ts` file. The `name`, `description`, and `category` properties are customer-facing and will be visible when customers deploy your integration from the embedded marketplace. ``` export default integration({ name: "Example Slack Integration with CNI", description: "My code-native Slack integration!", category: "Communication", labels: ["chat", "beta", "paid"], iconPath: "icon.png", flows, configPages, componentRegistry, }); ``` ![The integration's name, description, and category are visible when customers deploy your integration from the embedded marketplace](/docs/img/integrations/code-native/get-started/setup/integration-metadata-marketplace.png) ##### Semantic versioning in code-native integrations[​](#semantic-versioning-in-code-native-integrations "Direct link to Semantic versioning in code-native integrations") Each time you publish your integration, the version number will be incremented. Versions are not synced between stacks. That means that if you've published your Salesforce integration ten times in the US stack and twice on the EU stack, instances on v10 in the US stack will be running the same code as instances on v2 in the EU stack. You can specify your own semantic versioning in the `integration()` definition in `src/index.ts`: ``` export default integration({ // ... version: "1.2.3", }); ``` This version is accessible in the Prismatic API as the integration's `externalVersion` property. For example, this query will return these results: Query for Integration External Versions ``` query getIntegrationExternalVersions ($myIntegrationId: ID!) { integration( id: $myIntegrationId ) { versionSequence { nodes { versionNumber externalVersion } } } } ``` Query Variables ``` { "myIntegrationId": "SW50ZWdyYXRpb246ZDRjZjlmMWYtYWI5Mi00OTJiLWI1YzAtNThjNDkwOTUzM2Mw" } ``` [Try It Out ❯](https://prismatic.io/docs/explorer?query=query+getIntegrationExternalVersions+%28%24myIntegrationId%3A+ID%21%29+%7B%0A++++integration%28%0A++++++id%3A+%24myIntegrationId%0A++++%29+%7B%0A++++++versionSequence+%7B%0A++++++++nodes+%7B%0A++++++++++versionNumber%0A++++++++++externalVersion%0A++++++++%7D%0A++++++%7D%0A++++%7D%0A++%7D\&query_variables=%7B%0A++%22myIntegrationId%22%3A+%22SW50ZWdyYXRpb246ZDRjZjlmMWYtYWI5Mi00OTJiLWI1YzAtNThjNDkwOTUzM2Mw%22%0A%7D) Response ``` { "data": { "integration": { "versionSequence": { "nodes": [ { "versionNumber": 2, "externalVersion": "1.2.3" }, { "versionNumber": 1, "externalVersion": "1.2.1" } ] } } } } ``` Supplying an external version in this way is optional but a great way to determine which version of your integration is running in a given stack. #### Building and importing your code-native integration[​](#building-and-importing-your-code-native-integration "Direct link to Building and importing your code-native integration") To build your project, run: ``` npm run build ``` Webpack will compile your TypeScript code into a single JavaScript file, and the `dist` directory will be created. Once your integration is built, you can import your code-native integration using the `prism` CLI tool: ``` prism integrations:import --open ``` The `--open` flag is optional and will open the integration in the Prismatic designer, where you can configure a test instance and test data sources and flows. Renaming integrations By default, the `integrations:import` command will use the name of the integration defined in code to determine which existing integration to replace (or, if no integration with that name exists, it will create a new integration). If you update your integration's name in code, you will need to use an `--integrationId` flag to specify the ID of the integration you want to update. #### Publishing code-native integrations in a CI/CD pipeline[​](#publishing-code-native-integrations-in-a-cicd-pipeline "Direct link to Publishing code-native integrations in a CI/CD pipeline") If you have multiple tenants, or if you want to automate the publishing of your code-native integrations, you can incorporate integration publishing into your CI/CD pipeline. At a high level, the steps to publish a code-native integration in a CI/CD pipeline are: 1. Install the [`prism` CLI tool](https://prismatic.io/docs/cli.md) 2. [Authenticate](https://prismatic.io/docs/api/ci-cd-system.md) the `prism` CLI tool with your Prismatic tenant 3. Build your code-native integration 4. Use the `prism integrations:import` command to publish your integration If you use GitHub, you can use Prismatic's [GitHub Actions](https://prismatic.io/docs/api/github-actions.md) to publish your code-native integration as part of your GitHub Actions workflow. The [integration-publisher](https://github.com/marketplace/actions/prismatic-integration-publisher) GitHub Action supports both YAML-defined integrations and code-native integrations. ##### Including git commit information[​](#including-git-commit-information "Direct link to Including git commit information") When publishing code-native integrations in a CI/CD pipeline, you may want to include git commit information with your integration publish. This information can help you track which version of your integration code is associated with each published integration version. In this example, first import our code-native integration. Then, we run `integrations:import` and derive the commit hash, commit URL, and repository URL from the git repository and include that information when publishing the integration: Example: Publishing a code-native integration with git commit info ``` export INTEGRATION_ID=$(prism integrations:import) prism integrations:publish ${INTEGRATION_ID} \ --comment "Refactored config wizard"\ --commitHash $(git rev-parse HEAD) \ --commitUrl $(git config --get remote.origin.url)/commit/$(git rev-parse HEAD) \ --repoUrl $(git config --get remote.origin.url) ``` This information is available in the web app when viewing the integration's version history. ![The integration version history shows commit information associated with each published version](/docs/img/integrations/code-native/get-started/setup/version-history-commit-info.png) **Note**: If you use Prismatic's [GitHub Actions](https://prismatic.io/docs/api/github-actions.md), the action automatically includes git commit information when publishing your integration. --- ##### Testing Code-Native Integrations #### Testing a code-native integration[​](#testing-a-code-native-integration "Direct link to Testing a code-native integration") There are two types of testing that you can do with a code-native integration: you can run unit tests of your code locally in your IDE, and you can import the integration and test it in the Prismatic runner. * Unit tests in your IDE are great for testing the logic of your integration and testing modular portions of your code * Testing in the Prismatic runner is great for trying out the configuration wizard experience you've built and testing the integration's behavior in a real-world environment. You will probably want to incorporate both types of testing into your development process. ##### Testing a code-native integration in Prismatic[​](#testing-a-code-native-integration-in-prismatic "Direct link to Testing a code-native integration in Prismatic") After building with `npm run build` and importing your code-native integration with `prism integrations:import --open`, you can test your integration in the Prismatic runner similar to how you test a low-code integration. ![Testing a code-native integration in the Prismatic runner](/docs/img/integrations/code-native/testing/testing-cni-in-prismatic.png) To test your config wizard experience, click the **Test Configuration** button. To run a test of a flow, select the flow from the dropdown menu on the top right of the page and then click the green **Run** button. When you're satisfied with your integration, you can click **Publish** to publish a new version of your integration and manage instances of your integration from the **Management** tab. Use a debug logger A code-native integration has no steps - just a trigger and `onExecution` function, so there are no step results to inspect. To debug your integration, use the `context.logger` object in your `onExecution`. You can even conditionally log lines based on whether or not your test instance has [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode) enabled. ##### Testing a code-native integration from the CLI[​](#testing-a-code-native-integration-from-the-cli "Direct link to Testing a code-native integration from the CLI") If you would like to test your flow from within your IDE, use the `prism integrations:flows:test` command. After importing your code-native integration with `prism integrations:import`, a test instance of your integration is deployed and can be invoked [from the Prismatic UI](https://prismatic.io/docs/integrations/code-native/testing.md#testing-a-code-native-integration-in-prismatic). The same testing you can do in the UI can be done from the command line. We recommend using the `--tail-logs` flag to watch for logs from your invocation. If you would like to send a custom payload to your flow's trigger, use the `--payload` flag to send a local file as an HTTP body to your flow's webhook URL. ``` > prism integrations:flows:test --tail-logs --payload ./my-payload.json ? Select the flow to test: Flow 1 (flow-1) Starting execution...... done To re-run this flow directly: prism integrations:flows:test -u=https://hooks.prismatic.io/trigger/SW5zdGFuY2VGbG93Q29uZmlnOjhjYTZiZGU2LWM1MTktNGI5Ni1iYzVjLTc5NWJiZDMxMTcyNw== -p=./my-payload.json --tail-logs {"executionId":"SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6ZGVkOGE0YWEtMTYzZC00OTMwLWE0NTMtOTNlNGVmODlkYWQw"} β€Ί Warning: While the timestamps are accurate, logs & step results may not arrive in chronological order. Press CMD+C/CTRL+C to stop polling. This process will timeout after 20 minutes. 2025-05-12T17:05:01.197000+00:00 LOG_INFO Starting Instance 'My First Integration'. Total Concurrent Executions: 1 2025-05-12T17:05:05.115000+00:00 LOG_INFO "Select a database" is already marked complete. 2025-05-12T17:05:05.687000+00:00 LOG_INFO "Fix CORS configuration on API gateway" is already marked complete. 2025-05-12T17:05:06.013000+00:00 LOG_INFO "Document API authentication" is already marked complete. 2025-05-12T17:05:08.237000+00:00 LOG_INFO Ending Instance 'My First Integration' ``` ##### Measuring performance of a code-native integration[​](#measuring-performance-of-a-code-native-integration "Direct link to Measuring performance of a code-native integration") When in [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode), you can leverage functions of `context.debug` to measure [how long](https://prismatic.io/docs/integrations/troubleshooting.md#measuring-time-performance-in-a-code-block-or-custom-connector) specific portions of your flows take to run and [how much memory](https://prismatic.io/docs/integrations/troubleshooting.md#measuring-memory-performance-in-a-code-block-or-custom-connector) they consume (see links for examples). ##### Unit tests for code-native integrations[​](#unit-tests-for-code-native-integrations "Direct link to Unit tests for code-native integrations") You can also write unit tests for your code-native integration, similar to [unit tests for custom components](https://prismatic.io/docs/custom-connectors/unit-testing.md). The `invokeFlow` function from the custom component SDK is used to invoke a test of a flow in a code-native integration. You can specify a sample payload to "send" to your flow's `onTrigger` function, and the `invokeFlow` function will run both `onTrigger` and `onExecution` and return the result of the flow's `onExecution` function. Unit testing only works for integrations that do not leverage existing components If you use [existing components](https://prismatic.io/docs/integrations/code-native/existing-components.md) within your flows, you will not be able to build unit tests for your flows, since your local dev environment does not have access to the existing components. Please consider testing your flow [from the CLI](https://prismatic.io/docs/integrations/code-native/testing.md#testing-a-code-native-integration-from-the-cli) instead. Below is a simple flow that takes a payload and sends the payload to an API, returning the results of the API call. The corresponding unit test code invokes the flow, "sending" a sample payload and verifying that the results received are as expected. * CNI Code * Unit test code index.ts ``` import { configPage, configVar, flow, integration, } from "@prismatic-io/spectral"; import axios from "axios"; const configPages = { "Acme Config": configPage({ elements: { "Acme API Endpoint": configVar({ stableKey: "1F886045-27E7-452B-9B44-776863F6A862", dataType: "string", description: "The endpoint to fetch TODO items from Acme", defaultValue: "https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo", }), "Acme API Key": configVar({ stableKey: "webhook-config-endpoint", dataType: "string", description: "The endpoint to call when deploying or deleting an instance", }), }, }), }; export const myFlow = flow({ name: "Create Acme Opportunity", stableKey: "create-acme-opportunity", description: "Create an opportunity in Acme", onExecution: async (context, params) => { const { id, name, value } = params.onTrigger.results.body.data as { id: string; name: string; value: number; }; if (value < 0) { throw new Error("Invalid value - values cannot be negative"); } const acmeEndpoint = context.configVars["Acme API Endpoint"]; const response = await axios.post( `${acmeEndpoint}/opportunity`, { id, name, value }, { headers: { Authorization: `Bearer ${context.configVars["Acme API Key"]}`, }, }, ); return { data: response.data }; }, }); export default integration({ name: "acme-cni", description: "Acme CNI", iconPath: "icon.png", flows: [myFlow], configPages, componentRegistry, }); ``` index.test.ts ``` import { myFlow } from "."; import { invokeFlow, defaultTriggerPayload, } from "@prismatic-io/spectral/dist/testing"; interface MyFlowResponse { externalId: string; id: string; name: string; value: number; } describe("test myFlow", () => { test("Verify that the API returns an external ID that matches the specified ID", async () => { const { result } = await invokeFlow( myFlow, { "Acme API Endpoint": "https://staging.api.example.com", "Acme API Key": "my-api-key", }, {}, { ...defaultTriggerPayload(), body: { data: { id: "123", name: "my-opportunity", value: 1000 }, contentType: "application/json", }, }, ); expect((result?.data as MyFlowResponse).externalId).toBe("123"); }); test("Verify that errors are thrown when provided negative values", async () => { await expect( invokeFlow( myFlow, { "Acme API Endpoint": "https://staging.api.example.com", "Acme API Key": "my-api-key", }, {}, { ...defaultTriggerPayload(), body: { data: { id: "123", name: "my-opportunity", value: -1000 }, contentType: "application/json", }, }, ), ).rejects.toThrow("Invalid value - values cannot be negative"); }); }); ``` You can run a unit test with ``` npm run test ``` ![Running a unit test for a code-native integration](/docs/img/integrations/code-native/testing/cni-unit-test.png) Testing code-native integrations with component references Note that if your code-native integration depends on existing components' actions, your local environment does not have the necessary component code and you must test your integration [within Prismatic](#testing-a-code-native-integration-in-prismatic). ###### Unit testing a code-native integration with an OAuth 2.0 connection[​](#unit-testing-a-code-native-integration-with-an-oauth-20-connection "Direct link to Unit testing a code-native integration with an OAuth 2.0 connection") If your integration includes an OAuth 2.0 connection, you can use the same strategy outlined in the [Unit Testing Custom Components](https://prismatic.io/docs/custom-connectors/unit-testing.md) guide. Both custom components and code-native integrations can take advantage of the `prism components:dev:run` command to fetch an established connection from an existing test instance. --- #### Common Integration Patterns ##### Integration Types There is significant variety when it comes to building integrations. Despite that, your integrations will follow one of a few common patterns. Understanding these patterns and which ones are best suited for different scenarios will help when planning your approach to integrations - from how you set up your application API to how you leverage custom components. Here are the common patterns we will examine: * **Event-driven** - An event occurs in an app, which causes data to be sent to another app. The event may be in your system or the third-party system you are integrating with. This is probably the most common type of integration we see being used. * **Scheduled** - An integration that runs on a regular schedule (every minute, every hour, daily, etc.) These integrations include exports (recurring reports) and imports (polling an API to request data for import to your system). Not as common as event-driven integrations. * **Synchronous** - One system makes a request, then waits for the integration to complete and respond with an answer. Perhaps the most difficult type of integration. We don't see this one implemented very often for reasons we'll get into shortly. * **Hybrid** - This isn't so much a pattern as parts of other patterns used together in a single integration. Let's examine some use cases for these types and see when we should use each type and what is needed to build them in Prismatic. #### Event-driven integrations[​](#event-driven-integrations "Direct link to Event-driven integrations") As noted, this is probably the most common type of integration. When an event occurs in a source system (for example, a customer record is updated in a CRM or an invoice is paid in a POS), data about the event is sent to a destination system. One of the benefits of event-driven integrations is having a near-real-time data exchange. When an event occurs in the source system, the data is sent to the destination system in seconds (or whatever time it takes to run the integration). Event-driven integrations are further broken down into the following, which we'll cover shortly: * One-way import * One-way export * Two-way ##### Webhooks & payloads[​](#webhooks--payloads "Direct link to Webhooks & payloads") We've touched on events, but we also need to cover webhooks and payloads. A **webhook** is an event-triggered request to an app. This is the third-party app, for an import integration, or your app, for an export integration. The **payload** is the data sent by the webhook. While Prismatic commonly uses the term webhook, some companies use different terms. For example, Salesforce refers to these as [Outbound Messages](https://developer.salesforce.com/docs/atlas.en-us.api.meta/api/sforce_api_om_outboundmessaging_understanding.htm), and Slack calls them [Events](https://api.slack.com/apis/connections/events-api). The bottom line is that *something* changes in the source system, which causes the source system to send data outside the system. Webhooks are defined by the *source* system When the destination system creates an endpoint, it tells the source system, "when *thing* changes, notify me at this URL." That is, it is limited to defining where the payload is sent. For example, you could set up a webhook in an inventory app to notify you when the quantity for an inventory item is updated. When an update occurs in the inventory system, the system might send an HTTP request with a payload like this: ``` { "event": "INVENTORY_UPDATE", "updates": [ { "item": "widgets", "action": "add", "quantity": 20 }, { "item": "gadgets", "action": "remove", "quantity": 5 } ] } ``` For a great example of an event-driven API, let's consider GitHub, which has a variety of events that can trigger a webhook. GitHub also provides the shape of the payload sent via each webhook: . When you integrate with GitHub, you'll create endpoints to receive those payloads: . ##### Pub/sub systems[​](#pubsub-systems "Direct link to Pub/sub systems") Though webhooks are perhaps the most common type of a pub/sub (publisher/subscriber) system, the source system for an integration may also be using Amazon SNS, Apache Kafka, email, or something else altogether. In each case, these systems provide you with something you subscribe to as the starting point for the integration. ##### One-way import integration[​](#one-way-import-integration "Direct link to One-way import integration") Now that we've covered event-driven integrations in general, let's dive into the details of a (relatively) simple one-way event-driven import. This is an import of data from the third-party app into your app based on events in the third-party app. Once you've built a couple of these imports, you should be able to put the next one together pretty quickly and with a minimum of components. ![Simple diagram of one-way event-driven import integration](/docs/img/integrations/common-patterns/import-diagram.png) ###### Multiple events[​](#multiple-events "Direct link to Multiple events") First, let's talk about importing data from a third-party app that supports webhooks (or another pub/sub system). You may initially be importing a single, simple payload and may want to subscribe to only what you immediately need from the third-party system. However, you should probably subscribe to several webhooks (if available) in the third-party app to support the different payloads you may eventually need to receive. After all, an import integration does not need to be limited to a single type of data or payload. A single integration, for example, may allow you to get updated customer records from a third-party system and updated order records from the same system. These different datasets may all be provided to your app via the same integration, but they may be sent via multiple webhooks in the third-party app. ###### Branching & flows[​](#branching--flows "Direct link to Branching & flows") To support different events in the third-party app, you can either use [branching](https://prismatic.io/docs/components/branch.md) or set up a unique flow within the integration for each event type. If your integration handles a small number of webhook events, then branching may be the best option. If, however, the integration is more complex, with substantially different logic for each payload, it is better to use [flows](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md). For an example of flows to support multiple events, let's consider Salesforce. You might have one flow in your integration that supports the Opportunity update event, with separate flows for the Opportunity create event and the Contact create event. By setting things up this way, each flow handles a small piece of the overall integration, and code used in separate flows is tailored precisely to the payload the flow is processing. If the third-party app supports it, you could create a separate webhook for each flow. You could then use a [pre-process flow](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#instance-specific-endpoint-with-a-preprocess-flow) with a single endpoint. A pre-process flow examines incoming payloads and dispatches them appropriately to the integration's other flows. ###### Deploy-time flow[​](#deploy-time-flow "Direct link to Deploy-time flow") [A deploy-time flow](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger) helps to ensure that the webhooks are set up accurately. Deploy-time flows are set to run when you (or one of your customers) activate the integration. Since a deploy-time flow knows the webhook URLs (endpoints) for all its sibling flows, it can define those endpoints for the third-party app. This ensures that payloads are sent to the correct endpoints. ###### Third-party app component for import[​](#third-party-app-component-for-import "Direct link to Third-party app component for import") If you are integrating with a common app for which Prismatic already has a component, you shouldn't need to write any custom code. If you need to integrate with a niche app in your industry, building that [custom component](https://prismatic.io/docs/custom-connectors.md) should be straightforward. In many cases, this component consists of a few actions that create, list, and remove webhook URLs. You might also have a scenario where the third-party app returns a subset of needed data (maybe IDs of records that have changed but not the actual records themselves). In this scenario, you may need to wrap a bit of the third-party API with your component to get the records associated with the IDs before sending the entire payload to the next step in the integration. ###### Your app component for import[​](#your-app-component-for-import "Direct link to Your app component for import") If your app has an API, you'll want to build a custom component that wraps your API for import purposes. This will allow you to build import integrations from many different third-party apps and simply reuse your custom component in each case instead of needing to build a new custom component each time to connect to your API. After your first couple import integrations, you shouldn't have to make further changes to the custom component. Once you reach that point, most teams are able to hand off building additional import integrations to technical non-developers. If your app doesn't have an API, then you may need to use [scheduled integrations](#scheduled-integrations) to pass data through a third party. ##### One-way export integration[​](#one-way-export-integration "Direct link to One-way export integration") Now that we've looked at the one-way event-driven import integration, let's flip everything around and look at the reverse. The primary difference between an export integration and an import integration is that the export integration starts with your core application, giving you absolute control over the initial steps of the integration. ![Simple diagram of one-way event-driven export integration](/docs/img/integrations/common-patterns/export-diagram.png) ###### Your app[​](#your-app "Direct link to Your app") You'll want to write the core product functionality for the export one time, so you don't have to keep building additional functionality. The best approach is to set up your app with an API that includes events, webhooks, and the corresponding payload definitions. Doing this saves considerable time for your developers since they won't need to keep defining new export functionality for the API. Instead, you'll be able to use Prismatic to create integrations that subscribe to your webhooks, filter/transform the resulting payloads, and send the data to the third-party app. If your app does not have an API, you'll probably need to use [scheduled integrations](#scheduled-integrations) instead of event-driven ones. ###### Your app component for export[​](#your-app-component-for-export "Direct link to Your app component for export") In a one-way export, the component to handle the export from your app tends to be simple. You've already done the hard work by setting up your app with an API. As a result, this component usually consists of the actions necessary to create, list, and remove the webhook URLs. ###### Third-party app component for export[​](#third-party-app-component-for-export "Direct link to Third-party app component for export") If a standard component is available on Prismatic to access the third-party API, you will want to use that. But, if you need to build a custom component, you'll want to [wrap the third-party API](https://prismatic.io/docs/custom-connectors/get-started/wrap-an-api.md) so you can re-use this component any time you have an integration that requires you to export data to this third-party system. We discussed this same approach to wrap your API in a component when importing data into your app. ##### Two-way event-driven integration[​](#two-way-event-driven-integration "Direct link to Two-way event-driven integration") A two-way event-driven integration melds the one-way event-driven export and the one-way event-driven import into a single integration. The import will have its own flow or [flows](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md), and the export will have its own flow or flows, but everything is contained within a single integration. A two-way integration most likely uses a custom component for your app but a built-in component for the third-party app (since this is both an import and export integration). Your customers will then have a single integration to activate in the [marketplace](https://prismatic.io/docs/embed/marketplace.md), while your customer-facing teams have a single integration to support. We recommend doing this as a single integration using multiple flows instead of assembling the pieces into several one-way integrations. Building a single integration simplifies integration development, deployment, and support. ![Simple diagram of two-way event-driven integration](/docs/img/integrations/common-patterns/two-way-diagram.png) #### Scheduled integrations[​](#scheduled-integrations "Direct link to Scheduled integrations") When the system you need to transfer data from does not have webhooks or another pub/sub system to which you can subscribe, you need to use a scheduled integration. A scheduled integration can be either import or export. Instead of being triggered when the integration receives a notification, a scheduled integration is triggered at a specific time or time interval. For example, a scheduled integration may run every two minutes or every Thursday at 7:00 PM. By definition, a scheduled integration is not near-real-time. For a scheduled integration that recurs frequently, the data may be only a minute or two old, but it will not be as fresh as one can get from an event-driven integration. The main difference between a scheduled integration and an event-driven one is that the source system in an event-driven integration says, "I have something," but a scheduled integration asks the source system, "What do you have for me?" Scheduled integrations generally fall into one of the following: * Scheduled data import * Scheduled file import * Scheduled file export ##### Scheduled data import integration[​](#scheduled-data-import-integration "Direct link to Scheduled data import integration") You'll use a scheduled integration to import when you have a third-party API that does not have webhooks or another pub/sub system. If a pre-built component is available, then you will use that. If a pre-built component for this third-party app isn't available, you'll need to build a fully fleshed-out custom component that wraps the various API endpoints containing the data the integration will query. The API then processes the query and returns the requested payload. ![Simple diagram of one-way scheduled export integration](/docs/img/integrations/common-patterns/scheduled-import-diagram.png) ###### Your reusable custom component for import[​](#your-reusable-custom-component-for-import "Direct link to Your reusable custom component for import") In the event-driven import section, we talked about how you'll want to build a custom component for your app that can accept any data you need to import. The good news is that once you've built that component, you can also use it here within a scheduled import integration. It doesn't care how the data was sourced; all it cares about is taking the payload from Prismatic and providing it to your API. ##### Scheduled file import integrations[​](#scheduled-file-import-integrations "Direct link to Scheduled file import integrations") Some third-party apps may be configured with a file export instead of an API, which is more common with non-SaaS (legacy) systems. In this case, the app may write out data as PDF, XML, CSV, or another file type to an external data source such as Dropbox, SFTP, MS SQL, Queue, etc. To import this data, you'll build an integration that checks the location(s) according to a regular schedule to see if there are new files to process. When there is new data within the location (for example, new XML files in a Dropbox folder), the third-party custom component you've built for your integration will loop over each of the files, process the files, and then move or delete the processed files. As with other scheduled integrations, this still is not near-real-time but can be current within a few minutes of the file(s) being placed in the external data source. ![Simple diagram of one-way scheduled file import integration](/docs/img/integrations/common-patterns/scheduled-import-file-diagram.png) ##### Scheduled file export integration[​](#scheduled-file-export-integration "Direct link to Scheduled file export integration") We tend to see this type of integration where you generate a regular report for your customers. You set up an integration where you are either querying your API (for your app) or a third-party API for specific data. Once the data has been returned, your integration generates a file with the data in the necessary format (PDF, XML, CSV, etc.) Finally, this file is packaged up and sent off to its destination (which could range from a filesystem to email or SMS). You should be able to use a built-in Prismatic component to handle the part where the file is placed into a filesystem, sent via email, etc. Some of the common data platforms for which Prismatic has built-in components are [Dropbox](https://prismatic.io/docs/components/dropbox.md), [Amazon S3](https://prismatic.io/docs/components/aws-s3.md), [Google Drive](https://prismatic.io/docs/components/google-drive.md), and [Azure Files](https://prismatic.io/docs/components/azure-files.md). ![Simple diagram of one-way scheduled file export integration](/docs/img/integrations/common-patterns/scheduled-export-file-diagram.png) #### Synchronous integrations[​](#synchronous-integrations "Direct link to Synchronous integrations") A synchronous integration may be what you need if you have data from a third-party app that you must have *right* now. An integration of this type calls an integration URL and then waits for a response. If a synchronous integration is working quickly and has very little to no lag between the call and the response, then the integration should work well. However, if the integration takes a while to run (much data to process, the third-party app is slow in fulfilling requests, etc.), that makes the integration susceptible to network disconnects and timeouts. You are probably better off building this as a two-way asynchronous integration. Then, the integration can send off the request and "hang up." When the integration completes the process, it can "redial" to return the requested data. ![Simple diagram of one-way synchronous import integration](/docs/img/integrations/common-patterns/synchronous-diagram.png) #### Hybrid integrations[​](#hybrid-integrations "Direct link to Hybrid integrations") As noted, hybrid integrations are some combination of other integration types. For example, suppose that your app supports webhooks, but the third-party app you are working with only has an API with no webhooks. As a result, you might build a hybrid integration where you have an event-driven export from your app but are running a scheduled data import from the third-party app. Instead of setting this up as two separate integrations, you would be able to combine them in the same integration with distinct flows. --- ##### Using a FIFO Queue to Ensure In-Order Processing use native queueing This tutorial walks through implementing your own FIFO queue using a third-party queuing service, which can give you granular control over how messages are queued and processed. Prismatic also offers [native queueing](https://prismatic.io/docs/integrations/triggers/fifo-queue.md). Prismatic's integration runner is designed to process requests in parallel. If you invoke an integration with multiple requests in quick succession, the runner will scale and process all of the requests simultaneously. If you have a workflow that requires requests to be processed in a specific order, or a workflow that requires you to process only one record at a time, you'll need to take additional steps to ensure that the requests are processed sequentially. Queuing systems, like [Amazon SQS](https://aws.amazon.com/sqs/) or [Azure Service Bus](https://learn.microsoft.com/en-us/azure/service-bus-messaging/) often offer a FIFO (first-in, first-out) queue type which allows you to write messages to the queue, and retrieve them in the order that they were added. You can use a FIFO queue in an integration to queue up requests your integration receives and process them one by one. #### First-in, first-out (FIFO) flows[​](#first-in-first-out-fifo-flows "Direct link to First-in, first-out (FIFO) flows") Regardless of which queuing system you use, the general flow of a FIFO integration is the same: * One flow receives requests in parallel and quickly writes them to the queue. * Another flow runs on a regular schedule, reads one message from the queue at a time, and processes messages in series. ![Illustration of a FIFO queue](/docs/img/integrations/common-patterns/fifo-queue/fifo-queue.png) Additional flows can be added to the integration to handle other tasks, like configuring queues or cleaning up unprocessable requests from a [dead letter queue](https://en.wikipedia.org/wiki/Dead_letter_queue). #### FIFO queues in Amazon SQS[​](#fifo-queues-in-amazon-sqs "Direct link to FIFO queues in Amazon SQS") You can use the built-in [Amazon SQS](https://prismatic.io/docs/components/aws-sqs.md) component to ensure that your integration processes requests one at a time. [Example Integration](https://github.com/prismatic-io/examples/blob/main/integrations/amazon-sqs-fifo-queue.yml) An Amazon SQS-based FIFO integration will generally have four flows: 1. A "setup" flow that creates and configures the SQS queue. The flow is triggered by an [instance deploy trigger](https://prismatic.io/docs/components/management-triggers.md#instance-deploy) so that it runs when a customer deploys an instance. It contains a single action - [Create Queue](https://prismatic.io/docs/components/aws-sqs.md#create-queue) - which is idempotent and can be run many times. We can use the instance's ID as the queue name to ensure that each instance has its own queue. 2. A "write" flow that receives requests and writes them to the queue. This flow is triggered by a webhook request, and can run several executions in parallel. This flow contains two steps - one step that fetches the queue's URL based on its name, and another step that writes the trigger's payload to the queue. You can once again use the instance's ID as the group ID to ensure only one message is processed at a time. 3. A "read" flow that reads messages from the queue and processes them. This flow is triggered on a schedule (as often as every minute). The flow enters a loop and requests a message from the queue. If the queue is empty, the flow exits. If a message is returned, the flow processes the message and deletes it from the queue. 4. A "cleanup" flow that deletes the queue. This flow is triggered by an [instance delete trigger](https://prismatic.io/docs/components/management-triggers.md#instance-remove) so that it runs when a customer deletes an instance. It contains two actions - one that fetches the queue's URL based on its name, and another that deletes the queue. ##### Message deduplication in Amazon SQS[​](#message-deduplication-in-amazon-sqs "Direct link to Message deduplication in Amazon SQS") When creating an Amazon SQS queue, you have the option to enable [content-based deduplication](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/FIFO-queues.html#FIFO-queues-exactly-once-processing). When enabled, if two messages with the same content are added to the queue within a 5-minute window, only one message will be added to the queue. This is helpful if your integration receives duplicate requests. If you have your own mechanism for determining whether a message is a duplicate, you can disable content-based deduplication and use the [Message Deduplication ID](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/using-messagededuplicationid-property.html) in the "write" flow to determine whether a message is a duplicate. ##### Assured ordering in Amazon SQS[​](#assured-ordering-in-amazon-sqs "Direct link to Assured ordering in Amazon SQS") One concern you may have with this approach is that if the "read" flow runs every minute, and one flow takes time to process a large batch of messages, another "read" flow may begin running. *Will this cause messages to be processed out of order?* No. Amazon SQS will not return additional messages to any reader until the current message has been processed / deleted (or the message runs past its [visibility timeout](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-visibility-timeout.html)). If one "read" flow is already processing a message, an additional "read" flow will be told that no new messages are available, and will exit. ##### Real-time processing in an Amazon SQS FIFO integration[​](#real-time-processing-in-an-amazon-sqs-fifo-integration "Direct link to Real-time processing in an Amazon SQS FIFO integration") The "read" flow runs every minute to process messages. If you need to process messages in real-time, you can use a webhook trigger to trigger the "read" flow from the "write" flow after a message is written to the queue. ##### Handling AWS credentials in an Amazon SQS FIFO integration[​](#handling-aws-credentials-in-an-amazon-sqs-fifo-integration "Direct link to Handling AWS credentials in an Amazon SQS FIFO integration") If you leverage Amazon SQS in your integration, you will need to provide AWS credentials to the integration runner. You can either: * Have your customer provide their own AWS credentials when they deploy an instance. This requires that your customer have an AWS account and create an IAM user with the appropriate permissions. * Use your own AWS account and credentials. You can provide default credentials in the config wizard designer, but mark the connection as [organization-visible](https://prismatic.io/docs/integrations/config-wizard/config-variables.md#config-variable-visibility). Your customers' instances will then use your credentials to access AWS, but they will not be able to see the credentials in the config wizard or through the API. --- ##### Handling Large Files If your integration transfers large files between your app and a partner app, you may encounter several runner limitations: * The payload you can send to a webhook URL is limited to approximately 6MB * Your webhook request must complete within 30 seconds * Your flow can run for a maximum of 15 minutes * The runner is allocated 1GB of RAM A complete list of runner limits can be found [here](https://prismatic.io/docs/integrations/integration-runner-environment-limits.md). When sending large files through a flow, those files may exceed upload size and time limits, or may require more memory than is available (resulting in an out-of-memory error). There are several strategies you can use to handle large files in your Prismatic integration. #### Upload files directly to a file storage system[​](#upload-files-directly-to-a-file-storage-system "Direct link to Upload files directly to a file storage system") If you and your partner app both use a file storage system like Amazon S3 or Dropbox, you can upload files directly to that system. This can be accomplished using the file storage system's API, where you can request a temporary or presigned URL that allows you to upload a file directly from your application to the file storage system. ##### Upload files directly to Amazon S3[​](#upload-files-directly-to-amazon-s3 "Direct link to Upload files directly to Amazon S3") To upload a file directly to your customer's Amazon S3 bucket, you can use the [Generate Presigned URL](https://prismatic.io/docs/components/aws-s3.md#generate-presigned-url) action from the Amazon S3 component. ![Screenshot of generating a presigned URL from S3](/docs/img/integrations/common-patterns/large-files/generate-presigned-url-s3.png) If you'd like your flow to return a presigned upload URL whenever it is invoked: 1. Ensure that the trigger has a **Response Type** of **Synchronous** 2. Ensure that the **Generate Presigned URL** action is the last action of your flow. If those things are true, when your app calls an instance's flow's webhook URL, it will receive a response with the results of the **Generate Presigned URL** action (the presigned URL as a string) as the body of the response. You can use the returned presigned URL to upload a file directly to Amazon S3 through an HTTP PUT request: ``` # Fetch the presigned URL from the webhook response and remove double-quotes $ curl 'https://hooks.dev.prismatic-dev.io/trigger/SW5zdEXAMPLE==' --location | tr -d '"' https://example-bucket.s3.us-west-2.amazonaws.com/my-file.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAIOSFODNN7EXAMPLE%2F20240221%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20240221T191715Z&X-Amz-Expires=3600&X-Amz-Signature=82a604673c3fffc2671b2dd7c7a86036af67693509ba0d01f172ef0b1f84fb20&X-Amz-SignedHeaders=host&x-id=PutObject # Use the presigned URL to upload a file directly to Amazon S3 $ curl --request PUT \ --upload-file ./my-example-file.png \ https://example-bucket.s3.us-west-2.amazonaws.com/my-file.png?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=AKIAIOSFODNN7EXAMPLE%2F20240221%2Fus-west-2%2Fs3%2Faws4_request&X-Amz-Date=20240221T191715Z&X-Amz-Expires=3600&X-Amz-Signature=82a604673c3fffc2671b2dd7c7a86036af67693509ba0d01f172ef0b1f84fb20&X-Amz-SignedHeaders=host&x-id=PutObject ``` ##### Upload files directly to Dropbox[​](#upload-files-directly-to-dropbox "Direct link to Upload files directly to Dropbox") Similar to presigned URLs for Amazon S3, you can use the [Generate Temporary Upload Link](https://prismatic.io/docs/components/dropbox.md#get-temporary-upload-link) action from the Dropbox component to upload a file directly to Dropbox. ![Screenshot of generating a presigned URL from Dropbox](/docs/img/integrations/common-patterns/large-files/generate-temporary-upload-link-dropbox.png) Provided that you have a synchronous trigger and the **Generate Temporary Upload Link** action is the last action in your flow, the presigned URL will be returned to the caller of your webhook. ``` # Invoke a flow that generates a presigned URL, and parse the URL from the JSON response $ curl 'https://hooks.dev.prismatic-dev.io/trigger/SW5zdEXAMPLE==' --location | jq -r .result.link https://content.dropboxapi.com/apitul/1/ExAmPlE # Use the presigned URL to upload a file directly to Dropbox $ curl --request POST \ --upload-file ./my-example-file.png \ --headers "content-type: application/octet-stream" \ https://content.dropboxapi.com/apitul/1/ExAmPlE ``` Note that Dropbox expects a `POST` request (unlike Amazon S3, which expects a `PUT` request) and requires a `content-type` header with a value of `application/octet-stream`. ##### Query an instance's connection config variable and upload a file directly to a third-party[​](#query-an-instances-connection-config-variable-and-upload-a-file-directly-to-a-third-party "Direct link to Query an instance's connection config variable and upload a file directly to a third-party") If you need to upload a file to a third-party service that does not support presigned upload URLs, but your customer provided connection information for the third-party service when they configured an instance of your integration, you can query the Prismatic API for the instance's connection config variables and use those credentials to upload the file directly to the third-party service. To query an instance's config variables, you can query for an `instance` object's `configVariables` property - particularly their `inputs.nodes.value` fields: Query instance connection config variables ``` query getAcmeConnectionInfo($myInstanceId:ID!) { instance(id: $myInstanceId) { configVariables { nodes { id requiredConfigVariable { key } meta inputs { nodes { name value } } } } } } ``` Query Variables ``` { "myInstanceId": "SW5zdGFuY2U6NDY2MGQ4MDgtYjUwZS00NDdhLThhZmQtOWU0NzAxMzJkZThk" } ``` [Try It Out ❯](https://prismatic.io/docs/explorer?query=query+getAcmeConnectionInfo%28%24myInstanceId%3AID%21%29+%7B%0A++++instance%28id%3A+%24myInstanceId%29+%7B%0A++++++configVariables+%7B%0A++++++++nodes+%7B%0A++++++++++id%0A++++++++++requiredConfigVariable+%7B%0A++++++++++++key%0A++++++++++%7D%0A++++++++++meta%0A++++++++++inputs+%7B%0A++++++++++++nodes+%7B%0A++++++++++++++name%0A++++++++++++++value%0A++++++++++++%7D%0A++++++++++%7D%0A++++++++%7D%0A++++++%7D%0A++++%7D%0A++%7D\&query_variables=%7B%0A++%22myInstanceId%22%3A+%22SW5zdGFuY2U6NDY2MGQ4MDgtYjUwZS00NDdhLThhZmQtOWU0NzAxMzJkZThk%22%0A%7D) This query will return a result like this: ``` { "data": { "instance": { "configVariables": { "nodes": [ { "id": "SW5zdGFuY2VDb25maWdWYXJpYWJsZTo3MzkzYjU0YS1kNWMxLTQzYjEtOTM3ZS1iNTM0ZDhiYTA1NzA=", "requiredConfigVariable": { "key": "My String Config Variable" }, "meta": null, "inputs": { "nodes": [] } }, { "id": "SW5zdGFuY2VDb25maWdWYXJpYWJsZTo1OTY1ZjEwMy0xNGIyLTRmZWItYmI4My1hZWI4NmViNmRhYmI=", "requiredConfigVariable": { "key": "Acme Inc Connection" }, "meta": null, "inputs": { "nodes": [ { "name": "password", "value": "my-pass" }, { "name": "username", "value": "my-user" } ] } } ] } } } } ``` From the result you can extract connection information (like a username and password or API key) and use those credentials to make an API request directly to the file storage system. If the file storage system uses OAuth 2.0, your customer's API key will be present in the config variable's `meta` property. For more information on querying the Prismatic API, see the [Prismatic API documentation](https://prismatic.io/docs/api.md). #### Instruct Dropbox to download a file from a URL[​](#instruct-dropbox-to-download-a-file-from-a-url "Direct link to Instruct Dropbox to download a file from a URL") If you have a file that is publicly available at a certain URL, you can use the [Save from URL](https://prismatic.io/docs/components/dropbox.md#save-from-url) action from the Dropbox component to instruct Dropbox to download the file from the internet. Provide the URL where the file is located, and Dropbox will download the file and save it to your specified path in the user's Dropbox account. ![Screenshot of saving a file from a URL to Dropbox](/docs/img/integrations/common-patterns/large-files/save-from-url-dropbox.png) #### Pass a reference to a file via webhook[​](#pass-a-reference-to-a-file-via-webhook "Direct link to Pass a reference to a file via webhook") If you have a file that is too large to upload directly via webhook, but is small enough that the 1GB of memory available to the runner can handle it, you can pass a reference to the file to the runner via a webhook request. This reference could be a URL where the file is located, or a unique identifier that the runner can use to download the file from a file storage system. Your flow can use the [HTTP - GET](https://prismatic.io/docs/components/http.md#get-request) action or a comparable FTP or SFTP action to download the file from the URL, or use the file storage system's API to download the file using the unique identifier, and can then process the file as if it were uploaded directly to the flow via webhook request. #### Stream data using a custom action[​](#stream-data-using-a-custom-action "Direct link to Stream data using a custom action") If you need to load a file from one location, process its data, and store the file in another location, you can build a custom component and leverage Node.js streams to process small portions of the file at a time. See [Handling Large Files in Custom Components](https://prismatic.io/docs/custom-connectors/handling-large-files-in-custom-components.md) for examples. --- ##### Loop Over and Process Files In this tutorial we will build an integration that downloads and processes files stored in [Google Cloud Storage](https://prismatic.io/docs/components/google-cloud-storage.md), but similar concepts can be applied to files stored in Dropbox, Amazon S3, Azure Blob Storage, and SFTP server, or any other file storage system. For this integration, assume that some third-party service writes an XML file to a Google Cloud Platform (GCP) storage bucket whenever they process an order. We'll configure our integration to run every five minutes, and our integration will do the following: * Look for files in the `unprocessed/` directory of our GCP Storage bucket * For each file that we find: * Download the file * Deserialize the XML contained in the file * Perform data mapping * Post the transformed data to an HTTP endpoint * Move the file from the `unprocessed/` directory to a `processed/` directory Our integration will leverage the [loop](https://prismatic.io/docs/components/loop.md) component to process files one by one. If you would like to view the YAML definition of this example integration, it's available on [GitHub](https://github.com/prismatic-io/examples/blob/main/integrations/example-with-loop.yaml). You can import it by creating a new integration and selecting **Import**. #### List files[​](#list-files "Direct link to List files") We'll start by adding a Google Cloud Storage **List Files** action to our integration. This will automatically create a Google Cloud Storage connection to our config wizard. We'll want our customers to be able to specify their own bucket, so let's add a [Select Bucket](https://prismatic.io/docs/components/google-cloud-storage.md#select-bucket) data source config variable to our config wizard: ![Google Cloud Storage - Select Bucket data source](/docs/img/integrations/common-patterns/loop-over-files/select-bucket-data-source.png) We can also add two more string config variables to represent the `unprocessed/` and `processed/` directories in the bucket (which our customer may want to change, so it's handy to make these config variables). Now, configure the **List Files** action to reference our config variables: ![Google Cloud Storage - List Files inputs](/docs/img/integrations/common-patterns/loop-over-files/list-files-inputs.png) Next, we'll open the **Test Configuration** drawer and select **Test-instance configuration** to set some test credentials and config variable values. ![Google Cloud Storage - Test config variables](/docs/img/integrations/common-patterns/loop-over-files/test-config-variables.png) Finally, we'll click **Run**. If you see any errors about permissions, ensure that the Google IAM account you created has the proper permissions to the bucket you created. You should see the files in your `unprocessed/` directory: ![Google Cloud Storage - List Files result](/docs/img/integrations/common-patterns/loop-over-files/list-files-results.png) #### Create our loop[​](#create-our-loop "Direct link to Create our loop") Next, we'll loop over the files that our **List Files** step found. We'll add a **Loop Over Items** step. Under the **Items** input we will reference the list of files our previous step returned: ![Loop Over Items in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-files/loop-step.png) #### Add tasks to the loop[​](#add-tasks-to-the-loop "Direct link to Add tasks to the loop") Our loop is now configured to run once for each file that was found in the `unprocessed/` directory in our GCP bucket. Our loop will contain several steps to process and send the data to an external system. ##### Download the file we're currently looping over[​](#download-the-file-were-currently-looping-over "Direct link to Download the file we're currently looping over") First, we'll download the file we're currently processing. The item that we're currently processing from our loop is accessible using the `currentItem` key of the loop. We'll add a **Download File** action from the GCP component. * For **File Name** we'll reference the loop's `currentItem`. * For **Bucket Name** we'll reference the bucket name config variable we created. * Our **Connection** is already set up for us. For example, if there's a file named `unprocessed/order-123.xml` in our bucket, `loopOverEachFile.currentItem` would be equal to `"unprocessed/order-123.xml"`: ![Download current file inputs](/docs/img/integrations/common-patterns/loop-over-files/download-file-inputs.png) Because we're downloading an XML file, this action will return parsed XML in a string format. ![Download current file results](/docs/img/integrations/common-patterns/loop-over-files/download-file-results.png) ##### Deserialize the XML[​](#deserialize-the-xml "Direct link to Deserialize the XML") Next, we'll use the [Deserialize XML](https://prismatic.io/docs/components/change-data-format.md#deserialize-xml) action to convert the XML string into a JavaScript object whose keys can be referenced by subsequent steps. ![Download current file results](/docs/img/integrations/common-patterns/loop-over-files/deserialize-xml-results.png) ##### Map the data[​](#map-the-data "Direct link to Map the data") Next, suppose the API we're sending the data to expects a different format ("quantity" instead of "qty", etc). We can use the [Collection Tools](https://prismatic.io/docs/components/collection-tools.md) **Create Object** action to create a new object for us, referencing the results of the **Deserialize XML** step: ![Create object inputs](/docs/img/integrations/common-patterns/loop-over-files/create-object-inputs.png) ##### Send the data[​](#send-the-data "Direct link to Send the data") Next, we'll use the [HTTP](https://prismatic.io/docs/components/http.md) component's **POST Request** action to send the data we generated. As a placeholder for an external API, we'll post the data to [Postman's](https://www.postman.com/) `https://postman-echo.com/post` endpoint. For our **Data** input, we'll reference the **Create Object**'s results: ![HTTP POST inputs](/docs/img/integrations/common-patterns/loop-over-files/http-post-inputs.png) ##### Move the file to a processed directory[​](#move-the-file-to-a-processed-directory "Direct link to Move the file to a processed directory") Finally, we'll move the file that we downloaded out of the way by moving the file from `unprocessed/` to `processed/`. First, we need to replace the word `unprocessed` with `processed`. We'll use the [Text Manipulation](https://prismatic.io/docs/components/text-manipulation.md) component's **Find & Replace** action for that, once again referencing the loop's `currentItem`: ![Find-and-replace inputs](/docs/img/integrations/common-patterns/loop-over-files/find-and-replace-inputs.png) We'll add a Google Cloud Storage **Move File** action to move our file from one directory to another: ![Move File inputs](/docs/img/integrations/common-patterns/loop-over-files/move-file-inputs.png) #### Conclusion[​](#conclusion "Direct link to Conclusion") That's it! At this point we have an integration that loops over files in a directory, processes them, and sends the data to an HTTP endpoint. This integration can be published, and [instances](https://prismatic.io/docs/instances.md) of this integration can be configured and deployed to customers. --- ##### Loop Over a Paginated API When the number of records that an API stores is large, it's not economical to return all possible records at once. Instead, many APIs implement **pagination**. This means that the API returns a small number of records at a time. You as the consumer of the API can "page" through the records, and request more records by requesting the next "page". In an integration it's often helpful to be able to loop over a paginated API. In this tutorial, we'll examine how to use the [loop component](https://prismatic.io/docs/components/loop.md) to loop over a paginated API. #### JSON Placeholder API[​](#json-placeholder-api "Direct link to JSON Placeholder API") For illustration purposes, we'll use Typicode's [JSON Placeholder](https://jsonplaceholder.typicode.com/). You can request all 100 "posts" in JSON Placeholder by making a request to , and you can request fewer posts by making the same request with a `_limit` parameter. You can also choose an offset by passing in a `_start` parameter. For example, if you want just 10 posts, but you want to start at the 25th post, you can make a request to . In this exercise we're going to page through all 100 posts, 25 at a time, so we'll be making a request to `/posts?_limit=25&_start=`, then `/posts?_limit=25&_start=25`, `/posts?_limit=25&_start=50`, etc. until there are no more posts left to process. This works for any number of records Note: with paginated APIs you often don't know how many total results exist. We happen to know that we're going to fetch 100 total "posts", but the looping strategy we cover here will accommodate any unknown number of posts. We'll simply loop until there are no more records left to loop over. #### Our paginated integration[​](#our-paginated-integration "Direct link to Our paginated integration") Our integration will follow this flow to process all posts in the paginated API: * Start a loop * Fetch a page of up to 25 posts * Are we out of posts to process? * If so, break out of the loop * If not, loop over each post * Do something with the post (we'll just log out the post's title here) * Is this the last post? * If so, make a note of what `_start` we should use for the next loop iteration * Go back to the start of the loop Let's get building! ##### Our main loop[​](#our-main-loop "Direct link to Our main loop") First, we'll create a [Loop N Times](https://prismatic.io/docs/components/loop.md#loop-n-times) loop step to loop over pages. Under "Number of Iterations" we'll put in some high number, like 20, even though we'll break out of the loop before 20 iterations. Having a maximum number of iterations in a loop helps guard against an infinite loop - we don't want to introduce unintended load on the vendor's API from an infinite loop. If you know that you're going to loop over thousands of pages, you can choose a higher number of maximum iterations. ##### Fetching a page of data[​](#fetching-a-page-of-data "Direct link to Fetching a page of data") Next, we'll add two steps: 1. A [Persist Data - Get Execution Value](https://prismatic.io/docs/components/persist-data.md#execution---get-value) step will help to track the `_start` value in our API request. This action references a variable that is scoped to the current execution. That variable's value will be set by another step in the loop later. We'll provide a variable name, `Latest Post ID`, and default value `0` (since it hasn't been set yet): ![Persist Data - Get Execution Value in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/get-execution-value.png) 2. Next, we'll use the value from the previous step to make a call to JSON Placeholder. We'll add an [HTTP - GET Request](https://prismatic.io/docs/components/http.md#get-request) step and fetch `https://jsonplaceholder.typicode.com/posts` with a `_limit` search parameter of `25` and a `_start` search parameter of the value we retrieved: ![HTTP Get Request in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/get-posts.png) ##### Are we done?[​](#are-we-done "Direct link to Are we done?") Next, we'll determine if we're done fetching posts. We'll do this by adding a [Branch on Expression](https://prismatic.io/docs/components/branch.md#branch-on-expression) step, and we'll check to see if the "Get Posts" step we invoked returned an empty array: ![Branch on Expression to Check Empty in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/branch-on-empty.png) If the array returned was empty, we know there are no more results to page through. In that case we'll add a [Break Loop](https://prismatic.io/docs/components/loop.md#break-loop) step to exit our main loop: ![Break Loop in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/break-loop.png) ##### Process posts[​](#process-posts "Direct link to Process posts") Assuming there are posts to process, we'll create an interior loop to loop over each post. 1. Add a [Loop Over Items](https://prismatic.io/docs/components/loop.md#loop-over-items) action that takes the results from the "Get Posts" step to loop over the results: ![Loop Over Items in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/loop-over-posts.png) 2. We'll "process" each post. For illustration purposes we'll just log out the post's **id** and **title**. We can access each post's title by referencing the interior loop's `currentItem` property: ![Log Write Message in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/log-out-post.png) ##### Store the last item's ID[​](#store-the-last-items-id "Direct link to Store the last item's ID") Finally, we'll determine the ID of the last post in the page we loaded, so we can adjust our `_start` parameter for the next loop. 1. We'll use a [Branch on Expression](https://prismatic.io/docs/components/branch.md#branch-on-expression) action to determine if the `currentItem` has `isLast=true` (which indicates if we're looping over the last post): ![Branch on Expression to check for last post in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/is-last-post.png) 2. If we are on the last post, we'll use a [Persist Data - Save Execution Value](https://prismatic.io/docs/components/persist-data.md#execution---save-value) to save the ID of the last post. We'll use the same variable name - `Latest Post ID` - that we used before, and we'll save out the loop's `currentItem.id`. The "Get Execution Value" step we added at the beginning of the integration will pick up this value when the loop runs again: ![Persist Data - Save Execution Value in Prismatic integration designer](/docs/img/integrations/common-patterns/loop-over-paginated-api/save-execution-value.png) If we run our integration and look at logs, we can see that IDs and titles of the posts we loaded were logged out. After every 25 posts (after post ID 25, 50, etc.) we can also see that our main loop ran again and an additional page of posts was loaded. We'll see the logs from all loops in the test runner drawer: ![Logs are displayed for all posts that were fetched](/docs/img/integrations/common-patterns/loop-over-paginated-api/logs-are-displayed.png) #### Pagination implementations[​](#pagination-implementations "Direct link to Pagination implementations") Different APIs implement their pagination differently. Some page payloads contain a value indicating if you're on the last page or not. Others contain a value to let you know what value to ask for with your next API call. Your pagination loop implementation may look slightly different than this one, but hopefully this provided you with a general idea of how to implement pagination in an integration. --- ##### Processing Data in Parallel When you have a large set of records to process, you may want to process data in parallel to accelerate computation. This tutorial demonstrates how to process data in parallel by splitting the data into manageable chunks, and simultaneously processing each chunk. For this example, we'll fetch a "large" dataset from the internet - here we'll pull down 500 "comment" records from the JSONPlaceholder API: [Example Integration](https://github.com/prismatic-io/examples/blob/main/integrations/split-payload-example.yml) #### Split the data into manageable chunks[​](#split-the-data-into-manageable-chunks "Direct link to Split the data into manageable chunks") Once we've fetched our data, we can use the [Collection Tools](https://prismatic.io/docs/components/collection-tools.md#chunks) component's **Chunks** action to split the data into manageable chunks. Here, we split our 500 records into 10 groups of 50 records each. ![Configuring the Chunks action to split the data into 10 groups of 50 records each.](/docs/img/integrations/common-patterns/processing-data-in-parallel/configure-chunks.png) If your data is not evenly divisible by the number of elements you specify, the last chunk will contain the remaining elements. For example, if you have 108 records, and split them into chunks of 25, you'll get 4 chunks of 25 records, and 1 chunk of 8 records. If we open our chunks action's results, we can see 10 groups of 50 records each. ![The results of the Chunks action, showing 10 groups of 50 records each.](/docs/img/integrations/common-patterns/processing-data-in-parallel/chunks-results.png) #### Loop over each chunk[​](#loop-over-each-chunk "Direct link to Loop over each chunk") Next, add a [Loop Over Items](https://prismatic.io/docs/components/loop.md#loop-over-items) action to your integration and configure it to loop over the chunks you generated. ![Configuring the Loop Over Items action to loop over the chunks generated in the previous step.](/docs/img/integrations/common-patterns/processing-data-in-parallel/loop-over-chunks.png) #### Send each chunk to a sibling flow[​](#send-each-chunk-to-a-sibling-flow "Direct link to Send each chunk to a sibling flow") We need to send each chunk of records to a sibling flow. To accomplish that, we'll use [cross-flow](https://prismatic.io/docs/integrations/triggers/cross-flow.md) invocations. We'll add an [Invoke Flow](https://prismatic.io/docs/components/cross-flow.md#invoke-flow) step to our integration, and select a sibling flow to send our chunk to. So, first add a sibling flow that has a [Cross-Flow Trigger](https://prismatic.io/docs/components/cross-flow.md#cross-flow-trigger). Then, select the sibling flow for your **Invoke Flow** step's **Flow Name** input. Reference the **Loop Over Item**'s `currentItem` property for the **Data** input of the **Invoke Flow** action - that'll represent the current chunk of records and will configure the step to send the chunk of records to the sibling flow. ![Invoke a sibling flow by sending the current chunk of records to the sibling flow's webhook URL.](/docs/img/integrations/common-patterns/processing-data-in-parallel/configure-invoke-flow-action.png) If we open the **Process Records** flow after running a test of our parent flow, we can see that **Process Records** was invoked ten times, and each invocation received a chunk of 50 records. ![The Process Records flow was invoked 10 times, and each invocation received a chunk of 50 records.](/docs/img/integrations/common-patterns/processing-data-in-parallel/ten-invocations.png) What makes this parallel? By default, Prismatic executions are asynchronous, meaning that our main flow will not wait for an invocation of the **Process Records** flow to complete before beginning the next invocation. This allows us to process multiple chunks of records simultaneously, effectively processing data in parallel. #### Process each chunk in the sibling flow[​](#process-each-chunk-in-the-sibling-flow "Direct link to Process each chunk in the sibling flow") Now that records are being sent to the sibling flow, we can process each chunk of records in parallel. Your integration will require some business logic and will connect to some APIs to fetch or update records. For this example, we'll simply capitalize the body of each comment: ![The Process Records flow was invoked 10 times, and each invocation received a chunk of 50 records.](/docs/img/integrations/common-patterns/processing-data-in-parallel/capitalize-bodies.png) #### (Optional) Aggregate the results[​](#optional-aggregate-the-results "Direct link to (Optional) Aggregate the results") If your integration is unidirectional (i.e. it pulls data from a source and sends the data to a destination), you may not need to aggregate the results. But, if your integration is bidirectional or if you need to aggregate the results for any other reason, you can fetch the results of the parallelized invocations using the execution IDs that your HTTP POST action returned. If you look at the step results of the Loop Over Items action, you'll see that it returns an array of execution IDs from the HTTP POST action. ![The Loop Over Items action returns an array of execution IDs from the HTTP POST action.](/docs/img/integrations/common-patterns/processing-data-in-parallel/execution-ids.png) We can loop over these execution IDs and fetch the results of each invocation. To accomplish that, we'll: * Loop over the execution IDs * Loop up to 10 times using the **Loop N Times** action * Check if the execution is finished by querying the Prismatic API * If it's finished, fetch the step results of a step in the sibling flow. Break the inner loop. * If it's not finished, sleep for a few seconds and check again ##### Fetch step results from the Prismatic API[​](#fetch-step-results-from-the-prismatic-api "Direct link to Fetch step results from the Prismatic API") We can use the Prismatic component's **Raw GraphQL Request** action to query the Prismatic API for the step results of a step in the sibling flow. ``` query myGetExecutionResults($executionId: ID!, $stepName: String!) { executionResult(id: $executionId) { id endedAt stepResults(displayStepName: $stepName) { nodes { resultsUrl } } } } ``` ![Fetch data from the Prismatic API](/docs/img/integrations/common-patterns/processing-data-in-parallel/prismatic-api.png) If the execution is finished (indicated by whether or not `endedAt` has a value), we can use the `resultsUrl` that we received from the Prismatic API to fetch the step results of a step in the sibling flow, and then break the loop. If the execution is not finished, we'll sleep and then check again. ![Fetch results from S3](/docs/img/integrations/common-patterns/processing-data-in-parallel/fetch-step-results.png) A few notes: * Step results are stored as binary files in S3, and are compressed using [MessagePack](https://msgpack.org/index.html). Since they are binary files, we need to set the GET Request action's **Response Type** to **Binary**. * We need to decompress the step results using the MessagePack **Decompress** action. * We can either process the step results in the loop, or we can save the results to an array of results using the Persist Data's **Execution - Append Value to List** action to aggregate the results into a single array. That's what we do in the example integration here. ##### Process the results[​](#process-the-results "Direct link to Process the results") Finally, we can load the array of step results using an **Execution - Get Value** action. The results will be an array of arrays, so we can use the Collection Tools component's **Flatten** action to flatten the array of arrays into a single array. When we do that in our example, we get an array of 500 records, each with a capitalized body. ![Flatten the array of step results into a single array.](/docs/img/integrations/common-patterns/processing-data-in-parallel/flatten-results.png) We can then proceed to perform work on each record in the array of results. #### Limitations and considerations[​](#limitations-and-considerations "Direct link to Limitations and considerations") There are several limitations and considerations to keep in mind when processing data in parallel: **Simultaneous Execution Limit**. The number of concurrent executions your organization can run is determined by your pricing plan. If you try to run more than that many executions at once, you may receive a `429 Too Many Requests` error, and will need to handle that in your integration. **Execution Time Limit**. An execution can run for up to 15 minutes. If your execution takes longer than 15 minutes, it will be terminated. When sizing chunks of records, consider how many can be processed within 15 minutes. **Payload size limit**. A webhook request can be up to approximately 6MB in size. If a chunk of records exceeds 6MB, the chunks will need to be smaller. **Rate limits**. The APIs you integrate with may have rate limits, and parallelizing requests may exceed those limits. Be sure to check the rate limits of the APIs you integrate with. If you run into rate limiting constraints, consider running your flows in sequence with a [recursive trigger](https://prismatic.io/docs/integrations/common-patterns/processing-data-recursive-flows.md). For information on Prismatic integration limits, see [Integration Limits](https://prismatic.io/docs/integrations/integration-runner-environment-limits.md). --- ##### Processing Data with Recursive Flows Suppose your customers have a large number of records in a CRM that you need to sync to your app when an integration is deployed. Processing large amounts of data can take considerable time. For example, if your customers have around 100,000 records, and you know from testing that you can fetch and process about 10 records per second, then doing some back-of-the-envelope math you'll find it'll take almost 3 hours to do an initial sync of those records. But, an execution can run for a maximum of [15 minutes](https://prismatic.io/docs/integrations/integration-runner-environment-limits.md#execution-time-limitations). To process 3 hours of data, you'll need to split the work across at least 12 executions. You can accomplish this in one of two ways: 1. You can parallelize the work, running 12 executions concurrently. [Processing Data in Parallel](https://prismatic.io/docs/integrations/common-patterns/processing-data-in-parallel.md) documents how to do that. Note that with this strategy you may run into execution concurrency or third-party API rate limits if you run too many executions in parallel. 2. Run several executions in series, allowing one execution to process a chunk of data and then call itself with a cursor noting where it left off. This document details how to run a flow recursively to process large amounts of data. #### Recursive flows[​](#recursive-flows "Direct link to Recursive flows") A recursive flow is a flow that calls itself. In Prismatic, they're useful when processing large datasets that take longer than 15 minutes to process. One execution processes a number of records you know can be processed within the time constraints, and then it calls itself with a cursor indicating where it left off. The next execution continues the work. Typically, a recursive flow looks something like this: If the API you're integrating with is paginated (i.e. you fetch page 1 of records, then page 2, etc.), your first execution might loop 20 times, fetch pages 1-20, and then it'll call itself with a cursor of `21`, indicating that the next execution should process pages 21-40. #### The recursive flow component[​](#the-recursive-flow-component "Direct link to The recursive flow component") The [recursive flow](https://prismatic.io/docs/components/recursive-flow.md) component can be used to build recursive flows. It contains a trigger and three actions: * The [Recursive Trigger](https://prismatic.io/docs/components/recursive-flow.md#recursive-trigger) takes a JSON payload in the shape of `{"cursor": "some-cursor"}`. If a cursor is passed to it, it saves that cursor to [execution state](https://prismatic.io/docs/integrations/persist-data.md) for use by the flow. If a cursor is not present, it defaults to some default value that you provide as an input. * The [Invoke Recursive Trigger](https://prismatic.io/docs/components/recursive-flow.md#invoke-recursive-trigger) action reads the current cursor from execution state and calls its own trigger to start a new execution. * The [Get Recursive Cursor](https://prismatic.io/docs/components/recursive-flow.md#get-recursive-cursor) action reads the current cursor from execution state and returns it. This is handy to use at the top of a loop to fetch the current cursor value. * The [Set Recursive Cursor](https://prismatic.io/docs/components/recursive-flow.md#set-recursive-cursor) action saves a given value to the cursor in execution state, which is then read by a Get Recursive Cursor or Invoke Recursive Trigger action. ##### Running an initial data import[​](#running-an-initial-data-import "Direct link to Running an initial data import") If you would like your recursive flow to run when a customer deploys an instance of your integration, toggle the trigger's **Run on Deploy?** to `true`. Supply a reasonable **Default Cursor Value** which will be used in the first execution of the flow. For example, you could enter `1970-01-01 00:00:00` if your cursor is an "Updated At" timestamp, or `0` if your cursor is a paginated API page value. Deploy flows run each time an instance is deployed Note that a deploy flow runs each time an instance is deployed. So, if a customer deploys an instance, and then reconfigures the instance and re-deploys it, the recursive trigger will begin twice. Ensure that your flows are built in an [idempotent](https://en.wikipedia.org/wiki/Idempotence) way. You could, for example, set a [flow state](https://prismatic.io/docs/integrations/persist-data.md) persisted value when initial import completes, and short-circuit your flow if an initial import has previously completed. ##### Calling the recursive trigger yourself[​](#calling-the-recursive-trigger-yourself "Direct link to Calling the recursive trigger yourself") If you would like to call a recursive trigger yourself to begin a series of executions, you can invoke your flow's webhook URL (as you would a standard [webhook trigger](https://prismatic.io/docs/integrations/triggers/webhook.md)). If you would like to override the default cursor value, and supply your own cursor value, `POST` a request in the format `{"cursor": "some-cursor"}`. For example, ``` curl 'https://hooks.prismatic.io/trigger/SW5zexample==' \ --location \ --header "Content-Type: application/json" \ --data '{"cursor":"2000-01-01 00:00:00"}' ``` This is helpful if the initial recursive cursor you want to use is a dynamic value - you can have one flow compute the value (e.g. "Datetime 3 years ago"), and call your recursive flow with that value. #### Stopping a recursive flow[​](#stopping-a-recursive-flow "Direct link to Stopping a recursive flow") When a flow calls itself, it's easy to accidentally create an infinite loop. Ensure that you have logic within your flow that leads away from an **Invoke Recursive Trigger** action when data processing completes. If you do run into an infinite loop: * If you're in the integration designer, simply add a [Stop Execution](https://prismatic.io/docs/components/stop-execution.md) step to the top of your flow and hit 'save' to cause the next execution to stop before it calls itself again. * If you have an instance deployed to a customer that is in an infinite loop, [disable](https://prismatic.io/docs/instances/managing.md#enabling-and-disabling-instances) the instance for a short time. The next time the instance's flow attempts to call itself, it won't be able to. #### Example recursive flows[​](#example-recursive-flows "Direct link to Example recursive flows") This integration in our GitHub examples repository contains four flows that illustrate how to loop over various paginated APIs: [Example Integration](https://github.com/prismatic-io/examples/blob/main/integrations/recursive-flow-examples.yml) You can [import](https://prismatic.io/docs/integrations/low-code-integration-designer.md#yaml-definition) the integration into your own tenant for testing. ##### JSON Placeholder example recursive flow[​](#json-placeholder-example-recursive-flow "Direct link to JSON Placeholder example recursive flow") The **JSON Placeholder Example** flow loops over JSON Placeholder's [comments API](https://jsonplaceholder.typicode.com/comments), retrieving 5 pages of 15 records each execution. JSON Placeholder uses page numbers as pagination tokens (i.e. page 1, page 2, etc.), so this flow processes pages 1 through 5 during its first execution, then 6-10, 11-15, etc. No authentication is required for this flow. ##### PostgreSQL example recursive flow[​](#postgresql-example-recursive-flow "Direct link to PostgreSQL example recursive flow") The **PostgreSQL Example** flow loops over records in a PostgreSQL table. It uses a `createdat` column on the table as a cursor. It starts by querying all records created after UNIX epoch (1970-01-01), takes note of the last record's `createdat` value as a cursor, and then the next loop or execution fetches records with a `createdat` value larger than the cursor it stored. To test this flow locally, you'll need to spin up a PostgreSQL database that is publicly accessible (or accessible with an [on-prem agent](https://prismatic.io/docs/integrations/connections/on-prem-agent.md)). Then, you'll need to create a table with records that have a `createdat` timestamp. ##### Salesforce example recursive flow[​](#salesforce-example-recursive-flow "Direct link to Salesforce example recursive flow") The **Salesforce Example** flow loops over `Contact` records in SFDC. It uses the [SOQL](https://developer.salesforce.com/docs/atlas.en-us.soql_sosl.meta/soql_sosl/sforce_api_calls_soql.htm) query language to order and fetch records. ![SOQL recursive example](/docs/img/integrations/common-patterns/processing-data-recursive-flows/soql-example.png) This flow uses SOQL's `OFFSET` property to fetch pages. First it fetches records 1-5, then it fetches 5 more records with `OFFSET 5`, so it gets records 6-10, etc. To test this flow, you'll need a Salesforce account with basic auth, or you will need to update the connection to use OAuth 2.0. ##### Prismatic example recursive flow[​](#prismatic-example-recursive-flow "Direct link to Prismatic example recursive flow") The **Prismatic API Example** flow dog-foods Prismatic's API and loops over components (connectors) in Prismatic, fetching 5 pages of 10 connectors each execution. Prismatic's API returns a cursor that can be used in a subsequent query to fetch additional records. To test this flow, run `prism me:token --type refresh` to generate a refresh token for the Prismatic connection to use. #### Tracking recursive invocation lineage[​](#tracking-recursive-invocation-lineage "Direct link to Tracking recursive invocation lineage") When a flow invokes another flow, either with the [recursive flow](https://prismatic.io/docs/components/recursive-flow.md) or [cross-flow](https://prismatic.io/docs/components/cross-flow.md) components, that call lineage is tracked. This way, you can see that execution A called execution B, which called execution C, etc. To view lineage in the low-code designer, select **View linked executions** from the first execution that ran. There, you will see which execution invoked which subsequent execution, as well as what cursor was sent to each trigger. ![Execution lineage for recursive flows](/docs/img/integrations/common-patterns/processing-data-recursive-flows/lineage.png) #### Recursive flows in code-native[​](#recursive-flows-in-code-native "Direct link to Recursive flows in code-native") If you're building a recursive flow in a [code-native integration](https://prismatic.io/docs/integrations/code-native.md), you can accomplish the same pattern using `context.invokeFlow` with the current flow's name and a cursor value. Your flow's `onExecution` function can reference the cursor from the trigger's `results.body.data.cursor` property (if it exists), or default to some value. Here, we use numeric pagination, defaulting to 0 and incrementing the cursor by 1 each time. You can adapt this pattern to whatever cursor type you're using. Recursive flow in code-native ``` export const exampleRecursiveFlow = flow({ name: "Example Recursive Flow", stableKey: "example-recursive-flow", description: "Example of a recursive flow", onExecution: async (context, stepResults) => { // Get cursor from trigger payload (or default to 0) let cursor = (stepResults.onTrigger.results.body.data as Record) .cursor || 0; context.logger.info(`Cursor value is ${cursor}`); /* Do work here */ // Increment cursor and invoke the current flow with the updated cursor await context.invokeFlow(context.flow.name, { cursor: ++cursor }); return { data: null }; }, }); ``` --- #### Config Wizard ##### Config Wizard Overview The integrations you build are meant to be reusable and deployable across your heterogeneous customer base. To achieve this, you need to provide customers with a mechanism to configure integrations for their specific environments. When customers configure and activate an instance of your integration, they follow a configuration wizard where they authenticate with third-party apps and provide any additional information required for the integration to function. Some integrations have simple configuration wizards that only require third-party authentication, while others may involve multiple configuration pages with dynamically generated dropdown menus, toggles, text fields, and other input types. --- ##### Config Pages #### Config pages overview[​](#config-pages-overview "Direct link to Config pages overview") Customers enable and configure an instance of an integration through a **Configuration Wizard**. They work through **Configuration Pages**, authenticating with third-party apps and setting config variables. **Configuration pages** can contain config variables of various types and helper text and images to guide the user on where to look. If your integration requires manual configuration of webhooks, the config wizard can also display the instance's webhook endpoints and API keys (see [Endpoint API keys in the config wizard](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#endpoint-api-keys-in-the-config-wizard)). * Low-Code * Code-Native If you're building your integration with low-code, you can use the Config Wizard Designer within the integration designer to create your customers' configuration experience. ![Screenshot of the configuration wizard designer](/docs/img/integrations/config-wizard/config-pages/configuration-wizard-designer.png) You can add a configuration page by clicking **+ Config Page**, and you can rename a config page or add a short description to the page by clicking the pencil icon beside the page. Just like low-code integrations, code-native integrations include a [config wizard](https://prismatic.io/docs/integrations/config-wizard.md). The config wizard can include things like OAuth 2.0 connections, API key connections, dynamically-sourced UI elements (data sources), and other advanced configuration wizard steps. A config wizard consists of multiple pages. Each page has a title, which is derived from the `key` of the configPage object, and a `tagline` as well as a set of `elements` (individual config variables). For example, a config wizard might contain a page for a Slack OAuth 2.0 connection, a page where the user selects a channel from a dynamically-populated dropdown menu, and a page where a user enters two static string inputs: Example config pages definition ``` import { configPage, configVar } from "@prismatic-io/spectral"; import { slackConnectionConfigVar } from "./connections"; import { slackSelectChannelDataSource } from "./dataSources"; export const configPages = { Connections: configPage({ tagline: "Authenticate with Slack", elements: { "Slack OAuth Connection": slackConnectionConfigVar, }, }), "Slack Config": configPage({ tagline: "Select a Slack channel from a dropdown menu", elements: { "Select Slack Channel": slackSelectChannelDataSource, }, }), "Other Config": configPage({ elements: { "Acme API Endpoint": configVar({ stableKey: "acme-api-endpoint", dataType: "string", description: "The endpoint to fetch TODO items from Acme", defaultValue: "https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo", }), "Webhook Config Endpoint": configVar({ stableKey: "webhook-config-endpoint", dataType: "string", description: "The endpoint to call when deploying or deleting an instance", }), }, }), }; ``` For full documentation, see the [Build Code-Native](https://prismatic.io/docs/integrations/code-native/config-wizard.md) article. #### Displaying additional helper text in the configuration wizard[​](#displaying-additional-helper-text-in-the-configuration-wizard "Direct link to Displaying additional helper text in the configuration wizard") * Low-Code * Code-Native To add **helper text**, including headings (H1 - H6) or paragraphs, click the **+ Text/Image** button and select the type of text you'd like to add. To add an **image**, your image will need to be publicly accessible online. Enter the public URL of the image you'd like shown on your config page. For further customization, you can choose to add **Raw HTML** to your config page. In addition to config variables, you can add helpful text and images to guide your customers as they work through your config wizard. To add HTML to the config wizard (which can include links, images, etc), include a string `element` to a `configPage` definition: Include helper text in the config wizard ``` export const configPages = { Connections: configPage({ elements: { helpertext1: "

Asana Instructions

", helpertext2: "To generate an Asana API Key, visit the " + 'developer portal ' + 'and select "Create new token".', "Asana API Key": connectionConfigVar({ stableKey: "f0eab60f-545b-4b46-bebf-04d3aca6b63c", dataType: "connection", inputs: { // ... }, }), }, }), }; ``` ![A page in the config wizard with additional helper text](/docs/img/integrations/config-wizard/config-pages/helper-text.png) #### Displaying webhook information in the configuration wizard[​](#displaying-webhook-information-in-the-configuration-wizard "Direct link to Displaying webhook information in the configuration wizard") Your instance's webhook endpoints and API keys can be displayed in the configuration wizard. Click **+ Text/Image** and then select **Trigger Details** as the **Element Type**. You can opt to show all flows' URLs, or the URL for a specific flow. ![Add trigger details to the configuration wizard](/docs/img/integrations/config-wizard/config-pages/add-trigger-details.png) When your customers deploy an instance of your integration, they'll see the webhook information on the configuration page. This is helpful if they need to manually configure webhooks in a third-party app. ![Display trigger details in the configuration wizard](/docs/img/integrations/config-wizard/config-pages/display-trigger-details.png) --- ##### Config Variables You can define names, descriptions, variable types, and optional default values of config variables for your configuration wizard from the config wizard designer in the low-code builder, and you'll reference the values that customers set for each in your integration. ![Config variables drawer in Prismatic application](/docs/img/integrations/config-wizard/config-variables/integration-config-vars.png) When it comes time for your customer-facing teams to deploy your integration, they can enter or select configuration options and tailor the integration for a particular customer without the involvement of integration builders. Config variables that you define in the config variable drawer can be used within your integration as inputs to steps, or through the [Branch](https://prismatic.io/docs/components/branch.md) component to drive branching logic. Use only letters, numbers and spaces as config variable names The config variable name is used to reference the config variable's value. Please use only letters, numbers and spaces as config variable names. Why? If you have a config variable name like `MyApp.com Connection`, the `.` character can make config variable reference difficult, since `configVars.MyApp.com Connection` is ambiguous - it's unclear if `com Connection` is a property of config variable `MyApp`, or if `MyApp.com Connection` is the full config variable name. Some components may throw an error if they encounter a config variable with a `.` character, throwing error `Cannot read properties of undefined (reading 'key')`. #### Config variable data types[​](#config-variable-data-types "Direct link to Config variable data types") There are several types of configuration variables: * **String** is a standard string of characters * **Date** follows the form `mm/dd/yyyy`, and presents end users a calendar widget to choose a date * **Timestamp** follows the form `mm/dd/yyy, HH:MM [AM/PM]`, and presents end users a calendar and time widget to choose a date and time * **Picklist** allows you to define a series of options that your end user can choose from. Picklists are presented to end users as a dropdown menu of options. A picklist value can be up to 64 characters in length. * **Code** lets your end user enter JSON, XML, or other formatted code blocks. This is helpful if customers have unique formats for recurring reports, or other formatted documents that differ from customer to customer. Choose a Code Language when you create the config variable for syntax highlighting. * **Boolean** allows your end user to choose either true or false. * **Number** allows your end user to enter a number (integer or decimal). * **Object Selection** allows your end user to select zero or more objects from a list. This config variable type always sources data from a [data source](https://prismatic.io/docs/integrations/data-sources.md). * **Object Field Map** allows your end user to map a series of fields. This config variable type always sources data from a [data source](https://prismatic.io/docs/integrations/data-sources.md). * **JSON Form** allows you to leverage [JSON Forms](https://jsonforms.io/) to build your users' configuration experience. The code backing JSON Form config variables are developed in [custom components](https://prismatic.io/docs/custom-connectors.md) and return objects that are made up of key/value pairs. * **Connection** is made up of multiple fields that determine how a component should connect to an external API. It might include a username, password, API key, endpoint URL, or several other things. Note that connection config variables can only be added to the first config page, as subsequent pages may use the connection to dynamically generate other config variables. Inputs are sent to actions as strings The type of config variable you choose affects the UI that the end user interacts with (they get toggles for booleans, date pickers for timestamps, an editor with syntax highlighting for code, etc). Regardless of what type of config variable you choose, though, all values are presented to actions as strings. If you're [writing a custom component](https://prismatic.io/docs/custom-connectors.md), note that you will need to cast your action's input to the correct format. For example, you can `JSON.parse()` a JSON string, or run `util.types.toNumber()` or `util.types.toBool()` on a number or boolean input. You can use a [`clean`](https://prismatic.io/docs/custom-connectors/actions.md#cleaning-inputs) function to simplify type casting. Once you've added a config variable, you can use it as an input to actions within your integration. ##### List and key/value list config variables[​](#list-and-keyvalue-list-config-variables "Direct link to List and key/value list config variables") In addition to representing a **single** value, some config variable types can represent a **list** of values, or a list of **key/value pairs**. This is helpful for when you want your users to be able to enter an unknown number of items as the values of a config variable. For example, you may want users to select one or more values from a **picklist** menu. Config variables with a data type of **string**, **date**, **timestamp**, **picklist**, **code**, or **boolean** can be configured as lists or key/value lists. * Low-Code * Code-Native To create a **list** config variable, create a new config variable and select **LIST** under **Config Var Type**: ![Create list config variable in Prismatic application](/docs/img/integrations/config-wizard/config-variables/list-config-variable.png) When a list config variable is referenced by a step's input, that step's action receives a JavaScript array of values. To create a **key/value list** config variable, create a new config variable and select **KEY/VALUE LIST** under **Config Var Type**: ![Create key/value list config variable in Prismatic application](/docs/img/integrations/config-wizard/config-variables/key-value-list-config-variable.png) To create a **list** or **key/value list** config variable in a code-native integration, give your config variable a `collectionType` property of `valuelist` or `keyvaluelist`: Create a valuelist config variable in code-native ``` configVar({ dataType: "string", stableKey: "my-vals", collectionType: "valuelist", description: "Provide a list of vals", }); ``` When a **list** config variable is referenced by a step in the low-code designer or by a flow in a code-native integration, the config variable contains an array of strings like `["First Option", "Third Option", "Second Option"]`. When a **key/value list** config variable is referenced by a low-code step or code-native flow, the config variable contains an array of key/value pairs. ``` [ { key: "some-key", value: "Some value", }, { key: "another-key", value: "Another value", }, ]; ``` #### Config variable visibility[​](#config-variable-visibility "Direct link to Config variable visibility") By default, config variables that you add to your integration's configuration wizard are visible to customers who deploy instances of your integration. But, there are some situations where you may want to hide a config variable from the config wizard. For example: * All instances of your integration might share an API key to a third-party application. You may want to set that API key as a config variable, but not make it accessible or visible to your customer. * Your customer's instance needs an API key to access your application, but you want to set it on their behalf as part of the instance deployment process. In that case, you want the customer user to be able to set it programmatically behind the scenes, but not see its value in the UI. - Low-Code - Code-Native To configure visibility in the low-code designer, open the config wizard designer and select a config variable. Then, select an option from the **Permission and Visibility** dropdown menu. You have three options: * **Customer** is the default value. Customer users can view and edit the config variable, and it will always appear in the config wizard. * **Embedded** makes it so the config variable does not show up in the config wizard, but your application is able to [set it programmatically](https://prismatic.io/docs/embed/marketplace.md#dynamically-setting-config-variables-in-marketplace) through the embedded SDK. This is helpful if you want to set an API key for a user during the configuration process, but not allow the user to see or edit the value that is set. * **Organization** makes it so the config variable is not visible to your customer, and is not able to be set programmatically by your application. Config variables marked **organization** must have a default value, or else your team members will need to set the value on behalf of your customer. ![Set visibility for config variables in Prismatic application](/docs/img/integrations/config-wizard/config-variables/config-var-visibility.png) Additionally, you can toggle the **Visible to Organization** toggle to false to hide the config variable from organization team members who open a customer's instance config wizard screen. The config variable is still available programmatically to organization members, but this prevents a sensitive config variable from being displayed unintentionally on an organization team member's screen. If you are building a code-native integration, each config variable can have a `permissionAndVisibilityType` property with one of three values: * `customer` is the default value. Customer users can view and edit the config variable, and it will always appear in the config wizard. * `embedded` makes it so the config variable does not show up in the config wizard, but your application is able to [set it programmatically](https://prismatic.io/docs/embed/marketplace.md#dynamically-setting-config-variables-in-marketplace) through the embedded SDK. This is helpful if you want to set an API key for a user during the configuration process, but not allow the user to see or edit the value that is set. * `organization` makes it so the config variable is not visible to your customer, and is not able to be set programmatically by your application. Config variables marked **organization** must have a default value, or else your team members will need to set the value on behalf of your customer. Additionally, `visibleToOrgDeployer` determines if an organization user will see this config variable in the config wizard UI. While organization team members always have programmatic access to instances' config variables and their values, this helps to visually conceal some config variable values like generated metadata from data sources, etc. Defaults to `true`. A debug config variable that is only visible to org team members ``` configVar({ stableKey: "debug", dataType: "boolean", description: "Enable debug logging", defaultValue: "false", permissionAndVisibilityType: "customer", visibleToOrgDeployer: true, }); ``` #### Connection config variables[​](#connection-config-variables "Direct link to Connection config variables") Connections are a special type of config variable that contain the information necessary to connect to a third-party application. A connection might include a simple username and password pair, or might declare all the fields required for OAuth 2.0 (like auth URL, client ID, etc.). If several of your integrations use the same connection (for example, an API key for your application), you can create an [organization-activated connection](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/org-activated-customer.md). This allows you as an organization to create connections for each of your customers (e.g. customer-specific API keys for your application) once, and all of your customers' instances can use their customer-specific connections. To read more about OAuth 2.0 connections, see the [What is OAuth 2.0?](https://prismatic.io/docs/integrations/connections/oauth2.md). ##### Write-only connection inputs[​](#write-only-connection-inputs "Direct link to Write-only connection inputs") In some situations, it can be helpful to make a connection input **write-only** (e.g. a user can write a value, but not read it). For example, 1. Bob and Sue may be two customer users within the same customer. Sue is an administrator for a third-party app, and Bob is not, but Bob knows more about your integration's configuration. It can be helpful to have Sue enter her credentials into their instance, but have Bob take care of configuring or reconfiguring the instance. By making the connection inputs write-only, Bob will not be able to see Sue's credentials, but can view and change other config variables on the instance. 2. Your support team may want to view the configuration of a customer's instance. But, you don't want your support team to accidentally view your customer's API keys. By setting a connection's inputs to write-only, your customer can configure their instance and your team can observe the rest of the instance's config variables (but not the write-only values). To configure a connection's input to be write-only, open a connection config variable within the config wizard designer and click the gear icon next to an input. Toggle **Write Only**. ![Enable write-only on an input value](/docs/img/integrations/config-wizard/config-variables/enable-write-only.png) Note that once you set a connection input to **Write Only** and save your integration, you will be unable to disable the write-only setting. When a customer first deploys an instance of your integration, they will see inputs like they normally would, but with text indicating that the values are write-only. ![Customer enters their write-only credentials the first time](/docs/img/integrations/config-wizard/config-variables/write-only-first-time.png) If the customer reconfigures the instance, the sensitive values are not accessible via the API and masked placeholders are presented instead. A customer user can choose to overwrite the write-only values with new values, but cannot view the existing values. ![Customer enters their write-only credentials on subsequent times](/docs/img/integrations/config-wizard/config-variables/write-only-subsequent-times.png) --- ##### User Level Configuration Typically, one instance of an integration is configured and deployed to one customer. That "one instance for one customer" setup works well for many integrations, but what do you do when multiple users of a single customer each have their own third-party credentials or configuration requirements? **User Level Configuration** (ULC) allows your customers' users to each configure user-specific settings on an instance of an integration. One instance is deployed to a customer, but it contains user configuration information for one or more users within that customer. The instance then runs using the appropriate user's configuration depending on rules you define. **Example use case**: Suppose your customers use Dropbox for file storage. Your customer would like to sync data from your app with some subset of their users' Dropbox accounts. Using ULC, your customer would deploy an instance of your Dropbox integration using the standard deployment process. Then, the users who would like to use the integration within that customer would each go through a ULC configuration, each supplying connection information for their Dropbox account. ULC is an opt-in feature ULC is enabled on an as-needed basis. Please contact [support](mailto:support@prismatic.io) to discuss enabling ULC for your organization. #### User level configuration wizard[​](#user-level-configuration-wizard "Direct link to User level configuration wizard") * Low-Code * Code-Native When ULC is enabled in your account, you will see a **User Level Configuration Wizard** button next to the standard **Configuration Wizard** button in the integration designer. ![ULC button in the integration designer](/docs/img/integrations/config-wizard/user-level-configuration/ulc-button.png) The ULC configuration wizard designer is very similar to the [configuration wizard designer](https://prismatic.io/docs/integrations/config-wizard/config-pages.md) - you can add connections and other types of config variables like you would in the normal configuration wizard. * The configuration that you define in the **Configuration Wizard Designer** will be seen by an admin user of your customer. That user will initially create the instance of the integration for the customer and fill in any customer-wide configuration. * The configuration that you define in the **User Level Configuration Wizard Designer** will be seen by standard users of your customer. It will prompt individual users for user-specific credentials and config variables. To add a ULC config wizard to a code-native integration, create a `userLevelConfigPages` object within `configPages.ts` that has the same shape as `configPages`: User-level config wizard ``` export const userLevelConfigPages = { Options: configPage({ elements: { "My ULC Config Variables": configVar({ dataType: "string", stableKey: "my-ulc-config-var", description: "Enter a widget value", }), }, }), }; ``` Then, in `index.ts` import the `userLevelConfigPages` object. Provide the object as an export of your project (so TypeScript can infer types via `.spectral/index.ts`), and include it in your `integration()` definition: Including user-level config in your component ``` import { integration } from "@prismatic-io/spectral"; import flows from "./flows"; import { configPages, userLevelConfigPages } from "./configPages"; import { componentRegistry } from "./componentRegistry"; export { configPages, userLevelConfigPages } from "./configPages"; export { componentRegistry } from "./componentRegistry"; export default integration({ name: "ulc-example", description: "My user-level config example integration", iconPath: "icon.png", flows, configPages, userLevelConfigPages, componentRegistry, }); ``` #### Testing user level configuration in the integration designer[​](#testing-user-level-configuration-in-the-integration-designer "Direct link to Testing user level configuration in the integration designer") To test ULC in the integration designer, open up the **Test Runner** drawer and complete both the standard config wizard as well as user-specific config wizard. ![ULC button in the integration designer](/docs/img/integrations/config-wizard/user-level-configuration/test-runner-drawer.png) Tests you run will load the test user level configuration that you set. #### User level configuration in embedded marketplace[​](#user-level-configuration-in-embedded-marketplace "Direct link to User level configuration in embedded marketplace") When you [authenticate users in the embedded marketplace](https://prismatic.io/docs/embed/authenticate-users.md) you will need to include a `role` property in your signed JWT that has a value of either `"admin"` or `"user"`. If a `role` property is omitted, it defaults to `"admin"`. If you plan to use shared endpoints for ULC integrations, ensure your signed JWT includes an `external_id` property. This represents the customer *user's* external ID. This property generally matches `sub`, and is used to invoke ULC instances with shared endpoints. ##### Marketplace admins[​](#marketplace-admins "Direct link to Marketplace admins") A user with `role: "admin"` can deploy an instance of a ULC integration for the customer. They can also supply user-specific configuration on their own behalf after creating the instance. An admin user will [configure](https://prismatic.io/docs/instances/deploying.md) a ULC instance like they would a non-ULC instance - by stepping through a configuration wizard to set up customer-wide configuration settings. Once an instance is deployed, a marketplace admin can click **Configure User Level Configuration** to add their own user-specific credentials and config variables to the instance. The admin user is not required to enter those - they only need to set them if they themselves would like to use the integration. ![Configure user level config button in embedded](/docs/img/integrations/config-wizard/user-level-configuration/configure-ulc-button.png) ##### Marketplace users[​](#marketplace-users "Direct link to Marketplace users") A user with a `role: "user"` cannot deploy the instance, but can add user-specific configuration on their own behalf. A standard user will only see ULC integrations on the list view screen within the embedded marketplace. When a standard user selects an integration to configure, they will walk through the [user level configuration wizard](#user-level-configuration-wizard). ![User level config wizard](/docs/img/integrations/config-wizard/user-level-configuration/standard-user-ulc-config-wizard.png) A standard user will *not* be able to see the instance's executions, logs, or other tabs. Clicking an integration again will show the ULC config wizard, where they can update or remove their user-specific configuration. #### Managing an instance's user level config[​](#managing-an-instances-user-level-config "Direct link to Managing an instance's user level config") An organization user can view the various user-specific configurations by opening an instance and then opening the **User Configurations** tab. ![Instance user configurations](/docs/img/integrations/config-wizard/user-level-configuration/instance-user-configurations.png) You can click the **Details** button beside a user to view a user-specific webhook URL that can be used to call the instance using that user's configuration. #### User level configuration and endpoint config[​](#user-level-configuration-and-endpoint-config "Direct link to User level configuration and endpoint config") You have the same options for [endpoint configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md) in ULC integrations that you do for non-ULC integrations, with the caveat that when you use shared endpoints you also need to specify a user by external user ID (like you do for flow name and external customer ID): * When endpoint type is **Instance and Flow Specific**, each flow for each user configured for each instance receives its own unique webhook URL. In other words, Bob's "Import Records" flow webhook URL differs from Jane's "Import Records" flow webhook URL and differs from Bob's "Export Documents" flow webhook URL. User config is loaded based on unique webhook URL when the instance is invoked. * When endpoint type is **Instance Specific**, all flows for all users of a particular customer share a webhook URL. Executions are dispatched to specific flows running with user-specific configurations by sourcing a **Flow Name** and **External Customer User ID**. The external customer user ID should match the `external_id` that you set in your embedded users' JWT tokens, and often matches the JWT's `sub` property. ![Instance specific endpoint configurations](/docs/img/integrations/config-wizard/user-level-configuration/instance-specific-endpoint-configuration.png) * When endpoint type is **Shared**, all users and their flows of all customers share a single webhook URL. In addition to specifying **Flow Name** and **External Customer User ID**, like in **Instance Specific**, you also need to specify **External Customer ID**. ULC Shared Endpoints require ULC configurations for customer users An organization team member can create a ULC configuration on an instance deployed to a customer. However, such configurations are only meant for testing purposes from within the Prismatic web app. If you try to invoke a shared endpoint for a ULC instance and include an organization user's external ID (rather than a customer user's external ID), the execution will fail as the organization team member is not a customer user. Like non-ULC integrations, ULC integrations with **Instance Specific** and **Shared** endpoint configuration can leverage a [preprocess flow](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#shared-endpoint-without-a-preprocess-flow) to process incoming data and dispatch executions to the proper instance / user / flow. #### User level configuration information in trigger payloads[​](#user-level-configuration-information-in-trigger-payloads "Direct link to User level configuration information in trigger payloads") The default webhook trigger and most other non-custom triggers include information about the user whose user level config was used for an execution. To get ULC information for the current execution within an integration, reference the trigger's `results.user`, which is an object containing the user's `id`, `email` (which may be a UUID or their ID), `name` and `externalId`. --- #### Low-Code Integrations ##### Low-Code Integration Designer New to the low-code designer? Are you new to the low-code designer? Check out our [getting started guide](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) to build your first integration. When building an integration, you can use either the low-code designer or create a TypeScript project in your favorite IDE using the [Code-Native SDK](https://prismatic.io/docs/integrations/code-native.md). This article explains how to build an integration with the low-code designer. After [creating a new integration](#creating-a-new-integration) or selecting one from your list of integrations, you'll find yourself in the low-code **integration designer**. Here, you can build, test, and publish integrations. The integration designer has four main features: 1. Configuration menus that let you edit your integration's name, description, and other metadata, publish versions, add integrations to your marketplace, deploy instances to customers and more. 2. A step configuration drawer that lets you configure integration steps and modify runtime/webhook settings. 3. A testing drawer that lets you [run integration tests](https://prismatic.io/docs/integrations/low-code-integration-designer/testing.md), supply sample payloads, and view test results. 4. The integration editor pane, which occupies most of the page. Here, you can add steps to your integration, create branches and loops, and arrange the flow of your integration. You can also create multiple **flows** - each with its own [trigger](https://prismatic.io/docs/integrations/triggers.md) and series of steps to execute. ![Prismatic integration designer highlighting configuration drawer, testing drawer, version history drawer, and integration editor pane](/docs/img/integrations/low-code-integration-designer/integration-designer.png) #### Creating a new integration[​](#creating-a-new-integration "Direct link to Creating a new integration") To create a new integration in the web application, click **Integrations** in the left-side menu, then click the **+ Add Integration** button in the upper-right. When creating a new integration, you have several options: * Create a **blank** integration from scratch by selecting **Quickstart**. * Import an integration from a YAML file or clipboard (integrations are saved as [YAML definitions](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md#exporting-an-integrations-yaml-definition) behind the scenes). * Start from an existing [template](#integration-templates) - either one you've created or a Prismatic-built example. ![Configure a new integration](/docs/img/integrations/low-code-integration-designer/configure-new-integration.png) You'll need to provide a **name** for your integration and select a trigger for its first flow. The trigger determines when your integration's flow will run, and you can modify it at any time. ![Add integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/new-integration-name.png) #### Assigning an icon to an integration[​](#assigning-an-icon-to-an-integration "Direct link to Assigning an icon to an integration") To make integrations more visually appealing in the [integration marketplace](https://prismatic.io/docs/embed/marketplace.md), you can assign them icons. To add an icon to an integration in the designer, click the icon space to the left of your integration's name: ![Add icon to integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/add-icon.png) #### Assigning labels to an integration[​](#assigning-labels-to-an-integration "Direct link to Assigning labels to an integration") You can assign multiple labels to an integration through the **Integration details** menu in the designer: ![Assign labels to integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/labels.png) #### Categorizing integrations[​](#categorizing-integrations "Direct link to Categorizing integrations") Integrations can be assigned categories for easy sorting and filtering. For example, you might have several "ERP" integrations and some "Inventory Management" integrations. Categorizing integrations helps your team and customers in the [integration marketplace](https://prismatic.io/docs/embed/marketplace.md) view integrations sorted by category. To set a category for an integration, click the **Integration details** button in the top-left of the integration designer: ![Set category for integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/edit-integration-category.png) #### Publishing an integration[​](#publishing-an-integration "Direct link to Publishing an integration") **Publishing** an integration marks it as ready for customer deployment. To publish an integration, open the **Version history** tab on the left side of the page. If you have unpublished changes, you'll see an **Unpublished Draft** listed among the integration's versions. Enter a note describing your changes, then click **Save & Publish** to release a new version: ![Publish integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/version-history.png) Integration versions can be marked **Available** or **Unavailable** using the blue toggles to the right of each version. Marking a version **Unavailable** prevents it from being deployed as an [instance](https://prismatic.io/docs/instances.md) to customers. #### Integration templates[​](#integration-templates "Direct link to Integration templates") Many integrations share similar patterns. For example, if you're importing opportunities and accounts from multiple CRM vendors, the data fetching steps will differ, but the steps that send data to your application will remain consistent. Rather than recreating these common steps repeatedly, you can save time by creating an integration template for your team to use when building similar integrations. To convert your integration into a template, first [publish](#publishing-an-integration) a version. Then, open the **Integration details** modal and select **Available as Template**. ![Create a new template of an integration](/docs/img/integrations/low-code-integration-designer/available-as-template.png) You can make the template available only to organization users, or if you offer the [embedded workflow builder](https://prismatic.io/docs/embed/workflow-builder.md), you can make it available to customer integration builders. #### Integration attachments[​](#integration-attachments "Direct link to Integration attachments") Your team can share integration-related documents by clicking the **Documentation & attachments** button on the left side of the integration designer. ![Add attachments to integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/attachments.png) #### Internal integration documentation[​](#internal-integration-documentation "Direct link to Internal integration documentation") Sharing documentation, specifications, and notes about an integration with team members is valuable. You can add internal (non-customer-facing) notes and documentation by clicking the **Documentation & attachments** button on the left side of the integration designer. This space allows you to share notes, links, and other documentation with your team. ![Add internal documentation to integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/integration-documentation.png) #### Integration metadata[​](#integration-metadata "Direct link to Integration metadata") You can programmatically attach metadata to your integration, which is useful for assigning properties not covered by integration [categories](https://prismatic.io/docs/integrations/low-code-integration-designer.md#categorizing-integrations) or [labels](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-labels-to-an-integration). Metadata must be JSON-formatted and is accessible only through the [Prismatic GraphQL API](https://prismatic.io/docs/api.md). To set metadata on an integration, use the [`updateIntegration`](https://prismatic.io/docs/api/schema/mutation/updateIntegration.md) mutation: ``` mutation setIntegrationMetadata { updateIntegration( input: { id: "SW50ZWdyYXRpb246MGVjNDlhZmYtNjE2YS00NmU2LWExMTQtN2RjOThjY2Q1MzU4" metadata: "{\"price\": 100, \"compatible_plans\": [\"professional\", \"enterprise\"]}" } ) { integration { metadata } errors { field messages } } } ``` You can read this metadata as a property of an `integration` record: ``` query getIntegrationMetadata { integration( id: "SW50ZWdyYXRpb246MGVjNDlhZmYtNjE2YS00NmU2LWExMTQtN2RjOThjY2Q1MzU4" ) { name metadata } } ``` Important notes about metadata: * It must be a valid JSON string * Maximum length is 4096 characters * Only accessible via the API, not visible in the UI * Not included in an integration's [YAML definition](https://prismatic.io/docs/integrations/low-code-integration-designer.md#yaml-definition) #### YAML definition[​](#yaml-definition "Direct link to YAML definition") Integrations are represented in YAML behind the scenes. To view the YAML that defines your integration's flows, steps, inputs, connections, and config variables, click the **Integration details** button at the top-left of the integration designer and select **View YAML**. ![YAML for integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/yaml-definition.png) When exporting an integration for import in a different region (e.g., US to EU), make sure to click the **latest component versions** button, as component versions may differ between regions. Track integration changes The YAML shown corresponds to the currently displayed integration version. To view the YAML of a previous version, open the **VERSION HISTORY** drawer and select an older version. To identify differences between versions, compare their YAML definitions using your preferred diff tool (VSCode includes an excellent built-in [diff tool](https://vscode.one/diff-vscode/)). --- ##### Branching The [branch](https://prismatic.io/docs/components/branch.md) component allows you to add branching logic to your integration. Think of **branches** as logical paths that your integration can take. Given some information about config variables or step results, your integration can follow one of many paths. Branch actions are useful when you need to conditionally execute some steps. Here are a couple of examples of things you can accomplish with branching: **Example 1:** The webhook request your integration receives could contain an "Order Created", "Order Updated", or "Order Deleted" payload. You need to branch accordingly. **Example 2:** Your customers want to be alerted when their rocket fuel level is below a certain threshold. You can branch into "send an alert" and "fuel level is okay" branches depending on results of a "check rocket fuel level" step. **Example 3:** You want to [upsert](https://en.wikipedia.org/wiki/Merge_\(SQL\)) data into a system that doesn't support upsert. You can check if a record exists and branch into "add a new record" or "update the existing record" branches depending on whether the record exists. **For More Information**: [The Branch Component](https://prismatic.io/docs/components/branch.md), #### Branching on a value[​](#branching-on-a-value "Direct link to Branching on a value") Adding a [Branch on Value](https://prismatic.io/docs/components/branch.md#branch-on-value) action to your integration allows you to create a set of branches based on the value of some particular variable. It's very similar to the switch/case construct present in many programming languages. Consider **Example 1** above. Suppose the webhook request you receive has a header, `payload-type`, that can be one of three values: `order-create`, `order-update`, or `order-delete`. You can look at that value and branch accordingly. ![Branch on value in Prismatic app](/docs/img/integrations/low-code-integration-designer/branching/branch-on-value.png) **For More Information**: [Branch on Value Action](https://prismatic.io/docs/components/branch.md#branch-on-value) #### Branching on an expression[​](#branching-on-an-expression "Direct link to Branching on an expression") The [Branch on Expression](https://prismatic.io/docs/components/branch.md#branch-on-expression) action allows you to create branches within your integration based on more complex inputs. You can compare values, like config variables, step results, or static values, and follow a branch based on the results of the comparisons. Consider **Example 2** above. You have a step that checks rocket fuel level for a customer, and you want to alert users in different ways if their fuel levels are low. You can express this problem with some pseudocode: ``` if fuelLevel < 50: sendAnSMS() else if fuelLevel < 100: sendAnEmail() else: doNothing() ``` To express this pseudocode in an integration, add a step that looks up rocket fuel level. Then, add a **Branch on an Expression** action to your integration. Create one branch named **Fuel Critical** and under **Condition Inputs** check that `results` of the fuel level check step **is less than** 50. Then, create another branch named **Fuel warning** and check that `results` of the fuel level check step **is less than 100**. This will generate a branching step that will execute the branch **Send Alert SMS** if fuel levels are less than 50, **Send Warning Email** if fuel levels are less than 100, or will follow the **Else** branch if fuel levels are 100 or above. ![Branch on expression in Prismatic app](/docs/img/integrations/low-code-integration-designer/branching/branch-on-expression.png) #### Branch on expression operators[​](#branch-on-expression-operators "Direct link to Branch on expression operators") You can compare config variables, results from previous steps, or static values to one another using a variety of comparison operators. These operators each evaluate to `true` or `false` and can be chained together with **And** and **Or** clauses. ##### Equals branch operator[​](#equals-branch-operator "Direct link to Equals branch operator") The **equals** operator evaluates if two fields are equal to one another, regardless of type. | Left Field | Right Field | Result | Comments | | --------------- | --------------- | ------- | -------------------------------------------------------------- | | `5.2` | `5.2` | `true` | | | `5.2` | `5` | `false` | | | `"5.2"` | `5.2` | `true` | Strings are cast to numbers when compared to numbers | | `"Hello"` | `"Hello"` | `true` | | | `"Hello"` | `"hello"` | `false` | String comparison is case-sensitive | | `false` | `0` | `true` | Boolean `false` evaluates to `0`, and `true` evaluates to `1`. | | `[1,2,3]` | `[1,2,3]` | `true` | Arrays whose elements are the same are considered equal | | `{"foo":"bar"}` | `{"foo":"bar"}` | `true` | Objects with the same keys/values are equal | ##### Does not equal branch operator[​](#does-not-equal-branch-operator "Direct link to Does not equal branch operator") The **does not equal** operator evaluates if two fields are *not* equal to one another, regardless of type. | Left Field | Right Field | Result | | ---------- | ----------- | ------ | | `5.3` | `5.2` | true | | `[1,2,3]` | `[1,2,4]` | true | ##### Is greater than branch operator[​](#is-greater-than-branch-operator "Direct link to Is greater than branch operator") The **is greater than** operator evaluates if the left field is greater than the right field and is an implementation of the JavaScript [greater than operator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Greater_than). | Left Field | Right Field | Result | Comments | | ---------- | ----------- | ------- | ------------------------------------------------------------------------------ | | `5.2` | `5.3` | `false` | | | `5.3` | `5.3` | `false` | The values are equal; one is not greater than the other | | `"5.3"` | `5.2` | `true` | Strings are cast to numbers when compared to numbers | | `"Hello"` | `"World"` | `false` | Strings are compared alphabetically - `"Hello"` does not occur after `"World"` | | `"hello"` | `"World"` | `true` | The ASCII value for `"h"` occurs [after](https://www.asciitable.com/) `"W"` | | `true` | `false` | `true` | `true` (1) is greater than `false` (0) | ##### Is greater than or equal to branch operator[​](#is-greater-than-or-equal-to-branch-operator "Direct link to Is greater than or equal to branch operator") The **is greater than or equal to** operator is similar to **is greater than** but also returns true if the values being compared are equal to one another. | Left Field | Right Field | Result | Comments | | ---------- | ----------- | ------ | ---------------------------------------------------- | | `5.3` | `"5.3"` | `true` | Strings are cast to numbers when compared to numbers | ##### Is less than branch operator[​](#is-less-than-branch-operator "Direct link to Is less than branch operator") The **is less than** operator evaluates if the left field is less than the right field. | Left Field | Right Field | Result | Comments | | ---------- | ----------- | ------ | --------------------------------------- | | `3` | `4` | `true` | | | `"abc"` | `"daa"` | `true` | `"a"` is less than `"d"` alphabetically | ##### Is less than or equal to branch operator[​](#is-less-than-or-equal-to-branch-operator "Direct link to Is less than or equal to branch operator") The **is less than or equal to** operator is similar to **is less than** but also returns true if the values being compared are equal to one another. ##### Contained in branch operator[​](#contained-in-branch-operator "Direct link to Contained in branch operator") The **contained in** operator evaluates if the value of the left field is contained in the right field. The right field must be an array or a string. | Left Field | Right Field | Result | Comments | | ---------- | ----------------- | ------- | ----------------------------------------------------------------- | | `"world"` | `"Hello, world!"` | `true` | | | `"World"` | `"Hello, world!"` | `false` | String comparison is case-sensitive | | `2` | `[1,2,3]` | `true` | | | `"2"` | `[1,2,3]` | `false` | The string `"2"` does not occur in the array of numbers `[1,2,3]` | ##### Not contained in branch operator[​](#not-contained-in-branch-operator "Direct link to Not contained in branch operator") The **not contained in** operator evaluates if the value of the left field does not appear in the right field. | Left Field | Right Field | Result | | ---------- | ----------- | ------- | | `2` | `[1,2,3]` | `false` | | `'Hi'` | `'Hello'` | `true` | ##### Is empty branch operator[​](#is-empty-branch-operator "Direct link to Is empty branch operator") The **is empty** operator evaluates if the given value is an empty string or an empty array. | Field | Result | | --------- | ------- | | `""` | `true` | | `"hello"` | `false` | | `[]` | `true` | | `[1,2,3]` | `false` | ##### Exactly matches branch operator[​](#exactly-matches-branch-operator "Direct link to Exactly matches branch operator") The **exactly matches** operator evaluates if the two fields are equal to one another, taking the type of the values into consideration. | Left Field | Right Field | Result | Comments | | ---------- | ----------- | ------- | ----------------------------------------------- | | `"5"` | `5` | `false` | The string `"5"` is not equal to the number `5` | ##### Does not exactly match branch operator[​](#does-not-exactly-match-branch-operator "Direct link to Does not exactly match branch operator") The **does not exactly match** operator evaluates if the two fields are not equal to one another, taking the type of the values into consideration. | Left Field | Right Field | Result | Comments | | ---------- | ----------- | ------ | ----------------------------------------------- | | `"5"` | `5` | `true` | The string `"5"` is not equal to the number `5` | ##### Starts the string branch operator[​](#starts-the-string-branch-operator "Direct link to Starts the string branch operator") The **starts the string** operator evaluates if the right field's value begins with the left field's value. Both right and left values must be strings. | Left Field | Right Field | Result | Comments | | ---------- | ------------------- | ------- | -------------------------------------------------------------- | | `"Test"` | `"Testing Value"` | `true` | | | `"test"` | `"Testing Value"` | `false` | Comparisons are case-sensitive | | `"Test"` | `"A Testing Value"` | `false` | The right field must start with (not *contain*) the left value | ##### Does not start the string branch operator[​](#does-not-start-the-string-branch-operator "Direct link to Does not start the string branch operator") The **does not start the string** operator returns the opposite of the **starts with** operator. ##### Ends the string branch operator[​](#ends-the-string-branch-operator "Direct link to Ends the string branch operator") The **ends the string** operator evaluates if the right field ends with the left field. Both right and left values must be strings. | Left Field | Right Field | Result | | ---------- | --------------- | ------- | | `orld!` | `Hello, World!` | `true` | | `orld` | `Hello, World!` | `false` | ##### Does not end the string branch operator[​](#does-not-end-the-string-branch-operator "Direct link to Does not end the string branch operator") The **does not end the string** operator returns the opposite of the **ends with** operator. Accepted DateTime Formats The following three comparison operators accept date/times as ISO strings (like `2021-03-20` or `2021-03-20T11:52:21.881Z`), Unix epoch timestamps in milliseconds (for example, the number `1631568050` represents a time in 2021-09-13), or `Date()` JavaScript objects. ##### Is after (date/time) branch operator[​](#is-after-datetime-branch-operator "Direct link to Is after (date/time) branch operator") The **is after (date/time)** operator attempts to parse the left and right fields as dates and evaluates if the left field occurs after the right field. | Left Field | Right Field | Result | Comments | | ---------------------------- | ---------------------------- | ------- | --------------------------------------------------------------------------------- | | `"2021-03-20"` | `"2021-04-13"` | `false` | | | `"2021-03-20T12:50:30.105Z"` | `"2021-03-20T11:52:21.881Z"` | `true` | When dates are equivalent, time is compared | | `"2021-03-20"` | `1631568050` | `false` | `1631568050` is the UNIX epoch time for 2021-09-13, which occurs after 2021-03-05 | ##### Is before (date/time) branch operator[​](#is-before-datetime-branch-operator "Direct link to Is before (date/time) branch operator") The **is before (date/time)** operator attempts to parse the left and right fields as dates and evaluates if the left field occurs before the right field. ##### Is the same (date/time) branch operator[​](#is-the-same-datetime-branch-operator "Direct link to Is the same (date/time) branch operator") The **is the same (date/time)** operator attempts to parse the left and right fields as dates and evaluates if the timestamps are identical. | Left Field | Right Field | Result | Comments | | ---------------------------- | ---------------------------- | ------- | -------------------------------------------------------------------------------------- | | `"2021-03-20T12:50:30.105Z"` | `"2021-03-20T12:50:30.105Z"` | `true` | | | `"2021-03-20T12:50:30Z"` | `1616244630000` | `true` | `1616244630` is the millisecond UNIX epoch representation of `March 20, 2021 12:50:30` | | `"2021-03-20T12:50:30Z"` | `"2021-03-20T12:50:31Z"` | `false` | | ##### Is true branch operator[​](#is-true-branch-operator "Direct link to Is true branch operator") The **is true** operator evaluates if an input field is "truthy". Common "truthy" values include `true`, `"true"`, `"True"`, `"Yes"`, `"yes"`, `"Y"`, and `"y"`. Common "falsy" values include `false`, `"false"`, `"False"`, `"No"`, `"no"`, `"N"`, and `"n"` and evaluate to `false`. Other values that evaluate to `false` are `0`, `null`, `undefined`, `NaN`, and `""`. All other values (a non-zero number, a non-empty string, any array or object, etc.) evaluate to `true`. | Field | Result | | --------- | ------- | | `"Yes"` | `true` | | `"True"` | `true` | | `[]` | `true` | | `{}` | `true` | | `"Hello"` | `true` | | `-5` | `true` | | `"n"` | `false` | | `false` | `false` | | `""` | `false` | | `null` | `false` | | `0` | `false` | ##### Is false branch operator[​](#is-false-branch-operator "Direct link to Is false branch operator") The **is false** operator returns the opposite of the **is true** operator. ##### Does not exist branch operator[​](#does-not-exist-branch-operator "Direct link to Does not exist branch operator") The **does not exist** operator evaluates to `true` if the presented value is one of the following: `undefined`, `null`, `0`, `NaN`, `false` or `""`. | Field | Result | | ----------- | ------- | | `undefined` | `true` | | `NaN` | `true` | | `1` | `false` | | `"Hello"` | `false` | ##### Exists branch operator[​](#exists-branch-operator "Direct link to Exists branch operator") The **exists** operator returns the opposite of the `does not exist` operator. #### Combining multiple comparison operators[​](#combining-multiple-comparison-operators "Direct link to Combining multiple comparison operators") Multiple expressions can be grouped together with **And** or **Or** clauses, which execute like programming **and** and **or** clauses. Take, for example, this programming expression: ``` if ((foo > 500 and bar <= 20) or ("b" in ["a","b","c"])) ``` The same logic can be represented with a group of conditionals in a [Branch on Expression](https://prismatic.io/docs/components/branch.md#branch-on-expression) action: ![Branch on expression using conditionals in Prismatic app](/docs/img/integrations/low-code-integration-designer/branching/branch-on-expression-logic.png) **For More Information**: [Branch on Expression Action](https://prismatic.io/docs/components/branch.md#branch-on-expression) #### Converging branches[​](#converging-branches "Direct link to Converging branches") Regardless of which branch is followed, branches always converge to a single point. Once a branch has executed, the integration will continue with the next step listed below the branch convergence. This presents a problem: how do steps below the convergence reference steps in branches that may or may not have executed (depending on which branch was followed)? In your integration you may want to say "if branch *foo* was executed, get the results from *step A*, and if branch *bar* was executed, get the results instead from *step B*." Prismatic provides the [Select Executed Step Result](https://prismatic.io/docs/components/branch.md#select-executed-step-result) to handle that scenario. Imagine that you have two branches - one for incoming invoices, and one for outgoing invoices, with different logic contained in each. Regardless of which branch was executed, you'd like to insert the resulting data into an ERP. You can leverage the [Select Executed Step Result](https://prismatic.io/docs/components/branch.md#select-executed-step-result) action to say "get me the incoming or outgoing invoice - whichever one was executed." This action iterates over the list of step results that you specify, and returns the first one that has a non-null value (which indicates that it ran). ![Select executed step result while branching in Prismatic app](/docs/img/integrations/low-code-integration-designer/branching/select-executed-step-result.png) Within the component configuration drawer, select the step(s) whose results you would like, and the **Select Executed Step Result** step will yield the result of whichever one was executed. --- ##### Code Step Overview The [code component](https://prismatic.io/docs/components/code.md) allows you to execute product or industry-specific code within an integration. This page outlines when and how to use a code component. #### Why use the code component?[​](#why-use-the-code-component "Direct link to Why use the code component?") You will likely have integration logic that can't be solved using the [standard components](https://prismatic.io/docs/components.md) Prismatic provides. The portion of your integrations that are specific to your product or industry can be accomplished using code component steps or [custom components](https://prismatic.io/docs/custom-connectors.md). #### Code component vs custom connector[​](#code-component-vs-custom-connector "Direct link to Code component vs custom connector") Generally, the code component is useful if: * Your code does not depend on external libraries * Your code is short and succinct * Your code is step-specific and not reusable elsewhere You should consider building a [custom connector](https://prismatic.io/docs/custom-connectors.md) if: * Your code relies on external libraries * You would like to unit test your code independent of your integration * Your code is reusable and could be duplicated in multiple flows or integrations #### Adding a code step to an integration[​](#adding-a-code-step-to-an-integration "Direct link to Adding a code step to an integration") Within the integration designer, [add a step](https://prismatic.io/docs/integrations/low-code-integration-designer/steps.md#adding-steps-to-integrations) to your integration. Select the **Code** component, **Code Block** action. You will be presented with a new code step in your integration, and you can click the "Edit" button to open the code editor: ![Code editor in Prismatic app](/docs/img/integrations/low-code-integration-designer/code-step/code-step.png) #### Code structure[​](#code-structure "Direct link to Code structure") The code component provides a stub function by default. Let's examine the structure of the code: ``` module.exports = async ({ logger, configVars }, stepResults) => { return { data: null }; }; ``` The code component requires you to export an asynchronous function. The default code uses [arrow function notation](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/Arrow_functions) to create an `async` function to return. #### Code component parameters[​](#code-component-parameters "Direct link to Code component parameters") This function is provided a few parameters: 1. The first positional parameter is comprised of several properties: * `logger` allows you to write out log lines. * `debug` is an object which you can use when [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode) is enabled to emit additional debug log lines or measure time or memory costs of specific portions of your code. * `configVars` lets you access config variables (including connections). * `instanceState`, `crossFlowState`, `integrationState`, and `executionState` give you access to [persisted state](https://prismatic.io/docs/custom-connectors/actions.md#execution-instance-and-cross-flow-state). * `stepId` is the ID of the current step being executed. * `executionId` is the ID of the current execution. * `webhookUrls` contains the URLs of the running instance's sibling flows. * `webhookApiKeys` contains the API keys of the running instance's sibling flows. * `invokeUrl` was the URL used to invoke the integration. * `customer` is an object containing an `id`, `name`, and `externalId` of the customer the instance is assigned to. * `user` is an object containing an `id`, `name`, `email` (their ID), and `externalId` of the customer user whose user-level configuration was used for this execution. This only applies to instances with [User Level Configuration](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md). * `integration` is an object containing an `id`, `name`, and `versionSequenceId` of the integration the instance was created from. * `instance` is an object containing an `id` and `name` of the running instance. * `flow` is an object containing the `id` and `name` of the running flow. 2. The second positional parameter, `stepResults`, is an object that contains output from previous steps of the integration. ##### Logging[​](#logging "Direct link to Logging") `context.logger` is an object that can be used for logging and debugging. `context.logger` has four functions: `debug`, `info`, `warn`, and `error`. For example: ``` module.exports = async (context, stepResults) => { context.logger.info("Things are going great"); context.logger.warn("Now less great..."); }; // or module.exports = async ({ logger }, stepResults) => { logger.info("Hello World"); }; ``` **Note**: Log lines are truncated after 4096 characters. If you need longer log lines, consider [streaming logs](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md) to an external log service. ##### Config variables[​](#config-variables "Direct link to Config variables") `context.configVars` provides the [Code Component](https://prismatic.io/docs/components/code.md) with access to all [config variables](https://prismatic.io/docs/integrations/config-wizard/config-variables.md), including connections, associated with the integration. If you have a config variable named `Acme ERP Base URL`, for example, you could reference that config variable in a code step with `context.configVars["Config Variable Name"]` syntax: ``` module.exports = async ({ configVars }, stepResults) => { const fuelApiUrl = `${configVars["Acme ERP Base URL"]}/fuel`; // ... }; ``` ##### Connections[​](#connections "Direct link to Connections") [Connections](https://prismatic.io/docs/integrations/connections.md) are a special type of config variable. You can access the contents of a connection the same way that you would any other config variable. In this example, suppose you have a connection config variable named `Acme ERP Connection` that contains two fields, `tenantId` and `apiKey`: Destructuring a connection config variable ``` module.exports = async ({ logger, configVars }, stepResults) => { const { fields: { tenantId, apiKey }, } = configVars["Acme ERP Connection"]; const result = await doAThing({ tenantId, apiKey }); return { data: result }; }; ``` #### Referencing previous step outputs[​](#referencing-previous-step-outputs "Direct link to Referencing previous step outputs") Most steps of an integration return some sort of value. An **HTTP - GET** action, for example, might return a JSON payload from a REST API. An **Amazon S3 - Get Object** will return a binary file pulled from S3. The code component can reference those outputs through the `stepResults` parameter. `stepResults` is an object that contains results from all previous steps. For example, if you have an **HTTP - GET** step named **Fetch Users List** that pulls down an array of users from , you can generate an array of email addresses with this code: ``` module.exports = async (context, stepResults) => { const userArray = stepResults.fetchUsersList.results; const emailArray = userArray.map((user) => user.email); return { data: emailArray }; }; ``` Step results are often objects Many components return objects that have multiple keys. So, you can reference `stepResults.myStepName.results.someKey`. It's rare for a component to return serialized JSON, so there's rarely need to `JSON.parse()` results from a previous step. ##### Previous step names as variables[​](#previous-step-names-as-variables "Direct link to Previous step names as variables") Since names of steps can include spaces and non-JavaScript-friendly characters, alphanumeric characters of step names are converted to camelCase. So, a step named **Download JSON from API** would be converted to **downloadJsonFromApi**. You can test out step name to referenceable name conversions here: Step Name Download JSON from API Step Name Reference Name / Step ID Reference Name / Step ID ##### Referencing integration trigger payload data[​](#referencing-integration-trigger-payload-data "Direct link to Referencing integration trigger payload data") The integration trigger is simply another step that can have a unique name. Suppose an integration is triggered by a webhook, the trigger is named `My Integration Trigger`, and the webhook is provided a payload `body.data` of `{"exampleKey": "exampleValue"}`. ![Reference integration trigger payload in Prismatic app](/docs/img/integrations/low-code-integration-designer/code-step/trigger-payload.png) That `exampleKey` would be accessible using `stepResults.myIntegrationTrigger` like so: ``` module.exports = async ({ logger }, stepResults) => { const exampleKey = stepResults.myIntegrationTrigger.results.body.data.exampleKey; logger.info(`Received '${exampleKey}'`); }; ``` Using JavaScript destructuring, you can instead write this: ``` module.exports = async ( { logger }, { myIntegrationTrigger: { results: { body: { data: { exampleKey }, }, }, }, }, ) => { logger.info(`Received '${exampleKey}'`); }; ``` Notice the logged message in the testing drawer: ![Test runner step results in Prismatic app](/docs/img/integrations/low-code-integration-designer/code-step/using-stepresults.png) #### Persisted data in a code step[​](#persisted-data-in-a-code-step "Direct link to Persisted data in a code step") Like a custom component, a code step can save and reference [persisted state](https://prismatic.io/docs/custom-connectors/actions.md#execution-instance-and-cross-flow-state) at the flow (`instanceState`), cross-flow (`crossFlowState`), integration (`integrationState`), or execution (`executionState`) level. To save a value `bar` as key `foo` at the `executionState` level: ``` module.exports = async ({ logger, configVars }, stepResults) => { return { data: null, executionState: { foo: "bar" }, }; }; ``` To load the value of the key `foo` at the execution level, you can reference your function's first parameter's `executionState` property: ``` module.exports = async ({ executionState }, stepResults) => { const myvalue = executionState["foo"]; return { data: `My value is ${myvalue}` }; }; ``` #### Code component return values[​](#code-component-return-values "Direct link to Code component return values") The code component can optionally return a value for use by a subsequent step. The return value can be an object, string, integer, etc., and will retain its type as the value is passed to the next step. The return value is specified using the `data` key in the return object. In this example, we return a string with value `"https://ipinfo.io/ip"`: ``` module.exports = async (context, stepResults) => { return { data: "https://ipinfo.io/ip" }; }; ``` The output can be used as input for the next step by referencing `codeComponentStepName.results`. ![Use output for prior step for input to new step in Prismatic app](/docs/img/integrations/low-code-integration-designer/code-step/return-values.png) To see an example of returning more complex data structures, and a good example use case for a code component, see the [Using a Code Component to Transform Data](https://prismatic.io/docs/integrations/low-code-integration-designer/code-step/code-component-to-transform-data.md) quickstart. ##### Returning binary data from a code step[​](#returning-binary-data-from-a-code-step "Direct link to Returning binary data from a code step") Sometimes you'll want your code component to return binary data (like a rendered image or PDF). To do that, return an object with a `data` property of type `Buffer` (a file buffer), and a `contentType` property of type `String` that contains the file's [MIME type](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types): ``` module.exports = async (context, stepResults) => { // ... const fileBuffer = SomePdfLibrary.generatePdf(); return { data: fileBuffer, contentType: "application/pdf", }; }; ``` To see an example use case for returning binary data from a code component, check out our [Generate a PDF with a Code Component](https://prismatic.io/docs/integrations/low-code-integration-designer/code-step/generating-a-pdf-with-a-code-component.md). **For More Information**: [Using a Code Component to Transform Data](https://prismatic.io/docs/integrations/low-code-integration-designer/code-step/code-component-to-transform-data.md) #### Making HTTP calls from a code step[​](#making-http-calls-from-a-code-step "Direct link to Making HTTP calls from a code step") The [fetch](https://developer.mozilla.org/en-US/docs/Web/API/fetch) API is baked into the code component. To make an HTTP call to an API, you can use the `fetch` function: Make an HTTP POST request from a code step ``` module.exports = async (context, stepResults) => { const options = { method: "POST", headers: { Accept: "application/json", "Content-Type": "application/json", Authorization: "Bearer abc-123", }, body: JSON.stringify({ foo: "bar", baz: 123 }), }; const response = await fetch("https://postman-echo.com/post", options); const jsonData = await response.json(); return { data: jsonData }; }; ``` #### Adding dependencies to a code step[​](#adding-dependencies-to-a-code-step "Direct link to Adding dependencies to a code step") If your code component depends on node modules from `npm`, dependencies will be dynamically imported from the [UNPKG](https://unpkg.com/) and [jsDelivr](https://www.jsdelivr.com/) CDNs. For example, if your code component reads: Import lodash as a dependency ``` const lodash = require("lodash@4.17.21/lodash.js"); module.exports = async (context, stepResults) => { const mergedData = lodash.merge( { cpp: "12" }, { cpp: "23" }, { java: "23" }, { python: "35" }, ); return { data: mergedData }; }; ``` Then [lodash](https://unpkg.com/browse/lodash@4.17.21/lodash.js) version 4.17.21 will be imported as a dependency. You should specify specific known working versions of `npm` packages for your code component: ``` const lodash = require("lodash@2.4.2"); const { PDFDocument } = require("pdf-lib@1.17.1/dist/pdf-lib.js"); ``` You can require any file from `npm` using `package[@version][/file]` syntax. Note that with the `lodash` import above, no file was specified. If no file is specified, the `main` file defined in the `npm` package's `package.json` is imported. An explicit path was called out for the `pdf-lib` import because the `pdf-lib` package defaults to importing an index file that itself requires other files, and `dist/pdf-lib.js` is a completely independent file that can be imported on its own.. Downstream dependencies In order for an external dependency to be compatible with a code step, all JavaScript code must be compiled into a single file. For example, contains all of the code necessary to run in a single file. does not - it has its own `require()` statement and depends on other files. The former would work in the code step, the latter would not. If the external package has its own dependencies that are not compiled in, or if the file you reference has its own `require()` statements, you will see errors. CDN outages can cause downtime When a `require()` line is encountered in a code step, the code step will attempt to download the dependency from the UNPKG CDN. If UNPKG is down or otherwise unavailable, the code step will fall back to downloading the dependency from the jsDelivr CDN. If both CDNs are down, your code step will error. If you need external dependencies, we strongly recommend using a code step for prototyping, but building a [custom component](https://prismatic.io/docs/custom-connectors.md) for production use. Custom components have their dependencies compiled in, and are not dependent on the uptime of an external CDN. ##### Requiring built-in NodeJS modules[​](#requiring-built-in-nodejs-modules "Direct link to Requiring built-in NodeJS modules") You can also require built-in NodeJS modules, like `crypto` or `path`. If the library you specify is built in to NodeJS, the client will *not* reach out to a CDN, and will instead use the built-in module. ``` const crypto = require("crypto"); module.exports = async () => { const { publicKey, privateKey } = crypto.generateKeyPairSync("rsa", { modulusLength: 4096, publicKeyEncoding: { type: "spki", format: "pem", }, privateKeyEncoding: { type: "pkcs8", format: "pem", cipher: "aes-256-cbc", passphrase: "top secret", }, }); return { data: { publicKey, privateKey, }, }; }; ``` #### Using spectral utility functions in a code step[​](#using-spectral-utility-functions-in-a-code-step "Direct link to Using spectral utility functions in a code step") Prismatic's SDK, [@prismatic-io/spectral](https://www.npmjs.com/package/@prismatic-io/spectral) is automatically imported into each code block as `spectral`. You can reference any utility functions that Spectral declares. For example, if you need to cast a truthy `"NO"` string to a boolean, you can do this: ``` module.exports = async ({ configVars }, stepResults) => { const doAThing = spectral.util.types.toBool(configVars["Do a Thing?"]); if (doAThing) { return "Did a thing"; } else { return "Didn't do a thing"; } }; ``` A list of all util type functions is available in the [Spectral SDK](https://github.com/prismatic-io/spectral/blob/main/packages/spectral/src/util.ts). --- ##### Using a Code Component to Transform Data Prismatic's code component lets you incorporate custom JavaScript code anywhere in your integration. The code component, alongside [custom components](https://prismatic.io/docs/custom-connectors.md), allows you to write the product- or industry-specific portions of your integration that aren't readily solved using built-in components from the [component catalog](https://prismatic.io/docs/components.md). Both code and custom components have their use cases. As a reminder: code components are useful to write simple one-off, single integration code snippets. If you need to run the same code for multiple integrations, if your code is complex enough that it would benefit from unit testing, or if you are reliant on lots of external Node.js libraries, consider creating a [custom component](https://prismatic.io/docs/custom-connectors.md) instead. #### Today's problem[​](#todays-problem "Direct link to Today's problem") Today we'll address a problem that B2B companies often see when working with third-party vendors: data not coming in within an agreed-upon spec. Suppose that we and a third-party vendor agreed that we would be sent JSON-formatted data via webhook payload that looked like this: ``` [ { "firstName": "John", "lastName": "Smith", "dob": "1987-05-20", "userid": "123" }, { "firstName": "Jane", "lastName": "Smith", "dob": "1992-07-16", "userid": "172" } ] ``` Our integration worked during testing with sample data, but when we turn on our integration for third-party vendor consumption, errors are generated when our integration tries to parse the data that was sent. Logs indicate that the data the vendor is sending is formatted entirely differently than what we agreed to: ``` { "123": { "name": "John Smith", "dob": "05/20/1987" }, "172": { "name": "Jane Smith", "dob": "07/16/1992" } } ``` The other vendor drags its feet and claims a fix is a "long ways off". We don't have time to wait for the other vendor to fix its software - our customer is clamoring for the new integration they paid for! We can fortunately implement a quick fix by adding a code component to the top of our integration to transform the malformed data into the format we expect. #### Using a code component as a shim[​](#using-a-code-component-as-a-shim "Direct link to Using a code component as a shim") Our integration was written to expect one format of input but received another. We need to transform the data like this: ``` { "123": { "name": "John Smith", "dob": "05/20/1987" } } ``` into something like this: ``` { "firstName": "John", "lastName": "Smith", "dob": "1987-05-20", "userid": "123" } ``` That should be pretty easy to do. We really just need to do three things: * Split the name at the space character to form a first and last name * Reformat the date of birth into a more reasonable format * Pull the JSON key ("123") into a value of `userid` If we add a code component to our integration, by default it reads: ``` module.exports = async (context, stepResults) => { const results = null; // Result of calculation, API request, etc. return { data: results }; }; ``` The first thing I want to do is use JavaScript destructuring to capture the JSON that the third-party vendor sent as part of the integration trigger's webhook payload: ``` module.exports = async ( { logger }, { integrationTrigger: { results: { body: { data: userData }, }, }, }, ) => { logger.info(userData); // Verify we're capturing user data properly const result = null; return { data: result }; }; ``` Looking at logs, it looks like we've written our destructuring correctly. Let's next [map](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/map) over the objects (users) that we were sent with `Object.entries(userData).map()`. Each iteration of our `map()` will generate an object with `firstName`, `lastName`, `dob`, and `userid`. We'll reformat the date of birth with the `Date()` object, split the name that was provided to us into a first and last name using `split()`, and grab the `userid` from the keys that were provided to us: ``` module.exports = async ( { logger }, { integrationTrigger: { results: { body: { data: userData }, }, }, }, ) => { const result = Object.entries(userData).map(([userid, user]) => { const dob = new Date(Date.parse(user.dob)).toISOString().slice(0, 10); const firstName = user.name.split(" ").slice(0, -1).join(" "); const lastName = user.name.split(" ").slice(-1).join(" "); return { firstName, lastName, dob, userid, }; }); return { data: result }; }; ``` That's it! The rest of our integration can now be configured to reference the results of the code component rather than the body of the integration payload, and it'll start working as we'd expect it to despite the poorly formatted JSON from the third-party vendor. We can see from a test of our code component that our code component is reformatting data like we expect it to: ![Step outputs in Prismatic integration designer](/docs/img/integrations/low-code-integration-designer/code-step/code-component-to-transform-data/results.png) --- ##### Generating a PDF with a Code Component In this quickstart, we'll write a short code component snippet that references some JSON data and outputs a rendered PDF file. We'll cover how to import external libraries in a code component and how to output binary data (like a PDF file) from a code component. For this example, assume that our integration receives a list of rocket launches that have occurred recently in JSON format: ``` [ { "rocketName": "Deep Space 9", "launchTime": "2020-10-27T20:29:46.139Z", "launchSupervisor": "Robert Smith", "launchNotes": "Rocket was launched into orbit where it will remain for several months." }, { "rocketName": "Voyager", "launchTime": "2020-10-27T21:34:15.229Z", "launchSupervisor": "Sally Smith", "launchNotes": "Rocket launched without a hitch. Thrusters were retrieved 30 minutes after launch." } ] ``` Our customers would like to generate PDF files from this data with launch information, one launch per page, for their managers to print and read through. #### Should we use a custom component instead?[​](#should-we-use-a-custom-component-instead "Direct link to Should we use a custom component instead?") This use case straddles the line of needing a custom component versus using a code component. This code is only used for a single integration, is relatively short, and after some preliminary testing *probably* doesn't need extensive unit testing. It should be noted that when your code step has external dependencies (like on a PDF library), that external dependency is pulled down from a CDN each execution. That can be slow, and your integration will be dependent on the CDN being available. You may be better off building a custom component, where the external dependency will be compiled into your component. #### Importing external libraries[​](#importing-external-libraries "Direct link to Importing external libraries") Today we'll use the [PDF-LIB](https://www.npmjs.com/package/pdf-lib) library to render our PDF. To do that, let's add the following to the top of a new code component in our integration: ``` const { PDFDocument, StandardFonts, } = require("pdf-lib@1.11.2/dist/pdf-lib.js"); ``` It's wise to pin requirements to known working versions. Since we're testing our integration with PDF-LIB version 1.11.2, we'll select that specific version. If we omitted the version, our integration would import whatever latest version is available. Adding this `require()` line to the top of our code component will cause the code component to [dynamically import](https://prismatic.io/docs/integrations/low-code-integration-designer/code-step.md#adding-dependencies-to-a-code-step) the PDF-LIB library as a dependency. #### Writing our code component snippet[​](#writing-our-code-component-snippet "Direct link to Writing our code component snippet") Next, we'll write the code that generates a PDF. Let's first test that we can generate a blank PDF file: ``` const { PDFDocument } = require("pdf-lib@1.17.1/dist/pdf-lib.js"); module.exports = async (context, stepResults) => { const doc = await PDFDocument.create(); const pdfBytes = await doc.save(); return { data: pdfBytes, contentType: "application/pdf" }; }; ``` If we run this code component and look at step outputs, we see that our code component generated a 583 byte binary file. By adding a **Save File** step after the code component to GCP Storage, we can write that blank file out to GCP Storage to verify that it looks like we'd expect. You can choose to write the file out to Amazon S3, Azure, an SFTP share, DropBox, etc... your choice. We now have a blank PDF written out. What's left to do is add some text to the PDF based on the data that was presented to our integration's webhook: ``` const { PDFDocument, StandardFonts, } = require("pdf-lib@1.17.1/dist/pdf-lib.js"); module.exports = async (context, stepResults) => { // Pull in data from the webhook trigger payload const rocketLaunches = stepResults.integrationTrigger.results.body.data; // Generate a PDF Document const doc = await PDFDocument.create(); // Embed the Times Roman font const timesRomanFont = await doc.embedFont(StandardFonts.TimesRoman); // Loop over each rocket launch, adding a page to our document for each one rocketLaunches.forEach((rocketLaunch) => { const { rocketName, launchTime, launchSupervisor, launchNotes } = rocketLaunch; // Create a new page for each launch const page = doc.addPage(); const { width, height } = page.getSize(); const _launchTime = new Date(launchTime).toLocaleString(); // Print information about the launch page.drawText(`Rocket: ${rocketName}`, { x: 30, y: height - 120, size: 30, font: timesRomanFont, }); page.drawText(`Launch Time: ${_launchTime}`, { x: 30, y: height - 150, size: 12, font: timesRomanFont, }); page.drawText(`${launchSupervisor}: ${launchNotes}`, { x: 30, y: height - 166, size: 12, font: timesRomanFont, }); }); // Get PDF file as a file UInt8Array const pdfBytes = await doc.save(); // Return a PDF file with proper MIME type return { data: Buffer.from(pdfBytes), contentType: "application/pdf" }; }; ``` Note the format of the object that is returned from this code component: ``` return { data: Buffer.from(pdfBytes), contentType: "application/pdf" }; ``` The return object specifies both a `data` property that is a file `Buffer` and a `contentType` specifying the MIME type of the file being returned. If we run a test again, we can see that a two-page PDF is being generated with formatted content from the webhook payload: ![Sample PDF output file from webhook payload](/docs/img/integrations/low-code-integration-designer/code-step/generating-a-pdf-with-a-code-component/final-product.png) #### Further reading[​](#further-reading "Direct link to Further reading") That's it! With just about 40 lines of code (if you omit comments and blank lines), we have a code component that renders PDFs. For more information on code components, check out the [code component usage](https://prismatic.io/docs/integrations/low-code-integration-designer/code-step.md) page. --- ##### Error Handling #### Handling errors in integrations[​](#handling-errors-in-integrations "Direct link to Handling errors in integrations") Errors happen. An API you integrate with may encounter a temporary outage, or the "eventually" part of an "eventually consistent" database may need a couple more seconds to save a record. When you encounter errors, you have two tools to handle them: 1. Flow-level error handling. 2. Step-level error handling. #### Flow-level error handling[​](#flow-level-error-handling "Direct link to Flow-level error handling") If an execution fails, you can have the runner automatically retry a few minutes later. The webhook payload that you received will be passed back through your flow again, and your flow will start again at its first step. This is useful if your flow is [idempotent](https://en.wikipedia.org/wiki/Idempotence) and you don't know which step might fail. Read more about flow-level error handling on the [automatic retry](https://prismatic.io/docs/monitor-instances/retry-and-replay/automatic-retry.md) article. #### Step-level error handling[​](#step-level-error-handling "Direct link to Step-level error handling") You might not want your entire flow to stop because one step failed, especially if you're looping over hundreds of items and one item has issues. You can configure how the runner should handle errors on each step. To do that, click a step that you would like to configure and then open the **Error Handling** tab in the step configuration drawer. Under **Error Handler Type**, you have three options: * **Fail** - stop the flow and throw an error. * **Ignore** - ignore the error and continue running the rest of the flow. * **Retry** - wait for an amount of time (**Seconds Between Attempts**) and then try the step again, a maximum of **Max Attempts** times. Optionally wait longer and longer (**Exponential Backoff**, twice as long each time) between retries. If the last attempt still fails, either fail the integration or ignore the error depending on whether **Ignore Final Error** is true or false. ![Screenshot of step-level error handling configuration](/docs/img/integrations/low-code-integration-designer/error-handling/step-level-error-handling.png) ##### Branching after ignored errors[​](#branching-after-ignored-errors "Direct link to Branching after ignored errors") If a step is configured to **Ignore** errors, or if the step has retried its configured number of times and then ignored the final error, the step returns a result with an `error` property detailing the error that occurred. You can use the [branch](https://prismatic.io/docs/components/branch.md) component to branch based on that error. This is useful if you have some sort of [dead letter queue](https://en.wikipedia.org/wiki/Dead_letter_queue) to write the failed item to, or if you would like to notify someone of the problematic item. You can branch on whether or not the step's returned `error` **exists** and act accordingly. ![Screenshot of branching on step-level error handling](/docs/img/integrations/low-code-integration-designer/error-handling/branch-on-step-error.png) --- ##### Flows #### Flows in integrations[​](#flows-in-integrations "Direct link to Flows in integrations") Some integrations contain a single **flow** (one trigger and a series of steps that execute sequentially). For integrations requiring multiple logical flows - such as when integrating with Acme ERP that sends various webhook payload types - you can combine multiple flows into a single integration, with each flow handling a specific webhook type. This approach is more manageable than deploying dozens of distinct instances to each customer to integrate with Acme ERP; instead, you deploy a single integration composed of multiple flows. An integration's [config variables](https://prismatic.io/docs/integrations/config-wizard/config-variables.md) are scoped at the integration level. Therefore, config variables set for an integration are shared and accessible by any of the integration's flows. Each flow has its own unique trigger and its own [webhook URL](https://prismatic.io/docs/integrations/triggers/webhook.md) for invoking that specific flow. ##### Managing integration flows[​](#managing-integration-flows "Direct link to Managing integration flows") To add a new flow to your integration, click the **+ Add new flow** button at the top of the designer. ![Manage integration flows in Prismatic app](/docs/img/integrations/low-code-integration-designer/flows/manage-flows.png) To edit a flow, click your current flow's name, then click the pencil icon to the right of the flow. Each flow should have a unique name and may include an optional description. To delete a flow from an integration, click the trash icon to the right of the flow's name and description. ##### Cloning a flow[​](#cloning-a-flow "Direct link to Cloning a flow") When you need to add a flow similar to an existing one, you can **clone** (copy) the flow. To clone a flow, open the flow menu by clicking the flow name at the top of the integration designer. Then, select the clone flow button and provide a new name for the copy. ![Clone integration flow in Prismatic app](/docs/img/integrations/low-code-integration-designer/flows/clone-flow.png) --- ##### Add to Marketplace In this step you'll prepare the [integration you created](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) for the [integration marketplace](https://prismatic.io/docs/embed/marketplace.md). Once the integration has been added to your integration marketplace, your customers can activate it themselves. They can do that by logging in to Prismatic using [customer users](https://prismatic.io/docs/customers/customer-users.md), or if you [embed the marketplace](https://prismatic.io/docs/embed/marketplace.md) in your app, they can activate the integration without leaving your app. #### Adding a logo[​](#adding-a-logo "Direct link to Adding a logo") Integrations in the marketplace look best with a logo. Since this is a Slack integration, it makes sense to use the Slack logo. Right-click this image and click "Save As" to save a copy locally: ![Slack icon](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/slack-icon.png) Next, open your integration back up and click the image icon to the left of your integration's name. Select the Slack icon you just downloaded to make that the icon for this integration. ![Add icon to integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/add-icon.png) #### Give your integration a category[​](#give-your-integration-a-category "Direct link to Give your integration a category") Click the **Integration Details** button on the left of your integration designer and give your integration a category and description. This will help your customers filter and find your integration more easily. Since this is a Slack integration, a category of "communication" makes sense, but you can choose whatever category you want. ![Add category to integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/add-category.png) #### Clean up config variables[​](#clean-up-config-variables "Direct link to Clean up config variables") For integrations with few config variables, this may not be necessary, but it's nice to organize required config variables with headers. Open the config wizard designer from the top of your page. Add two headers by clicking the **+ Text/Image** button - one reading "Slack Info" and another reading "Todo App Info." Drag the headers and config variables to rearrange them. This will give your customers a more polished configuration experience when they deploy this integration: ![Clean up config variables in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/clean-up-config-variables.png) While you're in the configuration wizard designer, open each config variable by clicking the pencil icon, and remove the default value from each - this will require your customers to enter their own values for each config variable. #### Re-publish your integration[​](#re-publish-your-integration "Direct link to Re-publish your integration") Open the **Version History** drawer and **Save & Publish** a new version of this integration that includes your config variable and icon changes. ![Republish integration in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/republish.png) #### Add your integration to the integration marketplace[​](#add-your-integration-to-the-integration-marketplace "Direct link to Add your integration to the integration marketplace") Click **Marketplace Configuration** on the left-hand sidebar and select **Add integration** from the top right. Provide your marketplace offering an **Overview**: ![Configure integration overview for integration marketplace via Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/configure-integration-marketplace.png) Click **Update** when you are done. #### Let customers activate this integration[​](#let-customers-activate-this-integration "Direct link to Let customers activate this integration") Now, you can either [add customer users](https://prismatic.io/docs/customers/customer-users.md) to Prismatic and let them activate this integration, or you can [embed the marketplace](https://prismatic.io/docs/embed/marketplace.md) in your own app so users can seamlessly activate integrations without leaving your app. Your integration within an embedded marketplace would look like this: ![Sample list of integrations in embedded marketplace](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/embedded-marketplace-list.png) Clicking on the integration brings customers to a screen with the description and overview you set: ![Activate integration from embedded marketplace](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/embedded-marketplace-overview.png) If a customer chooses to activate this integration, their configuration experience looks clean and straightforward, thanks to the changes you made to the configuration experience (above): ![Configure integration from embedded marketplace](/docs/img/integrations/low-code-integration-designer/get-started/add-to-marketplace/embedded-marketplace-configure.png) #### Next steps[​](#next-steps "Direct link to Next steps") You are now ready to create your own integrations. Here are a few things we recommend you try next: * Modify your integration to use [additional components](https://prismatic.io/docs/components.md). * Write [your own component](https://prismatic.io/docs/custom-connectors.md) to accomplish industry-specific business logic for your company. --- ##### Deploy to a Customer Now that you've [built your first integration](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md), let's deploy the integration to a customer. We use the term **instance** to describe a copy of an integration that has been configured for a specific customer, using customer-specific configuration variables. Note: you have the option to deploy your integration to a customer yourself, or you can add your integration to your integration marketplace, so your customers can configure and deploy instances themselves. Learn more about your options [here](https://prismatic.io/docs/instances/deploying.md). #### Create a customer[​](#create-a-customer "Direct link to Create a customer") Your software company has [customers](https://prismatic.io/docs/customers.md). For this example we will create a customer - "FTL Rockets". You can choose another name if you would like. Navigate to the customers screen by clicking **Customers** on the left-hand sidebar, and then click the **+ Add customer** button in the upper right. ![Customers page in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/deploy-first-integration/customers-page.png) Give your customer the name "FTL Rockets", and a description of your choice. Click the **Add** button. ![Add Customer page in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/deploy-first-integration/add-customer.png) #### Deploy an instance[​](#deploy-an-instance "Direct link to Deploy an instance") Open the customer you created and then open the **Summary** tab. Click the **+ Add instance** link in the Instances card. ![Create an instance in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/deploy-first-integration/create-instance-1.png) Select the integration you created and published from the menu, and provide a name and description for the instance. ![Name an instance in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/deploy-first-integration/create-instance-2.png) Once your instance has been created, you will be brought to the instance's **Summary** tab. From here you can click the **Reconfigure** button in the top right to open the Config Wizard. In the Config Wizard, you can customize **Slack Webhook URL** and **Todo API URL** for this customer (or leave the defaults). Once you are satisfied with the configuration, click the **Finish** button at the bottom right. ![Configure and deploy an instance in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/deploy-first-integration/configure-and-deploy.png) You can now test the instance by clicking into the **Test** tab and clicking **Save and Run Test**. You should receive a Slack message, as you did before. #### Delete your deployed instance[​](#delete-your-deployed-instance "Direct link to Delete your deployed instance") Since Prismatic charges per deployed instance, and free accounts have a limit of 4 deployed instances, you should delete this instance when you're done. When you're ready, click the **Delete instance** button at the bottom of the **Summary** tab. #### Next steps[​](#next-steps "Direct link to Next steps") Your first integration has been deployed to one customer, but can be deployed to multiple customers now (each with their own API endpoint and Slack webhook URL). In the next page you'll learn how to productize your integration and [prepare it for marketplace](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/add-to-marketplace.md). --- ##### Get Started with Low-Code [Get Started with Low-Code](https://player.vimeo.com/video/1109467401) #### Overview[​](#overview "Direct link to Overview") This tutorial will guide you through fundamental integration development concepts. You will: * Fetch data from an API * Understand how data flows between integration steps * Use loops to iterate over retrieved records * Use logical branches to determine record processing logic * Send specific records to another system (this example uses Slack) #### Integration overview[​](#integration-overview "Direct link to Integration overview") The integration you'll build will retrieve data (a list of to-do tasks) from an API, loop over the list, identify tasks marked "incomplete", and notify you via Slack of any incomplete tasks. We'll use a placeholder API for the to-do list data - `https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo`. You'll design this integration to be **configurable**, enabling the same integration to be deployed to multiple customers with unique API endpoints, Slack credentials, and channel names. #### Let's build\![​](#lets-build "Direct link to Let's build!") You're encouraged to follow along with the video above or the steps below. If you'd like to import a completed integration, you can download this integration's YAML definition [here](https://prismatic.io/docs/samples/my-first-integration.yml). *Note*: This YAML definition assumes that you've created a [customer-activated](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md) connection for Slack with an ID of `slack-connection`. As you build, you'll retrieve data from this API endpoint: `https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo` You will need to create a Slack app and have access to a Slack workspace, which you can do for free at . Ensure you assign your Slack app the `chat:write`, `chat:write.public`, and `channels:read` scopes. Documentation for creating a Slack app can be found [here](https://api.slack.com/authentication/basics). ![Sample integration diagram in Prismatic app](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/completed-integration.png) The completed integration ##### Create a new integration[​](#create-a-new-integration "Direct link to Create a new integration") From the integrations page, click the **New Integration** button. Select **Quickstart** to create a new integration from scratch. Then, name your integration "My First Integration" or similar and select **Schedule** as your trigger. Select how often you'd like your flow to run. ![Create a new integration](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/add-new-integration.png) ##### Fetch data from an API[​](#fetch-data-from-an-api "Direct link to Fetch data from an API") Next, add a new step to your integration by clicking the **+** button below your trigger. Search for the **HTTP** component and add a **GET Request** step. ![Add an HTTP GET step](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/add-http-get-step.png) Configure your **GET Request** step to fetch data from the to-do API endpoint - `https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo`. ![Configure HTTP GET step](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/configure-http-get-step.png) Finally, click the green **Run** button at the bottom of your screen to test your integration so far. If you select the **Get Request** step in your test runner drawer at the bottom of your screen, you can expand the **results** section in the **Output** tab on the right to see the data retrieved from the API. ![HTTP GET step results](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/http-get-step-results.png) ##### Loop over the records[​](#loop-over-the-records "Direct link to Loop over the records") Now that we're fetching a list of to-do tasks, we need to loop over each task to determine which ones are incomplete. To do this, we'll add a **Loop** step to our integration. Add another step under your **GET Request** step, this time searching for the **Loop** component and its **Loop Over Items** action. Configure your loop step to loop over the `results` property of the previous step. You can do that by clicking **Configure Reference** in the loop step's configuration panel and selecting the `results` property of the **Get Request** step. ![Configure loop step](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/configure-loop-step.png) Run a test of your integration again. This time, you'll see that the loop step has a **currentItem** property in its output that contains the data for the current item being processed. We'll use this property in the next step to determine if the current item is incomplete. ![Loop step results](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/loop-step-results.png) ##### Identify complete items[​](#identify-complete-items "Direct link to Identify complete items") Now that we're looping over each item, we need to determine if the current item is complete or not. `completed` is a boolean property on each item that indicates whether the item is complete. Add a step within your loop - select the **Branch** component, **Branch on Expression** action. Configure the branch step to have a condition called **Is Complete?**. Select the loop step's **currentItems.completed** property as the **Field** input and under **Operator** select **is true**. ![Configure branch step](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/configure-branch-step.png) Now, items with `completed: true` will follow the **Is Complete?** branch, while items with `completed: false` will follow the **Else** branch. ##### Send incomplete items to Slack[​](#send-incomplete-items-to-slack "Direct link to Send incomplete items to Slack") Next, we want to send a message to Slack for each incomplete item. Until now, the steps we've added have not required additional connection configuration or credentials. This step will require additional setup. We'll first need to create a connection to Slack. Ensure you have created a Slack OAuth app and have access to a Slack workspace ([see Slack component docs](https://prismatic.io/docs/components/slack.md#slack-oauth-20)). ###### (Recommended) Option 1: Configure a reusable Slack connection[​](#recommended-option-1-configure-a-reusable-slack-connection "Direct link to (Recommended) Option 1: Configure a reusable Slack connection") Connections can be defined on the integration or at the customer level. The advantage of creating a reusable connection is that it can be easily shared across multiple integrations, reducing the need for duplicate configuration. After creating a Slack app in Slack, enter your connection information in a [customer-activated connection](https://prismatic.io/docs/integrations/connections/integration-agnostic-connections/customer-activated.md). ![Slack connection configuration](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/slack-customer-activated-connection.png) ###### Option 2: Configure single Slack connection[​](#option-2-configure-single-slack-connection "Direct link to Option 2: Configure single Slack connection") You can define the Slack connection on the integration. This is useful for quick setups or when the connection is only needed for a single integration. Click the **Configuration Wizard** button at the top of your screen and edit your **Slack Connection** config variable. Enter your **Client ID**, **Client Secret**, and **Signing Secret** from the Slack app you created. ![Slack connection configuration](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/slack-connection-config.png) ###### Add a Slack step[​](#add-a-slack-step "Direct link to Add a Slack step") With a connection established, within the **Else** branch, add a **Slack** component, **Post Message** step. Now, configure the Slack step to send a message to a channel in your Slack workspace. For now, hard-code a channel name like `general` in the **Channel Name** input - we'll make that dynamic shortly. Your **Message** input can be a template, concatenating a string like `Incomplete task: `with the `currentItem.task` property. ![Configure slack step](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/configure-slack-step.png) ###### Set up a test connection with Slack[​](#set-up-a-test-connection-with-slack "Direct link to Set up a test connection with Slack") Configure your test runner to use your test Slack workspace by clicking **Test Configuration** at the bottom of the screen and selecting **Test-instance configuration** - this will open the configuration wizard for your test instance. Finally, run a test of your integration. You should see messages in your Slack workspace for each incomplete task. ![Example Slack messages](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/example-slack-messages.png) ##### Make the integration configurable[​](#make-the-integration-configurable "Direct link to Make the integration configurable") Now that we have a working integration, let's make it configurable so we can deploy it to multiple customers. We'll make the Slack channel name configurable, as well as the API endpoint we're fetching data from. First, we'll make the Slack channel name configurable. Click the **Config Wizard** button at the top of your screen and create a second **Config Page**. Name it something like "Slack Configuration". Then, add a new Config Variable. Name it **Slack Channel** and choose **Picklist** as its **Data Type**. We want our config variable to be a picklist of all the channels in our customer's Slack workspace. We'll use a [data source](https://prismatic.io/docs/integrations/data-sources.md) to make the dropdown menu dynamic. Select the Slack component's **Select Channel** data source, as shown here: ![Slack channel config variable](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/slack-channel-data-source.png) Next, add an additional **String** config variable called **API Endpoint** and give it a default value of `https://my-json-server.typicode.com/prismatic-io/placeholder-data/todo`: ![API endpoint config variable](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/api-endpoint-config-variable.png) Finally, we need to update our integration to use these new config variables. Update the HTTP step to use the **API Endpoint** config variable as the URL to fetch data from. Then, update the Slack step to use the **Slack Channel** config variable as the channel name to send messages to. ![Configure slack step config var](/docs/img/integrations/low-code-integration-designer/get-started/first-integration/configure-slack-step-config-variable.png) Run a test of your integration again and verify that it still works as expected. #### Next steps[​](#next-steps "Direct link to Next steps") Congratulations! You created your first integration! Here are a few things you should try next: * Send your message to a different system, like [Microsoft Teams](https://prismatic.io/docs/components/ms-teams.md). * Swap out the HTTP GET action for a different data source component (perhaps pull from an [SFTP](https://prismatic.io/docs/components/sftp.md) server)? * Write your message out to a file in [Dropbox](https://prismatic.io/docs/components/dropbox.md) or [Amazon S3](https://prismatic.io/docs/components/aws-s3.md) instead of Slack. --- ##### Build a Full Integration The intro guide [Your First Integration](https://prismatic.io/docs/integrations/low-code-integration-designer/get-started/first-integration.md) walked you through basic concepts in Prismatic, including fetching data from an API, passing data between integration steps, and using built-in connectors to send data to third-party applications. This tutorial expands on those concepts. We'll build a more complex integration from start to finish that uses webhooks to sync data between two systems. #### The GitHub - Zendesk integration overview[​](#the-github---zendesk-integration-overview "Direct link to The GitHub - Zendesk integration overview") For this example, pretend that we are engineers at [Zendesk](https://www.zendesk.com/) (or a similar customer service SaaS). Some of our customers are software companies and maintain public [GitHub](https://github.com/) repositories. They've requested an integration that syncs GitHub repository **issues** with Zendesk **tickets**. Our product team provided us with this spec for an integration: * When a customer enables this integration, they should use OAuth 2.0 to connect their GitHub account. * After authenticating, the customer should be able to select one of their GitHub repositories from a dropdown menu. * If someone creates an **issue** in their chosen GitHub repository, a corresponding Zendesk **ticket** should be created. * If someone comments on the GitHub issue, that comment should be added to the same Zendesk ticket. * If a support person comments on the Zendesk ticket, that comment should be added to the GitHub issue. * If someone closes the GitHub issue, the Zendesk ticket should be automatically closed. ![Screenshot of the GitHub-Zendesk integration in action](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/github-zendesk-sync.png) The desired result - GitHub issues syncing with Zendesk tickets We'll implement this integration in the following videos. If you'd like to follow along with this example, you can download the YAML definition of the integration and [import](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md#importing-an-integrations-yaml-definition) it into your tenant. [Integration YAML](https://github.com/prismatic-io/examples/blob/main/integrations/github.yml) #### Connecting to GitHub and Zendesk[​](#connecting-to-github-and-zendesk "Direct link to Connecting to GitHub and Zendesk") When building any integration, the first step is to ensure you can connect to all applications you're integrating with. This video covers how to connect to both Zendesk and GitHub. Correction: use the repo scope This video erroneously uses a `repos` scope for GitHub that does not exist. That should be singular `repo` instead. A full list of GitHub scopes can be found [here](https://docs.github.com/en/apps/oauth-apps/building-oauth-apps/scopes-for-oauth-apps). Zendesk and GitHub both support the [OAuth 2.0 Auth Code flow](https://prismatic.io/docs/integrations/connections/oauth2.md) for authentication. This means your customers can connect their GitHub and Zendesk accounts by clicking a button and consenting to give your integration access to their data. ![GitHub configure connection](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/configure-connection.png) #### Handling webhook requests[​](#handling-webhook-requests "Direct link to Handling webhook requests") We want our integration to be event-driven, so when something happens in one app (like a comment being made on a GitHub issue), our integration is alerted to the change so it can make a corresponding change in Zendesk. This video covers how to handle webhook requests from GitHub. GitHub will send our integration several types of webhook requests, including: * Issue created * Issue updated * Issue closed * Comment added Our integration will need to handle each of these webhook events differently. The best way to do that is with [branching](https://prismatic.io/docs/components/branch.md) logic - we'll detect what type of request we received, and follow a series of steps to process the request accordingly. ![GitHub to Zendesk flow](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/github-zendesk-flow.png) ##### Avoiding infinite loops with branching[​](#avoiding-infinite-loops-with-branching "Direct link to Avoiding infinite loops with branching") Looking at the flow that handles GitHub webhook requests, you may notice the "short-circuit" logic under the **comment** branch. It would be very easy to create an infinite loop with this integration: * A comment is added to a GitHub issue - this triggers a GitHub webhook * A corresponding Zendesk comment is added - this triggers a Zendesk webhook * A corresponding GitHub comment is added * GOTO 10 We avoid this by prepending messages from the integration with `[Comment Created in GitHub]` or `[Comment Created in Zendesk]`. Then, when we receive a webhook request from GitHub or Zendesk we check if the message contains those strings. If the message does contain them, we stop and don't process the webhook request. There are likely more elegant approaches for tracking messages that have been passed, but for illustration purposes this is sufficient. ##### Leveraging external IDs[​](#leveraging-external-ids "Direct link to Leveraging external IDs") To map data from one application to another, it's common to leverage **external IDs**. In our case, we use GitHub's **issue number** as a Zendesk ticket's external ID. For GitHub issue number 13, for example, the Zendesk ticket is assigned an external ID `gh-13`. External IDs allow you to easily look up matching records. When a comment is added to a GitHub issue, we can look up the corresponding Zendesk ticket by fetching the ticket with external ID `gh-13`, and then we can add a comment to the ticket we looked up. ![GitHub to Zendesk flow](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/leverage-external-ids.png) ##### Validating HMAC signatures[​](#validating-hmac-signatures "Direct link to Validating HMAC signatures") We want to ensure that messages received from GitHub and Zendesk originated from GitHub and Zendesk (rather than from some malicious party). It's common practice with webhooks to **sign** a message using **Hash-Based Message Authentication Codes** (HMAC). We have an entire [article on HMAC](https://prismatic.io/docs/integrations/triggers/webhook/what-is-hmac.md), but the quick summary is that you and the third party (GitHub or Zendesk) know some secret, and the third-party uses that secret to generate a unique hash of the message they sent. By verifying the HMAC signature of a webhook request, you can be sure that the message originated from the correct party, since no one else knows your HMAC signing secret. Different apps implement HMAC in different ways: * When Zendesk sends a webhook request, it includes an HMAC signature header and a webhook ID header. The Zendesk trigger, then, uses the Zendesk connection to fetch the webhook's signing secret and verifies the HMAC signature. This is all done for you by the built-in Zendesk trigger. * GitHub lets you optionally set an HMAC signing secret when you create the webhook. We can use a config variable that users set at deployment time as the HMAC signing secret. The GitHub trigger references that config variable to verify webhook HMAC signature headers. For both apps, their respective triggers throw an error and immediately stop an execution if HMAC signatures are not correct, preventing malicious messages from being processed. #### Automating webhook setup[​](#automating-webhook-setup "Direct link to Automating webhook setup") In the previous video, we manually set up webhooks in GitHub. But, that's not a process we want our customers to go through. In this video, we'll automate the process of setting up webhooks in GitHub and Zendesk when a customer deploys our integration. To run some "setup" logic when an instance of our integration is deployed, we'll use an [Instance Deploy](https://prismatic.io/docs/components/management-triggers.md#instance-deploy) trigger. You can similarly use an [Instance Remove](https://prismatic.io/docs/components/management-triggers.md#instance-remove) trigger to run "teardown" logic when an instance is removed. ##### GitHub repos and data sources[​](#github-repos-and-data-sources "Direct link to GitHub repos and data sources") We need to know which GitHub repository to configure a webhook for. To determine that, we'll need to present a list of the customer's GitHub repositories in a dropdown menu when they configure our integration. The GitHub component's [List Repos](https://prismatic.io/docs/components/github.md#list-repos) data source will allow us to create that dropdown menu. ![GitHub configure connection](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/github-configure-dropdown.png) The data source will use the end user's OAuth 2.0 credentials to fetch a list of repositories that the user has access to, and it will present the repositories as a dropdown picklist menu. ![GitHub complete connection](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/github-dropdown-menu.png) ##### Removing webhooks on instance removal[​](#removing-webhooks-on-instance-removal "Direct link to Removing webhooks on instance removal") We also want to remove webhooks that we've created if someone removes an instance of the integration. To do that, we'll begin another flow with an [Instance Remove](https://prismatic.io/docs/components/management-triggers.md#instance-remove) trigger, which executes when someone deletes an instance. We can leverage GitHub and Zendesk's respective **Delete Instance Webhooks** actions. These actions are aware of the current instance's webhook URLs, and remove only webhooks in GitHub and Zendesk that target those URLs. The clean-up flow consists of a trigger and three steps: ![Advanced integration instance remove flow to remove webhooks](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/instance-remove-flow.png) Don't delete other apps' webhooks! It's easy to make the mistake of listing webhooks and removing all of them. Use caution when listing and removing webhooks - some of your customer's webhooks may be configured for other apps and are completely unrelated to your integration. Most built-in components that support webhooks have a **Delete Instance Webhooks** action that removes only webhooks for your instance. #### Preparing for the integration marketplace[​](#preparing-for-the-integration-marketplace "Direct link to Preparing for the integration marketplace") Now that we've built and tested a fully functional integration that sets up webhooks and syncs data between GitHub and Zendesk, we're ready to publish it to our [embedded Prismatic marketplace](https://prismatic.io/docs/embed/marketplace.md). Additional metadata can be [added](https://prismatic.io/docs/integrations/low-code-integration-designer.md#categorizing-integrations) to our integration, and helpful documentation, images, and even raw HTML can be added to the [config wizard](https://prismatic.io/docs/integrations/config-wizard.md) that your customers work through. ##### What your customer sees[​](#what-your-customer-sees "Direct link to What your customer sees") When your customer deploys an instance of your integration for themselves, they will not see the flow editor or any of the components that you used to build the integration. Instead, they'll see a configuration wizard that you've designed. The wizard will guide them through the process of connecting their GitHub and Zendesk accounts: ![GitHub to Zendesk config wizard 1](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/config-wizard-1.png) Then, it will ask them to select a GitHub repository from a dropdown menu: ![GitHub to Zendesk config wizard 2](/docs/img/integrations/low-code-integration-designer/get-started/full-integration/config-wizard-2.png) #### Conclusion[​](#conclusion "Direct link to Conclusion") This example integration modeled what a bi-directional, event-driven integration could look like. We used Zendesk and GitHub for illustration purposes, but you can build similar integrations between your app and a [variety](https://prismatic.io/docs/components.md) of other third-party apps. --- ##### Looping For many integrations, it's useful to be able to loop over an array of items or to loop a certain number of times. If your integration processes files on an SFTP server, for example, you might want to loop over an array of files on the server. If your integration sends alerts to users, you might want to loop over an array of users. Prismatic provides the [loop component](https://prismatic.io/docs/components/loop.md) to allow you to loop over an array of items, or you can loop a predetermined number of times. After adding a **loop** step to your integration, you can then add steps within the loop that will execute over and over again. The **loop** component takes one input: **items**. **Items** is an array - an array of numbers, strings, objects, etc. For example, one step might generate an array of files that your integration needs to process. Its output might look like this: ``` [ "path/to/file1.txt", "path/to/file2.txt", "path/to/file3.txt", "path/to/file4.txt" ] ``` The loop component can then be configured to loop over those files by referencing the `results` of the **list files** step: ![Loop over files by referencing results of list files step in Prismatic app](/docs/img/integrations/low-code-integration-designer/looping/loop.png) Subsequent steps can reference the loop step's `currentItem` and `index` parameters to get values like `path/to/file3.txt` and `2` respectively: ![Loop over items to get file paths in Prismatic app](/docs/img/integrations/low-code-integration-designer/looping/loop-current-item.png) **For More Information**: [The Loop Component](https://prismatic.io/docs/components/loop.md), [Looping Over Files Quickstart](https://prismatic.io/docs/integrations/common-patterns/loop-over-files.md), [Looping Over a Paginated API](https://prismatic.io/docs/integrations/common-patterns/loop-over-paginated-api.md) #### Looping over lists of objects[​](#looping-over-lists-of-objects "Direct link to Looping over lists of objects") The list of objects passed into a loop component can be as simple or complex as you like. In this example, if we have a loop named **Loop Over Users**, and the loop was presented **items** in the form: ``` [ { "name": "Bob Smith", "email": "bob.smith@progix.io" }, { "name": "Sally Smith", "email": "sally.smith@progix.io" } ] ``` Then the loop will iterate twice - once for each object in the list, and we can write a code component that accesses the loop's `currentItem` and `index` values and sub-properties of `currentItem` like this: ``` module.exports = async ( { logger }, { loopOverUsers: { currentItem, index } }, ) => { logger.info(`User #${index + 1}: ${currentItem.name} - ${currentItem.email}`); }; ``` That will log lines like `User #1: Bob Smith - bob.smith@prismatic.io`. #### Looping over a paginated API[​](#looping-over-a-paginated-api "Direct link to Looping over a paginated API") Many third-party APIs limit the number of records you can fetch at once and let you load a batch (or "page") of records at a time. You may need to loop over an unknown number of pages of records in an integration. You can accomplish that with a combination of two loops (one to loop over pages and one to loop over records on each page) and a [break loop](https://prismatic.io/docs/components/loop.md#break-loop) action that stops loading pages when there are no more left to load: ![Loop over paginated API in Prismatic app](/docs/img/integrations/low-code-integration-designer/looping/paginated-loop.png) Please reference [this quickstart](https://prismatic.io/docs/integrations/common-patterns/loop-over-paginated-api.md) for an example of how to loop over a paginated API. #### Return values of loops[​](#return-values-of-loops "Direct link to Return values of loops") A loop will collect the results of the **last** step within the loop and will save those results as an array. For example, if the loop is presented the list of JSON-formatted user objects [above](#looping-over-lists-of-objects), and the last step in the loop is a code component reading: ``` module.exports = async(context, loopOverUsers: { currentItem }) => { return {data: `Processed ${currentItem.email}`} } ``` Then the `result` of the loop will yield: ``` ["Processed bob.smith@progix.io", "Processed sally.smith@progix.io"] ``` --- ##### Passing Data Between Steps #### Step outputs[​](#step-outputs "Direct link to Step outputs") When a step runs, it may output data that subsequent steps can consume as input. For example, an SFTP **List Files** step outputs an array of file names: ![Example of step outputs as a list in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/step-output-0.png) An SFTP **Get File** step outputs the contents of a file retrieved from an SFTP server (in this case, an image): ![Example of step outputs as a file in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/step-output-1.png) Outputs take one of three forms: * A primitive value, like a **string**, **boolean**, **number**, or **array** of primitives. A subsequent step that references this output will receive the string, boolean, number, or array as input. * An **object**. An output might include multiple key-value pairs: ``` { "key1": "value1", "key2": ["value2.0", "value2.1", "value2.2"] } ``` You might see this after retrieving JSON from an HTTP endpoint. Specific values from an object can be referenced as inputs using familiar JavaScript `.dot` and `["bracket"]` notation. Using the above example, to access `value2.1`, you would reference `results.key2[1]`. * A **binary file**. Binary file outputs contain a combination of a file `Buffer` and content type (like `"image/png"`) in the form: ``` { "data": Buffer, "contentType": String } ``` Note: An action can return a combination of JSON and binary file(s) if properties of the JSON object are objects with `data` and `contentType` properties. **For More Information**: [Custom Component Action Results](https://prismatic.io/docs/custom-connectors/actions.md#perform-function-return-values) #### Configuring step inputs[​](#configuring-step-inputs "Direct link to Configuring step inputs") After adding a step to your integration, you will typically need to configure inputs for that step. Inputs might include a RESTful URL endpoint, an S3 bucket name, a Slack webhook to invoke, or even a binary file such as an image or PDF to upload or process. Some inputs are required and denoted with a `*` symbol, while others are optional. Inputs can take one of four forms: **value**, **reference**, **config variable**, or **template**. **Value** inputs are static strings, **reference** inputs reference the results of a previous step, **config variable** inputs reference customer-specific config variables, and **template** inputs allow you to concatenate static strings, config variables, and step result references. use the Join Lines action for multi-line input values If you need to enter multiple lines of text for an input value, you can use the [Join Lines](https://prismatic.io/docs/components/text-manipulation.md#join-lines) action to concatenate multiple lines of text into a single string. ![Join lines action in an input](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/join-lines-input.png) ##### Value inputs[​](#value-inputs "Direct link to Value inputs") A **value** is a simple string (perhaps a URL for an HTTP request). When you set a **value** for an input, that static value will be used as input for all your customers: ![Set input value in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/input-value.png) ##### Reference inputs[​](#reference-inputs "Direct link to Reference inputs") A **reference** is a reference to the output of a previous step. For example, if a previous step retrieves a file from Amazon S3 and the step is named **Fetch my file**, then you can reference **Fetch my file** as input for another step, and that subsequent step will receive the file that **Fetch my file** returned. Outputs from one step can be referenced by a subsequent step by referencing the previous step's **results** field. For instance, if a previous step returned an object - such as when an **HTTP - GET** action retrieved JSON reading `{ "firstKey": "firstvalue", "secondKey": "secondvalue" }` - you can access that `firstvalue` property in a subsequent step's input by selecting the **HTTP - GET** step and choosing `results.firstKey` in your **Reference search**: ![Reference earlier step result as step input in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/input-reference.png) ##### Config variable inputs[​](#config-variable-inputs "Direct link to Config variable inputs") A **config variable** references one of the integration's config variables. For example, we can select a config variable, `CMS API Endpoint`, as input for one of our steps. Config variables can be distinct for each customer, allowing each customer to be configured with a different `CMS API Endpoint`: ![Config variable inputs in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/input-config-var.png) ##### Template inputs[​](#template-inputs "Direct link to Template inputs") Finally, a **template** is a combination of string values, config variables, or step result references. You can concatenate several strings, config variables, and step results together to serve as a single input. Templates are useful when an input needs to be composed from various config variables and step results. For example, suppose you want to make an HTTP request to an API endpoint that is stored as a config variable and fetch an item whose ID was obtained in a previous step. You could combine the API endpoint, URL path, and product ID like this: ![Add template input in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/input-template.png) A static string, like `/product?id=`, can be intermixed with config variables and step result references. You can add references to config variables or step results by clicking the **+** button. ![Template inputs in Prismatic app](/docs/img/integrations/low-code-integration-designer/passing-data-between-steps/input-template-add-reference.png) **For More Information**: [Custom Component Inputs](https://prismatic.io/docs/custom-connectors/actions.md#input-parameters) --- ##### Raw Request Actions Components contain actions that wrap a large number of API endpoints. But, some APIs are vast with thousands of endpoints (only some of which are relevant to integrations). Not every endpoint that an app offers is represented by an action in the component. That's where an **HTTP Raw Request** action is useful. Raw request actions allow you to send a request to any endpoint that an API offers, using an HTTP client that is already authenticated with the third-party. Most built-in components include raw request actions, and depending on the API they may include an action for generic HTTP requests or an action for GraphQL requests. This document details how to use raw request actions in your integration. #### Determining what endpoint to specify[​](#determining-what-endpoint-to-specify "Direct link to Determining what endpoint to specify") You can determine the endpoint URL to use by looking at the API's documentation. For example, the [Asana API documentation](https://developers.asana.com/reference/rest-api-reference) lists all of the endpoints that they offer. To [Get audit log events](https://developers.asana.com/reference/getauditlogevents) from Asana, you need to send a GET request to `https://app.asana.com/api/1.0/workspaces/{workspace_gid}/audit_log_events`. The Asana component helper text notes that it fills in the base URL `https://app.asana.com/api/1.0` for you. So, you would just need to construct the remaining `/workspaces/{workspace_gid}/audit_log_events` portion of the URL. ![Raw request URL input with config variable template](/docs/img/integrations/low-code-integration-designer/raw-request-actions/url-input.png) The comments and example you see when you first input the URL are provided by the component developer and let you know what the base URL is and what the rest of the path should look like. Override a raw request base URL You can override a raw request base URL by specifying a fully qualified URL in the URL input. For example, if you wanted to send a request to `https://my-api.example.com/some/endpoint` from the Asana raw request component, you can specify that full URL in the URL input and the component's base URL will be ignored. #### Sending JSON to an API using raw request[​](#sending-json-to-an-api-using-raw-request "Direct link to Sending JSON to an API using raw request") The majority of modern APIs expect JSON data in the request body. You can send JSON data to an API using a raw request action by constructing a JSON string in the `data` input. include a content-type header Most JSON-based APIs require that you specify a `content-type` header of `application/json` when sending JSON data. Otherwise, you may see an error from the API that the request body is not valid JSON. If you reference a JavaScript object in the `data` input, the component will automatically set the `content-type` header to `application/json` for you. If you specify a JSON string, manually include the `content-type` header in the `Headers` input. ![Raw request JSON content type header](/docs/img/integrations/low-code-integration-designer/raw-request-actions/json-content-type.png) Additionally, you can reference a JavaScript object in the `data` input, and the component will automatically convert it to a JSON string for you. This is useful if you have a code step that constructs a JavaScript object that you want to send to the API. ![Raw request data from a JavaScript object](/docs/img/integrations/low-code-integration-designer/raw-request-actions/javascript-object-data-input.png) #### Sending non-JSON text to an API using raw request[​](#sending-non-json-text-to-an-api-using-raw-request "Direct link to Sending non-JSON text to an API using raw request") For non-JSON text data, like XML, CSV, etc., you can use the [Change Data Format](https://prismatic.io/docs/components/change-data-format.md) component to serialize data into the appropriate format. Like JSON data, ensure that you specify an appropriate `content-type` header (e.g. `text/csv`, `application/xml`, etc.). #### Sending form data to an API using raw request[​](#sending-form-data-to-an-api-using-raw-request "Direct link to Sending form data to an API using raw request") Form data inputs are useful for sending data to APIs that expect a content type `application/x-www-form-urlencoded`. Form data is largely used when you need to send several types of data in a single request, such as a file along with some metadata. To send form data, first ensure that you have cleared the `data` input - you can't send both data and form data together. Then, specify form data key/value pairs. In the below example, we send both a simple string `userid` and an XML payload `person-xml`. We also send a file `profile-picture` by referencing a picture from a previous step, and we give the file a name using the `File Data File Names` input: ![Raw request form data inputs](/docs/img/integrations/low-code-integration-designer/raw-request-actions/form-data-inputs.png) Serialize JSON before sending Unlike the `data` input, form data inputs cannot accept JavaScript objects. Serialize the JavaScript object into a JSON string (or XML string, etc.) before referencing it. #### Sending custom parameters and headers using raw request[​](#sending-custom-parameters-and-headers-using-raw-request "Direct link to Sending custom parameters and headers using raw request") The `Query Parameters` input allows you to specify custom query parameters to send to the API (that's the `?key=value` portion of the URL). While you could specify query parameters in the URL input through a [template input](https://prismatic.io/docs/integrations/low-code-integration-designer/passing-data-between-steps.md#template-inputs), string concatenation is prone to encoding issues. The `Query Parameters` input ensures that your query parameters are properly URI-encoded. The `Headers` input allows you to specify custom headers to send to the API. In addition to the usual `Content-Type` header, you may need to specify other headers like `Accept` or a custom header like `X-Tenant-ID`. ![Raw request parameters and headers inputs](/docs/img/integrations/low-code-integration-designer/raw-request-actions/parameters-and-headers.png) #### Response data types for raw request actions[​](#response-data-types-for-raw-request-actions "Direct link to Response data types for raw request actions") The `Response Type` input allows you to specify how you would like the response data to be formatted. * `json` is the default response type and will return the response data as a JavaScript object. This type assumes that the API returns `application/json` response data. * `text` will return the response data as a string. This type assumes that the API returns `text/plain`, `text/html`, or other text-based response data. * `arraybuffer` is used when you expect a binary file response. Use this if you expect a file, like a PDF, image, etc. #### Debugging raw request actions[​](#debugging-raw-request-actions "Direct link to Debugging raw request actions") If you're having trouble with a raw request action, you can toggle the `Debug` input to see the full request and response data in logs (just remember to toggle it back before deploying to production!). This is useful for debugging issues with the request body, headers, etc. It is also useful to use a tool like [Postman Echo](https://learning.postman.com/docs/developer/echo-api/) to echo back the request that you're sending. To use Postman echo, set the URL to `https://postman-echo.com/post` and the `HTTP Method` to `POST`. The raw request step's result will contain the full request data that you sent, which you can compare to the API's documentation to ensure that you're sending the correct data. Another tool that is useful for debugging raw requests is the [mendhak/http-https-echo](https://hub.docker.com/r/mendhak/http-https-echo) Docker image. This Docker container will print and echo any request that it receives. To run the Docker container and expose its port, run the following command: ``` docker run -p 8080:8080 -p 8443:8443 --rm -t mendhak/http-https-echo:latest ``` In a separate terminal, run [ngrok](https://ngrok.com/) to expose the Docker container to the internet: ``` ngrok http 8080 ``` `ngrok` will give you a public URL, like `https://73ed-123-45-67-89.ngrok-free.app`, which you can use as the URL in your raw request action. When you run your integration, you can see the full request data in the Docker container's logs. ![Raw request ngrok echo](/docs/img/integrations/low-code-integration-designer/raw-request-actions/ngrok-echo.png) tip If you can get an HTTP request to work in a tool like `curl` or [Postman](https://www.postman.com/) but cannot get it to work in a raw request action, send both the raw request and Postman request to your `ngrok` / echo endpoint. Comparing the two requests side-by-side can help you identify what is different between the two requests. #### Building an HTTP raw request action in your custom component[​](#building-an-http-raw-request-action-in-your-custom-component "Direct link to Building an HTTP raw request action in your custom component") You can build your own raw request actions for your custom components. That's useful if you have a large API but don't have the development resources to build out actions for every endpoint. A raw request action can be used by your integration builders to send requests to any endpoint that your API offers. For an example raw request action, see our [GitHub examples repo](https://github.com/prismatic-io/examples/blob/main/components/asana/src/actions/rawRequest.ts) for the code that backs our Asana component's [raw request action](https://prismatic.io/docs/components/asana.md#raw-request). Prismatic components use standard inputs and a `sendRawRequest` function to build built-in raw request actions. You can import the same inputs and function from the custom component SDK, `@prismatic-io/spectral`, so you don't need to write that code yourself. Example raw request action from the Asana component ``` import { action } from "@prismatic-io/spectral"; import { inputs as httpClientInputs, sendRawRequest, } from "@prismatic-io/spectral/dist/clients/http"; import { connectionInput } from "../inputs"; const rawRequest = action({ display: { label: "Raw Request", description: "Send a raw HTTP request to Asana API", }, inputs: { connection: connectionInput, ...httpClientInputs, url: { // Optional; this adds component-specific instructions to the URL input ...httpClientInputs.url, comments: "Input the path only (/goals), The base URL is already included (https://app.asana.com/api/1.0). For example, to connect to https://app.asana.com/api/1.0/goals, only /goals is entered in this field.", example: "/goals", }, }, perform: async (context, { connection, ...httpClientInputs }) => { const asanaToken = connection?.token?.access_token || connection?.fields?.apiKey; const { data } = await sendRawRequest( "https://app.asana.com/api/1.0", // Change this to your API's base URL httpClientInputs, { Authorization: `Bearer ${asanaToken}` }, // Authorization methods vary by API ); return { data }; }, }); export default rawRequest; ``` You can likely copy and paste the above code, changing the helpful `comments` and `example` for the `URL` input and changing the base URL and authorization header to match your API. #### Sending GraphQL requests using raw request[​](#sending-graphql-requests-using-raw-request "Direct link to Sending GraphQL requests using raw request") Some APIs, like [Fluent Commerce](https://prismatic.io/docs/components/fluent-commerce.md), are GraphQL-based. These built-in components generally have `Generic GraphQL Request` actions that you can use to send GraphQL requests. The generic GraphQL request action has a `Query or Mutation` input, which is the GraphQL query or mutation that you want to send. It's wise to parameterize queries and mutations using variables (to avoid QL-injection issues). Most generic GraphQL request actions have a `Variables` input, which is a key/value input where you can specify variables and their values that your mutation uses. It also generally includes a `Variables Object` input if you would like to provide a key/value object from a previous step. `Variables` and `Variables Object` are merged together and can be used in tandem. For example, suppose we want to send this mutation: ``` mutation myMutation( $customerName: String! $customerDescription: String! $labels: [String] ) { createCustomer( input: { name: $customerName description: $customerDescription labels: $labels } ) { id } } ``` You could reference `customerName` from a previous step but also supply `customerDescription` or `labels` from a previous step using the `Variables Object` input: ![GraphQL Raw request variables object input](/docs/img/integrations/low-code-integration-designer/raw-request-actions/graphql-variables-object-input.png) Construct GraphQL queries and mutations first using a GraphQL client GraphQL APIs often offer a web-based GraphQL explorer where you can construct queries and mutations. We recommend using a GraphQL client tool to construct your query or mutation first and then copy/pasting it into the `Query or Mutation` input. #### Building a GraphQL raw request action in a custom component[​](#building-a-graphql-raw-request-action-in-a-custom-component "Direct link to Building a GraphQL raw request action in a custom component") A generic GraphQL raw request action is similar to an HTTP raw request action, but instead of generic data inputs it has inputs for the query/mutation to run and the parameterized variables to use. This example requires three additional dependencies: ``` npm install graphql graphql-request lodash.merge ``` In the below example, we use the `graphql-request` library to prepare and send the GraphQL request (including the query/mutation and parameterized variables). `lodash.merge` is used to merge the `Variables` and `Variables Object` inputs together, as you may want to specify some variables in the UI and reference other variables as an object from a previous step. The portions of code you will need to change are the GraphQL API URL, and any custom authorization headers that your API requires. Example GraphQL raw request action ``` import { GraphQLClient } from "graphql-request"; import { Connection, action, component, connection, input, util, } from "@prismatic-io/spectral"; import merge from "lodash.merge"; const createClient = (connection: Connection) => new GraphQLClient( "https://app.prismatic.io/api", // Replace this URL with your API { headers: { Authorization: `Bearer ${connection.fields.apiKey}` }, // Authorization methods vary by API }, ); const genericGraphQLQuery = action({ display: { label: "Generic GraphQL Query", description: "Issue a query or mutation against the GraphQL API", }, inputs: { connection: input({ label: "Connection", type: "connection", required: true, }), query: input({ label: "Query or Mutation", type: "code", required: true, language: "graphql", clean: util.types.toString, }), // Variables are presented in the UI as a key-value pair list, and are handy // if you know ahead of time how many variables your query includes variables: input({ label: "Variables", type: "string", required: false, collection: "keyvaluelist", clean: (val: any) => util.types.keyValPairListToObject(val), }), // Variables Object is presented in the UI as a JSON editor, and is handy // if you don't know ahead of time how many variables your query includes, // or if you want to reference entire key/value objects from previous steps. variablesObject: input({ label: "Variables Object", type: "code", language: "json", required: false, clean: (value) => (value ? util.types.toObject(value) : {}), }), }, perform: async (context, params) => { const client = createClient(params.connection); const data = await client.request( params.query, merge(params.variables, params.variablesObject), // Merge the two variables inputs together ); return { data }; }, }); ``` --- ##### Steps #### Integration steps[​](#integration-steps "Direct link to Integration steps") Actions, like downloading a file from an [SFTP server](https://prismatic.io/docs/components/sftp.md) or posting a message to [Slack](https://prismatic.io/docs/components/slack.md), are added as **steps** of an integration. Steps are executed in order, and outputs from one step can be used as inputs for subsequent steps. Steps are run in order from top to bottom, and you can add conditional logic to your integration with a [branch](https://prismatic.io/docs/integrations/low-code-integration-designer/branching.md) or run a series of steps on a data set in a [loop](https://prismatic.io/docs/integrations/low-code-integration-designer/looping.md). If one step throws an error, the integration stops running. #### The trigger step[​](#the-trigger-step "Direct link to The trigger step") The first step of your integration is the **trigger** step, which determines when instances of your integration will run. The [integration triggers](https://prismatic.io/docs/integrations/triggers.md) article details how triggers work and how to invoke your integration. #### Adding steps to integrations[​](#adding-steps-to-integrations "Direct link to Adding steps to integrations") To add a step to an integration, click the **+** icon underneath the trigger or another action. Select the component and action you would like to add to your integration. For example, you can choose the **Amazon DynamoDB** component and then select the **Create Item** action. You can begin to type the name of the component or action you would like to add to filter the list of components and actions available. ![Choose component or action to add step in Prismatic app](/docs/img/integrations/low-code-integration-designer/steps/add-step.png) ##### Choosing component versions[​](#choosing-component-versions "Direct link to Choosing component versions") Components are [versioned](https://prismatic.io/docs/custom-connectors/publishing.md#component-versioning). You can choose a version of each component (custom or built-in) that works for your integration. "Pinning" component versions for your integration prevents accidental regressions if a new version of a component is published that contains breaking changes. To choose what version of each component your integration uses, click the **Component Versions** button on the right-hand side of the integration designer. You can choose to run the latest version of a component or any previous version. Components running the latest available version will be marked in grey, while components running a version for which there is a newer version available will be marked in yellow. ![Component version list highlighting latest version in Prismatic app](/docs/img/integrations/low-code-integration-designer/steps/component-version-drawer.png) To change the version of a component your integration uses, click the **CHANGE VERSION** button to the right of the component you want to change and select a version from the dropdown. #### Cloning steps[​](#cloning-steps "Direct link to Cloning steps") If you would like to make a copy of a step in your integration, click the **...** button next to the step and then select **Duplicate**. ![Clone a step in Prismatic app](/docs/img/integrations/low-code-integration-designer/steps/clone-step.png) This will copy the step, including any inputs you've configured for the action. #### Changing step actions[​](#changing-step-actions "Direct link to Changing step actions") If you would like to change the action that a step uses, click the **...** button next to the step and then select **Change Step Action**. ![Change a step's action in Prismatic app](/docs/img/integrations/low-code-integration-designer/steps/change-step-action.png) You will be prompted to select a different action and then will be prompted to fill in that new action's inputs. #### Changing step names[​](#changing-step-names "Direct link to Changing step names") By default, steps are uniquely named after the action they invoke (so they're named things like **CSV to YAML** or **Delete Object**). To override that default name, click the step and open the **Details** tab in the step configuration drawer. Like using descriptive variable names in a program, renaming steps allows you to give your steps descriptive names. Rather than `HTTP - PUT`, you could give your step a name like **Update Record in Acme**. We recommend giving your steps descriptive names and descriptions so your team members can read through integrations and understand their purpose more readily. ![Rename a step in Prismatic app](/docs/img/integrations/low-code-integration-designer/steps/rename-step.png) #### Reordering steps[​](#reordering-steps "Direct link to Reordering steps") Steps are executed in serial. To reorder steps, click and drag a step up or down. ![Reorder flow steps in Prismatic app](/docs/img/integrations/low-code-integration-designer/steps/reorder-steps.webp) --- ##### Testing Integrations The integration designer provides a sandbox for testing integrations. From the designer, you can invoke a test instance of your integration, configure test values for config variables, and view test logs in real time. You can test your integration after you set test values for config variables by clicking the green **Run** button. If your integration is made up of multiple [flows](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md), each flow is tested independently. Click the flow name on the top of the integration designer area, select the flow you would like to test, and then click **Run** for that flow. Note that each flow has a distinct webhook URL, so if you are invoking the integration from a third-party app via webhook, you'll need to note the flow's webhook URL. #### The test runner drawer[​](#the-test-runner-drawer "Direct link to The test runner drawer") The test runner drawer is where you can configure and run tests of your integration. It's located at the bottom of the integration designer screen. ![Test runner drawer in Prismatic app](/docs/img/integrations/low-code-integration-designer/testing/test-runner-drawer.png) If you build a code-native integration, the designer canvas will be hidden (as all of your integration logic exists in code), but the same test configuration experience will be available to you. #### Test instance config variables[​](#test-instance-config-variables "Direct link to Test instance config variables") If your integration uses config variables, you can specify testing values for those variables by clicking **Test Configuration** in the **Test Runner** drawer and then select **Test-instance configuration**. You will be prompted to fill out the same configuration wizard that your customers will see when they deploy an instance of your integration. If you specified default values for your [config variables](https://prismatic.io/docs/integrations/config-wizard/config-variables.md), those will be preset for you. Otherwise, fill in testing values and connection information for the purposes of testing your integration. ![Config wizard for testing integrations in Prismatic app](/docs/img/integrations/low-code-integration-designer/testing/config-wizard.png) We recommend that you create testing, non-production sandbox credentials for integration tests. #### Running a test of your flow[​](#running-a-test-of-your-flow "Direct link to Running a test of your flow") To run a test of your flow, click the green **Run** button in the **Test Runner** drawer. If you would like to send data to your integration's trigger as a webhook payload, you can specify that payload in the **Test Configuration** tab by clicking **Trigger payload**. ![Trigger payload dialog for testing integrations in Prismatic app](/docs/img/integrations/low-code-integration-designer/testing/trigger-payload.png) Within the trigger payload dialog, you can also specify custom HTTP headers to be sent with the webhook request. If you would like to invoke your integration from an external system (i.e. send a webhook from your app or a third-party system), copy the webhook URL that is displayed in the **Trigger payload** dialog and send HTTP requests to that endpoint. #### Replaying test invocations[​](#replaying-test-invocations "Direct link to Replaying test invocations") Time-saving tip Just like [instance replays](https://prismatic.io/docs/monitor-instances/retry-and-replay.md), you can replay a test integration invocation. That comes in handy if you are testing an integration invocation from a third-party app. You don't need to set up your third-party environment every time - you can send a webhook invocation once from your third-party app with a payload and run that same payload through your integration until you're happy with the results. To replay a test integration invocation, open the **Test Runner** drawer and select a test that you'd like to replay. Click the replay button to the right of the test. ![Replay test integration invocation in Prismatic app](/docs/img/integrations/low-code-integration-designer/testing/test-integration-replay.png) The payload that was sent to trigger this integration test will be fed back into another test of the integration. This allows you to make changes to your integration and iterate quickly, without needing to reconfigure your third-party apps and services to fire new webhook requests over and over. #### Test run results and logs[​](#test-run-results-and-logs "Direct link to Test run results and logs") After running an integration test, the steps that ran are displayed in the **Steps** column of the **Test Runner** drawer. You can toggle the **Logs** option to show logs for each step that ran. Clicking on a step will display the step's outputs and logs in the third column. This is helpful for debugging and verifying the flow of data within your integration. **For More Information**: [Logging](https://prismatic.io/docs/monitor-instances/logging.md), [Log Retention](https://prismatic.io/docs/monitor-instances/logging.md#log-retention) #### Testing a trigger's instance deploy and instance delete events[​](#testing-a-triggers-instance-deploy-and-instance-delete-events "Direct link to Testing a trigger's instance deploy and instance delete events") In addition to receiving and processing a webhook request, some triggers contain code that runs when an instance of the integration is deployed or removed. This code is useful for setting up or removing resources that are required for the integration to run (like configuring webhooks in third-party apps). If the trigger you are using contains code that runs on instance deploy or delete, you can test that code by clicking **Test Configuration** in the **Test Runner** drawer and then select **Test trigger functions**. You can opt to test the instance deploy or instance delete functions. ![Test deploy and delete events in Prismatic app](/docs/img/integrations/low-code-integration-designer/testing/test-deploy-delete.png) --- #### Triggers ##### Triggers Overview Integration **triggers** define *when* a flow should run. If your integration has multiple [flows](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md), each flow has its own trigger (and so its own webhook URL or schedule). There are several types of triggers: [App Events](https://prismatic.io/docs/integrations/triggers/app-events.md) [Invoke a flow when data changes in a third-party app. Some app event triggers rely on webhook requests sent from the third-party, while others poll the third-party app's API for changes.](https://prismatic.io/docs/integrations/triggers/app-events.md) [**Example:** A new lead is created in Hubspot, and you want to ingest the new lead data in your app.](https://prismatic.io/docs/integrations/triggers/app-events.md) [Universal Webhook](https://prismatic.io/docs/integrations/triggers/webhook.md) [The Universal Webhook trigger allows you to invoke a flow by making an HTTP request to the trigger's URL.](https://prismatic.io/docs/integrations/triggers/webhook.md) [**Example:** You want to invoke a customer's flow when the customer clicks a button in your app.](https://prismatic.io/docs/integrations/triggers/webhook.md) [Schedule](https://prismatic.io/docs/integrations/triggers/schedule.md) [Scheduled triggers allow you to create a regular schedule to dictate how often your integration should run.](https://prismatic.io/docs/integrations/triggers/schedule.md) [**Example:** A customer would like their data synced from your app to Salesforce each weekday at 8:00 AM.](https://prismatic.io/docs/integrations/triggers/schedule.md) [Management](https://prismatic.io/docs/integrations/triggers/management.md) [The management trigger allows you to invoke a flow as part of a setup or management task.](https://prismatic.io/docs/integrations/triggers/management.md) [**Example:** When a customer completes configuration of your Dropbox integration, run a flow that creates a set of directories in their Dropbox account.](https://prismatic.io/docs/integrations/triggers/management.md) [Cross Flow](https://prismatic.io/docs/integrations/triggers/cross-flow.md) [The cross flow trigger allows you to create flows that are designed to be invoked by other flows.](https://prismatic.io/docs/integrations/triggers/cross-flow.md) [**Example:** One instance fetches 10,000 records to process, and splits those records into 10 sets of 1000 records, sending each chunk to a sibling flow to be processed in parallel.](https://prismatic.io/docs/integrations/triggers/cross-flow.md) --- ##### App Event Triggers A flow with an **app event trigger** runs when some data changes in a third-party app. For example, you may want to be notified when an [Asana Project](https://prismatic.io/docs/components/asana.md#workspace-projects-trigger) is created, updated, or deleted, or when a [PagerDuty Incident](https://prismatic.io/docs/components/pagerduty.md#incidents-trigger) occurs. ![Salesforce app event trigger in flow builder](/docs/img/integrations/triggers/app-events/sfdc.png) If the connector you're working with does not have a built-in app event trigger, you can leverage the [universal webhook trigger](https://prismatic.io/docs/integrations/triggers/universal-webhook.md) to receive event notifications from a third-party app or the [schedule trigger](https://prismatic.io/docs/integrations/triggers/schedule.md) to periodically check for updates from the third-party. #### What are app events?[​](#what-are-app-events "Direct link to What are app events?") You want your flows to run when data is updated in a third-party app. There are two ways to detect changes to data: 1. The app notifies you of a change via a [webhook](https://prismatic.io/docs/integrations/triggers/webhook.md) request. 2. You [poll](#app-event-triggers-with-polling) the app for changes on a regular cadence, like every few minutes. ##### App event triggers with webhooks[​](#app-event-triggers-with-webhooks "Direct link to App event triggers with webhooks") Some app event triggers receive update notifications from the third-party app via [webhook](https://prismatic.io/docs/integrations/triggers/webhook.md) request. You'll need to configure the third-party app to notify you when changes occur. This can be done in one of three ways: 1. Some triggers are built to configure webhooks in a third-party app when an instance of your integration is deployed and remove webhook configuration if the instance is deleted. Examples of those triggers include [Asana's Workplace Projects Trigger](https://prismatic.io/docs/components/asana.md#workspace-projects-trigger) or [PagerDuty's Incidents Trigger](https://prismatic.io/docs/components/pagerduty.md#incidents-trigger). 2. Some triggers are built to receive webhook requests, but you'll need to build logic into a [deploy flow](https://prismatic.io/docs/integrations/common-patterns.md#deploy-time-flow) that configures the third-party app to send requests to another flow. You can see an example of that in the [Gmail](https://prismatic.io/docs/components/google-gmail.md#receiving-notifications-from-gmail) connector. 3. Some apps do not allow you to configure webhooks programmatically. You'll need to [display the instance's webhook URLs](https://prismatic.io/docs/integrations/config-wizard/config-pages.md#displaying-webhook-information-in-the-configuration-wizard) in your config wizard and have your customer manually configure webhooks in the third-party app using these URLs. Or, you could use [polling triggers](https://prismatic.io/docs/integrations/triggers/app-events.md#app-event-triggers-with-polling) to poll for changes. If you are integrating with an app that sends all of your customers' webhook requests to a single endpoint, see [Single-Endpoint Webhook Integrations](https://prismatic.io/docs/integrations/triggers/single-endpoint-webhook-integrations.md). To build your own app event trigger in a custom connector that supports webhooks, see [instance deploy and delete events for triggers](https://prismatic.io/docs/custom-connectors/triggers.md#instance-deploy-and-delete-events-for-triggers). ###### Rate limiting app events[​](#rate-limiting-app-events "Direct link to Rate limiting app events") If you are receiving a high volume of webhook requests, you may need to implement rate limiting to avoid overwhelming your integration or downstream services. This can be done by queuing incoming requests and processing them at a controlled rate using [Flow Concurrency Management](https://prismatic.io/docs/integrations/triggers/fifo-queue.md). ##### App event triggers with polling[​](#app-event-triggers-with-polling "Direct link to App event triggers with polling") Some apps do not support webhooks, or webhook configuration is tedious to configure. Polling triggers are useful when you want to be notified when data changes in those apps. A polling trigger will poll an external API on a schedule that you set (for example, "every 5 minutes"), and if new data is available since the last time it polled, a full execution will run so your flow can process the data. Some sort of cursor is stored internally on the trigger. The cursor might be an ID of a record that was last seen or an "Updated At" timestamp. The cursor acts as a "bookmark" so the trigger knows what records to fetch the next time it runs. Use singleton executions to ensure you don't double-process data Like other [persisted data](https://prismatic.io/docs/integrations/persist-data.md), your trigger's cursor is loaded when an execution starts and is written out when the execution finishes. If your flow takes 10 minutes to run and you configure your trigger to run every 5 minutes, a second execution will fetch and process the same changes that the first fetched, since the first didn't finish before the second began. To prevent double-processing data, enable [singleton executions](https://prismatic.io/docs/integrations/triggers/schedule.md#ensuring-singleton-executions-for-scheduled-flows) on your polling trigger. Note that an execution runs each time a polling trigger looks for new data. If no new data is available, the execution immediately stops, and no additional steps run. If you navigate to an executions screen, you can filter executions that found no new data to process by clicking **Filter** and selecting **Exclude executions without trigger-detected changes**. Examples of connectors that implement polling triggers include [Dropbox](https://prismatic.io/docs/components/dropbox.md#new-and-updated-files) and [Google Drive](https://prismatic.io/docs/components/google-drive.md#new-and-updated-files). To build your own app event trigger in a custom connector that supports polling, see the [Custom Triggers](https://prismatic.io/docs/custom-connectors/triggers.md#app-event-polling-triggers) article. --- ##### Cross-Flow Trigger #### Cross-Flow trigger overview[​](#cross-flow-trigger-overview "Direct link to Cross-Flow trigger overview") Sometimes, you may want to have one flow invoke an execution of another flow. Common examples include: * [Parallel processing](https://prismatic.io/docs/integrations/common-patterns/processing-data-in-parallel.md) of tens or thousands of records * Simplifying large and complex integrations into smaller chunks * Creating reusable logic that can be referenced by other flows For situations like these, we recommend using the [Cross-Flow](https://prismatic.io/docs/components/cross-flow.md) component's trigger and Invoke Flow action. Cross-flow executions are linked in Prismatic's API, and you can readily identify which flows called other flows. #### Adding a cross-flow trigger[​](#adding-a-cross-flow-trigger "Direct link to Adding a cross-flow trigger") To declare that a certain flow should be invoked by others, begin the flow with a [Cross-Flow Trigger](https://prismatic.io/docs/components/cross-flow.md#cross-flow-trigger). ![Add a cross-flow trigger to an integration](/docs/img/integrations/triggers/cross-flow/add-trigger.png) Note that cross-flow triggers can be invoked like regular [webhook triggers](https://prismatic.io/docs/integrations/triggers/webhook.md) through their webhook URL, but invoking the sibling flow through the [Invoke Flow](https://prismatic.io/docs/components/cross-flow.md#invoke-flow) step ensures that the flows' executions are linked. ##### Synchronous and asynchronous cross-flow invocations[​](#synchronous-and-asynchronous-cross-flow-invocations "Direct link to Synchronous and asynchronous cross-flow invocations") When you configure a cross-flow trigger, you can select a **Response Type** of **Synchronous** or **Asynchronous**. * If you select **synchronous**, your sibling flow must complete its execution within 30 seconds. The flow calling its sibling will wait for a response before continuing its execution. * If you select **asynchronous**, your sibling flow can run for up to 15 minutes as usual. The flow calling its sibling will continue its execution without waiting for the sibling to finish. Full documentation on synchronous and asynchronous invocations is available [here](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md). #### Invoking a sibling flow[​](#invoking-a-sibling-flow "Direct link to Invoking a sibling flow") To invoke a sibling flow, add an [Invoke Flow](https://prismatic.io/docs/components/cross-flow.md#invoke-flow) step to your integration. Select a sibling flow using the **Flow Name** input. If you would like to send data to the sibling flow, generate the data in one step - [Code](https://prismatic.io/docs/integrations/low-code-integration-designer/code-step.md) or [Create Object](https://prismatic.io/docs/components/collection-tools.md#create-object) actions are great for generating payloads - and then reference that data as the **Data** input. ![Configure an invoke flow step](/docs/img/integrations/triggers/cross-flow/configure-action.png) The **data** that you send will be available to the sibling flow via the trigger's payload under `results.body.data`. Flows that invoke themselves Precautions in the Cross-Flow component are in place to prevent a flow from calling itself (as it is easy to create an unintended infinite loop or [fork bomb](https://en.wikipedia.org/wiki/Fork_bomb)). If you need a flow to call itself, please use caution. You can instruct a flow to call itself by adding an [HTTP](https://prismatic.io/docs/components/http.md) step to your integration that references your flow's trigger's `results.webhookUrls.Flow Name`. ##### Dynamically selecting a flow to invoke[​](#dynamically-selecting-a-flow-to-invoke "Direct link to Dynamically selecting a flow to invoke") Each **Invoke Flow** step must select a single flow to invoke. If you'd like to invoke other flows dynamically (i.e. conditionally sometimes invoke "Flow 1" and other times invoke "Flow 2"), add a [Branch](https://prismatic.io/docs/components/branch.md) step and use branching logic to determine which **Invoke Flow** step to run. ![Invoke other steps conditionally](/docs/img/integrations/triggers/cross-flow/conditional-invocation.png) #### Tracing executions across flows[​](#tracing-executions-across-flows "Direct link to Tracing executions across flows") When the [Cross-Flow](https://prismatic.io/docs/components/cross-flow.md) component is used, executions that call one another are linked together within the Prismatic API. You can access an execution's `lineage`, which tells you two things: 1. Which execution invoked this execution? 2. Did this execution invoke any others? ![Query for an execution's lineage](/docs/img/integrations/triggers/cross-flow/lineage-query.png) Within the integration designer, you can click **View linked executions** beside an execution to see its lineage. In this example, our parent execution invoked three children. One of the children invoked its own child execution. ![Linked executions in the designer](/docs/img/integrations/triggers/cross-flow/linked-executions-designer.png) You can see linked executions within an instance's **Executions** tab as well, which can help when debugging an integration that calls sibling flows. ![Linked executions in an instance screen](/docs/img/integrations/triggers/cross-flow/linked-executions-instance.png) #### Using cross-flow triggers in code-native[​](#using-cross-flow-triggers-in-code-native "Direct link to Using cross-flow triggers in code-native") If you're building a [code-native integration](https://prismatic.io/docs/integrations/code-native.md), you can use the `context.invokeFlow` function that the cross-flow component wraps in your own code. In this example, "Parent Flow" pulls down 100 records from the JSON Placeholder API. It breaks those records into 5 chunks of 20 and sends those chunks to "Capitalize Titles" using `context.invokeFlow`. "Capitalize Titles" is configured to be synchronous, so "Parent Flow" waits for each invocation of its sibling to complete before continuing. It accumulates the results returned from the sibling flow. ``` import axios from "axios"; import { flow } from "@prismatic-io/spectral"; interface Post { userId: number; id: number; title: string; body: string; } const CHUNK_SIZE = 20; export const parentFlow = flow({ name: "Parent Flow", stableKey: "parent-flow", description: "Parent flow that invokes its sibling flow", onExecution: async (context) => { // Fetch 100 posts from JSON Placeholder API const { data: posts } = await axios.get( "https://jsonplaceholder.typicode.com/posts", ); const processedTitles = []; // Send posts to sibling flow to be capitalized, 20 at a time for (let i = 0; i < posts.length; i += CHUNK_SIZE) { // Get chunk of 20 posts const chunk = posts.slice(i, i + CHUNK_SIZE); // Invoke sibling flow with chunk of posts const siblingFlowResponse = await context.invokeFlow( "Capitalize Titles", { posts: chunk }, ); // Response was synchronous; append returned capitalized titles // to accumulator processedTitles.push(...siblingFlowResponse.data); } return { data: processedTitles }; }, }); // This flow is contrived, but simulates "work" to be done by a sibling flow export const capitalizeTitles = flow({ name: "Capitalize Titles", stableKey: "capitalize-titles", description: "Capitalize titles of all posts it receives", isSynchronous: true, onExecution: async (context, params) => { const { posts } = params.onTrigger.results.body.data as { posts: Post[] }; const titlesCapitalized = posts.map((post) => post.title.toUpperCase()); // Synchronously return titles of posts, capitalized. return Promise.resolve({ data: titlesCapitalized }); }, }); export default [parentFlow, capitalizeTitles]; ``` --- ##### Endpoint Configuration Webhook triggers can be configured for an integration in one of three ways, depending on your needs: * **Instance and Flow Specific**: Each flow on each instance gets its own unique endpoint. This is the default configuration. * **Instance Specific**: Each instance gets a unique endpoint, and the integration determines which flow to run based on header or payload data. This is useful when each customer can configure their own webhook endpoint, but all webhook events are sent to the same endpoint. You can route a "Create Widget" request to a "Create Widget" flow and an "Update Gadget" request to an "Update Gadget" flow. * **Shared**: All customers' instances of the integration share an endpoint. Data in the header or payload determines which customer and flow should run. This is useful when you are only allowed to configure a single webhook URL in a third-party app for all of your customers' webhook events. When deciding on a webhook endpoint configuration, ask yourself two questions: 1. Do my webhook endpoints need to be the same for each of my customers, or can customers be configured to use different webhook endpoints? 2. If my customers can have unique webhook endpoints, can webhooks be configured to send data to unique endpoints depending on what activity they're responding to? For example, if a third-party app invokes an integration when it has an inventory update **or** when a new order is created, can those two activities be configured to invoke distinct webhook endpoints? Once you have answers to those questions, you can choose the appropriate webhook endpoint configuration. Shared Endpoint Configuration vs Single-Endpoint Webhooks The **shared** endpoint configuration is useful when the third-party app you're integrating with sends webhook requests one at a time (i.e. one request is for customer A, the next request is for customer B, etc.). Your preprocess flow can determine which customer to dispatch the request to. If the third-party app sends requests in bulk (i.e. one request is an array of updates, with a few for Customer A and a few for Customer B), you should reach for a "router" integration to break down and route all requests to their respective destinations. See [Single-Endpoint Webhooks](https://prismatic.io/docs/integrations/triggers/single-endpoint-webhook-integrations.md). #### Selecting endpoint configuration[​](#selecting-endpoint-configuration "Direct link to Selecting endpoint configuration") * Low-Code * Code-Native To configure your integration's endpoint settings, click the **Endpoint Configuration** button on the top of the integration designer. From the **Endpoint Type** tab, select one of the three endpoint types listed above. ![Select endpoint type in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/select-endpoint-type.png) Depending on what endpoint type you choose, you will be presented with different configuration options (described next). By default, instances of integrations that you deploy will be assigned unique webhook URLs - one URL for each flow. We call this **Instance and Flow Specific** endpoint configuration. Alternatively, you can choose **Instance Specific** endpoint configuration (each instance gets its own webhook URL and all flows share the single URL) or **Shared** endpoint configuration, where all flows of all instances share one URL. To specify endpoint type, add an `endpointType` property to the `integration()` definition in `src/index.ts`. It can have values `"instance_specific"`, `"flow_specific"`, or `"shared_instance"` and defaults to `"flow_specific"`: ``` import { integration } from "@prismatic-io/spectral"; import flows from "./flows"; import { configPages } from "./configPages"; export default integration({ name: "shared-endpoint-example", description: "Shared Endpoint Example", iconPath: "icon.png", flows, configPages, componentRegistry, endpointType: "instance_specific", }); ``` When **Instance Specific** or **Shared** endpoint configuration is selected, you need some logic to determine which flow (and which customer's instance in the case of **Shared**) should be run. This can be done with or without a [preprocess flow](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#instance-specific-endpoint-with-a-preprocess-flow), and both methods are described below. #### Instance and flow-specific endpoint configuration[​](#instance-and-flow-specific-endpoint-configuration "Direct link to Instance and flow-specific endpoint configuration") This is the *default* configuration. When an instance is deployed to a customer, each flow within the instance is assigned its own webhook endpoint. Customer A's "Update Inventory" flow has a unique endpoint that is different from Customer A's "Process Order" flow endpoint and different from Customer B's "Update Inventory" flow endpoint. Integrations that use this endpoint configuration often set up webhooks with a [deploy trigger](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger) and remove them with an [instance remove](https://prismatic.io/docs/components/management-triggers.md#instance-remove) trigger. #### Instance-specific endpoint configuration[​](#instance-specific-endpoint-configuration "Direct link to Instance-specific endpoint configuration") When an instance is deployed to a customer, that instance is assigned a single webhook endpoint. The flows that comprise the instance all share that endpoint. Each customer's instance has a unique endpoint, so Customer A's instance of the "Acme ERP" integration will have one endpoint, and Customer B's instance of the same integration will have a different endpoint. **If flows share an endpoint, which flow is executed?** Since several flows share an endpoint URL, you need a way to determine which flow should run when data is sent to your endpoint. You can determine which flow to run in two ways: 1. **Without a preprocess flow**. You can send the name of the flow that should run as an HTTP header or as a value in the HTTP payload. 2. **With a preprocess flow**. You can designate one flow of your integration to be a **preprocess** flow - that flow will determine which sibling flow should run. ##### Instance-specific endpoint without a preprocess flow[​](#instance-specific-endpoint-without-a-preprocess-flow "Direct link to Instance-specific endpoint without a preprocess flow") If you do not use a preprocess flow, you can send the name of the flow to run as an HTTP header or as a value in the HTTP payload. ###### Flow name from HTTP payload[​](#flow-name-from-http-payload "Direct link to Flow name from HTTP payload") For example, you could send the name of the flow you'd like to execute as part of your payload like this: Determine flow name from HTTP payload ``` curl https://hooks.prismatic.io/trigger/EXAMPLE== \ --header "Content-Type: application/json" \ --data '{"myFlowName":"Update Inventory","item":"widgets","quantity":5,"state":"removed"}' ``` Within the **Endpoint Configuration** drawer, you could choose to reference `results.body.data.myFlowName` to determine which flow to run: ![Set flow name from HTTP payload for Endpoint Configuration in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/flow-name-from-payload.png) Given the `curl` invocation above, the `Update Inventory` flow would be run with the rest of the payload that was provided. Run a test of endpoint configuration In order to populate the result picker in the screenshot above, click open the testing drawer's **Test Configuration** tab and then select **Endpoint Payload**. Enter a sample payload and click the globe icon to the right of the **Run** button. ###### Flow name from HTTP header[​](#flow-name-from-http-header "Direct link to Flow name from HTTP header") If you'd like to pass flow name as an HTTP header instead, a `curl` invocation could look like this: Determine flow name from an HTTP header ``` curl https://hooks.prismatic.io/trigger/EXAMPLE== \ --location \ --header "Content-Type: application/json" \ --header "myflowname: Update Inventory" \ --data '{"item":"widgets","quantity":5,"state":"removed"}' ``` In that case, you would reference `results.headers.myflowname` to determine which flow to run: ![Set flow name from HTTP header for Endpoint Configuration in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/flow-name-from-header.png) Use lower-case HTTP header keys Per [HTTP RFC 2616](https://datatracker.ietf.org/doc/html/rfc2616#section-4.2), HTTP headers should be case-insensitive. We've found HTTP clients to be inconsistent about their behavior and implementations, though. [Postman](https://www.postman.com/), for example, will send camel-cased headers, while others will always lowercase header keys. We recommend that you use lowercase HTTP header keys to avoid compatibility issues. ##### Instance-specific endpoint with a preprocess flow[​](#instance-specific-endpoint-with-a-preprocess-flow "Direct link to Instance-specific endpoint with a preprocess flow") If you need additional logic to determine which flow to run (for example, if you need to inspect an XML payload's shape to determine what kind of data was received), you can leverage a **preprocess flow**. This flow executes when data is sent to the instance's endpoint. It can be comprised of any number of steps, and the last step's results determine which sibling flow to execute. To configure a preprocess flow, first build a flow that can inspect an incoming payload and verify that the last step returns the name of the flow that you'd like to run next. Then, open the **Endpoint Configuration** drawer, select your preprocess flow from the **Preprocess Flow** dropdown menu, and under **Flow Name** select the key representing the name of the flow that should run: ![Set flow name from preprocess flow for Endpoint Configuration in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/flow-name-from-preprocess-flow.png) You may need to run a test of your preprocess flow in order to populate the result picker in the **Endpoint Configuration** drawer. #### Shared endpoint configuration[​](#shared-endpoint-configuration "Direct link to Shared endpoint configuration") All customers that have an instance of a particular integration deployed to them share a webhook endpoint, and data is routed to the proper customer and flow based on data contained in the HTTP request. Like [Instance-Specific Endpoint Configuration](#instance-specific-endpoint-configuration), **Shared Endpoint Configuration** can be configured with or without a **preprocess flow**. ##### Shared endpoint without a preprocess flow[​](#shared-endpoint-without-a-preprocess-flow "Direct link to Shared endpoint without a preprocess flow") If you do not use a preprocess flow, the shared endpoint's webhook invocation must include an [external customer ID](https://prismatic.io/docs/customers/managing-customers.md#customer-external-ids) and **flow name** either in the HTTP payload or as HTTP headers. You can mix-and-match if you'd like - provide one value as an HTTP header and the other in the HTTP payload. For example, if "Customer A" had an external ID of `abc-123` and you wanted to invoke their `Update Inventory` flow, you could send this `curl` request with the flow name represented as an HTTP header and customer ID represented in the HTTP payload: Routing a request by header and payload ``` curl https://hooks.prismatic.io/trigger/EXAMPLE== \ --location \ --header "Content-Type: application/json" \ --header "myflowname: Update Inventory" \ --data '{"myCustomerId":"abc-123","item":"widgets","quantity":5,"state":"removed"}' ``` ![Set flow name from HTTP payload or header for shared endpoint configuration in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/flow-name-and-customer-id.png) Flow name is not required for single-flow Integrations If your integration is comprised of just a single flow, then you only need to specify an external customer ID and not a flow name. ##### Shared endpoint with a preprocess flow[​](#shared-endpoint-with-a-preprocess-flow "Direct link to Shared endpoint with a preprocess flow") If you need additional logic to determine which flow to run or need to look up a customer's external ID, you should leverage a **preprocess flow**. This flow executes when data is sent to a shared endpoint. It can be comprised of any number of steps, and the last step's results determine which flow to execute for which customer. The final step must return an object containing both an [external customer ID](https://prismatic.io/docs/customers/managing-customers.md#customer-external-ids) and a **flow name**. ![Set flow name with preprocess flow for shared endpoint configuration in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/flow-name-and-customer-id-from-preprocess-flow.png) Shared endpoint preprocess flows cannot reference config variables When you use a shared endpoint, the preprocess flow runs without knowing yet what customer the data is destined for. It runs as a "system" instance and is not bound to any particular customer. Because of this, a preprocess flow cannot reference any customer-specific config variables or connections. ##### Shared endpoint config and versioning[​](#shared-endpoint-config-and-versioning "Direct link to Shared endpoint config and versioning") An integration can have [multiple versions](https://prismatic.io/docs/integrations/low-code-integration-designer.md#publishing-an-integration), and customers' instances can be on different versions. The endpoint configuration can change between versions, but a shared endpoint exists outside of a specific instance. So, **which version's endpoint configuration is used?** The answer is *the latest version that is currently deployed to a customer*. If your integration currently has three available published versions: 4, 5 and 6, and some of your customers are on version 4 and some are on version 5, then the endpoint configuration on version 5 is used for the shared endpoint for all customers. If another instance is then deployed using version 6, then the endpoint configuration for version 6 is used for all customers. If that single instance of version 6 is removed, leaving just versions 4 and 5 deployed, the endpoint configuration for version 5 will be used for all customers. #### Testing endpoint configuration[​](#testing-endpoint-configuration "Direct link to Testing endpoint configuration") You can test each of your flows individually (including a **preprocess flow**, if applicable) by clicking the **Run** button on the bottom of the integration designer in the testing drawer. If you would like to test your endpoint configuration, click the globe icon to the right of **Run**. To configure a test payload, open the testing drawer, select **Test Configuration**, and then select **Endpoint payload**. You can enter the payload and any headers that you would like to send to the shared endpoint. ![Test endpoint configuration in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/test-endpoint-config-payload.png) Within the **Logs** and **Step Outputs** tabs you will see logs and step results for both the preprocess flow (if you have one), and the flow that the request was routed to. If an error is thrown (for example, the flow name that the preprocess flow generated was not found), that error will appear in the **Logs** tab. Invocations from the integration designer are always dispatched to a test customer The integration designer is a sandbox. No test invocations will go to your customers' instances. Instead, if you use [Shared Endpoint Configuration](#shared-endpoint-configuration) (where all customers' instances share an endpoint), the execution will always be dispatched to a "test customer" within the integration designer. So, you can reference any external customer ID and the endpoint configuration test will be routed to the "test customer." #### Securing endpoints with API keys[​](#securing-endpoints-with-api-keys "Direct link to Securing endpoints with API keys") Endpoints can be configured to only run when an API key is included with the webhook request as an HTTP header. You can elect to use API keys for all flows, or only for specific flows. * Low-Code * Code-Native To configure instances of your integration to use API keys, open the **Endpoint Configuration** drawer in the integration designer, and select the **Security Type** for each of your flows. You have three options: * **No API Keys** indicates that the flow can be invoked without an API key. This option is often paired with a trigger that handles security in another way (like with [HMAC](https://prismatic.io/docs/integrations/triggers/webhook/what-is-hmac.md)). * **Secured by Customer** allows a customer to generate API keys for an endpoint when they deploy an instance of your integration. The API keys can either be generated automatically, or the customer can provide their own. If you select **Required**, your customer will be required to provide one or more API keys when they deploy an instance of your integration. * **Secured by Organization** gives you the option to set an API key that will be used by all customers' instances for that endpoint. ![Endpoint configuration security options in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/endpoint-config-drawer-security.png) To configure instances of your integration to use API keys, specify an `endpointSecurityType`. You have four options: * `unsecured` - do not use API keys. Secure webhooks some other way (like with [HMAC](https://prismatic.io/docs/integrations/triggers/webhook/what-is-hmac.md)). * `customer_optional` - customers can choose to secure endpoints with API keys when they deploy an instance of your integration. * `customer_required` - customers are required to supply API keys. * `organization` - you specify the API keys that will be used. If you specify `"organization"`, also provide a string array of API keys for the `organizationApiKeys` property. Secure a flow with API keys ``` export const flow1 = flow({ name: "Flow 1", stableKey: "9499d1d8-dddd-4d9b-aaff-c054f59d02cc", description: "This is the first flow", isSynchronous: true, endpointSecurityType: "organization", organizationApiKeys: ["my-first-key", "p@s$W0Rd"], onExecution: async (context, params) => { return { data: null }; }, }); ``` ##### Endpoint API keys in the config wizard[​](#endpoint-api-keys-in-the-config-wizard "Direct link to Endpoint API keys in the config wizard") Endpoints marked **Secured by Customer** will appear on the first page of the config wizard when a customer deploys an instance of your integration. ![Endpoint configuration in config wizard in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/endpoint-config-in-config-wizard.png) Your customers are required to provide API keys if you selected **Required**, and can optionally provide API keys if you didn't. An endpoint can have multiple API keys. If part of your configuration experience involves showing the customer the endpoint URL and API key, you can add a **Trigger Details** section to your config wizard by clicking **+Text/Image** and then selecting **Trigger Details**. ![Add trigger details to config wizard in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/add-trigger-details.png) You can add additional headings, helper text, and images to assist your customers as they configure webhooks in third-party apps. ![Trigger details in config wizard in Prismatic app](/docs/img/integrations/triggers/endpoint-configuration/trigger-details-in-config-wizard.png) ##### Sending requests to an endpoint secured with an API key[​](#sending-requests-to-an-endpoint-secured-with-an-api-key "Direct link to Sending requests to an endpoint secured with an API key") If your instance's flow has an API key, pass in an additional `Api-Key` header as part of your POST request: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --location \ --header "Content-Type: application/json" \ --header "Api-Key: 5cc74e1546382c52a8e93dce6795a5d4" \ --data '{"examplePayloadKey": "examplePayloadValue"}' ``` #### Troubleshooting shared endpoints in production[​](#troubleshooting-shared-endpoints-in-production "Direct link to Troubleshooting shared endpoints in production") If you have an integration with [Instance-Specific Endpoint Configuration](#instance-specific-endpoint-configuration), then all logs and execution records will appear in the instance's execution results page. Executions that that run through a [preprocess flow](#instance-specific-endpoint-with-a-preprocess-flow) that then trigger a sibling flow are packaged together as one execution in the executions page. If your instance's preprocess flow throws an error or yields the name of a flow that doesn't exist, you can see those errors and step results from that page. If you have an integration with [Shared Endpoint Configuration](#shared-endpoint-configuration), then the preprocess flow runs before it knows what customer it will dispatch the work to and is not tied to a specific instance. --- ##### FIFO Queues By default executions run concurrently, meaning multiple webhook invocations of the same flow are processed at the same time. This is great for performance - dozens of webhook requests can all be processed in parallel, but it can lead to out-of-order processing or rate limiting. To mitigate this, you can opt to place a queue in front of your flow. When **First In, First Out** (FIFO) is enabled, requests are processed one at a time in the order they are received. If your flow is already processing a request when a new request arrives, the new request is placed in a queue until the flow is ready to process it. This is helpful in a few situations: * If it's important that requests are processed in the order they are received (e.g., financial transactions). * If your integration is sensitive to the load it places on downstream systems (e.g., third-party rate limits). * If you expect to encounter [execution rate limits](https://prismatic.io/docs/integrations/integration-runner-environment-limits.md#webhook-rate-limiting-and-concurrent-executions) in Prismatic. This ensures that your integration can handle bursts of traffic without overwhelming your integrations or downstream third-party apps, and ensures that messages are processed in the order they are received. #### Enabling FIFO on a flow[​](#enabling-fifo-on-a-flow "Direct link to Enabling FIFO on a flow") To enable a FIFO queue on a flow, select the flow's trigger and open the **Flow control** tab. Toggle **Enable FIFO** on. ![](/docs/img/integrations/triggers/fifo-queue/enabling.png) When enabled, this flow will process one at a time in the order they're received (FIFO). Subsequent events wait in a queue until the current execution completes. **Note**: This feature is available for webhook-based [app event triggers](https://prismatic.io/docs/integrations/triggers/app-events.md) and generic [webhook triggers](https://prismatic.io/docs/integrations/triggers/webhook.md). It is not available for [management](https://prismatic.io/docs/integrations/triggers/management.md) or [pre-process flows](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#instance-specific-endpoint-with-a-preprocess-flow). Additionally, flows that have FIFO enabled *must* be [asynchronous](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md). For [app event polling](https://prismatic.io/docs/integrations/triggers/app-events.md#app-event-triggers-with-polling), or [scheduled](https://prismatic.io/docs/integrations/triggers/schedule.md) triggers, see [singleton executions](https://prismatic.io/docs/integrations/triggers/schedule.md#ensuring-singleton-executions-for-scheduled-flows). #### Message deduplication[​](#message-deduplication "Direct link to Message deduplication") Many applications ensure "at least once" delivery of outbound webhook requests, which can result in duplicate events being processed. To prevent processing duplicate requests, you can implement message deduplication strategies in your FIFO-enabled flows. To enable automatic deduplication of messages, specify a **Deduplication ID** in your trigger's **Flow control** configuration. For example, if a third-party sends a header called `x-acme-webhook-id`, you can use that value as the Deduplication ID. If two requests with the same `x-acme-webhook-id` header are received within a 10-minute window, the second request will be considered a duplicate and will be ignored. ![](/docs/img/integrations/triggers/fifo-queue/message-deduplication.png) #### Enabling FIFO queue in code-native integrations[​](#enabling-fifo-queue-in-code-native-integrations "Direct link to Enabling FIFO queue in code-native integrations") FIFO queues can be enabled in code-native integrations' flows by adding a `queueConfig` property to your flow. `usesFifoQueue` must be set to `true` to enable FIFO. You can optionally specify a `dedupeIdField` to prevent [message duplication](#message-deduplication). ``` export const listItems = flow({ name: "List Items", stableKey: "abc-123", description: "Fetch items from an API", queueConfig: { usesFifoQueue: true, dedupeIdField: "body.data.webhook-id", }, onTrigger: () => {}, onExecution: () => {}, }); ``` The above example assumes that the body of the incoming webhook request contains a field called `webhook-id` that uniquely identifies the event. To reference a header (for example, one named `x-acme-webhook-id`), you can use the following syntax: ``` dedupeIdField: "headers.x-acme-webhook-id", ``` #### Flow concurrency management FAQ[​](#flow-concurrency-management-faq "Direct link to Flow concurrency management FAQ") ##### Where can I see my queued requests?[​](#where-can-i-see-my-queued-requests "Direct link to Where can I see my queued requests?") Queued requests will appear alongside your other executions. In the integration designer if you observe a queued execution's logs, you will see a message like `Queuing Execution for Instance 'Salesforce'. Total Queued Executions: 4`. ![](/docs/img/integrations/triggers/fifo-queue/queued-logs.png) Make sure that you toggle **Logs** on. When the queued execution is processed, you will see a message like `Resuming Queued Execution for Instance Salesforce'. Total Queued Executions: 3. Total Concurrent Executions: 15` in the logs. The `15` there represents all concurrent executions of all flows. ![](/docs/img/integrations/triggers/fifo-queue/resumed-logs.png) In an instance's **Executions** tab, queued executions will also appear alongside running executions. ![](/docs/img/integrations/triggers/fifo-queue/queued-instance-executions.png) ##### Why can't FIFO be enabled for a synchronous flow?[​](#why-cant-fifo-be-enabled-for-a-synchronous-flow "Direct link to Why can't FIFO be enabled for a synchronous flow?") The goal of a synchronous flow is to process requests in real time, providing immediate feedback to the caller. Queueing the request and processing it later would defeat that purpose. ##### Can I configure the number of concurrent executions for a flow?[​](#can-i-configure-the-number-of-concurrent-executions-for-a-flow "Direct link to Can I configure the number of concurrent executions for a flow?") Currently, the number of concurrent executions for a flow is fixed and cannot be configured. A maximum of one execution will run at a time. ##### What happens to my FIFO queue if my instance is paused?[​](#what-happens-to-my-fifo-queue-if-my-instance-is-paused "Direct link to What happens to my FIFO queue if my instance is paused?") No new executions will queue, but any existing executions that were queued will resume after the instance is enabled again. ##### What happens to my FIFO queue if my instance is deleted?[​](#what-happens-to-my-fifo-queue-if-my-instance-is-deleted "Direct link to What happens to my FIFO queue if my instance is deleted?") All queued executions will be permanently removed and cannot be recovered. ##### What happens if I enable flow retry with FIFO?[​](#what-happens-if-i-enable-flow-retry-with-fifo "Direct link to What happens if I enable flow retry with FIFO?") If you enable [flow retry](https://prismatic.io/docs/monitor-instances/retry-and-replay/automatic-retry.md), failed executions will be retried in the order they were received, preserving the FIFO semantics. For example, if an execution fails and your flow is configured to retry up to 3 times, waiting 2 minutes between failures, the failed execution will be retried after 2 minutes, then again after 4 minutes, and finally after 6 minutes (assuming it continues to fail). During that time, no other executions will be processed from the queue. If an execution ultimately fails after all retries, it will be marked as failed and the next execution in the queue will be processed. ##### How many executions can be queued?[​](#how-many-executions-can-be-queued "Direct link to How many executions can be queued?") There is no current hard limit on the number of executions that can be queued, but keep in mind that excessive queuing will result in delayed processing of new requests. --- ##### Management Triggers #### Instance deploy trigger[​](#instance-deploy-trigger "Direct link to Instance deploy trigger") An integration flow can be configured to run when an instance of the integration is [deployed](https://prismatic.io/docs/components/management-triggers.md#instance-remove). This is useful when your integration needs to complete a series of tasks when it's deployed. For example, your integration might need to configure a third-party app to send data to the other flows' webhooks. Or, your integration might need to enable features in a third-party app or create a series of directories in a file share before the integration is invoked. If there are tasks that need to occur when an instance is deployed, set up those tasks as a flow and configure the trigger to run at deploy time. #### Instance remove trigger[​](#instance-remove-trigger "Direct link to Instance remove trigger") The opposite of an instance deploy trigger is an [instance remove](https://prismatic.io/docs/components/management-triggers.md#instance-remove) trigger. Flows with this trigger run when an instance is removed (deleted) and can be used to clean up webhooks and other configuration that an instance deploy trigger created. #### User-level config deploy and remove triggers[​](#user-level-config-deploy-and-remove-triggers "Direct link to User-level config deploy and remove triggers") If your integration supports [user-level configuration](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md), you can use the [User Level Config Deploy](https://prismatic.io/docs/components/management-triggers.md#user-level-config-deploy) and [User Level Config Remove](https://prismatic.io/docs/components/management-triggers.md#user-level-config-remove) triggers to execute when a user-level config is completed or removed respectively. --- ##### Schedule Triggers Scheduled triggers allow you to create a regular schedule to specify how often your integration should run. This is useful when you have an integration that should be triggered consistently at a specific time. You can set up your integration to run at the same time for all customers, or you can set up schedules on a per-customer basis. To set up the same schedule for all customers, click the integration's trigger, open the **Schedule** input, and enter the schedule you would like your integration to follow. You can configure your integration to run every X minutes, hours, days, or weeks: ![Set static integration trigger in Prismatic app](/docs/img/integrations/triggers/schedule/static-schedule.png) You can alternatively select **Custom** and provide a [cron string](https://en.wikipedia.org/wiki/Cron) (interpreted in UTC time). For example, a trigger of `*/5 8-16 * * 1-5` would cause your integration to run every five minutes during business hours (8:00-16:55), Monday through Friday. For help computing a cron schedule, see this [Cron Calculator](https://crontab.guru/). #### Ensuring singleton executions for scheduled flows[​](#ensuring-singleton-executions-for-scheduled-flows "Direct link to Ensuring singleton executions for scheduled flows") By default, if a scheduled trigger is set to run every 5 minutes, and one execution takes longer than 5 minutes, a second execution will start while the first execution is still running. This can lead to multiple executions running concurrently, which may not be desirable for your integration. To ensure that only one execution of your integration runs at a time, select your scheduled trigger and open its **Flow control** tab. Toggle **Enable Singleton Executions**. ![Enable singleton executions for integration trigger in Prismatic app](/docs/img/integrations/triggers/schedule/singleton-executions.png) When this option is enabled, if an execution is still running when the next scheduled time occurs, the new execution will be skipped. To enable singleton executions in a code-native integration, see [Code Native Flows](https://prismatic.io/docs/integrations/code-native/flows.md#enabling-singleton-executions-for-code-native-flows). #### Setting trigger schedules from a config variable[​](#setting-trigger-schedules-from-a-config-variable "Direct link to Setting trigger schedules from a config variable") To configure schedules on a per-customer basis, first create a config variable of type **Schedule** within the config wizard designer. You can give your config variable any name you choose: ![Configure integration trigger to use config variable in Prismatic app](/docs/img/integrations/triggers/schedule/schedule-config-variable.png) Then, click your integration trigger and reference the **Config Variable** you created: ![Set config variable for integration trigger in Prismatic app](/docs/img/integrations/triggers/schedule/config-driven-schedule.png) When your integration deployment team later deploys an instance of your integration, they can configure a custom schedule for that instance. #### Staggering schedules for customers[​](#staggering-schedules-for-customers "Direct link to Staggering schedules for customers") If you have many customers, you may want to stagger the schedules to avoid [concurrent execution constraints](https://prismatic.io/docs/integrations/integration-runner-environment-limits.md#webhook-rate-limiting-and-concurrent-executions) or prevent overwhelming your integration's target system. To stagger schedules, you can use a string type [config variable](https://prismatic.io/docs/integrations/triggers/schedule.md#setting-trigger-schedules-from-a-config-variable) to set a schedule for each customer. If you use a [code](https://prismatic.io/docs/components/code.md#code-block-string) data source and mark your config variable as [embedded](https://prismatic.io/docs/integrations/config-wizard/config-variables.md#config-variable-visibility), you can automatically generate a schedule for each customer without their intervention. This allows you to set different schedules for each customer without manually configuring each instance. This [Code Block String](https://prismatic.io/docs/components/code.md#code-block-string) example shows how to set a schedule that runs every 5 minutes, but with a random offset of up to 5 minutes for each customer: Configure each customer to run every 5 minutes with a random offset ``` module.exports = async () => { // Random integer between 0 and 4 const randomNumber = Math.floor(Math.random() * 5); // Create a cron string using the random number // "0-59/5 * * * *" means "At every 5th minute from 0 through 59." (i.e. :00, :05, :10, :15, :20...) // "3-59/5 * * * *" means "At every 5th minute from 3 through 59." (i.e. :03, :08, :13, :18, :23...) const cronString = `${randomNumber}-59/5 * * * *`; return { result: cronString }; }; ``` One customer may have a schedule of [`0-59/5 * * * *`](https://crontab.guru/#0-59/5_*_*_*_*), while another customer may have a schedule of [`3-59/5 * * * *`](https://crontab.guru/#3-59/5_*_*_*_*). An example integration that uses the above code is available [here](https://prismatic.io/docs/samples/random-schedule.yml). Similar code snippets can be used to generate other random schedules. See [Cron Guru](https://crontab.guru/) for help with cron syntax. --- ##### Single-Endpoint Webhook Integrations Some apps, notably Slack, Dropbox and Hubspot, do not have customer-specific webhooks. Instead, all events are sent to a single endpoint, and these apps expect the receiving server to route the events to the correct customer. If you would like to build an event-driven integration with Slack, Dropbox, Hubspot, or another app that offers single-endpoint webhooks, you can do so by building out a "router" integration that will receive all events and route them to the correct customer's instance. This video explains the concept: #### Example Dropbox router integration[​](#example-dropbox-router-integration "Direct link to Example Dropbox router integration") The router integration and Dropbox integration demonstrated in the video above is available in our GitHub examples repo. You can [import](https://prismatic.io/docs/configure-prismatic/integrations-multiple-regions.md#importing-an-integrations-yaml-definition) the integrations for yourself and extend them however you see fit. [Router Integration](https://github.com/prismatic-io/examples/blob/main/integrations/dropbox/example-dropbox-router.yml)
[Example Dropbox Integration](https://github.com/prismatic-io/examples/blob/main/integrations/dropbox/example-dropbox-integration.yml) ##### How the Dropbox router integration works[​](#how-the-dropbox-router-integration-works "Direct link to How the Dropbox router integration works") When an instance of the Dropbox integration is deployed, its `Register Instance` flow looks up the authenticated Dropbox connection's Dropbox account ID and sends that account ID along with the webhook URL for the `Handle Change Notifications` flow to the router integration. The router integration stores the Dropbox account ID and webhook URL in a map. When the router integration receives a webhook event from Dropbox, it looks up the Dropbox account ID associated with the event and sends the event to the correct instance of the Dropbox integration using the corresponding webhook URL. --- ##### Universal Webhook Trigger The universal webhook trigger can be used to invoke a flow from any service that can [send an HTTP request](https://prismatic.io/docs/integrations/triggers/webhook/sending-data.md) to a custom URL. Each flow that begins with a universal webhook trigger has its own uniquely generated webhook URL, and sending HTTP requests to that webhook URL causes the flow to execute. Each of your customers' instances' flows have their own webhook URL by default (though you can change that with [Endpoint Configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md)). The universal webhook trigger is useful in many situations. To name a few: * A third-party API may support webhooks, but a connector does not have a dedicated trigger for the webhook events you're interested in. A flow that starts with the universal trigger can receive requests from any third-party app for any event. * You can implement webhooks in your own application. Your webhooks can invoke a universal webhook trigger, allowing you to send data from your app to your customers' third-party apps in real time. * Your frontend app can make [synchronous](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md) requests to a flow with a universal webhook trigger in order to fetch data from a third-party in real time. #### When should I build a custom webhook trigger?[​](#when-should-i-build-a-custom-webhook-trigger "Direct link to When should I build a custom webhook trigger?") The [universal webhook trigger](https://prismatic.io/docs/components/webhook-triggers.md) is useful if the party calling the trigger sends a JSON or form data payload and expects a generic HTTP 200 response. If the third-party you're working with requires a custom response (like a challenge code), if they send atypical payloads (like XML or YAML that needs to be parsed first), or if they secure requests by signing payloads with [HMAC](https://prismatic.io/docs/integrations/triggers/webhook/what-is-hmac.md), you may need to wrap custom third-party logic in a [custom trigger](https://prismatic.io/docs/custom-connectors/triggers.md). --- ##### Using Shared Webhooks and Preprocess Flows In this tutorial we'll cover how to create a single webhook endpoint that can be invoked by multiple customers. For our scenario, imagine we want to integrate with a third-party ERP "Acme ERP". The Acme ERP tracks things like inventory and orders. Configuration of the ERP is limited and only allows you to specify a single webhook URL for all of your customers to post inventory and order updates. We need to create a single webhook URL that can accept the webhook payload that Acme ERP sends. That endpoint will need to inspect the payload, determine what customer it pertains to and what sort of payload it is (an "inventory update" payload or an "order creation" payload), and then it'll need to route that information to the correct flow in an instance deployed to the correct customer in Prismatic. #### Our Acme ERP Integration[​](#our-acme-erp-integration "Direct link to Our Acme ERP Integration") The Acme ERP integration that we deploy to customers has two flows. The first flow takes a collection of inventory updates ("remove 5 bowler hats", "add 20 pencils", etc.) and ensures that those updates are reflected in Progix: ![Acme ERP Update Inventory flow in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/inventory-flow.png) The second flow takes an order created in Acme ERP, processes the data in the order, and submits an order request to Progix: ![Acme ERP Create Order flow in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/order-flow.png) All of our customers are going to use the same webhook endpoint, so we'll need to configure our integration to create just one URL for all instances of this integration that are deployed. To do that, let's open up the **Endpoint Configuration** drawer using the button on the right-hand side of the integration designer. We'll toggle **Endpoint Type** to **Shared**. ![AcmeERP Endpoint Configuration for shared endpoint in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/endpoint-type-shared-instance.png) We'll leave **Preprocess Flow**, **Flow Name**, and **External Customer Id** alone for now and come back to them in a moment. Read more about webhook endpoint configuration on the [Integration Triggers](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md) article. #### The webhook payloads[​](#the-webhook-payloads "Direct link to The webhook payloads") When a customer in Acme ERP updates their inventory, Acme ERP sends a payload in this format to the shared webhook: ``` { "customerId": "A06DFFAC", "type": "inventory_update", "data": [ { "txid": "E865298B-A3EC-4D17-B410-5FDFC8861BA7", "item": "bowler hat", "quantity": 5, "state": "removed" }, { "txid": "4A38DE96-EA6B-4AB9-B9E2-3B033CA997CC", "item": "pencils", "quantity": 20, "state": "added" } ] } ``` When a customer in Acme ERP creates a new order, Acme ERP sends a payload in this format to the shared webhook: ``` { "customerId": "C3E72B0C", "type": "create_order", "data": { "orderid": "75F5AEC5-D482-4386-8878-219F92185DEC", "date": "2021-09-08", "shipped": false, "addr_1": "177A Bleecker St.", "addr_2": "New York, NY 10012", "total": 122.57, "paid": true } } ``` These two webhook payloads share some similarities: they both contain a `customerId` for your customers in Acme ERP, and they both have a `type` indicating the type of the webhook payload. We're going to use those two values to determine which customer in Prismatic to send the webhook to and which flow to invoke for that customer. To determine which customer to dispatch the webhook request to, we'll need a way to map an Acme ERP `customerId` to the [external ID](https://prismatic.io/docs/customers/managing-customers.md#customer-external-ids) that we use in Prismatic. We'll also need to map `type` to a flow name. To accomplish those tasks, we'll create another flow that will execute when a webhook is first received but before it's been dispatched to a customer's instance. #### The preprocess flow[​](#the-preprocess-flow "Direct link to The preprocess flow") A "preprocess" flow allows us to process data that comes in to a shared webhook URL and dispatch the data to a specific customer's instance and flow accordingly. The preprocess is called synchronously and returns whatever the last step of the flow returns. Our preprocess flow will contain two steps: one step will look up the customer's Progix external ID given the Acme ERP customer ID, and the second step will map `type` to a flow name and return both customer ID and flow name: ![AcmeERP preprocess flow in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/preprocess-flow.png) The `customerId` from the Acme ERP webhook request (accessible from `integrationTrigger.results.body.data.customerId`) can be passed in to a Progix external ID lookup action. The code component can then grab the external ID that is returned and can map `type` to a flow name: ``` const flowNamesMap = { inventory_update: "Update Inventory", create_order: "Create Order", }; module.exports = async ({ logger, configVars }, stepResults) => { const customerExternalId = stepResults.getCustomerByAcmeId.results.id; const webhookType = stepResults.trigger.results.body.data.type; const flowName = flowNamesMap[webhookType]; return { data: { customerExternalId, flowName }, }; }; ``` If we run our flow with a sample payload from above using the **Run** button, we can see that our last step returns an `customerExternalId` and `flowName`. ![AcmeERP Update Inventory test runner step results in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/preprocess-step-results.png) With a working flow, now we just need to indicate in Prismatic that this flow is a preprocess flow. To do that, let's open up the **Endpoint Configuration** once again and select our "Preprocess" flow as the **Preprocess Flow** (you can name your preprocess flow whatever you want). ![Set preprocess flow for Acme ERP in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/set-preprocess-flow.png) #### Routing the webhook to the correct customer and flow[​](#routing-the-webhook-to-the-correct-customer-and-flow "Direct link to Routing the webhook to the correct customer and flow") Now that our preprocess flow has yielded a customer's `customerExternalId` and `flowName`, we now need to configure our integration to route the webhook invocation to the proper customer and flow. All we need to do is instruct our integration where to look to get the `customerExternalId` and `flowName` to key off of. Let's open up the **Endpoint Configuration** drawer once more. This time, we'll open the **Flow Name** input, which works a lot like a step input. The results of the preprocess flow test run are available here as an object named `results`. We can select `results.flowName` that our preprocess flow returned from the result picker. We'll do the same for the **External Customer ID** input - this time we'll select `results.customerExternalId`: ![Reference preprocess flow results in Prismatic integration designer](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/reference-preprocess-results.png) That's it. All instances we deploy of this integration will share a single webhook endpoint, and webhook invocations will be routed to the proper customer and flow based on the information contained in the body of the webhook request. #### Follow-up questions[​](#follow-up-questions "Direct link to Follow-up questions") ##### Can I test endpoint config in the designer?[​](#can-i-test-endpoint-config-in-the-designer "Direct link to Can I test endpoint config in the designer?") Yes! Your requests will always be routed to your integration designer (and not actual instances), but you can open the test runner drawer and open **Test Configuration**. Then, select **Endpoint payload** and paste in a sample payload. ![Endpoint Test Payload](/docs/img/integrations/triggers/using-shared-webhooks-and-preprocess-flows/endpoint-test-payload.png) Once you've done that, click the globe icon to the right of **Run** to run a test of endpoint configuration. ##### Was the preprocess flow necessary here?[​](#was-the-preprocess-flow-necessary-here "Direct link to Was the preprocess flow necessary here?") Possibly. If the webhook trigger had contained the flow name and customer external ID that we needed, we could have omitted the preprocess flow. When no preprocess flow is assigned, the `results` object that **External Customer ID** and **Flow Name** can reference contains the header and payload information from the webhook invocation. We could have mapped the Prismatic customers' external ID from the webhook's `results.body.data.externalId`. That would require Acme ERP to be cognizant of the external IDs we assign to customers in Prismatic, which may or may not be possible in the third-party system. For **Flow Name**, we could have named our flows `inventory_update` rather than `Update Inventory` and `create_order` rather than `Create Order` and could have configured **Flow Name** to reference the webhook request's `results.body.data.type` instead. That would look less attractive in the integration designer, so again, there are trade-offs. ##### Could the customer ID have been passed in as a header instead of in the body?[​](#could-the-customer-id-have-been-passed-in-as-a-header-instead-of-in-the-body "Direct link to Could the customer ID have been passed in as a header instead of in the body?") Yes. The customer ID could have just as easily been a header as a value in the body. In that case, the lookup action could have referenced `stepResults.integrationTrigger.results.headers.customerId` rather than `stepResults.integrationTrigger.results.body.customerId`. ##### Do the same concepts apply to an instance-specific webhook?[​](#do-the-same-concepts-apply-to-an-instance-specific-webhook "Direct link to Do the same concepts apply to an instance-specific webhook?") Yes. Instance-specific webhooks provide you with a unique webhook per instance you deploy (so each customer gets its own webhook endpoint). In that situation, you wouldn't need to worry about **Customer External ID**, but you would still want a way to route to a particular flow, so the flow-name-mapping portion of this tutorial still applies. --- ##### What is a Webhook? [What are webhooks?](https://www.youtube.com/embed/CriaG014ocM?rel=0) A **webhook** is an automated message that is sent from one application to another application when certain events occur. Webhooks let applications notify one another in real-time when something has changed in one system and can be used to trigger an instance's flow so the change is reflected in the other system. A webhook consists of two main parts: * The **event** that causes the webhook to fire. The event is usually a change to a record in an application. For example, you may have a `contact.changed` or `report.created` event. * The **endpoint** where information about the event is sent. The endpoint is a URL that you provide to the application that will receive the webhook. #### Webhook request payloads[​](#webhook-request-payloads "Direct link to Webhook request payloads") When a webhook fires, the application where the event occurred will send a POST request to the endpoint you provided. Most applications will send a JSON payload in the request body that contains information about the event that occurred (though some, notably Salesforce, send XML payloads). Some payloads contain the entire record that changed, while others contain only the record's ID and you are expected to fetch the record yourself. #### Webhook security[​](#webhook-security "Direct link to Webhook security") Webhooks are often secured using Hashed Message Authentication Codes (or HMAC) to ensure that the request is coming from the application that you expect. **For More Information**: [Secure Webhook Endpoints with HMAC](https://prismatic.io/docs/integrations/triggers/webhook/what-is-hmac.md). #### Responses to webhooks[​](#responses-to-webhooks "Direct link to Responses to webhooks") Some applications expect a particular response to the webhook request to ensure that your application understood the request. Most applications expect an HTTP 200 response with a special message in the response body that is usually derived from the webhook's payload or request header. Other applications simply expect an HTTP 200 response. #### Webhooks in Prismatic[​](#webhooks-in-prismatic "Direct link to Webhooks in Prismatic") In Prismatic, the **endpoint** of a webhook is generally an instance's flow's trigger URL. Your integration configures webhooks in the third-party app when the instance is deployed either using an [onInstanceDeploy](https://prismatic.io/docs/custom-connectors/triggers.md#instance-deploy-and-delete-events-for-triggers) function in a custom trigger, or by creating a "setup" flow that is [triggered on instance deploy](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger). Some built-in components will subscribe to webhooks automatically when the instance is deployed. If you configure webhooks on instance deploy, you should also remove webhooks on instance delete using a similar process. If the third-party app expects a specific response to the webhook request, you can build a [custom trigger](https://prismatic.io/docs/custom-connectors/triggers.md) that validates HMAC signatures and responds to the webhook request appropriately. Many built-in components already have triggers that handle this for you. #### Ensuring webhook ordering with a FIFO queue[​](#ensuring-webhook-ordering-with-a-fifo-queue "Direct link to Ensuring webhook ordering with a FIFO queue") Prismatic's integration runner is designed to process requests in parallel. If you invoke an integration with multiple requests in quick succession, the runner will scale and process all of the requests simultaneously. If you have a workflow that requires requests to be processed in a specific order, or a workflow that requires you to process only one record at a time, you can leverage [Flow Concurrency Management](https://prismatic.io/docs/integrations/triggers/fifo-queue.md) to create a "first in, first out" (FIFO) queue. --- ##### Custom HMAC Trigger **Hash-Based Message Authentication Code** (or HMAC) is an authentication mechanism used frequently by webhooks to verify that a webhook message is legitimate. It helps ensure that messages sent to a webhook endpoint originated from a particular third-party and not from a malicious actor on the internet. If you compute an HMAC hash from the webhook's request body, your secret key is a string, and the hash is included as a header, you can use the built-in [HMAC Webhook Trigger](https://prismatic.io/docs/components/hash.md#hmac-webhook-trigger) to validate webhook requests. Most (but not all) HMAC implementations follow this pattern. In this tutorial, we'll examine the code behind the [Slack Webhook Trigger](https://prismatic.io/docs/components/slack.md#app-webhook), which differs from the generic HMAC trigger in two ways: * Slack concatenates the request body with a timestamp before computing the HMAC hash (see [Slack documentation](https://api.slack.com/authentication/verifying-requests-from-slack#verifying-requests-from-slack-using-signing-secrets__a-recipe-for-security__step-by-step-walk-through-for-validating-a-request)). * When a webhook is first configured in Slack, Slack sends a URL verification ["challenge" request](https://api.slack.com/apis/connections/events-api#the-events-api__subscribing-to-event-types__events-api-request-urls__request-url-configuration--verification__url-verification-handshake) to the webhook endpoint and expects the endpoint to respond with the same challenge string. The Slack trigger responds with that dynamic response and branches into "URL Verify" or "Notification" branches, depending on whether it received a "challenge" confirmation request or an actual Slack event. Full source code for the Slack trigger can be found in our [examples repository on GitHub](https://github.com/prismatic-io/examples/blob/main/components/slack/src/triggers.ts). #### Computing the Slack HMAC hash[​](#computing-the-slack-hmac-hash "Direct link to Computing the Slack HMAC hash") Rather than hashing only the request's body, Slack hashes a timestamp as well, which is passed in as a header. The string that is hashed follows this format: ``` v0:{TIMESTAMP}:{REQUEST BODY} ``` Then, an HMAC hash is computed and appended to the end of a string that starts with `v0=`. So, we built a function that takes a request body, timestamp, and signing secret, and returns a string in the form Slack sends: Helper function to compute Slack hashes ``` const computeSignature = ( requestBody: string, signingSecret: string, timestamp: number, ) => { const signatureBaseString = `v0:${timestamp}:${requestBody}`; const signature = crypto .createHmac("sha256", signingSecret) .update(signatureBaseString, "utf8") .digest("hex"); return `v0=${signature}`; }; ``` #### Verifying the incoming hash signature[​](#verifying-the-incoming-hash-signature "Direct link to Verifying the incoming hash signature") Now that we have a helper function, let's examine the trigger's `perform` function that is invoked when a webhook request is received. The trigger first gathers the necessary pieces of information: * We fetch the raw, unprocessed request **body** from `payload.rawBody.data` and convert it to a string. * We get the **timestamp** from an HTTP header through `payload.headers["X-Slack-Request-Timestamp"]`. The timestamp is an integer (Unix epoch seconds). * We get the **signing secret** from the Slack connection. A connection contains OAuth 2.0 information in addition to a signing secret and is passed in as an input, `params.slackConnection.fields.signingSecret`. Then, we'll run our `computeSignature` function on the data we gathered. If the hash we generate is not the same as the hash that Slack provided (through `payload.headers["X-Slack-Signature"]`), we'll throw an error and the integration will stop running: Check Slack HMAC hash ``` const signingSecret = util.types.toString( params.slackConnection.fields.signingSecret, ); const requestBody = util.types.toString(payload.rawBody.data); const timestamp = util.types.toInt( payload.headers["X-Slack-Request-Timestamp"], ); const computedSignature = computeSignature( requestBody, signingSecret, timestamp, ); const payloadSignature = util.types.toString( payload.headers["X-Slack-Signature"], ); if (payloadSignature !== computedSignature) { throw new Error( "Error validating message signature. Check your signing secret and verify that this message came from Slack.", ); } ``` Assuming the signature Slack generated, `payloadSignature`, matches the hash we computed, `computedSignature`, our trigger continues. #### Returning a Slack challenge and branching[​](#returning-a-slack-challenge-and-branching "Direct link to Returning a Slack challenge and branching") Once we've verified that the webhook we received is, indeed, from Slack, we can return a proper challenge response to Slack if it needs one and then branch appropriately. We'll get the challenge that Slack sent from the request's body (if we got a URL verification request), and then we'll generate a response that contains exactly that challenge string: ``` const challenge = (payload.body.data as Request)?.challenge; const response: HttpResponse = { statusCode: 200, contentType: "text/plain", body: challenge, }; ``` Note: if we didn't receive a `challenge` request, the response body will be blank. That's fine. Slack doesn't expect a special response for other webhook events. Finally, we'll return the `payload` that we received (so it can be used by the integration), the `response` that our webhook should send back to Slack, and the `branch` that the integration should follow (depending on whether we received a URL verify challenge request): ``` return Promise.resolve({ payload, response, branch: challenge ? "URL Verify" : "Notification", }); ``` #### Conclusion[​](#conclusion "Direct link to Conclusion") You can build your own trigger that validates webhooks by following similar patterns, and you can make your HTTP responses as static/simple or dynamic/complex as you'd like. For more information on building custom components and custom triggers, check out the [Writing Custom Components](https://prismatic.io/docs/custom-connectors.md) article. For full source code of the Slack trigger, check out our [examples repo on GitHub](https://github.com/prismatic-io/examples/blob/main/components/slack/src/triggers.ts). --- ##### Sending data to webhook triggers Webhook triggers allow you to run a particular instance or [flow](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) of an instance by making an HTTP POST, PUT, PATCH, DELETE, or GET request to the webhook's URL. This is useful when you would like an outside application to invoke an integration when something within the outside application occurs. The outside application can assemble data and send that data to a Prismatic webhook URL via an HTTP request. For example, third-party software could invoke an instance with a JSON payload whenever a job in the third-party application is complete, like this: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --data '{"renderId":51266,"s3Bucket":"test-customer-renders","status":"complete"}' \ --header "Content-Type: application/json" ``` Note that the payload of the request is available by referencing the integration's trigger, shown in the screenshot below. Steps can then reference data from the webhook payload through the trigger's results. Headers are available through `results.headers` and body data is available through `results.body.data`: ![Set webhook trigger via POST request in Prismatic app](/docs/img/integrations/triggers/webhook/sending-data/webhook-trigger-payload.png) The GET verb is supported You *can* use the GET verb to invoke an instance, but note that the GET verb does not allow you to send data with your request. If you need to send data with your request, use the POST, PUT, DELETE or PATCH verbs. The GET verb was introduced because some applications send a GET request when a webhook is configured to verify that the webhook endpoint is ready to receive requests. #### Adding a webhook trigger to a flow[​](#adding-a-webhook-trigger-to-a-flow "Direct link to Adding a webhook trigger to a flow") * Low-Code * Code-Native When you [add a flow](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) to your integration, you are prompted to add a trigger to the flow. Select the **Webhook Trigger** option to add a webhook trigger to the flow. If you would like to switch the trigger to a different type, click the three dots to the left of the trigger in the flow and select **Change step action**. The generic webhook trigger is used by default if you omit an `onTrigger` property from your `flow` definition. If you need to parse the payload that the trigger receives or send a custom response to the caller, you can define a custom trigger in your `flow` definition. See [Code-native flow triggers](https://prismatic.io/docs/integrations/code-native/flows.md#code-native-flow-triggers). #### Webhook trigger responses[​](#webhook-trigger-responses "Direct link to Webhook trigger responses") By default, webhook triggers provide an HTTP code 200 ("OK") response to callers of the webhook. The response body contains an execution ID, which can be used later to retrieve logs and step results from the Prismatic API. The response looks like this: ``` curl \ --data '{}' \ --header "Content-Type: application/json" \ 'https://hooks.prismatic.io/trigger/EXAMPLE==' {"executionId":"SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6OTdiNWQxYmEtZGUyZi00ZDY4LWIyMTgtMDFlZGMwMTQxNTM5"} ``` * Low-Code * Code-Native You can customize the response by clicking the integration webhook trigger and selecting a different HTTP code, response body, or response content type: ![Customize webhook trigger response code in Prismatic app](/docs/img/integrations/triggers/webhook/sending-data/webhook-trigger-response.png) Custom webhook responses can be defined in the `onTrigger` block of your `flow` definition. Create an `HttpResponse` object with a `statusCode`, `contentType` and `body` to be returned to the caller: ``` import { HttpResponse, flow, util } from "@prismatic-io/spectral"; import { XMLParser } from "fast-xml-parser"; flow({ // ... onTrigger: async (context, payload) => { // Parse the raw XML from the webhook request const parser = new XMLParser(); const parsedBody = parser.parse(util.types.toString(payload.rawBody.data)); // Respond to the request with a plaintext response that includes the challenge key const response: HttpResponse = { statusCode: 200, contentType: "text/plain", body: parsedBody.notification.challenge, }; // Ensure that the payload is updated with the parsed body return Promise.resolve({ payload: { ...payload, body: { data: parsedBody } }, response, }); }, }); ``` #### Other webhook trigger responses[​](#other-webhook-trigger-responses "Direct link to Other webhook trigger responses") You may encounter other responses to your webhook trigger request. ##### HTTP 303 See Other / Redirect to S3 results bucket[​](#http-303-see-other--redirect-to-s3-results-bucket "Direct link to HTTP 303 See Other / Redirect to S3 results bucket") When your webhook trigger is invoked [synchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md), your HTTP client sends a request and waits for the integration to finish running before closing the request. Prismatic responds to the request with an HTTP 303 redirect, redirecting your HTTP client to an object in an Amazon S3 bucket that contains the results of the last step of your integration. Ensure that your HTTP client follows redirects (for `curl`, you add a `--location` flag). Synchronous request redirected ``` $ curl 'https://hooks.prismatic.io/trigger/example==' --location -v ... < HTTP/2 303 < content-type: application/json < content-length: 0 < location: https://example.s3.us-east-2.amazonaws.com/example < date: Wed, 28 Sep 2022 19:42:24 GMT < x-amzn-requestid: 4c2a5179-89a2-4351-afe0-336df2cdef11 < access-control-allow-headers: Accept,CloudFront-Forwarded-Proto,CloudFront-Is-Desktop-Viewer,CloudFront-Is-Mobile-Viewer,CloudFront-Is-SmartTV-Viewer,CloudFront-Is-Tablet-Viewer,CloudFront-Viewer-ASN,CloudFront-Viewer-Country,Host,User-Agent,Via,X-Amz-Cf-Id,X-Amzn-Trace-Id,X-Forwarded-For,X-Forwarded-Port,X-Forwarded-Proto < x-amz-apigw-id: ZL6A7GegCYcF5Ew= < prismatic-executionid: SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6NGVjMTJiNWMtODk2ZC00ZGJiLThjZDgtZWUwYzNlMDE4OTBh < access-control-allow-methods: GET < x-amzn-trace-id: Root=1-6334a39f-0f975a9a7bb343c768645c36;Sampled=1 < x-cache: Miss from cloudfront < via: 1.1 d67353af1bc95b93fa6102d888271954.cloudfront.net (CloudFront) < x-amz-cf-pop: ORD58-P7 < x-amz-cf-id: xraa6hPBPc_texazIJXdKgavfwVRMW2oV85GqwCx6xUkUck7MxGKPg== < ... * Connection #0 to host hooks.prismatic.io left intact * Issue another request to this URL: 'https://example.s3.us-east-2.amazonaws.com/example' ... < HTTP/1.1 200 OK < x-amz-id-2: zx2ZHsGlC/Szu3HCp7xjRAVDALsEdQ/TJ5x/MkcbFkAB8DjLmRJOWCyTGmpI93UhdpqeYxt7hVo= < x-amz-request-id: SE5KV3CJ5AXCYGZA < Date: Wed, 28 Sep 2022 19:42:25 GMT < Last-Modified: Wed, 28 Sep 2022 19:42:25 GMT < x-amz-expiration: expiry-date="Sat, 29 Oct 2022 00:00:00 GMT", rule-id="expire-old-step-results" < ETag: "083be81885a78809b54f4deead0e6c24" < x-amz-server-side-encryption: AES256 < x-amz-version-id: 7bc_zpEQGmpc_stsaC.9vcF1DqxzXWx. < Accept-Ranges: bytes < Content-Type: application/octet-stream < Server: AmazonS3 < Content-Length: 11 < * Connection #1 to host payload-bucket20200616192411543900000009.s3.us-east-2.amazonaws.com left intact {"item":"Widgets","quantity":5} ``` Avoid combining --location and --request POST Combining `--location` and an explicit `--request POST` (or `-X POST`) flag in the same `curl` command can have unintended consequences. `curl` will be redirected to an S3 bucket but will attempt to make a `POST` request (rather than a `GET` request) to the S3 bucket. This results in S3 responding with a `SignatureDoesNotMatch` error. You will see the S3 error within [Postman](https://www.postman.com/) if you enable **Follow original HTTP method**. Keep that option unchecked. ![Avoid specifying HTTP method with redirects](/docs/img/integrations/triggers/webhook/sending-data/postman-redirect-warning.png) ##### HTTP 400 Bad Request[​](#http-400-bad-request "Direct link to HTTP 400 Bad Request") You'll see an HTTP 400 response for one of two reasons: * If your request was malformed (for example, you have a header `content-type: application/json`, but the data you sent wasn't valid JSON). Malformed payload ``` $ curl 'https://hooks.treece.prismatic-dev.io/trigger/example==' -v \ --data "{bad-data" \ --header "content-type: application/json" ... < HTTP/2 400 < content-type: application/json < content-length: 44 < date: Wed, 28 Sep 2022 19:47:09 GMT < x-amzn-requestid: 64a06065-c2d5-4d1b-8188-1513f072cb8e < access-control-allow-headers: Accept,CloudFront-Forwarded-Proto,CloudFront-Is-Desktop-Viewer,CloudFront-Is-Mobile-Viewer,CloudFront-Is-SmartTV-Viewer,CloudFront-Is-Tablet-Viewer,CloudFront-Viewer-ASN,CloudFront-Viewer-Country,content-type,Host,User-Agent,Via,X-Amz-Cf-Id,X-Amzn-Trace-Id,X-Forwarded-For,X-Forwarded-Port,X-Forwarded-Proto < x-amz-apigw-id: ZL6tYEjPCYcFmMA= < prismatic-executionid: SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6YmIwZmQwMjgtYmRlOS00NDFhLTg4ZTYtNjcwZDlkMDY2NjZm < access-control-allow-methods: POST < x-amzn-trace-id: Root=1-6334a4bb-5a02f67725c134b505472c11;Sampled=1 < x-cache: Error from cloudfront < via: 1.1 ee57d6770700357db4b696b4c5250b82.cloudfront.net (CloudFront) < x-amz-cf-pop: ORD58-P7 < x-amz-cf-id: qSGjLnZJNDjUkUGU2Wl297s6eezPSFs5UEF58H_hceHj9JokJkfB3A== < * Connection #0 to host hooks.treece.prismatic-dev.io left intact {"error":"Received malformed JSON payload."} ``` * When your webhook trigger is invoked [synchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md) and stops because an error is thrown, your request will receive an HTTP 400 response with the error that was thrown. Error thrown in synchronous execution ``` curl 'https://hooks.treece.prismatic-dev.io/trigger/example==' -v ... * We are completely uploaded and fine * Connection state changed (MAX_CONCURRENT_STREAMS == 128)! < HTTP/2 400 < content-type: application/json < content-length: 55 < date: Wed, 28 Sep 2022 19:53:33 GMT < x-amzn-requestid: 8e46051e-2783-4e25-b95d-35ee323ce523 < access-control-allow-headers: Accept,CloudFront-Forwarded-Proto,CloudFront-Is-Desktop-Viewer,CloudFront-Is-Mobile-Viewer,CloudFront-Is-SmartTV-Viewer,CloudFront-Is-Tablet-Viewer,CloudFront-Viewer-ASN,CloudFront-Viewer-Country,content-type,Host,User-Agent,Via,X-Amz-Cf-Id,X-Amzn-Trace-Id,X-Forwarded-For,X-Forwarded-Port,X-Forwarded-Proto < x-amz-apigw-id: ZL7paEZjiYcFU4A= < prismatic-executionid: SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6NTM5ZGE0YjItNDViMC00MDQxLTg3MTgtMzFhZDkwMDg1Y2Iw < access-control-allow-methods: POST < x-amzn-trace-id: Root=1-6334a63c-0d58198b7c2e26d11f4331f3;Sampled=1 < x-cache: Error from cloudfront < via: 1.1 26c731836eb716e46fe9852a7aaeb508.cloudfront.net (CloudFront) < x-amz-cf-pop: ORD58-P7 < x-amz-cf-id: ZKCbTC07GI90gLLyeRNmGaHub0gpULRK45_GZ7d7uGM3njGFYKaPkw== < * Connection #0 to host hooks.treece.prismatic-dev.io left intact {"error":"The widget requested is not in the database"} ``` ##### HTTP 429 Too Many Requests / Rate Limiting[​](#http-429-too-many-requests--rate-limiting "Direct link to HTTP 429 Too Many Requests / Rate Limiting") A webhook endpoint URL can be invoked up to 50 times per second. If a request is received for an endpoint URL that has already received 50 requests in the last second, the request will receive a 429 "too many requests" response. Request rate limited ``` $ curl 'https://hooks.prismatic.io/trigger/example==' -v ... < HTTP/2 429 < content-type: application/json < content-length: 200 < date: Wed, 28 Sep 2022 19:36:55 GMT < x-amzn-requestid: f53fbf8c-9ce6-4c4a-80a6-84edbc29fdeb < x-amz-apigw-id: ZL5NtH82CYcFbFg= < x-amzn-trace-id: Root=1-6334a257-1ca7ad23574a8c3255b24776;Sampled=1 < x-cache: Error from cloudfront < via: 1.1 a044221a7cde0fa9b5dc69d5ceb4439a.cloudfront.net (CloudFront) < x-amz-cf-pop: ORD58-P7 < x-amz-cf-id: BoOs_sbk7gB-t16ZCR4uVtD8NcPnmD40rGUPwgaouVe2hiHu60Rcjw== < * Connection #0 to host hooks.prismatic.io left intact {"error":"Endpoint with id: example== has exceeded maximum allowed throughput of 50 requests/second. Please throttle your requests."} ``` #### Webhook endpoint configuration[​](#webhook-endpoint-configuration "Direct link to Webhook endpoint configuration") Webhook triggers can be configured for an integration in one of three ways: * **Instance and Flow Specific**: Each flow on each instance gets its own unique endpoint. This is the default configuration. * **Instance Specific**: Each instance gets a unique endpoint, and the integration determines which flow to run based on header or payload data. * **Shared**: All customers' instances of the integration share an endpoint. Data in the header or payload determines which customer and flow should run. For information on configuring and troubleshooting webhook endpoints, see our [Endpoint Configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md) article. #### Sending data to webhook triggers[​](#sending-data-to-webhook-triggers "Direct link to Sending data to webhook triggers") A webhook parses data from the following sources: * The **request body** - the JSON (or other) data that is sent to the webhook as an [HTTP request body](https://developer.mozilla.org/en-US/docs/Web/HTTP/Messages#body). * The **request headers** - the [HTTP headers](https://developer.mozilla.org/en-US/docs/Web/HTTP/Messages#headers). * The **URL path** - The [path to resource](https://developer.mozilla.org/en-US/docs/Learn/Common_questions/What_is_a_URL#path_to_resource) that follows the integration webhook URL. * The **URL parameters** - The [parameters](https://developer.mozilla.org/en-US/docs/Learn/Common_questions/What_is_a_URL#parameters) that follow the `?` in a URL.. Take, for example, this `curl` invocation: ``` curl \ 'https://hooks.prismatic.io/trigger/EXAMPLE==/my/custom/path?param-one=ParamValueOne¶m-two=ParamValueTwo' \ --header "header-one: First header value" \ --header "header-two: Second header value" \ --header "Content-Type: application/json" \ --data '{"Payload Key 1":"Payload Value 1","Do Thing?":true,"quantity":123}' ``` * The request body - `{"Payload Key 1":"Payload Value 1","Do Thing?":true,"quantity":123}` - is parsed (if JSON) and is accessible to the integration by referencing the trigger's `results.body.data.KEY-NAME`. Non-JSON payloads (like XML, images, etc) are accessible through `results.rawBody` and can be parsed in subsequent steps that handle that type of data. * The request headers are accessible through the trigger's `results.headers.HEADER-NAME`. * The url path - `my/custom/path` - is accessible through the trigger's `results.pathFragment`. You can pass that data into the built-in [split string](https://prismatic.io/docs/components/text-manipulation.md#split-string) action and split on the `/` character to split the URL path into an array `['my','custom','path']`. * The url parameters - `?param-one=ParamValueOne¶m-two=ParamValueTwo` are parsed and accessible through the trigger's `results.queryParameters.PARAMETER-NAME`. ![Screenshot of Sending Data to Webhook Triggers](/docs/img/integrations/triggers/webhook/sending-data/sending-data-to-webhook-triggers.png) #### Posting binary data with webhook triggers[​](#posting-binary-data-with-webhook-triggers "Direct link to Posting binary data with webhook triggers") If you have binary data (like an image or PDF) that you would like to post as part of your webhook invocation, you can pass that binary data in as part of your request. For example, if you have an image, `my-image.png`, you could invoke a test of an integration with: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --request POST \ --header 'Content-Type: image/png' \ --data-binary '@/path/to/my-image.png' ``` The binary file can be accessed by subsequent steps by referencing `integrationTrigger.results.body.data`. #### Posting multipart data with webhook triggers[​](#posting-multipart-data-with-webhook-triggers "Direct link to Posting multipart data with webhook triggers") It's useful to be able to post a combination of binary and text data to a Prismatic webhook. For example, you might want to post information about a person, as well as an avatar image of the person, to be processed by an integration. To do that, use a content type of `multipart/form-data` with your webhook invocation: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --request POST \ --header "Content-Type: multipart/form-data" \ --form person='{"firstname":"Taylor","lastname":"Reece"};type=application/json' \ --form photo=@taylor.png ``` The first name in this example is accessible by referencing the trigger's `results.body.data.person.data.firstname`, and the avatar image is accessible by referencing `results.body.data.photo`: ![Post multipart data with webhook triggers in Prismatic app](/docs/img/integrations/triggers/webhook/sending-data/webhook-multipart-payload.png) #### Accessing webhook URLs in an integration[​](#accessing-webhook-urls-in-an-integration "Direct link to Accessing webhook URLs in an integration") An integration is aware of its own webhook URLs, which is handy for setting up webhooks in third-party apps on deploy, or when you need one flow to call a sibling flow by webhook URL. * Low-Code * Code-Native Those URLs are accessible by referencing the trigger's `results.webhookUrls` object: ![Access webhook URLS for integration in Prismatic app](/docs/img/integrations/triggers/webhook/sending-data/webhook-urls-payload.png) This comes in handy if you need to configure a third-party service to send data to your webhooks. A common pattern is for one [flow](https://prismatic.io/docs/integrations/low-code-integration-designer/flows.md) of your integration to be run when an instance is deployed using a [deploy trigger](https://prismatic.io/docs/integrations/triggers/management.md#instance-deploy-trigger). That deploy-time flow can set up webhooks in a third-party app by referencing its trigger's `results.webhookUrls` values. Then, the third-party app will invoke the other flows of the integration when it needs to. If you use [shared endpoint configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md), the shared endpoint URL is accessible from `results.invokeUrl`. If you set up [API keys](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#securing-endpoints-with-api-keys) for your deployed instances, you can access `results.webhookApiKeys` similarly. An instance's flow can have multiple API keys assigned to it, so each `results.webhookApiKeys.MY FLOW NAME` is an array. You will likely reference the first API key, `results.webhookApiKeys.MY FLOW NAME.0`. `webhookUrls` in a code-native integration is available in the `context` object passed to the `onTrigger`, `onInstanceDeploy`, `onInstanceRemove` or `onExecution` functions. For example, if you want to establish a webhook in a third-party app when an instance is deployed, you can do so in the `onInstanceDeploy` function: ``` const myFlow = flow({ // ... onInstanceDeploy: async (context, params) => { // Get the current flow's webhook URL const flowWebhookUrl = context.webhookUrls[context.flow.name]; // Create a webhook in Acme const { data } = await axios.post( "https://api.acme.com/webhooks", { endpoint: flowWebhookUrl, events: ["message.created", "message.updated"], }, { headers: { Authorization: `Bearer ${context.configVars["My Connection"].fields.apiKey}`, }, }, ); // Store the webhook ID in the flow's persistent state so it can be // deleted on instance delete return { instanceState: { webhookId: data.id }, }; }, }); ``` --- ##### Synchronous and Asynchronous Invocations Integrations are configured by default to run **asynchronously**. This means that whenever an integration is invoked by trigger webhook URL, the integration begins to run and the system that invoked the integration can proceed to complete other work. This is the most common case for integrations - you want to start up an instance when a certain event occurs, but you don't want to wait while the instance runs. Sometimes, however, it's useful for an application to get information back from the instance that was invoked. For example, you might want your proprietary software to wait until an instance runs to completion before completing other work. In that case, you can choose to run your integration **synchronously**. Then, when your software makes a call to the instance's webhook trigger URL, the HTTP request is held open until the instance run is complete. When you choose to run your integrations **synchronously**, the HTTP request that invokes an instance returns a *redirect* to a URL containing the output results of the final step of the integration. For example, if the final step of your integration pulls down JSON from , you will see this when you invoke the integration synchronously: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --data '{}' \ --header "Content-Type: application/json" \ --header "prismatic-synchronous: true" \ --location {"id":1,"name":"Leanne Graham","username":"Bret","email":"Sincere@april.biz","address":{"street":"Kulas Light","suite":"Apt. 556","city":"Gwenborough","zipcode":"92998-3874","geo":{"lat":"-37.3159","lng":"81.1496"}},"phone":"1-770-736-8031 x56442","website":"hildegard.org","company":{"name":"Romaguera-Crona","catchPhrase":"Multi-layered client-server neural-net","bs":"harness real-time e-markets"}} ``` You can toggle whether your integration is synchronous or asynchronous by clicking the trigger and selecting a **Response Type**. You can also pass in a header, `prismatic-synchronous` with a webhook invocation to instruct your instance to run synchronously or asynchronously: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --header "prismatic-synchronous: false" \ --request POST ``` ##### Synchronous invocations and redirects[​](#synchronous-invocations-and-redirects "Direct link to Synchronous invocations and redirects") When you invoke an instance synchronously, the HTTP response you receive contains the result of the final step of your flow. If the result is over 5MB in size, the result is written to Amazon S3, and you receive an HTTP 303 redirect to an object in S3. Because of this possible redirect, you should ensure that your HTTP client is configured to follow HTTP status code 303 redirects. For `curl`, for example, include a `-L / --location` flag so it follows redirects. Ensure redirects are followed ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --header "prismatic-synchronous: true" \ --location \ --request POST \ --data "{}" ``` If you would like Prismatic's runner to always return 303 redirects, you can include an optional header, `prismatic-prefer-redirect-sync-response: true` and the runner will return an HTTP 303 response to an S3 object, regardless of size. Response content type You can control the `content-type` of the response by adding a `contentType` property to your last step's result. For example, ``` const returnData = ` Tove Jani Reminder Don't forget me this weekend! `; return { data: returnData, contentType: "application/xml", }; ``` #### HTTP status codes for synchronous integrations[​](#http-status-codes-for-synchronous-integrations "Direct link to HTTP status codes for synchronous integrations") When an instance is configured to run synchronously or is invoked synchronously with the `prismatic-synchronous` header, the HTTP response returns a status code `200 - OK` by default. It's sometimes useful, however, to return other HTTP status codes. For example, if a client submits incorrectly formatted data to be processed by an instance, it might be helpful to return a `406 - Not Acceptable` or `415 - Unsupported Media Type`. * Low-Code * Code-Native To accomplish this in a low-code integration, you can configure the final step of your integration to return a different status code. Most commonly, you can add a [Stop Execution](https://prismatic.io/docs/components/stop-execution.md) step to the end of your integration and specify an HTTP response that it should return. ![Configure HTTP status codes for synchronous integrations in Prismatic app](/docs/img/integrations/triggers/webhook/synchronous-and-asynchronous/stop-execution-status-code.png) If you would like to return HTTP status codes from a custom component at the end of your integration instead, return an object with a `statusCode` attribute instead of a `data` attribute: ``` return { statusCode: 415 }; ``` To accomplish this in a code-native integration, mark your flow as `isSynchronous: true,` and return an object with a `statusCode` attribute: Return a 415 status code from a code-native integration ``` export const flow1 = flow({ name: "Flow 1", stableKey: "9499d1d8-dddd-4d9b-aaff-c054f59d02cc", description: "This is the first flow", isSynchronous: true, onExecution: async (context, params) => { return { data: { customError: "Invalid file type. You must provide an SVG.", }, statusCode: 415, headers: { "X-Custom-Header": "foo" }, contentType: "application/json", }; }, }); ``` ``` $ curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --verbose \ --location \ --header "prismatic-synchronous: true" * TCP_NODELAY set * Connected to hooks.prismatic.io (13.227.37.2) port 443 (#0) ... < HTTP/2 415 ``` #### Response headers for synchronous integrations[​](#response-headers-for-synchronous-integrations "Direct link to Response headers for synchronous integrations") In addition to HTTP status codes (above), you can also yield custom response headers from your synchronous integrations. This is useful if you would like to redirect the client to a different URL once the flow is complete. Your code step, for example, can read: ``` module.exports = async ({ logger, configVars }, stepResults) => { return { data: "Redirecting you...", statusCode: 303, headers: { Location: "https://example.com" }, }; }; ``` When a client invokes the integration synchronously, they will receive a `303 - See Other` status code and be redirected to `https://example.com`. #### Synchronous call limitations[​](#synchronous-call-limitations "Direct link to Synchronous call limitations") ##### Response body and status code limitations[​](#response-body-and-status-code-limitations "Direct link to Response body and status code limitations") When an integration is invoked synchronously, the integration redirects the caller to a URL containing the output results of the final step of the integration. If the final step of the integration is a [Stop Execution](https://prismatic.io/docs/components/stop-execution.md) action, or any custom component action that returns a `statusCode`, the redirect does not occur and the caller receives a `null` response body instead. ##### API gateway size and time limitations[​](#api-gateway-size-and-time-limitations "Direct link to API gateway size and time limitations") AWS API Gateway times out requests after 29 seconds, and our maximum response size is 500MB. So, to get a response from an instance that is invoked synchronously, please ensure that your integration runs in under 29 seconds and produces a final step payload of less than 500MB. If your integration regularly takes over 29 seconds to run, or produces large responses, we recommend that you run your integrations asynchronously instead. When you invoke an integration asynchronously, you receive an `executionId`: ``` curl 'https://hooks.prismatic.io/trigger/EXAMPLE==' \ --data '{}' \ --header "Content-Type: application/json" {"executionId":"SW5zdGFuY2VFeGVjdXRpb25SZXN1bHQ6OTdiNWQxYmEtZGUyZi00ZDY4LWIyMTgtMDFlZGMwMTQxNTM5"} ``` That execution ID can be exchanged later with the Prismatic API for logs and step results using the [executionResult](https://prismatic.io/docs/api/schema/query/executionResult.md) GraphQL mutation. --- ##### What is HMAC? When it comes to transferring data via integrations, security is a top concern. To secure data being passed via webhooks ([for event-driven integrations](https://prismatic.io/docs/integrations/common-patterns.md#event-driven-integrations)), you have a few options, including API keys, your own authorization headers, or HMAC. Prismatic recommends using HMAC because it's the simplest (and strongest) approach you could use. It doesn't require that your team learn a new language or gain an advanced understanding of encryption, but it does an excellent job of protecting the integrity of your data at the point of transfer. #### What is HMAC?[​](#what-is-hmac "Direct link to What is HMAC?") HMAC, or hashed message authentication code, is an authentication method that generates a hash from a message and a cryptographic key. When you implement HMAC for your webhook, you'll use an algorithm such as MD5, SHA-256, or RipeMD-128 for the hash to ensure the HTTP request that shows up at your webhook endpoint is legitimate. #### How does HMAC work?[​](#how-does-hmac-work "Direct link to How does HMAC work?") Before the source app sends an HTTP request via the webhook, it hashes or serializes the payload (request body) with HMAC using the secret key. The resulting hash is then bundled into the HTTP request as a header, and the entire request (header and body) is sent to the webhook endpoint. The Secret Key is never sent in the payload! It's important to be aware that the secret key is and should never be sent in the payload -- It is only used to generate and validate the HMAC signature. Upon receiving the HTTP request, the destination app hashes the body with the secret key and then compares the result to the hash provided in the header. If the values match, the destination app knows the data is legitimate and processes it. If the values do not match, the destination app rejects the data and executes whatever code was written for that scenario - perhaps creating a log entry or sending a notification. If someone tries to spoof the payload, they won't be able to generate a valid hash since they don't have the secret key. Door closed. Key Rotation can reduce security risks Adding key rotation to rotate the secret key can decrease the risk of key exposure, leakage, long-term compromise, orphaned integration compromise, insider threats, and compliance violations. Consider this approach when setting up HMAC Validation #### HMAC language support[​](#hmac-language-support "Direct link to HMAC language support") Here are links to popular languages with HMAC capabilities: * [NodeJS](https://nodejs.org/api/crypto.html) * [Python](https://docs.python.org/3/library/hmac.html) * [PHP](https://www.php.net/manual/en/function.hash-hmac.php) * [.NET C#](https://docs.microsoft.com/en-us/dotnet/api/system.security.cryptography.hmac?view=net-6.0) #### Example code for HMAC[​](#example-code-for-hmac "Direct link to Example code for HMAC") Here is an example of how HMAC might be set up in NodeJS using the built-in crypto module: ``` const crypto = require("crypto"); const SECRET_KEY = "secret-FA782CF7-060E-484E-B3DC-055CF2C9ED99"; const payload = JSON.stringify({ event: "REFUND_REQUEST", user: "realcustomer@notabaddie.com", amount: "50.25", }); const hash = crypto .createHmac("sha256", SECRET_KEY) .update(payload, "utf-8") .digest("hex"); console.log(hash); // Prints d12f95e3f98240cff00b2743160455fdf70cb8d431db2981a9af8414fc4ad5f8 ``` The corresponding HTTP request using HMAC might look like this: ``` curl https://my.webhook.endpoint.com/callback \ --request POST \ --header "x-hmac-hash: d12f95e3f98240cff00b2743160455fdf70cb8d431db2981a9af8414fc4ad5f8" \ --data '{"event":"REFUND_REQUEST","user":"realcustomer@notabaddie.com","amount":"50.25"}' ``` #### Example code for HMAC multipart form data[​](#example-code-for-hmac-multipart-form-data "Direct link to Example code for HMAC multipart form data") If you're planning to use HMAC to send data as multipart form data, you'll need to hash the entire request body and use buffers to convert the data into the appropriate format. Here's an example of how you might do that in NodeJS: ``` const crypto = require("crypto"); const formData = require("form-data"); const fs = require("fs"); const SECRET_KEY = "secret-FA782CF7-060E-484E-B3DC-055CF2C9ED99"; // Create the Forms Data object const data = new formData(); data.append("foo", "bar"); data.append("baz", JSON.stringify({ buz: "biz" }), { contentType: "application/json", }); data.append("my-buffer", Buffer.from(fs.readFileSync("/Path/To/File/")), { filename: "name_of_file.ext", contentType: "content/type", }); // Create the HMAC hash for the fetch request const hash = crypto .createHmac("sha256", SECRET_KEY) .update(data.getBuffer().toString()) .digest("hex"); console.log(hash); ``` #### Sending HMAC through Axios or fetch[​](#sending-hmac-through-axios-or-fetch "Direct link to Sending HMAC through Axios or fetch") If you're using Axios or fetch to send the HTTP request, you can include the HMAC hash in the headers like this: With fetch, you'll need to ensure you provide the body as a buffer object and include the boundary from the form data object in the content-type header. ``` const response = await fetch("https://my.webhook.endpoint.com/callback", { method: "POST", body: data.getBuffer(), headers: { "x-hmac-hash": hash, "content-type": `multipart/form-data; boundary=${data.getBoundary()}`, }, }); console.log(response.json()); ``` With Axios, you can include the HMAC hash in the headers like this: ``` axios.post("https://my.webhook.endpoint.com/callback", data, { headers: { "x-hmac-hash": hash, }, }); ``` #### Further resources[​](#further-resources "Direct link to Further resources") Setting up HMAC for one of your integrations should be straightforward. To make things even simpler, we've added an [HMAC Webhook trigger](https://prismatic.io/docs/components/hash.md#hmac-webhook-trigger) to our built-in Hash component. In many cases, this will address your needs, but for times it doesn't, here is a quickstart tutorial for [Writing a Custom Webhook Trigger with HMAC Validation](https://prismatic.io/docs/integrations/triggers/webhook/custom-hmac-trigger.md). --- ### Custom Connectors #### Custom Connectors Overview #### Overview[​](#overview "Direct link to Overview") Prismatic is extensible and allows developer users to develop their own custom connectors. Connectors that Prismatic users develop are proprietary to their organization and are private. Connectors are Node.js/TypeScript projects that accomplish specific tasks or connect to an outside service. Connectors comprise: * [Connections](https://prismatic.io/docs/custom-connectors/connections.md) which contain information such as passwords, API keys, or OAuth 2.0 connection information. Your customers fill in their authentication information for connections when they deploy your integration. * [Actions](https://prismatic.io/docs/custom-connectors/actions.md) which are purpose-built functions that you can use in your integration. You might build "Create Widget" or "List Gadgets" actions in a custom connector, and each action can be used as a step within an [integration](https://prismatic.io/docs/integrations.md). * [Triggers](https://prismatic.io/docs/custom-connectors/triggers.md) determine when an integration should run and how it should handle requests. You might create a custom trigger that configures webhooks in a third-party application when a customer configures an instance of your integration and responds to webhook requests from the third-party appropriately. * [Data Sources](https://prismatic.io/docs/custom-connectors/data-sources.md) fetch data dynamically during the [config wizard](https://prismatic.io/docs/integrations/config-wizard.md) experience. For example, you might create a data source that fetches a list of projects in a CMS and presents the projects as a dropdown menu in the config wizard. [Sample connector code](https://github.com/prismatic-io/examples/tree/main/components) is referenced throughout these docs. For a sample connector that wraps an HTTP-based API, see our quickstart on [Wrapping an API in a Custom Connector](https://prismatic.io/docs/custom-connectors/get-started/wrap-an-api.md). #### Custom connector library[​](#custom-connector-library "Direct link to Custom connector library") [![Spectral NPM version](https://badge.fury.io/js/@prismatic-io%2Fspectral.svg)](https://www.npmjs.com/package/@prismatic-io/spectral) Prismatic provides a Node.js package, [@prismatic-io/spectral](https://www.npmjs.com/package/@prismatic-io/spectral), which provides TypeScript typing and some utility functions. Source code for Spectral is available on [GitHub](https://github.com/prismatic-io/spectral). **Node.js Version Support**: While many versions of Node.js may work for connector development, we recommend using the latest LTS (long-term support) version of Node.js. You can find the latest LTS version on the [Node.js download page](https://nodejs.org/en/download/). Connector vs Component The terms **connector** and **component** are used interchangeably in the Prismatic platform. Generally, if a component connects to a third-party API, we call it a **connector**. But, not all components connect to external APIs. The [Change Data Format](https://prismatic.io/docs/components/change-data-format.md) component, for example, converts data between common formats (JSON, XML, etc), and the [Math](https://prismatic.io/docs/components/math.md) component provides a variety of math utility functions, but neither component makes a network request to a third-party app. #### Customer users and custom components[​](#customer-users-and-custom-components "Direct link to Customer users and custom components") If you use the [embedded workflow builder](https://prismatic.io/docs/embed/workflow-builder.md), your customers may want to build their own custom components and use them in their workflows. These private components are scoped to the customer and are not visible to other customers. This is beneficial when customers need to build custom components specific to their business (for example, to wrap their own API). The development of customer-scoped custom components follows the same process as organization-wide custom components. Customer users must be able to log in to Prismatic For a customer user to publish a custom component, you must [create a customer user account](https://prismatic.io/docs/customers/customer-users.md) and assign them the "Admin" role. The customer user must be able to log into Prismatic to authenticate the Prism CLI tool for publishing custom components. A customer user cannot use a signed JWT to publish custom components. Once a customer user has logged into Prismatic, they can authenticate the Prism CLI tool by running `prism login` and then publish a component scoped to their customer with `prism components:publish`, as an organization team member would. To publish a component as an organization team member for a specific customer, run `prism components:publish --customer ` using an ID that you can fetch by running `prism customers:list --columns "id,name"`. --- #### Custom Actions #### Overview[​](#overview "Direct link to Overview") A component is comprised of zero, one or many actions. For example, the [HTTP component](https://prismatic.io/docs/components/http.md) contains actions to [GET](https://prismatic.io/docs/components/http.md#get-request) (`httpGet`), [POST](https://prismatic.io/docs/components/http.md#post-request) (`httpPost`), etc. An action can be added as a step of an integration. An `action` has three required properties: 1. `display` which affects how the action renders within the Prismatic web application 2. A series of `input` fields 3. A function to `perform` when the action is encountered in a flow. An action may return some `data` that can be used in a subsequent step. ``` import { action, input } from "@prismatic-io/spectral"; const myAction = action({ display: { label: "Say Hello", description: "Concatenate the first and last name of a person", }, inputs: { firstName: input({ label: "First Name", type: "string", required: true }), lastName: input({ label: "Last Name", type: "string", required: true }), }, perform: async (context, params) => { const myMessage = `Hello, ${params.firstName} ${params.lastName}`; return Promise.resolve({ data: myMessage }); }, }); ``` #### Action inputs[​](#action-inputs "Direct link to Action inputs") Components can take `inputs`. Each `input` is comprised of a required `label`, and `type` and optional `placeholder`, `default`, `comments`, `required` and `model`. Consider this example input: ``` const middleName = input({ label: "Middle Name", placeholder: "Middle name of a person", type: "string", required: false, default: "", comments: "Leave blank if the user has no middle name", clean: (value) => util.types.toString(value), }); ``` This contributes to an input prompt that looks like this: ![Step Config - Properly Format Name in Prismatic app](/docs/img/custom-connectors/actions/inputs.png) Note where the `label` and `placeholder` text appeared in the web application, and note that First Name and Last Name are required - indicated with a `*`, but Middle Name is not. ##### Action input types[​](#action-input-types "Direct link to Action input types") An input can take a number of types, which affects how the input renders in the Prismatic web application: * **string** will allow users to input or reference a string of characters. ![String input in Prismatic app](/docs/img/custom-connectors/actions/input-types/string.png) * **password** will allow users to input or reference a string of characters, and the string will be obfuscated in the UI. ![Password input in Prismatic app](/docs/img/custom-connectors/actions/input-types/password.png) * **boolean** allows users to enter one of two values: true or false. ![Boolean input in Prismatic app](/docs/img/custom-connectors/actions/input-types/boolean.png) * **code** opens a code editor so users can enter XML, HTML, JSON, etc. Syntax highlighting can be added to a **code** input's definition and can reference any language supported by [PrismJS](https://prismjs.com/#supported-languages). (e.g. `input({ label: "My Code", type: "code", language: "json" })`) ![Code editor in Prismatic app](/docs/img/custom-connectors/actions/input-types/code.png) * **conditional** allows users to enter a series of logical conditionals. This is most notably used in the [branch](https://prismatic.io/docs/components/branch.md) component. ![Conditional input in Prismatic app](/docs/img/custom-connectors/actions/input-types/conditional.png) You can also create **connection** inputs for actions. Read more about [connections](https://prismatic.io/docs/custom-connectors/connections.md). ##### Dropdown menu inputs[​](#dropdown-menu-inputs "Direct link to Dropdown menu inputs") Rather than allowing integration builders to enter values for an input, you might want to have users choose a value from a list of possible values. You can do that by making your input into a dropdown menu. ![Dropdown menu in Prismatic app](/docs/img/custom-connectors/actions/dropdown-input.png) To create an input with a dropdown menu, add a `model` property to your input: ``` export const acmeEnvironment = input({ label: "Acme Inc Environment to Use", placeholder: "ACME Environment", type: "string", required: true, model: [ { label: "Production", value: "https://api.acme.com/", }, { label: "Staging", value: "https://staging.acme.com/api", }, { label: "Sandbox", value: "https://sandbox.acme.com/api", }, ], }); ``` The `model` property should be an array of objects, with each object containing a `label` and a `value`. The `label` is shown in the dropdown menu. The `value` is passed in as the input's value to the custom component. ##### Collection inputs[​](#collection-inputs "Direct link to Collection inputs") Most inputs represent single strings. A **collection** input, on the other hand, represents an array of values or key-value pairs. Collections are handy when you don't know how many items a component user might need. ###### Value list collection[​](#value-list-collection "Direct link to Value list collection") For example, your component might require an array of record to query, but you might not know how many record IDs a component user will enter. You can create a `valuelist` collection in code like this: Value List Collection Example ``` const recordIdsInputField = input({ label: "Record ID", type: "string", collection: "valuelist", required: true, }); ``` The corresponding UI in the integration designer would then prompt a user for any number of record IDs that they would like to enter: ![Value List collection in Prismatic app](/docs/img/custom-connectors/actions/input-types/value-list-collection.png) When the input is received by an action's [perform function](#the-perform-function), the input is a `string[]`. ###### Key value list collection[​](#key-value-list-collection "Direct link to Key value list collection") If you would like users to enter a number of key-value pairs as an input, you can use a `keyvaluelist` collection. The *Header* input on the [HTTP component](https://prismatic.io/docs/components/http.md#get-request) actions is an example of a `keyvaluelist` collection, and is defined in code like this: Key Value List Input ``` export const headersInputField = input({ label: "Header", type: "string", collection: "keyvaluelist", required: false, comments: "A list of headers to send with the request.", example: "User-Agent: curl/7.64.1", }); ``` The "Header" input, then, appears like this in the integration designer: ![Key Value List Collection in Prismatic app](/docs/img/custom-connectors/actions/input-types/key-value-list-collection.png) When the input is received by an action's [perform function](#the-perform-function), the input is an array of objects of the form: ``` [ { key: "foo", value: "bar", }, { key: "baz", value: 5, }, ]; ``` If you would like to convert the input to a key-value pair object, you can use the built-in Spectral function, `keyValPairListToObject`: ``` import { util } from "@prismatic-io/spectral"; const myObject = util.types.keyValPairListToObject(myInput); // { foo: "bar", baz: 5 } ``` ##### Cleaning inputs[​](#cleaning-inputs "Direct link to Cleaning inputs") An input of an action can be anything - a number, string, boolean, JavaScript Buffer, a complex object with lots of properties, etc. If you reuse an input for multiple actions, it's handy to do some preprocessing and type conversion on the input. That's where a `clean` function on an input comes in. For example, suppose you expect an input to be a number. But, inputs by default are presented to `perform` functions as strings. You can leverage the `util.types.toNumber()` utility function and `clean` property to ensure that the input is presented to the `perform` function as a number: Ensure input is a number ``` const serverPortInput = input({ label: "Server Port", placeholder: "The port of the API server", comments: "Look for the number after the colon (my-server.com:3000)" type: "string", default: "3000", required: true, clean: (value) => util.types.toNumber(value), }); ``` You can also add validation to the input. For example, if you want to validate that the input is an IPv4 IP address, you can build a more complex `clean` function: Validate that an input is an IP address ``` const validateIpAddress = (value: unknown) => { const ipAddressRegex = /^(?:(?:2(?:[0-4][0-9]|5[0-5])|[0-1]?[0-9]?[0-9])\.){3}(?:(?:2([0-4][0-9]|5[0-5])|[0-1]?[0-9]?[0-9]))$/; const inputValue = util.types.toString(value); if (!ipAddressRegex.test(inputValue)) { throw new Error(`The value "${inputValue}" is not a valid IP address`); } return inputValue; }; const ipAddressInput = input({ label: "IP Address", placeholder: "Server IP Address", type: "string", default: "192.168.1.1", required: true, clean: validateIpAddress, }); ``` ##### Handle complex inputs in a custom action[​](#handle-complex-inputs-in-a-custom-action "Direct link to Handle complex inputs in a custom action") When an API endpoint that you're wrapping in a custom action expects a simple payload, like ``` POST /widgets { "name": "string", "color": "string", "quantity": "number" } ``` it's easy to map each value in the POST request to an input. Here, we'd create a "name" input, a "color" input, and a "quantity" input. Then, we'd apply a `clean: util.types.toNumber` clean function to the "quantity" input to ensure it is cast to a number. But, some endpoints expect complex payloads that may contain arrays of objects with optional properties, etc. ``` POST /widgets { "externalId": "abc-123", "variants": [ { "name": "Variant 1", "color": "red", "price": { "usd": 5, "ca": 5.5 } }, { "name": "Variant 2", "color": "blue", "price": { "usd": 6 } } ] } ``` In this case, it's likely that an integration builder will want to construct a property like `variants` programmatically, and it's probably best to present two inputs, "External ID" which is `type: "string"` and "Variants" which is `type: "code"`. To accommodate both JSON and JavaScript object inputs, use the `util.types.toObject` function to ensure that what is entered becomes a JavaScript object. For example, Convert a complex input to an object ``` const createWidgets = action({ display: { label: "Create Widgets", description: "Create widgets and their variants", }, inputs: { connection: connectionInput, externalId: input({ label: "External ID", type: "string", comments: "The ID stored in Acme for this Widget type", clean: util.types.toString, }), variants: input({ label: "Variants", comments: "Variant types of the widget. Ensure you provide an array of variant objects.", type: "code", language: "json", clean: util.types.toObject, example: JSON.stringify( [ { name: "Variant 1", color: "red", price: { usd: 5, ca: 5.5, }, }, { name: "Variant 2", color: "blue", price: { usd: 6, }, }, ], null, 2, ), }), }, perform: async (context, params) => { const client = createAcmeClient(params.connection); const { data } = await client.post("/widgets", { externalId: params.externalId, variants: params.variants, }); return { data }; }, }); ``` #### The perform function[​](#the-perform-function "Direct link to The perform function") Each action contains one `perform` function, which is an async function with two parameters that may or may not have a return value. In this example `firstName`, `middleName`, and `lastName`, are input parameters for this `perform` function: ``` export const properFormatName = action({ display: { label: "Properly Format Name", description: "Properly format a person's name (Last, First M.)", }, perform: async (context, params) => { if (params.middleName) { return { data: `${params.lastName}, ${params.firstName} ${params.middleName[0]}.`, }; } else { return { data: `${params.lastName}, ${params.firstName}` }; } }, inputs: { firstName, middleName, lastName }, }); ``` ##### `perform` Function Parameters[​](#perform-function-parameters "Direct link to perform-function-parameters") The `perform` function takes two parameters, `context` and `params`, that can be destructured into their respective properties: ``` perform: async (context, params) => {}, // or perform: async ( { logger }, { paramName1, paramName2, ... } ) => {}, ``` ##### The `context` parameter[​](#the-context-parameter "Direct link to the-context-parameter") The `context` parameter is an object that contains the following attributes: * `logger` allows you to write out log lines. * `debug` is an object which you can use when [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode) is enabled to emit additional debug log lines or measure time or memory costs of specific portions of your code. * `instanceState`, `crossFlowState`, `integrationState` and `executionState` gives you access to [persisted state](#execution-instance-and-cross-flow-state). * `stepId` is the ID of the current step being executed. * `executionId` is the ID of the current execution. * `webhookUrls` contains the URLs of the running instance's sibling flows. * `webhookApiKeys` contains the API keys of the running instance's sibling flows. * `invokeUrl` was the URL used to invoke the integration. * `customer` is an object containing an `id`, `name`, and `externalId` of the customer the instance is assigned to. * `user` is an object containing an `id`, `name`, `email` (their ID) and `externalId` of the customer user whose user-level config was used for this execution. This only applies to instances with [User Level Configuration](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md). * `integration` is an object containing an `id`, `name`, and `versionSequenceId` of the integration the instance was created from. * `instance` is an object containing an `id` and `name` of the running instance. * `flow` is an object containing the `id` and `name` of the running flow. * `invokeFlow` is a function that lets you invoke another flow by name. Generally, you'll want to use the [Invoke Flow](https://prismatic.io/docs/components/cross-flow.md#invoke-flow) action which wraps this function. ###### Step ID[​](#step-id "Direct link to Step ID") `context.stepId` contains the unique identifier (UUID) of the step. It is used by the [Process Data - DeDuplicate](https://prismatic.io/docs/components/process-data.md#deduplicate) action to track what items in a array have or have not been previously seen. You can use it similarly in a custom component to persist step-specific data. ###### Webhook URLs[​](#webhook-urls "Direct link to Webhook URLs") You can reference an instance's webhook URLs through the `context.webhookUrls` object. This is useful when writing actions to configure and delete webhooks in a third-party app. ``` perform: async (context, params) => { const inventoryUrl = context.webhookUrls["My Inventory Flow"]; }; ``` You can reference `context.flow.name` to fetch the current flow's webhook URL: ``` perform: async (context, params) => { const myCurrentUrl = context.webhookUrls[context.flow.name]; }; ``` ##### Logger object[​](#logger-object "Direct link to Logger object") `context.logger` is a logging object and can be helpful to debug components. ``` perform: async ({ logger }, params) => { logger.info("Things are going great"); logger.warn("Now less great..."); }; ``` Available log functions in increasing order of severity include `logger.debug`, `logger.info`, `logger.warn` and `logger.error`. You can also execute `logger.metric` on an object, which helps when [streaming logs and metrics](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md) to an external logging service. **Note**: Log lines are truncated after 4096 characters. If you need longer log lines, consider [streaming logs](https://prismatic.io/docs/monitor-instances/logging/streaming-logs-externally.md) to an external log service. ##### Execution, instance, and cross-flow state[​](#execution-instance-and-cross-flow-state "Direct link to Execution, instance, and cross-flow state") `context.executionState`, `context.instanceState`, `context.integrationState` and `context.crossFlowState` are key/value stores that may be used to store small amounts of data for future use: * `context.executionState` stores state for the duration of the execution, and is often used as an accumulator for loops. * `context.instanceState` stores state that is persisted between executions. This state is scoped to a specific flow. The flow may persist data, and reference it in a subsequent execution. Shouldn't `instanceState` be called `flowState`? Great question! We developed state storage prior to multi-flow, and the name `instanceState` was retained for historical reasons. * `context.crossFlowState` also stores state that is persisted between executions. This state is scoped to the instance, and flows may reference one another's stored state. * `context.integrationState` stores state between flows in instances of the same integration. Customer A's flow 1 can share data with Customer B's flow 2. State is most notably used by the [Persist Data](https://prismatic.io/docs/components/persist-data.md) and [Process Data](https://prismatic.io/docs/components/process-data.md) components, but you can use it in your custom components, too. If, for example, a previous flow's run saved a state key of `sampleKey`, you can reference `context.instanceState['sampleKey']` to access that key's value. To do the reverse, and save data to a flow's state storage for subsequent runs, add an `instanceState` property to your perform function's return value: ``` return { data: "Some Data", instanceState: { exampleKey: "example value", anotherKey: [1, 2, 3] }, }; ``` **Note**: To remove a key from persisted state, set it to `null`: Remove a key from crossFlowState ``` return { data: "Some Data", crossFlowState: { exampleKey: null }, }; ``` ##### Input parameters[​](#input-parameters "Direct link to Input parameters") The `params` parameter is an object that has attributes for each input field the action supports. For example, for the perform action [defined above](#the-perform-function), `params` has `params.firstName`, `params.middleName`, and `params.lastName`. `firstName`, `middleName`, and `lastName` are based off of the input objects that are provided to the action as `inputs`. Shorthand property names You can use [shorthand property names](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Object_initializer) for inputs. If your input object variables have different names - say you have a `const myFirstNameInput = input ({...})`, you can structure your action's input property like this: ``` inputs: { firstName: myFirstNameInput, middleName: myMiddleNameInput, lastName: myLastNameInput, } ``` and the `params` object passed into `perform` will have keys `firstName`, `middleName`, and `lastName`. Using non-shorthand property names is preferable to some developers to avoid [variable shadowing](https://en.wikipedia.org/wiki/Variable_shadowing). The function is written with a destructured `params` parameter. It could be rewritten without being destructured. ``` perform: async (context, params) => { if (params.middleName == "") { return { data: `${params.lastName}, ${params.firstName}` }; } else { return { data: `${params.lastName}, ${params.firstName} ${params.middleName[0]}.`, }; } }, ``` ##### Coercing input types[​](#coercing-input-types "Direct link to Coercing input types") TypeScript-based Node libraries often have strict rules about the type of variables that are passed into their functions, but inputs to `perform` functions are of type `unknown` since it's not known ahead of time what types of values users of components are going to pass in. For example, you might expect one of your inputs to be a `number`, but a user might pass in a `string` instead. That's obviously a problem since `"2" + 3` is `"23"`, while `2 + 3` is `5` in JavaScript. The Spectral package includes several utility functions for coercing input to be the type of variable that you need. Looking at the number/string example, suppose you have some input - `quantity` - that you need turned into a number (even if someone passes in `"123.45"` as a string), and you have another input - `itemName` - that you'd like to be a string. You can use `util.types.toNumber()` and `util.types.toString()` to ensure that the input has been converted to a number and string respectively: ``` import { action, util } from "@prismatic-io/spectral"; import { someThirdPartyApiCall } from "some-example-third-party-library"; action({ /*...*/ perform: async (context, { quantity, itemName }) => { const response = await someThirdPartyApiCall({ orderQuantity: util.types.toNumber(quantity), // Guaranteed to be a number orderItemName: util.types.toString(itemName), // Guaranteed to be a string }); return { data: response }; }, }); ``` If an input cannot be coerced into the type you've chosen - for example, suppose you pass `"Hello World"` into `util.toNumber()` - an error will be thrown indicating that `"Hello World"` cannot be coerced into a number. ###### Writing your own type checking functions[​](#writing-your-own-type-checking-functions "Direct link to Writing your own type checking functions") Prismatic provides a variety of type check and type coercion functions for common types (number, integer, string, boolean, etc). If you require a uniquely shaped object, you can create your own type check and coercion functions to ensure that inputs your custom component receives have the proper shape that the libraries you rely on expect. You can import an `interface` or `type` (or write one yourself) and write a function that converts inputs into an expected shape. For example, the SendGrid SDK expects an object that has this form: ``` { "to": [string], "from": string, "subject": string, "text": string, "html": string } ``` We can pull in that defined type, `MailDataRequired`, from the SendGrid SDK, and write a function that takes inputs and converts them to an object containing a series of strings: ``` import { MailDataRequired } from "@sendgrid/mail"; import { util } from "@prismatic-io/spectral"; export const createEmailPayload = ({ to, from, subject, text, html, }): MailDataRequired => ({ to: util.types .toString(to) .split(",") .map((recipient: string) => recipient.trim()), from: util.types.toString(from), subject: util.types.toString(from), text: util.types.toString(text), html: util.types.toString(html), }); ``` #### Perform function return values[​](#perform-function-return-values "Direct link to Perform function return values") An action can return a variety of data types. To return a simple string, number, boolean, array, or object your return block can read: ``` // return a string: return { data: "some string", }; // return a number: return { data: 123.45, }; // return a boolean: return { data: true, }; // return an array: return { data: [1, 2, 3, 4, "a", "b"], }; // return an object: return { data: { key1: "value1", key2: ["value2", 123], }, }; ``` Those values can be used as inputs in subsequent steps by referencing this step's `results`: ![Step results from an action in Prismatic app](/docs/img/custom-connectors/actions/step-results.png) ##### Setting synchronous HTTP status codes[​](#setting-synchronous-http-status-codes "Direct link to Setting synchronous HTTP status codes") If you invoke your instances [synchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md#http-status-codes-for-synchronous-integrations) and would like to return an HTTP status code other than `200 - OK`, you can configure the final step of your integration to be a custom component that returns any HTTP status code you want. To return an HTTP status code other than 200, return a `statusCode` attribute in the object you return from your custom component instead of a `data` attribute: ``` return { statusCode: 415, }; ``` If this custom component is the last step of an integration, then the integration will return an HTTP status code of 415 if invoked synchronously. Note: When an integration is invoked synchronously, by default the integration redirects the caller to a URL containing the output results of the final step of the integration. If the final step of the integration is a [Stop Execution](https://prismatic.io/docs/components/stop-execution.md) action, or any custom component action that returns a `statusCode`, the redirect does not occur. Instead, the caller receives an HTTP response with the `statusCode` specified. Read more about [HTTP status codes for synchronous integrations](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md#http-status-codes-for-synchronous-integrations). ##### Example action payloads[​](#example-action-payloads "Direct link to Example action payloads") As noted above, actions return results for subsequent steps to consume. It's often handy for an integration builder to have access to the shape of the results prior to a test being run. Your action can provide an `examplePayload` that can be referenced before test data is available: ``` { /* ... */ examplePayload: { data: { username: "john.doe", name: { first: "John", last: "Doe", }, age: 20, }, }, } ``` In the integration designer, this example payload can be referenced as an input. **Note:** your `examplePayload` must match the exact TypeScript type of the return value of your `perform` function. If your `perform` function's return value does not match the type of the example payload, TypeScript will generate a helpful error message: ![Example Result Data Type Mismatch in Typescript](/docs/img/custom-connectors/actions/example-result-data-type-mismatch.png) --- #### Cursor Rules and AI Coding Assistants Prismatic provides [a set of best-practice rules](https://github.com/prismatic-io/spectral-coding-assistant-best-practices) to help AI tools like Cursor generate well-structured, production-ready custom components. These guidelines are designed to reduce the time it takes to build components by steering AI toward consistent, proven patterns that align with Prismatic's SDK. Support for code-native integration development is coming! While the current rules focus on custom components, we plan to expand support to code-native integrations and additional AI coding environments in the future. #### What the rules help with[​](#what-the-rules-help-with "Direct link to What the rules help with") The Cursor Rules guide AI agents to adopt the same best practices we follow internally. For example: * Logging should use Prismatic's logger: `logger.debug()`, `logger.info()`, etc. * Configuration values should come from inputs and not be embedded directly in code * Errors should be handled gracefully to give users sufficient information to debug issues * Functions should be clearly named and avoid unnecessary async wrappers or IIFEs #### How to use with Cursor[​](#how-to-use-with-cursor "Direct link to How to use with Cursor") To enable the rules in Cursor: 1. Clone or reference the [Spectral rule set](https://github.com/prismatic-io/spectral-coding-assistant-best-practices) 2. Add it to your project's `.spectral.yaml` configuration 3. Use Cursor to scaffold or update your component code - your AI will be guided to follow the structure Prismatic expects #### Example prompts[​](#example-prompts "Direct link to Example prompts") Here are two real-world examples of how Cursor can use Prismatic's ruleset and supporting files to accelerate custom component development: ##### Example 1: Create a new custom connector[​](#example-1-create-a-new-custom-connector "Direct link to Example 1: Create a new custom connector") **Prompt**: > Help me create a component that wraps the API @docs.my-messaging-service.com while referencing @start-new-component.md **What Happens**: Cursor uses the linked API docs and `start-new-component.md` (which outlines how to write a Prismatic component) along with the ruleset to scaffold a new component. It generates a task list (`tasks.md`) covering all needed actions and connections, which you can review and guide through follow-up prompts like `@next-task.md`. ##### Example 2: Add actions to an existing connector[​](#example-2-add-actions-to-an-existing-connector "Direct link to Example 2: Add actions to an existing connector") **Prompt**: > Add a trigger for when a new channel is created > > Add a "Get User Details" action **What Happens**: Cursor uses the `component-dev.mdc` context file to understand how to apply the Prismatic SDK. With your project-specific best practices loaded, the assistant can safely extend an existing component with new triggers or actions - ensuring it aligns with your development standards. --- #### Handling Binary Files Integrations in Prismatic generally process serialized JSON, XML or other simple strings and pass deserialized JavaScript objects between steps. However, there are situations when you may want to process and pass binary data between steps. By "binary data", we mean files that are not plain text - PDF files, images, MP3 audio, etc. Within an integration, a binary file is represented by its `contentType` (MIME type), and a `Buffer` that contains the file's data. See Mozilla's documentation for a list of common file [MIME types](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/MIME_types/Common_types). ``` { data: Buffer.from("..."), contentType: "application/pdf" }; ``` ##### Processing binary data as an input[​](#processing-binary-data-as-an-input "Direct link to Processing binary data as an input") Inputs for binary files are similar to any other input you might create, though you can use the `util.types.toData` function to ensure that the input has the form `{ data: Buffer, contentType: string }`: ``` { inputs: { myFile: input({ label: "My File", type: "data", required: true, clean: util.types.toData, }); } } ``` The `myFile` property that comes in to your `perform` function will have the form of a binary file, with `data` and `contentType` properties that you can reference. ``` { perform: async (context, params) => { const { data: fileData, contentType } = params.myFile; // Send the data to an endpoint axios.post("http://my-endpoint", fileData, { headers: { "Content-Type": contentType }, }); }; } ``` ##### Returning binary data from an action[​](#returning-binary-data-from-an-action "Direct link to Returning binary data from an action") To return a binary file from your action, ensure that the data you return is a `Buffer` and optionally include a `contentType` property alongside `data` that indicates its MIME type. For example, if your custom component returns a rendered PDF file and the PDF contents are saved in a `Buffer` variable named `pdfContents`, the return block might look like this: ``` return { data: pdfContents, contentType: "application/pdf", }; ``` You can return multiple files, or binary files in a nested object with a similar structure: ``` return { myKey: "myValue", myPdf: { data: pdfBuffer, contentType: "application/pdf", }, myPng: { data: pngBuffer, contentType: "image/png", }, }; ``` ##### Fetching binary data with the Spectral HTTP client[​](#fetching-binary-data-with-the-spectral-http-client "Direct link to Fetching binary data with the Spectral HTTP client") When fetching binary data from an API, you must configure your HTTP client to expect binary data and to write the data to a `Buffer`. For the HTTP client (which is Axios-based), use the `responseType: "arraybuffer"` configuration option to ensure the `data` property returned is a `Buffer`: ``` { perform: async (context, params) => { const client = createBambooClient(params.connection); const { data, headers } = await client.get(`/v1/files/${params.fileId}`, { responseType: "arraybuffer", }); return { data, contentType: headers["content-type"] }; }; } ``` --- #### Branching in Custom Actions and Triggers Similar to the [branch](https://prismatic.io/docs/components/branch.md) component, your custom actions and triggers can make use of logical branches. To support branches, give your `action()` or `trigger()` two additional properties, `allowsBranching: true` and `staticBranchNames: ["List", "Of", "Branches"]`, and ensure that the object that your `perform` function returns includes a `branch` property: Example action with branching ``` export const branchExample = action({ display: { label: "Branch Example", description: "An example action that branches", }, inputs: { myValue: input({ label: "My Value", type: "string", required: true }), }, allowsBranching: true, staticBranchNames: ["First", "Second", "Other"], perform: async (context, params) => { let branchName = "Other"; if (params.myValue === "One") { branchName = "First"; } else if (params.myValue === "Two") { branchName = "Second"; } return await Promise.resolve({ branch: branchName, data: null }); }, }); ``` Similar code can be used in a custom trigger. `allowsBranching: true` indicates to the integration designer that it should render branches beneath your action or trigger. `staticBranchNames` is an array of strings representing names of possible branches that can be followed. The branch name that matches the `branch` return value will be followed. ![Logs showing which branch was followed in a custom branch step](/docs/img/custom-connectors/branching/custom-branch-logs.png) Execution logs will always indicate which branch of a branching step was followed. --- #### Custom Connections #### Overview[​](#overview "Direct link to Overview") A **connection** is a special type of input for an action that contains information about how to connect to an external application or service. A connection can contain one or more inputs representing API endpoints, keys, passwords, OAuth 2.0 fields, and other authentication details. The inputs within a connection use the same structure as other inputs, described [here](https://prismatic.io/docs/custom-connectors/actions.md#action-inputs). For example, suppose you're writing a component for an API that accepts either a username/password combination or an API key. You would create two connections - one for username/password authentication and one for API key authentication. You also want to allow your customers to point to either a sandbox or production environment - each connection should include an input for the endpoint. Your connections might look like this: ``` import { connection } from "@prismatic-io/spectral"; // Define this once to avoid repetition in the two connections const acmeEnvironment = input({ label: "Acme Inc Environment to Use", placeholder: "ACME Environment", type: "string", required: true, model: [ { label: "Production", value: "https://api.acme.com/", }, { label: "Sandbox", value: "https://sandbox.acme.com/api", }, ], }); const basicAuth = connection({ key: "basicAuth", display: { label: "Acme username and password", description: "Acme basic auth", }, inputs: { username: { label: "Acme Username", placeholder: "Username", type: "string", required: true, }, password: { label: "Acme Password", placeholder: "Password", type: "string", required: true, }, acmeEnvironment, }, }); const apiKey = connection({ key: "apiKey", display: { label: "Acme API Key", description: "Acme API key auth", }, inputs: { username: { label: "Acme API Key", placeholder: "API Key", type: "string", required: true, }, acmeEnvironment, }, }); ``` After defining connections, include them in your `component` definition. This allows users to enter connection information once and reuse it in actions that require that connection. It also makes connections available to all inputs of type "connection" in your component: ``` import { component } from "@prismatic-io/spectral"; // ... export default component({ key: "acme", public: false, display: { label: "Acme Inc", description: "Interact with Acme Inc's API", iconPath: "icon.png", }, actions: { myAction1, myAction2 }, triggers: { myTrigger1 }, connections: [basicAuth, apiKey], }); ``` Connection ordering The **first** connection listed in the `connections:` array becomes the default connection. In this example, `basicAuth` would be the default connection for this component. The default connection is recommended to users when they add an action from your component to their integration, but they can select other connection types as well. #### Referencing connections as inputs in actions[​](#referencing-connections-as-inputs-in-actions "Direct link to Referencing connections as inputs in actions") Actions can reference connections like any other input. To allow users to assign a connection to an action, create an `input` of type `connection` and add it as an input to your action: Reference a connection input from an action ``` import { action, input } from "@prismatic-io/spectral"; const connectionInput = input({ label: "Connection", type: "connection" }); export const getAcmeData = action({ display: { label: "Get Item", description: "Get an Item from Acme", }, inputs: { itemId: itemIdInput, myConnection: connectionInput }, perform: async (context, { itemId, myConnection }) => { const response = axios({ method: "get", url: `${myConnection.fields.acmeEnvironment}/item/${itemId}`, headers: { Authorization: `Bearer ${myConnection.fields.apiKey}`, }, }); return { data: response.data }; }, }); ``` #### Throwing connection errors[​](#throwing-connection-errors "Direct link to Throwing connection errors") It's important to know whether a connection is valid and to track any connection failures. Within your custom component, you can throw a `ConnectionError` to signal to Prismatic that there's a problem with the connection (such as inability to connect to an endpoint or invalid credentials). For example, if you know the API returns a 401 "Unauthorized" response for invalid credentials, you could throw a `ConnectionError` when your HTTP client receives a status code 401: Throw a connection error ``` import { action, ConnectionError, util } from "@prismatic-io/spectral"; const getItem = action({ display: { label: "Get Item", description: "Get an item from Acme", }, perform: async (context, { myConnection, itemId }) => { const apiKey = util.types.toString(myConnection.fields.apiKey); const response = await axios.get(`https://api.acme.com/items/${itemId}`, { headers: { Authorization: apiKey }, }); if (response.status === 401) { throw new ConnectionError( myConnection, "Invalid Acme credentials have been configured.", ); } return { data: response.data, }; }, inputs: { myConnection: input({ label: "Connection", type: "connection" }), itemId: itemIdInput, }, }); ``` The thrown error will be indicated by a red mark next to customers' connections on an instance, and messages will appear in logs. #### Writing OAuth 2.0 connections[​](#writing-oauth-20-connections "Direct link to Writing OAuth 2.0 connections") An OAuth 2.0 authorization code connection follows the [OAuth 2.0](https://oauth.net/2/) protocol and requires five inputs: * `authorizeUrl` - The URL where users authorize an OAuth 2.0 connection * `tokenUrl` - The URL for exchanging authorization codes for API keys and optional refresh tokens, and for refreshing API keys using refresh tokens * `scopes` - A space-delimited list of required permissions (scopes) for your application * `clientId` - Your OAuth 2.0 application's client ID * `clientSecret` - Your OAuth 2.0 application's client secret The first three fields are typically found in the API documentation of the service you're integrating with. Client ID and secret are created when you register an application with the third-party service. You can allow integration builders to edit any of these fields. Alternatively, you can mark fields as `shown: false`, in which case the default value will always be used and integration developers won't see the value. For example, when writing an OAuth 2.0 connection to Google Drive, the `authorizeUrl` and `tokenUrl` are constant. These can have default values and be marked as `shown: false`. Integration developers typically need to adjust scopes, client ID, and client secret (though you may already know which scopes you need). Here's an example connection: Example OAuth 2.0 Connection with Google Drive ``` import { oauth2Connection, OAuth2Type } from "@prismatic-io/spectral"; export const oauth2 = oauth2Connection({ key: "googleDriveOauth", display: { label: "OAuth2", description: "OAuth2 Connection", icons: { oauth2ConnectionIconPath: "oauth-icon.png", }, }, required: true, oauth2Type: OAuth2Type.AuthorizationCode, inputs: { authorizeUrl: { label: "Authorize URL", placeholder: "Authorization URL", type: "string", required: true, shown: false, comments: "The Authorization URL for Google Drive.", default: "https://accounts.google.com/o/oauth2/v2/auth", }, tokenUrl: { label: "Token URL", placeholder: "Token URL", type: "string", required: true, shown: false, comments: "The Token URL for Google Drive.", default: "https://oauth2.googleapis.com/token", }, scopes: { label: "Scopes", placeholder: "Scopes", type: "string", required: true, comments: "Space delimited listing of scopes. https://developers.google.com/identity/protocols/oauth2/scopes#drive", default: "https://www.googleapis.com/auth/drive", }, clientId: { label: "Client ID", placeholder: "Client Identifier", type: "password", required: true, comments: "The Google Drive app's Client Identifier.", }, clientSecret: { label: "Client Secret", placeholder: "Client Secret", type: "password", required: true, comments: "The Google Drive app's Client Secret.", }, }, }); ``` Use `oauth2Connection` for OAuth Connections Note that we used `oauth2Connection()` rather than `connection()` to define this OAuth connection. That's because the `oauth2Connection` helper function gives us additional TypeScript hinting about what fields are required. An `oauth2Connection` can be assigned to a component and referenced as an input just like a `connection`. The input that is received by a `perform` function will have the form: ``` { "token": { "access_token": "EXAMPLE-TOKEN", "token_type": "bearer", "expires_in": 14400, "refresh_token": "EXAMPLE-REFRESH-TOKEN", "scope": "account_info.read account_info.write file_requests.read file_requests.write files.content.read files.content.write files.metadata.read files.metadata.write", "uid": "123456789", "account_id": "dbid:EXAMPLEIRNhsZ3wECJZ3aXK3Gm47Di74", "expires_at": "2021-12-07T01:54:38.096Z" }, "context": { "code": "EXAMPLEqMEAAAAAAAAON5iBXhk_yOxjkfDeWy_vSE0", "state": "EXAMPLE2VDb25maWdWYXJpYWJsZTpmMDZlMDVkNy1kMjY0LTQ0YTgtYWI0Ni01MDhiOTNmZjU5ZjI=" }, "instanceConfigVarId": "EXAMPLE2VDb25maWdWYXJpYWJsZTpmMDZlMDVkNy1kMjY0LTQ0YTgtYWI0Ni01MDhiOTNmZjU5ZjI=", "key": "oauth", "fields": { "scopes": "", "clientId": "example-client-id", "tokenUrl": "https://api.dropboxapi.com/oauth2/token", "authorizeUrl": "https://www.dropbox.com/oauth2/authorize?token_access_type=offline", "clientSecret": "example-client-secret" } } ``` You will likely want to reference `myConnection.token.access_token`. Add a custom button to your OAuth 2.0 Connection You can specify what the OAuth 2.0 button looks like in the instance configuration page by specifying an optional `display.icons.oauth2ConnectionIconPath` (see the above example). An icon must be a PNG file, and we recommend that it be wider than it is tall with text indicating what it does: ![OAuth buttons including Dropbox, Facebook, GitHub, Google, Tumblr, and Twitter](/docs/img/custom-connectors/connections/oauth-buttons.png) Without an `oauth2ConnectionIconPath`, a simple button that says **CONNECT** will be placed in the configuration page. ##### Supporting PKCE with OAuth 2.0[​](#supporting-pkce-with-oauth-20 "Direct link to Supporting PKCE with OAuth 2.0") If the application that you are integrating with supports [Proof Key for Code Exchange](https://oauth.net/2/pkce/) (PKCE), you can add PKCE to your OAuth 2.0 connection by adding a `oauth2PkceMethod` property. You can specify either the `plain` or `S256` method, or omit the property to specify "no PKCE". Example PKCE declaration ``` export const oauth = oauth2Connection({ oauth2Type: OAuth2Type.AuthorizationCode, oauth2PkceMethod: OAuth2PkceMethod.S256, key: "oauth", display: { label: "Airtable OAuth 2.0", description: "Airtable OAuth 2.0 Auth Code", }, inputs: { // ... }, }); ``` ##### Overriding OAuth 2.0 token refresh URL[​](#overriding-oauth-20-token-refresh-url "Direct link to Overriding OAuth 2.0 token refresh URL") The OAuth 2.0 standard specifies that the refresh endpoint URL is the same as the token endpoint URL. It is generally something like `https://example.com/oauth2/token`. However, some OAuth 2.0 providers use a different URL for refreshing tokens. For example, they may use `/oauth2/token` for the initial auth code exchange, but `/oauth2/refresh` for refreshing tokens. To override the refresh endpoint URL, add a `refreshUrl` property to your OAuth 2.0 connection: Example refresh URL override ``` export const oauth = oauth2Connection({ oauth2Type: OAuth2Type.AuthorizationCode, key: "oauth", display: { label: "Acme OAuth 2.0", description: "Acme OAuth 2.0 Auth Code", }, inputs: { // ... refreshUrl: { label: "Refresh URL", placeholder: "Refresh URL", type: "string", required: true, shown: false, comments: "The Refresh URL for Acme Inc.", default: "https://example.com/oauth2/refresh", }, }, }); ``` #### Templating connection inputs[​](#templating-connection-inputs "Direct link to Templating connection inputs") Some apps provide their customers with unique OAuth 2.0 authorization and token URLs. When authenticating two of your customers, you may need to send one to `https://hooli.acme.com/oauth/authorize` and another to `https://pied-piper.acme.com/oauth/authorize`. Shopify is an example of an app that provisions customers with [unique OAuth endpoints](https://shopify.dev/docs/apps/build/authentication-authorization/access-tokens/token-exchange#step-2-get-an-access-token). Asking a customer to add their custom domain to several inputs (authorize URL, token URL, optional refresh URL, etc) is error-prone and not a good user experience. That's where templated connection inputs are helpful (not to be confused with [connection templates!](https://prismatic.io/docs/integrations/connections/integration-specific.md#connection-templates)). You can prompt a user for their custom domain (or other information) once, and generate other inputs' values automatically. So, a user can enter `pied-piper` once, and a connection could derive an authorization URL `https://pied-piper.acme.com/oauth/authorize` and token URL `https://pied-piper.acme.com/oauth/token` automatically. To add templated connection inputs to your custom connector, import `templateConnectionInputs`. Provide user-specified or global inputs as the function's first parameter, and templated inputs as the second parameter. In this example, we prompt a user for their Acme `domain`, and `authorizeUrl` and `tokenURL` are derived using `domain`. Example OAuth 2.0 auth code connection with templated inputs ``` import { OAuth2Type, oauth2Connection, templateConnectionInputs, } from "@prismatic-io/spectral"; export const acmeOAuth = oauth2Connection({ key: "acmeOauth", display: { label: "Acme OAuth 2.0", description: "Connect to Acme with OAuth 2.0 auth code flow", }, oauth2Type: OAuth2Type.AuthorizationCode, inputs: templateConnectionInputs( { domain: { label: "Acme Subdomain", example: "pied-piper", type: "string", required: true, shown: true, comments: "Your acme subdomain. The **pied-piper** portion of **pied-piper**.acme.com.", }, clientId: { label: "Client ID", type: "string", required: true, shown: true, comments: "Obtain by creating an OAuth app [here](https://partners.acme.com/)", }, clientSecret: { label: "Client Secret", type: "password", required: true, shown: true, comments: "Obtain by creating an OAuth app [here](https://partners.acme.com/)", }, scopes: { label: "Scopes", example: "widgets.read widgets.write offline_access", default: "widgets.read widgets.write offline_access", type: "string", required: false, shown: true, comments: "A space-delimited set scopes (permissions) to request from your user. Read more [here](https://acme.dev/api/usage/access-scopes#authenticated-access-scopes)", }, }, { authorizeUrl: { label: "Authorize URL", placeholder: "Authorize URL", type: "template", comments: "The OAuth 2.0 Authorization URL for Acme", templateValue: "https://{{#domain}}.acme.com/oauth/authorize/", }, tokenUrl: { label: "Authorize URL", placeholder: "Authorize URL", type: "template", comments: "The OAuth 2.0 Authorization URL for Acme", templateValue: "https://{{#domain}}.acme.com/oauth/token/", }, }, OAuth2Type.AuthorizationCode, ), }); ``` ![Screenshot of templating connection inputs](/docs/img/custom-connectors/connections/templating-connection-inputs.png) use templated connection inputs for non-OAuth connections Templated connection inputs are not only for OAuth 2.0 authorize and token URLs. You can use templated connection inputs for any connection type. You can also use multiple inputs within a template. Suppose, for example, you need to build a URL using several inputs. You could prompt a user for their `username`, `password`, `host`, and `serviceName` and template a value of `https://{{#username}}:{{#password}}@{{#host}}/api/{{#serviceName}}`. #### Using connections with HTTP clients[​](#using-connections-with-http-clients "Direct link to Using connections with HTTP clients") While the majority of APIs you'll interact with are HTTP based, and most present a RESTful interface, not all are the same. Some APIs (like Prismatic's!) use GraphQL. Others use remote procedure calls (RPCs), like gRPC, XML RPC, or SOAP. Luckily, there is an [NPM](https://www.npmjs.com/) package for almost any protocol. * If you are working with an HTTP-based **REST API**, we recommend using Spectral's built-in `createClient` function, which creates an [Axios](https://axios-http.com/docs/intro) HTTP client behind the scenes with some useful settings pre-configured (see [example](#using-the-built-in-createclient-http-client) below). If your team is more comfortable with vanilla Axios or [node-fetch](https://www.npmjs.com/package/node-fetch), you can certainly use those, too. * For **GraphQL APIs**, we recommend using [graphql-request](https://www.npmjs.com/package/graphql-request). You can use a generic HTTP client, but `graphql-request` provides a handy `gql` string literal tag. * For **XML RPC APIs**, you can import [xmlrpc](https://www.npmjs.com/package/xmlrpc) into your component project, or you can reach for [soap](https://www.npmjs.com/package/soap) if it's a **SOAP API**. * It's far less common for HTTP API integrations, but [@grpc/grpc-js](https://www.npmjs.com/package/@grpc/grpc-js) can be used for **gRPC APIs**. Regardless of which client you use, you will likely need to set some HTTP headers for authentication, content type, etc. #### Using the built-in createClient HTTP client[​](#using-the-built-in-createclient-http-client "Direct link to Using the built-in createClient HTTP client") Spectral comes with a built-in HTTP client for integrating with REST APIs. Behind the scenes, `createClient` creates an Axios-based HTTP client with some timeout, retry, and debug logic built on top of it. You can see the source code for `createClient` in Spectral's [GitHub repo](https://github.com/prismatic-io/spectral/blob/main/packages/spectral/src/clients/http/index.ts). To create an HTTP client, feed the client a base URL for your API along with the header information you need for authentication. You can fetch authentication values from a connection. It may look something like this: Example createClient usage ``` import { createClient } from "@prismatic-io/spectral/dist/clients/http"; action({ perform: async (context, params) => { // Create the authenticated HTTP client const myClient = createClient({ baseUrl: "https://example.com/api", debug: false, headers: { "X-API-Key": params.connection.fields.apiKey, Accept: "application/json", }, responseType: "json", }); // Use the HTTP client to POST data to the API const response = await myClient.post("/items", { sku: "12345", quantity: 3, price: 20.25, }); // Return the response as the action's result return { data: response.data }; }, }); ``` Debugging an HTTP Connection If you would like to see the full contents of the HTTP request and response, set `debug: true`. You will see all endpoints, headers, response codes, etc. in the integration logs. Just remember to turn off debugging for production! #### Using existing component connections in data sources[​](#using-existing-component-connections-in-data-sources "Direct link to Using existing component connections in data sources") You may want to extend an existing component to populate a config variable. For example, you may want to fetch and filter specific information from a CRM or ERP and present the data to your user as a picklist menu. Your data source can reference any existing connection config variable - including those from built-in components. To use an existing component's connection, reference its connection's key names. The [AWS Glue component](https://prismatic.io/docs/components/aws-glue.md) , for example, has an `accessKeyId` and `secretAccessKey`. Your data source can reference those with: ``` { perform: async (context, params) => { const { accessKeyId, secretAccessKey } = params.myConnection.fields; }; } ``` The field that you likely need to use for OAuth 2.0 connections is the connection's `access_token`, which is nested under `token` like this: ``` { perform: async (context, params) => { const myAccessToken = params.myConnection.token.access_token; }; } ``` An example of reusing existing connections is available in the [Building a Field Mapper Data Source](https://prismatic.io/docs/integrations/data-sources/field-mapping/salesforce-field-mapper.md) tutorial which covers pulling down custom fields from Salesforce. --- #### Config Wizard Data Sources #### Writing custom data sources[​](#writing-custom-data-sources "Direct link to Writing custom data sources") A **Data Source** fetches data from a third-party API that will be used to dynamically generate a config variable. When your customer deploys an instance, they use a [connection](https://prismatic.io/docs/custom-connectors/connections.md) to authenticate with a third-party API. A data source can generate a variety of [types of data](https://github.com/prismatic-io/spectral/blob/main/packages/spectral/src/types/DataSourceResult.ts) including a `string`, `date`, `picklist` (which is a `string[]`), complex `objectSelection` objects, and more. Here's a simple data source that fetches an array of customers, each with a `name` and `id`. It maps them to a `label`/`key` object so the customer's names show in a `picklist`, and the customer's `id` is saved: Fetch a string from an external API ``` import { dataSource, Element } from "@prismatic-io/spectral"; interface Customer { name: string; id: string; } const companyName = dataSource({ display: { label: "Fetch Customers", description: "Fetch an array of customers' names", }, inputs: { connection: input({ label: "Connection", type: "connection", required: true, }), }, perform: async (context, params) => { const client = createAcmeClient(params.connection); const response = await client.get<{ customers: Customer[] }>("/customers"); const customers = response.data.customers?.map((customer) => ({ label: customer.name, key: customer.id, })); return { result: customers }; }, dataSourceType: "picklist", examplePayload: { result: [ { label: "Smith Rocket Company", key: "abc-123" }, { label: "Mars Rocket Corp", key: "def-456" }, ], }, }); ``` In this example, we fetch several items from an API, including metadata about each item, so that a user can select one or more of the items and get that metadata of each: Fetch a string from an external API ``` const companyName = dataSource({ display: { label: "Fetch Items", description: "Fetch all available items", }, inputs: { connection: input({ label: "Connection", type: "connection", required: true, }), }, perform: async (context, params) => { const client = createAcmeClient(params.connection); const response = await client.get("/items"); const objects: ObjectSelection = response.data.items.map((item) => ({ object: { key: object.id, label: object.name }, fields: [ { key: object.quantity, label: "Quantity" }, { key: object.sku, label: "SKU" }, ], })); return { result: response.data.name }; }, dataSourceType: "objectSelection", examplePayload: { result: [ { object: { key: "abc-123", label: "widgets" }, fields: [ { key: "5", label: "Quantity" }, { key: "0000000000", label: "SKU" }, ], }, ], }, }); ``` An example of a data source that generates a picklist is available in the [Slack component](https://github.com/prismatic-io/examples/blob/main/components/slack/src/dataSources/index.ts). #### JSON Forms data sources[​](#json-forms-data-sources "Direct link to JSON Forms data sources") [JSON Forms](https://jsonforms.io/) is a form-generating framework that allows you to create forms through JSON schema that you generate. A JSON Form can contain any number of string, boolean, number, date, time, datetime or enum (dropdown menu) inputs, and you have some control over how the input elements are rendered (in tabs, grouped, vertical or horizontal layout, etc). Full documentation on JSON Forms is available on their [documentation page](https://jsonforms.io/examples), including several examples. Prismatic offers a [JSON Forms playground](https://prismatic.io/docs/jsonforms/playground) where you can create new forms and see how they would be rendered in Prismatic. A JSON Form config data source must return two properties (and one optional property): * `schema` defines the types of inputs your form contains (its `properties`), and some optional validators, like which properties are required. * `uiSchema` defines how those inputs should be rendered, like whether the inputs should be vertically or horizontally aligned. * `data` (optional) allows you to specify some default values for your form inputs. This simple example's `schema` declares two inputs - `name` (a string) and `age` (an integer between 0 and 150), and the `uiSchema` labels the first input "First Name" and places the input fields horizontally next to one another. A simple JSON Form ``` return { result: { schema: { type: "object", properties: { name: { type: "string", }, age: { type: "integer", minimum: 0, maximum: 150, }, }, }, uiSchema: { type: "HorizontalLayout", elements: [ { type: "Control", scope: "#/properties/name", }, { type: "Control", scope: "#/properties/age", }, ], }, data: { name: "Bob", age: 20 }, }, }; ``` The resulting JSON Form looks like this: ![Simple JSON form config variable](/docs/img/custom-connectors/data-sources/simple-example-result.png) ##### Data mapping with JSON Forms[​](#data-mapping-with-json-forms "Direct link to Data mapping with JSON Forms") A common use-case for JSON Forms is presenting a data-mapping UI to a user. For the sake of illustration, we'll pull down [users](https://jsonplaceholder.typicode.com/users) and [to-do tasks](https://jsonplaceholder.typicode.com/todos?_limit=10) from [JSON Placeholder](https://jsonplaceholder.typicode.com/), and give our users the ability to map any number of users to tasks. In order to provide any number of mappings of user-to-task, our JSON schema will need to contain an [`array`](https://jsonforms.io/examples/array) of `object`. Each `object` will contain a `user` property and a `task` property. The `user` and `task` property will each have a [`oneOf`](https://jsonforms.io/docs/multiple-choice#one-of) property, which represents a dropdown menu. For the sake of illustration, we also provide a `data` value containing some defaults that our UI should show. This property can be omitted. Data mapping with JSON forms ``` import axios from "axios"; import { dataSource, util } from "@prismatic-io/spectral"; interface User { id: number; name: string; email: string; } interface Task { id: number; title: string; } const userTaskExample = dataSource({ dataSourceType: "jsonForm", display: { label: "User/Task Mapper", description: "Map users to to-do tasks", }, inputs: {}, perform: async (context, params) => { const { data: users } = await axios.get( "https://jsonplaceholder.typicode.com/users", ); const { data: tasks } = await axios.get( "https://jsonplaceholder.typicode.com/todos?_limit=10", ); const schema = { type: "object", properties: { mymappings: { // Arrays allow users to make one or more mappings type: "array", items: { // Each object in the array should contain a user and task type: "object", properties: { user: { type: "string", // Have users select "one of" a dropdown of items oneOf: users.map((user) => ({ // JSON Forms expects a string value: const: util.types.toString(user.id), title: user.name, })), }, task: { type: "string", oneOf: tasks.map((task) => ({ const: util.types.toString(task.id), title: task.title, })), }, }, }, }, }, }; const uiSchema = { type: "VerticalLayout", elements: [ { type: "Control", scope: "#/properties/mymappings", label: "User <> Task Mapper", }, ], }; // Provide a default value, mapping of the first user to the first task const defaultValues = { mymappings: [ { user: util.types.toString(users[0].id), task: util.types.toString(tasks[0].id), }, ], }; return { result: { schema, uiSchema, data: defaultValues }, }; }, }); ``` The resulting JSON Form looks like this: ![Data mapping JSON form result](/docs/img/custom-connectors/data-sources/data-mapping-json-form-result.png) ##### JSON Forms data validation[​](#json-forms-data-validation "Direct link to JSON Forms data validation") You can validate the data returned by a JSON Form by passing the form data into a subsequent data source. See [JSON Form Validation](https://prismatic.io/docs/integrations/data-sources/json-forms/form-validation.md) --- #### Error Handling #### Global error handlers[​](#global-error-handlers "Direct link to Global error handlers") The actions in your component might all wrap API endpoints using an HTTP client, and that client might throw certain errors. You could handle those errors within each action, but you would end up writing the same error handlers over and over. You can now specify an error handler function to run whenever any of your actions throws an error. To specify an error handler, add a `handlers` block to your `component({})` function definition: ``` components({ // ... hooks: { error: (error) => doSomething(error), }, }); ``` For example, the popular HTTP client [axios](https://www.npmjs.com/package/axios) throws an error whenever it receives a status code that's [*not* between 200-299](https://github.com/axios/axios/blob/1f13dd7e26124a27c373c83eff0a8614acc1a04f/lib/defaults/index.js#L127-L129). If your HTTP client receives a status code in the 4xx or 5xx range, an error is thrown with a minimal message. If you would like additional information, like the status code or full response to the HTTP request, you can inspect the error being thrown and return a more detailed error message, as illustrated in Spectral [here](https://github.com/prismatic-io/spectral/blob/v6.5.0/packages/spectral/src/clients/http/index.ts#L53-L62). #### Measuring performance of a custom connector[​](#measuring-performance-of-a-custom-connector "Direct link to Measuring performance of a custom connector") When in [debug mode](https://prismatic.io/docs/integrations/troubleshooting.md#debug-mode), you can leverage functions of `context.debug` to measure [how long](https://prismatic.io/docs/integrations/troubleshooting.md#measuring-time-performance-in-a-code-block-or-custom-connector) specific portions of your actions take to run, and [how much memory](https://prismatic.io/docs/integrations/troubleshooting.md#measuring-memory-performance-in-a-code-block-or-custom-connector) they consume (see links for examples). --- #### Hello, Prismatic Here, we'll walk through initializing, modifying, publishing and testing a new component. Be sure that you've [set up your environment](https://prismatic.io/docs/custom-connectors/get-started/setup.md) first. [Your first custom component](https://player.vimeo.com/video/897263444) #### Initialize your component[​](#initialize-your-component "Direct link to Initialize your component") The Prismatic CLI tool, `prism`, has a variety of commands, one of which initializes a new custom component project. Run `prism components:init hello-prismatic`. You can give your component a description of "My First Component", and for **Type of Connection** select **Basic Connection**. Initialize a new component ``` $ prism components:init hello-prismatic Creating component directory for "hello-prismatic"... ? Description of Component My First Component ? Type of Connection Basic Connection create package.json create assets/icon.png create src/index.ts create src/index.test.ts create src/client.ts create jest.config.js create tsconfig.json create webpack.config.js create src/actions.ts create src/triggers.ts create src/dataSources.ts create src/connections.ts Running npm install for you to install the required dependencies. added 798 packages, and audited 799 packages in 41s found 0 vulnerabilities "hello-prismatic" is ready for development. To install dependencies, run either "npm install" or "yarn install" To test the component, run "npm run test" or "yarn test" To build the component, run "npm run build" or "yarn build" To publish the component, run "prism components:publish" For documentation on writing custom components, visit https://prismatic.io/docs/custom-components/writing-custom-components/ ``` This will create a new directory called `hello-prismatic` in your current directory. Open that directory in your code editor. #### Edit your component[​](#edit-your-component "Direct link to Edit your component") The boilerplate component code that is generated contains custom triggers, data sources, connections, and more. These are all topics that are covered fully in the [Writing Custom Components](https://prismatic.io/docs/custom-connectors.md) article, but we can safely remove them for now. Open the component directory that you just created in your code editor, and remove all files from the `hello-prismatic/src/` directory *except* for `index.ts`. Then, replace the contents of `index.ts` with this: index.ts ``` import { action, component, input } from "@prismatic-io/spectral"; const myAction = action({ // This is what the action should look like in the integration designer display: { label: "Say Hello", description: "Take a first and last name, and say hello", }, // This action should have these inputs inputs: { firstName: input({ label: "First Name", type: "string", required: true, example: "John", }), lastName: input({ label: "Last Name", type: "string", required: true, example: "Doe", }), }, // When run, this action should execute this function perform: async (context, params) => { const message = `Hello ${params.firstName} ${params.lastName}!`; // Promise.resolve because perform functions are async return Promise.resolve({ data: message }); }, }); export default component({ key: "hello-prismatic", public: false, // This is what the component should look like in the integration designer display: { label: "Hello Prismatic", description: "I'm testing custom components", iconPath: "icon.png", }, // The component includes these actions actions: { myAction }, }); ``` ![Screenshot of VS Code editor and a Prismatic custom component](/docs/img/custom-connectors/get-started/hello-prismatic/vscode-index-ts.png) Initialize your custom component #### Build your component[​](#build-your-component "Direct link to Build your component") The boilerplate default component code uses [webpack](https://webpack.js.org/) to compile TypeScript code into minified JavaScript. To invoke webpack, run the build command from your `hello-prismatic` directory: ``` npm run build ``` That will create a `dist/` directory that contains an `icon.png` file for your component (you can swap that out by replacing `assets/icon.png`), and an `index.js` file - the compiled JavaScript code for your component. Next, run the publish command from your `hello-prismatic/` directory: ``` prism components:publish ``` ![Screenshot of VS Code editor building and publishing a custom Prismatic component](/docs/img/custom-connectors/get-started/hello-prismatic/vscode-build-publish.png) Publish your component to your Prismatic organization #### Test your component[​](#test-your-component "Direct link to Test your component") Log in to Prismatic and create a new integration. Add an action and search for "My First Component" and select your component and action. Fill in the **First Name** and **Last Name** inputs of the action. Then, click **Run** to run a test of your integration. Select your **Say Hello** step in the test drawer and verify that your step returned a string that concatenates your first and last name together. ![Screenshot of the Hello Prismatic step results](/docs/img/custom-connectors/get-started/hello-prismatic/say-hello-test.png) #### Next steps[​](#next-steps "Direct link to Next steps") Developers learn best by trying things out in code. Try to add a middle name input, but only print the middle initial, or add a title input that presents a [dropdown menu](https://prismatic.io/docs/custom-connectors/actions.md#dropdown-menu-inputs), so users can select "Mr.", "Mrs.", "Dr.", etc. Make sure to `npm run build` and `prism components:publish` to publish new component changes to Prismatic, and within the Prismatic integration designer, you'll need to click **Component Versions** and bump your component version to the latest version. Once you're ready, head to the [next tutorial](https://prismatic.io/docs/custom-connectors/get-started/wrap-an-api.md) where we'll cover making API calls from a custom component. --- #### Set Up Your Environment In this section you'll learn how to set up your environment for custom component development. #### Operating system considerations[​](#operating-system-considerations "Direct link to Operating system considerations") You can develop custom components on any operating system that supports Node.js (Windows, Linux, MacOS, etc). If you use Windows, you may find it helpful to work in a Windows Subsystem for Linux [WSL](https://learn.microsoft.com/en-us/windows/wsl/install) environment. #### Install Node.js[​](#install-nodejs "Direct link to Install Node.js") Custom components are written in Node.js (with TypeScript layered on top). While many versions of Node.js can be used to build custom components, we recommend using the latest LTS version available on [NodeJS.org](https://nodejs.org/). Once you install Node.js, ensure that you can run both `node --version` and `npm --version`. #### Install an IDE[​](#install-an-ide "Direct link to Install an IDE") If you have a favorite code editor, use that. We've found that [VS Code](https://code.visualstudio.com/) works great for custom component development, but some of our developers prefer [Sublime](https://www.sublimetext.com/) or [neovim](https://neovim.io/). It's not necessary, but if you use VS Code we've found the following extensions helpful: * [ES Lint](https://marketplace.visualstudio.com/items?itemName=dbaeumer.vscode-eslint) for code linting * [Prettier](https://marketplace.visualstudio.com/items?itemName=esbenp.prettier-vscode) for code formatting * [Version Lens](https://marketplace.visualstudio.com/items?itemName=pflannery.vscode-versionlens) for showing the latest version for each package in package.json VS Code has built-in TypeScript IntelliSense (auto-complete), but if you want to use the latest-and-greatest, you can also install the [TypeScript Nightly build extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode.vscode-typescript-next). #### Install the Prismatic CLI tool[​](#install-the-prismatic-cli-tool "Direct link to Install the Prismatic CLI tool") The Prismatic CLI tool, `prism` is used for initializing and publishing custom components. You can install it after installing Node.js and `npm` with this command: Install prism ``` npm install --global @prismatic-io/prism ``` Once you've installed `prism`, log in by typing `prism login`. You'll be prompted to enter your Prismatic credentials, and then to authorize `prism` to access your Prismatic account. Click **Accept**. Once you've logged in, run `prism me` to verify that you are logged in. ``` $ prism me Name: John Doe Email: john.doe@example.com Organization: Example Corp - US Region Endpoint URL: https://app.prismatic.io ``` #### Enable AI agents like Cursor[​](#enable-ai-agents-like-cursor "Direct link to Enable AI agents like Cursor") [Cursor](https://www.cursor.com/) is a powerful AI code editor that can accelerate development. Prismatic provides a set of coding best practices that can guide Cursor (and similar AI agents) when building custom Prismatic connectors. #### Install additional development tools[​](#install-additional-development-tools "Direct link to Install additional development tools") In addition to those requirements, you may also want to install the [Prism MCP server](https://prismatic.io/docs/dev-tools/prism-mcp.md) into your IDE to enhance AI coding assistant support. --- #### Wrap an API in a Custom Connector Component authors are frequently asked to develop a component that implements a series of API calls for some third-party service (or possibly your own company's API). If the API you're creating a component for has an OpenAPI or WSDL spec, you can have `prism` [generate a component](https://prismatic.io/docs/custom-connectors/initializing.md#custom-connectors-from-wsdls-or-openapi-specs) based off of your YAML, JSON, or XML spec file. If, however, your API does not have an OpenAPI or WSDL spec, or if you're just looking to implement actions for a small handful of endpoints, we have some "best practices" you can follow for developing a custom component that implements those endpoints. The code for our custom component is available on [GitHub](https://github.com/prismatic-io/examples/tree/main/components/acmeerp). Let's dive into the code here. [How to Wrap a RESTful API in a Custom Component](https://player.vimeo.com/video/894996672) #### Our component's spec[​](#our-components-spec "Direct link to Our component's spec") For today's tutorial, suppose we're writing a component for "Acme ERP" - a made-up ERP system, and we would like to implement actions for a subset of endpoints that the Acme ERP API offers. We're handed a few specs for our component: * Customers each have distinct instances of Acme ERP, so the API endpoint is different for each customer. * Customers authenticate by entering an API key that they generate. * For our first integration, there are four endpoints that we want actions for: * **List All Items**: This is a GET call to the API's `/items/` endpoint which returns an array of items in our inventory. * **Get Item**: This is a GET call to `/items/{ ITEM_ID }` and fetches a single item from our inventory by ID. * **Add Item**: This is a POST call to `/items/` to add an item to our inventory. * **Delete Item**: This is a DELETE call to `/items/{ ITEM_ID }` to delete an item from inventory. #### Initializing the component[​](#initializing-the-component "Direct link to Initializing the component") Like any new custom component, we can create our new component using a subcommand of [prism](https://prismatic.io/docs/cli.md): Initialize our component ``` $ prism components:init acmeerp Creating component directory for "acmeerp"... "acmeerp" is ready for development. To install dependencies, run either "npm install" or "yarn install" To test the component, run "npm run test" or "yarn test" To build the component, run "npm run build" or "yarn build" To publish the component, run "prism components:publish" For documentation on writing custom components, visit https://prismatic.io/docs/custom-components/writing-custom-components/ ``` Give your component whatever description you'd like, and select **Basic Connection** for connection type for now - we'll change this later. This will generate a boilerplate custom component. Remove all files in the `src/` directory and replace `index.ts` with an empty component definition: index.ts ``` import { component } from "@prismatic-io/spectral"; export default component({ key: "acmeerp", public: false, display: { label: "acmeerp", description: "Acme ERP", iconPath: "icon.png", }, actions: {}, }); ``` #### Creating a shared HTTP client[​](#creating-a-shared-http-client "Direct link to Creating a shared HTTP client") Our component needs to reach out to an endpoint that is unique for each customer, and needs to use API Key authentication. So, our connections will need to contain both endpoint and credential information. To handle those things, let's create a function that takes a connection, and returns an HTTP client pointed at that endpoint. The custom component SDK provides an HTTP client that we can use. Let's first create a `connection`, which will contain the information needed to connect to Acme ERP: the customer's unique endpoint URL and the customer's API key. src/connections.ts ``` import { connection } from "@prismatic-io/spectral"; // Create a connection that contains an API endpoint URL // and an API key. export const apiKeyConnection = connection({ key: "apiKey", display: { label: "Acme Connection", description: "Acme Connection", }, inputs: { endpoint: { label: "Acme Endpoint URL", placeholder: "Acme Endpoint URL", type: "string", required: true, comments: "Acme API Endpoint URL", default: "https://my-json-server.typicode.com/prismatic-io/placeholder-data", example: "https://my-company.api.acme.com/", }, apiKey: { label: "Acme API Key", placeholder: "Acme API Key", type: "password", required: true, comments: "Generate at https://app.acme.com/settings/api-keys", }, }, }); export default [apiKeyConnection]; ``` To add this connection to our component, we'll need to import the connection into `index.ts` and add it to our component definition: Add connection to index.ts ``` import { component } from "@prismatic-io/spectral"; import connections from "./connections"; export default component({ key: "acmeerp", public: false, display: { label: "acmeerp", description: "Acme ERP", iconPath: "icon.png", }, actions: {}, connections, }); ``` Next, we'll create a helper function, `getAcmeErpClient`, which will take a connection input and create an HTTP client pointed at the connection's endpoint URL and authenticated with the connection's API key. Let's place this code in a new file called `client.ts`: src/client.ts ``` import { createClient } from "@prismatic-io/spectral/dist/clients/http"; import { Connection, util } from "@prismatic-io/spectral"; export function getAcmeErpClient(acmeConnection: Connection) { const { apiKey, endpoint } = acmeConnection.fields; // Return an HTTP client that has been configured to point // towards endpoint URL, and passes an API key as a header return createClient({ baseUrl: util.types.toString(endpoint), headers: { Authorization: `Bearer ${apiKey}`, }, }); } ``` Now, each of our actions simply need to pass in a connection to the `getAcmeErpClient` function, and they'll be able to make HTTP calls to the Acme ERP API. #### Write some actions[​](#write-some-actions "Direct link to Write some actions") Now it's time to implement our actions. Remember, we want to create four actions that list all items in our inventory, list a specific item, delete a specific item, and add an item to our inventory. ##### List all items action[​](#list-all-items-action "Direct link to List all items action") Let's start with the "list all items" action. This action will take a single input - the connection we just defined. Let's keep our inputs organized, so we'll create a new file, `src/actions.ts`, with that single action. At the top of the file, we'll import the `action` and `input` functions from our [custom component SDK](https://prismatic.io/docs/custom-connectors.md) library, and the `getAcmeErpClient` function we defined in `client.ts`: src/inputs.ts - Connection Input ``` import { action, input } from "@prismatic-io/spectral"; import { getAcmeErpClient } from "./client"; const listAllItems = action({ display: { label: "List All Items", description: "List all items in our inventory", }, inputs: { // Declare an input for this action acmeConnection: input({ label: "Acme ERP", required: true, type: "connection", }), }, perform: async (context, { acmeConnection }) => { // Initialize our HTTP client const acmeErpClient = getAcmeErpClient(acmeConnection); // Make a synchronous GET call to "{ endpoint }/items": const response = await acmeErpClient.get("/items/"); // Return the data that we got back return { data: response.data }; }, // Show an example payload in the Prismatic UI: examplePayload: { data: [ { id: 1, name: "Widgets", quantity: 20, }, { id: 2, name: "Whatsits", quantity: 100, }, ], }, }); export default { listAllItems }; ``` Next, import actions from `actions.ts` into `index.ts` and update our component definition: index.ts ``` import { component } from "@prismatic-io/spectral"; import connections from "./connections"; import actions from "./actions"; export default component({ key: "acmeerp", public: false, display: { label: "acmeerp", description: "Acme ERP", iconPath: "icon.png", }, actions, connections, }); ``` At this point we can run `npm run build` and `prism components:publish` to publish our single-action component to Prismatic. If we add our action to an integration, we can see that the step takes the connection input we specified: ![Acme ERP - List all Items in Prismatic integration designer](/docs/img/custom-connectors/get-started/wrap-an-api/first-action-inputs.png) If we open the test instance configuration, we're prompted for the connection information that we specified previously. ![Acme ERP - config wizard in Prismatic integration designer](/docs/img/custom-connectors/get-started/wrap-an-api/config-wizard.png) If we want to test our action, we can use the **Run** button in the Prismatic UI to run our step and see the results: ![Acme ERP - List all Items results in Prismatic integration designer](/docs/img/custom-connectors/get-started/wrap-an-api/first-action-results.png) ##### Get item action[​](#get-item-action "Direct link to Get item action") Next, let's build another action that fetches a specific item from our API's inventory. The "get item" action will be very similar to the previous action - it'll initialize an HTTP client and return some data. The `display`, `inputs`, `perform` logic, and `examplePayload` will differ slightly, but the bulk of the action will be the same as before: src/actions.ts - Get Item Action ``` // ... const getItem = action({ display: { label: "Get Item", description: "Get an Item by ID", }, inputs: { acmeConnection: input({ label: "Acme ERP", required: true, type: "connection", }), itemId: input({ label: "Item ID", required: true, type: "string", }), }, perform: async (context, { acmeConnection, itemId }) => { const acmeErpClient = getAcmeErpClient(acmeConnection); const response = await acmeErpClient.get(`/items/${itemId}`); return { data: response.data }; }, examplePayload: { data: { id: 1, name: "Widgets", quantity: 20, }, }, }); export default { getItem, listAllItems }; ``` ##### Delete item action[​](#delete-item-action "Direct link to Delete item action") At this point, we "rinse and repeat" - the "delete an item" action will get an HTTP client, send a DELETE HTTP request to an endpoint, and this time return nothing (since a DELETE of an item on our API returns nothing). Our "delete an item" action will take an endpoint URL and item ID, like the "get an item" action: src/actions.ts - Delete an Item Action ``` // ... const deleteItem = action({ display: { label: "Delete Item", description: "Delete an Item by ID", }, inputs: { acmeConnection: input({ label: "Acme ERP", required: true, type: "connection", }), itemId: input({ label: "Item ID", required: true, type: "string", }), }, perform: async (context, { acmeConnection, itemId }) => { const acmeErpClient = getAcmeErpClient(acmeConnection); await acmeErpClient.delete(`/items/${itemId}`); return { data: null }; }, }); export default { deleteItem, getItem, listAllItems }; ``` ##### Add an item action[​](#add-an-item-action "Direct link to Add an item action") The "add an item" action has a couple subtle differences from the other three actions: * It takes additional inputs (`name` and `quanity`). These inputs are not shared by other actions, so we can define them in-line. * The HTTP call we use is a POST call, and we pass our `name` and `quantity` inputs to the API as POST parameters. The reset of the action is very similar to the previous actions: src/actions.ts - Add Item Action ``` // ... const addItem = action({ display: { label: "Add Item", description: "Add an Item to Inventory", }, // We can define some inputs inline if they're not reused: inputs: { acmeConnection: input({ label: "Acme ERP", required: true, type: "connection", }), name: input({ label: "Item Name", type: "string" }), quantity: input({ label: "Item Quantity", type: "string" }), }, perform: async (context, { acmeConnection, name, quantity }) => { const acmeErpClient = getAcmeErpClient(acmeConnection); const response = await acmeErpClient.post("/items/", { name, quantity, }); return { data: response.data }; }, // This API call returns the item object that was created: examplePayload: { data: { id: 1, name: "Widgets", quantity: 20, }, }, }); export default { addItem, deleteItem, getItem, listAllItems }; ``` #### Unit testing our component[​](#unit-testing-our-component "Direct link to Unit testing our component") Now that we've implemented our four actions for our component, let's add some [unit tests](https://prismatic.io/docs/custom-connectors/unit-testing.md) to verify that our actions return what we'd expect them to return. To do that, we'll create a new file, `src/actions.test.ts`. We can import a helper function from Prismatic's SDK - `createHarness`, which will create a testing harness for our tests. Then, we can import the actions that we want to test (we'll test "add an item" here) and we can invoke the action and verify that the `result` we receive is what we expect: src/actions.test.ts - Unit Test the Add Item Action ``` import { createConnection, createHarness, } from "@prismatic-io/spectral/dist/testing"; import { apiKeyConnection } from "./connections"; import myComponent from "."; const harness = createHarness(myComponent); const acmeConnection = createConnection(apiKeyConnection, { endpoint: "https://my-json-server.typicode.com/prismatic-io/placeholder-data", apiKey: process.env.ACME_ERP_API_KEY, // Get API key from an environment variable }); describe("test the add item action", () => { test("test that we get back what we sent", async () => { const name = "widgets"; const quantity = 123; const result = await harness.action( "addItem", // Invoke the "addItem" action { acmeConnection, name, quantity }, // Pass in some inputs that we declared ); const data = result?.data as Record; expect(data?.name).toEqual(name); // Verify that the response had the same item name expect(data?.quantity).toEqual(quantity); // Verify that the response had the same item quantity }); }); ``` --- #### Handling Large Files in Custom Components If you need to process large files in your integration, it's easy to exhaust your available memory and encounter out-of-memory (OOM) problems. For example, if you pull down a 100 MB file from an SFTP server, deserialize the CSV to a JavaScript object, map each row to a new format, serialize each row, etc., you can end up with a dozen copies of the data in memory and can overflow the 1GB of memory that the integration runner has by default. Rather than loading an entire large file at once, it's often better to load and process smaller portions of the file at a time. That way, you can load a few kilobytes of a file, or a few rows of a CSV, process those, and then move to the next set of bytes or rows. If done correctly, a step can process very large files with only a few megabytes of memory. Processing large files a small portion at a time is generally accomplished in Node.js with [streams](https://nodejs.org/api/stream.html). Let's look at a couple of examples. #### Streaming a large file from HTTP to SFTP[​](#streaming-a-large-file-from-http-to-sftp "Direct link to Streaming a large file from HTTP to SFTP") Suppose that your integration needs to pull down a file from an HTTP endpoint, and save that file to an SFTP server. If your file is large (say, 200MB in size), and you use several steps to accomplish your goal, you can end up using well over 1 GB of memory: * The HTTP step will use 200MB when downloading the file * The HTTP step will use 200MB+ when serializing and persisting the step result * Depending on the output format, the HTTP step may use another 200MB+ to deserialize the file to JSON, etc. * The SFTP step will use 200MB when converting the file's contents to a communication format that SFTP understands If you download the file from the HTTP endpoint a few KB at a time, and stream those bytes directly to the SFTP server, your step will only use a few MB of memory at a time - the entire file will never be loaded into memory at once. In the example below, the `axios.get` function takes a parameter, `{ responseType: "stream" }`. That will cause `response.data` to be of type `stream.Readable`. That stream can be passed to an SFTP client's `.put` function. When it detects a readable stream, [ssh2-sft-client](https://www.npmjs.com/package/ssh2-sftp-client) will pipe that stream to the SFTP server as chunks are received. * actions.ts * inputs.ts * connections.ts Stream a file from the internet to an SFTP server ``` import { action, util, ConnectionError } from "@prismatic-io/spectral"; import { connectionInput, sftpPathInput, sourceUrlInput } from "./inputs"; import axios from "axios"; import SFTPClient from "ssh2-sftp-client"; const uploadFileFromUrl = action({ display: { label: "Upload file from URL", description: "Upload a file from a URL to an SFTP server", }, inputs: { connection: connectionInput, sourceUrl: sourceUrlInput, sftpPath: sftpPathInput, }, perform: async (context, params) => { const sftpClient = new SFTPClient(); const { username, password, host, port, timeout } = params.connection.fields; try { await sftpClient.connect({ username: util.types.toString(username), password: util.types.toString(password), host: util.types.toString(host), port: util.types.toInt(port), readyTimeout: util.types.toInt(timeout) || 3000, }); } catch (err) { throw new ConnectionError( params.connection, `Unable to connect to SFTP server. ${err}`, ); } const response = await axios.get(params.sourceUrl, { responseType: "stream", }); try { const result = await sftpClient.put(response.data, params.sftpPath); return { data: result }; } finally { await sftpClient.end(); } }, }); export default { uploadFileFromUrl }; ``` ``` import { input, util } from "@prismatic-io/spectral"; export const connectionInput = input({ label: "Connection", type: "connection", required: true, }); export const sourceUrlInput = input({ label: "Source File URL", type: "string", clean: util.types.toString, required: true, example: "https://files.example.com/my-file.pdf", }); export const sftpPathInput = input({ label: "Destination File Path", type: "string", clean: util.types.toString, required: true, example: "/path/to/my-file.pdf", }); ``` ``` import { connection } from "@prismatic-io/spectral"; export const basic = connection({ key: "basic", display: { label: "Basic Username/Password", description: "Basic Username and Password connection", }, inputs: { username: { label: "Username", type: "string", required: true }, password: { label: "Password", type: "password", required: true }, host: { label: "Host", type: "string", required: true }, port: { label: "Port", type: "string", default: "22", required: true }, }, }); export default [basic]; ``` You can extend the HTTP call to be authenticated, have search parameters, etc. As long as you specify `{ responseType: "stream" }`, your response will be a readable stream. Similar concepts can be applied to stream a file from HTTP to Dropbox, Google Drive, Azure Files, or most other file storage systems - most Node.js file storage libraries accept streams as inputs or have writeable stream functions. #### Streaming and processing a large CSV from Amazon S3[​](#streaming-and-processing-a-large-csv-from-amazon-s3 "Direct link to Streaming and processing a large CSV from Amazon S3") In this example, suppose you host large CSV files in Amazon S3 that represent transactions. These files are formatted like this: transactions.csv ``` id,product,quantity,price 1,widgets,5,100 2,gadgets,10,3.5 3,whatsits,1,200 . . . ``` However, there are thousands of records and the file is hundreds of MB in size and cannot be loaded into memory all at once. You want to find the total price of the transactions (sum of `quantity x price`) and return just the total price. 1. First, we'll use the AWS SDK to fetch an object from Amazon S3. The resulting object's `.Body` property is an instance of `stream.Readable`. 2. Then, we'll stream the readable file into a popular CSV parser, [PapaParse](https://www.papaparse.com/). PapaParse accepts streams and provides a callback function, `step`, which is run whenever a line of a CSV stream is processed. As we process each record, we'll add `quantity x price` to the total price. 3. Finally, we'll return the total price as the step's result. Because we're not returning the entire file that was read, the runner does not spend time and memory serializing the file as a step result. ``` import { GetObjectCommand, S3Client } from "@aws-sdk/client-s3"; import { action, input, util } from "@prismatic-io/spectral"; import { parse } from "papaparse"; import { Readable } from "stream"; interface CsvRecord { data: { id: string; product: string; quantity: number; price: number; }; } export const processLargeCsvFromS3 = action({ display: { label: "Process Large CSV from S3", description: "Find the total price of many transactions in a CSV file", }, inputs: { connection: input({ label: "Connection", type: "connection", required: true, }), bucket: input({ label: "Bucket Name", type: "string", required: true, clean: util.types.toString, }), objectKey: input({ label: "Object Key", type: "string", required: true, clean: util.types.toString, }), }, perform: async (context, params) => { // Initialize an Amazon S3 client const s3 = new S3Client({ region: "us-east-2", credentials: { accessKeyId: util.types.toString(params.connection.fields.accessKeyId), secretAccessKey: util.types.toString( params.connection.fields.secretAccessKey, ), }, }); // Initialize an accumulator let total = 0; // Fetch an object from Amazon S3. // The returned item.Body is stream.Readable const command = new GetObjectCommand({ Bucket: params.bucket, Key: params.objectKey, }); const item = await s3.send(command); // Parse the stream as it is read from S3, running "step" for each record read await new Promise((resolve) => { parse(item.Body as Readable, { header: true, dynamicTyping: true, // As each line in the CSV is read, function "step" is called step: (record: CsvRecord, parser) => { parser.pause(); // Pause the parser while work is done total += record.data.quantity * record.data.price; parser.resume(); // Re-enable the CSV parser after work is complete }, // When the stream ends, run complete complete: () => { resolve(null); }, }); }); return { data: total }; }, }); export default { processLargeCsvFromS3 }; ``` The `parser.pause()` and `parser.resume()` above are not necessary for our example, but if you need to do work on each record that you read (for example, transform the data and send it to an API), pausing the CSV parser while that work is done can help you to avoid overwhelming the API you're sending data to. Some File Formats stream better than others A CSV file is able to be processed readily as a stream because it can be read line-by-line. Other formats, like JSON or XML, have beginning and ending brackets or tags that may require you to load the file in its entirety. If you are dealing with JSON, consider [JSONL](https://jsonlines.org/) format, which can be read line-by-line. If you are parsing XML, you can parse the XML file by streaming data into [node-xml-stream-parser](https://www.npmjs.com/package/node-xml-stream-parser) and looking for specific XML tags. --- #### Initializing a New Component #### Initializing a new connector[​](#initializing-a-new-connector "Direct link to Initializing a new connector") To initialize a new project, run `prism components:init {{ CONNECTOR NAME }}`. If you do not have Prismatic's CLI tool, `prism`, installed, please take a moment to look through the [Prism overview page](https://prismatic.io/docs/cli.md). ``` prism components:init acme-erp ``` Your component name must be comprised of alphanumeric characters, hyphens, and underscores, and start and end with alphanumeric characters. You will be prompted with a couple of questions - to give a description for your component and to determine what connection authorization type should be templated (you can edit those later). This will create a directory structure that looks like this: ``` acme-erp β”œβ”€β”€ assets β”‚ └── icon.png β”œβ”€β”€ jest.config.js β”œβ”€β”€ package.json β”œβ”€β”€ src β”‚ β”œβ”€β”€ actions.ts β”‚ β”œβ”€β”€ client.ts β”‚ β”œβ”€β”€ connections.ts β”‚ β”œβ”€β”€ index.test.ts β”‚ β”œβ”€β”€ index.ts β”‚ └── triggers.ts β”œβ”€β”€ tsconfig.json └── webpack.config.js ``` * `assets/icon.png` is the icon that will be displayed next to your component. Square transparent PNGs at least 128 x 128 pixels in size look best, and will be scaled by the web application appropriately. * `jest.config.js` contains configuration for the [Jest](https://jestjs.io/) testing framework. * `package.json` is a standard node package definition file. * `src/actions.ts` contains your component's actions. This can be broken out into distinct files as your code grows. * `src/client.ts` contains a shared "client". This is handy for actions that share a mechanism for connecting to an API. The "client" will probably be an authenticated HTTP client that's configured to make requests of a particular endpoint. * `src/connections.ts` contains the connections that your component uses to authenticate with third-party APIs. * `src/index.test.ts` contains tests for the component's actions. See [Unit Testing Custom Components](https://prismatic.io/docs/custom-connectors/unit-testing.md). * `src/index.ts` contains your component definition. * `src/triggers.ts` contains custom triggers. * `tsconfig.json` contains configuration for [TypeScript](https://www.typescriptlang.org/). * `webpack.config.js` contains configuration for [Webpack](https://webpack.js.org/). #### Custom connectors from WSDLs or OpenAPI specs[​](#custom-connectors-from-wsdls-or-openapi-specs "Direct link to Custom connectors from WSDLs or OpenAPI specs") Third-party applications and services often provide APIs with hundreds of RESTful endpoints. It would be tedious to manually write actions for each individual endpoint. Luckily, many companies also provide an API specification - commonly a [Web Service Definition Language (WSDL)](https://www.w3.org/TR/2001/NOTE-wsdl-20010315) file, or an [OpenAPI (Swagger)](https://swagger.io/specification/) specification. You can generate a custom component from a WSDL file with `prism` by passing the `--wsdl-path` flag to the `components:init` subcommand: ``` prism components:init myThirdPartyComponent --wsdl-path ./thirdPartySpec.wsdl ``` You can generate a custom component from an OpenAPI definition (you can use a YAML or JSON file - both work fine) with `prism` by passing the `--open-api-path` flag to the `components:init` subcommand: ``` prism components:init myThirdPartyComponent --open-api-path ./third-party-openapi-spec.json ``` The custom component code that is generated may require some tweaking - some APIs have undocumented required headers, or irregular authentication schemes (so you may need to touch up `src/client.ts` or `src/connection.ts`). But, this does give you a great jumping-off point when building a custom component for a third-party app. --- #### Handling API Pagination #### Pagination in custom components[​](#pagination-in-custom-components "Direct link to Pagination in custom components") Every application implements pagination differently. Some applications require you to pass page number and number of records to return as URL search parameters (i.e. `?page=5&page_size=20`). In that case, it's your job to keep track of which page you're on. Others return a "cursor" with the response (either in the body or as a response header). You can include that cursor with your next request to get another page of results. As you build an action for an API that paginates, ask yourself this question: *is it reasonable to pull down all records at one time?*. If the API you're interacting with returns 100 records at a time, for example, and you know that customers never have more than a few hundred records of a particular type, it probably makes sense to pack pagination logic into your custom action. That way, your customers don't need to keep track of page numbers or cursors - your action simply returns all results. In this example, Airtable returns a JSON payload with an `offset` property and array of `records` that we accumulate in a `do`/`while` loop: Handling pagination within an action ``` export interface AirtableRecord { id: string; createdTime: string; fields: Record; } export interface AirtableRecordResponse { offset: string; records: AirtableRecord[]; } const listRecords = action({ display: { label: "List Records", description: "List all records inside of the given table", }, inputs: { airtableConnection: connectionInput, baseId: baseIdInput, tableName: tableNameInput, view: viewInput, }, perform: async (context, params) => { const client = createAirtableClient(params.airtableConnection); const records: AirtableRecord[] = []; let offset = ""; do { const { data } = await client.get( `/v0/${params.baseId}/${params.tableName}`, { params: { view: params.view, offset, }, }, ); records.push(...data.records); offset = data.offset; } while (offset); return { data: records }; }, }); ``` On the other hand, if you know that your customers have a significant number of records stored in a third-party application (e.g. they have millions of records in their Airtable base), it's more memory-efficient to fetch a page of records at a time, processing each page before fetching the next page. In that case, we recommend adding `offset`, `cursor`, `page_number`, etc., as inputs of your action, and ensure that your action returns those values for the next iteration. parsing link headers If the API you're working with returns link headers, we recommend the [parse-link-header](https://www.npmjs.com/package/parse-link-header) package, which can be used to extract the next URL to use when paginating. --- #### Publishing a Custom Component #### Publishing a custom component[​](#publishing-a-custom-component "Direct link to Publishing a custom component") Package a component with `webpack` by running `npm run build` or `yarn build`: ``` $ yarn build yarn run v1.22.10 $ webpack asset icon.png 94.2 KiB [compared for emit] [from: assets/icon.png] [copied] asset index.js 92.2 KiB [emitted] (name: main) runtime modules 1.04 KiB 5 modules modules by path ./node_modules/@prismatic-io/spectral/ 49.6 KiB modules by path ./node_modules/@prismatic-io/spectral/dist/types/*.js 3.92 KiB 12 modules modules by path ./node_modules/@prismatic-io/spectral/dist/*.js 21.4 KiB ./node_modules/@prismatic-io/spectral/dist/index.js 4.21 KiB [built] [code generated] ./node_modules/@prismatic-io/spectral/dist/util.js 11.9 KiB [built] [code generated] ./node_modules/@prismatic-io/spectral/dist/testing.js 5.29 KiB [built] [code generated] ./node_modules/@prismatic-io/spectral/node_modules/jest-mock/build/index.js 24.2 KiB [built] [code generated] modules by path ./node_modules/date-fns/ 16 KiB modules by path ./node_modules/date-fns/_lib/ 780 bytes ./node_modules/date-fns/_lib/toInteger/index.js 426 bytes [built] [code generated] ./node_modules/date-fns/_lib/requiredArgs/index.js 354 bytes [built] [code generated] 4 modules ./src/index.ts 2.46 KiB [built] [code generated] ./node_modules/valid-url/index.js 3.99 KiB [built] [code generated] webpack 5.41.1 compiled successfully in 1698 ms ✨ Done in 2.86s. ``` This will create a `dist/` directory containing your compiled JavaScript and icon image. Now use `prism` to publish your component. If you do not have Prismatic's CLI tool, `prism`, installed, please take a moment to look through the [Prism overview page](https://prismatic.io/docs/cli.md). ``` $ prism components:publish Format Name - Format a person's name given a first, middle, and last name Would you like to publish Format Name? (y/N): y Successfully submitted Format Name (v6)! The publish should finish processing shortly. ``` #### Component versioning[​](#component-versioning "Direct link to Component versioning") Components are versioned. The first time a component is published it is given version "1". Thereafter, each time the component is republished its version number increments by one. Within the integration designer, you can choose which version of a component to use. Components marked in grey are using the latest version, and components marked with yellow have newer versions available. ![Component version drawer in Prismatic app](/docs/img/custom-connectors/component-version-drawer.png) In most cases, you'll will want to use the latest version of each component. For the sake of stability and consistency, though, versions of components used in integrations are not bumped to the latest version automatically. This prevents unintended issues in your integration if your team member were to publish a broken custom component. You can keep an integration pinned to component "version 6" for example, while your team members experiment with a newer "version 7" of a custom component for another integration. When your team publishes a new version of a custom component, or when Prismatic publishes a new version of a built-in component, you will see a notification in the integration designer on the **Component Versions** button indicating that some components are outdated. To upgrade those components, click the **Component Versions** button and select **CHANGE VERSION** for the component whose version you want to change. #### Publishing components in a CI/CD pipeline[​](#publishing-components-in-a-cicd-pipeline "Direct link to Publishing components in a CI/CD pipeline") If you have multiple tenants, or if you want to automate the publishing of your custom components, you can incorporate component publishing into your CI/CD pipeline. At a high level, the steps to publish a component in a CI/CD pipeline are: 1. Install the [`prism` CLI tool](https://prismatic.io/docs/cli.md) 2. [Authenticate](https://prismatic.io/docs/api/ci-cd-system.md) the `prism` CLI tool with your Prismatic tenant 3. Build your component 4. Use the `prism components:publish` command to publish your component If you use GitHub, you can use Prismatic's [GitHub Actions](https://prismatic.io/docs/api/github-actions.md) to publish your component as part of your GitHub Actions workflow. [This example repo](https://github.com/prismatic-io/example-project-structure) contains an example project structure and GitHub Actions workflow that builds and publishes multiple custom components to Prismatic whenever changes are pushed to the `main` branch. ##### Including git commit information[​](#including-git-commit-information "Direct link to Including git commit information") When publishing components in a CI/CD pipeline, you may want to include git commit information with your component publish. This information can help you track which version of your component code is associated with each published component version. In this example, we derive the commit hash, commit URL, and repository URL from the git repository and include that information when publishing the component: Example: Publishing with git commit info ``` prism components:publish \ --comment "Refactored the 'Get Widgets' action"\ --commitHash $(git rev-parse HEAD) \ --commitUrl $(git config --get remote.origin.url)/commit/$(git rev-parse HEAD) \ --repoUrl $(git config --get remote.origin.url) ``` This information is available in the web app when viewing the component's details. ![Component commit information in Prismatic app](/docs/img/custom-connectors/publishing/component-details.png) **Note**: If you use Prismatic's [GitHub Actions](https://prismatic.io/docs/api/github-actions.md), the action automatically includes git commit information when publishing your component. --- #### Custom Triggers #### Writing custom triggers[​](#writing-custom-triggers "Direct link to Writing custom triggers") Integrations are usually triggered [on a schedule](https://prismatic.io/docs/integrations/triggers/schedule.md) (meaning instances of the integration run every X minutes, or at a particular time of day) or [via webhook](https://prismatic.io/docs/integrations/triggers/webhook.md) (meaning some outside system sends JSON data to a unique URL and an instance processes the data that was sent). The vast majority of integrations built in Prismatic start with a schedule trigger or webhook trigger. There are situations, though, where neither the schedule nor the standard webhook trigger are suitable for one reason or another. That's where writing your own triggers come in handy. Triggers are custom bits of code that are similar to [actions](https://prismatic.io/docs/custom-connectors/actions.md). They give you fine-grained control over how a webhook's payload is presented to the rest of the steps of an integration and what HTTP response is returned to whatever invoked the trigger's webhook URL. Similar to an action, a trigger is comprised of `display` information, a `perform` function and `inputs`. Additionally, you specify if your trigger can be invoked [synchronously](https://prismatic.io/docs/integrations/triggers/webhook/synchronous-and-asynchronous.md) (`synchronousResponseSupport`) and if your trigger supports scheduling (`scheduleSupport`). Suppose, for example, a third-party app can be configured to send CSV data via webhook and requires that the webhook echo a header, `x-confirmation-code`, back in plaintext to confirm it got the payload. The default webhook trigger accepts JSON, and responds with an [execution ID](https://prismatic.io/docs/api/schema/object/InstanceExecutionResult.md#return-fields), so it's not suitable for integrating with this third-party app. This trigger will return an HTTP 200 and echo a particular header back to the system invoking the webhook, and then it'll parse the CSV payload into an object so that subsequent steps can reference through the trigger's `results.body.data`: ``` import { input, trigger, TriggerPayload, HttpResponse, util, } from "@prismatic-io/spectral"; import papaparse from "papaparse"; // CSV Library export const csvTrigger = trigger({ display: { label: "CSV Webhook", description: "Accepts and parses CSV data into a referenceable object and returns a plaintext ACK to the webhook caller.", }, perform: async (context, payload, { hasHeader }) => { // Create a custom HTTP response that echos a header, // x-confirmation-code, that was received as part of // the webhook invocation const response: HttpResponse = { statusCode: 200, contentType: "text/plain; charset=utf-8", body: payload.headers["x-confirmation-code"], }; // Create a copy of the webhook payload, deserialize // the CSV raw body, and add the deserialized object // to the object to the trigger's outputs const finalPayload: TriggerPayload = { ...payload }; const parseResult = papaparse.parse( util.types.toString(payload.rawBody.data), { header: util.types.toBool(hasHeader), }, ); finalPayload.body.data = parseResult.data; // Return the modified trigger payload and custom HTTP response return Promise.resolve({ payload: finalPayload, response, }); }, inputs: { // Declare if the incoming CSV has header information hasHeader: input({ label: "CSV Has Header", type: "boolean", default: "false", }), }, synchronousResponseSupport: "invalid", // Do not allow synchronous invocations scheduleSupport: "invalid", // Do not allow scheduled invocations }); export default { csvTrigger }; ``` Notice a few things about this example: * The `trigger`'s form is very similar to that of an `action` definition. * The `response` contains an HTTP `statusCode`, `body`, and `contentType` to be returned to the webhook caller. * The second argument to the `perform` function - `payload` - contains the same information that a standard webhook trigger returns. The `rawBody.data` presumably contains some CSV text - the `body.data` key of the payload is replaced by the deserialized version of the CSV data. * `inputs` work the same way that they work for actions - you define a series of `input`s, and they're passed in as the third parameter of the `perform` function. ##### Instance deploy and delete events for triggers[​](#instance-deploy-and-delete-events-for-triggers "Direct link to Instance deploy and delete events for triggers") Similar to a `perform` function, a trigger can also define `onInstanceDeploy` and `onInstanceDelete` functions. These functions are called when an instance is deployed or deleted, respectively. They are handy for creating or deleting resources in a third-party system that are associated with an instance (like webhooks, file directories, etc). ##### Adding a trigger to your component[​](#adding-a-trigger-to-your-component "Direct link to Adding a trigger to your component") Once you've written a trigger, you can add it to an existing component the same way you add an action to your component, but using the `triggers` key: ``` import { csvTrigger } from "./csvTrigger"; export default component({ key: "format-name", public: false, display: { label: "Format Name", description: "Format a person's name given a first, middle, and last name", iconPath: "icon.png", }, actions: { improperFormatName, properFormatName, }, triggers: { csvTrigger }, }); ``` #### App event triggers[​](#app-event-triggers "Direct link to App event triggers") It's common for users to want to know when records are created or updated in a third-party app. There are a couple of ways you can achieve this: 1. An event-based system uses [webhooks](https://prismatic.io/docs/integrations/triggers/webhook.md) to notify your flow whenever something happens. 2. A trigger polls the third-party API for changes on a time interval. Generally speaking, webhook triggers are preferable over polling triggers as they provide near real-time updates. ##### App event webhook triggers[​](#app-event-webhook-triggers "Direct link to App event webhook triggers") An app event webhook trigger takes advantage of `onInstanceDeploy` and `onInstanceDelete` functions (described [above](https://prismatic.io/docs/custom-connectors/triggers.md#instance-deploy-and-delete-events-for-triggers)). When a customer configures and deploys an instance of your integration, `onInstanceDeploy` configures a webhook. When the instance is removed, the `onInstanceDelete` trigger removes the webhook. ###### Example app event trigger using webhooks[​](#example-app-event-trigger-using-webhooks "Direct link to Example app event trigger using webhooks") This example trigger will create a webhook in a third-party app when an instance is deployed, storing the webhook ID in persistent data, and delete the webhook when the instance is deleted: ``` const acmeWebhookTrigger = trigger({ display: { label: "Acme Webhook Trigger", description: "Acme will notify your app when certain events occur in Acme", }, scheduleSupport: "invalid", synchronousResponseSupport: "invalid", inputs: { connection: input({ label: "Acme Connection", type: "connection", required: true, }), events: input({ type: "string", label: "Events", comments: "The events that would cause an Acme webhook request to be sent to this flow", collection: "valuelist", model: [ { label: "Lead Created", value: "lead_created" }, { label: "Lead Updated", value: "lead_updated" }, { label: "Lead Deleted", value: "lead_deleted" }, ], }), }, /** Run when a trigger is invoked. This function could contain additional logic for verifying HMAC signatures, etc. */ perform: async (_context, payload, _params) => { return Promise.resolve({ payload }); }, /** Run when an instance with this trigger is deployed */ onInstanceDeploy: async (context, params) => { // Get the current flow's webhook URL const flowWebhookUrl = context.webhookUrls[context.flow.name]; // Create a webhook in Acme const { data } = await axios.post( "https://api.acme.com/webhooks", { endpoint: flowWebhookUrl, events: params.events, }, { headers: { Authorization: `Bearer ${params.connection.token?.access_token}`, }, }, ); // Store the webhook ID in the instance state return { crossFlowState: { [`${context.flow.name}-webhook-id`]: data.id }, }; }, /** Run when an instance with this trigger is removed */ onInstanceDelete: async (context, params) => { // Get the webhook ID from the instance state const webhookId = context.crossFlowState[`${context.flow.name}-webhook-id`]; // Delete the webhook from Acme await axios.delete(`https://api.acme.com/webhooks/${webhookId}`, { headers: { Authorization: `Bearer ${params.connection.token?.access_token}`, }, }); }, }); ``` Ensure your onInstanceDeploy function is idempotent Either the external third-party API, or your trigger, should be designed to be idempotent - meaning that if the `onInstanceDeploy` is created twice, it won't cause any problems. To test your trigger's `onInstanceDeploy` and `onInstanceDelete` functions in the integration designer, open the **Test Runner** drawer and click **Test Deploy** or **Test Delete** within the **Trigger** tab. warning Note that `onInstanceDeploy` and `onInstanceDelete` functions do not have access to flow-specific persisted data. Both functions should read and write data at the `crossFlowState` level. You can store unique data for each flow using key names that include the flow name in order to generate unique persisted data keys, like `${context.flow.name}-webhook-id` in the example above. ##### App event polling triggers[​](#app-event-polling-triggers "Direct link to App event polling triggers") Polling triggers are used when you want to be notified of changes in an external app, but the app does not support webhooks. The trigger's job is to fetch any new data since the last time it ran. A `pollingTrigger` is similar to a standard trigger that supports running on a schedule. Its `perform` function receives an additional parameter, `context.polling`, which has a few functions: * `context.polling.getState()` will fetch existing poll state. * `context.polling.invokeAction()` can invoke an existing component's action (if one exists) to fetch data from the external app. This is handy if you don't want to duplicate logic in your trigger and an action. * `context.polling.setState()` sets state for the next execution to load. Generally, a polling trigger's `perform` function will look like this: 1. **Get current poll state** from `context.polling.getState()`. This state will represent a cursor of some kind, depending on the API you're working with. If the API is paginated with pages that are numbers, your state may represent the number of the last page you fetched. If records in the API have "updated at" timestamps, this state may represent the most recent timestamp you've processed. 2. **Fetch new records.** Using the cursor you loaded, fetch records that you have not yet processed. You can either use `context.polling.invokeAction()` to run an action that fetches new data, or you can implement the logic yourself. If the API uses numbered pagination, fetch `lastPage + 1`. If the API uses "updated at" timestamps, query for records where `updated_at > ${previous_updated_at}`. Implementations will be different depending on the service you're integrating with. 3. **Update poll state** using `context.polling.stateState()`. Save the newest page number of "updated at" timestamp that you fetched. 4. **Return new records** for the flow to process. If no new records were found, return `polledNoChanges: true` which will cause the execution to stop immediately. ###### Example PostgreSQL polling trigger[​](#example-postgresql-polling-trigger "Direct link to Example PostgreSQL polling trigger") This example polling trigger connects to a [PostgreSQL](https://prismatic.io/docs/components/postgres.md) database and queries a table called `people` which has columns `firstname TEXT`, `lastname TEXT` and `updated_at TIMESTAMP`. While PostgreSQL can trigger a webhook request when data changes through a combination of a postgresql [TRIGGER](https://www.postgresql.org/docs/current/sql-createtrigger.html) function and HTTP plugin, implementing webhooks in your database can cause the database to slow down considerably, since every INSERT or UPDATE waits for an HTTP request. Polling makes more sense when looking for updates in a PostgreSQL database. The first time this polling trigger runs, it finds `MAX(updated_at)::TEXT`. We cast the timestamp to `TEXT` so that it can be stored in persisted state readily, and so that PostgreSQL returns a timestamp with microseconds (it normally returns just milliseconds). On subsequent runs, we load the `cursor` (previous timestamp) that was found, and execute `"SELECT firstname, lastname FROM people WHERE updated_at > ${cursor}"`, polling any record that has an `updated_at` timestamp greater than the previous timestamp. Example polling trigger that invokes an existing action ``` import { pollingTrigger } from "@prismatic-io/spectral"; import { connectionInput } from "./inputs"; import { createDB } from "./client"; export const pollPeople = pollingTrigger({ display: { label: "Poll people table for changes", description: "Fetch any updated records in the Acme people table", }, inputs: { postgresConnection: connectionInput, }, perform: async (context, payload, params) => { const db = createDB(params.postgresConnection); const state = context.polling.getState(); const cursorQuery = "SELECT MAX(updated_at)::TEXT AS cursor FROM people"; try { if (!state?.cursor) { // No previous cursor was found. This is the first time this // trigger has run, so fetch an initial cursor and then exit const { cursor: newCursor } = await db.one(cursorQuery); context.polling.setState({ cursor: newCursor }); context.logger.log( `First time running. Next time records with "updated_at" greater than "${newCursor}" will be fetched.`, ); return { payload, polledNoChanges: true, }; } // The trigger has run previously. Fetch results since it last ran const result = await db.tx(async (task) => { return { recordsQuery: await task.manyOrNone( "SELECT firstname, lastname FROM people WHERE updated_at > ${cursor}", { cursor: state.cursor }, ), cursorQuery: await task.one(cursorQuery), // Also fetch new cursor in the same transaction }; }); const newCursor = result.cursorQuery.cursor; const records = result.recordsQuery; context.polling.setState({ cursor: newCursor }); if (records.length > 0) { // If any new records were found, return them return { payload: { ...payload, body: { data: records } }, polledNoChanges: false, }; } else { // If no results were found, return nothing and exit return { payload, polledNoChanges: true }; } } finally { await db.$pool.end(); } }, }); ``` Note that if you return `polledNoChanges: true`, the runner will immediately stop and your flow will not continue to run. Use this property if you checked for new changes, but found none. ###### Example polling trigger using existing action[​](#example-polling-trigger-using-existing-action "Direct link to Example polling trigger using existing action") In this example, imagine you already have a custom component with an action `listProducts` that returns a result like this: List Products action return value ``` { "products": [ {"id": 123, "color": "red", "name": "Widget"}, {"id": 456, "color": "red", "name": "Gadget"} ] "page_info": { "limit": 100, "page": 20 } } ``` You can leverage this already-existing action in a polling trigger using the `pollAction` property, and `context.polling.invokeAction()` function: Invoking an action in a polling trigger ``` import { pollingTrigger } from "@prismatic-io/spectral"; import { listProducts } from "./actions"; import { connectionInput } from "./inputs"; interface MyPollingState { limit?: number; page?: number; } interface Product { id: number; color: string; name: string; } interface ListProductsResult { products: Product[]; page_info: { limit: number; page: number; }; } const myPollingTrigger = pollingTrigger({ display: { label: "Poll products API for changes", description: "Fetch new products from Acme", }, pollAction: listProducts, inputs: { connection: connectionInput }, perform: async (context, payload, params) => { const { limit, page: oldPage }: MyPollingState = context.polling.getState(); const { data } = (await context.polling.invokeAction({ connection: params.connection, limit, page: oldPage + 1, // Fetch the next page of results })) as ListProductsResult; const { page: newPage } = data.page_info; const { products } = data; if (products.length) { // Some products were found return { payload: { ...payload, body: { data: products } }, polledNoChanges: false, }; } else { return { payload, polledNoChanges: true, }; } }, }); ``` --- #### Unit Testing for Custom Connectors #### Overview[​](#overview "Direct link to Overview") It's important to have good unit tests for software - custom components are no exception. You want to catch errors or breaking changes before they wreak havoc on your customers' integrations. Prismatic's Spectral library provides some utility functions to make writing unit tests easier. In the examples below, we assume that you use the [Jest](https://jestjs.io/) testing framework which is installed by default when you run `prism components:init`. You can swap Jest out for another testing framework if you prefer. ##### Test file naming conventions[​](#test-file-naming-conventions "Direct link to Test file naming conventions") To create a unit test file, create a new file alongside your code that has the extension `test.ts` (rather than `.ts`). For example, if your code lives in `index.ts`, create a file named `index.test.ts`. If you separate out your component actions into `actions.ts`, create a corresponding `actions.test.ts`. ##### Testing component actions[​](#testing-component-actions "Direct link to Testing component actions") A component action's `perform` function takes two arguments: * `context` is an object that contains a `logger`, `executionId`, `instanceState`, and `stepId`. * `params` is an object that contains input parameters as key-value pairs. Test `context` parameters are described [here](https://prismatic.io/docs/custom-connectors/actions.md#the-context-parameter). Let's ignore them for now and look at the `params` object. Consider the example "Format Proper Name" action described previously: ``` export const properFormatName = action({ display: { label: "Properly Format Name", description: "Properly format a person's name (Last, First M.)", }, perform: async (context, params) => { if (params.middleName) { return { data: `${params.lastName}, ${params.firstName} ${params.middleName[0]}.`, }; } else { return { data: `${params.lastName}, ${params.firstName}` }; } }, inputs: { firstName, middleName, lastName }, }); ``` You can use the `ComponentTestHarness` class and `createHarness` helper function to test your actions. The test harness's `action` function takes two required and one optional parameters: * The action's key (i.e. `properFormatName`) * An object containing input parameters * An optional `context` object containing `logger`, `executionId`, `instanceState`, and `stepId` A Jest test file, then, could look like this: ``` import component from "."; import { createHarness } from "@prismatic-io/spectral/dist/testing"; const harness = createHarness(component); describe("Test the Proper Name formatter", () => { test("Verify first, middle, and last name", async () => { const result = await harness.action("properFormatName", { firstName: "John", middleName: "James", lastName: "Doe", }); expect(result.data).toStrictEqual("Doe, John J."); }); test("Verify first and last name without middle", async () => { const result = await harness.action("properFormatName", { firstName: "John", middleName: null, lastName: "Doe", }); expect(result.data).toStrictEqual("Doe, John"); }); }); ``` You can then run `yarn run jest`, and Jest will run each test, returning an error code if a test failed. ##### Verifying correct logging in action tests[​](#verifying-correct-logging-in-action-tests "Direct link to Verifying correct logging in action tests") You may want to verify that your action generates some logs of particular severities in certain situations. In addition to step results, the test utility's `invoke` function returns an object, `loggerMock`, with information on what was logged during the action invocation. You can use Jest to verify that certain lines were logged like this: ``` import { myExampleAction } from "./actions"; import { invoke } from "@prismatic-io/spectral/dist/testing"; test("Ensure that an error is logged", async () => { const level = "error"; const message = "Error code 42 occurred."; const { loggerMock } = await invoke(myExampleAction, { exampleInput1: "exampleValue1", exampleInput2: "exampleValue2", }); expect(loggerMock[level]).toHaveBeenCalledWith(message); }); ``` In the above example, the test would pass if an `error` log line of `Error code 42 occurred.` were generated, and would fail otherwise. ##### Providing test connection inputs to an action test[​](#providing-test-connection-inputs-to-an-action-test "Direct link to Providing test connection inputs to an action test") Many actions require a connection to interact with third-party services. You can create a connection object with the `createConnection` function from `@prismatic-io/spectral/dist/testing`: ``` import { createConnection, createHarness, } from "@prismatic-io/spectral/dist/testing"; import component from "."; import { myConnection } from "./connections"; const harness = createHarness(component); const myBasicAuthTestConnection = createConnection(myConnection, { username: "myUsername", password: "myPassword", }); describe("test my action", () => { test("verify the return value of my action", async () => { const result = await harness.action("myAction", { someInput: "abc-123", connection: myBasicAuthTestConnection, someOtherInput: "def-456", }); }); }); ``` It's not good practice to hard-code authorization secrets. Please use best practices, like setting environment variables to store secrets in your CI/CD environment: ``` import { createConnection } from "@prismatic-io/spectral/dist/testing"; import { myConnection } from "./connections"; const myBasicAuthTestConnection = createConnection(myConnection, { username: process.env.ACME_ERP_USERNAME, password: process.env.ACME_ERP_PASSWORD, }); ``` Use an Existing Integration's Connections for Testing If you would like to fetch an access key from an existing OAuth 2.0 connection in an integration (or username / password, API key, etc.), leverage the `prism components:dev:run` command to fetch the connection's fields and tokens. You can then reference the `PRISMATIC_CONNECTION_VALUE` environment variable in your Jest tests. More info is in our [prism docs](https://prismatic.io/docs/cli/prism.md#componentsdevrun). ##### Testing a trigger[​](#testing-a-trigger "Direct link to Testing a trigger") Testing a trigger is similar to [testing an action](#testing-component-actions), except you use the `harness.trigger` function instead. For example, if you want to use Jest to test the `csvTrigger` outlined above, your test could look like this: ``` import component from "."; import { createHarness, defaultTriggerPayload, } from "@prismatic-io/spectral/dist/testing"; const harness = createHarness(component); describe("test csv webhook trigger", () => { test("verify the return value of the csv webhook trigger", async () => { const payload = defaultTriggerPayload(); // The payload you can expect a generic trigger to receive payload.rawBody.data = "first,last,age\nJohn,Doe,30\nJane,Doe,31"; payload.headers.contentType = "text/csv"; payload.headers["x-confirmation-code"] = "some-confirmation-code-123"; const expectedData = [ { first: "John", last: "Doe", age: "30" }, { first: "Jane", last: "Doe", age: "31" }, ]; const expectedResponse = { statusCode: 200, contentType: "text/plain; charset=utf-8", body: payload.headers["x-confirmation-code"], }; const { payload: { body: { data }, }, response, } = await harness.trigger("csvTrigger", null, payload, { hasHeader: true, }); expect(data).toStrictEqual(expectedData); expect(response).toStrictEqual(expectedResponse); }); }); ``` #### Testing components from the CLI[​](#testing-components-from-the-cli "Direct link to Testing components from the CLI") The [`prism` CLI tool](https://prismatic.io/docs/cli.md) provides two commands for testing custom components: * `prism components:dev:run` fetches an integration's active connection and saves the fields as an environment variable so you can run unit tests and other commands locally. This is helpful, since many unit tests require an access token from a validated OAuth 2.0 connection - this provides a way of fetching the token from a connection you've already authenticated in the Prismatic integration designer. * `prism components:dev:test` publishes your component under a temporary name, and runs a single-action test integration for you that tests the action. This is helpful for quickly testing an action in the real Prismatic integration runner environment. ##### Access connections for local testing[​](#access-connections-for-local-testing "Direct link to Access connections for local testing") The `prism components:dev:run` command fetches an active connection from the Prismatic integration designer, so you can use the connection's fields for unit testing. The connection's values are set to an environment variable named `PRISMATIC_CONNECTION_VALUE`, which can be used by a subsequent command. In this example, we use `printenv` to print the environment variable, and pipe the result into [jq](https://stedolan.github.io/jq/) for pretty printing: ``` prism components:dev:run \ --integrationId SW50ZEXAMPLE \ --connectionKey "Dropbox Connection" -- printenv PRISMATIC_CONNECTION_VALUE | jq { "token": { "access_token": "sl.EXAMPLE", "token_type": "bearer", "expires_in": 14400, "expires_at": "2022-10-13T20:09:53.739Z", "refresh_token": "EXAMPLE" }, "context": { "code": "sU4pEXAMPLE", "state": "SW5zdEXAMPLE" }, "fields": { "clientId": "EXAMPLE", "clientSecret": "EXAMPLE" } } ``` Note that the command you want to run with the environment variable should follow a `--`. Within your unit test code, you can use `harness.connectionValue()`, which pulls in the `PRISMATIC_CONNECTION_VALUE` environment variable into a connection that you can use for tests: Use PRISMATIC\_CONNECTION\_VALUE for a Jest test ``` import { createHarness } from "@prismatic-io/spectral/dist/testing"; import { oauthConnection } from "./connections"; import component from "."; // Initialize a testing harness const harness = createHarness(component); // Parse the OAuth 2.0 connection from the PRISMATIC_CONNECTION_VALUE environment variable const parsedConnection = harness.connectionValue(oauthConnection); describe("listFolder", () => { test("listRootFolder", async () => { const result = await harness.action("listFolder", { dropboxConnection: parsedConnection, // Pass in our connection path: "/", }); const files = result["data"]["result"]["entries"]; // Verify a folder named "Public" exists in the response expect(files).toEqual( expect.arrayContaining([expect.objectContaining({ name: "Public" })]), ); }); }); ``` From your component, you can then run: ``` prism components:dev:run \ --integrationId SW50ZEXAMPLE \ --connectionKey "Dropbox Connection" -- npm run jest ``` ##### Run a test of an action from the command line[​](#run-a-test-of-an-action-from-the-command-line "Direct link to Run a test of an action from the command line") The `prism components:dev:test` command allows you to test an action quickly from the command line in the real integration runner environment. * Run `prism components:dev:test` from your component's root directory. * You will be prompted to select an action to test. Select one. * For each input of the action, supply a value * If your action requires a connection, you will be prompted for values for that connection (username, password, client\_id, etc). * If your action requires an OAuth 2.0 connection, a web browser will open to handle the OAuth flow. Once all inputs are entered, your action will run in the integration runner, and you will see logs from your action. ###### Test run environment files[​](#test-run-environment-files "Direct link to Test run environment files") You do not need to enter the same inputs each time you want to run a test of your action. To set some values for your test inputs, create a new file called `.env` in the same directory where you're invoking `prism` and enter your inputs and values as key/value pairs. For example, if you plan to leave `cursor` and `limit` inputs blank, set `path` to `/`, and you have an OAuth client ID and secret that you want to use each time, your `.env` file can look like this: ``` CURSOR= LIMIT= PATH=/ CLIENT_ID=xlexample CLIENT_SECRET=4yexample ``` --- #### Webhooks #### Handle webhooks in custom components[​](#handle-webhooks-in-custom-components "Direct link to Handle webhooks in custom components") If the API you're building a custom component for supports event-driven notifications ([webhooks](https://prismatic.io/docs/integrations/triggers/webhook.md)), it's helpful to include actions that subscribe an instance's flow to a webhook. Generally, we've found that four webhook-related actions are helpful to have: 1. **List Webhooks**. This action lists all webhooks that are configured in the third-party app. We recommend only displaying webhooks that are pointed at the current instance's flows (otherwise, you'll see webhooks that are configured for other applications). Check out the example in our [GitHub examples repo](https://github.com/prismatic-io/examples/blob/2e45a9aa2af25ddb511df364cb29df057367afd5/components/asana/src/actions/webhooks.ts#L55-L73) which demonstrates how to reference `context.webhookUrls` to filter webhooks down to only ones that match your current instance. 2. **Create Webhook** This action takes an event (or list of events, like `contact.create`) and a URL, which can be a webhook URL of a sibling flow. If an ID of a webhook is returned, this action can return that ID. See our [examples repo](https://github.com/prismatic-io/examples/blob/2e45a9aa2af25ddb511df364cb29df057367afd5/components/asana/src/actions/webhooks.ts#L115-L143) for an example of creating a webhook. 3. **Delete Webhook by ID** This action can take an ID of a webhook (fetched by the **List Webhooks** action), and delete that webhook by ID. 4. **Delete Instance Webhooks** This is a handy action to include, as it fetches a list of all webhooks in the third-party application, filters them down to only webhooks pointed at the current instance, and removes just those webhooks. You can leverage `context.webhookUrls` to determine which webhooks to delete. Having this logic baked in to a single action can reduce complexity on your component's users. Check out our [examples repo](https://github.com/prismatic-io/examples/blob/2e45a9aa2af25ddb511df364cb29df057367afd5/components/asana/src/actions/webhooks.ts#L176-L204) for an example of a **Delete Instance Webhooks** action. The above actions assume that an integration builder will create two flows: one that creates webhooks on instance deploy and one that removes webhooks on instance delete. If you'd like to simplify that process for users of your component, you can bake webhook logic into a custom trigger's `onInstanceDeploy` and `onInstanceDelete` functions. See our [examples repository](https://github.com/prismatic-io/examples/blob/2e45a9aa2af25ddb511df364cb29df057367afd5/components/asana/src/triggers/eventTriggers.ts#L115-L152) for an example of how to create and delete webhooks automatically when an instance is created or deleted. --- ### Embed Prismatic #### Embedded Overview #### Overview[​](#overview "Direct link to Overview") You can embed Prismatic's integration marketplace and workflow builder directly into your frontend application, allowing users to deploy and manage integrations. You can customize the appearance of the embedded components to seamlessly blend with your application's design, providing a consistent user experience. Embedding involves three key steps: 1. [Install](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) the embedded SDK into your application. 2. [Authenticate](https://prismatic.io/docs/embed/authenticate-users.md) embedded users through your existing authentication system (eliminating the need for separate Prismatic credentials). 3. Customize the marketplace or workflow builder interface to match your application's design system. Prismatic is displayed within your application as an iframe, enabling authenticated users to interact with the marketplace and workflow builder as if they were part of your application. ![Example of embedded integration marketplace](/docs/img/embed/overview/acme-saas-example.png) --- #### Embedding Additional Screens In addition to the embedded workflow builder and marketplace, you can also embed: * The **customer dashboard**, giving your customers a one-stop shop for managing integrations, instances, reusable connections and workflows, and monitoring executions and logs. * **Component screens**, allowing customers to view all components or a specific component's details. * **Logs**, providing customers with access to logs for all of their instances and workflows. * **Connections**, enabling customers to manage their reusable connections in one place. #### Showing the customer dashboard[​](#showing-the-customer-dashboard "Direct link to Showing the customer dashboard") To provide your customer with a comprehensive view of their integrations, workflows, instances, etc., you can display the customer dashboard. Showing the customer dashboard ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "dashboard-div"; function Dashboard() { useEffect(() => { prismatic.showDashboard({ selector: `#${id}` }); }, []); return
Loading...
; } export default Dashboard; ``` ![Open the customer dashboard in embedded](/docs/img/embed/additional-screens/dashboard.png) ##### Hiding tabs in the dashboard[​](#hiding-tabs-in-the-dashboard "Direct link to Hiding tabs in the dashboard") By default, all tabs are displayed to customers when `showDashboard()` is invoked. To hide specific tabs, include the `screenConfiguration.dashboard.hideTabs` parameter when calling `showDashboard()`: Hiding tabs in the customer dashboard ``` prismatic.showDashboard({ selector: `#${id}`, theme: "LIGHT", screenConfiguration: { dashboard: { hideTabs: ["Attachments"], // Hide the Attachments tab }, }, }); ``` #### Showing the connection screen[​](#showing-the-connection-screen "Direct link to Showing the connection screen") Customers can manage all of their reusable connections in one place when you embed the connections screen (requires `@prismatic/embedded@4.2.0` or later). Showing the connections screen ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "connections-div"; function Connections() { useEffect(() => { prismatic.showConnections({ selector: `#${id}` }); }, []); return
Loading...
; } export default Connections; ``` ![Embedded connections screen](/docs/img/embed/additional-screens/connections.png) Clicking on a connection from the listview will open the connection detail screen. From there, users can view the instances and workflows that use that connection, as well as edit or delete the connection. #### Showing component screens[​](#showing-component-screens "Direct link to Showing component screens") You can embed component screens in your application using the `prismatic.showComponents` and `prismatic.showComponent` functions. ##### Showing all components[​](#showing-all-components "Direct link to Showing all components") To display all components available in your organization, use the `showComponents()` function: Showing all components ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "my-embedded-builder-div"; function ComponentListView() { useEffect(() => { prismatic.showComponents({ selector: `#${id}` }); }, []); return
Loading...
; } export default ComponentListView; ``` ![Embedded components listview screen](/docs/img/embed/additional-screens/components.png) ##### Showing a specific component[​](#showing-a-specific-component "Direct link to Showing a specific component") To show details about a specific component, use the `showComponent()` function and provide the component's ID. You will need to get the component ID using the `prismatic.graphqlRequest` function: Showing a specific component ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "my-embedded-builder-div"; function DropboxComponent() { useEffect(() => { const showDropboxComponent = async () => { if (authenticated) { const query = `query getComponentByKey($componentKey: String!) { components(key: $componentKey) { nodes { id } } }`; const result = await prismatic.graphqlRequest({ query, variables: { componentKey: "dropbox" }, }); prismatic.showComponent({ selector: `#${embeddedDivId}`, theme: "LIGHT", componentId: result.data.components.nodes[0].id, }); } }; showDropboxComponent(); }, []); return
Loading...
; } export default DropboxComponent; ``` ![Embedded component detail screen](/docs/img/embed/additional-screens/component.png) Additional documentation on querying the Prismatic API as a customer user from the embedded SDK is available in the [Embedded API Requests](https://prismatic.io/docs/embed/embedded-api-requests.md) article. #### Showing logs[​](#showing-logs "Direct link to Showing logs") The `prismatic.showLogs` function presents customer users with the logs listview page. This allows them to view logs from all their instances in one location. It is the same view you would see as an organization team member when opening a customer's logs tab. Showing all instance logs ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "logs-div"; function AllInstanceLogs() { useEffect(() => { prismatic.showLogs({ selector: `#${id}` }); }, []); return
Loading...
; } export default AllInstanceLogs; ``` ![Embedded logs screen](/docs/img/embed/additional-screens/logs.png) --- #### Authenticating Embedded Users #### Authenticating users[​](#authenticating-users "Direct link to Authenticating users") One advantage of embedding the Prismatic marketplace or workflow builder is that users don't need to remember an additional set of credentials. They can log in to your application, and you can provide them with an authentication token that allows them to interact with Prismatic. You do this by generating a JSON Web Token (JWT) that is signed by a unique private key that you get from Prismatic. The JWT contains information about the user from your system. When Prismatic is presented with a signed JWT, the JWT signature is verified and the user is granted a [customer role](https://prismatic.io/docs/customers/customer-users.md#customer-user-roles) of **Marketplace**, which allows them to deploy integrations for their specific customer in Prismatic. #### JWT signing keys[​](#jwt-signing-keys "Direct link to JWT signing keys") Before you can generate a JWT, you will need a valid **signing key** from Prismatic. Within Prismatic, click on your organization name on the bottom of the left-hand sidebar, and then open the **Embedded** tab. Click the **+ Add signing Key** button. *Note*: You must be an [owner or admin](https://prismatic.io/docs/configure-prismatic/organization-users.md#organization-team-member-roles) to create a signing key. You will be presented with a private signing key. Store this key somewhere safe - it's the key you'll use to sign JWTs for users in your application. ![Get signing key in Prismatic app](/docs/img/embed/authenticate-users/example-private-key.png) Private keys are not stored in Prismatic Prismatic does not store the private signing key that is generated. Instead, we only save the last 8 characters so you can easily match up a private key you have with one in our system. We store the corresponding public key to verify signatures of JWTs you send. Save the private key that you generate somewhere safe. If it's ever compromised or you lose it, you can deactivate old keys and generate a new one. ##### Importing your own private signing key[​](#importing-your-own-private-signing-key "Direct link to Importing your own private signing key") You can also import your own private signing key for embedded authentication. The OpenSSL CLI tool is most commonly used for generating public/private key pairs yourself: ``` # Generate a private key with 4096 bit encryption openssl genrsa -out my-private-key.pem 4096 # Generate the corresponding public key openssl rsa -in my-private-key.pem -pubout > my-public-key.pub ``` This will generate two files - a private key called `my-private-key.pem` and a public key called `my-public-key.pub`. Your public key will look like this: ``` -----BEGIN PUBLIC KEY----- EXAMPLE -----END PUBLIC KEY----- ``` Import the public key using the [prism CLI tool](https://prismatic.io/docs/cli/prism.md#organizationsigning-keysimport): ``` prism organization:signing-keys:import -p my-public-key.pub ``` #### Create and sign a JWT[​](#create-and-sign-a-jwt "Direct link to Create and sign a JWT") Now that you have a signing key, you can create and sign a JSON web token (JWT). Your backend API (not your frontend) should generate a JWT for your users. Your frontend client should request this JWT from your backend API. Do your JWT generation on the backend Generate the JWT tokens on the backend. Baking JWT generation (including the signing key) into your frontend presents a security problem - someone with the signing key could sign their own JWT and pretend to be any user. Most programming languages offer JWT libraries for generating tokens - see [jwt.io](https://jwt.io/). The JWT that you generate for a user should have the following properties: * The **header** should indicate that it's signed with RSA SHA-256, and should read: ``` { "alg": "RS256", "typ": "JWT" } ``` * The **private signing key** you generated * A **payload** containing the following fields: * A unique user ID `sub`. This should generally be a UUID. * An optional `external_id` sets the external ID of the *user* in Prismatic. This typically matches the `sub` and represents the ID of the user in your system. * The user's `name` (*optional*) * Your `organization` ID, found on the **Embedded** tab where you generate signing certificates * The [external ID](https://prismatic.io/docs/customers/managing-customers.md#customer-external-ids) of the `customer` the user belongs to * An optional `customer_name`. If present, and if a `customer` with the given external ID does not exist, a customer record will automatically be created. If a customer is found with the provided external ID, this property will be ignored. * A signing time (current time) `iat`. This must be a number representing the current Unix timestamp. * A `role` is only required for [user level configuration](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md) (ULC). A role of `"admin"` allows the user to deploy ULC instances. A role of `"user"` allows the user to supply user configuration to an already deployed ULC instance. The role defaults to `"admin"`. * An expiration time `exp`. This must be a number representing the Unix timestamp when the token expires. Example JWT Payload ``` { "sub": "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", "external_id": "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", "name": "Phil Embedmonson", "organization": "T3JnYW5pemF0aW9uOmU5ZGVhZDU5LWU3YzktNDNkMi1hNjhhLWFhMjcyMzEyMTAxNw==", "customer": "abc-123", "customer_name": "Hooli", "role": "admin", "iat": 1631676917, "exp": 1631680517 } ``` Use unique identifiers as JWT subjects The `sub` (subject) within the JWT identifies the user who is logged in to your system. The `sub` value can be any unique identifier - a UUID, email address, etc. A customer user with that identifier will be created in Prismatic if it doesn't already exist, and will be granted permissions to configure and deploy instances to the customer they're assigned to (you assign the user to a customer in the examples below). Here are a couple of code snippets for JavaScript and Python that would create a valid JWT to authenticate a user in Prismatic: * JavaScript Example * Python Example * .Net (C#) Example ``` import jsonwebtoken from "jsonwebtoken"; /* This is for illustrative purposes only; Obviously don't hard-code a signing key in your code. */ const signingKey = `-----BEGIN PRIVATE KEY----- MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDP3+OrT0IXqCu4 EXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEE c5R7QVzxgmGRXjPZGPf5huA1 -----END PRIVATE KEY-----`; const currentTime = Math.floor(Date.now() / 1000); const token = jsonwebtoken.sign( { sub: "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", // Some unique identifier for the user external_id: "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", // Generally matches sub name: "Phil Embedmonson", // Optional organization: "T3JnYW5pemF0aW9uOmU5ZGVhZDU5LWU3YzktNDNkMi1hNjhhLWFhMjcyMzEyMTAxNw==", customer: "abc-123", // This is an external ID of a customer customer_name: "Hooli", // The optional name to use if we need to create a new customer record iat: currentTime, exp: currentTime + 60 * 60, // 1 hour from now }, signingKey, // Store this somewhere safe { algorithm: "RS256" }, ); ``` ``` import jwt import math from time import time # This is for illustrative purposes only; # Obviously don't hard-code a signing key in your code. signing_key = '''-----BEGIN PRIVATE KEY----- MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDP3+OrT0IXqCu4 EXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEE c5R7QVzxgmGRXjPZGPf5huA1 -----END PRIVATE KEY-----''' current_time = math.floor(time()) token = jwt.encode( { "sub": "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", # Some unique identifier for the user "external_id": "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", # Generally matches sub "name": "Phil Embedmonson", # Optional "organization": "T3JnYW5pemF0aW9uOmU5ZGVhZDU5LWU3YzktNDNkMi1hNjhhLWFhMjcyMzEyMTAxNw==", "customer": "abc-123", # This is an external ID of a customer "customer_name": "Hooli", # The optional name to use if we need to create a new customer record "iat": current_time, "exp": current_time + 60 * 60, # 1 hour from now }, signing_key, algorithm="RS256") ``` ``` using Microsoft.IdentityModel.Tokens; using System.Security.Cryptography; using System.IdentityModel.Tokens.Jwt; using System.Security.Claims; /* This is for illustrative purposes only; Obviously don't hard-code a signing key in your code. */ var pem = @"-----BEGIN PRIVATE KEY----- MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDP3+OrT0IXqCu4 EXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEEXAMPLEE c5R7QVzxgmGRXjPZGPf5huA1 -----END PRIVATE KEY-----"; Task GetToken() { using var rsa = RSA.Create(); rsa.ImportFromPem(pem); var descriptor = new SecurityTokenDescriptor(); descriptor.SigningCredentials = new SigningCredentials(new RsaSecurityKey(rsa), SecurityAlgorithms.RsaSha256) { CryptoProviderFactory = new CryptoProviderFactory { CacheSignatureProviders = false } }; var claims = new List { new Claim("sub", "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1"), // Some unique identifier for the user new Claim("external_id", "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1"), // Generally matches sub new Claim("name", "Phil Embedmonson"), // Optional new Claim("organization", "T3JnYW5pemF0aW9uOmU5ZGVhZDU5LWU3YzktNDNkMi1hNjhhLWFhMjcyMzEyMTAxNw=="), new Claim("customer", "abc-123"), // This is an external ID of a customer new Claim("customer_name", "Hooli") // The optional name to use if we need to create a new customer record }; descriptor.Subject = new ClaimsIdentity(claims); descriptor.IssuedAt = DateTime.UtcNow; descriptor.Expires = DateTime.UtcNow.AddHours(1); // Expire 1 hour from now var token = new JwtSecurityTokenHandler().CreateEncodedJwt(descriptor); return Task.FromResult(token); } var token = await GetToken(); ``` An example NextJS implementation of JWT generation is available in [GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/pages/api/prismatic-auth.tsx). #### Use the JWT to authenticate the user[​](#use-the-jwt-to-authenticate-the-user "Direct link to Use the JWT to authenticate the user") Now that a user in your application has a signed JWT from the backend, you can authenticate them with the Prismatic library using the `prismatic.authenticate()` function in your frontend application: ``` // Some function that fetches the JWT from your API: const token = getJwtToken(); try { await prismatic.authenticate({ token }); } catch (error) { console.error(`Authentication failed with error ${error}`); } ``` If your customer or organization ID in your JWT are incorrect, if your JWT is not signed correctly, or if the JWT is expired, `prismatic.authenticate()` will throw an error. For an example React hook that wraps the `prismatic.authenticate()` function, see the [GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/src/usePrismaticAuth.ts#L66). #### Refreshing an embedded JWT[​](#refreshing-an-embedded-jwt "Direct link to Refreshing an embedded JWT") If a customer user's JWT expires, the customer user will see a 404 in their embedded iframe. To reauthenticate a user prior to expiration, ensure that your frontend app fetches a new token for your user and then run `prismatic.authenticate({ token })` with the new token. Existing iframes and the embedded client will be updated to use the new token. --- #### Custom Marketplace UI #### Why build a custom marketplace UI?[​](#why-build-a-custom-marketplace-ui "Direct link to Why build a custom marketplace UI?") The [embedded marketplace](https://prismatic.io/docs/embed/marketplace.md) enables you to embed Prismatic's marketplace as an [iframe](https://www.w3schools.com/tags/tag_iframe.ASP) in your application. [Theming](https://prismatic.io/docs/embed/theming.md) provides flexibility with the marketplace's appearance and allows you to match your application's branding colors and typography. But, you may require further customization. With a custom marketplace UI, you can query Prismatic's API for available integrations and map them to native UI elements within your application. You can present integrations as material cards, in a table, as a listview - any format you prefer. ![Example integration marketplace UI](/docs/img/embed/custom-marketplace-ui/example.png) A full example implementation of a custom UI is available on [GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/pages/examples/custom-ui-elements.tsx). #### Querying for marketplace integrations[​](#querying-for-marketplace-integrations "Direct link to Querying for marketplace integrations") The [marketplaceIntegrations](https://prismatic.io/docs/api/schema/query/marketplaceIntegrations.md) query returns an array of integrations available in the marketplace. Since Prismatic's API is GraphQL-based, you can query for more or fewer fields, but this query (and its corresponding TypeScript types) contains the information needed to display a marketplace: * GraphQL Query * TypeScript Types GraphQL query to fetch marketplace integrations ``` query getMarketplaceIntegrations { marketplaceIntegrations(includeActiveIntegrations: true) { nodes { id name allowMultipleMarketplaceInstances avatarUrl category description isCustomerDeployable marketplaceConfiguration overview versionNumber firstDeployedInstance { id } deployedInstances deploymentStatus } } } ``` Corresponding TypeScript types ``` interface MarketplaceIntegration { id: string; name: string; allowMultipleMarketplaceInstances: boolean; avatarUrl?: string; category: string; description: string; isCustomerDeployable: boolean; marketplaceConfiguration: string; overview: string; versionNumber: number; firstDeployedInstance?: { id: string }; deployedInstances: "ZERO" | "ONE" | "MULTIPLE"; deploymentStatus?: "ACTIVATED" | "PAUSED" | "UNCONFIGURED"; } type MarketplaceIntegrationsResponse = { data: { marketplaceIntegrations: { nodes: MarketplaceIntegration[]; }; }; }; ``` Key points about this query: * The `includeActiveIntegrations` parameter is optional. Include it if you have customer-specific integrations deployed to only a subset of customers and not added to your marketplace. This will display those integrations to customers who have instances of them. * The `firstDeployedInstance`, `deployedInstances`, and `deploymentStatus` properties are used for performance optimization - they're more efficient than including an `instances { nodes { } }` connection in the query. * `firstDeployedInstance` represents the first instance of this integration deployed to the customer making the query (if any). * `deployedInstances` will have one of three values - `"ZERO"`, `"ONE"` or `"MULTIPLE"` - representing the number of instances this customer has deployed. * `deploymentStatus` will have one of four values - `"ACTIVATED"`, `"PAUSED"`, `"UNCONFIGURED"` or `null` - depending on the state of the instance the customer has deployed (if any). To execute the query against Prismatic, you can either use the embedded package's `prismatic.graphqlRequest` function or leverage a GraphQL client like [graphql-request](https://www.npmjs.com/package/graphql-request). See [Embedded API Requests](https://prismatic.io/docs/embed/embedded-api-requests.md) for more details. #### Displaying integration avatars[​](#displaying-integration-avatars "Direct link to Displaying integration avatars") Each integration returned from the query above has an optional `avatarUrl` property, which you can use to fetch the icon associated with your integration. Your `avatarUrl` will look something like `/media/UUID/Integration/UUID/EnumMeta.AVATAR/UUID.png`. With an `avatarUrl`, make an authenticated API call to Prismatic with your URL as a relative path (e.g. `https://app.prismatic.io/media/UUID/Integration/UUID/EnumMeta.AVATAR/UUID.png`). Ensure your request is authenticated with your embedded JWT as a header: `Authorization: Bearer ${TOKEN}`. This request will return JSON data in this format: ``` { "data": { "url": "https://s3.us-west-2.amazonaws.com/PRESIGNED-URL" } } ``` You can then set that presigned URL as your image's `src` property. Here's an example ReactJS implementation that displays an integration's avatar icon (if available) or defaults to a generic icon if not: Example of displaying integration avatars using ReactJS ``` function PrismaticAvatar({ avatarUrl, token }) { const [src, setSrc] = React.useState(""); useEffect(() => { let mounted = true; if (avatarUrl) { // Fetch the presigned URL from // https://app.prismatic.io/media/UUID/Integration/UUID/EnumMeta.AVATAR/UUID.png fetch(`https://app.prismatic.io${avatarUrl}`, { headers: { Authorization: `Bearer ${token}` }, }).then((response) => { response.json().then((data) => { if (mounted) { setSrc(data.url); } }); }); } return () => { mounted = false; }; }, []); // If the integration has no avatar URL, display a generic avatar icon // Otherwise, display an avatar with the presigned URL we fetched return src ? ( ) : ( ); } ``` #### Opening integration configuration windows[​](#opening-integration-configuration-windows "Direct link to Opening integration configuration windows") After mapping marketplace integrations to UI elements in your application, you'll want to make them interactive so users can open the configuration wizard modal and deploy instances. While you can build the config wizard from scratch and set config variables programmatically, we recommend leveraging Prismatic's implementation of the config wizard. Config wizards are complex! To open a config wizard, invoke `prismatic.configureInstance()` with your integration's name (or an existing instance's ID). See [Embedding Marketplace](https://prismatic.io/docs/embed/marketplace.md#configure-a-specific-integration) for more details. --- #### Embedded API Requests The embedded SDK enables you to embed the marketplace and workflow builder into your application. But, you may want to query and display additional data from the Prismatic API. By leveraging the Prismatic API, you can map API data into custom UI components and fully customize your customers' integration management and deployment experience. This article details how to fetch data from the Prismatic API using the embedded SDK. For information on installing the embedded SDK and authenticating customer users, see [Installing Embedded SDK](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md). For an example of querying the Prismatic API to implement a custom marketplace UI, see [GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/pages/examples/custom-ui-elements.tsx). #### The Prismatic GraphQL API[​](#the-prismatic-graphql-api "Direct link to The Prismatic GraphQL API") Prismatic's API is built on GraphQL. Comprehensive API documentation is available in the [API docs](https://prismatic.io/docs/api.md), including [example queries and mutations](https://prismatic.io/docs/api/common-queries/creating-instances.md). You can test the Prismatic API using our [GraphiQL explorer tool](https://prismatic.io/docs/explorer). #### Embedded user scope and API permissions[​](#embedded-user-scope-and-api-permissions "Direct link to Embedded user scope and API permissions") When you authenticate a customer user through the embedded SDK, that user is associated with one of your customers in Prismatic. The embedded customer user has access permissions limited to resources available to that specific customer (i.e., they cannot access other customers' integrations, instances, custom components, users, etc.). For example, they can execute this query to retrieve instances deployed to their customer: ``` { authenticatedUser { customer { instances { nodes { id name } } } } } ``` #### Example embedded API requests[​](#example-embedded-api-requests "Direct link to Example embedded API requests") The following examples demonstrate common use cases for embedded API requests. All examples utilize `prismatic.graphqlRequest`, which accepts two parameters: * `query`: A string representing the GraphQL query or mutation to execute against the Prismatic API. * `variables`: An object containing key-value pairs of variables to include with the query or mutation. ##### Listing deployed instances[​](#listing-deployed-instances "Direct link to Listing deployed instances") This example retrieves a list of instances from the API and renders them as `
    ` elements. While simple, it demonstrates how to fetch arbitrary data from the API and render it using custom UI components. Fetch and display data from the API ``` import { Button, Typography } from "@mui/material"; import prismatic from "@prismatic-io/embedded"; import { useState } from "react"; /** * Get a list of instances deployed to the current user's customer */ const loadInstances = async (setInstances: Function) => { const query = `{ authenticatedUser { customer { instances { nodes { id name flowConfigs { nodes { id flow { name } apiKeys webhookUrl } } } } } } }`; const result = await prismatic.graphqlRequest({ query }); setInstances(result.data.authenticatedUser.customer.instances.nodes); }; interface FlowConfig { id: string; flow: { name: string }; webhookUrl: string; } interface Instance { id: string; name: string; flowConfigs: { nodes: FlowConfig[]; }; } function ListInstances() { const [instances, setInstances] = useState([]); return ( <> In this example, all instances that are currently deployed to the current user's customer are listed, along with each instance's webhook URLs.
      {instances.map((instance) => (
    • {instance.name} ({instance.id})
    • ))}
    ); } export default ListInstances; ``` ##### Deploying an instance[​](#deploying-an-instance "Direct link to Deploying an instance") For instances with minimal configuration requirements (e.g., only a single OAuth flow), you may want to bypass the instance configuration wizard. This example presents a "Deploy Dropbox" button that, when clicked, retrieves the current user's customer ID and the Dropbox integration ID. It then deploys a Dropbox instance and opens a new window for the user to complete the OAuth flow. Deploy an instance without the config wizard ``` import { Button, Typography } from "@mui/material"; import prismatic from "@prismatic-io/embedded"; import { useState } from "react"; /** * Get the ID of the version of the Dropbox integration * that is available in the integration marketplace */ const getDropboxVersionId = async () => { const query = `query getMarketplaceIntegrations($name: String) { marketplaceIntegrations( name: $name sortBy: [{field: CATEGORY, direction: ASC}, {field: NAME, direction: ASC}] ) { nodes { id name versionSequence(first: 1, versionIsAvailable: true) { nodes { id versionNumber } } } } }`; const variables = { name: "Dropbox" }; const result = await prismatic.graphqlRequest({ query, variables }); return result.data.marketplaceIntegrations.nodes[0].versionSequence.nodes[0] .id; }; /** * Get the current user's customer ID */ const getCustomerId = async () => { const query = `{ authenticatedUser { customer { id } } }`; const result = await prismatic.graphqlRequest({ query }); return result.data.authenticatedUser.customer.id; }; interface CreateInstanceProps { dropboxVersionId: string; customerId: string; instanceName: string; } /** * Create a new instance of the Dropbox integration, returning the * OAuth authorize URL where the user should be sent */ const createInstance = async ({ dropboxVersionId, customerId, instanceName, }: CreateInstanceProps) => { const query = `mutation createDropboxInstance($customerId: ID!, $integrationId: ID!, $instanceName: String!) { createInstance(input: {customer: $customerId, integration: $integrationId, name: $instanceName}){ instance { id name configVariables { nodes { authorizeUrl } } flowConfigs { nodes { id flow { name } webhookUrl } } } } }`; const variables = { customerId, integrationId: dropboxVersionId, instanceName, }; const result = await prismatic.graphqlRequest({ query, variables }); return result; }; interface DeployInstanceProps { instanceId: string; } /** * Deploy the instance after configuration */ const deployInstance = async ({ instanceId }: DeployInstanceProps) => { const query = `mutation deployDropbox($instanceId: ID!){ deployInstance(input:{id:$instanceId}) { instance { lastDeployedAt } } }`; const variables = { instanceId }; await prismatic.graphqlRequest({ query, variables }); }; interface Instance { data?: { createInstance: { instance: { id: string }; }; }; } function DeployDropbox() { const [instance, setInstance] = useState({}); return ( <> In this example, an instance of an integration named Dropbox is created, the user is redirected to an OAuth screen, and the instance is then deployed.
    Note: this assumes that you have an integration in your marketplace called "Dropbox", and that the integration has only one config variable - the Dropbox connection.
    {JSON.stringify(instance, null, 2)}
    ); } export default DeployDropbox; ``` --- #### Embedding Without SDK While we strongly recommend using the [Embedded SDK](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) to embed the Prismatic marketplace and workflow builder, some frontend tech stacks prohibit Node.js module installation. If your frontend tech stack requires that you write custom HTML and JavaScript, you can still embed the marketplace. #### Generate a JWT[​](#generate-a-jwt "Direct link to Generate a JWT") Similar to using the SDK, you will need to generate a [JSON Web Token (JWT)](https://jwt.io/) for your user. See the [Authenticating Users](https://prismatic.io/docs/embed/authenticate-users.md) documentation for details on how to generate a JWT. You can generate a JWT using your backend server or any other secure environment - the JWT should *not* be generated in the client's browser. #### Authenticate the user[​](#authenticate-the-user "Direct link to Authenticate the user") Before displaying the iframe, you must call the authentication endpoint `https://app.prismatic.io/embedded/authenticate` with your customer's JWT. This will do two things: 1. Validate your JWT 2. Verify that the associated customer user exists (a user will be created if one does not exist). #### Embed the marketplace[​](#embed-the-marketplace "Direct link to Embed the marketplace") The embedded marketplace is essentially an [iframe](https://www.w3schools.com/tags/tag_iframe.ASP) pointing to `https://app.prismatic.io/integration-marketplace/` with a few query parameters: * `jwt`: The JWT you generate (see [Authenticating Users](https://prismatic.io/docs/embed/authenticate-users.md)) * `embedded`: Must be set to `"true"` * `theme`: Can be `LIGHT` or `DARK` In this pure HTML/JS example, on page load we fetch a JWT for our user and then call the authentication endpoint. Once we receive a response, we set the `src` property of our iframe appropriately. ```

    Pure HTML Embed Example

    ``` --- #### Installing the Embedded SDK #### Installing Prismatic's SDK[​](#installing-prismatics-sdk "Direct link to Installing Prismatic's SDK") To embed Prismatic in your app, you must include Prismatic JavaScript code with your client application. You have two choices for incorporating this code: 1. **(Recommended)** Install the [@prismatic-io/embedded](https://www.npmjs.com/package/@prismatic-io/embedded) NPM package in your web application Node.js project. We strongly recommend installing the SDK via NPM for intellisense and TypeScript support. 2. Build the package from source and include it in your app through a ` ``` Once the script has loaded, `prismatic.*` functions (described below) are available for use. You can also download the built `index.js` file from a JavaScript CDN like UNPKG: [unpkg.com/browse/@prismatic-io/embedded/](https://unpkg.com/browse/@prismatic-io/embedded/). #### Initialize the Prismatic client[​](#initialize-the-prismatic-client "Direct link to Initialize the Prismatic client") When Prismatic is embedded in your app, it's presented in an [iframe](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe) that by default points to Prismatic's application URL (). To initialize the Prismatic client, execute the `init` function: ``` prismatic.init(); ``` If you use a [custom white-label domain](https://prismatic.io/docs/configure-prismatic/custom-domains.md) (something like `https://integrations.my-example-company.com`), or if your Prismatic tenant is hosted in a different region (like EU or Australia), you can direct the embedded SDK to an alternative endpoint by specifying a `prismaticUrl`: Optional endpoint config for Prismatic's SDK ``` prismatic.init({ prismaticUrl: "https://integrations.my-example-company.com", }); // or prismatic.init({ prismaticUrl: "https://app.eu-west-1.prismatic.io", }); ``` For a complete list of public endpoints, see [Deployment Regions](https://prismatic.io/docs/configure-prismatic/deployment-regions.md#logging-in-to-additional-regions). #### Embedded screen configuration[​](#embedded-screen-configuration "Direct link to Embedded screen configuration") You can control how the embedded workflow builder, marketplace, dashboard and other pages appear in your app with these configuration options: * `usePopover`: A boolean that controls whether the screen should display as a popover. By default, the screen is embedded as an iframe into an existing DOM element. * `selector`: Specifies which DOM element to embed the iframe into when `usePopover` is false. * `theme`: Overrides the [custom theming](https://prismatic.io/docs/embed/theming.md) default behavior and can be set to `"LIGHT"` or `"DARK"` mode. --- #### Embedding Marketplace [Embed Prismatic Marketplace in 20 Minutes](https://player.vimeo.com/video/906041989) The video above demonstrates the process of embedding Prismatic in your application, starting from a blank React application. #### Embed the integration marketplace[​](#embed-the-integration-marketplace "Direct link to Embed the integration marketplace") You can embed the Integration Marketplace into your application with minimal code, providing your customers with a native integration deployment experience. This article details the implementation of Prismatic's integration marketplace within your application. For information on installing the embedded SDK and authenticating customer users, see [Installing Embedded SDK](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) Once a user has been authenticated, you can display the integration marketplace using the `prismatic.showMarketplace()` function. The `showMarketplace()` function accepts three optional parameters: * `usePopover`: Determines whether the marketplace should display as a popover. By default, the screen is embedded as an iframe into an existing DOM element. * `selector`: Specifies the DOM element to embed the iframe into when `usePopover` is false. * `theme`: Overrides [custom theming](https://prismatic.io/docs/embed/theming.md) default behavior, accepting either `"LIGHT"` or `"DARK"` mode. ##### iframe embedding method[​](#iframe-embedding-method "Direct link to iframe embedding method") When `usePopover` is set to `false`, an [iframe](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe) HTML element is embedded into the element specified by the `selector` parameter. For example, if you have a section of your application that reads: ```
    Loading...
    ``` You can embed the integration marketplace iframe into that div, replacing the "Loading..." text, by calling: ``` prismatic.showMarketplace({ selector: `#integration-marketplace-placeholder`, usePopover: false, }); ``` The result is that the integration marketplace appears as part of your application: ![Prismatic integration marketplace embedded in your app](/docs/img/embed/marketplace/acme-saas-example.png) Theming At this point, the marketplace has not been themed to match your application. To implement custom branding, see [Theming Embedded Marketplace](https://prismatic.io/docs/embed/theming.md). For an example of the iframe embedding method, see [GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/pages/examples/embedded-marketplace.tsx). ##### Popover method[​](#popover-method "Direct link to Popover method") If you set `usePopover` to `true`, the integration marketplace will display in a popover: ``` prismatic.showMarketplace({ usePopover: true, }); ``` ![Enable popover for Prismatic embedded marketplace](/docs/img/embed/marketplace/popover.png) ##### Filtering integrations[​](#filtering-integrations "Direct link to Filtering integrations") You can filter the displayed integrations by applying `filters` to your `showMarketplace()` call. ###### Simple integration filters[​](#simple-integration-filters "Direct link to Simple integration filters") You can filter integrations by a single [category](https://prismatic.io/docs/integrations/low-code-integration-designer.md#categorizing-integrations) or [label](https://prismatic.io/docs/integrations/low-code-integration-designer.md#assigning-labels-to-an-integration) (or both). If both filters are applied, integrations must match both `category` and `label` to appear. For example, to show only integrations with the category "ERP" and the label "enterprise" in an embedded marketplace, use: Filter by category and label ``` prismatic.showMarketplace({ selector: `#my-div-id`, usePopover: false, filters: { marketplace: { category: "ERP", label: "enterprise", }, }, }); ``` ###### Advanced integration filters[​](#advanced-integration-filters "Direct link to Advanced integration filters") You can use more sophisticated filtering logic using a `filterQuery`. You can filter integrations based on their `name`, `labels`, or `category`. The following operators are available: * `TermOperator.equal`: Returns integrations where the first term equals the second term, works with `name` and `category`. Example: To show only the "Dropbox" integration: ``` prismatic.showMarketplace({ filters: { marketplace: { filterQuery: [TermOperator.equal, "name", "Dropbox"], }, }, }); ``` * `TermOperator.notEqual`: Returns integrations where the first term does not equal the second term, works with `name` and `category`. Example: To show all integrations except those with the category "ERP": ``` prismatic.showMarketplace({ filters: { marketplace: { filterQuery: [TermOperator.notEqual, "category", "ERP"], }, }, }); ``` * `TermOperator.in`: Returns integrations whose `labels` include a specific label. Example: To show integrations with a "paid" label: ``` prismatic.showMarketplace({ filters: { marketplace: { filterQuery: [TermOperator.in, "labels", "paid"], }, }, }); ``` * `TermOperator.notIn`: Returns integrations that do not have specified labels. * `TermOperator.startsWith`: Returns integrations with names or categories that begin with a specific string. Example: To show all integrations starting with "Algolia": ``` prismatic.showMarketplace({ filters: { marketplace: { filterQuery: [TermOperator.startsWith, "name", "Algolia"], }, }, }); ``` This will match integrations with names like `Algolia - Dropbox` or `Algolia - SFTP`. * `TermOperator.doesNotStartWith`: Returns integrations with names or categories that do not start with a specific string. * `TermOperator.endsWith`: Returns integrations with names or categories that end with a specific string. * `TermOperator.doesNotEndWith`: Returns integrations with names or categories that do not end with a specific string. You can combine multiple conditions using `and` and `or` operators. For example, to show all integrations that have the category "ERP" and label "paid", plus the Dropbox and Slack integrations: ``` import prismatic, { BooleanOperator, TermOperator, } from "@prismatic-io/embedded"; prismatic.showMarketplace({ filters: { marketplace: { filterQuery: [ BooleanOperator.or, [ BooleanOperator.and, [TermOperator.equal, "category", "ERP"], [TermOperator.in, "labels", "paid"], ], [TermOperator.equal, "name", "Dropbox"], [TermOperator.equal, "name", "Slack"], ], }, }, }); ``` ##### Configure a specific integration[​](#configure-a-specific-integration "Direct link to Configure a specific integration") You can use custom UI elements (buttons, divs, hyperlinks, etc.) to start integration deployment. Call the `prismatic.configureInstance()` function from your UI element to display a configuration screen for a specific integration. Provide the integration name as `integrationName`. You can include other display parameters (like `usePopover` and `selector`) as described in the iframe embedding and popover methods above. You can also specify an optional `skipRedirectOnRemove` boolean parameter, which defaults to `false`. This determines whether users should be redirected to the integration listview upon removing an integration. If you're embedding the marketplace, you likely want to keep this value set to `false`. If you're implementing a custom API wrapper and handling integration display yourself, you might want to set this to `true`. For example, to display the "Salesforce" integration configuration when a button is clicked: ``` const deploySalesforce = () => { prismatic.configureInstance({ integrationName: "Salesforce", skipRedirectOnRemove: false, usePopover: true, }); }; ; ``` For an example of implementing a custom marketplace UI and opening the configuration screen for a specific integration with `prismatic.configureInstance()`, see [GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/pages/examples/custom-ui-elements.tsx#L212-L219). ##### Listening to marketplace events[​](#listening-to-marketplace-events "Direct link to Listening to marketplace events") The embedded marketplace emits custom JavaScript [events](https://www.w3schools.com/js/js_events.asp) when certain things happen: * `INSTANCE_CREATED`: Emitted when an integration's configuration screen is opened for the first time. * `INSTANCE_CONFIGURATION_OPENED`: Emitted when an instance configuration screen is opened. * `INSTANCE_CONFIGURATION_LOADED`: Emitted when an instance configuration screen has loaded. This is the optimal event to hook into for [programmatically setting config variables](#dynamically-setting-config-variables-in-marketplace). * `INSTANCE_CONFIGURATION_PAGE_LOADED`: Emitted when an instance configuration screen transitions between pages. * `INSTANCE_CONFIGURATION_CLOSED`: Emitted when an instance configuration screen is closed (regardless of whether configuration was completed). * `INSTANCE_DEPLOYED`: Emitted when an instance has been configured and enabled for a customer. * `INSTANCE_DELETED`: Emitted when an integration has been deactivated (the instance has been deleted). * `POPOVER_CLOSED`: Emitted when a user closes the popover modal (only applicable when `usePopover: true`). You can subscribe to these events to trigger additional actions in your application when instances are created, configured, deleted, etc. All events return data in this structure (except for `INSTANCE_CONFIGURATION_LOADED`, which includes additional data): ``` { "event": "INSTANCE_CONFIGURATION_OPENED", "data": { "customerId": "Q3VzdG9tZXI6OThiMjU3MDUtZmMzNC00NWYwLTk0ZDItODA0ZjFkYzEyYTZk", "customerName": "Smith Rockets", "instanceId": "SW5zdGFuY2U6YzJlYTliZjEtY2Y3MS00NTg1LTk2MjMtYjZhNDAxYjQyNWRm", "instanceName": "Salesforce", "integrationName": "Salesforce", "integrationVersionNumber": 18, "readOnly": false } } ``` In this example, we log the customer name and the integration they opened: ``` import { PrismaticMessageEvent } from "@prismatic-io/embedded"; window.addEventListener("message", (event) => { // Check if the message event is of type INSTANCE_CONFIGURATION_OPENED if ( event.data.event === PrismaticMessageEvent.INSTANCE_CONFIGURATION_OPENED ) { // Extract relevant data const { customerName, integrationName } = event.data.data; // Log the customer and integration name. Replace this with your desired data handling or helper functions console.log( `${customerName} opened the configuration page for ${integrationName}`, ); } }); ``` You can implement similar event listeners for instance configuration page closure, loading, creation, or deletion events. ##### User-level configuration marketplace events[​](#user-level-configuration-marketplace-events "Direct link to User-level configuration marketplace events") If [user-level configuration](https://prismatic.io/docs/integrations/config-wizard/user-level-configuration.md) is enabled for an integration, the following marketplace events will be emitted: * `USER_CONFIGURATION_OPENED`: Emitted when a ULC configuration screen is opened. * `USER_CONFIGURATION_LOADED`: Emitted when a ULC configuration screen has loaded. This is the best event to hook into for [programmatically setting config variables](#dynamically-setting-config-variables-in-marketplace). * `USER_CONFIGURATION_PAGE_LOADED`: Emitted when a ULC configuration screen transitions between pages. * `USER_CONFIGURATION_CLOSED`: Emitted when a ULC configuration screen is closed (regardless of whether configuration was completed). * `USER_CONFIGURATION_DEPLOYED`: Emitted when a ULC configuration is completed. * `USER_CONFIGURATION_DELETED`: Emitted when a ULC configuration is deleted. For all of these events, the event's `data` property will contain: ``` { "event": "USER_CONFIGURATION_LOADED", "data": { "customerId": "Q3VzdG9tZXI6ZjYxMzY4MzItMzYxYi00NmI3LTlmMmItN2ZkNTc3YzY1YTg1", "customerName": "Smith Real Estate", "instanceId": "SW5zdGFuY2U6N2I1MjFiNzItYzllNS00YjkwLWIzZGEtZTY4OTY5OWU2ZjBl", "instanceName": "Dropbox", "integrationName": "Dropbox", "integrationVersionNumber": 9, "userConfigId": "VXNlckxldmVsQ29uZmlnOjI0NDU5MjA2LWQxNGUtNDRhMy04NWE4LTJjMzgwMzg2Y2NmZg==", "userEmail": "2E52B7CB-071B-4EA2-8E9D-F64910EBDBB1", "userId": "VXNlcjpmYjc4YzE1OS1kOWMwLTQxMDctYjIyNC0zYmNhMDFlOTQ5NzY=", "userLevelConfigVariables": { "Dropbox Connection": { "status": "ACTIVE" } }, "readOnly": false, "userName": "Phil Embedmonson" } } ``` #### Dynamically setting config variables in marketplace[​](#dynamically-setting-config-variables-in-marketplace "Direct link to Dynamically setting config variables in marketplace") consider integration-agnostic connections If you need to programmatically set connection values, consider implementing [integration-agnostic connections](https://prismatic.io/docs/integrations/connections.md#integration-agnostic-connections), which enable you to configure connections for each customer. You can also use the embedded SDK to set connection values at deployment time, as demonstrated below. You can programmatically set values for your customers' config variables by leveraging [marketplace events](#listening-to-marketplace-events). This is particularly useful when you have access to certain values (API keys, endpoints, etc.) and want to set config variables on behalf of your customers (eliminating the need for them to look up these values). When your application receives an `INSTANCE_CONFIGURATION_LOADED` event message (or `USER_CONFIGURATION_LOADED` for ULC), the message payload includes the properties listed above, plus current config variable values. You can analyze that event's data and conditionally set values for specific config variables using the `prismatic.setConfigVars()` function. Let's examine the event message structure and how to respond with config variable values: * Event Payload * NodeJS Example * React Example Here's the structure of the `INSTANCE_CONFIGURATION_LOADED` payload. The event message's `.data.configVars` property contains all currently set configuration variables and their values: Example INSTANCE\_CONFIGURATION\_LOADED event payload ``` { "event": "INSTANCE_CONFIGURATION_LOADED", "data": { "instanceId": "SW5zdGFuY2U6ZTE4NTNkYWItZjJhMi00OGIyLTk1ZWItODRjYzQ3YzRiMzc4", "instanceName": "Test Embedded config vars", "integrationName": "Test Embedded config vars", "integrationVersionNumber": 1, "customerId": "Q3VzdG9tZXI6OThiMjU3MDUtZmMzNC00NWYwLTk0ZDItODA0ZjFkYzEyYTZk", "customerName": "Smith Rockets", "readOnly": false, "configVars": { "Acme Connection": { "inputs": { "username": { "value": "" }, "password": { "value": "" } }, "status": "PENDING" }, "My Key/Value List": { "value": [], "collectionType": "keyvaluelist", "codeLanguage": null, "dataType": "string", "pickList": null, "scheduleType": null, "timeZone": null }, "My String": { "value": "", "collectionType": null, "codeLanguage": null, "dataType": "string", "pickList": null, "scheduleType": null, "timeZone": null }, "My List": { "value": [], "collectionType": "valuelist", "codeLanguage": null, "dataType": "string", "pickList": null, "scheduleType": null, "timeZone": null }, "My String With Default": { "value": "Some Default", "collectionType": null, "codeLanguage": null, "dataType": "string", "pickList": null, "scheduleType": null, "timeZone": null }, "My List With Default": { "value": ["Foo1", "Foo2"], "collectionType": "valuelist", "codeLanguage": null, "dataType": "string", "pickList": null, "scheduleType": null, "timeZone": null }, "My Picklist": { "value": "", "collectionType": null, "codeLanguage": null, "dataType": "picklist", "pickList": ["Foo", "Bar", "Baz"], "scheduleType": null, "timeZone": null }, "My Boolean": { "value": false, "collectionType": null, "codeLanguage": null, "dataType": "boolean", "pickList": null, "scheduleType": null, "timeZone": null } } } } ``` In this example, we implement a [message event listener](https://developer.mozilla.org/en-US/docs/Web/API/Window/message_event) to monitor for the `PrismaticMessageEvent.INSTANCE_CONFIGURATION_LOADED` event. If the event contains a config variable named `Acme Connection`, we assign a value to that variable. Additionally, if this is the `Salesforce` integration, we assign values to some example config variables. Dynamically set config variable values ``` import prismatic, { getMessageIframe, PrismaticMessageEvent, } from "@prismatic-io/embedded"; const myListener = (message: MessageEvent) => { // Extract event and data information from the message const { event, data } = message.data; // Verify this is an "INSTANCE_CONFIGURATION_LOADED" event and // that the config screen was not opened in read-only mode if ( event === PrismaticMessageEvent.INSTANCE_CONFIGURATION_LOADED && !data.readOnly ) { // Extract integration name and config variables const { integrationName, configVars } = data; // Identify the iframe that sent the message for response targeting const iframe = getMessageIframe(message); // Check if the instance has an "Acme Connection" config variable: if (Object.keys(configVars).includes("Acme Connection")) { // Verify if they're empty or already populated: if ( configVars["Acme Connection"].inputs.username === "" && configVars["Acme Connection"].inputs.password === "" && configVars["Acme Connection"].status === "PENDING" ) { // Set username and password fields for "Acme Connection" prismatic.setConfigVars({ iframe, // Target the iframe that sent the message configVars: { "Acme Connection": { inputs: { username: { value: "My-User" }, password: { value: "supersecretpassword" }, }, }, }, }); } } // Set config variables if this is the Salesforce integration if (integrationName === "Salesforce") { prismatic.setConfigVars({ iframe, configVars: { "String Config Var": { value: "Updated Value" }, // Update a simple string "String Valuelist": { value: ["Value 1", "Value 2"] }, // Update a value list of strings // Update a key-value list "String Keyvaluelist": { value: [ { key: "A Key", value: "A Value" }, { key: "B Key", value: "B Value" }, ], }, }, }); } } }; window.addEventListener("message", myListener); ``` Here's how to implement a similar JavaScript example using a React [useEffect hook](https://reactjs.org/docs/hooks-effect.html): setConfigVars Example in React ``` import React, { useEffect } from "react"; import prismatic, { getMessageIframe, PrismaticMessageEvent, } from "@prismatic-io/embedded"; import Loading from "../components/Loading"; const id = "embedded-marketplace-container"; const EmbeddedMarketplace = () => { useEffect(() => { prismatic.showMarketplace({ selector: `#${id}`, usePopover: false, theme: "LIGHT", }); }, []); useEffect(() => { const listener = (message: MessageEvent) => { const { event, data } = message.data; if ( event === PrismaticMessageEvent.INSTANCE_CONFIGURATION_LOADED && !data.readOnly ) { const iframe = getMessageIframe(message); if (data.integrationName === "Test Embedded config vars") { prismatic.setConfigVars({ iframe, configVars: { "Amazon S3 Connection": { inputs: { accessKeyId: { value: "supersecretpassword" }, secretAccessKey: { value: "My-User" }, }, }, }, }); } } }; window.addEventListener("message", listener); return () => { window.removeEventListener("message", listener); }; }, []); return (
    ); }; export default EmbeddedMarketplace; ``` ##### The getMessageIframe helper function[​](#the-getmessageiframe-helper-function "Direct link to The getMessageIframe helper function") Your application needs to identify which iframe to send a `setConfigVars` message to. Your application might contain multiple iframes, potentially including multiple instances of the embedded marketplace on a single page. The `getMessageIframe` function helps identify the iframe that generated the `INSTANCE_CONFIGURATION_LOADED` event. If you need to send a `setConfigVars` message to a specific iframe, you can alternatively provide a `selector` property to the `setConfigVars` function (similar to `prismatic.showMarketplace()`): setConfigVars message by selector ``` prismatic.setConfigVars({ selector: "#my-marketplace-container", configVars: { "My Config Var Name": { value: "My config var value" }, }, }); ``` #### Hiding UI elements in marketplace[​](#hiding-ui-elements-in-marketplace "Direct link to Hiding UI elements in marketplace") You can optionally hide the **Back to Marketplace** link, specific tabs from the instance screen, and elements of the instance configuration wizard. This is useful for preventing customers from running tests from the **Test** tab or reconfiguring alert monitors independently. To disable specific UI elements, add a `screenConfiguration` block to your `prismatic.showMarketplace()` or `prismatic.configureInstance()` invocations: Hide the 'back' link and 'monitors' and 'test' tabs ``` prismatic.showMarketplace({ selector: `#${id}`, usePopover: false, theme: "LIGHT", screenConfiguration: { instance: { hideBackToMarketplace: true, hideTabs: ["Test", "Logs"], hidePauseButton: true, }, configurationWizard: { hideSidebar: true, isInModal: true, triggerDetailsConfiguration: "hidden", mode: "streamlined", // Hide the initial configuration page }, marketplace: { hideSearch: true, hideActiveIntegrationsFilter: true, }, }, }); ``` * `instance.hideBackToMarketplace`: Controls whether to hide the **Back to Marketplace** link in the instance configuration screen (defaults to `false`). * `instance.hideTabs`: Specifies which tabs to hide from the instance configuration screen. Available options are **Test**, **Executions**, or **Logs** tabs. No tabs are hidden by default. * `instance.hidePauseButton`: Controls whether a customer user can [pause or unpause](https://prismatic.io/docs/instances/managing.md#enabling-and-disabling-instances) an instance. * `configurationWizard.hideSidebar`: Hides the left-hand sidebar from the configuration wizard (config wizard page titles and numbers). * `configurationWizard.isInModal`: Determines whether the config wizard should appear as a modal overlay on the current page. When set to `true`, it assumes your embedded marketplace is already in a modal (preventing a modal-in-modal scenario) and opens the config wizard to fill its containing `
    `. * `configurationWizard.triggerDetailsConfiguration`: Controls the display of trigger details in the config wizard. Options are `hidden` to completely hide trigger details, `default` to include the element in collapsed state, or `default-open` to include the element in expanded state. Endpoint details may also be hidden if they are not configured to be **Secured by Customer** (see [Endpoint Configuration](https://prismatic.io/docs/integrations/triggers/endpoint-configuration.md#securing-endpoints-with-api-keys)). * `configurationWizard.mode`: Can be set to `traditional` or `streamlined` and affects the [initial configuration page](https://prismatic.io/docs/embed/marketplace.md#configuration-wizard-customization). * `marketplace.hideSearch`: Hides the search bar in the marketplace. * `marketplace.hideActiveIntegrationsFilter`: Hides the top-right filter for active integrations in the marketplace. To apply these preferences globally (for multiple `showMarketplace()` and `configureInstance()` invocations), configure settings in `prismatic.init()`: Hide UI elements globally ``` prismatic.init({ screenConfiguration: { instance: { hideBackToMarketplace: true, hideTabs: ["Test", "Monitors"], }, }, }); ``` Screen configuration settings in `prismatic.init()` can be overridden within specific `showMarketplace()` or `configureInstance()` invocations. Note that `hideTabs` properties are not merged in this case - the `screenConfiguration` settings in `showMarketplace` / `configureInstance` completely override the default settings in `init`. #### Integration configuration detail screen[​](#integration-configuration-detail-screen "Direct link to Integration configuration detail screen") The embedded SDK configuration enables fine-tuning of the user experience for accessing the integration configuration details screen. It includes the ability to prevent marketplace users from accessing the detail screen, which is useful if you want to restrict access to the **Test**, **Executions**, **Monitors**, or **Logs** functionality. * `allow-details`: Redirects the marketplace user back to the marketplace listing after completing the configuration wizard. When a user selects an integration from the marketplace listing, they see a summary of the integration including its status. Users can manage instances from the overflow menu at the bottom. "View Details" opens the full integration details screen. You can [hide specific UI elements](#hiding-ui-elements-in-marketplace) on this page. This is the default behavior in SDK version 2.0.0 and later if no configuration is provided. * `disallow-details`: Provides the same experience as `allow-details` but removes the details option from the overflow menu. * `always-show-details`: Redirects the marketplace user to the full integration details screen after configuration or when selecting an instance from the marketplace listing. Inline configuration experience that allows marketplace users to access integration config details ``` prismatic.showMarketplace({ selector: `#my-div`, screenConfiguration: { marketplace: { configuration: "allow-details", }, }, }); ``` #### Configuration wizard customization[​](#configuration-wizard-customization "Direct link to Configuration wizard customization") When customers configure or reconfigure an integration in your marketplace, they will either see an initial configuration page for editing the instance name and viewing webhook URLs, or they will be taken directly to the first page of the configuration wizard. This behavior is controlled by the `screenConfiguration.configurationWizard.mode` option: * `traditional`: Presents customers with an initial configuration page where they can set the instance name and view general instance details, like flow webhook URLs. This is the default behavior in `@prismatic.io/embedded` versions before 3.0. * `streamlined`: Skips the initial configuration page. This is the default in `@prismatic.io/embedded` version 3.x and later. Note that if you enable [multiple instances](https://prismatic.io/docs/embed/marketplace.md#multiple-instances-of-one-integration-in-marketplace) of your integration to be deployed by a single customer, instance names must be unique. Customers will still see an initial configuration page for editing instance names, but webhook information will not be displayed. If your config wizard has a single page, the left-hand sidebar displaying config page names will be hidden in `streamlined` mode. If you choose `streamlined` mode but still want to display instance webhook URLs and API keys within your config wizard, [you can](https://prismatic.io/docs/integrations/config-wizard/config-pages.md#displaying-webhook-information-in-the-configuration-wizard). #### Showing all instances in marketplace[​](#showing-all-instances-in-marketplace "Direct link to Showing all instances in marketplace") Sometimes you may want to deploy an integration to a specific customer via Marketplace without making it generally available to all customers. To achieve this, deploy an [unconfigured instance](https://prismatic.io/docs/instances/deploying.md#creating-an-unconfigured-instance) of the integration to your customer as an organization user. Then, set the `filters.marketplace.includeActiveIntegrations` marketplace option to `true` to display instances that are enabled for a customer (even if the integration is not in the marketplace): Show all instances in marketplace ``` prismatic.showMarketplace({ selector: `#my-div`, filters: { marketplace: { includeActiveIntegrations: true, }, }, }); ``` filters take precedence over `includeActiveIntegrations` by default Note that if you have [filters](#filtering-integrations) enabled, the `includeActiveIntegrations` option will not override the filters. By default, active integrations deployed to a customer must match the specified filters to be displayed. If you want to display all active integrations regardless of filters, in addition to marketplace integrations that match your filters, set the `includeActiveIntegrations` option to `true` and add the `strictMatchFilterQuery` property set to `false`. #### Multiple instances of one integration in marketplace[​](#multiple-instances-of-one-integration-in-marketplace "Direct link to Multiple instances of one integration in marketplace") By default, customer users can enable a single instance of an integration through the embedded marketplace. However, there are scenarios where you want customers to deploy multiple instances of an integration. For example, your customers might have multiple Shopify stores and require separate instances of your Shopify integration for each store. Enabling customers to deploy multiple instances of an integration is configured on a per-integration basis. From the marketplace configuration screen or the marketplace drawer in the workflow builder, enable the **Allow Multiple Instances** toggle. ![Allow multiple instances toggle in marketplace drawer](/docs/img/embed/marketplace/allow-multiple-instances.png) Embedded customer users who select your integration will be able to view existing enabled instances of the integration and will have the option to add additional instances. ![Multiple instances in embedded marketplace](/docs/img/embed/marketplace/multiple-instances.png) Customers will see a similar view if you have enabled multiple instances of an integration for them. #### Custom marketplace UI[​](#custom-marketplace-ui "Direct link to Custom marketplace UI") The embedded marketplace listview screen is presented as an iframe. If you prefer to present marketplace integrations using your own custom UI elements, you can. This is particularly useful if you build some integrations in Prismatic and others outside of Prismatic - you can display both using consistent UI elements. For complete details, see [Custom Marketplace UI](https://prismatic.io/docs/embed/custom-marketplace-ui.md). --- #### Theming Embedded You can theme your embedded marketplace to match your app's look and feel. To create a custom theme for your embedded marketplace and workflow builder, click your organization's name at the bottom of the left-hand sidebar, then open the **Theme** tab. Permissions required You must have the *admin* or *owner* [role](https://prismatic.io/docs/configure-prismatic/organization-users.md#organization-team-member-roles) to edit custom themes. The left side of the **Theme** tab displays customizable properties (colors and styles). The right side provides a preview of how various UI elements will appear with those custom properties. The left side of the **Theme** tab displays customizable properties (colors and styles) in 3 separate tabs: Brand, Banner & Log, and Neutral. The neutral tab allows you to choose on neutral color, and additional neutral values used throughout the team are calculated for you based on your neutral selection. The right side provides a preview of how various UI elements will appear with those custom properties. ![Theming the Prismatic integration marketplace for your app](/docs/img/embed/theming/theming-options.png) Your embedded themes will be applied to your embedded application: ![Example of themed integration marketplace](/docs/img/embed/theming/progix-themed.png) ##### Light and dark mode themes[​](#light-and-dark-mode-themes "Direct link to Light and dark mode themes") In the **Theme Mode** section, you have four options: * **Light** and **Dark** control the appearance of the Prismatic application when using light or dark mode. Each team member can configure dark or light mode settings from their [profile settings page](https://prismatic.io/docs/configure-prismatic/user-settings.md#setting-light-or-dark-mode). Alternatively, they can choose to have Prismatic follow their operating system's dark/light mode theme. Customer themes Your customers will not see **Light** or **Dark** themes unless you create customer users for them and they log directly into Prismatic. * **Embedded Light** and **Embedded Dark** control the theme for your embedded marketplace. These are the themes that your customers will see in the embedded marketplace. By default, the embedded marketplace automatically switches between dark and light themes based on your customers' operating system settings. This is beneficial if your app also follows OS theme settings. The embedded marketplace will switch between dark and light modes alongside your app. To override the dark/light mode behavior for embedded and display only either the dark or light theme, add a `theme` property to your `showMarketplace` calls: Only show light mode theme ``` prismatic.showMarketplace({ selector: `#my-embedded-div`, usePopover: false, theme: "LIGHT", // or "DARK" }); ``` ##### Using a custom font[​](#using-a-custom-font "Direct link to Using a custom font") You can apply a custom font for your embedded marketplace. This is useful if your app uses a custom font and you want to maintain consistency in the embedded marketplace. Prismatic supports any font available on [Google Fonts](https://fonts.google.com/). To apply a custom font, add it to `prismatic.init()` as `fontConfiguration.google.families`: Use a custom font ``` prismatic.init({ fontConfiguration: { google: { families: ["Inter"], }, }, }); ``` ##### Customizing the loading screen[​](#customizing-the-loading-screen "Direct link to Customizing the loading screen") A loading screen appears briefly when `prismatic.showMarketplace()` is called. The screen shows a loading icon on a solid-color background. You can customize the background color and the color of the loading icon and "Loading" text. Add a `screenConfiguration.initializing` argument to do this. Add it to `prismatic.init()` to customize colors for all marketplace loading screens. Or add it to `prismatic.configureInstance()` or `prismatic.showMarketplace()` to customize colors for a specific marketplace div. Colors can be any valid CSS color. Customize loading screen colors ``` prismatic.init({ screenConfiguration: { initializing: { background: "#FF5733", color: "blue", }, }, }); // or prismatic.showMarketplace({ screenConfiguration: { initializing: { background: "rgb(5,102,0)", color: "rgba(255,153,255,.2)", }, }, }); ``` #### Renaming "Integration" and "Marketplace"[​](#renaming-integration-and-marketplace "Direct link to Renaming \"Integration\" and \"Marketplace\"") The embedded integration marketplace is labeled "Marketplace" by default and a collection of flows with a config wizard is called an "Integration". You can customize these concepts to match your organization's terminology. For example, your organization might refer to an integration as a "Solution" or a "Workflow". To modify these terms in the embedded marketplace and workflow builder, navigate to the settings page. Select your organization in the bottom left corner, then choose the **Embedded** tab. ![Rename marketplace and integrations in the embedded app](/docs/img/embed/rename-marketplace-integrations.png) Your custom terms for "integration" and "marketplace" will be displayed throughout the embedded marketplace and workflow builder interfaces. ![Renamed marketplace in the embedded app](/docs/img/embed/renamed-marketplace.png) For multi-language support, refer to the internationalization settings in the [i18n documentation](https://prismatic.io/docs/embed/translations-and-internationalization.md). --- #### Translations and Internationalization (i18n) #### Translations and internationalization[​](#translations-and-internationalization "Direct link to Translations and internationalization") Your customers may require non-English language support. Internationalization (i18n) enables you to localize the embedded marketplace interface into multiple languages. Review the marketplace's [translations package](https://github.com/prismatic-io/translations/tree/main/src). This repository contains all translatable phrases in the marketplace, along with their English source text. There are three categories of translatable phrases: * `SimplePhrase`: Direct string-to-string translations. Example: `input.integrationVersionLabel` "Integration Version" translates to French "Version d'intΓ©gration". * `ComplexPhrase`: Template strings that incorporate variables representing customer names, integration names, counts, etc. Example: `integrations.id__banner.customerActivateText` with value `"Please contact %{organization} to activate this integration."` This template injects your organization's name into the string. * **Dynamic Phrases**: Translations of your custom content (integration names, configuration variable keys, etc.), detailed [below](#dynamic-phrases). An example of i18n in action is available in our example embedded app [on GitHub](https://github.com/prismatic-io/embedded/blob/main/example-embedded-app/pages/examples/i18n.tsx). #### Adding an i18n dictionary for a user[​](#adding-an-i18n-dictionary-for-a-user "Direct link to Adding an i18n dictionary for a user") To apply an i18n dictionary to a user, include a `translation` property in your `prismatic.init` invocation. You can also add phrases to `prismatic.showMarketplace` or similar functions to apply translations to specific iframes. Include any `phrases` that require translation: Translating phrases ``` prismatic.init({ translation: { phrases: { // Static Translations: "integration-marketplace__filterBar.allButton": "Alle, bitte!", "integration-marketplace__filterBar.activateButton": "Solo activado", "detail.categoryLabel": "カテゴγƒͺγƒΌ", "detail.descriptionLabel": "ΰ€΅ΰ€Ώΰ€΅ΰ€°ΰ€£", "detail.overviewLabel": "概述", // Complex translation with variables: "activateIntegrationDialog.banner.text--isNotConfigurable": { _: "Veuillez contacter %{organization} pour activer cette intΓ©gration", }, }, }, }); ``` After implementing changes, reload the marketplace to view the new translations. ![](/docs/img/embed/translations-and-internationalization/i18n-example.png) Use Intellisense The Marketplace NPM package includes inline documentation showing current English values for translatable phrases. Enable intellisense in your code editor and hover over the target phrase to view its English source text. ![](/docs/img/embed/translations-and-internationalization/i18n-intellisense.png) #### i18n debug mode[​](#i18n-debug-mode "Direct link to i18n debug mode") To identify which phrase corresponds to which UI element in Prismatic, enable **debug mode** by adding `debugMode: true` to the `translation` property of `prismatic.init`: Enable debug mode for translations ``` prismatic.init({ translation: { debugMode: true, }, }); ``` This will display phrase keys and their current values for all UI elements in the embedded marketplace: ![](/docs/img/embed/translations-and-internationalization/i18n-debug-mode.png) #### Namespaced phrases[​](#namespaced-phrases "Direct link to Namespaced phrases") Certain phrases are shared across multiple pages in the embedded marketplace. A comprehensive list of these common phrases is available on [GitHub](https://github.com/prismatic-io/translations/tree/main/src/lib/shared). For example, `common.loading` translates to `"Loading"` in English across multiple screens. Setting `common.loading` once will affect all instances of this phrase. To customize common phrases on a per-page basis, you can namespace phrases. For instance, to translate `common.loading` differently only on the [alert monitors](https://prismatic.io/docs/monitor-instances/alerting.md) page, prefix the phrase key with a `PhraseNamespace` using the format `NAMESPACE__PHRASE-KEY` (see [translations](https://github.com/prismatic-io/translations/tree/main/src)): Namespacing phrases ``` prismatic.init({ translation: { phrases: { "integrations.id.alert-monitors__common.loading": "Sit tight a sec...", }, }, }); ``` #### Dynamic phrases[​](#dynamic-phrases "Direct link to Dynamic phrases") As you develop integrations, you create organization-specific phrases. These custom phrases include: * Integration names * Config variable names * Config wizard page titles * Config wizard page descriptions * Config wizard helper text * Flow names (visible to customers in execution results) * Step names (visible to customers in execution results) You can translate these phrases by adding a `dynamicPhrase` property to the `phrases` object. For example, to translate an integration named `"Sync customer data with Salesforce"` to French, add a `dynamicPhrase` to your `prismatic.init` invocation: ``` prismatic.init({ translation: { phrases: { dynamicPhrase: { "Sync customer data with Salesforce": "Synchronisez les donnΓ©es clients avec Salesforce", }, }, }, }); ``` You can override any other dynamic phrases using the same method. ``` prismatic.init({ translation: { phrases: { dynamicPhrase: { "Microsoft Teams": "Microsoft Squadre", // Integration name "Notify a Teams channel of new leads": "Notifica un canale di Teams di nuovi lead", // Integration description "Teams Configuration": "Configurazione di Teams", // Config wizard page title "Enter your Teams authentication info": "", // Config wizard page subtitle "Teams Authentication": "Autenticazione di Teams", // Config variable "

    Teams OAuth

    ": "

    OAuth di Teams

    ", // HTML helper text in the config wizard }, }, }, }); ``` Dynamic phrases must match exactly Dynamic phrases must exactly match the phrases used in your integration. If your config variable includes capitalization or punctuation, you must include these in the dynamic phrase. Partial matches will not translate (e.g., translating `{"Customer": "Cliente"}` will not translate `"Customer Name"` to `"Cliente Name"`). Note the `

    ` tag in the last example. When providing custom HTML helper text in your configuration wizard, you must translate the complete HTML string. ##### Listing all dynamic phrases[​](#listing-all-dynamic-phrases "Direct link to Listing all dynamic phrases") To view all translatable dynamic phrases from your account, use the `prism` CLI: List all dynamic phrases for all integrations in marketplace ``` prism translations:list ``` This command generates a `translations_output.json` file in your current directory containing all dynamic phrases for your account or a specific integration. Use this file as a reference when populating your `dynamicPhrase` object. --- #### Embedded Workflow Builder Overview Feature Availability The embedded workflow builder feature is available to customers on specific pricing plans. Refer to your pricing plan or contract, or contact the Prismatic support team to learn more. Prismatic's Embedded Workflow Builder lets your customers create and manage custom workflows right inside your application. This enables them to solve their unique integration and automation needs, without relying on your support or engineering teams. ![Embedded workflow builder](/docs/img/embed/workflow-builder/overview/overview.png) The UI of the embedded workflow builder is similar to the [low-code integration designer](https://prismatic.io/docs/integrations/low-code-integration-designer.md) that your team has access to, with a few key differences: 1. Workflows that customers create contain single flows. If they need multiple flows, they can create multiple workflows. 2. Configuration is done inline. Customer users configure connections and steps directly in the workflow builder, rather than in a separate [configuration wizard](https://prismatic.io/docs/integrations/config-wizard.md). 3. [Connections](https://prismatic.io/docs/integrations/connections.md) are scoped to the customer, and can be shared across workflows. That means that they can authenticate with Slack or Salesforce once, and use those connections in multiple workflows. 4. Deployment is simpler. Once they've tested their workflow, they can click a single **Enable** button to deploy it. To get started with embedding the workflow builder, first [install Prismatic embedded SDK](https://prismatic.io/docs/embed/get-started/install-embedded-sdk.md) and then see [Embedding the Workflow Builder](https://prismatic.io/docs/embed/workflow-builder/workflow-builder.md). *** Class embedded designer docs If you are looking for documentation for the classic embedded designer, see [embedded designer](https://prismatic.io/docs/embed/workflow-builder/designer.md). --- #### Embedding Integration Designer The embedded integration designer allows you to embed Prismatic's low-code integration designer in your app, enabling your customers to create and manage integrations directly within your application using the same low-code designer that your team uses. Consider the embedded workflow builder If your goal is to let customers quickly build simple, one-off workflows in your app, use the [embedded workflow builder](https://prismatic.io/docs/embed/workflow-builder/workflow-builder.md) instead. #### Embedded workflow builder vs embedded designer[​](#embedded-workflow-builder-vs-embedded-designer "Direct link to Embedded workflow builder vs embedded designer") The **embedded workflow builder** and the **embedded designer** are both low-code editors, but they serve different purposes and have different user experiences. **Number of flows**: * The embedded workflow builder is designed for your customers to build single-flow workflows. * The embedded designer is intended for building reusable integrations that can be deployed multiple times, and can contain multiple flows. The embedded workflow builder is ideal for customers who need to create simple, one-off workflows that connect their systems, while the embedded designer is better suited for customers who want to build complex integrations that contain multiple workflows. **Configuration**: * With the embedded workflow builder, all configuration is done inline, making it easier for technical non-developers to build workflows without needing a separate configuration wizard. Users complete connections and select data source inputs (like a dropdown menu containing their Slack channels) as they build the workflow. No separate configuration wizard is needed. * The embedded designer leverages the same low-code config wizard that your team uses, allowing for more complex configuration options. The embedded workflow builder is designed to be more user-friendly and accessible for non-technical users, while the embedded designer provides more advanced configuration options for users who need them. **Deployment**: * Deployment is streamlined in the embedded workflow builder. After testing, a customer user clicks the **Enable** button in the workflow builder to activate their workflow. * In the embedded designer, customer users publish their integration and add it to the marketplace. Then, either they or other users within their customer group follow the configuration wizard to deploy one or instances of their integration. The embedded workflow builder has a simpler deployment process, making it easier for customer users to get their workflows up and running quickly, while the embedded designer provides more advanced deployment options and allows for multiple instances of the same integration. #### Listing integrations[​](#listing-integrations "Direct link to Listing integrations") The `prismatic.showIntegrations` function displays all integrations owned by the customer user's customer. Customers can create their own integrations or have integrations assigned to them. ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "integration-list-div"; function IntegrationList() { useEffect(() => { prismatic.showIntegrations({ selector: `#${id}` }); }, []); return
    Loading...
    ; } export default IntegrationList; ``` From the `showIntegrations` screen, customer users can open existing integrations or create new integrations by clicking the **+Add integration** button (similar to how organization team members do). ![List integrations as a customer user in embedded](/docs/img/embed/workflow-builder/list-integrations.png) integrations are scoped to the customer Integrations are scoped to a customer. If a customer creates a new integration or you share an integration with a customer, only customer users for that customer can view that integration. #### Assigning an integration to a customer[​](#assigning-an-integration-to-a-customer "Direct link to Assigning an integration to a customer") You can build an integration and transfer ownership to a customer. To do this, open the integration and select the **Details** tab at the top of the screen. Then select the customer who should own the integration from the **Customer** dropdown menu. ![Assign an integration to a customer](/docs/img/embed/workflow-builder/assign-integration.png) #### Opening the integration designer directly[​](#opening-the-integration-designer-directly "Direct link to Opening the integration designer directly") If you know the ID of an integration, you can open it directly using `prismatic.showDesigner`. You will need to get the integration ID using the `prismatic.graphqlRequest` function: Open an embedded workflow builder directly ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "my-embedded-builder-div"; /** * Get an integration's ID given its name */ async function getIntegrationId(integrationName: string) { const query = `query getMyIntegration($integrationName: String!) { integrations(name: $integrationName) { nodes { id name } } }`; const variables = { integrationName }; const result = await prismatic.graphqlRequest({ query, variables }); return result.data.integrations.nodes?.[0].id; } function EmbeddedDesigner() { useEffect(() => { const showDesigner = async () => { const integrationId = await getIntegrationId("Acme Inc"); prismatic.showDesigner({ selector: `#${id}`, integrationId, theme: "LIGHT", }); }; showDesigner(); }, []); return
    Loading...
    ; } export default EmbeddedDesigner; ``` ![Build an integration as a customer user](/docs/img/embed/workflow-builder/embedded-designer.png) #### Filtering components[​](#filtering-components "Direct link to Filtering components") To limit the components that a customer user can use in the embedded workflow builder, specify a `filters.components.filterQuery` object when calling `prismatic.showIntegrations` or `prismatic.showDesigner`. You can filter using a component's `key`, whether it's a `public` component, and component `category`. In this example, we list several file-related components, a private version of the Slack component, and all components in the "Logic" category (such as Branch, Loop, etc.): Filter components in the embedded workflow builder ``` import prismatic, { BooleanOperator, TermOperator, } from "@prismatic-io/embedded"; prismatic.showIntegrations({ selector: `#my-integrations-div`, filters: { components: { filterQuery: [ BooleanOperator.or, [TermOperator.equal, "key", "aws-s3"], [TermOperator.equal, "key", "dropbox"], [TermOperator.equal, "key", "google-drive"], [ BooleanOperator.and, [TermOperator.equal, "key", "slack"], [TermOperator.equal, "public", false], ], [TermOperator.equal, "category", "Logic"], ], }, }, }); ``` ![Filter component list in the embedded workflow builder](/docs/img/embed/workflow-builder/filter-components.png) #### Requiring components[​](#requiring-components "Direct link to Requiring components") To require that customers use certain components within their integrations, open your organization settings page and select the **Embedded** tab. Add any components you want to require under the **Required components** section. ![Require certain components in the embedded workflow builder](/docs/img/embed/workflow-builder/required-components.png) Embedded builder users must include components you specify before they can publish their integration. ![Require certain components in the embedded workflow builder before publishing](/docs/img/embed/workflow-builder/required-components-prevent-publish.png) --- #### Embedded Workflow Builder White-label Docs When you embed the [workflow builder](https://prismatic.io/docs/embed/workflow-builder/workflow-builder.md) in your application, your customers will naturally ask for documentation on how to use it. Prismatic provides a way for you to offer white-label documentation to your users, so that they can learn how to use the workflow builder without needing to refer to Prismatic's documentation. #### Why white-label documentation?[​](#why-white-label-documentation "Direct link to Why white-label documentation?") * Empower your users to learn and build workflows independently. * Reduce support questions and tickets. * Provide a fully branded, seamless experience for your customers. #### Providing white-label documentation to your customers[​](#providing-white-label-documentation-to-your-customers "Direct link to Providing white-label documentation to your customers") To provide white-label documentation to your users: 1. Clone the embedded workflow builder docs repository from GitHub: [prismatic-io/embedded-workflow-builder-docs](https://github.com/prismatic-io/embedded-workflow-builder-docs). 2. Configure and build the white-labeled docs to fit your needs. 1. If you want to build static HTML, CSS and JS files that you can host on your own servers, see [Building Branded HTML Assets](https://github.com/prismatic-io/embedded-workflow-builder-docs/blob/main/BUILDING-HTML.md). 2. If you want to ingest the white-labeled markdown files into your own content management system (CMS), see [Building Branded Markdown Files](https://github.com/prismatic-io/embedded-workflow-builder-docs/blob/main/BUILDING-MARKDOWN.md). Most CMS systems support importing vanilla markdown files. --- #### Embedding Workflow Builder #### Enabling workflow builder[​](#enabling-workflow-builder "Direct link to Enabling workflow builder") The embedded workflow builder is disabled for all customers by default. To enable the them for a customer, open the customer from the **Customers** screen and select the **Details** tab. Enable the **Allow Embedded Designer** option. ![Enable embedded workflow builder for a customer](/docs/img/embed/workflow-builder/customer-allow-embedded-designer.png) #### Listing workflows[​](#listing-workflows "Direct link to Listing workflows") The `prismatic.showWorkflows` function displays all workflows owned by the customer user's customer. ``` import prismatic from "@prismatic-io/embedded"; import { useEffect } from "react"; const id = "workflow-list-div"; function WorkflowList() { useEffect(() => { prismatic.showWorkflows({ selector: `#${id}` }); }, []); return
    Loading...
    ; } export default WorkflowList; ``` From the `showWorkflows` screen, customer users can open existing workflows or create new workflows by clicking the **+Add workflow** button. ![List workflows as a customer user in embedded](/docs/img/embed/workflow-builder/list-workflows.png) #### Opening the workflow builder directly[​](#opening-the-workflow-builder-directly "Direct link to Opening the workflow builder directly") If you know the ID of an workflow, you can open it directly using `prismatic.showWorkflow`. You will need to get the workflow's ID using the `prismatic.graphqlRequest` function: Open an embedded workflow builder directly ``` const embeddedDivId = "embedded-marketplace-div"; interface Workflow { id: string; name: string; } function Workflows() { const { authenticated } = usePrismaticAuth(); const [workflows, setWorkflows] = React.useState([]); React.useEffect(() => { const fetchWorkflows = async () => { const response = await prismatic.graphqlRequest({ query: `query getWorkflows { workflows(draft: true) { nodes { id name } } }`, }); setWorkflows(response.data.workflows.nodes); }; if (authenticated) { fetchWorkflows(); } }, [authenticated]); return ( <> Embedded Marketplace {workflows.map((workflow) => ( ))}