Incorporating Prismatic Integrations into Your Dev Processes

This post refers to an earlier version of Prismatic. Consult our docs on integration YAML or contact us if you'd like help with any of the topics addressed in this post.
In our previous two posts, we examined a use case for Prismatic, and dove into how to assemble an integration. Today I'd like to look at incorporating integration development with Prismatic into your existing software development processes, first discussing reasons why you should consider saving your integrations as YAML within source control, and then looking at how to codify the integration from the previous post into a single YAML file.
Why should I consider storing integrations as YAML?
Let's first take a step back and think about how we manage cloud infrastructure and server configuration today.
When I create a proof-of-concept for a new tech stack in AWS, GCP, or Azure, I like to futz with cloud resources using the cloud providers' respective web consoles. The visual representations of how my resources interconnect helps me to quickly debug and make tweaks to infrastructure. When it comes time to replicate my tech stack for production, though, I definitely don't do it by hand. Replicating a complex environment for CI, QA, beta, staging, and prod would be a nightmare to build and maintain if done by hand. So, I reach for something like CloudFormation or Terraform so I can readily cookie-cutter out environments.
The same holds true for configuring servers. I like to play around with installing packages and editing Apache or HAProxy configuration manually until I get my environment set up how I like it, but in the end I write up my configuration using Ansible playbooks or Chef recipes so I can replicate my work readily.
Assembling Prismatic integrations is no different. The integration designer is a powerful tool that allows you to assemble an integration from scratch.
You can test your integration from within the designer as you build it and observe inputs and outputs of your steps as you go.
It's a great way to prototype and debug a new integration.
Once you're satisfied with your integration within the integration designer, you can even export it to a YAML file with a simple prism integrations:export
command.

Now, you could maintain your cloud environments by hand and configure every server you ever touch manually, and you can certainly maintain all of your integrations from within Prismatic's web app, but like infrastructure and server configuration definitions, saving your integration definitions in the same code repository as your core product provides you with several advantages:
- It's saved in source control. The normal advantages of source control (having code reviews, feature branches, merge requests, etc.) now apply to your integrations. Your team can see how an integration has changed over time, and can read through commit messages to figure out what changed when, and why. This gives you the added benefit of keeping your integrations in lock-step with your APIs. If your APIs change, your integrations that consume them can be modified and shipped out at the same time.
- Easy to replicate and QA. If your integration is saved in source control, your QA team can easily import your new integration definition into a test QA tenant to verify that it works alongside any new code you've written.
- Fits into existing CI/CD Pipelines. Your integration can be shipped automatically with minimal changes to your CI/CD pipeline.
Just use Prismatic's CLI tool and add a simple
prism integrations:import
command to your build pipeline. You can build CI tests around your integrations withprism integrations:test
to verify that the integration works as expected. When it comes time to deploy to production, simply have your production pipeline runprism integrations:import
on your YAML file, and deploy instances of the integration to customers who need it. Deployment of your integration are entirely scriptable and testable. - Developers work efficiently in code. This sounds like an obvious thing to say, but remember back to the first time you watched an experienced developer wrangle a code base using Vim or Emacs. I remember being astounded at just how quickly a good developer can search through files, pull up dependencies, and make changes to multiple lines in multiple files with a few keyboard strokes. If you're supporting hundreds of Prismatic integrations, you'll similarly want to be able to check which integrations use a particular API endpoint or configuration variable, and that's made easy in most code editors if your integrations are saved as code.
Now that we've touched on why saving integrations as YAML is advantageous, let's look at how to assemble an integration in YAML.
Progix's ERP integration as YAML
The Progix ERP integration, introduced previously, consisted of five pieces: a trigger and four action steps. Let's look at the entirety of the YAML code first (also available on GitHub), and then highlight specific features:
---
name: Rocket Fuel to AcmeERP
description: >
After a rocket is launched, fuel data is sent to this integration via a
trigger payload. The payload is verified, fuel data is converted from pounds
to gallons, and XML-formatted data is sent to the customer's AcmeERP instance.
requiredConfigVars:
acmeErpFuelEndpoint: https://api.acmeerp.com/fuel
secret: secret
trigger:
name: Webhook Trigger
description: Expects a data payload and X-Progix-Signature header
steps:
- name: Verify Signature
description: Verify that the X-Progix-Signature is valid
action:
key: verifySignature
componentKey: progix-sig-check
inputs:
signature:
type: reference
value: webhookTrigger.results.headers.x-progix-signature
body:
type: reference
value: webhookTrigger.results.rawBody.data
secret:
type: configVar
value: secret
- name: Compute Gallons Fuel
description: Convert incoming fuel data from pounds to gallons
action:
key: runCode
componentKey: code
inputs:
code:
type: value
value: |
module.exports = async (context, { webhookTrigger }) => {
const gallonsToPoundsConversion = {
Hydrazine: 8.38,
Kerosene: 6.68,
Nitromethane: 9.49,
O2: 9.52,
};
const fuelUsed = webhookTrigger.results.body.data.fuelUsed;
return {
data: {
fuelGallonsUsed: fuelUsed.reduce((obj, item) => {
return {
...obj,
[item.type]: item.pounds / gallonsToPoundsConversion[item.type],
};
}, {}),
},
};
};
- name: Convert JSON to XML
description: Convert JSON data from the code component to the XML that AcmeERP expects.
action:
key: jsonToXml
componentKey: change-data-format
inputs:
data:
type: reference
value: computeGallonsFuel.results.fuelGallonsUsed
- name: Send Data to AcmeERP
description: HTTP POST XML data to AcmeERP endpoint using OAuth 2.0
action:
key: httpPost
componentKey: http
inputs:
url:
type: configVar
value: acmeErpFuelEndpoint
data:
type: reference
value: convertJsonToXml.results
responseType:
type: value
value: text
Right away notice that the file starts with name
and description
blocks, which are pretty straight forward.
If you have done any AWS CloudFormation templating or Ansible playbook creation, YAML should be familiar to you.
In YAML we can take advantage of multi-line strings using the block scalar style >
and |
characters, so long descriptions like this are readable, but render as a single cohesive sentence in the Prismatic web app:
description: >
After a rocket is launched, fuel data is sent to this integration via a
trigger payload. The payload is verified, fuel data is converted from pounds
to gallons, and XML-formatted data is sent to the customer's AcmeERP instance.
Next, we define our requiredConfigVars
like we did within the web app.
That's a simple key-value pairing of variable names and their default values (so acmeErpFuelEndpoint
defaults to https://api.acmeerp.com/fuel
, for example):
requiredConfigVars:
acmeErpFuelEndpoint: https://api.acmeerp.com/fuel
secret: secret
After that we have the integration's trigger. Our trigger is simple, and contains a name and optional description. By default a trigger creates a webhook, though you can instead configure the trigger to fire on a schedule using crontab notation:
trigger:
name: Webhook Trigger
description: Expects a data payload and X-Progix-Signature header
Finally, and most importantly, our steps
block defines the series of steps that make up the integration.
Each step contains a name
and optional description
.
Then we declare what component action
to invoke by listing a component key and action key, which we can get by referencing our component catalog or by running prism components:actions:list -x
.
We then choose values for inputs
that the action step takes (inputs are enumerated in our component catalog docs).
Inputs can be strings or variables (like outputs or configuration variables).
In this step, for example, we invoke the httpPost
action against an HTTP url that is defined by a configuration variable, and posts the output data from the step named Convert JSON to XML to that endpoint:
- name: Send Data to AcmeERP
description: HTTP POST XML data to AcmeERP endpoint using OAuth 2.0
action:
key: httpPost
componentKey: http
inputs:
url:
type: configVar
value: acmeErpFuelEndpoint
data:
type: reference
value: convertJsonToXml.results
responseType:
type: value
value: text
We can also define code component steps using YAML multi-line syntax to pass in custom JavaScript code that handles vertical-specific business logic:
- name: Compute Gallons Fuel
description: Convert incoming fuel data from pounds to gallons
action:
key: runCode
componentKey: code
inputs:
code:
type: value
value: |
module.exports = async (context, { webhookTrigger }) => {
const gallonsToPoundsConversion = {
Hydrazine: 8.38,
Kerosene: 6.68,
Nitromethane: 9.49,
O2: 9.52,
};
const fuelUsed = webhookTrigger.results.body.data.fuelUsed;
return {
data: {
fuelGallonsUsed: fuelUsed.reduce((obj, item) => {
return {
...obj,
[item.type]: item.pounds / gallonsToPoundsConversion[item.type],
};
}, {}),
},
};
};
That's it!
With just 72 lines of YAML (26% of which is whitespace or optional description text), we have a fully functional integration that we can import into our Prismatic account.
From there, we can rapidly make changes to our code and then run prism integrations:import
and prism integrations:test
to verify that our changes work as expected.
Learn more
We looked into the why and how of Prismatic integrations as YAML. For any other questions, check out our docs or reach out – we'd love to hear from you!
About Prismatic
Prismatic is the integration platform for B2B software companies. It's the quickest way to build integrations to the other apps your customers use and to add a native integration marketplace to your product. A complete embedded iPaaS solution that empowers your whole organization, Prismatic encompasses an intuitive integration designer, embedded integration marketplace, integration deployment and support, and a purpose-built cloud infrastructure. Prismatic was built in a way developers love and provides the tools to make it perfectly fit the way you build software.
Get the latest from Prismatic
Subscribe to receive updates, product news, blog posts, and more.