Skip to main content

AI agent integrations

When your app includes an AI assistant or chatbot, customers will ask it questions that require data or actions from tools they already use - "How many open support tickets do I have in Zendesk?" or "Schedule a follow-up meeting in Google Calendar."

Prismatic's MCP Flow Server exposes your integration flows as tools that AI agents can discover and invoke through the standard MCP protocol. You define the flows once; your AI agent handles the rest.

Design considerations

Think in questions, not endpoints. When deciding what flows to build, ask what questions your customers will direct at the AI. Each distinct question type or action becomes a flow. "Get open ticket count by status" is more useful to an AI than "list all tickets."

Write clear invocation schemas. Your AI agent uses the invocation schema to understand what a flow does and what parameters it needs. Treat schema descriptions like prompts - be specific about what the flow returns and when to call it.

Keep flows focused and fast. Agent flows run synchronously within an LLM conversation. A flow that does one thing quickly is better than one that handles multiple cases. Avoid long-running or batch-style logic.

Return structured, descriptive responses. The LLM interprets your flow's output to compose a response for the user. Return clean JSON with meaningful field names rather than raw API responses. Include human-readable labels alongside IDs.

Handle errors gracefully. If a flow fails or the third-party API is unavailable, return a clear error message the LLM can relay to the user rather than throwing an unhandled exception.

Scope by customer automatically. Agent flows run in the context of a customer's deployed instance, so they use that customer's credentials and configuration. You don't need to pass tenant identifiers.

Getting started

  1. Build your first agent flow
  2. Test with an MCP client such as Postman, Claude, or ChatGPT
  3. Connect your AI agent to Prismatic's MCP server