Skip to content

Workflow recipes

End-to-end workflow examples for common integration patterns.

Four patterns that cover the core data flows: webhook ingestion with AI, external service sync, scheduled MCP tool access, and outbound sync. Each recipe is a complete workflow you can build in the Lightfield builder.

Pattern: Webhook in → AI interprets → record writes

Stripe sends dozens of event types with inconsistent schemas. A customer.subscription.created payload looks nothing like an invoice.payment_succeeded payload. Traditional automation would require a branching tree of conditionals to map each event type to the right fields. The AI agent step handles this with a single prompt.

Trigger: Webhook
Step 1: Agent request
Step 2: Log

Trigger: Webhook

Configure Stripe to POST webhook events to your Lightfield webhook URL. The raw JSON payload becomes the trigger output.

stripe webhook configuration

Step 1: Agent request

Prompt the agent to interpret the Stripe event and take appropriate actions:

You are receiving a Stripe webhook event. Based on the event type and payload:
1. If this is a customer-related event, find or create a Contact using the
customer email. Set the contact source to "Stripe".
2. If this is a subscription or payment event, find or create an Account
using the customer's company name (from metadata or email domain).
3. If this involves revenue (payment, subscription, invoice), create or
update an Opportunity with the amount and status mapped from the
Stripe event status.
Stripe payload:
{{trigger}}

Enable entity creation and entity updates. The agent has access to all your Lightfield data. It will search for existing records before creating duplicates, map Stripe’s nested metadata fields to the right attributes, and handle edge cases (missing company name, multiple subscriptions per customer) that would break rule-based automation.

Step 2: Log

Processed Stripe event {{trigger.type}} for {{trigger.data.object.customer}}

Stripe’s schema varies by event type, and the mapping from payment data to records requires judgment. Is a subscription renewal a new opportunity or an update to an existing one? Should a refund close the opportunity or just update the amount? The AI agent makes these decisions contextually, using the same reasoning a human rep would.


Pattern: Webhook in → AI interprets → contact enrichment

Kondo captures LinkedIn DM conversations and can forward them via webhook. This workflow ingests those conversations and attaches them as context to the right contact records in Lightfield.

Trigger: Webhook
Step 1: Agent request

Trigger: Webhook

Configure Kondo to POST LinkedIn DM data to your Lightfield webhook URL. The payload includes the conversation participants, message content, and metadata.

kondo webhook configuration

Step 1: Agent request

You are receiving LinkedIn DM conversation data from Kondo. For each
conversation:
1. Find the Contact in Lightfield by matching the LinkedIn profile name
or email. If no match exists, create a new Contact with the available
information (name, LinkedIn URL, company if available).
2. Create a Note on the Contact with the conversation summary.
Title: "LinkedIn DM - {{date}}"
Include key discussion points, any mentioned next steps, and the
conversation sentiment.
3. If the conversation mentions a deal, product interest, or buying signal,
create or update an Opportunity linked to the Contact's Account.
Kondo payload:
{{trigger}}

Enable entity creation and entity updates. The agent will fuzzy-match contact names against your existing records, handle cases where a LinkedIn name doesn’t exactly match the Lightfield record, and extract structured signals (buying intent, next steps) from unstructured conversation text.

LinkedIn DMs are unstructured text. Mapping them to records requires understanding who the conversation is with (name matching across systems), what was discussed (content extraction), and whether it’s sales-relevant (intent classification). This is exactly the kind of work that is impossible with field-mapping rules and trivial for an AI agent.


Pattern: Scheduled trigger → AI with MCP tools → record enrichment

Granola records and transcribes meetings. This workflow runs daily, pulls recent meeting notes via Granola’s MCP server, and updates Lightfield with the relevant context: tasks, notes, and opportunity updates derived from what was actually discussed.

Trigger: Scheduled (Daily, 9:00 AM)
Step 1: Agent request (with Granola MCP)

Trigger: Scheduled

Set to daily, 9:00 AM in your team’s timezone. The workflow fires once per day and hands off to the AI agent.

Step 1: Agent request (with Granola MCP access)

Pull yesterday's meeting notes from Granola. For each meeting:
1. Identify the attendees and match them to Contacts in Lightfield.
2. Find the relevant Account and Opportunity for the meeting context.
3. Create a Note on the Account summarizing the key discussion points,
decisions made, and any objections or concerns raised.
4. Create Tasks for any action items mentioned in the meeting, assigned
to the appropriate team member, with due dates if mentioned.
5. If the meeting revealed a change in deal status, confidence, or
timeline, update the Opportunity fields accordingly.
Focus on extracting actionable information, not transcription summaries.

The agent uses Granola’s MCP server to fetch meeting data, then uses Lightfield’s built-in tools to create and update records. No HTTP step needed. MCP provides structured access to Granola’s data directly within the agent’s tool ecosystem.

Meeting notes are the highest-signal, lowest-structure data in a sales organization. Reps discuss deal blockers, timeline changes, and next steps in conversation, but that context rarely makes it into the system of record. This workflow closes the loop automatically: meetings happen, Granola captures them, and the AI agent extracts structured updates from unstructured conversation.

The MCP integration is key. Instead of building a custom HTTP integration with Granola’s API, the agent accesses Granola through a standard tool interface. When Granola updates their API, the MCP server handles the change. Your workflow prompt stays the same.


Pattern: Object lifecycle trigger → HTTP request out

When a new opportunity is created in Lightfield, push it to an external system: your ERP, a Slack channel, a data warehouse, an internal dashboard. This is the outbound counterpart to the inbound webhook recipes above.

Trigger: Object lifecycle (Opportunity created)
Step 1: HTTP request

Trigger: Object lifecycle (create)

Set the entity type to Opportunity and the event to create. No field watching needed. This fires on every new opportunity.

Step 1: HTTP request

POST the opportunity data to your external system:

URL: https://your-system.example.com/api/opportunities

Headers:

Authorization: Bearer YOUR_API_TOKEN
Content-Type: application/json

Body (JSON):

{
"lightfield_id": "{{trigger.id}}",
"name": "{{trigger.fields.system_name.value}}",
"stage": "{{trigger.fields.stage.value}}",
"amount": "{{trigger.fields.amount.value}}",
"created_at": "{{trigger.occurredAt}}",
"source": "lightfield"
}

Slack notification: Replace the HTTP endpoint with a Slack incoming webhook URL. Structure the body as a Slack Block Kit message:

{
"text": "New opportunity: {{trigger.fields.system_name.value}}"
}

Multi-step sync: Add additional HTTP request steps after the first to push to multiple systems in sequence. One step for your ERP, then one for Slack, then one for your data warehouse.

Conditional sync with AI: Replace the HTTP step with an Agent request step that decides whether and where to sync based on the opportunity’s characteristics. High-value deals go to Slack and the ERP. Small deals just get logged. The agent makes the routing decision instead of a brittle conditional.

Most data eventually needs to exist somewhere else. The object lifecycle trigger catches the moment data is created or changes, and the HTTP request pushes it immediately. No polling, no batch sync, no stale data. For more complex routing logic, swap the HTTP step for an AI agent that makes the decision contextually.


These recipes demonstrate four integration patterns, but the building blocks compose freely. A single workflow can combine a webhook trigger with an AI agent step, multiple Object operations, and an HTTP request to an external system.

For the full reference on each trigger and action type, see Building workflows. For architecture details on how the execution engine handles concurrency, retries, and failure recovery, see How workflows work.