For the complete documentation index, see llms.txt.

Firecrawl + Vennio MCP

Firecrawl and Vennio are two separate MCP servers that work well together. Firecrawl gives agents web intelligence — scrape, search, crawl. Vennio gives agents scheduling primitives — availability, bookings, proposals. Run both and an agent can go from scrape the web to book the time in one prompt.

Two separate MCP servers

Firecrawl is not embedded in Vennio's MCP server. They are independent servers you configure side-by-side. Vennio exposes 12 scheduling tools. Firecrawl exposes web tools. Your agent calls whichever it needs.

What each server provides

Vennio MCP tools (12 total)

Vennio exposes scheduling primitives at https://api.vennio.app/mcp. Key tools:

Tool What it does
find_availability Bookable slots for a business, accounting for existing calendar events and bookings
find_mutual_availability Slots where 2–5 people are all free (requires consent from each)
create_booking Confirm a booking directly — creates calendar event, sends emails, fires webhooks
propose_meeting Create a multi-party meeting proposal with proposed time slots
respond_to_proposal Accept, counter, or reject a proposal on behalf of a participant
get_proposal Retrieve a proposal and optionally the full negotiation thread
get_booking Retrieve a booking by ID
cancel_booking Cancel a confirmed booking, fires cancellation webhooks
check_consent Verify you have consent to access a principal's calendar data
get_network Snapshot of your connection network — counts, types, recent contacts
list_connections List network connections with filtering by type, status, or search
find_mutual_connections Find people connected to all specified principals

See the AI Agent Integration (MCP) guide for full parameter docs and examples for each tool.

Firecrawl MCP tools

Firecrawl provides web data extraction tools. The three you'll use most alongside Vennio:

Tool What it does
firecrawl_scrape Scrape a URL and return clean markdown or structured data
firecrawl_search Search the web and return ranked results with scraped content
firecrawl_crawl Crawl a site and return content across multiple pages

Full Firecrawl tool docs are at docs.firecrawl.dev/mcp.

Configure both servers

Claude Desktop

Add both servers to ~/Library/Application Support/Claude/claude_desktop_config.json. Firecrawl can run locally via npx or connect to their hosted endpoint.

Option A — local Firecrawl (npx):

{
  "mcpServers": {
    "vennio": {
      "type": "http",
      "url": "https://api.vennio.app/mcp",
      "headers": {
        "Authorization": "Bearer vennio_sk_live_YOUR_KEY"
      }
    },
    "firecrawl": {
      "command": "npx",
      "args": ["-y", "firecrawl-mcp"],
      "env": {
        "FIRECRAWL_API_KEY": "fc-YOUR_FIRECRAWL_KEY"
      }
    }
  }
}

Option B — hosted Firecrawl endpoint:

{
  "mcpServers": {
    "vennio": {
      "type": "http",
      "url": "https://api.vennio.app/mcp",
      "headers": {
        "Authorization": "Bearer vennio_sk_live_YOUR_KEY"
      }
    },
    "firecrawl": {
      "type": "http",
      "url": "https://mcp.firecrawl.dev/YOUR_FIRECRAWL_KEY/v2/mcp"
    }
  }
}

Restart Claude Desktop after saving. Both "vennio" and "firecrawl" will appear in the MCP tools panel.

Cursor

In Cursor, go to Settings → Cursor Settings → MCP and add both servers under mcpServers using the same config format as above. Cursor reads the same JSON structure.

Example workflows

Each workflow below shows the agent prompt and the sequence of MCP tool calls the agent makes to complete the task.

Workflow 1: Pricing alert review meeting

An agent monitors a competitor's pricing page and books a review meeting when it detects a change.

Check acme.com/pricing. If the pricing has changed since last week,
book a 30-minute review meeting with Sarah and Marcus for this week.

Tool sequence:

  1. firecrawl_scrape — fetch acme.com/pricing and extract structured pricing data
  2. Compare against a stored snapshot (or ask the user to confirm a change was detected)
  3. find_mutual_availability — find slots where Sarah and Marcus are both free this week
  4. create_booking — book the first available 30-minute slot
// 1. Scrape competitor pricing
const page = await client.callTool({
  name: 'firecrawl_scrape',
  arguments: {
    url: 'https://acme.com/pricing',
    formats: ['markdown'],
  },
})

// pricing change detected — find a time for Sarah and Marcus
// 2. Find mutual availability
const slots = await client.callTool({
  name: 'find_mutual_availability',
  arguments: {
    principal_ids: ['SARAH_UUID', 'MARCUS_UUID'],
    duration_minutes: 30,
    from: '2026-04-07T00:00:00Z',
    to: '2026-04-11T23:59:59Z',
    timezone: 'America/New_York',
    limit: 3,
  },
})

// 3. Book the first available slot
const booking = await client.callTool({
  name: 'create_booking',
  arguments: {
    business_id: 'BUSINESS_UUID',
    customer_email: 'sarah@example.com',
    customer_name: 'Sarah Chen',
    start_time: slots.content[0].mutual_slots[0].start,
    end_time: slots.content[0].mutual_slots[0].end,
    notes: 'Acme pricing change detected — review required',
    agent_name: 'Pricing Monitor Agent',
  },
})

Workflow 2: Search weekly digest booking

An agent searches for relevant news and books a team digest meeting to discuss it.

Search Hacker News for posts about scheduling APIs this week.
Book a 30-minute team digest for Friday afternoon.

Tool sequence:

  1. firecrawl_search — search for recent HN posts on scheduling APIs
  2. find_availability — find open 30-minute slots on Friday afternoon
  3. create_booking — book the slot with a summary of top posts in the notes
// 1. Search for relevant posts
const results = await client.callTool({
  name: 'firecrawl_search',
  arguments: {
    query: 'scheduling API site:news.ycombinator.com',
    limit: 5,
    scrapeOptions: { formats: ['markdown'] },
  },
})

// extract top post titles for the booking notes
const summary = results.content[0].results
  .map(r => `- ${r.title}`)
  .join('\n')

// 2. Find availability on Friday afternoon
const availability = await client.callTool({
  name: 'find_availability',
  arguments: {
    business_id: 'BUSINESS_UUID',
    duration_minutes: 30,
    from: '2026-04-11T13:00:00Z',
    to: '2026-04-11T18:00:00Z',
    timezone: 'America/New_York',
  },
})

// 3. Book the first available slot
const booking = await client.callTool({
  name: 'create_booking',
  arguments: {
    business_id: 'BUSINESS_UUID',
    customer_email: 'team@example.com',
    customer_name: 'Team Digest',
    start_time: availability.content[0].slots[0].start,
    end_time: availability.content[0].slots[0].end,
    notes: `Top HN posts this week:\n${summary}`,
    agent_name: 'Digest Scheduler',
  },
})

Workflow 3: Enrich a booking with company context

When a demo is booked via webhook, an agent scrapes the prospect's company website and adds a briefing to the booking notes.

A new demo booking just came in from jane@acmecorp.com.
Scrape acmecorp.com and add a company briefing to the booking notes.

Tool sequence:

  1. Webhook fires booking.created — handler reads customer_email and derives the company domain
  2. firecrawl_scrape — scrape the company homepage for context
  3. get_booking — fetch the existing booking
  4. Update the booking notes via the REST API with the enriched briefing
import express from 'express'
import { Client } from '@modelcontextprotocol/sdk/client/index.js'
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js'

const app = express()
app.use(express.json())

const vennio = new Client({ name: 'demo-enricher', version: '1.0.0' })
const firecrawl = new Client({ name: 'demo-enricher', version: '1.0.0' })

await vennio.connect(new StreamableHTTPClientTransport(
  new URL('https://api.vennio.app/mcp'),
  { requestInit: { headers: { Authorization: `Bearer ${process.env.VENNIO_API_KEY}` } } }
))

await firecrawl.connect(new StreamableHTTPClientTransport(
  new URL(`https://mcp.firecrawl.dev/${process.env.FIRECRAWL_API_KEY}/v2/mcp`)
))

app.post('/webhooks/vennio', async (req, res) => {
  const { event, data } = req.body
  if (event !== 'booking.created') return res.sendStatus(200)

  const { id: booking_id, customer_email } = data.booking
  const domain = customer_email.split('@')[1]

  // Scrape the company site
  const page = await firecrawl.callTool({
    name: 'firecrawl_scrape',
    arguments: { url: `https://${domain}`, formats: ['markdown'] },
  })

  const briefing = `Company briefing (scraped ${new Date().toISOString().slice(0,10)}):\n`
    + page.content[0].markdown.slice(0, 500)

  // Update the booking notes via Vennio REST API
  await fetch(`https://api.vennio.app/v1/bookings/${booking_id}`, {
    method: 'PATCH',
    headers: {
      'Authorization': `Bearer ${process.env.VENNIO_API_KEY}`,
      'Content-Type': 'application/json',
    },
    body: JSON.stringify({ notes: briefing }),
  })

  res.sendStatus(200)
})

app.listen(3000)
Webhook setup

Register your webhook endpoint in the Vennio dashboard under Settings → Webhooks. Subscribe to the booking.created event. See the Webhooks guide for signature verification.

Get started

  1. Vennio API key — Create a secret key at vennio.app/api-keys (vennio_sk_live_*)
  2. Firecrawl API key — Sign up at firecrawl.dev and copy your API key from the dashboard
  3. Add both servers to your MCP client config as shown above
  4. Review Vennio MCP docs — full tool parameter reference is in the AI Agent Integration (MCP) guide