Skip to main content
Leonar exposes a native Model Context Protocol (MCP) server that lets AI assistants call Leonar tools and read contextual resources directly.
AI Assistant  →  MCP Protocol  →  Leonar MCP Server  →  Leonar CRM data

Quick start

1. Create an API key

Go to Settings > API in your Leonar dashboard and create a key with the scopes your assistant needs (see Scopes).

2. Configure your MCP client

Add the Leonar MCP server to your client’s configuration. The easiest way is via our NPX package, which works with all MCP clients.

3. Start using tools

Once connected, your AI assistant has access to the same safe workflow tools used by Leonar’s integrated AI agent, plus 4 contextual resources. Ask it things like:
  • “Search for backend engineers in Paris”
  • “Show me the context for the Senior PM project”
  • “Find sourcing candidates for this project”

Authentication

The MCP server uses the same leo_ API keys as the REST API. Pass your key as a Bearer token in the Authorization header.
Authorization: Bearer leo_your_api_key
Each tool requires specific scopes. If the API key lacks a required scope, the tool returns an error. See the scopes reference for the full list.
Use the minimum scopes your assistant needs. For read-only exploration, start with contacts:read, projects:read, and companies:read.
Approval-gated mutations used by the in-app agent (for example project creation or launching sourcing directly into a pipeline) are intentionally not exposed over external MCP until a reviewed approval flow is available for third-party MCP clients.

Transport

The MCP server uses Streamable HTTP transport over a single endpoint (POST /api/mcp). It is stateless — no session management is required. All MCP clients that support the url transport type can connect directly.

Available resources

The server also exposes these read-only resources:
Resource URIDescription
leonar://workspace/contextWorkspace metadata, billing state, and lightweight counters
leonar://accounts/connectedActive connected accounts available for sending and sourcing
leonar://team/membersWorkspace members and roles
leonar://pipeline/templatesPipeline templates with their stages

Available tools

Search & context

ToolScopesDescription
search_leadscontacts:readSearch leads/contacts by text, job title, company, and location.
get_project_contextprojects:read, pipeline:readGet project context including pipeline stages and candidate counts.
get_project_sourcing_contextprojects:read, pipeline:read, sourcing:readGet sourcing-specific project context, owners, and available sourcing sources.
search_candidatesprojects:read, sourcing:readSearch sourcing candidates using the same engine as the integrated AI agent.
suggest_followupscontacts:readProduce structured follow-up suggestions for selected contacts.

Memory

ToolScopesDescription
save_user_memoryagents:writeSave a memory item for the current API key owner.
update_user_memoryagents:writeUpdate a saved memory item.
delete_user_memoryagents:writeSoft-delete a saved memory item.

Tool parameters reference

search_leads

{
  query?: string
  jobTitle?: string
  company?: string
  location?: string
  limit?: number
}

get_project_context

{
  projectId: string
}

get_project_sourcing_context

{
  projectId?: string
  projectName?: string
}

search_candidates

{
  projectId?: string
  projectName?: string
  query?: string
  jobTitle?: string
  titles?: string[]
  location?: string
  locations?: string[]
  companies?: string[]
  skills?: string[]
  languages?: string[]
  includeOpenToWork?: boolean
  sourcePreference?: "auto" | "linkedin_recruiter" | "unified"
  excludeAlreadyProcessed?: boolean
  excludeInProject?: boolean
  page?: number
  pageSize?: number
  searchId?: string
}

suggest_followups

{
  contactIds: string[]
  context?: string
  tone?: "formal" | "casual" | "friendly"
}

save_user_memory

{
  content: string
  memoryType?: string
  key?: string
  projectId?: string
}

update_user_memory

{
  memoryId: string
  content?: string
  key?: string
  status?: string
}

delete_user_memory

{
  memoryId: string
}

Example use cases for recruiters

Pipeline review

“Show me the context for the Senior PM project and how many candidates are in each stage.”The assistant calls get_project_context and summarizes the pipeline without opening the dashboard.

Candidate search

“Find backend candidates for our Paris project, excluding already processed people.”Uses search_candidates with the same sourcing defaults as the integrated AI agent.

Outreach status

“Give me three follow-up angles for these contacts before my outreach block.”Uses suggest_followups to return structured talking points your assistant can turn into drafts.

Project sourcing context

“What sourcing sources are available for the Acme frontend project?”Uses get_project_sourcing_context to inspect owners, pipeline state, and available sourcing providers.

Memory assist

“Remember that this workspace prefers concise, founder-style outreach.”Uses save_user_memory so future agent interactions can reuse durable context.

Rate limits

The MCP server shares the same rate limits as the REST API: 1000 requests per hour per API key. Each tool call counts as one request.

MCP vs REST API

MCP ServerREST API
Best forAI assistants (Claude, Cursor, Codex)Custom integrations, scripts
ProtocolModel Context ProtocolHTTP REST
AuthSame leo_ API keysSame leo_ API keys
ScopesSame scope systemSame scope system
Tools8 workflow tools + 4 resources60+ endpoints (full CRUD)
Use the MCP server when connecting an AI assistant. Use the REST API when building custom integrations or scripts that need full CRUD access to all resources.