Skip to main content
This is a beta feature according to Algolia’s Terms of Service (“Beta Services”).

Before you begin

To connect an MCP server, make sure you have:
  • An Algolia application with at least one index and some records. Create one in the dashboard if needed.
  • One of the following:
    • An OpenAI account with access to the OpenAI Playground.
    • A ChatGPT account with custom connectors enabled (you should see a Create button in the settings).
    • A Claude Pro subscription with access to custom connectors (you should see an Add Custom Connector option on the Connector settings page).

Use the Algolia MCP Server with OpenAI Playground

1

Create or locate an MCP server

  1. Go to the MCP servers page in the Algolia dashboard and select the appropriate application.
  2. If an existing MCP server is available, click the 🔗 button to copy the URL and go to the next step: Configure in OpenAI Playground.
  3. To create a new MCP server:
    1. Select the indices you want to expose.
    2. Add descriptions to help AI choose the right index or tool.
    3. Save and copy the URL of the new MCP instance.
2

Configure in OpenAI Playground

  1. Open OpenAI Platform → Chat.
  2. Go to Tools → + Add.
  3. Choose MCP server.
  4. Click + Server and enter:
    • Server URL: https://mcp.{REGIONS}.algolia.com/1/{UNIQ_ID}/mcp (created or copied in the previous step: “Create or locate an MCP server”).
    • Label: Algolia
    • Description: add a description that helps the LLM understand the purpose of the tool.
    • Authentication: none.
  5. Click Connect.
  6. For Model, choose gpt-4o. This model is recommended because GPT-5 may not format results correctly.
  7. Use the following system prompt to ensure accurate results:
You are an AI assistant that helps users with searching for results in an Algolia index.
Display the Algolia results exactly as returned. Do not filter, reorder, or summarize the hits.
3

Review the tools exposed by the MCP Server

The Algolia MCP Server makes several tools available to LLMs.
Each tool corresponds to a specific Algolia API endpoint that the LLM can call through the MCP connection.
  • algolia_search_{index_name_1}, algolia_search_{index_name_2}, … : search tools for each index you selected when creating the MCP server.
  • algolia_search_for_facet_values
  • algolia_recommendations
In addition, OpenAI clients (like ChatGPT) include two built-in tools: search and fetch.These tools let you query and display Algolia results using the MCP connection. Once you understand which tools are available, use them in your chosen client.

Use the Algolia MCP Server with other MCP clients

  • With ChatGPT
  • With Claude
  1. Open Connectors settings, then click Create.
  2. Fill in the following:
    • Server URL: https://mcp.{REGIONS}.algolia.com/1/{UNIQ_ID}/mcp (created or copied in the previous “Create or locate an MCP server” step).
    • Label: Algolia
    • Description:: add a description that helps the LLM understand the purpose of the tool.
    • Authentication: none.
  3. Check I trust this application.
  4. Click Create.
  5. Enable the Algolia MCP connection: in the client’s prompt bar, you should find a connectors button to enable the Algolia MCP connection you just configured.

Guidelines using the Algolia MCP server

When using the Algolia MCP Server with an LLM, follow these best practices to ensure secure and efficient operation:
  • Usage and billing: API requests made through the MCP Server count toward your Algolia usage. Typically, each tool call from the LLM results in one API request. However, in some cases, the MCP may issue additional calls to gather context and provide more accurate responses.
  • Data access and permissions: The MCP grants the connected LLM access to your Algolia records according to the permissions defined in your Search API key. Before enabling this connection, make sure you’re comfortable sharing that data with your chosen LLM provider.
  • Prompt design: Use a clear and well-defined system prompt that specifies the assistant’s role and instructs it to return results exactly as provided by Algolia. This helps maintain accuracy and consistency in the responses generated by the LLM.