Loading

Generative AI connectors

Use these connectors to connect to third-party large language model (LLM) services and Elastic's own LLM offerings.

  • AI Connector: Connect to third-party LLM services including Amazon Bedrock, Azure, Google Gemini, OpenAI, and Elastic Inference Service.
  • Amazon Bedrock: Send a request to Amazon Bedrock.
  • Elastic Managed LLMs: Send a request to Elastic Managed LLMs.
  • Google Gemini: Send a request to Google Gemini.
  • OpenAI: Send a request to OpenAI.

External MCP Server

  • MCP: Connect to MCP servers and call their tools.
Important

Connecting to LLM providers through a proxy is in technical preview. If you use a proxy, it should support streaming and be SSE-compatible. Elastic only parses streamed responses.

To check if problems are caused by using a proxy, you can test your LLM service without using a proxy.