﻿---
title: Generative AI connectors
description: Use these connectors to connect to third-party large language model (LLM) services and Elastic's own LLM offerings. 
url: https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/gen-ai-connectors
products:
  - Kibana
---

# Generative AI connectors
Use these connectors to connect to third-party large language model (LLM) services and Elastic's own LLM offerings.

## Available connectors

- [AI Connector](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/ai-connector): Connect to third-party LLM services including Amazon Bedrock, Azure, Google Gemini, OpenAI, and Elastic Inference Service.
- [Amazon Bedrock](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/bedrock-action-type): Send a request to Amazon Bedrock.
- [Elastic Managed LLMs](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/elastic-managed-llm): Send a request to Elastic Managed LLMs.
- [Google Gemini](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/gemini-action-type): Send a request to Google Gemini.
- [OpenAI](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/openai-action-type): Send a request to OpenAI.

**External MCP Server**
- [MCP](https://www.elastic.co/elastic/docs-builder/docs/3016/reference/kibana/connectors-kibana/mcp-action-type): Connect to MCP servers and call their tools. <applies-to>Elastic Stack: Preview in 9.3</applies-to> <applies-to>Elastic Cloud Serverless: Preview</applies-to>

<important>
  Connecting to LLM providers through a proxy is in technical preview. If you use a proxy, it should support streaming and be SSE-compatible. Elastic only parses streamed responses.To check if problems are caused by using a proxy, you can test your LLM service without using a proxy.
</important>