﻿---
title: Configure access to LLMs
description: Elastic's AI features work with the out-of-the-box Elastic Managed LLMs or with third-party LLMs configured using one of the available connectors. While...
url: https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features/llm-guides/llm-connectors
products:
  - Elastic Cloud Serverless
  - Elastic Observability
  - Elastic Security
  - Elasticsearch
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Generally available
---

# Configure access to LLMs
Elastic's [AI features](https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features) work with the out-of-the-box Elastic Managed LLMs or with third-party LLMs configured using one of the available connectors.

## Elastic Managed LLMs

[Elastic Managed LLMs](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/kibana/connectors-kibana/elastic-managed-llm) are available in Kibana deployments and serverless projects with an appropriate [subscription or feature tier](https://www.elastic.co/pricing). They provide immediate access to generative AI features without requiring any setup or external model integration.
Elastic Managed LLMs are available out-of-the box. They do not require manual configuration or API key management. Alternatively, you can configure and use third-party LLM connectors, such as OpenAI, Azure, or Amazon Bedrock.
To learn more about security and data privacy, refer to [Elastic Managed LLMs](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/kibana/connectors-kibana/elastic-managed-llm) which provides details on the models.
<important>
  Using Elastic Managed LLMs incurs additional costs. Refer to [Elastic Cloud pricing](https://www.elastic.co/pricing/serverless-search) for more information.
</important>


## Tested models and performance ratings

While connectors let you connect to a wide range of LLMs, model performance varies by solution and use case. Refer to the following performance matrices to find performance information for models tested by Elastic:
- [LLM performance matrix for Observability](https://www.elastic.co/elastic/docs-builder/docs/3028/solutions/observability/ai/llm-performance-matrix)
- [LLM performance matrix for Elastic Security](https://www.elastic.co/elastic/docs-builder/docs/3028/solutions/security/ai/large-language-model-performance-matrix)

<note>
  Models that do not appear in these matrices may still work, but Elastic hasn't tested them. We recommend selecting a model with strong ratings for your intended use case.
</note>


## Connect to a third-party or self-managed LLM

Follow these guides to connect to one or more third-party LLM providers:
- [Azure OpenAI](https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features/llm-guides/connect-to-azure-openai)
- [Amazon Bedrock](https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features/llm-guides/connect-to-amazon-bedrock)
- [OpenAI](https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features/llm-guides/connect-to-openai)
- [Google Vertex](https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features/llm-guides/connect-to-google-vertex)
- [Self-managed LLMs](https://www.elastic.co/elastic/docs-builder/docs/3028/explore-analyze/ai-features/llm-guides/local-llms-overview)


## Preconfigured connectors

<applies-to>
  - Elastic Cloud Serverless: Unavailable
  - Elastic Stack: Generally available
</applies-to>

You can also use [preconfigured connectors](https://docs-v3-preview.elastic.dev/elastic/docs-builder/docs/3028/reference/kibana/connectors-kibana/pre-configured-connectors) to set up third-party LLM connectors by editing the `kibana.yml` file. This allows you enable a connector for multiple spaces at once, without performing set up in the Kibana UI for each space.
If you use a preconfigured connector for your LLM connector, we recommend adding the `exposeConfig: true` parameter to the `xpack.actions.preconfigured` section of the `kibana.yml` config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.