Loading

Enable large language model (LLM) access

Serverless Stack

Elastic uses large language model (LLM) connectors to power its AI features. These features with the out-of-the-box Elastic Managed LLM or by configuring a third-party LLM connector.

Elastic Managed LLM is the default large language model (LLM) connector available in Kibana for eligible users. It provides immediate access to generative AI features without requiring any setup or external model integration.

Elastic Managed LLM is available out-of-the box; it does not require manual connector setup or API key management. However, you can configure and use a third-party LLM connector, such as OpenAI, Azure, or Amazon Bedrock if you prefer.

To learn more about security and data privacy, refer to the connector documentation and download the model card.

Important

Using the Elastic Managed LLM incurs additional costs. Refer to Elastic Cloud pricing for more information.

Follow these guides to connect to one or more third-party LLM providers:

Serverless Unavailable Stack

You can also use preconfigured connectors to set up a third-party LLM connector.

If you use a preconfigured connector for your LLM connector, we recommend adding the exposeConfig: true parameter to the xpack.actions.preconfigured section of the kibana.yml config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.