Configure access to LLMs
Elastic's AI features work with the out-of-the-box Elastic Managed LLMs or with third-party LLMs configured using one of the available connectors.
Elastic Managed LLMs are available in Kibana deployments and serverless projects with an appropriate subscription or feature tier. They provide immediate access to generative AI features without requiring any setup or external model integration.
Elastic Managed LLMs are available out-of-the box. They do not require manual configuration or API key management. Alternatively, you can configure and use third-party LLM connectors, such as OpenAI, Azure, or Amazon Bedrock.
To learn more about security and data privacy, refer to Elastic Managed LLMs which provides details on the models.
Using Elastic Managed LLMs incurs additional costs. Refer to Elastic Cloud pricing for more information.
While connectors let you connect to a wide range of LLMs, model performance varies by solution and use case. Refer to the following performance matrices to find performance information for models tested by Elastic:
Models that do not appear in these matrices may still work, but Elastic hasn't tested them. We recommend selecting a model with strong ratings for your intended use case.
Follow these guides to connect to one or more third-party LLM providers:
You can also use preconfigured connectors to set up third-party LLM connectors by editing the kibana.yml file. This allows you enable a connector for multiple spaces at once, without performing set up in the Kibana UI for each space.
If you use a preconfigured connector for your LLM connector, we recommend adding the exposeConfig: true parameter to the xpack.actions.preconfigured section of the kibana.yml config file. This parameter makes debugging easier by adding configuration information to the debug logs, including which LLM the connector uses.