﻿---
title: External inference
description: You can use your own API keys to integrate with third-party model providers like Amazon Bedrock, Anthropic, Azure AI Studio, Cohere, Google AI, Mistral,...
url: https://docs-v3-preview.elastic.dev/elastic/docs-content/pull/5528/explore-analyze/elastic-inference/external
products:
  - Kibana
applies_to:
  - Elastic Cloud Serverless: Generally available
  - Elastic Stack: Planned
---

# External inference
You can use your own API keys to integrate with third-party model providers like Amazon Bedrock, Anthropic, Azure AI Studio, Cohere, Google AI, Mistral, OpenAI, Hugging Face, and more.
The **External inference** app provides an interface for managing external inference models and endpoints.
Available actions include:
- Add new endpoint
- View endpoint details
- Copy the inference endpoint ID
- Delete endpoints

Alternatively, you can use [inference APIs](https://www.elastic.co/docs/api/doc/elasticsearch/group/endpoint-inference).

## Add new inference endpoint

1. Go to the **External inference** model management page in the navigation menu or use the [global search field](https://docs-v3-preview.elastic.dev/elastic/docs-content/pull/5528/explore-analyze/find-and-organize/find-apps-and-objects).
2. Select the **Add endpoint** button.
3. Select a service from the drop down menu.
4. Provide the required configuration details.
   For service-specific information, refer to the relevant API documentation.
   For example, [create a JinaAI inference endpoint](https://www.elastic.co/docs/api/doc/elasticsearch/operation/operation-inference-put-jinaai).
5. Select **Save** to create the endpoint.