[role="xpack"] [[put-inference-api]] === Create {infer} API experimental[] Creates an {infer} endpoint to perform an {infer} task. IMPORTANT: The {infer} APIs enable you to use certain services, such as built-in {ml} models (ELSER, E5), models uploaded through Eland, Cohere, OpenAI, Mistral, Azure OpenAI, Google AI Studio or Hugging Face. For built-in models and models uploaded though Eland, the {infer} APIs offer an alternative way to use and manage trained models. However, if you do not plan to use the {infer} APIs to use these models or if you want to use non-NLP models, use the <>. [discrete] [[put-inference-api-request]] ==== {api-request-title} `PUT /_inference//` [discrete] [[put-inference-api-prereqs]] ==== {api-prereq-title} * Requires the `manage_inference` <> (the built-in `inference_admin` role grants this privilege) [discrete] [[put-inference-api-desc]] ==== {api-description-title} The create {infer} API enables you to create an {infer} endpoint and configure a {ml} model to perform a specific {infer} task. The following services are available through the {infer} API, click the links to review the configuration details of the services: * <> * <> * <> * <> (for built-in models and models uploaded through Eland) * <> * <> * <> * <> * <>