[DOCS] Documents that deployment_id can be used as inference_id in certain cases. (#121055) (#121059)

This commit is contained in:
István Zoltán Szabó 2025-01-28 17:28:34 +01:00 committed by GitHub
parent 7e03356faf
commit 4614cb1f59
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -62,11 +62,14 @@ GET _search
(Required, string) The name of the field that contains the token-weight pairs to be searched against. (Required, string) The name of the field that contains the token-weight pairs to be searched against.
`inference_id`:: `inference_id`::
(Optional, string) The <<inference-apis,inference ID>> to use to convert the query text into token-weight pairs. (Optional, string)
The <<inference-apis,inference ID>> to use to convert the query text into token-weight pairs.
It must be the same inference ID that was used to create the tokens from the input text. It must be the same inference ID that was used to create the tokens from the input text.
Only one of `inference_id` and `query_vector` is allowed. Only one of `inference_id` and `query_vector` is allowed.
If `inference_id` is specified, `query` must also be specified. If `inference_id` is specified, `query` must also be specified.
If all queried fields are of type <<semantic-text, semantic_text>>, the inference ID associated with the `semantic_text` field will be inferred. If all queried fields are of type <<semantic-text, semantic_text>>, the inference ID associated with the `semantic_text` field will be inferred.
You can reference a `deployment_id` of a {ml} trained model deployment as an `inference_id`.
For example, if you download and deploy the ELSER model in the {ml-cap} trained models UI in {kib}, you can use the `deployment_id` of that deployment as the `inference_id`.
`query`:: `query`::
(Optional, string) The query text you want to use for search. (Optional, string) The query text you want to use for search.