elasticsearch/docs/java-rest/high-level/ml/put-trained-model.asciidoc
David Kyle 94adaa55c0
[ML] Merge the pytorch-inference feature branch (#73660)
The feature branch contains changes to configure PyTorch models with a 
TrainedModelConfig and defines a format to store the binary models. 
The _start and _stop deployment actions control the model lifecycle 
and the model can be directly evaluated with the _infer endpoint. 
2 Types of NLP tasks are supported: Named Entity Recognition and Fill Mask.

The feature branch consists of these PRs: #73523, #72218, #71679
#71323, #71035, #71177, #70713
2021-06-03 12:43:06 +01:00

59 lines
2.1 KiB
Text

--
:api: put-trained-model
:request: PutTrainedModelRequest
:response: PutTrainedModelResponse
--
[role="xpack"]
[id="{upid}-{api}"]
=== Create trained models API
Creates a new trained model for inference.
The API accepts a +{request}+ object as a request and returns a +{response}+.
[id="{upid}-{api}-request"]
==== Request
A +{request}+ requires the following argument:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-request]
--------------------------------------------------
<1> The configuration of the {infer} trained model to create
[id="{upid}-{api}-config"]
==== Trained model configuration
The `TrainedModelConfig` object contains all the details about the trained model
configuration and contains the following arguments:
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-config]
--------------------------------------------------
<1> The {infer} definition for the model
<2> Optionally, if the {infer} definition is large, you may choose to compress it for transport.
Do not supply both the compressed and uncompressed definitions.
<3> The unique model id
<4> The type of model being configured. If not set the type is inferred from the model definition
<5> The input field names for the model definition
<6> Optionally, a human-readable description
<7> Optionally, an object map contain metadata about the model
<8> Optionally, an array of tags to organize the model
<9> The default inference config to use with the model. Must match the underlying
definition target_type.
include::../execution.asciidoc[]
[id="{upid}-{api}-response"]
==== Response
The returned +{response}+ contains the newly created trained model.
The +{response}+ will omit the model definition as a precaution against
streaming large model definitions back to the client.
["source","java",subs="attributes,callouts,macros"]
--------------------------------------------------
include-tagged::{doc-tests-file}[{api}-response]
--------------------------------------------------