mirror of
https://github.com/elastic/elasticsearch.git
synced 2025-04-24 23:27:25 -04:00
HLRC: Adding Update datafeed API (#34882)
* HLRC: Adding Update datafeed API * Addressing unused import * Adjusting docs and fixing minor comments * fixing comment
This commit is contained in:
parent
11fa8d3744
commit
052dfa5646
10 changed files with 351 additions and 2 deletions
58
docs/java-rest/high-level/ml/update-datafeed.asciidoc
Normal file
58
docs/java-rest/high-level/ml/update-datafeed.asciidoc
Normal file
|
@ -0,0 +1,58 @@
|
|||
--
|
||||
:api: update-datafeed
|
||||
:request: UpdateDatafeedRequest
|
||||
:response: PutDatafeedResponse
|
||||
--
|
||||
[id="{upid}-{api}"]
|
||||
=== Update Datafeed API
|
||||
|
||||
The Update Datafeed API can be used to update a {ml} datafeed
|
||||
in the cluster. The API accepts a +{request}+ object
|
||||
as a request and returns a +{response}+.
|
||||
|
||||
[id="{upid}-{api}-request"]
|
||||
==== Update Datafeed Request
|
||||
|
||||
A +{request}+ requires the following argument:
|
||||
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-request]
|
||||
--------------------------------------------------
|
||||
<1> The updated configuration of the {ml} datafeed
|
||||
|
||||
[id="{upid}-{api}-config"]
|
||||
==== Updated Datafeed Arguments
|
||||
|
||||
A `DatafeedUpdate` requires an existing non-null `datafeedId` and
|
||||
allows updating various settings.
|
||||
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-config]
|
||||
--------------------------------------------------
|
||||
<1> Mandatory, non-null `datafeedId` referencing an existing {ml} datafeed
|
||||
<2> Optional, set the datafeed Aggregations for data gathering
|
||||
<3> Optional, the indices that contain the data to retrieve and feed into the job
|
||||
<4> Optional, specifies how data searches are split into time chunks.
|
||||
<5> Optional, the interval at which scheduled queries are made while the datafeed runs in real time.
|
||||
<6> Optional, a query to filter the search results by. Defaults to the `match_all` query.
|
||||
<7> Optional, the time interval behind real time that data is queried.
|
||||
<8> Optional, allows the use of script fields.
|
||||
<9> Optional, the `size` parameter used in the searches.
|
||||
<10> Optional, the `jobId` that references the job that the datafeed should be associated with
|
||||
after the update.
|
||||
|
||||
include::../execution.asciidoc[]
|
||||
|
||||
[id="{upid}-{api}-response"]
|
||||
==== Response
|
||||
|
||||
The returned +{response}+ returns the full representation of
|
||||
the updated {ml} datafeed if it has been successfully updated.
|
||||
|
||||
["source","java",subs="attributes,callouts,macros"]
|
||||
--------------------------------------------------
|
||||
include-tagged::{doc-tests-file}[{api}-response]
|
||||
--------------------------------------------------
|
||||
<1> The updated datafeed
|
|
@ -240,6 +240,7 @@ The Java High Level REST Client supports the following Machine Learning APIs:
|
|||
* <<{upid}-update-job>>
|
||||
* <<{upid}-get-job-stats>>
|
||||
* <<{upid}-put-datafeed>>
|
||||
* <<{upid}-update-datafeed>>
|
||||
* <<{upid}-get-datafeed>>
|
||||
* <<{upid}-delete-datafeed>>
|
||||
* <<{upid}-preview-datafeed>>
|
||||
|
@ -266,6 +267,7 @@ include::ml/close-job.asciidoc[]
|
|||
include::ml/update-job.asciidoc[]
|
||||
include::ml/flush-job.asciidoc[]
|
||||
include::ml/put-datafeed.asciidoc[]
|
||||
include::ml/update-datafeed.asciidoc[]
|
||||
include::ml/get-datafeed.asciidoc[]
|
||||
include::ml/delete-datafeed.asciidoc[]
|
||||
include::ml/preview-datafeed.asciidoc[]
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue