[6.8] [DOCS] Removes Extend your use case section (#49812) (#49925)

This commit is contained in:
Lisa Cawley 2019-11-15 10:39:06 -08:00 committed by GitHub
parent cce96d2893
commit fb91a5d519
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
9 changed files with 47 additions and 41 deletions

View file

@ -27,3 +27,7 @@ This page has moved. Please see {stack-ov}/create-jobs.html[Creating {anomaly-jo
This page has moved. Please see {stack-ov}/job-tips.html[Machine learning job tips].
[role="exclude",id="extend"]
== Extend your use case
This page was deleted. See <<xpack-graph>> and <<xpack-ml>>.

View file

@ -1,15 +0,0 @@
[[extend]]
= Extend your use case
[partintro]
--
//TBD
* <<xpack-graph>>
* <<xpack-ml>>
--
include::graph/index.asciidoc[]
include::ml/index.asciidoc[]

View file

@ -1,6 +1,6 @@
[role="xpack"]
[[graph-configuration]]
=== Configuring Graph
== Configuring Graph
When a user saves a graph workspace in Kibana, it is stored in the `.kibana`
index along with other saved objects like visualizations and dashboards.
@ -49,7 +49,7 @@ explicitly selects the include data option.
[float]
[[disable-drill-down]]
==== Disabling drill down configuration
=== Disabling drill down configuration
By default, users can configure _drill down_ URLs to display additional
information about a selected vertex in a new browser window. For example,

View file

@ -1,6 +1,6 @@
[role="xpack"]
[[graph-getting-started]]
=== Getting started
== Getting started
Graph is automatically enabled in {es} and {kib}.

View file

@ -1,7 +1,9 @@
[role="xpack"]
[[xpack-graph]]
== Graphing connections in your data
= Graphing connections in your data
[partintro]
--
The {graph-features} enable you to discover how items in an
Elasticsearch index are related. You can explore the connections between
indexed terms and see which connections are the most meaningful. This can be
@ -17,9 +19,10 @@ and an interactive graph visualization tool for Kibana. Both work out of the
box with existing Elasticsearch indices--you don't need to store any
additional data to use these features.
[discrete]
[[how-graph-works]]
[float]
=== How graphs work
== How graphs work
The Graph API provides an alternative way to extract and summarize information
about the documents and terms in your Elasticsearch index. A _graph_ is really
just a network of related items. In our case, this means a network of related
@ -61,6 +64,7 @@ multi-node clusters and scales with your Elasticsearch deployment.
Advanced options let you control how your data is sampled and summarized.
You can also set timeouts to prevent graph queries from adversely
affecting the cluster.
--
include::getting-started.asciidoc[]

View file

@ -1,12 +1,12 @@
[role="xpack"]
[[graph-limitations]]
=== Graph limitations
== Graph limitations
++++
<titleabbrev>Limitations</titleabbrev>
++++
[float]
==== Limited support for multiple indices
[discrete]
=== Limited support for multiple indices
The graph API can explore multiple indices, types, or aliases in a
single API request, but the assumption is that each "hop" it performs
is querying the same set of indices. Currently, it is not possible to

View file

@ -1,12 +1,12 @@
[role="xpack"]
[[graph-troubleshooting]]
=== Graph Troubleshooting
== Graph Troubleshooting
++++
<titleabbrev>Troubleshooting</titleabbrev>
++++
[float]
==== Why are results missing?
[discrete]
=== Why are results missing?
The default settings in Graph API requests are configured to tune out noisy
results by using the following strategies:
@ -29,8 +29,8 @@ of any statistical correlation with the sample.
* Set the `min_doc_count` for your vertices to 1 to ensure only one document is
required to assert a relationship.
[float]
==== What can I do to to improve performance?
[discrete]
=== What can I do to to improve performance?
With the default setting of `use_significance` set to `true`, the Graph API
performs a background frequency check of the terms it discovers as part of

View file

@ -20,7 +20,9 @@ include::timelion.asciidoc[]
include::canvas.asciidoc[]
include::extend.asciidoc[]
include::graph/index.asciidoc[]
include::ml/index.asciidoc[]
include::{kib-repo-dir}/maps/index.asciidoc[]

View file

@ -1,17 +1,14 @@
[role="xpack"]
[[xpack-ml]]
== {ml-cap}
= {ml-cap}
[partintro]
--
As datasets increase in size and complexity, the human effort required to
inspect dashboards or maintain rules for spotting infrastructure problems,
cyber attacks, or business issues becomes impractical. The Elastic {ml}
{anomaly-detect} feature automatically model the normal behavior of your time
series data — learning trends, periodicity, and more — in real time to identify
anomalies, streamline root cause analysis, and reduce false positives.
{anomaly-detect-cap} run in and scale with {es}, and include an
intuitive UI on the {kib} *Machine Learning* page for creating {anomaly-jobs}
and understanding results.
cyber attacks, or business issues becomes impractical. Elastic {ml-features}
such as {anomaly-detect} make it easier to notice suspicious activities with
minimal human interference.
If you have a basic license, you can use the *Data Visualizer* to learn more
about your data. In particular, if your data is stored in {es} and contains a
@ -25,9 +22,23 @@ experimental[] You can also upload a CSV, NDJSON, or log file (up to 100 MB in s
The *Data Visualizer* identifies the file format and field mappings. You can then
optionally import that data into an {es} index.
If you have a trial or platinum license, you can
create {anomaly-jobs} and manage jobs and {dfeeds} from the *Job
Management* pane:
--
[role="xpack"]
[[xpack-ml-anomalies]]
== {anomaly-detect-cap}
The Elastic {ml} {anomaly-detect} feature automatically model the normal
behavior of your time series data — learning trends, periodicity, and more — in
real time to identify anomalies, streamline root cause analysis, and reduce
false positives.
{anomaly-detect-cap} run in and scale with {es}, and include an
intuitive UI on the {kib} *Machine Learning* page for creating {anomaly-jobs}
and understanding results.
If you have a license that includes the {ml} features, you can create
{anomaly-jobs} and manage jobs and {dfeeds} from the *Job Management* pane:
[role="screenshot"]
image::user/ml/images/ml-job-management.jpg[Job Management]