Backports PR #8934
**Commit 1:**
[docs] Add plugin install instructions for packages. Closes#8845
* Original sha: ad6a4f7b53
* Authored by Jonathan Budzenski <jon@jbudz.me> on 2016-11-02T15:04:28Z
Backports PR #8397
**Commit 1:**
sorting chart xValues by metric sum
* Original sha: 64db47ef08
* Authored by ppisljar <peter.pisljar@gmail.com> on 2016-09-21T09:14:49Z
**Commit 2:**
fixing tests
* Original sha: b6bcd55c11
* Authored by ppisljar <peter.pisljar@gmail.com> on 2016-09-21T09:40:54Z
**Commit 3:**
adding order buckets by value option to point series charts
* Original sha: c03bc99be5
* Authored by ppisljar <peter.pisljar@gmail.com> on 2016-09-22T07:50:02Z
**Commit 4:**
fixing tests
* Original sha: 591729394f
* Authored by ppisljar <peter.pisljar@gmail.com> on 2016-09-22T15:11:40Z
**Commit 5:**
fixing tests
* Original sha: 19bcdfe614
* Authored by ppisljar <peter.pisljar@gmail.com> on 2016-09-22T15:21:07Z
**Commit 6:**
Updating based on CJs comments and adding documentation
* Original sha: cccc06fa68
* Authored by ppisljar <peter.pisljar@gmail.com> on 2016-09-23T06:06:06Z
Backports PR #8999
**Commit 1:**
Docs: Updated Loading Sample Data to download all datasets from S3. Closes#8997
* Original sha: 780af3a64c
* Authored by debadair <debadair@elastic.co> on 2016-11-07T23:55:50Z
These changes include all three known upgrade scenarios:
1. New installation (K3 -> K5)
2. Standard upgrade (K4.2 -> K5)
3. Standard upgrade with reindex (K4.1 -> K5)
This overhaul of the docs structure puts Kibana's documentation more
inline with the structure that is used in Elasticsearch. This will help
us better organize the docs going forward as more docs are added.
This also includes a few necessary content changes for 5.0.
Since we introduced the ability for users to manually manipulate some of
the JSON used in Elasticsearch requests in the form of of the filter
editor in 4.3, it's possible that saved objects are taking advantage of
then-deprecated and now-removed Elasticsearch functionality.
This calls out that possibility and links to the Elasticsearch breaking
changes documentation for more information. Even though this is
technically not a Kibana BC break, it manifests as if it were an issue
with Kibana since it results in broken
searches/visualizations/dashboards.
Fixes#8511
---------
**Commit 1:**
typo fix for advanced-settings.asciidoc
* Original sha: 45a1e5b8f8
* Authored by fanfan <yu.qifan@zte.com.cn> on 2016-10-13T02:08:25Z
---------
**Commit 1:**
[docs] Init breaking changes
* Original sha: 3e60953880
* Authored by Jonathan Budzenski <jon@jbudz.me> on 2016-09-15T17:35:44Z
**Commit 2:**
[docs] Add separate section for 5.0
* Original sha: 9f0101f286
* Authored by Jonathan Budzenski <jon@jbudz.me> on 2016-09-15T19:09:36Z
**Commit 3:**
[docs] Capitalize breaking changes title
* Original sha: f10a949cad
* Authored by Jonathan Budzenski <jon@jbudz.me> on 2016-10-10T14:17:01Z
Remove Upload Data API
Remove CSV Upload server config
Remove other unused Add Data code
Remove Upload CSV from getting started docs
remove unused npm deps
---------
**Commit 1:**
Make expected server.basePath usage more clear
* Original sha: 02c9030c6a
* Authored by Matthew Bargar <mbargar@gmail.com> on 2016-09-20T15:48:35Z
---------
**Commit 1:**
Make CSV upload limit configurable
Kibana's Upload CSV feature isn't intended for gigantic import jobs, so
I originally set a sane default of 1GB. Some users exprssed a desire to
import slightly larger files, they should be able to import something
that's 1.1GB without being blocked by an arbitrary limit. So I've made
the limit configurable via kibana.yml.
This change includes a few pieces:
* Added optional `kibana.addDataMaxBytes` key to kibana.yml with 1GB
default
* Set upload data route payload limit based on new config value
* Updated help text in UI to use the dynamic config value on the parse
csv page
* Updated parse csv page to check file size and fail early if the
selected file is too big
Resolves: https://github.com/elastic/kibana/issues/7671
* Original sha: 0503aa8a31
* Authored by Matthew Bargar <mbargar@gmail.com> on 2016-09-12T18:13:44Z
The current version in docs is always trailing the current version of
the project, since the current version of docs is for the latest release
of this version line rather than the version that is currently in
development.
A new server-side configuration, elasticsearch.customHeaders, allows
people to configure any number of custom headers that will get sent
along to all requests to Elasticsearch that are made via the proxy or
exposed client.
This allows for advanced architectures that do things such as dynamic
routing based on install-specific headers.
When following the description from the docs to set the elasticsearch url I got a warning that the key was deprecated. This updates the docs to use the non-deprecated key. I haven't checked if this needs to be updated in other parts of the documentation too.