mirror of
https://github.com/elastic/kibana.git
synced 2025-04-23 17:28:26 -04:00
Merge branch 'master' of github.com:elastic/kibana into implement/npmCleanTask
This commit is contained in:
commit
636bc7d9ce
1113 changed files with 53587 additions and 33563 deletions
|
@ -30,6 +30,7 @@ rules:
|
|||
no-bitwise: 0
|
||||
no-caller: 2
|
||||
no-cond-assign: 0
|
||||
no-const-assign: 2
|
||||
no-debugger: 2
|
||||
no-empty: 2
|
||||
no-eval: 2
|
||||
|
|
|
@ -4,16 +4,16 @@
|
|||
At any given time the Kibana team at Elastic is working on dozens of features and enhancements to Kibana and other projects at Elastic. When you file an issue we'll take the time to digest it, consider solutions, and weigh its applicability to both the broad Kibana user base and our own goals for the project. Once we've completed that process we will assign the issue a priority.
|
||||
|
||||
- **P1**: A high priority issue that affects almost all Kibana users. Bugs that would cause incorrect results, security issues and features that would vastly improve the user experience for everyone. Work arounds for P1s generally don't exist without a code change.
|
||||
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
|
||||
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
|
||||
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
|
||||
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
|
||||
- **P4**: Niche and special interest issues that may not fit our core goals. We would take a high quality pull for this if implemented in such a way that it does not meaningfully impact other functionality or existing code. Issues may also be labeled P4 if they would be better implemented in Elasticsearch.
|
||||
- **P5**: Highly niche or in opposition to our core goals. Should usually be closed. This doesn't mean we wouldn't take a pull for it, but if someone really wanted this they would be better off working on a plugin. The Kibana team will usually not work on P5 issues but may be willing to assist plugin developers on IRC.
|
||||
|
||||
#### How to express the importance of an issue
|
||||
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
|
||||
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
|
||||
|
||||
#### My issue isn't getting enough attention
|
||||
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
|
||||
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
|
||||
|
||||
### I want to help!
|
||||
**Now we're talking**. If you have a bugfix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
|
||||
|
@ -74,6 +74,20 @@ optimize:
|
|||
lazyPrebuild: false
|
||||
```
|
||||
|
||||
#### SSL
|
||||
|
||||
When Kibana runs in development mode it will automatically use bundled SSL certificates. These certificates won't be trusted by your OS by default which will likely cause your browser to complain about the cert. You can deal with this in a few ways:
|
||||
|
||||
1. Supply your own cert using the `config/kibana.dev.yml` file.
|
||||
1. Configure your OS to trust the cert:
|
||||
- OSX: https://www.accuweaver.com/2014/09/19/make-chrome-accept-a-self-signed-certificate-on-osx/
|
||||
- Window: http://stackoverflow.com/a/1412118
|
||||
- Linux: http://unix.stackexchange.com/a/90607
|
||||
1. Click through the warning and accept future warnings.
|
||||
1. Disable SSL with the `--no-ssl` flag:
|
||||
- `npm start -- --no-ssl`
|
||||
|
||||
|
||||
#### Linting
|
||||
|
||||
A note about linting: We use [eslint](http://eslint.org) to check that the [styleguide](STYLEGUIDE.md) is being followed. It runs in a pre-commit hook and as a part of the tests, but most contributors integrate it with their code editors for real-time feedback.
|
||||
|
@ -133,6 +147,7 @@ Distributable packages can be found in `target/` after the build completes.
|
|||
Packages are built using fpm, pleaserun, dpkg, and rpm. fpm and pleaserun can be installed using gem. Package building has only been tested on Linux and is not supported on any other platform.
|
||||
```sh
|
||||
gem install pleaserun
|
||||
apt-get install ruby-dev
|
||||
gem install fpm
|
||||
npm run build:ospackages
|
||||
```
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
Copyright 2012–2014 Elasticsearch BV
|
||||
Copyright 2012–2015 Elasticsearch BV
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
|
||||
|
||||
|
|
|
@ -31,6 +31,8 @@ For example, a chart of dates with incident counts can display dates in chronolo
|
|||
priority of the incident-reporting aggregation to show the most active dates first. The chronological order might show
|
||||
a time-dependent pattern in incident count, and sorting by active dates can reveal particular outliers in your data.
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
4
docs/color-picker.asciidoc
Normal file
4
docs/color-picker.asciidoc
Normal file
|
@ -0,0 +1,4 @@
|
|||
You can customize the colors of your visualization by clicking the color dot next to each label to display the
|
||||
_color picker_.
|
||||
|
||||
image::images/color-picker.png[An array of color dots that users can select]
|
|
@ -71,8 +71,13 @@ in your Web page.
|
|||
|
||||
NOTE: A user must have Kibana access in order to view embedded dashboards.
|
||||
|
||||
Click the *Share* button to display HTML code to embed the dashboard in another Web page, along with a direct link to
|
||||
the dashboard. You can select the text in either option to copy the code or the link to your clipboard.
|
||||
To share a dashboard, click the *Share* button image:images/share-dashboard.png[] to display the _Sharing_ panel.
|
||||
|
||||
image:images/sharing-panel.png[]
|
||||
|
||||
Click the *Copy to Clipboard* button image:images/share-link.png[] to copy the native URL or embed HTML to the clipboard.
|
||||
Click the *Generate short URL* button image:images/share-short-link.png[] to create a shortened URL for sharing or
|
||||
embedding.
|
||||
|
||||
[float]
|
||||
[[embedding-dashboards]]
|
||||
|
|
|
@ -195,7 +195,8 @@ yellow open logstash-2015.05.20 5 1 4750 0 16.4mb
|
|||
[[tutorial-define-index]]
|
||||
=== Defining Your Index Patterns
|
||||
|
||||
Each set of data loaded to Elasticsearch has an <<settings-create-pattern,index pattern>>. In the previous section, the Shakespeare data set has an index named `shakespeare`, and the accounts
|
||||
Each set of data loaded to Elasticsearch has an <<settings-create-pattern,index pattern>>. In the previous section, the
|
||||
Shakespeare data set has an index named `shakespeare`, and the accounts
|
||||
data set has an index named `bank`. An _index pattern_ is a string with optional wildcards that can match multiple
|
||||
indices. For example, in the common logging use case, a typical index name contains the date in MM-DD-YYYY
|
||||
format, and an index pattern for May would look something like `logstash-2015.05*`.
|
||||
|
@ -211,6 +212,9 @@ The Logstash data set does contain time-series data, so after clicking *Add New*
|
|||
set, make sure the *Index contains time-based events* box is checked and select the `@timestamp` field from the
|
||||
*Time-field name* drop-down.
|
||||
|
||||
NOTE: When you define an index pattern, indices that match that pattern must exist in Elasticsearch. Those indices must
|
||||
contain data.
|
||||
|
||||
[float]
|
||||
[[tutorial-discovering]]
|
||||
=== Discovering Your Data
|
||||
|
@ -288,8 +292,10 @@ This shows you what proportion of the 1000 accounts fall in these balance ranges
|
|||
we're going to add another bucket aggregation. We can break down each of the balance ranges further by the account
|
||||
holder's age.
|
||||
|
||||
Click *Add sub-buckets* at the bottom, then select *Split Slices*. Choose the *Terms* aggregation and the *age* field from the drop-downs.
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to add an external ring with the new results.
|
||||
Click *Add sub-buckets* at the bottom, then select *Split Slices*. Choose the *Terms* aggregation and the *age* field from
|
||||
the drop-downs.
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to add an external ring with the new
|
||||
results.
|
||||
|
||||
image::images/tutorial-visualize-pie-3.png[]
|
||||
|
||||
|
@ -321,7 +327,8 @@ as well as change many other options for your visualizations, by clicking the *O
|
|||
Now that you have a list of the smallest casts for Shakespeare plays, you might also be curious to see which of these
|
||||
plays makes the greatest demands on an individual actor by showing the maximum number of speeches for a given part. Add
|
||||
a Y-axis aggregation with the *Add metrics* button, then choose the *Max* aggregation for the *speech_number* field. In
|
||||
the *Options* tab, change the *Bar Mode* drop-down to *grouped*, then click the green *Apply changes* button image:images/apply-changes-button.png[]. Your
|
||||
the *Options* tab, change the *Bar Mode* drop-down to *grouped*, then click the green *Apply changes* button
|
||||
image:images/apply-changes-button.png[]. Your
|
||||
chart should now look like this:
|
||||
|
||||
image::images/tutorial-visualize-bar-3.png[]
|
||||
|
@ -371,7 +378,8 @@ Write the following text in the field:
|
|||
The Markdown widget uses **markdown** syntax.
|
||||
> Blockquotes in Markdown use the > character.
|
||||
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to display the rendered Markdown in the preview pane:
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to display the rendered Markdown in the
|
||||
preview pane:
|
||||
|
||||
image::images/tutorial-visualize-md-2.png[]
|
||||
|
||||
|
|
BIN
docs/images/color-picker.png
Normal file
BIN
docs/images/color-picker.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 28 KiB |
BIN
docs/images/share-dashboard.png
Normal file
BIN
docs/images/share-dashboard.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 554 B |
BIN
docs/images/share-link.png
Normal file
BIN
docs/images/share-link.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 589 B |
BIN
docs/images/share-short-link.png
Normal file
BIN
docs/images/share-short-link.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 507 B |
BIN
docs/images/sharing-panel.png
Normal file
BIN
docs/images/sharing-panel.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 62 KiB |
|
@ -1,12 +1,12 @@
|
|||
[[setup-repositories]]
|
||||
=== Kibana Repositories
|
||||
|
||||
Binary packages for Kibana are available for Unix distributions that support the `apt` and `yum` tools.We also have
|
||||
repositories available for APT and YUM based distributions.
|
||||
Binary packages for Kibana are available for Unix distributions that support the `apt` and `yum` tools.We also have
|
||||
repositories available for APT and YUM based distributions.
|
||||
|
||||
NOTE: Since the packages are created as part of the Kibana build, source packages are not available.
|
||||
|
||||
Packages are signed with the PGP key http://pgp.mit.edu/pks/lookup?op=vindex&search=0xD27D666CD88E42B4[D88E42B4], which
|
||||
Packages are signed with the PGP key http://pgp.mit.edu/pks/lookup?op=vindex&search=0xD27D666CD88E42B4[D88E42B4], which
|
||||
has the following fingerprint:
|
||||
|
||||
4609 5ACC 8548 582C 1A26 99A9 D27D 666C D88E 42B4
|
||||
|
@ -31,7 +31,7 @@ echo "deb http://packages.elastic.co/kibana/{branch}/debian stable main" | sudo
|
|||
+
|
||||
[WARNING]
|
||||
==================================================
|
||||
Use the `echo` method described above to add the Kibana repository. Do not use `add-apt-repository`, as that command
|
||||
Use the `echo` method described above to add the Kibana repository. Do not use `add-apt-repository`, as that command
|
||||
adds a `deb-src` entry with no corresponding source package.
|
||||
When the `deb-src` entry, is present, the commands in this procedure generate an error similar to the following:
|
||||
|
||||
|
@ -47,7 +47,7 @@ Delete the `deb-src` entry from the `/etc/apt/sources.list` file to clear the er
|
|||
sudo apt-get update && sudo apt-get install kibana
|
||||
--------------------------------------------------
|
||||
+
|
||||
. Configure Kibana to automatically start during bootup. If your distribution is using the System V version of `init`,
|
||||
. Configure Kibana to automatically start during bootup. If your distribution is using the System V version of `init`,
|
||||
run the following command:
|
||||
+
|
||||
[source,sh]
|
||||
|
@ -67,7 +67,7 @@ sudo /bin/systemctl enable kibana.service
|
|||
[[kibana-yum]]
|
||||
===== Installing Kibana with yum
|
||||
|
||||
WARNING: The repositories set up in this procedure are not compatible with distributions using version 3 of `rpm`, such
|
||||
WARNING: The repositories set up in this procedure are not compatible with distributions using version 3 of `rpm`, such
|
||||
as CentOS version 5.
|
||||
|
||||
. Download and install the public signing key:
|
||||
|
@ -96,8 +96,8 @@ enabled=1
|
|||
yum install kibana
|
||||
--------------------------------------------------
|
||||
+
|
||||
Configure Kibana to automatically start during bootup. If your distribution is using the System V version of `init`,
|
||||
run the following command:
|
||||
Configure Kibana to automatically start during bootup. If your distribution is using the System V version of `init`
|
||||
(check with `ps -p 1`), run the following command:
|
||||
+
|
||||
[source,sh]
|
||||
--------------------------------------------------
|
||||
|
|
|
@ -11,6 +11,8 @@ if the splits are displayed in a row or a column by clicking the *Rows | Columns
|
|||
|
||||
include::x-axis-aggs.asciidoc[]
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
|
@ -55,6 +55,8 @@ types.
|
|||
When multiple aggregations are defined on a chart's axis, you can use the up or down arrows to the right of the
|
||||
aggregation's type to change the aggregation's priority.
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
|
@ -42,7 +42,7 @@ kibana_elasticsearch_password: kibana4-password
|
|||
----
|
||||
|
||||
Kibana 4 users also need access to the `.kibana` index so they can save and load searches, visualizations, and dashboards.
|
||||
For more information, see {shield}/kibana.html#kibana4-server-role[Configuring Roles for Kibana 4 Users] in
|
||||
For more information, see {shield}/kibana.html[Using Kibana with Shield] in
|
||||
the Shield documentation.
|
||||
|
||||
TIP: See <<kibana-dynamic-mapping, Kibana and Elasticsearch Dynamic Mapping>> for important information on Kibana and
|
||||
|
|
|
@ -1,27 +1,27 @@
|
|||
[[releasenotes]]
|
||||
== Kibana 4.3 Release Notes
|
||||
== Kibana 4.4 Release Notes
|
||||
|
||||
The 4.3 release of Kibana requires Elasticsearch 2.1 or later.
|
||||
The 4.4 release of Kibana requires Elasticsearch 2.2 or later.
|
||||
|
||||
Using event times to create index names is *deprecated* in this release of Kibana. Support for this functionality will be
|
||||
removed entirely in the next major Kibana release. Elasticsearch 2.1 includes sophisticated date parsing APIs that Kibana
|
||||
uses to determine date information, removing the need to specify dates in the index pattern name.
|
||||
Using event times to create index names is no longer supported as of this release. Current versions of Elasticsearch
|
||||
include sophisticated date parsing APIs that Kibana uses to determine date information, removing the need to specify dates
|
||||
in the index pattern name.
|
||||
|
||||
[float]
|
||||
[[enhancements]]
|
||||
== Enhancements
|
||||
|
||||
* {k4issue}5109[Issue 5109]: Adds custom JSON and filter alias naming for filters.
|
||||
* {k4issue}1726[Issue 1726]: Adds a color field formatter for value ranges in numeric fields.
|
||||
* {k4issue}4342[Issue 4342]: Increased performance for wildcard indices.
|
||||
* {k4issue}1600[Issue 1600]: Support for global time zones.
|
||||
* {k4pull}5275[Pull Request 5275]: Highlighting values in Discover can now be disabled.
|
||||
* {k4issue}5212[Issue 5212]: Adds support for multiple certificate authorities.
|
||||
* {k4issue}2716[Issue 2716]: The open/closed position of the spy panel now persists across UI state changes.
|
||||
// * {k4issue}5109[Issue 5109]: Adds custom JSON and filter alias naming for filters.
|
||||
// * {k4issue}1726[Issue 1726]: Adds a color field formatter for value ranges in numeric fields.
|
||||
// * {k4issue}4342[Issue 4342]: Increased performance for wildcard indices.
|
||||
// * {k4issue}1600[Issue 1600]: Support for global time zones.
|
||||
// * {k4pull}5275[Pull Request 5275]: Highlighting values in Discover can now be disabled.
|
||||
// * {k4issue}5212[Issue 5212]: Adds support for multiple certificate authorities.
|
||||
// * {k4issue}2716[Issue 2716]: The open/closed position of the spy panel now persists across UI state changes.
|
||||
|
||||
[float]
|
||||
[[bugfixes]]
|
||||
== Bug Fixes
|
||||
|
||||
* {k4issue}5165[Issue 5165]: Resolves a display error in embedded views.
|
||||
* {k4issue}5021[Issue 5021]: Improves visualization dimming for dashboards with auto-refresh.
|
||||
// * {k4issue}5165[Issue 5165]: Resolves a display error in embedded views.
|
||||
// * {k4issue}5021[Issue 5021]: Improves visualization dimming for dashboards with auto-refresh.
|
||||
|
|
|
@ -35,11 +35,17 @@ list.
|
|||
contains time-based events* option and select the index field that contains the timestamp. Kibana reads the index
|
||||
mapping to list all of the fields that contain a timestamp.
|
||||
|
||||
. By default, Kibana restricts wildcard expansion of time-based index patterns to indices with data within the currently
|
||||
selected time range. Click *Do not expand index pattern when search* to disable this behavior.
|
||||
|
||||
. Click *Create* to add the index pattern.
|
||||
|
||||
. To designate the new pattern as the default pattern to load when you view the Discover tab, click the *favorite*
|
||||
button.
|
||||
|
||||
NOTE: When you define an index pattern, indices that match that pattern must exist in Elasticsearch. Those indices must
|
||||
contain data.
|
||||
|
||||
To use an event time in an index name, enclose the static text in the pattern and specify the date format using the
|
||||
tokens described in the following table.
|
||||
|
||||
|
@ -195,6 +201,8 @@ Scripted fields compute data on the fly from the data in your Elasticsearch indi
|
|||
the Discover tab as part of the document data, and you can use scripted fields in your visualizations.
|
||||
Scripted field values are computed at query time so they aren't indexed and cannot be searched.
|
||||
|
||||
NOTE: Kibana cannot query scripted fields.
|
||||
|
||||
WARNING: Computing data on the fly with scripted fields can be very resource intensive and can have a direct impact on
|
||||
Kibana's performance. Keep in mind that there's no built-in validation of a scripted field. If your scripts are
|
||||
buggy, you'll get exceptions whenever you try to view the dynamically generated data.
|
||||
|
@ -449,10 +457,12 @@ To export a set of objects:
|
|||
. Click the selection box for the objects you want to export, or click the *Select All* box.
|
||||
. Click *Export* to select a location to write the exported JSON.
|
||||
|
||||
WARNING: Exported dashboards do not include their associated index patterns. Re-create the index patterns manually before
|
||||
importing saved dashboards to a Kibana instance running on another Elasticsearch cluster.
|
||||
|
||||
To import a set of objects:
|
||||
|
||||
. Go to *Settings > Objects*.
|
||||
. Click *Import* to navigate to the JSON file representing the set of objects to import.
|
||||
. Click *Open* after selecting the JSON file.
|
||||
. If any objects in the set would overwrite objects already present in Kibana, confirm the overwrite.
|
||||
|
||||
|
|
|
@ -34,6 +34,8 @@ if the splits are displayed in a row or a column by clicking the *Rows | Columns
|
|||
|
||||
include::x-axis-aggs.asciidoc[]
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
11
package.json
11
package.json
|
@ -53,10 +53,11 @@
|
|||
"precommit": "grunt precommit",
|
||||
"karma": "karma start",
|
||||
"elasticsearch": "grunt esvm:dev:keepalive",
|
||||
"elasticsearchWithPlugins": "grunt esvm:withPlugins:keepalive",
|
||||
"lint": "grunt eslint:source",
|
||||
"lintroller": "grunt eslint:fixSource",
|
||||
"mocha": "mocha --compilers js:babel/register",
|
||||
"mocha:debug": "mocha --debug-brk --compilers js:babel/register",
|
||||
"mocha": "mocha",
|
||||
"mocha:debug": "mocha --debug-brk",
|
||||
"clean": "grunt scrub"
|
||||
},
|
||||
"repository": {
|
||||
|
@ -104,6 +105,7 @@
|
|||
"good-squeeze": "2.1.0",
|
||||
"gridster": "0.5.6",
|
||||
"hapi": "8.8.1",
|
||||
"httpolyglot": "0.1.1",
|
||||
"imports-loader": "0.6.4",
|
||||
"jade": "1.11.0",
|
||||
"jade-loader": "0.7.1",
|
||||
|
@ -141,7 +143,7 @@
|
|||
"Nonsense": "0.1.2",
|
||||
"angular-mocks": "1.4.7",
|
||||
"auto-release-sinon": "1.0.3",
|
||||
"babel-eslint": "4.1.3",
|
||||
"babel-eslint": "4.1.7",
|
||||
"chokidar": "1.0.5",
|
||||
"eslint": "1.5.1",
|
||||
"eslint-plugin-mocha": "1.0.0",
|
||||
|
@ -181,7 +183,8 @@
|
|||
"portscanner": "1.0.0",
|
||||
"simple-git": "1.8.0",
|
||||
"sinon": "1.17.2",
|
||||
"source-map": "0.4.4"
|
||||
"source-map": "0.4.4",
|
||||
"supertest-as-promised": "2.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": "4.2.4",
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
let _ = require('lodash');
|
||||
let Command = require('commander').Command;
|
||||
import _ from 'lodash';
|
||||
|
||||
let red = require('./color').red;
|
||||
let yellow = require('./color').yellow;
|
||||
let help = require('./help');
|
||||
import help from './help';
|
||||
import { Command } from 'commander';
|
||||
import { red } from './color';
|
||||
import { yellow } from './color';
|
||||
|
||||
Command.prototype.error = function (err) {
|
||||
if (err && err.message) err = err.message;
|
||||
|
|
|
@ -1,11 +1,11 @@
|
|||
let _ = require('lodash');
|
||||
let ansicolors = require('ansicolors');
|
||||
import _ from 'lodash';
|
||||
import ansicolors from 'ansicolors';
|
||||
|
||||
let log = _.restParam(function (color, label, rest1) {
|
||||
console.log.apply(console, [color(` ${_.trim(label)} `)].concat(rest1));
|
||||
});
|
||||
|
||||
let color = require('./color');
|
||||
import color from './color';
|
||||
|
||||
module.exports = class Log {
|
||||
constructor(quiet, silent) {
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
let _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
let utils = require('requirefrom')('src/utils');
|
||||
let pkg = utils('packageJson');
|
||||
let Command = require('./Command');
|
||||
import Command from './Command';
|
||||
|
||||
let argv = process.env.kbnWorkerArgv ? JSON.parse(process.env.kbnWorkerArgv) : process.argv.slice();
|
||||
let program = new Command('bin/kibana');
|
||||
|
|
98
src/cli/cluster/base_path_proxy.js
Normal file
98
src/cli/cluster/base_path_proxy.js
Normal file
|
@ -0,0 +1,98 @@
|
|||
import { Server } from 'hapi';
|
||||
import { notFound } from 'boom';
|
||||
import { merge, sample } from 'lodash';
|
||||
import { format as formatUrl } from 'url';
|
||||
import { fromNode } from 'bluebird';
|
||||
import { Agent as HttpsAgent } from 'https';
|
||||
import { readFileSync } from 'fs';
|
||||
|
||||
import Config from '../../server/config/config';
|
||||
import setupConnection from '../../server/http/setup_connection';
|
||||
import setupLogging from '../../server/logging';
|
||||
|
||||
const alphabet = 'abcdefghijklmnopqrztuvwxyz'.split('');
|
||||
|
||||
export default class BasePathProxy {
|
||||
constructor(clusterManager, userSettings) {
|
||||
this.clusterManager = clusterManager;
|
||||
this.server = new Server();
|
||||
|
||||
const config = Config.withDefaultSchema(userSettings);
|
||||
|
||||
this.targetPort = config.get('dev.basePathProxyTarget');
|
||||
this.basePath = config.get('server.basePath');
|
||||
|
||||
const { cert } = config.get('server.ssl');
|
||||
if (cert) {
|
||||
this.proxyAgent = new HttpsAgent({
|
||||
ca: readFileSync(cert)
|
||||
});
|
||||
}
|
||||
|
||||
if (!this.basePath) {
|
||||
this.basePath = `/${sample(alphabet, 3).join('')}`;
|
||||
config.set('server.basePath', this.basePath);
|
||||
}
|
||||
|
||||
setupLogging(null, this.server, config);
|
||||
setupConnection(null, this.server, config);
|
||||
this.setupRoutes();
|
||||
}
|
||||
|
||||
setupRoutes() {
|
||||
const { server, basePath, targetPort } = this;
|
||||
|
||||
server.route({
|
||||
method: 'GET',
|
||||
path: '/',
|
||||
handler(req, reply) {
|
||||
return reply.redirect(basePath);
|
||||
}
|
||||
});
|
||||
|
||||
server.route({
|
||||
method: '*',
|
||||
path: `${basePath}/{kbnPath*}`,
|
||||
handler: {
|
||||
proxy: {
|
||||
passThrough: true,
|
||||
xforward: true,
|
||||
agent: this.proxyAgent,
|
||||
mapUri(req, callback) {
|
||||
callback(null, formatUrl({
|
||||
protocol: server.info.protocol,
|
||||
hostname: server.info.host,
|
||||
port: targetPort,
|
||||
pathname: req.params.kbnPath,
|
||||
query: req.query,
|
||||
}));
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
server.route({
|
||||
method: '*',
|
||||
path: `/{oldBasePath}/{kbnPath*}`,
|
||||
handler(req, reply) {
|
||||
const {oldBasePath, kbnPath = ''} = req.params;
|
||||
|
||||
const isGet = req.method === 'get';
|
||||
const isBasePath = oldBasePath.length === 3;
|
||||
const isApp = kbnPath.slice(0, 4) === 'app/';
|
||||
|
||||
if (isGet && isBasePath && isApp) {
|
||||
return reply.redirect(`${basePath}/${kbnPath}`);
|
||||
}
|
||||
|
||||
return reply(notFound());
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async listen() {
|
||||
await fromNode(cb => this.server.start(cb));
|
||||
this.server.log(['listening', 'info'], `basePath Proxy running at ${this.server.info.uri}${this.basePath}`);
|
||||
}
|
||||
|
||||
}
|
|
@ -1,30 +1,52 @@
|
|||
let cluster = require('cluster');
|
||||
let { join } = require('path');
|
||||
let { debounce, compact, invoke, bindAll, once } = require('lodash');
|
||||
import cluster from 'cluster';
|
||||
const { join } = require('path');
|
||||
const { format: formatUrl } = require('url');
|
||||
import Hapi from 'hapi';
|
||||
const { debounce, compact, get, invoke, bindAll, once, sample } = require('lodash');
|
||||
|
||||
let Log = require('../Log');
|
||||
let Worker = require('./worker');
|
||||
import Log from '../Log';
|
||||
import Worker from './worker';
|
||||
import BasePathProxy from './base_path_proxy';
|
||||
|
||||
process.env.kbnWorkerType = 'managr';
|
||||
|
||||
module.exports = class ClusterManager {
|
||||
constructor(opts) {
|
||||
constructor(opts, settings) {
|
||||
this.log = new Log(opts.quiet, opts.silent);
|
||||
this.addedCount = 0;
|
||||
|
||||
const serverArgv = [];
|
||||
const optimizerArgv = [
|
||||
'--plugins.initialize=false',
|
||||
'--server.autoListen=false',
|
||||
];
|
||||
|
||||
if (opts.basePath) {
|
||||
this.basePathProxy = new BasePathProxy(this, settings);
|
||||
|
||||
optimizerArgv.push(
|
||||
`--server.basePath=${this.basePathProxy.basePath}`
|
||||
);
|
||||
|
||||
serverArgv.push(
|
||||
`--server.port=${this.basePathProxy.targetPort}`,
|
||||
`--server.basePath=${this.basePathProxy.basePath}`
|
||||
);
|
||||
}
|
||||
|
||||
this.workers = [
|
||||
this.optimizer = new Worker({
|
||||
type: 'optmzr',
|
||||
title: 'optimizer',
|
||||
log: this.log,
|
||||
argv: compact([
|
||||
'--plugins.initialize=false',
|
||||
'--server.autoListen=false'
|
||||
]),
|
||||
argv: optimizerArgv,
|
||||
watch: false
|
||||
}),
|
||||
|
||||
this.server = new Worker({
|
||||
type: 'server',
|
||||
log: this.log
|
||||
log: this.log,
|
||||
argv: serverArgv
|
||||
})
|
||||
];
|
||||
|
||||
|
@ -48,12 +70,15 @@ module.exports = class ClusterManager {
|
|||
startCluster() {
|
||||
this.setupManualRestart();
|
||||
invoke(this.workers, 'start');
|
||||
if (this.basePathProxy) {
|
||||
this.basePathProxy.listen();
|
||||
}
|
||||
}
|
||||
|
||||
setupWatching() {
|
||||
var chokidar = require('chokidar');
|
||||
let utils = require('requirefrom')('src/utils');
|
||||
let fromRoot = utils('fromRoot');
|
||||
const chokidar = require('chokidar');
|
||||
const utils = require('requirefrom')('src/utils');
|
||||
const fromRoot = utils('fromRoot');
|
||||
|
||||
this.watcher = chokidar.watch([
|
||||
'src/plugins',
|
||||
|
@ -81,12 +106,12 @@ module.exports = class ClusterManager {
|
|||
}
|
||||
|
||||
setupManualRestart() {
|
||||
let readline = require('readline');
|
||||
let rl = readline.createInterface(process.stdin, process.stdout);
|
||||
const readline = require('readline');
|
||||
const rl = readline.createInterface(process.stdin, process.stdout);
|
||||
|
||||
let nls = 0;
|
||||
let clear = () => nls = 0;
|
||||
let clearSoon = debounce(clear, 2000);
|
||||
const clear = () => nls = 0;
|
||||
const clearSoon = debounce(clear, 2000);
|
||||
|
||||
rl.setPrompt('');
|
||||
rl.prompt();
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
let _ = require('lodash');
|
||||
let cluster = require('cluster');
|
||||
import _ from 'lodash';
|
||||
import cluster from 'cluster';
|
||||
let { resolve } = require('path');
|
||||
let { EventEmitter } = require('events');
|
||||
|
||||
let fromRoot = require('../../utils/fromRoot');
|
||||
import fromRoot from '../../utils/fromRoot';
|
||||
|
||||
let cliPath = fromRoot('src/cli');
|
||||
let baseArgs = _.difference(process.argv.slice(2), ['--no-watch']);
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
|
||||
var _ = require('lodash');
|
||||
var ansicolors = require('ansicolors');
|
||||
import _ from 'lodash';
|
||||
import ansicolors from 'ansicolors';
|
||||
|
||||
exports.green = _.flow(ansicolors.black, ansicolors.bgGreen);
|
||||
exports.red = _.flow(ansicolors.white, ansicolors.bgRed);
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = function (command, spaces) {
|
||||
if (!_.size(command.commands)) {
|
||||
|
|
31
src/cli/plugin/__tests__/file_type.js
Normal file
31
src/cli/plugin/__tests__/file_type.js
Normal file
|
@ -0,0 +1,31 @@
|
|||
import expect from 'expect.js';
|
||||
import fileType, { ZIP, TAR } from '../file_type';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
describe('file_type', function () {
|
||||
it('returns ZIP for .zip filename', function () {
|
||||
const type = fileType('wat.zip');
|
||||
expect(type).to.equal(ZIP);
|
||||
});
|
||||
it('returns TAR for .tar.gz filename', function () {
|
||||
const type = fileType('wat.tar.gz');
|
||||
expect(type).to.equal(TAR);
|
||||
});
|
||||
it('returns TAR for .tgz filename', function () {
|
||||
const type = fileType('wat.tgz');
|
||||
expect(type).to.equal(TAR);
|
||||
});
|
||||
it('returns undefined for unknown file type', function () {
|
||||
const type = fileType('wat.unknown');
|
||||
expect(type).to.equal(undefined);
|
||||
});
|
||||
it('accepts paths', function () {
|
||||
const type = fileType('/some/path/to/wat.zip');
|
||||
expect(type).to.equal(ZIP);
|
||||
});
|
||||
it('accepts urls', function () {
|
||||
const type = fileType('http://example.com/wat.zip');
|
||||
expect(type).to.equal(ZIP);
|
||||
});
|
||||
});
|
||||
});
|
|
@ -1,6 +1,6 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const plugin = require('../plugin');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import plugin from '../plugin';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const fs = require('fs');
|
||||
const rimraf = require('rimraf');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import fs from 'fs';
|
||||
import rimraf from 'rimraf';
|
||||
|
||||
const pluginCleaner = require('../plugin_cleaner');
|
||||
const pluginLogger = require('../plugin_logger');
|
||||
import pluginCleaner from '../plugin_cleaner';
|
||||
import pluginLogger from '../plugin_logger';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
|
|
@ -1,12 +1,12 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const nock = require('nock');
|
||||
const glob = require('glob');
|
||||
const rimraf = require('rimraf');
|
||||
const { join } = require('path');
|
||||
const mkdirp = require('mkdirp');
|
||||
const pluginLogger = require('../plugin_logger');
|
||||
const pluginDownloader = require('../plugin_downloader');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import nock from 'nock';
|
||||
import glob from 'glob';
|
||||
import rimraf from 'rimraf';
|
||||
import mkdirp from 'mkdirp';
|
||||
import pluginLogger from '../plugin_logger';
|
||||
import pluginDownloader from '../plugin_downloader';
|
||||
import { join } from 'path';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
@ -124,6 +124,25 @@ describe('kibana cli', function () {
|
|||
});
|
||||
});
|
||||
|
||||
it('should consider .tgz files as archive type .tar.gz', function () {
|
||||
const filePath = join(__dirname, 'replies/test_plugin_master.tar.gz');
|
||||
|
||||
const couchdb = nock('http://www.files.com')
|
||||
.defaultReplyHeaders({
|
||||
'content-length': '10'
|
||||
})
|
||||
.get('/plugin.tgz')
|
||||
.replyWithFile(200, filePath);
|
||||
|
||||
const sourceUrl = 'http://www.files.com/plugin.tgz';
|
||||
|
||||
return downloader._downloadSingle(sourceUrl)
|
||||
.then(function (data) {
|
||||
expect(data.archiveType).to.be('.tar.gz');
|
||||
expectWorkingPathNotEmpty();
|
||||
});
|
||||
});
|
||||
|
||||
it('should download a zip from a valid http url', function () {
|
||||
const filePath = join(__dirname, 'replies/test_plugin_master.zip');
|
||||
|
||||
|
|
|
@ -1,13 +1,13 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const glob = require('glob');
|
||||
const rimraf = require('rimraf');
|
||||
const { join } = require('path');
|
||||
const mkdirp = require('mkdirp');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import glob from 'glob';
|
||||
import rimraf from 'rimraf';
|
||||
import mkdirp from 'mkdirp';
|
||||
|
||||
const pluginLogger = require('../plugin_logger');
|
||||
const extract = require('../plugin_extractor');
|
||||
const pluginDownloader = require('../plugin_downloader');
|
||||
import pluginLogger from '../plugin_logger';
|
||||
import extract from '../plugin_extractor';
|
||||
import pluginDownloader from '../plugin_downloader';
|
||||
import { join } from 'path';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const rimraf = require('rimraf');
|
||||
const { mkdirSync } = require('fs');
|
||||
const { join } = require('path');
|
||||
const pluginLogger = require('../plugin_logger');
|
||||
const pluginInstaller = require('../plugin_installer');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import rimraf from 'rimraf';
|
||||
import pluginLogger from '../plugin_logger';
|
||||
import pluginInstaller from '../plugin_installer';
|
||||
import { mkdirSync } from 'fs';
|
||||
import { join } from 'path';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const pluginLogger = require('../plugin_logger');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import pluginLogger from '../plugin_logger';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
const expect = require('expect.js');
|
||||
const sinon = require('sinon');
|
||||
const progressReporter = require('../progress_reporter');
|
||||
const pluginLogger = require('../plugin_logger');
|
||||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import progressReporter from '../progress_reporter';
|
||||
import pluginLogger from '../plugin_logger';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
|
|
@ -1,9 +1,9 @@
|
|||
var path = require('path');
|
||||
var expect = require('expect.js');
|
||||
import path from 'path';
|
||||
import expect from 'expect.js';
|
||||
|
||||
var utils = require('requirefrom')('src/utils');
|
||||
var fromRoot = utils('fromRoot');
|
||||
var settingParser = require('../setting_parser');
|
||||
import settingParser from '../setting_parser';
|
||||
|
||||
describe('kibana cli', function () {
|
||||
|
||||
|
@ -76,11 +76,11 @@ describe('kibana cli', function () {
|
|||
options = { install: 'dummy/dummy', pluginDir: fromRoot('installedPlugins') };
|
||||
});
|
||||
|
||||
it('should require the user to specify either install and remove', function () {
|
||||
it('should require the user to specify either install, remove, or list', function () {
|
||||
options.install = null;
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install or --remove./);
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify both install and remove', function () {
|
||||
|
@ -88,7 +88,32 @@ describe('kibana cli', function () {
|
|||
options.install = 'org/package/version';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install or --remove./);
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify both install and list', function () {
|
||||
options.list = true;
|
||||
options.install = 'org/package/version';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify both remove and list', function () {
|
||||
options.list = true;
|
||||
options.remove = 'package';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify install, remove, and list', function () {
|
||||
options.list = true;
|
||||
options.install = 'org/package/version';
|
||||
options.remove = 'package';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
describe('quiet option', function () {
|
||||
|
@ -293,7 +318,7 @@ describe('kibana cli', function () {
|
|||
describe('remove option', function () {
|
||||
|
||||
it('should set settings.action property to "remove"', function () {
|
||||
options.install = null;
|
||||
delete options.install;
|
||||
options.remove = 'package';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -303,7 +328,7 @@ describe('kibana cli', function () {
|
|||
});
|
||||
|
||||
it('should allow one part to the remove parameter', function () {
|
||||
options.install = null;
|
||||
delete options.install;
|
||||
options.remove = 'test-plugin';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -312,8 +337,8 @@ describe('kibana cli', function () {
|
|||
expect(settings).to.have.property('package', 'test-plugin');
|
||||
});
|
||||
|
||||
it('should not allow more than one part to the install parameter', function () {
|
||||
options.install = null;
|
||||
it('should not allow more than one part to the remove parameter', function () {
|
||||
delete options.install;
|
||||
options.remove = 'kibana/test-plugin';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -322,7 +347,7 @@ describe('kibana cli', function () {
|
|||
});
|
||||
|
||||
it('should populate the pluginPath', function () {
|
||||
options.install = null;
|
||||
delete options.install;
|
||||
options.remove = 'test-plugin';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -334,6 +359,21 @@ describe('kibana cli', function () {
|
|||
|
||||
});
|
||||
|
||||
describe('list option', function () {
|
||||
|
||||
it('should set settings.action property to "list"', function () {
|
||||
delete options.install;
|
||||
delete options.remove;
|
||||
options.list = true;
|
||||
parser = settingParser(options);
|
||||
|
||||
var settings = parser.parse();
|
||||
|
||||
expect(settings).to.have.property('action', 'list');
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
const { createWriteStream, createReadStream, unlinkSync, statSync } = require('fs');
|
||||
const getProgressReporter = require('../progress_reporter');
|
||||
import getProgressReporter from '../progress_reporter';
|
||||
import { createWriteStream, createReadStream, unlinkSync, statSync } from 'fs';
|
||||
import fileType from '../file_type';
|
||||
|
||||
function openSourceFile({ sourcePath }) {
|
||||
try {
|
||||
|
@ -36,15 +37,6 @@ async function copyFile({ readStream, writeStream, progressReporter }) {
|
|||
});
|
||||
}
|
||||
|
||||
function getArchiveTypeFromFilename(path) {
|
||||
if (/\.zip$/i.test(path)) {
|
||||
return '.zip';
|
||||
}
|
||||
if (/\.tar\.gz$/i.test(path)) {
|
||||
return '.tar.gz';
|
||||
}
|
||||
}
|
||||
|
||||
/*
|
||||
// Responsible for managing local file transfers
|
||||
*/
|
||||
|
@ -67,7 +59,7 @@ export default async function copyLocalFile(logger, sourcePath, targetPath) {
|
|||
}
|
||||
|
||||
// all is well, return our archive type
|
||||
const archiveType = getArchiveTypeFromFilename(sourcePath);
|
||||
const archiveType = fileType(sourcePath);
|
||||
return { archiveType };
|
||||
} catch (err) {
|
||||
logger.error(err);
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
const { fromNode: fn } = require('bluebird');
|
||||
const { createWriteStream, unlinkSync } = require('fs');
|
||||
const Wreck = require('wreck');
|
||||
const getProgressReporter = require('../progress_reporter');
|
||||
import Wreck from 'wreck';
|
||||
import getProgressReporter from '../progress_reporter';
|
||||
import { fromNode as fn } from 'bluebird';
|
||||
import { createWriteStream, unlinkSync } from 'fs';
|
||||
import fileType, { ZIP, TAR } from '../file_type';
|
||||
|
||||
function sendRequest({ sourceUrl, timeout }) {
|
||||
const maxRedirects = 11; //Because this one goes to 11.
|
||||
|
@ -49,18 +50,12 @@ function getArchiveTypeFromResponse(resp, sourceUrl) {
|
|||
const contentType = (resp.headers['content-type'] || '');
|
||||
|
||||
switch (contentType.toLowerCase()) {
|
||||
case 'application/zip': return '.zip';
|
||||
case 'application/x-gzip': return '.tar.gz';
|
||||
case 'application/zip': return ZIP;
|
||||
case 'application/x-gzip': return TAR;
|
||||
default:
|
||||
//If we can't infer the archive type from the content-type header,
|
||||
//fall back to checking the extension in the url
|
||||
if (/\.zip$/i.test(sourceUrl)) {
|
||||
return '.zip';
|
||||
}
|
||||
if (/\.tar\.gz$/i.test(sourceUrl)) {
|
||||
return '.tar.gz';
|
||||
}
|
||||
break;
|
||||
return fileType(sourceUrl);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
const zlib = require('zlib');
|
||||
const fs = require('fs');
|
||||
const tar = require('tar');
|
||||
import zlib from 'zlib';
|
||||
import fs from 'fs';
|
||||
import tar from 'tar';
|
||||
|
||||
async function extractArchive(settings) {
|
||||
await new Promise((resolve, reject) => {
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
const DecompressZip = require('@bigfunger/decompress-zip');
|
||||
import DecompressZip from '@bigfunger/decompress-zip';
|
||||
|
||||
async function extractArchive(settings) {
|
||||
await new Promise((resolve, reject) => {
|
||||
|
|
14
src/cli/plugin/file_type.js
Normal file
14
src/cli/plugin/file_type.js
Normal file
|
@ -0,0 +1,14 @@
|
|||
export const TAR = '.tar.gz';
|
||||
export const ZIP = '.zip';
|
||||
|
||||
export default function fileType(filename) {
|
||||
if (/\.zip$/i.test(filename)) {
|
||||
return ZIP;
|
||||
}
|
||||
if (/\.tar\.gz$/i.test(filename)) {
|
||||
return TAR;
|
||||
}
|
||||
if (/\.tgz$/i.test(filename)) {
|
||||
return TAR;
|
||||
}
|
||||
}
|
|
@ -1,9 +1,10 @@
|
|||
const utils = require('requirefrom')('src/utils');
|
||||
const fromRoot = utils('fromRoot');
|
||||
const settingParser = require('./setting_parser');
|
||||
const installer = require('./plugin_installer');
|
||||
const remover = require('./plugin_remover');
|
||||
const pluginLogger = require('./plugin_logger');
|
||||
import settingParser from './setting_parser';
|
||||
import installer from './plugin_installer';
|
||||
import remover from './plugin_remover';
|
||||
import lister from './plugin_lister';
|
||||
import pluginLogger from './plugin_logger';
|
||||
|
||||
export default function pluginCli(program) {
|
||||
function processCommand(command, options) {
|
||||
|
@ -18,11 +19,16 @@ export default function pluginCli(program) {
|
|||
|
||||
const logger = pluginLogger(settings);
|
||||
|
||||
if (settings.action === 'install') {
|
||||
installer.install(settings, logger);
|
||||
}
|
||||
if (settings.action === 'remove') {
|
||||
remover.remove(settings, logger);
|
||||
switch (settings.action) {
|
||||
case 'install':
|
||||
installer.install(settings, logger);
|
||||
break;
|
||||
case 'remove':
|
||||
remover.remove(settings, logger);
|
||||
break;
|
||||
case 'list':
|
||||
lister.list(settings, logger);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -30,6 +36,7 @@ export default function pluginCli(program) {
|
|||
.command('plugin')
|
||||
.option('-i, --install <org>/<plugin>/<version>', 'The plugin to install')
|
||||
.option('-r, --remove <plugin>', 'The plugin to remove')
|
||||
.option('-l, --list', 'List installed plugins')
|
||||
.option('-q, --quiet', 'Disable all process messaging except errors')
|
||||
.option('-s, --silent', 'Disable all process messaging')
|
||||
.option('-u, --url <url>', 'Specify download url')
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
const rimraf = require('rimraf');
|
||||
const fs = require('fs');
|
||||
import rimraf from 'rimraf';
|
||||
import fs from 'fs';
|
||||
|
||||
export default function createPluginCleaner(settings, logger) {
|
||||
function cleanPrevious() {
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
const _ = require('lodash');
|
||||
const urlParse = require('url').parse;
|
||||
const downloadHttpFile = require('./downloaders/http');
|
||||
const downloadLocalFile = require('./downloaders/file');
|
||||
import _ from 'lodash';
|
||||
import downloadHttpFile from './downloaders/http';
|
||||
import downloadLocalFile from './downloaders/file';
|
||||
import { parse as urlParse } from 'url';
|
||||
|
||||
export default function createPluginDownloader(settings, logger) {
|
||||
let archiveType;
|
||||
|
|
|
@ -1,12 +1,13 @@
|
|||
const zipExtract = require('./extractors/zip');
|
||||
const tarGzExtract = require('./extractors/tar_gz');
|
||||
import zipExtract from './extractors/zip';
|
||||
import tarGzExtract from './extractors/tar_gz';
|
||||
import { ZIP, TAR } from './file_type';
|
||||
|
||||
export default function extractArchive(settings, logger, archiveType) {
|
||||
switch (archiveType) {
|
||||
case '.zip':
|
||||
case ZIP:
|
||||
return zipExtract(settings, logger);
|
||||
break;
|
||||
case '.tar.gz':
|
||||
case TAR:
|
||||
return tarGzExtract(settings, logger);
|
||||
break;
|
||||
default:
|
||||
|
|
|
@ -1,14 +1,14 @@
|
|||
const _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
const utils = require('requirefrom')('src/utils');
|
||||
const fromRoot = utils('fromRoot');
|
||||
const pluginDownloader = require('./plugin_downloader');
|
||||
const pluginCleaner = require('./plugin_cleaner');
|
||||
const pluginExtractor = require('./plugin_extractor');
|
||||
const KbnServer = require('../../server/KbnServer');
|
||||
const readYamlConfig = require('../serve/read_yaml_config');
|
||||
const { statSync, renameSync } = require('fs');
|
||||
const Promise = require('bluebird');
|
||||
const rimrafSync = require('rimraf').sync;
|
||||
import pluginDownloader from './plugin_downloader';
|
||||
import pluginCleaner from './plugin_cleaner';
|
||||
import pluginExtractor from './plugin_extractor';
|
||||
import KbnServer from '../../server/KbnServer';
|
||||
import readYamlConfig from '../serve/read_yaml_config';
|
||||
import Promise from 'bluebird';
|
||||
import { sync as rimrafSync } from 'rimraf';
|
||||
import { statSync, renameSync } from 'fs';
|
||||
const mkdirp = Promise.promisify(require('mkdirp'));
|
||||
|
||||
export default {
|
||||
|
|
8
src/cli/plugin/plugin_lister.js
Normal file
8
src/cli/plugin/plugin_lister.js
Normal file
|
@ -0,0 +1,8 @@
|
|||
import fs from 'fs';
|
||||
|
||||
export function list(settings, logger) {
|
||||
fs.readdirSync(settings.pluginDir)
|
||||
.forEach(function (pluginFile) {
|
||||
logger.log(pluginFile);
|
||||
});
|
||||
}
|
|
@ -1,5 +1,5 @@
|
|||
const fs = require('fs');
|
||||
const rimraf = require('rimraf');
|
||||
import fs from 'fs';
|
||||
import rimraf from 'rimraf';
|
||||
|
||||
module.exports = {
|
||||
remove: remove
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
const { resolve } = require('path');
|
||||
const expiry = require('expiry-js');
|
||||
import expiry from 'expiry-js';
|
||||
import { intersection } from 'lodash';
|
||||
import { resolve } from 'path';
|
||||
|
||||
export default function createSettingParser(options) {
|
||||
function parseMilliseconds(val) {
|
||||
|
@ -22,6 +23,10 @@ export default function createSettingParser(options) {
|
|||
return 'https://download.elastic.co/' + settings.organization + '/' + settings.package + '/' + filename;
|
||||
}
|
||||
|
||||
function areMultipleOptionsChosen(options, choices) {
|
||||
return intersection(Object.keys(options), choices).length > 1;
|
||||
}
|
||||
|
||||
function parse() {
|
||||
let parts;
|
||||
let settings = {
|
||||
|
@ -84,8 +89,12 @@ export default function createSettingParser(options) {
|
|||
settings.package = parts.shift();
|
||||
}
|
||||
|
||||
if (!settings.action || (options.install && options.remove)) {
|
||||
throw new Error('Please specify either --install or --remove.');
|
||||
if (options.list) {
|
||||
settings.action = 'list';
|
||||
}
|
||||
|
||||
if (!settings.action || areMultipleOptionsChosen(options, [ 'install', 'remove', 'list' ])) {
|
||||
throw new Error('Please specify either --install, --remove, or --list.');
|
||||
}
|
||||
|
||||
settings.pluginDir = options.pluginDir;
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
let _ = require('lodash');
|
||||
let fs = require('fs');
|
||||
let yaml = require('js-yaml');
|
||||
import _ from 'lodash';
|
||||
import fs from 'fs';
|
||||
import yaml from 'js-yaml';
|
||||
|
||||
let utils = require('requirefrom')('src/utils');
|
||||
let fromRoot = utils('fromRoot');
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
let _ = require('lodash');
|
||||
let { isWorker } = require('cluster');
|
||||
let { resolve } = require('path');
|
||||
import _ from 'lodash';
|
||||
const { isWorker } = require('cluster');
|
||||
const { resolve } = require('path');
|
||||
|
||||
let cwd = process.cwd();
|
||||
let src = require('requirefrom')('src');
|
||||
let fromRoot = src('utils/fromRoot');
|
||||
const cwd = process.cwd();
|
||||
const src = require('requirefrom')('src');
|
||||
const fromRoot = src('utils/fromRoot');
|
||||
|
||||
let canCluster;
|
||||
try {
|
||||
|
@ -14,19 +14,61 @@ try {
|
|||
canCluster = false;
|
||||
}
|
||||
|
||||
let pathCollector = function () {
|
||||
let paths = [];
|
||||
const pathCollector = function () {
|
||||
const paths = [];
|
||||
return function (path) {
|
||||
paths.push(resolve(process.cwd(), path));
|
||||
return paths;
|
||||
};
|
||||
};
|
||||
|
||||
let pluginDirCollector = pathCollector();
|
||||
let pluginPathCollector = pathCollector();
|
||||
const pluginDirCollector = pathCollector();
|
||||
const pluginPathCollector = pathCollector();
|
||||
|
||||
function initServerSettings(opts, extraCliOptions) {
|
||||
const readYamlConfig = require('./read_yaml_config');
|
||||
const settings = readYamlConfig(opts.config);
|
||||
const set = _.partial(_.set, settings);
|
||||
const get = _.partial(_.get, settings);
|
||||
const has = _.partial(_.has, settings);
|
||||
const merge = _.partial(_.merge, settings);
|
||||
|
||||
if (opts.dev) {
|
||||
try { merge(readYamlConfig(fromRoot('config/kibana.dev.yml'))); }
|
||||
catch (e) { null; }
|
||||
}
|
||||
|
||||
if (opts.dev) {
|
||||
set('env', 'development');
|
||||
set('optimize.lazy', true);
|
||||
if (opts.ssl && !has('server.ssl.cert') && !has('server.ssl.key')) {
|
||||
set('server.host', 'localhost');
|
||||
set('server.ssl.cert', fromRoot('test/dev_certs/server.crt'));
|
||||
set('server.ssl.key', fromRoot('test/dev_certs/server.key'));
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.elasticsearch) set('elasticsearch.url', opts.elasticsearch);
|
||||
if (opts.port) set('server.port', opts.port);
|
||||
if (opts.host) set('server.host', opts.host);
|
||||
if (opts.quiet) set('logging.quiet', true);
|
||||
if (opts.silent) set('logging.silent', true);
|
||||
if (opts.verbose) set('logging.verbose', true);
|
||||
if (opts.logFile) set('logging.dest', opts.logFile);
|
||||
|
||||
set('plugins.scanDirs', _.compact([].concat(
|
||||
get('plugins.scanDirs'),
|
||||
opts.pluginDir
|
||||
)));
|
||||
|
||||
set('plugins.paths', [].concat(opts.pluginPath || []));
|
||||
merge(extraCliOptions);
|
||||
|
||||
return settings;
|
||||
}
|
||||
|
||||
module.exports = function (program) {
|
||||
let command = program.command('serve');
|
||||
const command = program.command('serve');
|
||||
|
||||
command
|
||||
.description('Run the kibana server')
|
||||
|
@ -64,59 +106,30 @@ module.exports = function (program) {
|
|||
if (canCluster) {
|
||||
command
|
||||
.option('--dev', 'Run the server with development mode defaults')
|
||||
.option('--no-ssl', 'Don\'t run the dev server using HTTPS')
|
||||
.option('--no-base-path', 'Don\'t put a proxy in front of the dev server, which adds a random basePath')
|
||||
.option('--no-watch', 'Prevents automatic restarts of the server in --dev mode');
|
||||
}
|
||||
|
||||
command
|
||||
.action(async function (opts) {
|
||||
const settings = initServerSettings(opts, this.getUnknownOptions());
|
||||
|
||||
if (canCluster && opts.dev && !isWorker) {
|
||||
// stop processing the action and handoff to cluster manager
|
||||
let ClusterManager = require('../cluster/cluster_manager');
|
||||
new ClusterManager(opts);
|
||||
const ClusterManager = require('../cluster/cluster_manager');
|
||||
new ClusterManager(opts, settings);
|
||||
return;
|
||||
}
|
||||
|
||||
let readYamlConfig = require('./read_yaml_config');
|
||||
let KbnServer = src('server/KbnServer');
|
||||
|
||||
let settings = readYamlConfig(opts.config);
|
||||
|
||||
if (opts.dev) {
|
||||
try { _.merge(settings, readYamlConfig(fromRoot('config/kibana.dev.yml'))); }
|
||||
catch (e) { null; }
|
||||
}
|
||||
|
||||
let set = _.partial(_.set, settings);
|
||||
let get = _.partial(_.get, settings);
|
||||
|
||||
if (opts.dev) {
|
||||
set('env', 'development');
|
||||
set('optimize.lazy', true);
|
||||
}
|
||||
|
||||
if (opts.elasticsearch) set('elasticsearch.url', opts.elasticsearch);
|
||||
if (opts.port) set('server.port', opts.port);
|
||||
if (opts.host) set('server.host', opts.host);
|
||||
if (opts.quiet) set('logging.quiet', true);
|
||||
if (opts.silent) set('logging.silent', true);
|
||||
if (opts.verbose) set('logging.verbose', true);
|
||||
if (opts.logFile) set('logging.dest', opts.logFile);
|
||||
|
||||
set('plugins.scanDirs', _.compact([].concat(
|
||||
get('plugins.scanDirs'),
|
||||
opts.pluginDir
|
||||
)));
|
||||
|
||||
set('plugins.paths', [].concat(opts.pluginPath || []));
|
||||
|
||||
let kbnServer = {};
|
||||
|
||||
const KbnServer = src('server/KbnServer');
|
||||
try {
|
||||
kbnServer = new KbnServer(_.merge(settings, this.getUnknownOptions()));
|
||||
kbnServer = new KbnServer(settings);
|
||||
await kbnServer.ready();
|
||||
}
|
||||
catch (err) {
|
||||
let { server } = kbnServer;
|
||||
const { server } = kbnServer;
|
||||
|
||||
if (server) server.log(['fatal'], err);
|
||||
console.error('FATAL', err);
|
||||
|
|
|
@ -1,83 +1,81 @@
|
|||
define(function (require) {
|
||||
return function GeoHashGridAggResponseFixture() {
|
||||
import _ from 'lodash';
|
||||
export default function GeoHashGridAggResponseFixture() {
|
||||
|
||||
var _ = require('lodash');
|
||||
|
||||
// for vis:
|
||||
//
|
||||
// vis = new Vis(indexPattern, {
|
||||
// type: 'tile_map',
|
||||
// aggs:[
|
||||
// { schema: 'metric', type: 'avg', params: { field: 'bytes' } },
|
||||
// { schema: 'split', type: 'terms', params: { field: '@tags', size: 10 } },
|
||||
// { schema: 'segment', type: 'geohash_grid', params: { field: 'geo.coordinates', precision: 3 } }
|
||||
// ],
|
||||
// params: {
|
||||
// isDesaturated: true,
|
||||
// mapType: 'Scaled%20Circle%20Markers'
|
||||
// },
|
||||
// });
|
||||
// for vis:
|
||||
//
|
||||
// vis = new Vis(indexPattern, {
|
||||
// type: 'tile_map',
|
||||
// aggs:[
|
||||
// { schema: 'metric', type: 'avg', params: { field: 'bytes' } },
|
||||
// { schema: 'split', type: 'terms', params: { field: '@tags', size: 10 } },
|
||||
// { schema: 'segment', type: 'geohash_grid', params: { field: 'geo.coordinates', precision: 3 } }
|
||||
// ],
|
||||
// params: {
|
||||
// isDesaturated: true,
|
||||
// mapType: 'Scaled%20Circle%20Markers'
|
||||
// },
|
||||
// });
|
||||
|
||||
var geoHashCharts = _.union(
|
||||
_.range(48, 57), // 0-9
|
||||
_.range(65, 90), // A-Z
|
||||
_.range(97, 122) // a-z
|
||||
);
|
||||
var geoHashCharts = _.union(
|
||||
_.range(48, 57), // 0-9
|
||||
_.range(65, 90), // A-Z
|
||||
_.range(97, 122) // a-z
|
||||
);
|
||||
|
||||
var totalDocCount = 0;
|
||||
var totalDocCount = 0;
|
||||
|
||||
var tags = _.times(_.random(4, 20), function (i) {
|
||||
// random number of tags
|
||||
var docCount = 0;
|
||||
var buckets = _.times(_.random(40, 200), function () {
|
||||
return _.sample(geoHashCharts, 3).join('');
|
||||
})
|
||||
.sort()
|
||||
.map(function (geoHash) {
|
||||
var count = _.random(1, 5000);
|
||||
var tags = _.times(_.random(4, 20), function (i) {
|
||||
// random number of tags
|
||||
var docCount = 0;
|
||||
var buckets = _.times(_.random(40, 200), function () {
|
||||
return _.sample(geoHashCharts, 3).join('');
|
||||
})
|
||||
.sort()
|
||||
.map(function (geoHash) {
|
||||
var count = _.random(1, 5000);
|
||||
|
||||
totalDocCount += count;
|
||||
docCount += count;
|
||||
|
||||
return {
|
||||
key: geoHash,
|
||||
doc_count: count,
|
||||
1: {
|
||||
value: 2048 + i
|
||||
}
|
||||
};
|
||||
});
|
||||
totalDocCount += count;
|
||||
docCount += count;
|
||||
|
||||
return {
|
||||
key: 'tag ' + (i + 1),
|
||||
doc_count: docCount,
|
||||
3: {
|
||||
buckets: buckets
|
||||
},
|
||||
key: geoHash,
|
||||
doc_count: count,
|
||||
1: {
|
||||
value: 1000 + i
|
||||
value: 2048 + i
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
took: 3,
|
||||
timed_out: false,
|
||||
_shards: {
|
||||
total: 4,
|
||||
successful: 4,
|
||||
failed: 0
|
||||
key: 'tag ' + (i + 1),
|
||||
doc_count: docCount,
|
||||
3: {
|
||||
buckets: buckets
|
||||
},
|
||||
hits: {
|
||||
total: 298,
|
||||
max_score: 0.0,
|
||||
hits: []
|
||||
},
|
||||
aggregations: {
|
||||
2: {
|
||||
buckets: tags
|
||||
}
|
||||
1: {
|
||||
value: 1000 + i
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
took: 3,
|
||||
timed_out: false,
|
||||
_shards: {
|
||||
total: 4,
|
||||
successful: 4,
|
||||
failed: 0
|
||||
},
|
||||
hits: {
|
||||
total: 298,
|
||||
max_score: 0.0,
|
||||
hits: []
|
||||
},
|
||||
aggregations: {
|
||||
2: {
|
||||
buckets: tags
|
||||
}
|
||||
}
|
||||
};
|
||||
});
|
||||
};
|
||||
|
|
|
@ -1,22 +1,20 @@
|
|||
define(function (require) {
|
||||
var results = {};
|
||||
var results = {};
|
||||
|
||||
results.timeSeries = {
|
||||
data: {
|
||||
ordered: {
|
||||
date: true,
|
||||
interval: 600000,
|
||||
max: 1414437217559,
|
||||
min: 1414394017559
|
||||
}
|
||||
},
|
||||
label: 'apache',
|
||||
value: 44,
|
||||
point: {
|
||||
label: 'apache',
|
||||
x: 1414400400000,
|
||||
y: 44,
|
||||
y0: 0
|
||||
results.timeSeries = {
|
||||
data: {
|
||||
ordered: {
|
||||
date: true,
|
||||
interval: 600000,
|
||||
max: 1414437217559,
|
||||
min: 1414394017559
|
||||
}
|
||||
};
|
||||
});
|
||||
},
|
||||
label: 'apache',
|
||||
value: 44,
|
||||
point: {
|
||||
label: 'apache',
|
||||
x: 1414400400000,
|
||||
y: 44,
|
||||
y0: 0
|
||||
}
|
||||
};
|
||||
|
|
|
@ -1,228 +1,226 @@
|
|||
define(function (require) {
|
||||
var data = { };
|
||||
var data = { };
|
||||
|
||||
data.metricOnly = {
|
||||
hits: { total: 1000, hits: [], max_score: 0 },
|
||||
aggregations: {
|
||||
agg_1: { value: 412032 },
|
||||
}
|
||||
};
|
||||
data.metricOnly = {
|
||||
hits: { total: 1000, hits: [], max_score: 0 },
|
||||
aggregations: {
|
||||
agg_1: { value: 412032 },
|
||||
}
|
||||
};
|
||||
|
||||
data.threeTermBuckets = {
|
||||
hits: { total: 1000, hits: [], max_score: 0 },
|
||||
aggregations: {
|
||||
agg_2: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'png',
|
||||
doc_count: 50,
|
||||
agg_1: { value: 412032 },
|
||||
agg_3: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'IT',
|
||||
doc_count: 10,
|
||||
agg_1: { value: 9299 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 4, agg_1: { value: 0 } },
|
||||
{ key: 'mac', doc_count: 6, agg_1: { value: 9299 } }
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'US',
|
||||
doc_count: 20,
|
||||
agg_1: { value: 8293 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'linux', doc_count: 12, agg_1: { value: 3992 } },
|
||||
{ key: 'mac', doc_count: 8, agg_1: { value: 3029 } }
|
||||
]
|
||||
}
|
||||
data.threeTermBuckets = {
|
||||
hits: { total: 1000, hits: [], max_score: 0 },
|
||||
aggregations: {
|
||||
agg_2: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'png',
|
||||
doc_count: 50,
|
||||
agg_1: { value: 412032 },
|
||||
agg_3: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'IT',
|
||||
doc_count: 10,
|
||||
agg_1: { value: 9299 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 4, agg_1: { value: 0 } },
|
||||
{ key: 'mac', doc_count: 6, agg_1: { value: 9299 } }
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'css',
|
||||
doc_count: 20,
|
||||
agg_1: { value: 412032 },
|
||||
agg_3: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'MX',
|
||||
doc_count: 7,
|
||||
agg_1: { value: 9299 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 3, agg_1: { value: 4992 } },
|
||||
{ key: 'mac', doc_count: 4, agg_1: { value: 5892 } }
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'US',
|
||||
doc_count: 13,
|
||||
agg_1: { value: 8293 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'linux', doc_count: 12, agg_1: { value: 3992 } },
|
||||
{ key: 'mac', doc_count: 1, agg_1: { value: 3029 } }
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'US',
|
||||
doc_count: 20,
|
||||
agg_1: { value: 8293 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'linux', doc_count: 12, agg_1: { value: 3992 } },
|
||||
{ key: 'mac', doc_count: 8, agg_1: { value: 3029 } }
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'html',
|
||||
doc_count: 90,
|
||||
agg_1: { value: 412032 },
|
||||
agg_3: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'CN',
|
||||
doc_count: 85,
|
||||
agg_1: { value: 9299 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 46, agg_1: { value: 4992 } },
|
||||
{ key: 'mac', doc_count: 39, agg_1: { value: 5892 } }
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'FR',
|
||||
doc_count: 15,
|
||||
agg_1: { value: 8293 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 3, agg_1: { value: 3992 } },
|
||||
{ key: 'mac', doc_count: 12, agg_1: { value: 3029 } }
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
data.oneRangeBucket = {
|
||||
'took': 35,
|
||||
'timed_out': false,
|
||||
'_shards': {
|
||||
'total': 1,
|
||||
'successful': 1,
|
||||
'failed': 0
|
||||
},
|
||||
'hits': {
|
||||
'total': 6039,
|
||||
'max_score': 0,
|
||||
'hits': []
|
||||
},
|
||||
'aggregations': {
|
||||
'agg_2': {
|
||||
'buckets': {
|
||||
'0.0-1000.0': {
|
||||
'from': 0,
|
||||
'from_as_string': '0.0',
|
||||
'to': 1000,
|
||||
'to_as_string': '1000.0',
|
||||
'doc_count': 606
|
||||
},
|
||||
'1000.0-2000.0': {
|
||||
'from': 1000,
|
||||
'from_as_string': '1000.0',
|
||||
'to': 2000,
|
||||
'to_as_string': '2000.0',
|
||||
'doc_count': 298
|
||||
},
|
||||
{
|
||||
key: 'css',
|
||||
doc_count: 20,
|
||||
agg_1: { value: 412032 },
|
||||
agg_3: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'MX',
|
||||
doc_count: 7,
|
||||
agg_1: { value: 9299 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 3, agg_1: { value: 4992 } },
|
||||
{ key: 'mac', doc_count: 4, agg_1: { value: 5892 } }
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'US',
|
||||
doc_count: 13,
|
||||
agg_1: { value: 8293 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'linux', doc_count: 12, agg_1: { value: 3992 } },
|
||||
{ key: 'mac', doc_count: 1, agg_1: { value: 3029 } }
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'html',
|
||||
doc_count: 90,
|
||||
agg_1: { value: 412032 },
|
||||
agg_3: {
|
||||
buckets: [
|
||||
{
|
||||
key: 'CN',
|
||||
doc_count: 85,
|
||||
agg_1: { value: 9299 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 46, agg_1: { value: 4992 } },
|
||||
{ key: 'mac', doc_count: 39, agg_1: { value: 5892 } }
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
key: 'FR',
|
||||
doc_count: 15,
|
||||
agg_1: { value: 8293 },
|
||||
agg_4: {
|
||||
buckets: [
|
||||
{ key: 'win', doc_count: 3, agg_1: { value: 3992 } },
|
||||
{ key: 'mac', doc_count: 12, agg_1: { value: 3029 } }
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
data.oneRangeBucket = {
|
||||
'took': 35,
|
||||
'timed_out': false,
|
||||
'_shards': {
|
||||
'total': 1,
|
||||
'successful': 1,
|
||||
'failed': 0
|
||||
},
|
||||
'hits': {
|
||||
'total': 6039,
|
||||
'max_score': 0,
|
||||
'hits': []
|
||||
},
|
||||
'aggregations': {
|
||||
'agg_2': {
|
||||
'buckets': {
|
||||
'0.0-1000.0': {
|
||||
'from': 0,
|
||||
'from_as_string': '0.0',
|
||||
'to': 1000,
|
||||
'to_as_string': '1000.0',
|
||||
'doc_count': 606
|
||||
},
|
||||
'1000.0-2000.0': {
|
||||
'from': 1000,
|
||||
'from_as_string': '1000.0',
|
||||
'to': 2000,
|
||||
'to_as_string': '2000.0',
|
||||
'doc_count': 298
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
data.oneFilterBucket = {
|
||||
'took': 11,
|
||||
'timed_out': false,
|
||||
'_shards': {
|
||||
'total': 1,
|
||||
'successful': 1,
|
||||
'failed': 0
|
||||
},
|
||||
'hits': {
|
||||
'total': 6005,
|
||||
'max_score': 0,
|
||||
'hits': []
|
||||
},
|
||||
'aggregations': {
|
||||
'agg_2': {
|
||||
'buckets': {
|
||||
'_type:apache': {
|
||||
'doc_count': 4844
|
||||
},
|
||||
'_type:nginx': {
|
||||
'doc_count': 1161
|
||||
}
|
||||
data.oneFilterBucket = {
|
||||
'took': 11,
|
||||
'timed_out': false,
|
||||
'_shards': {
|
||||
'total': 1,
|
||||
'successful': 1,
|
||||
'failed': 0
|
||||
},
|
||||
'hits': {
|
||||
'total': 6005,
|
||||
'max_score': 0,
|
||||
'hits': []
|
||||
},
|
||||
'aggregations': {
|
||||
'agg_2': {
|
||||
'buckets': {
|
||||
'_type:apache': {
|
||||
'doc_count': 4844
|
||||
},
|
||||
'_type:nginx': {
|
||||
'doc_count': 1161
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
data.oneHistogramBucket = {
|
||||
'took': 37,
|
||||
'timed_out': false,
|
||||
'_shards': {
|
||||
'total': 6,
|
||||
'successful': 6,
|
||||
'failed': 0
|
||||
},
|
||||
'hits': {
|
||||
'total': 49208,
|
||||
'max_score': 0,
|
||||
'hits': []
|
||||
},
|
||||
'aggregations': {
|
||||
'agg_2': {
|
||||
'buckets': [
|
||||
{
|
||||
'key_as_string': '2014-09-28T00:00:00.000Z',
|
||||
'key': 1411862400000,
|
||||
'doc_count': 8247
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-09-29T00:00:00.000Z',
|
||||
'key': 1411948800000,
|
||||
'doc_count': 8184
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-09-30T00:00:00.000Z',
|
||||
'key': 1412035200000,
|
||||
'doc_count': 8269
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-10-01T00:00:00.000Z',
|
||||
'key': 1412121600000,
|
||||
'doc_count': 8141
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-10-02T00:00:00.000Z',
|
||||
'key': 1412208000000,
|
||||
'doc_count': 8148
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-10-03T00:00:00.000Z',
|
||||
'key': 1412294400000,
|
||||
'doc_count': 8219
|
||||
}
|
||||
]
|
||||
}
|
||||
data.oneHistogramBucket = {
|
||||
'took': 37,
|
||||
'timed_out': false,
|
||||
'_shards': {
|
||||
'total': 6,
|
||||
'successful': 6,
|
||||
'failed': 0
|
||||
},
|
||||
'hits': {
|
||||
'total': 49208,
|
||||
'max_score': 0,
|
||||
'hits': []
|
||||
},
|
||||
'aggregations': {
|
||||
'agg_2': {
|
||||
'buckets': [
|
||||
{
|
||||
'key_as_string': '2014-09-28T00:00:00.000Z',
|
||||
'key': 1411862400000,
|
||||
'doc_count': 8247
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-09-29T00:00:00.000Z',
|
||||
'key': 1411948800000,
|
||||
'doc_count': 8184
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-09-30T00:00:00.000Z',
|
||||
'key': 1412035200000,
|
||||
'doc_count': 8269
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-10-01T00:00:00.000Z',
|
||||
'key': 1412121600000,
|
||||
'doc_count': 8141
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-10-02T00:00:00.000Z',
|
||||
'key': 1412208000000,
|
||||
'doc_count': 8148
|
||||
},
|
||||
{
|
||||
'key_as_string': '2014-10-03T00:00:00.000Z',
|
||||
'key': 1412294400000,
|
||||
'doc_count': 8219
|
||||
}
|
||||
]
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
|
||||
return data;
|
||||
});
|
||||
export default data;
|
||||
|
|
|
@ -1,22 +1,20 @@
|
|||
define(function (require) {
|
||||
var _ = require('lodash');
|
||||
var longString = Array(200).join('_');
|
||||
import _ from 'lodash';
|
||||
var longString = Array(200).join('_');
|
||||
|
||||
return function (id, mapping) {
|
||||
function fakeVals(type) {
|
||||
return _.mapValues(mapping, function (f, c) {
|
||||
return c + '_' + type + '_' + id + longString;
|
||||
});
|
||||
}
|
||||
export default function (id, mapping) {
|
||||
function fakeVals(type) {
|
||||
return _.mapValues(mapping, function (f, c) {
|
||||
return c + '_' + type + '_' + id + longString;
|
||||
});
|
||||
}
|
||||
|
||||
return {
|
||||
_id: id,
|
||||
_index: 'test',
|
||||
_source: fakeVals('original'),
|
||||
sort: [id],
|
||||
$$_formatted: fakeVals('formatted'),
|
||||
$$_partialFormatted: fakeVals('formatted'),
|
||||
$$_flattened: fakeVals('_flattened')
|
||||
};
|
||||
return {
|
||||
_id: id,
|
||||
_index: 'test',
|
||||
_source: fakeVals('original'),
|
||||
sort: [id],
|
||||
$$_formatted: fakeVals('formatted'),
|
||||
$$_partialFormatted: fakeVals('formatted'),
|
||||
$$_flattened: fakeVals('_flattened')
|
||||
};
|
||||
});
|
||||
};
|
||||
|
|
|
@ -1,62 +1,60 @@
|
|||
define(function (require) {
|
||||
return {
|
||||
test: {
|
||||
mappings: {
|
||||
testType: {
|
||||
'baz': {
|
||||
full_name: 'baz',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'long'
|
||||
}
|
||||
export default {
|
||||
test: {
|
||||
mappings: {
|
||||
testType: {
|
||||
'baz': {
|
||||
full_name: 'baz',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'long'
|
||||
}
|
||||
},
|
||||
'foo.bar': {
|
||||
full_name: 'foo.bar',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string',
|
||||
}
|
||||
}
|
||||
},
|
||||
'foo.bar': {
|
||||
full_name: 'foo.bar',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string',
|
||||
}
|
||||
},
|
||||
'not_analyzed_field': {
|
||||
full_name: 'not_analyzed_field',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string',
|
||||
index: 'not_analyzed'
|
||||
}
|
||||
}
|
||||
},
|
||||
'not_analyzed_field': {
|
||||
full_name: 'not_analyzed_field',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string',
|
||||
index: 'not_analyzed'
|
||||
}
|
||||
},
|
||||
'index_no_field': {
|
||||
full_name: 'index_no_field',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string',
|
||||
index: 'no'
|
||||
}
|
||||
}
|
||||
},
|
||||
'index_no_field': {
|
||||
full_name: 'index_no_field',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string',
|
||||
index: 'no'
|
||||
}
|
||||
},
|
||||
_id: {
|
||||
full_name: '_id',
|
||||
mapping: {
|
||||
_id: {
|
||||
store: false,
|
||||
index: 'no',
|
||||
}
|
||||
}
|
||||
},
|
||||
_id: {
|
||||
full_name: '_id',
|
||||
mapping: {
|
||||
_id: {
|
||||
store: false,
|
||||
index: 'no',
|
||||
}
|
||||
},
|
||||
_timestamp: {
|
||||
full_name: '_timestamp',
|
||||
mapping: {
|
||||
_timestamp: {
|
||||
store: true,
|
||||
index: 'no',
|
||||
}
|
||||
}
|
||||
},
|
||||
_timestamp: {
|
||||
full_name: '_timestamp',
|
||||
mapping: {
|
||||
_timestamp: {
|
||||
store: true,
|
||||
index: 'no',
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
};
|
|
@ -1,7 +1,5 @@
|
|||
define(function (require) {
|
||||
return {
|
||||
meta: {
|
||||
index: 'logstash-*'
|
||||
}
|
||||
};
|
||||
});
|
||||
export default {
|
||||
meta: {
|
||||
index: 'logstash-*'
|
||||
}
|
||||
};
|
||||
|
|
|
@ -1,24 +1,22 @@
|
|||
define(function (require) {
|
||||
var _ = require('lodash');
|
||||
return function fitsFixture() {
|
||||
return _.map([
|
||||
{_source: {'@timestamp': 0, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 10, request: 'foo'}},
|
||||
{_source: {'@timestamp': 1, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 20, request: 'bar'}},
|
||||
{_source: {'@timestamp': 2, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'bar'}},
|
||||
{_source: {'@timestamp': 3, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'baz'}},
|
||||
{_source: {'@timestamp': 4, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'baz'}},
|
||||
{_source: {'@timestamp': 5, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'baz'}},
|
||||
{_source: {'@timestamp': 6, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
{_source: {'@timestamp': 7, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
{_source: {'@timestamp': 8, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
{_source: {'@timestamp': 9, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
], function (p, i) {
|
||||
return _.merge({}, p, {
|
||||
_score: 1,
|
||||
_id: 1000 + i,
|
||||
_type: 'test',
|
||||
_index: 'test-index'
|
||||
});
|
||||
import _ from 'lodash';
|
||||
export default function fitsFixture() {
|
||||
return _.map([
|
||||
{_source: {'@timestamp': 0, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 10, request: 'foo'}},
|
||||
{_source: {'@timestamp': 1, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 20, request: 'bar'}},
|
||||
{_source: {'@timestamp': 2, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'bar'}},
|
||||
{_source: {'@timestamp': 3, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'baz'}},
|
||||
{_source: {'@timestamp': 4, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'baz'}},
|
||||
{_source: {'@timestamp': 5, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 30, request: 'baz'}},
|
||||
{_source: {'@timestamp': 6, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
{_source: {'@timestamp': 7, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
{_source: {'@timestamp': 8, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
{_source: {'@timestamp': 9, ssl: true, ip: '192.168.0.1', extension: 'php', 'machine.os': 'Linux', bytes: 40.141592, request: 'bat'}},
|
||||
], function (p, i) {
|
||||
return _.merge({}, p, {
|
||||
_score: 1,
|
||||
_id: 1000 + i,
|
||||
_type: 'test',
|
||||
_index: 'test-index'
|
||||
});
|
||||
};
|
||||
});
|
||||
});
|
||||
};
|
||||
|
|
|
@ -1,37 +1,35 @@
|
|||
define(function (require) {
|
||||
function stubbedLogstashFields() {
|
||||
var sourceData = [
|
||||
{ name: 'bytes', type: 'number', indexed: true, analyzed: true, sortable: true, filterable: true, count: 10 },
|
||||
{ name: 'ssl', type: 'boolean', indexed: true, analyzed: true, sortable: true, filterable: true, count: 20 },
|
||||
{ name: '@timestamp', type: 'date', indexed: true, analyzed: true, sortable: true, filterable: true, count: 30 },
|
||||
{ name: 'time', type: 'date', indexed: true, analyzed: true, sortable: true, filterable: true, count: 30 },
|
||||
{ name: '@tags', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'utc_time', type: 'date', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'phpmemory', type: 'number', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'ip', type: 'ip', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'request_body', type: 'attachment', indexed: true, analyzed: true, sortable: false, filterable: true },
|
||||
{ name: 'point', type: 'geo_point', indexed: true, analyzed: true, sortable: false, filterable: false },
|
||||
{ name: 'area', type: 'geo_shape', indexed: true, analyzed: true, sortable: true, filterable: false },
|
||||
{ name: 'hashed', type: 'murmur3', indexed: true, analyzed: true, sortable: false, filterable: false },
|
||||
{ name: 'geo.coordinates', type: 'geo_point', indexed: true, analyzed: true, sortable: false, filterable: true },
|
||||
{ name: 'extension', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'machine.os', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'geo.src', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: '_type', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: '_id', type: 'string', indexed: false, analyzed: false, sortable: false, filterable: true},
|
||||
{ name: '_source', type: 'string', indexed: false, analyzed: false, sortable: false, filterable: false},
|
||||
{ name: 'custom_user_field', type: 'conflict', indexed: false, analyzed: false, sortable: false, filterable: true },
|
||||
{ name: 'script string', type: 'string', scripted: true, script: '\'i am a string\'', lang: 'expression' },
|
||||
{ name: 'script number', type: 'number', scripted: true, script: '1234', lang: 'expression' },
|
||||
{ name: 'script murmur3', type: 'murmur3', scripted: true, script: '1234', lang: 'expression'},
|
||||
].map(function (field) {
|
||||
field.count = field.count || 0;
|
||||
field.scripted = field.scripted || false;
|
||||
return field;
|
||||
});
|
||||
function stubbedLogstashFields() {
|
||||
var sourceData = [
|
||||
{ name: 'bytes', type: 'number', indexed: true, analyzed: true, sortable: true, filterable: true, count: 10 },
|
||||
{ name: 'ssl', type: 'boolean', indexed: true, analyzed: true, sortable: true, filterable: true, count: 20 },
|
||||
{ name: '@timestamp', type: 'date', indexed: true, analyzed: true, sortable: true, filterable: true, count: 30 },
|
||||
{ name: 'time', type: 'date', indexed: true, analyzed: true, sortable: true, filterable: true, count: 30 },
|
||||
{ name: '@tags', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'utc_time', type: 'date', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'phpmemory', type: 'number', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'ip', type: 'ip', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'request_body', type: 'attachment', indexed: true, analyzed: true, sortable: false, filterable: true },
|
||||
{ name: 'point', type: 'geo_point', indexed: true, analyzed: true, sortable: false, filterable: false },
|
||||
{ name: 'area', type: 'geo_shape', indexed: true, analyzed: true, sortable: true, filterable: false },
|
||||
{ name: 'hashed', type: 'murmur3', indexed: true, analyzed: true, sortable: false, filterable: false },
|
||||
{ name: 'geo.coordinates', type: 'geo_point', indexed: true, analyzed: true, sortable: false, filterable: true },
|
||||
{ name: 'extension', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'machine.os', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: 'geo.src', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: '_type', type: 'string', indexed: true, analyzed: true, sortable: true, filterable: true },
|
||||
{ name: '_id', type: 'string', indexed: false, analyzed: false, sortable: false, filterable: true},
|
||||
{ name: '_source', type: 'string', indexed: false, analyzed: false, sortable: false, filterable: false},
|
||||
{ name: 'custom_user_field', type: 'conflict', indexed: false, analyzed: false, sortable: false, filterable: true },
|
||||
{ name: 'script string', type: 'string', scripted: true, script: '\'i am a string\'', lang: 'expression' },
|
||||
{ name: 'script number', type: 'number', scripted: true, script: '1234', lang: 'expression' },
|
||||
{ name: 'script murmur3', type: 'murmur3', scripted: true, script: '1234', lang: 'expression'},
|
||||
].map(function (field) {
|
||||
field.count = field.count || 0;
|
||||
field.scripted = field.scripted || false;
|
||||
return field;
|
||||
});
|
||||
|
||||
return sourceData;
|
||||
}
|
||||
return sourceData;
|
||||
}
|
||||
|
||||
return stubbedLogstashFields;
|
||||
});
|
||||
export default stubbedLogstashFields;
|
||||
|
|
|
@ -1,40 +1,38 @@
|
|||
define(function (require) {
|
||||
return {
|
||||
test: {
|
||||
mappings: {
|
||||
testType: {
|
||||
'baz': {
|
||||
full_name: 'baz',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'long'
|
||||
}
|
||||
}
|
||||
},
|
||||
'foo.bar': {
|
||||
full_name: 'foo.bar',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string'
|
||||
}
|
||||
export default {
|
||||
test: {
|
||||
mappings: {
|
||||
testType: {
|
||||
'baz': {
|
||||
full_name: 'baz',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'long'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
duplicates: {
|
||||
mappings: {
|
||||
testType: {
|
||||
'baz': {
|
||||
full_name: 'baz',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'date'
|
||||
}
|
||||
},
|
||||
'foo.bar': {
|
||||
full_name: 'foo.bar',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'string'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
||||
});
|
||||
},
|
||||
duplicates: {
|
||||
mappings: {
|
||||
testType: {
|
||||
'baz': {
|
||||
full_name: 'baz',
|
||||
mapping: {
|
||||
bar: {
|
||||
type: 'date'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
};
|
|
@ -1,17 +1,16 @@
|
|||
define(function (require) {
|
||||
var _ = require('lodash');
|
||||
var sinon = require('auto-release-sinon');
|
||||
import _ from 'lodash';
|
||||
import sinon from 'auto-release-sinon';
|
||||
import FixturesStubbedLogstashIndexPatternProvider from 'fixtures/stubbed_logstash_index_pattern';
|
||||
|
||||
return function (Private, Promise) {
|
||||
var indexPatterns = Private(require('fixtures/stubbed_logstash_index_pattern'));
|
||||
var getIndexPatternStub = sinon.stub();
|
||||
getIndexPatternStub.returns(Promise.resolve(indexPatterns));
|
||||
export default function (Private, Promise) {
|
||||
var indexPatterns = Private(FixturesStubbedLogstashIndexPatternProvider);
|
||||
var getIndexPatternStub = sinon.stub();
|
||||
getIndexPatternStub.returns(Promise.resolve(indexPatterns));
|
||||
|
||||
var courier = {
|
||||
indexPatterns: { get: getIndexPatternStub },
|
||||
getStub: getIndexPatternStub
|
||||
};
|
||||
|
||||
return courier;
|
||||
var courier = {
|
||||
indexPatterns: { get: getIndexPatternStub },
|
||||
getStub: getIndexPatternStub
|
||||
};
|
||||
});
|
||||
|
||||
return courier;
|
||||
};
|
||||
|
|
|
@ -1,18 +1,17 @@
|
|||
define(function (require) {
|
||||
var _ = require('lodash');
|
||||
var sinon = require('auto-release-sinon');
|
||||
import _ from 'lodash';
|
||||
import sinon from 'auto-release-sinon';
|
||||
|
||||
function MockState(defaults) {
|
||||
this.on = _.noop;
|
||||
this.off = _.noop;
|
||||
this.save = sinon.stub();
|
||||
_.assign(this, defaults);
|
||||
}
|
||||
function MockState(defaults) {
|
||||
this.on = _.noop;
|
||||
this.off = _.noop;
|
||||
this.save = sinon.stub();
|
||||
this.replace = sinon.stub();
|
||||
_.assign(this, defaults);
|
||||
}
|
||||
|
||||
MockState.prototype.resetStub = function () {
|
||||
this.save = sinon.stub();
|
||||
return this;
|
||||
};
|
||||
MockState.prototype.resetStub = function () {
|
||||
this.save = sinon.stub();
|
||||
return this;
|
||||
};
|
||||
|
||||
return MockState;
|
||||
});
|
||||
export default MockState;
|
||||
|
|
|
@ -1,15 +1,13 @@
|
|||
define(function (require) {
|
||||
var _ = require('lodash');
|
||||
var keys = {};
|
||||
return {
|
||||
get: function (path, def) {
|
||||
return keys[path] == null ? def : keys[path];
|
||||
},
|
||||
set: function (path, val) {
|
||||
keys[path] = val;
|
||||
return val;
|
||||
},
|
||||
on: _.noop,
|
||||
off: _.noop
|
||||
}
|
||||
})
|
||||
import _ from 'lodash';
|
||||
var keys = {};
|
||||
export default {
|
||||
get: function (path, def) {
|
||||
return keys[path] == null ? def : keys[path];
|
||||
},
|
||||
set: function (path, val) {
|
||||
keys[path] = val;
|
||||
return val;
|
||||
},
|
||||
on: _.noop,
|
||||
off: _.noop
|
||||
}
|
||||
|
|
|
@ -1,227 +1,225 @@
|
|||
define(function (require) {
|
||||
|
||||
/*
|
||||
Extensions:
|
||||
gif: 5
|
||||
html: 8
|
||||
php: 5 (thus 5 with phpmemory fields)
|
||||
png: 2
|
||||
/*
|
||||
Extensions:
|
||||
gif: 5
|
||||
html: 8
|
||||
php: 5 (thus 5 with phpmemory fields)
|
||||
png: 2
|
||||
|
||||
_type:
|
||||
apache: 18
|
||||
nginx: 2
|
||||
_type:
|
||||
apache: 18
|
||||
nginx: 2
|
||||
|
||||
Bytes (all unique except):
|
||||
374: 2
|
||||
Bytes (all unique except):
|
||||
374: 2
|
||||
|
||||
All have the same index, ids are unique
|
||||
*/
|
||||
All have the same index, ids are unique
|
||||
*/
|
||||
|
||||
return [
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '61',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 360.20000000000005
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '388',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 5848.700000000001
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '403',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'png',
|
||||
'bytes': 841.6
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '415',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 1626.4
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '460',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 2070.6,
|
||||
'phpmemory': 276080
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '496',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 8421.6
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '511',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 994.8000000000001
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '701',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 374
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '838',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 506.09999999999997,
|
||||
'phpmemory': 67480
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '890',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 506.09999999999997,
|
||||
'phpmemory': 67480
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'nginx',
|
||||
'_id': '927',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 2591.1,
|
||||
'phpmemory': 345480
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1034',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 1450
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1142',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 1803.8999999999999,
|
||||
'phpmemory': 240520
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1180',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 1626.4
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'nginx',
|
||||
'_id': '1224',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 10617.2
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1243',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 10961.5
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1510',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 382.8
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1628',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 374
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1729',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'png',
|
||||
'bytes': 3059.2000000000003
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1945',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 10617.2
|
||||
}
|
||||
export default [
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '61',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 360.20000000000005
|
||||
}
|
||||
];
|
||||
});
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '388',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 5848.700000000001
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '403',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'png',
|
||||
'bytes': 841.6
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '415',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 1626.4
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '460',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 2070.6,
|
||||
'phpmemory': 276080
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '496',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 8421.6
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '511',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 994.8000000000001
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '701',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 374
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '838',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 506.09999999999997,
|
||||
'phpmemory': 67480
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '890',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 506.09999999999997,
|
||||
'phpmemory': 67480
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'nginx',
|
||||
'_id': '927',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 2591.1,
|
||||
'phpmemory': 345480
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1034',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 1450
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1142',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'php',
|
||||
'bytes': 1803.8999999999999,
|
||||
'phpmemory': 240520
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1180',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 1626.4
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'nginx',
|
||||
'_id': '1224',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 10617.2
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1243',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 10961.5
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1510',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 382.8
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1628',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'html',
|
||||
'bytes': 374
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1729',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'png',
|
||||
'bytes': 3059.2000000000003
|
||||
}
|
||||
},
|
||||
{
|
||||
'_index': 'logstash-2014.09.09',
|
||||
'_type': 'apache',
|
||||
'_id': '1945',
|
||||
'_score': 1,
|
||||
'_source': {
|
||||
'extension': 'gif',
|
||||
'bytes': 10617.2
|
||||
}
|
||||
}
|
||||
];
|
||||
|
|
|
@ -1,18 +1,16 @@
|
|||
define(function (require) {
|
||||
var hits = require('fixtures/real_hits');
|
||||
import hits from 'fixtures/real_hits';
|
||||
|
||||
return {
|
||||
took: 73,
|
||||
timed_out: false,
|
||||
_shards: {
|
||||
total: 144,
|
||||
successful: 144,
|
||||
failed: 0
|
||||
},
|
||||
hits: {
|
||||
total : 49487,
|
||||
max_score : 1.0,
|
||||
hits: hits
|
||||
}
|
||||
};
|
||||
});
|
||||
export default {
|
||||
took: 73,
|
||||
timed_out: false,
|
||||
_shards: {
|
||||
total: 144,
|
||||
successful: 144,
|
||||
failed: 0
|
||||
},
|
||||
hits: {
|
||||
total : 49487,
|
||||
max_score : 1.0,
|
||||
hits: hits
|
||||
}
|
||||
};
|
|
@ -1,22 +1,22 @@
|
|||
define(function (require) {
|
||||
function stubbedDocSourceResponse(Private) {
|
||||
var mockLogstashFields = Private(require('fixtures/logstash_fields'));
|
||||
import FixturesLogstashFieldsProvider from 'fixtures/logstash_fields';
|
||||
|
||||
return function (id, index) {
|
||||
index = index || '.kibana';
|
||||
return {
|
||||
_id: id,
|
||||
_index: index,
|
||||
_type: 'index-pattern',
|
||||
_version: 2,
|
||||
found: true,
|
||||
_source: {
|
||||
customFormats: '{}',
|
||||
fields: JSON.stringify(mockLogstashFields)
|
||||
}
|
||||
};
|
||||
function stubbedDocSourceResponse(Private) {
|
||||
var mockLogstashFields = Private(FixturesLogstashFieldsProvider);
|
||||
|
||||
return function (id, index) {
|
||||
index = index || '.kibana';
|
||||
return {
|
||||
_id: id,
|
||||
_index: index,
|
||||
_type: 'index-pattern',
|
||||
_version: 2,
|
||||
found: true,
|
||||
_source: {
|
||||
customFormats: '{}',
|
||||
fields: JSON.stringify(mockLogstashFields)
|
||||
}
|
||||
};
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
return stubbedDocSourceResponse;
|
||||
});
|
||||
export default stubbedDocSourceResponse;
|
|
@ -1,24 +1,25 @@
|
|||
define(function (require) {
|
||||
return function stubbedLogstashIndexPatternService(Private) {
|
||||
var StubIndexPattern = Private(require('testUtils/stub_index_pattern'));
|
||||
var fieldTypes = Private(require('ui/index_patterns/_field_types'));
|
||||
var mockLogstashFields = Private(require('fixtures/logstash_fields'));
|
||||
import _ from 'lodash';
|
||||
import TestUtilsStubIndexPatternProvider from 'testUtils/stub_index_pattern';
|
||||
import IndexPatternsFieldTypesProvider from 'ui/index_patterns/_field_types';
|
||||
import FixturesLogstashFieldsProvider from 'fixtures/logstash_fields';
|
||||
export default function stubbedLogstashIndexPatternService(Private) {
|
||||
var StubIndexPattern = Private(TestUtilsStubIndexPatternProvider);
|
||||
var fieldTypes = Private(IndexPatternsFieldTypesProvider);
|
||||
var mockLogstashFields = Private(FixturesLogstashFieldsProvider);
|
||||
|
||||
var _ = require('lodash');
|
||||
|
||||
var fields = mockLogstashFields.map(function (field) {
|
||||
field.displayName = field.name;
|
||||
var type = fieldTypes.byName[field.type];
|
||||
if (!type) throw new TypeError('unknown type ' + field.type);
|
||||
if (!_.has(field, 'sortable')) field.sortable = type.sortable;
|
||||
if (!_.has(field, 'filterable')) field.filterable = type.filterable;
|
||||
return field;
|
||||
});
|
||||
var fields = mockLogstashFields.map(function (field) {
|
||||
field.displayName = field.name;
|
||||
var type = fieldTypes.byName[field.type];
|
||||
if (!type) throw new TypeError('unknown type ' + field.type);
|
||||
if (!_.has(field, 'sortable')) field.sortable = type.sortable;
|
||||
if (!_.has(field, 'filterable')) field.filterable = type.filterable;
|
||||
return field;
|
||||
});
|
||||
|
||||
var indexPattern = new StubIndexPattern('logstash-*', 'time', fields);
|
||||
indexPattern.id = 'logstash-*';
|
||||
var indexPattern = new StubIndexPattern('logstash-*', 'time', fields);
|
||||
indexPattern.id = 'logstash-*';
|
||||
|
||||
return indexPattern;
|
||||
return indexPattern;
|
||||
|
||||
};
|
||||
});
|
||||
};
|
||||
|
|
|
@ -1,39 +1,38 @@
|
|||
define(function (require) {
|
||||
var sinon = require('auto-release-sinon');
|
||||
var searchResponse = require('fixtures/search_response');
|
||||
import sinon from 'auto-release-sinon';
|
||||
import searchResponse from 'fixtures/search_response';
|
||||
import FixturesStubbedLogstashIndexPatternProvider from 'fixtures/stubbed_logstash_index_pattern';
|
||||
|
||||
return function stubSearchSource(Private, $q, Promise) {
|
||||
var deferedResult = $q.defer();
|
||||
var indexPattern = Private(require('fixtures/stubbed_logstash_index_pattern'));
|
||||
export default function stubSearchSource(Private, $q, Promise) {
|
||||
var deferedResult = $q.defer();
|
||||
var indexPattern = Private(FixturesStubbedLogstashIndexPatternProvider);
|
||||
|
||||
return {
|
||||
sort: sinon.spy(),
|
||||
size: sinon.spy(),
|
||||
fetch: sinon.spy(),
|
||||
destroy: sinon.spy(),
|
||||
get: function (param) {
|
||||
switch (param) {
|
||||
case 'index':
|
||||
return indexPattern;
|
||||
default:
|
||||
throw new Error('Param "' + param + '" is not implemented in the stubbed search source');
|
||||
}
|
||||
},
|
||||
crankResults: function () {
|
||||
deferedResult.resolve(searchResponse);
|
||||
deferedResult = $q.defer();
|
||||
},
|
||||
onResults: function () {
|
||||
// Up to the test to resolve this manually
|
||||
// For example:
|
||||
// someHandler.resolve(require('fixtures/search_response'))
|
||||
return deferedResult.promise;
|
||||
},
|
||||
onError: function () { return $q.defer().promise; },
|
||||
_flatten: function () {
|
||||
return Promise.resolve({ index: indexPattern, body: {} });
|
||||
return {
|
||||
sort: sinon.spy(),
|
||||
size: sinon.spy(),
|
||||
fetch: sinon.spy(),
|
||||
destroy: sinon.spy(),
|
||||
get: function (param) {
|
||||
switch (param) {
|
||||
case 'index':
|
||||
return indexPattern;
|
||||
default:
|
||||
throw new Error('Param "' + param + '" is not implemented in the stubbed search source');
|
||||
}
|
||||
};
|
||||
|
||||
},
|
||||
crankResults: function () {
|
||||
deferedResult.resolve(searchResponse);
|
||||
deferedResult = $q.defer();
|
||||
},
|
||||
onResults: function () {
|
||||
// Up to the test to resolve this manually
|
||||
// For example:
|
||||
// someHandler.resolve(require('fixtures/search_response'))
|
||||
return deferedResult.promise;
|
||||
},
|
||||
onError: function () { return $q.defer().promise; },
|
||||
_flatten: function () {
|
||||
return Promise.resolve({ index: indexPattern, body: {} });
|
||||
}
|
||||
};
|
||||
});
|
||||
|
||||
};
|
||||
|
|
|
@ -1,21 +1,19 @@
|
|||
define(function (require) {
|
||||
var sinon = require('auto-release-sinon');
|
||||
import sinon from 'auto-release-sinon';
|
||||
|
||||
function MockMap(container, chartData, params) {
|
||||
this.container = container;
|
||||
this.chartData = chartData;
|
||||
this.params = params;
|
||||
function MockMap(container, chartData, params) {
|
||||
this.container = container;
|
||||
this.chartData = chartData;
|
||||
this.params = params;
|
||||
|
||||
// stub required methods
|
||||
this.addStubs();
|
||||
}
|
||||
// stub required methods
|
||||
this.addStubs();
|
||||
}
|
||||
|
||||
MockMap.prototype.addStubs = function () {
|
||||
this.addTitle = sinon.stub();
|
||||
this.addFitControl = sinon.stub();
|
||||
this.addBoundingControl = sinon.stub();
|
||||
this.destroy = sinon.stub();
|
||||
};
|
||||
MockMap.prototype.addStubs = function () {
|
||||
this.addTitle = sinon.stub();
|
||||
this.addFitControl = sinon.stub();
|
||||
this.addBoundingControl = sinon.stub();
|
||||
this.destroy = sinon.stub();
|
||||
};
|
||||
|
||||
return MockMap;
|
||||
});
|
||||
export default MockMap;
|
|
@ -1,7 +1,20 @@
|
|||
var $ = require('jquery');
|
||||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
import $ from 'jquery';
|
||||
import VislibVisProvider from 'ui/vislib/vis';
|
||||
|
||||
var $visCanvas = $('<div>')
|
||||
.attr('id', 'vislib-vis-fixtures')
|
||||
.css({
|
||||
height: '500px',
|
||||
width: '1024px',
|
||||
display: 'flex',
|
||||
position: 'fixed',
|
||||
top: '0px',
|
||||
left: '0px',
|
||||
overflow: 'hidden'
|
||||
})
|
||||
.appendTo('body');
|
||||
|
||||
var $visCanvas = $('<div>').attr('id', 'vislib-vis-fixtures').appendTo('body');
|
||||
var count = 0;
|
||||
var visHeight = $visCanvas.height();
|
||||
|
||||
|
@ -19,7 +32,7 @@ afterEach(function () {
|
|||
|
||||
module.exports = function VislibFixtures(Private) {
|
||||
return function (visLibParams) {
|
||||
var Vis = Private(require('ui/vislib/vis'));
|
||||
var Vis = Private(VislibVisProvider);
|
||||
return new Vis($visCanvas.new(), _.defaults({}, visLibParams || {}, {
|
||||
shareYAxis: true,
|
||||
addTooltip: true,
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'columns': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'columns': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
module.exports = {
|
||||
'columns': [
|
||||
{
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'valueFormatter': _.identity,
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'columns': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'columns': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'columns': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var moment = require('moment');
|
||||
import moment from 'moment';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'columns': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'rows': [
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
var _ = require('lodash');
|
||||
import _ from 'lodash';
|
||||
|
||||
module.exports = {
|
||||
'label': '',
|
||||
|
|
|
@ -1,16 +1,16 @@
|
|||
let { inherits } = require('util');
|
||||
let { defaults } = require('lodash');
|
||||
let { resolve } = require('path');
|
||||
let { writeFile } = require('fs');
|
||||
let webpack = require('webpack');
|
||||
var Boom = require('boom');
|
||||
let DirectoryNameAsMain = require('webpack-directory-name-as-main');
|
||||
let ExtractTextPlugin = require('extract-text-webpack-plugin');
|
||||
var CommonsChunkPlugin = require('webpack/lib/optimize/CommonsChunkPlugin');
|
||||
import webpack from 'webpack';
|
||||
import Boom from 'boom';
|
||||
import DirectoryNameAsMain from 'webpack-directory-name-as-main';
|
||||
import ExtractTextPlugin from 'extract-text-webpack-plugin';
|
||||
import CommonsChunkPlugin from 'webpack/lib/optimize/CommonsChunkPlugin';
|
||||
|
||||
let utils = require('requirefrom')('src/utils');
|
||||
let fromRoot = utils('fromRoot');
|
||||
let babelOptions = require('./babelOptions');
|
||||
import babelOptions from './babelOptions';
|
||||
import { inherits } from 'util';
|
||||
import { defaults } from 'lodash';
|
||||
import { resolve } from 'path';
|
||||
import { writeFile } from 'fs';
|
||||
let babelExclude = [/[\/\\](webpackShims|node_modules|bower_components)[\/\\]/];
|
||||
|
||||
class BaseOptimizer {
|
||||
|
|
|
@ -1,8 +1,8 @@
|
|||
let { fromNode } = require('bluebird');
|
||||
let { writeFile } = require('fs');
|
||||
|
||||
let BaseOptimizer = require('./BaseOptimizer');
|
||||
let fromRoot = require('../utils/fromRoot');
|
||||
import BaseOptimizer from './BaseOptimizer';
|
||||
import fromRoot from '../utils/fromRoot';
|
||||
import { fromNode } from 'bluebird';
|
||||
import { writeFile } from 'fs';
|
||||
|
||||
module.exports = class FsOptimizer extends BaseOptimizer {
|
||||
async init() {
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue