Merge remote-tracking branch 'origin/master' into feature/design

This commit is contained in:
Tyler Smalley 2016-03-09 10:08:35 -08:00
commit 9502adedc1
95 changed files with 919 additions and 820 deletions

View file

@ -1 +1 @@
4.3.0
4.3.2

View file

@ -13,7 +13,7 @@ At any given time the Kibana team at Elastic is working on dozens of features an
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
#### My issue isn't getting enough attention
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC ([#kibana](https://kiwiirc.com/client/irc.freenode.net/?#kibana) on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
### I want to help!
**Now we're talking**. If you have a bugfix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
@ -111,36 +111,73 @@ Before running the tests you will need to install the projects dependencies as d
Once that is complete just run:
```sh
```
sh
npm run test && npm run build
```
#### Testing and debugging tests
#### Debugging unit tests
The standard `npm run test` task runs several sub tasks and can take several minutes to complete, making debugging failures pretty painful. In order to ease the pain specialized tasks provide alternate methods for running the tests.
<dl>
<dt><code>npm run test:quick</code></dt>
<dd>Runs both server and browser tests, but skips linting</dd>
<dt><code>npm run test:server</code> or <code>npm run test:browser</code></dt>
<dd>Runs the tests for just the server or browser</dd>
`npm run test:quick`
Runs both server and browser tests, but skips linting
<dt><code>npm run test:dev</code></dt>
<dd>
Initializes an environment for debugging the browser tests. Includes an dedicated instance of the kibana server for building the test bundle, and a karma server. When running this task the build is optimized for the first time and then a karma-owned instance of the browser is opened. Click the "debug" button to open a new tab that executes the unit tests.
<br>
<img src="http://i.imgur.com/DwHxgfq.png">
</dd>
`npm run test:server`
Run only the server tests
<dt><code>npm run mocha [test file or dir]</code> or <code>npm run mocha:debug [test file or dir]</code></dt>
<dd>
Run a one off test with the local project version of mocha, babel compilation, and optional debugging. Great
for development and fixing individual tests.
</dd>
</dl>
`npm run test:browser`
Run only the browser tests
Distributable packages can be found in `target/` after the build completes.
`npm run test:dev`
Initializes an environment for debugging the browser tests. Includes an dedicated instance of the kibana server for building the test bundle, and a karma server. When running this task the build is optimized for the first time and then a karma-owned instance of the browser is opened. Click the "debug" button to open a new tab that executes the unit tests.
![Browser test debugging](http://i.imgur.com/DwHxgfq.png)
`npm run mocha [test file or dir]` or `npm run mocha:debug [test file or dir]`
Run a one off test with the local project version of mocha, babel compilation, and optional debugging. Great
for development and fixing individual tests.
#### Unit testing plugins
This should work super if you're using the [Kibana plugin generator](https://github.com/elastic/generator-kibana-plugin). If you're not using the generator, well, you're on your own. We suggest you look at how the generator works.
`npm run test:dev -- --kbnServer.testsBundle.pluginId=some_special_plugin --kbnServer.plugin-path=../some_special_plugin`
Run the tests for just your particular plugin. Assuming you plugin lives outside of the `installedPlugins directory`, which it should.
#### Running browser automation tests:
*The Selenium server that is started currently only runs the tests in Firefox*
The following will start Kibana, Elasticsearch and Selenium for you. To run the functional UI tests use the following commands
`npm run test:ui`
Run the functional UI tests one time and exit. This is used by the CI systems and is great for quickly checking that things pass. It is essentially a combination of the next two tasks.
`npm run test:ui:server`
Start the server required for the `test:ui:runner` tasks. Once the server is started `test:ui:runner` can be run multiple times without waiting for the server to start.
`npm run test:ui:runner`
Execute the front-end selenium tests. This requires the server started by the `test:ui:server` task.
##### If you already have ElasticSearch, Kibana, and Selenium Server running:
Set your es and kibana ports in `test/intern.js` to 9220 and 5620, respectively. You can configure your Selenium server to run the tests on Chrome,IE, or other browsers here.
Once you've got the services running, execute the following:
```
sh
npm run test:ui:runner
```
#### Browser automation notes:
- Using Page Objects pattern (https://theintern.github.io/intern/#writing-functional-test)
- At least the initial tests for the Settings, Discover, and Visualize tabs all depend on a very specific set of logstash-type data (generated with makelogs). Since that is a static set of data, all the Discover and Visualize tests use a specific Absolute time range. This guarantees the same results each run.
- These tests have been developed and tested with Chrome and Firefox browser. In theory, they should work on all browsers (that's the benefit of Intern using Leadfoot).
- These tests should also work with an external testing service like https://saucelabs.com/ or https://www.browserstack.com/ but that has not been tested.
- https://theintern.github.io/
- https://theintern.github.io/leadfoot/Element.html
#### Building OS packages
@ -157,48 +194,7 @@ To specify a package to build you can add `rpm` or `deb` as an argument.
npm run build:ospackages -- --rpm
```
### Functional UI Testing
#### Handy references
- https://theintern.github.io/
- https://theintern.github.io/leadfoot/Element.html
#### Running tests using npm task:
*The Selenium server that is started currently only runs the tests in Firefox*
To run the functional UI tests use the following commands
<dl>
<dt><code>npm run test:ui</code></dt>
<dd>Run the functional UI tests one time and exit. This is used by the CI systems and is great for quickly checking that things pass. It is essentially a combination of the next two tasks.</dd>
<dt><code>npm run test:ui:server</code></dt>
<dd>Start the server required for the <code>test:ui:runner</code> tasks. Once the server is started <code>test:ui:runner</code> can be run multiple times without waiting for the server to start.</dd>
<dt><code>npm run test:ui:runner</code></dt>
<dd>Execute the front-end selenium tests. This requires the server started by the <code>test:ui:server</code> task.</dd>
</dl>
#### Running tests locally with your existing (and already running) ElasticSearch, Kibana, and Selenium Server:
Set your es and kibana ports in `test/intern.js` to 9220 and 5620, respecitively. You can configure your Selenium server to run the tests on Chrome,IE, or other browsers here.
Once you've got the services running, execute the following:
```sh
npm run test:ui:runner
```
#### General notes:
- Using Page Objects pattern (https://theintern.github.io/intern/#writing-functional-test)
- At least the initial tests for the Settings, Discover, and Visualize tabs all depend on a very specific set of logstash-type data (generated with makelogs). Since that is a static set of data, all the Discover and Visualize tests use a specific Absolute time range. This gaurantees the same results each run.
- These tests have been developed and tested with Chrome and Firefox browser. In theory, they should work on all browsers (that's the benefit of Intern using Leadfoot).
- These tests should also work with an external testing service like https://saucelabs.com/ or https://www.browserstack.com/ but that has not been tested.
Distributable packages can be found in `target/` after the build completes.
## Submitting a pull request
@ -225,6 +221,7 @@ Remember, someone is blocked by a pull awaiting review, make it count. Be thorou
1. **Understand the issue** that is being fixed, or the feature being added. Check the description on the pull, and check out the related issue. If you don't understand something, ask the person the submitter for clarification.
1. **Reproduce the bug** (or the lack of feature I guess?) in the destination branch, usually `master`. The referenced issue will help you here. If you're unable to reproduce the issue, contact the issue submitter for clarification
1. **Check out the pull** and test it. Is the issue fixed? Does it have nasty side effects? Try to create suspect inputs. If it operates on the value of a field try things like: strings (including an empty string), null, numbers, dates. Try to think of edge cases that might break the code.
1. **Merge the target branch**. It is possible that tests or the linter have been updated in the target branch since the pull was submitted. Merging the pull could cause core to start failing.
1. **Read the code**. Understanding the changes will help you find additional things to test. Contact the submitter if you don't understand something.
1. **Go line-by-line**. Are there [style guide](https://github.com/elastic/kibana/blob/master/STYLEGUIDE.md) violations? Strangely named variables? Magic numbers? Do the abstractions make sense to you? Are things arranged in a testable way?
1. **Speaking of tests** Are they there? If a new function was added does it have tests? Do the tests, well, TEST anything? Do they just run the function or do they properly check the output?

View file

@ -586,7 +586,7 @@ Use slashes for both single line and multi line comments. Try to write
comments that explain higher level mechanisms or clarify difficult
segments of your code. **Don't use comments to restate trivial things**.
***Exception:*** Comment blocks describing a function and it's arguments (docblock) should start with `/**`, contain a single `*` at the begining of each line, and end with `*/`.
***Exception:*** Comment blocks describing a function and its arguments (docblock) should start with `/**`, contain a single `*` at the beginning of each line, and end with `*/`.
*Right:*
@ -656,7 +656,7 @@ function ClassName() {
var ClassName = function () {};
```
### Inhertiance should be done with a utility
### Inheritance should be done with a utility
While you can do it with pure JS, a utility will remove a lot of boilerplate, and be more readable and functional.
@ -685,7 +685,7 @@ Square.prototype = Object.create(Shape);
### Keep Constructors Small
It is often the case that there are properties that can't be defined on the prototype, or work that needs to be done to completely create an object (like call it's Super class). This is all that should be done within constructors.
It is often the case that there are properties that can't be defined on the prototype, or work that needs to be done to completely create an object (like call its Super class). This is all that should be done within constructors.
Try to follow the [Write small functions](#write-small-functions) rule here too.
@ -775,7 +775,7 @@ Several already exist, and can be found in `src/kibana/utils/_mixins.js`
## Filenames
All filenames should use `snake_case` and *can* start with an underscore if the module is not intended to be used outside of it's containing module.
All filenames should use `snake_case` and *can* start with an underscore if the module is not intended to be used outside of its containing module.
*Right:*
- `src/kibana/index_patterns/index_pattern.js`
@ -858,7 +858,7 @@ app.service('CustomService', function(Promise, otherDeps) {
### Routes
Angular routes are defined using a custom require modules named `routes` that remove much of the required boilerplate.
Angular routes are defined using a custom require module named `routes` that remove much of the required boilerplate.
```js
require('ui/routes')
@ -871,7 +871,7 @@ require('ui/routes')
## Multiple attribute values
When a node has multiple attributes that would cause it to exceed the line character limit, each attribute including the first should be on its own line with a single indent. Also, when a node that is styled in this way has child nodes, there should be a blank line between the openening parent tag and the first child tag.
When a node has multiple attributes that would cause it to exceed the line character limit, each attribute including the first should be on its own line with a single indent. Also, when a node that is styled in this way has child nodes, there should be a blank line between the opening parent tag and the first child tag.
```
<ul

View file

@ -56,7 +56,7 @@
# Time in milliseconds to wait for responses from the back end or Elasticsearch. This value
# must be a positive integer.
# elasticsearch.requestTimeout: 300000
# elasticsearch.requestTimeout: 30000
# Time in milliseconds for Elasticsearch to wait for responses from shards. Set to 0 to disable.
# elasticsearch.shardTimeout: 0

View file

@ -22,11 +22,11 @@ has the following fingerprint:
wget -qO - https://packages.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
--------------------------------------------------
+
. Add the repository definition to your `/etc/apt/sources.list` file:
. Add the repository definition to your `/etc/apt/sources.list.d/kibana.list` file:
+
[source, sh]
--------------------------------------------------
echo "deb http://packages.elastic.co/kibana/{branch}/debian stable main" | sudo tee -a /etc/apt/sources.list
echo "deb http://packages.elastic.co/kibana/{branch}/debian stable main" | sudo tee -a /etc/apt/sources.list.d/kibana.list
--------------------------------------------------
+
[WARNING]
@ -37,7 +37,7 @@ When the `deb-src` entry, is present, the commands in this procedure generate an
Unable to find expected entry 'main/source/Sources' in Release file (Wrong sources.list entry or malformed file)
Delete the `deb-src` entry from the `/etc/apt/sources.list` file to clear the error.
Delete the `deb-src` entry from the `/etc/apt/sources.list.d/kibana.list` file to clear the error.
==================================================
+
. Run `apt-get update` and the repository is ready for use. Install Kibana with the following command:

View file

@ -56,6 +56,7 @@
"elasticsearchWithPlugins": "grunt esvm:withPlugins:keepalive",
"lint": "grunt eslint:source",
"lintroller": "grunt eslint:fixSource",
"makelogs": "makelogs",
"mocha": "mocha",
"mocha:debug": "mocha --debug-brk",
"sterilize": "grunt sterilize"
@ -127,11 +128,10 @@
"moment-timezone": "0.4.1",
"raw-loader": "0.5.1",
"request": "2.61.0",
"requirefrom": "0.2.0",
"rimraf": "2.4.3",
"rjs-repack-loader": "1.0.6",
"script-loader": "0.6.1",
"semver": "4.3.6",
"semver": "5.1.0",
"style-loader": "0.12.3",
"tar": "2.2.0",
"url-loader": "0.5.6",
@ -146,7 +146,7 @@
"angular-mocks": "1.4.7",
"auto-release-sinon": "1.0.3",
"babel-eslint": "4.1.8",
"chokidar": "1.0.5",
"chokidar": "1.4.3",
"eslint": "1.10.3",
"eslint-plugin-mocha": "1.1.0",
"expect.js": "0.3.1",
@ -177,6 +177,7 @@
"libesvm": "3.3.0",
"license-checker": "3.1.0",
"load-grunt-config": "0.7.2",
"makelogs": "3.0.0-beta3",
"marked-text-renderer": "0.1.0",
"mocha": "2.3.0",
"nock": "2.10.0",
@ -188,7 +189,7 @@
"supertest-as-promised": "2.0.2"
},
"engines": {
"node": "4.3.0",
"npm": "2.14.15"
"node": "4.3.2",
"npm": "2.14.22"
}
}

View file

@ -1,7 +1,6 @@
import _ from 'lodash';
let utils = require('requirefrom')('src/utils');
let pkg = utils('packageJson');
import pkg from '../utils/packageJson';
import Command from './Command';
let argv = process.env.kbnWorkerArgv ? JSON.parse(process.env.kbnWorkerArgv) : process.argv.slice();

View file

@ -1,8 +1,8 @@
import cluster from 'cluster';
const { join } = require('path');
const { join, resolve } = require('path');
const { format: formatUrl } = require('url');
import Hapi from 'hapi';
const { debounce, compact, get, invoke, bindAll, once, sample } = require('lodash');
const { debounce, compact, get, invoke, bindAll, once, sample, uniq } = require('lodash');
import Log from '../Log';
import Worker from './worker';
@ -83,17 +83,21 @@ module.exports = class ClusterManager {
setupWatching(extraPaths) {
const chokidar = require('chokidar');
const utils = require('requirefrom')('src/utils');
const fromRoot = utils('fromRoot');
const fromRoot = require('../../utils/fromRoot');
this.watcher = chokidar.watch([
'src/plugins',
'src/server',
'src/ui',
'src/utils',
'config',
...extraPaths
], {
const watchPaths = uniq(
[
fromRoot('src/plugins'),
fromRoot('src/server'),
fromRoot('src/ui'),
fromRoot('src/utils'),
fromRoot('config'),
...extraPaths
]
.map(path => resolve(path))
);
this.watcher = chokidar.watch(watchPaths, {
cwd: fromRoot('.'),
ignored: /[\\\/](\..*|node_modules|bower_components|public|__tests__)[\\\/]/
});

View file

@ -1,8 +1,7 @@
import path from 'path';
import expect from 'expect.js';
var utils = require('requirefrom')('src/utils');
var fromRoot = utils('fromRoot');
import fromRoot from '../../../utils/fromRoot';
import settingParser from '../setting_parser';
describe('kibana cli', function () {

View file

@ -1,5 +1,4 @@
const utils = require('requirefrom')('src/utils');
const fromRoot = utils('fromRoot');
import fromRoot from '../../utils/fromRoot';
import settingParser from './setting_parser';
import installer from './plugin_installer';
import remover from './plugin_remover';

View file

@ -1,6 +1,5 @@
import _ from 'lodash';
const utils = require('requirefrom')('src/utils');
const fromRoot = utils('fromRoot');
import fromRoot from '../../utils/fromRoot';
import pluginDownloader from './plugin_downloader';
import pluginCleaner from './plugin_cleaner';
import pluginExtractor from './plugin_extractor';

View file

@ -2,8 +2,7 @@ import _ from 'lodash';
import fs from 'fs';
import yaml from 'js-yaml';
let utils = require('requirefrom')('src/utils');
let fromRoot = utils('fromRoot');
import fromRoot from '../../utils/fromRoot';
let legacySettingMap = {
// server
@ -67,4 +66,3 @@ module.exports = function (path) {
apply(config, val, key);
}, {});
};

View file

@ -3,8 +3,7 @@ const { isWorker } = require('cluster');
const { resolve } = require('path');
const cwd = process.cwd();
const src = require('requirefrom')('src');
const fromRoot = src('utils/fromRoot');
import fromRoot from '../../utils/fromRoot';
let canCluster;
try {
@ -61,7 +60,11 @@ function initServerSettings(opts, extraCliOptions) {
opts.pluginDir
)));
set('plugins.paths', [].concat(opts.pluginPath || []));
set('plugins.paths', _.compact([].concat(
get('plugins.paths'),
opts.pluginPath
)));
merge(extraCliOptions);
return settings;
@ -123,7 +126,7 @@ module.exports = function (program) {
}
let kbnServer = {};
const KbnServer = src('server/KbnServer');
const KbnServer = require('../../server/KbnServer');
try {
kbnServer = new KbnServer(settings);
await kbnServer.ready();

View file

@ -1,10 +1,8 @@
import LazyServer from './LazyServer';
import LazyOptimizer from './LazyOptimizer';
module.exports = async (kbnServer, kibanaHapiServer, config) => {
let src = require('requirefrom')('src');
let fromRoot = src('utils/fromRoot');
import fromRoot from '../../utils/fromRoot';
export default async (kbnServer, kibanaHapiServer, config) => {
let server = new LazyServer(
config.get('optimize.lazyHost'),
config.get('optimize.lazyPort'),
@ -20,7 +18,6 @@ module.exports = async (kbnServer, kibanaHapiServer, config) => {
})
);
let ready = false;
let sendReady = () => {

View file

@ -29,7 +29,7 @@ module.exports = function ({ Plugin }) {
key: string()
}).default(),
apiVersion: Joi.string().default('master'),
engineVersion: Joi.string().valid('^3.0.0').default('^3.0.0')
engineVersion: Joi.string().valid('^5.0.0').default('^5.0.0')
}).default();
},

View file

@ -3,8 +3,7 @@ import expect from 'expect.js';
import sinon from 'sinon';
import isUpgradeable from '../is_upgradeable';
let utils = require('requirefrom')('src/utils');
let pkg = utils('packageJson');
import pkg from '../../../../utils/packageJson';
let version = pkg.version;
describe('plugins/elasticsearch', function () {

View file

@ -1,7 +1,6 @@
import expect from 'expect.js';
import util from 'util';
const requireFromTest = require('requirefrom')('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
import * as kbnTestServer from '../../../../../test/utils/kbn_server';
const format = util.format;
@ -12,6 +11,8 @@ describe('plugins/elasticsearch', function () {
let kbnServer;
before(function () {
this.timeout(60000); // sometimes waiting for server takes longer than 10
kbnServer = kbnTestServer.createServer();
return kbnServer.ready()
.then(() => kbnServer.server.plugins.elasticsearch.waitUntilReady());
@ -99,7 +100,7 @@ describe('plugins/elasticsearch', function () {
testRoute({
method: 'POST',
url: '/elasticsearch/_msearch?timeout=0&ignore_unavailable=true&preference=1429577952339',
payload: '{"index":"logstash-2015.04.21","ignore_unavailable":true}\n{"size":500,"sort":{"@timestamp":"desc"},"query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":"*"}},{"bool":{"must":[{"range":{"@timestamp":{"gte":1429577068175,"lte":1429577968175}}}],"must_not":[]}}],"must_not":[]}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"*":{}}},"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"30s","pre_zone":"-07:00","pre_zone_adjust_large_interval":true,"min_doc_count":0,"extended_bounds":{"min":1429577068175,"max":1429577968175}}}},"fields":["*","_source"],"script_fields":{},"fielddata_fields":["timestamp_offset","@timestamp","utc_time"]}\n' // eslint-disable-line max-len
payload: '{"index":"logstash-2015.04.21","ignore_unavailable":true}\n{"size":500,"sort":{"@timestamp":"desc"},"query":{"bool":{"must":[{"query_string":{"analyze_wildcard":true,"query":"*"}},{"bool":{"must":[{"range":{"@timestamp":{"gte":1429577068175,"lte":1429577968175}}}],"must_not":[]}}],"must_not":[]}},"highlight":{"pre_tags":["@kibana-highlighted-field@"],"post_tags":["@/kibana-highlighted-field@"],"fields":{"*":{}}},"aggs":{"2":{"date_histogram":{"field":"@timestamp","interval":"30s","min_doc_count":0,"extended_bounds":{"min":1429577068175,"max":1429577968175}}}},"fields":["*","_source"],"script_fields":{},"fielddata_fields":["timestamp_offset","@timestamp","utc_time"]}\n' // eslint-disable-line max-len
});
});

View file

@ -1,5 +1,7 @@
import SetupError from './setup_error';
import { format } from 'util';
import { mappings } from './kibana_index_mappings';
module.exports = function (server) {
const client = server.plugins.elasticsearch.client;
const index = server.config().get('kibana.index');
@ -16,16 +18,7 @@ module.exports = function (server) {
settings: {
number_of_shards: 1
},
mappings: {
config: {
properties: {
buildNum: {
type: 'string',
index: 'not_analyzed'
}
}
}
}
mappings
}
})
.catch(handleError('Unable to create Kibana index "<%= kibana.index %>"'))

View file

@ -1,23 +1,30 @@
import createAgent from './create_agent';
import mapUri from './map_uri';
import { resolve } from 'url';
import { assign } from 'lodash';
function createProxy(server, method, route, config) {
const options = {
method: method,
path: createProxy.createPath(route),
config: {
timeout: {
socket: server.config().get('elasticsearch.requestTimeout')
}
},
handler: {
proxy: {
mapUri: mapUri(server),
passThrough: true,
agent: createAgent(server),
xforward: true
xforward: true,
timeout: server.config().get('elasticsearch.requestTimeout')
}
},
};
if (config) options.config = config;
assign(options.config, config);
server.route(options);
};

View file

@ -9,6 +9,19 @@ import callWithRequest from './call_with_request';
module.exports = function (server) {
const config = server.config();
class ElasticsearchClientLogging {
error(err) {
server.log(['error', 'elasticsearch'], err);
}
warning(message) {
server.log(['warning', 'elasticsearch'], message);
}
info() {}
debug() {}
trace() {}
close() {}
}
function createClient(options) {
options = _.defaults(options || {}, {
url: config.get('elasticsearch.url'),
@ -19,6 +32,8 @@ module.exports = function (server) {
clientKey: config.get('elasticsearch.ssl.key'),
ca: config.get('elasticsearch.ssl.ca'),
apiVersion: config.get('elasticsearch.apiVersion'),
pingTimeout: config.get('elasticsearch.pingTimeout'),
requestTimeout: config.get('elasticsearch.requestTimeout'),
keepAlive: true,
auth: true
});
@ -45,21 +60,12 @@ module.exports = function (server) {
plugins: options.plugins,
apiVersion: options.apiVersion,
keepAlive: options.keepAlive,
pingTimeout: options.pingTimeout,
requestTimeout: options.requestTimeout,
defer: function () {
return Bluebird.defer();
},
log: function () {
this.error = function (err) {
server.log(['error', 'elasticsearch'], err);
};
this.warning = function (message) {
server.log(['warning', 'elasticsearch'], message);
};
this.info = _.noop;
this.debug = _.noop;
this.trace = _.noop;
this.close = _.noop;
}
log: ElasticsearchClientLogging
});
}
@ -69,6 +75,7 @@ module.exports = function (server) {
const noAuthClient = createClient({ auth: false });
server.on('close', _.bindKey(noAuthClient, 'close'));
server.expose('ElasticsearchClientLogging', ElasticsearchClientLogging);
server.expose('client', client);
server.expose('createClient', createClient);
server.expose('callWithRequestFactory', callWithRequest);

View file

@ -22,7 +22,7 @@ module.exports = function (plugin, server) {
plugin.status.yellow('Waiting for Elasticsearch');
function waitForPong() {
return client.ping({ requestTimeout: 1500 }).catch(function (err) {
return client.ping().catch(function (err) {
if (!(err instanceof NoConnections)) throw err;
plugin.status.red(format('Unable to connect to Elasticsearch at %s.', config.get('elasticsearch.url')));

View file

@ -1,5 +1,4 @@
import semver from 'semver';
const utils = require('requirefrom')('src/utils');
const rcVersionRegex = /(\d+\.\d+\.\d+)\-rc(\d+)/i;
module.exports = function (server, doc) {

View file

@ -0,0 +1,10 @@
export const mappings = {
config: {
properties: {
buildNum: {
type: 'string',
index: 'not_analyzed'
}
}
}
};

View file

@ -1,4 +1,5 @@
import upgrade from './upgrade_config';
import { mappings } from './kibana_index_mappings';
module.exports = function (server) {
const config = server.config();
@ -8,11 +9,16 @@ module.exports = function (server) {
type: 'config',
body: {
size: 1000,
sort: [ { buildNum: { order: 'desc', ignore_unmapped: true } } ]
sort: [
{
buildNum: {
order: 'desc',
unmapped_type: mappings.config.properties.buildNum.type
}
}
]
}
};
return client.search(options).then(upgrade(server));
};

View file

@ -3,8 +3,6 @@ import isUpgradeable from './is_upgradeable';
import _ from 'lodash';
import { format } from 'util';
const utils = require('requirefrom')('src/utils');
module.exports = function (server) {
const MAX_INTEGER = Math.pow(2, 53) - 1;
@ -54,4 +52,3 @@ module.exports = function (server) {
});
};
};

View file

@ -11,7 +11,7 @@ export default function HistogramVisType(Private) {
title: 'Vertical bar chart',
icon: 'fa-bar-chart',
description: 'The goto chart for oh-so-many needs. Great for time and non-time data. Stacked or grouped, ' +
'exact numbers or percentages. If you are not sure which chart your need, you could do worse than to start here.',
'exact numbers or percentages. If you are not sure which chart you need, you could do worse than to start here.',
params: {
defaults: {
shareYAxis: true,

View file

@ -127,7 +127,7 @@ app.controller('discover', function ($scope, config, courier, $route, $window, N
return {
query: $scope.searchSource.get('query') || '',
sort: getSort.array(savedSearch.sort, $scope.indexPattern),
columns: savedSearch.columns || ['_source'],
columns: savedSearch.columns.length > 0 ? savedSearch.columns : config.get('defaultColumns'),
index: $scope.indexPattern.id,
interval: 'auto',
filters: _.cloneDeep($scope.searchSource.getOwn('filter'))

View file

@ -3,7 +3,7 @@
<div class="bs-callout bs-callout-warning">
<h4>Caution: You can break stuff here</h4>
Be careful in here, these settings are for very advanced users only.
Tweaks you make here can break large portionsof Kibana. Some of these
Tweaks you make here can break large portions of Kibana. Some of these
settings may be undocumented, unsupported or experimental. If a field has
a default value, blanking the field will reset it to its default which
may be unacceptable given other configuration directives. Deleting a

View file

@ -13,7 +13,7 @@
<!-- tabs -->
<ul class="nav navbar-nav">
<li ng-class="{active: sidebar.section == 'data'}" ng-show="vis.type.schemas.metrics">
<li ng-class="{active: sidebar.section == 'data'}" ng-show="sidebar.showData">
<a class="navbar-link active" ng-click="sidebar.section='data'">Data</a>
</li>
<li ng-class="{active: sidebar.section == 'options'}">
@ -60,10 +60,9 @@
<div class="vis-editor-config" ng-show="sidebar.section == 'options'">
<!-- vis options -->
<vis-editor-vis-options vis="vis"></vis-editor-vis-options>
<vis-editor-vis-options vis="vis" saved-vis="savedVis"></vis-editor-vis-options>
</div>
</form>
</div>

View file

@ -16,7 +16,12 @@ uiModules
controller: function ($scope) {
$scope.$bind('vis', 'editableVis');
$scope.$bind('outputVis', 'vis');
this.section = _.get($scope, 'vis.type.requiresSearch') ? 'data' : 'options';
$scope.$watch('vis.type', (visType) => {
if (visType) {
this.showData = visType.schemas.buckets || visType.schemas.metrics;
this.section = this.section || (this.showData ? 'data' : 'options');
}
});
}
};
});

View file

@ -80,6 +80,7 @@
&-content {
.flex-parent();
z-index: 0;
// overrides for tablet and desktop
@media (min-width: @screen-md-min) {

View file

@ -11,6 +11,7 @@ uiModules
template: visOptionsTemplate,
scope: {
vis: '=',
savedVis: '=',
},
link: function ($scope, $el) {
const $optionContainer = $el.find('.visualization-options');

View file

@ -15,8 +15,6 @@ module.exports = function createMappingsFromPatternFields(fields) {
mapping = {
type: 'string',
index: 'analyzed',
omit_norms: true,
fielddata: {format: 'disabled'},
fields: {
raw: {type: 'string', index: 'not_analyzed', doc_values: true, ignore_above: 256}
}

View file

@ -59,8 +59,6 @@ module.exports = function registerPost(server) {
mapping: {
type: 'string',
index: 'analyzed',
omit_norms: true,
fielddata: {format: 'disabled'},
fields: {
raw: {type: 'string', index: 'not_analyzed', doc_values: true, ignore_above: 256}
}

View file

@ -1,10 +1,8 @@
import { union } from 'lodash';
import findSourceFiles from './findSourceFiles';
module.exports = (kibana) => {
let utils = require('requirefrom')('src/utils');
let fromRoot = utils('fromRoot');
import fromRoot from '../../utils/fromRoot';
export default (kibana) => {
return new kibana.Plugin({
config: (Joi) => {
return Joi.object({

View file

@ -27,6 +27,7 @@ window.__KBN__ = {
kbnIndex: '.kibana',
esShardTimeout: 1500,
esApiVersion: '2.0',
esRequestTimeout: '300000'
}
};

View file

@ -1,10 +1,10 @@
import Hapi from 'hapi';
import { constant, once, compact, flatten } from 'lodash';
import { promisify, resolve, fromNode } from 'bluebird';
import fromRoot from '../utils/fromRoot';
import pkg from '../utils/packageJson';
let utils = require('requirefrom')('src/utils');
let rootDir = utils('fromRoot')('.');
let pkg = utils('packageJson');
let rootDir = fromRoot('.');
module.exports = class KbnServer {
constructor(settings) {

View file

@ -1,10 +1,8 @@
import { resolve } from 'path';
import { fromNode as fn } from 'bluebird';
import expect from 'expect.js';
import requirefrom from 'requirefrom';
const requireFromTest = requirefrom('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
import * as kbnTestServer from '../../../test/utils/kbn_server';
const basePath = '/kibana';
describe('Server basePath config', function () {

View file

@ -3,7 +3,7 @@ import Joi from 'joi';
import _ from 'lodash';
import override from './override';
import createDefaultSchema from './schema';
let pkg = require('requirefrom')('src/utils')('packageJson');
import pkg from '../../utils/packageJson';
import clone from './deepCloneWithBuffers';
import { zipObject } from 'lodash';

View file

@ -4,8 +4,7 @@ import path from 'path';
import { get } from 'lodash';
import { randomBytes } from 'crypto';
let utils = require('requirefrom')('src/utils');
let fromRoot = utils('fromRoot');
import fromRoot from '../../utils/fromRoot';
module.exports = () => Joi.object({
pkg: Joi.object({

View file

@ -1,8 +1,5 @@
import expect from 'expect.js';
import requirefrom from 'requirefrom';
const requireFromTest = requirefrom('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
import * as kbnTestServer from '../../../../test/utils/kbn_server';
describe('cookie validation', function () {
let kbnServer;

View file

@ -1,9 +1,7 @@
import expect from 'expect.js';
import { fromNode as fn } from 'bluebird';
import { resolve } from 'path';
const requireFromTest = require('requirefrom')('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
import * as kbnTestServer from '../../../../test/utils/kbn_server';
const nonDestructiveMethods = ['GET'];
const destructiveMethods = ['POST', 'PUT', 'DELETE'];

View file

@ -111,7 +111,7 @@ module.exports = async function (kbnServer, server, config) {
path: '/goto/{urlId}',
handler: async function (request, reply) {
const url = await shortUrlLookup.getUrl(request.params.urlId);
reply().redirect(url);
reply().redirect(config.get('server.basePath') + url);
}
});

View file

@ -8,6 +8,42 @@ const defaultConfigSchema = Joi.object({
enabled: Joi.boolean().default(true)
}).default();
/**
* The server plugin class, used to extend the server
* and add custom behavior. A "scoped" plugin class is
* created by the PluginApi class and provided to plugin
* providers that automatically binds all but the `opts`
* arguments.
*
* @class Plugin
* @param {KbnServer} kbnServer - the KbnServer this plugin
* belongs to.
* @param {String} path - the path from which the plugin hails
* @param {Object} pkg - the value of package.json for the plugin
* @param {Objects} opts - the options for this plugin
* @param {String} [opts.id=pkg.name] - the id for this plugin.
* @param {Object} [opts.uiExports] - a mapping of UiExport types
* to UI modules or metadata about
* the UI module
* @param {Array} [opts.require] - the other plugins that this plugin
* requires. These plugins must exist and
* be enabled for this plugin to function.
* The require'd plugins will also be
* initialized first, in order to make sure
* that dependencies provided by these plugins
* are available
* @param {String} [opts.version=pkg.version] - the version of this plugin
* @param {Function} [opts.init] - A function that will be called to initialize
* this plugin at the appropriate time.
* @param {Function} [opts.config] - A function that produces a configuration
* schema using Joi, which is passed as its
* first argument.
* @param {String|False} [opts.publicDir=path + '/public']
* - the public directory for this plugin. The final directory must
* have the name "public", though it can be located somewhere besides
* the root of the plugin. Set this to false to disable exposure of a
* public directory
*/
module.exports = class Plugin {
constructor(kbnServer, path, pkg, opts) {
this.kbnServer = kbnServer;
@ -18,7 +54,6 @@ module.exports = class Plugin {
this.uiExportsSpecs = opts.uiExports || {};
this.requiredIds = opts.require || [];
this.version = opts.version || pkg.version;
this.externalCondition = opts.initCondition || _.constant(true);
this.externalInit = opts.init || _.noop;
this.getConfigSchema = opts.config || _.noop;
this.init = _.once(this.init);

View file

@ -2,7 +2,7 @@
import PluginApi from './PluginApi';
import { inspect } from 'util';
import { get, indexBy } from 'lodash';
let Collection = require('requirefrom')('src')('utils/Collection');
import Collection from '../../utils/Collection';
let byIdCache = Symbol('byIdCache');
let pluginApis = Symbol('pluginApis');

View file

@ -64,6 +64,7 @@ module.exports = async (kbnServer, server, config) => {
defaultInjectedVars.kbnIndex = config.get('kibana.index');
}
if (config.has('elasticsearch')) {
defaultInjectedVars.esRequestTimeout = config.get('elasticsearch.requestTimeout');
defaultInjectedVars.esShardTimeout = config.get('elasticsearch.shardTimeout');
defaultInjectedVars.esApiVersion = config.get('elasticsearch.apiVersion');
}

View file

@ -4,8 +4,9 @@ import 'ui/number_list';
import AggTypesMetricsMetricAggTypeProvider from 'ui/agg_types/metrics/MetricAggType';
import AggTypesMetricsGetResponseAggConfigClassProvider from 'ui/agg_types/metrics/getResponseAggConfigClass';
import RegistryFieldFormatsProvider from 'ui/registry/field_formats';
export default function AggTypeMetricPercentileRanksProvider(Private) {
import getPercentileValue from './percentiles_get_value';
export default function AggTypeMetricPercentileRanksProvider(Private) {
var MetricAggType = Private(AggTypesMetricsMetricAggTypeProvider);
var getResponseAggConfigClass = Private(AggTypesMetricsGetResponseAggConfigClassProvider);
var fieldFormats = Private(RegistryFieldFormatsProvider);
@ -49,11 +50,7 @@ export default function AggTypeMetricPercentileRanksProvider(Private) {
return fieldFormats.getInstance('percent') || fieldFormats.getDefaultInstance('number');
},
getValue: function (agg, bucket) {
// values for 1, 5, and 10 will come back as 1.0, 5.0, and 10.0 so we
// parse the keys and respond with the value that matches
return _.find(bucket[agg.parentId] && bucket[agg.parentId].values, function (value, key) {
return agg.key === parseFloat(key);
}) / 100;
return getPercentileValue(agg, bucket) / 100;
}
});
};

View file

@ -5,8 +5,9 @@ import 'ui/number_list';
import AggTypesMetricsMetricAggTypeProvider from 'ui/agg_types/metrics/MetricAggType';
import AggTypesMetricsGetResponseAggConfigClassProvider from 'ui/agg_types/metrics/getResponseAggConfigClass';
import RegistryFieldFormatsProvider from 'ui/registry/field_formats';
export default function AggTypeMetricPercentilesProvider(Private) {
import getPercentileValue from './percentiles_get_value';
export default function AggTypeMetricPercentilesProvider(Private) {
var MetricAggType = Private(AggTypesMetricsMetricAggTypeProvider);
var getResponseAggConfigClass = Private(AggTypesMetricsGetResponseAggConfigClassProvider);
var fieldFormats = Private(RegistryFieldFormatsProvider);
@ -43,12 +44,6 @@ export default function AggTypeMetricPercentilesProvider(Private) {
return new ValueAggConfig(percent);
});
},
getValue: function (agg, bucket) {
// percentiles for 1, 5, and 10 will come back as 1.0, 5.0, and 10.0 so we
// parse the keys and respond with the value that matches
return _.find(bucket[agg.parentId] && bucket[agg.parentId].values, function (value, key) {
return agg.key === parseFloat(key);
});
}
getValue: getPercentileValue
});
};

View file

@ -0,0 +1,7 @@
import { find } from 'lodash';
export default function getPercentileValue(agg, bucket) {
const values = bucket[agg.parentId] && bucket[agg.parentId].values;
const percentile = find(values, value => agg.key === value.key);
return percentile ? percentile.value : NaN;
}

View file

@ -32,7 +32,7 @@ export default function configDefaultsProvider() {
type: 'json',
value:
'[\n' +
' ["", "hh:mm:ss.SSS"],\n' +
' ["", "HH:mm:ss.SSS"],\n' +
' ["PT1S", "HH:mm:ss"],\n' +
' ["PT1M", "HH:mm"],\n' +
' ["PT1H",\n' +
@ -50,6 +50,10 @@ export default function configDefaultsProvider() {
value: null,
description: 'The index to access if no index is set',
},
'defaultColumns': {
value: ['_source'],
description: 'Columns displayed by default in the Discovery tab',
},
'metaFields': {
value: ['_source', '_id', '_type', '_index', '_score'],
description: 'Fields that exist outside of _source to merge into our document when displaying it',

View file

@ -1,22 +1,22 @@
import sinon from 'auto-release-sinon';
import expect from 'expect.js';
import ngMock from 'ngMock';
import CourierDataSourceDocSourceProvider from 'ui/courier/data_source/doc_source';
import CourierFetchRequestDocProvider from 'ui/courier/fetch/request/doc';
import DocSourceProvider from '../../data_source/doc_source';
import DocRequestProvider from '../request/doc';
describe('Courier DocFetchRequest class', function () {
let storage;
let source;
let defer;
let req;
var storage;
var source;
var defer;
var req;
var setVersion;
let setVersion;
beforeEach(ngMock.module('kibana'));
beforeEach(ngMock.inject(function (Private, Promise, $injector) {
var DocSource = Private(CourierDataSourceDocSourceProvider);
var DocFetchRequest = Private(CourierFetchRequestDocProvider);
const DocSource = Private(DocSourceProvider);
const DocFetchRequest = Private(DocRequestProvider);
storage =
$injector.get('localStorage').store =

View file

@ -2,11 +2,12 @@ import ngMock from 'ngMock';
import expect from 'expect.js';
import sinon from 'auto-release-sinon';
import FetchProvider from 'ui/courier/fetch';
import IndexPatternProvider from 'fixtures/stubbed_logstash_index_pattern';
import searchResp from 'fixtures/search_response';
import CourierDataSourceDocSourceProvider from 'ui/courier/data_source/doc_source';
import CourierDataSourceSearchSourceProvider from 'ui/courier/data_source/search_source';
import FetchProvider from '../fetch';
import DocSourceProvider from '../../data_source/doc_source';
import SearchSourceProvider from '../../data_source/search_source';
describe('Fetch service', function () {
require('testUtils/noDigestPromises').activateForSuite();
@ -24,8 +25,8 @@ describe('Fetch service', function () {
Promise = $injector.get('Promise');
fetch = Private(FetchProvider);
indexPattern = Private(IndexPatternProvider);
DocSource = Private(CourierDataSourceDocSourceProvider);
SearchSource = Private(CourierDataSourceSearchSourceProvider);
DocSource = Private(DocSourceProvider);
SearchSource = Private(SearchSourceProvider);
}));
describe('#doc(docSource)', function () {

View file

@ -1,8 +1,9 @@
import _ from 'lodash';
import sinon from 'auto-release-sinon';
import expect from 'expect.js';
import ngMock from 'ngMock';
import CourierFetchFetchTheseProvider from 'ui/courier/fetch/_fetch_these';
import FetchTheseProvider from '../fetch_these';
describe('ui/courier/fetch/_fetch_these', () => {
let Promise;
@ -22,15 +23,15 @@ describe('ui/courier/fetch/_fetch_these', () => {
return fakeResponses;
}
PrivateProvider.swap(require('ui/courier/fetch/_call_client'), FakeResponsesProvider);
PrivateProvider.swap(require('ui/courier/fetch/_call_response_handlers'), FakeResponsesProvider);
PrivateProvider.swap(require('ui/courier/fetch/_continue_incomplete'), FakeResponsesProvider);
PrivateProvider.swap(require('ui/courier/fetch/call_client'), FakeResponsesProvider);
PrivateProvider.swap(require('ui/courier/fetch/call_response_handlers'), FakeResponsesProvider);
PrivateProvider.swap(require('ui/courier/fetch/continue_incomplete'), FakeResponsesProvider);
}));
beforeEach(ngMock.inject((Private, $injector) => {
$rootScope = $injector.get('$rootScope');
Promise = $injector.get('Promise');
fetchThese = Private(CourierFetchFetchTheseProvider);
fetchThese = Private(FetchTheseProvider);
request = mockRequest();
requests = [ request ];
}));

View file

@ -1,9 +0,0 @@
import CourierFetchRequestRequestProvider from 'ui/courier/fetch/request/request';
export default function CourierFetchIsRequestProvider(Private) {
var AbstractRequest = Private(CourierFetchRequestRequestProvider);
return function isRequest(obj) {
return obj instanceof AbstractRequest;
};
};

View file

@ -1,29 +1,31 @@
import _ from 'lodash';
import CourierFetchIsRequestProvider from 'ui/courier/fetch/_is_request';
import CourierFetchMergeDuplicateRequestsProvider from 'ui/courier/fetch/_merge_duplicate_requests';
import CourierFetchReqStatusProvider from 'ui/courier/fetch/_req_status';
import IsRequestProvider from './is_request';
import MergeDuplicatesRequestProvider from './merge_duplicate_requests';
import ReqStatusProvider from './req_status';
export default function CourierFetchCallClient(Private, Promise, es, esShardTimeout, sessionId) {
var isRequest = Private(CourierFetchIsRequestProvider);
var mergeDuplicateRequests = Private(CourierFetchMergeDuplicateRequestsProvider);
const isRequest = Private(IsRequestProvider);
const mergeDuplicateRequests = Private(MergeDuplicatesRequestProvider);
var ABORTED = Private(CourierFetchReqStatusProvider).ABORTED;
var DUPLICATE = Private(CourierFetchReqStatusProvider).DUPLICATE;
const ABORTED = Private(ReqStatusProvider).ABORTED;
const DUPLICATE = Private(ReqStatusProvider).DUPLICATE;
function callClient(strategy, requests) {
// merging docs can change status to DUPLICATE, capture new statuses
var statuses = mergeDuplicateRequests(requests);
const statuses = mergeDuplicateRequests(requests);
// get the actual list of requests that we will be fetching
var executable = statuses.filter(isRequest);
var execCount = executable.length;
const executable = statuses.filter(isRequest);
let execCount = executable.length;
// resolved by respond()
var esPromise;
var defer = Promise.defer();
let esPromise;
const defer = Promise.defer();
// for each respond with either the response or ABORTED
var respond = function (responses) {
const respond = function (responses) {
responses = responses || [];
return Promise.map(requests, function (req, i) {
switch (statuses[i]) {
@ -43,7 +45,7 @@ export default function CourierFetchCallClient(Private, Promise, es, esShardTime
// handle a request being aborted while being fetched
var requestWasAborted = Promise.method(function (req, i) {
const requestWasAborted = Promise.method(function (req, i) {
if (statuses[i] === ABORTED) {
defer.reject(new Error('Request was aborted twice?'));
}

View file

@ -1,12 +1,12 @@
import { SearchTimeout } from 'ui/errors';
import { RequestFailure } from 'ui/errors';
import { ShardFailure } from 'ui/errors';
import CourierFetchReqStatusProvider from 'ui/courier/fetch/_req_status';
import CourierFetchNotifierProvider from 'ui/courier/fetch/_notifier';
import { RequestFailure, SearchTimeout, ShardFailure } from 'ui/errors';
import ReqStatusProvider from './req_status';
import NotifierProvider from './notifier';
export default function CourierFetchCallResponseHandlers(Private, Promise) {
var ABORTED = Private(CourierFetchReqStatusProvider).ABORTED;
var INCOMPLETE = Private(CourierFetchReqStatusProvider).INCOMPLETE;
var notify = Private(CourierFetchNotifierProvider);
const ABORTED = Private(ReqStatusProvider).ABORTED;
const INCOMPLETE = Private(ReqStatusProvider).INCOMPLETE;
const notify = Private(NotifierProvider);
function callResponseHandlers(requests, responses) {
@ -15,7 +15,7 @@ export default function CourierFetchCallResponseHandlers(Private, Promise) {
return ABORTED;
}
var resp = responses[i];
let resp = responses[i];
if (resp.timed_out) {
notify.warning(new SearchTimeout());

View file

@ -1,10 +1,10 @@
import CourierFetchReqStatusProvider from 'ui/courier/fetch/_req_status';
import ReqStatusProvider from './req_status';
export default function CourierFetchContinueIncompleteRequests(Private) {
var INCOMPLETE = Private(CourierFetchReqStatusProvider).INCOMPLETE;
const INCOMPLETE = Private(ReqStatusProvider).INCOMPLETE;
function continueIncompleteRequests(strategy, requests, responses, fetchWithStrategy) {
var incomplete = [];
const incomplete = [];
responses.forEach(function (resp, i) {
if (resp === INCOMPLETE) {

View file

@ -1,18 +1,20 @@
import _ from 'lodash';
import CourierRequestQueueProvider from 'ui/courier/_request_queue';
import CourierFetchFetchTheseProvider from 'ui/courier/fetch/_fetch_these';
import CourierFetchCallResponseHandlersProvider from 'ui/courier/fetch/_call_response_handlers';
import CourierFetchReqStatusProvider from 'ui/courier/fetch/_req_status';
import RequestQueueProvider from '../_request_queue';
import FetchTheseProvider from './fetch_these';
import CallResponseHandlersProvider from './call_response_handlers';
import ReqStatusProvider from './req_status';
export default function fetchService(Private, Promise) {
var requestQueue = Private(CourierRequestQueueProvider);
var fetchThese = Private(CourierFetchFetchTheseProvider);
const requestQueue = Private(RequestQueueProvider);
const fetchThese = Private(FetchTheseProvider);
var callResponseHandlers = Private(CourierFetchCallResponseHandlersProvider);
var INCOMPLETE = Private(CourierFetchReqStatusProvider).INCOMPLETE;
const callResponseHandlers = Private(CallResponseHandlersProvider);
const INCOMPLETE = Private(ReqStatusProvider).INCOMPLETE;
function fetchQueued(strategy) {
var requests = requestQueue.getStartable(strategy);
const requests = requestQueue.getStartable(strategy);
if (!requests.length) return Promise.resolve();
else return fetchThese(requests);
}
@ -20,7 +22,7 @@ export default function fetchService(Private, Promise) {
this.fetchQueued = fetchQueued;
function fetchASource(source, strategy) {
var defer = Promise.defer();
const defer = Promise.defer();
fetchThese([
source._createRequest(defer)

View file

@ -1,22 +1,22 @@
import CourierFetchNotifierProvider from 'ui/courier/fetch/_notifier';
import CourierFetchForEachStrategyProvider from 'ui/courier/fetch/_for_each_strategy';
import CourierFetchCallClientProvider from 'ui/courier/fetch/_call_client';
import CourierFetchCallResponseHandlersProvider from 'ui/courier/fetch/_call_response_handlers';
import CourierFetchContinueIncompleteProvider from 'ui/courier/fetch/_continue_incomplete';
import CourierFetchReqStatusProvider from 'ui/courier/fetch/_req_status';
import NotifierProvider from './notifier';
import ForEachStrategyProvider from './for_each_strategy';
import CallClientProvider from './call_client';
import CallResponseHandlersProvider from './call_response_handlers';
import ContinueIncompleteProvider from './continue_incomplete';
import ReqStatusProvider from './req_status';
export default function FetchTheseProvider(Private, Promise) {
var notify = Private(CourierFetchNotifierProvider);
var forEachStrategy = Private(CourierFetchForEachStrategyProvider);
const notify = Private(NotifierProvider);
const forEachStrategy = Private(ForEachStrategyProvider);
// core tasks
var callClient = Private(CourierFetchCallClientProvider);
var callResponseHandlers = Private(CourierFetchCallResponseHandlersProvider);
var continueIncomplete = Private(CourierFetchContinueIncompleteProvider);
const callClient = Private(CallClientProvider);
const callResponseHandlers = Private(CallResponseHandlersProvider);
const continueIncomplete = Private(ContinueIncompleteProvider);
var ABORTED = Private(CourierFetchReqStatusProvider).ABORTED;
var DUPLICATE = Private(CourierFetchReqStatusProvider).DUPLICATE;
var INCOMPLETE = Private(CourierFetchReqStatusProvider).INCOMPLETE;
const ABORTED = Private(ReqStatusProvider).ABORTED;
const DUPLICATE = Private(ReqStatusProvider).DUPLICATE;
const INCOMPLETE = Private(ReqStatusProvider).INCOMPLETE;
function fetchThese(requests) {
return forEachStrategy(requests, function (strategy, reqsForStrategy) {
@ -66,7 +66,7 @@ export default function FetchTheseProvider(Private, Promise) {
}
return new Promise(function (resolve) {
var action = req.started ? req.continue : req.start;
const action = req.started ? req.continue : req.start;
resolve(action.call(req));
})
.catch(err => req.handleFailure(err));

View file

@ -1,13 +1,13 @@
import _ from 'lodash';
export default function FetchForEachRequestStrategy(Private, Promise) {
export default function FetchForEachRequestStrategy(Private, Promise) {
function forEachStrategy(requests, block) {
block = Promise.method(block);
var sets = [];
const sets = [];
requests.forEach(function (req) {
var strategy = req.strategy;
var set = _.find(sets, { 0: strategy });
const strategy = req.strategy;
const set = _.find(sets, { 0: strategy });
if (set) set[1].push(req);
else sets.push([strategy, [req]]);
});

View file

@ -0,0 +1,9 @@
import AbstractRequestProvider from './request';
export default function IsRequestProvider(Private) {
const AbstractRequest = Private(AbstractRequestProvider);
return function isRequest(obj) {
return obj instanceof AbstractRequest;
};
};

View file

@ -1,17 +1,17 @@
import CourierFetchIsRequestProvider from 'ui/courier/fetch/_is_request';
import CourierFetchReqStatusProvider from 'ui/courier/fetch/_req_status';
import IsRequestProvider from './is_request';
import ReqStatusProvider from './req_status';
export default function FetchMergeDuplicateRequests(Private) {
var isRequest = Private(CourierFetchIsRequestProvider);
var DUPLICATE = Private(CourierFetchReqStatusProvider).DUPLICATE;
const isRequest = Private(IsRequestProvider);
const DUPLICATE = Private(ReqStatusProvider).DUPLICATE;
function mergeDuplicateRequests(requests) {
// dedupe requests
var index = {};
const index = {};
return requests.map(function (req) {
if (!isRequest(req)) return req;
var iid = req.source._instanceid;
const iid = req.source._instanceid;
if (!index[iid]) {
// this request is unique so far
index[iid] = req;

View file

@ -1,10 +1,11 @@
import sinon from 'auto-release-sinon';
import expect from 'expect.js';
import ngMock from 'ngMock';
import CourierFetchRequestSegmentedProvider from 'ui/courier/fetch/request/segmented';
import CourierFetchRequestSearchProvider from 'ui/courier/fetch/request/search';
describe('ui/courier/fetch/request/segmented', () => {
import SegmentedRequestProvider from '../segmented';
import SearchRequestProvider from '../search';
describe('ui/courier/fetch/request/segmented', () => {
let Promise;
let $rootScope;
let SegmentedReq;
@ -16,8 +17,8 @@ describe('ui/courier/fetch/request/segmented', () => {
beforeEach(ngMock.inject((Private, $injector) => {
Promise = $injector.get('Promise');
$rootScope = $injector.get('$rootScope');
SegmentedReq = Private(CourierFetchRequestSegmentedProvider);
searchReqStart = sinon.spy(Private(CourierFetchRequestSearchProvider).prototype, 'start');
SegmentedReq = Private(SegmentedRequestProvider);
searchReqStart = sinon.spy(Private(SearchRequestProvider).prototype, 'start');
}));
describe('#start()', () => {

View file

@ -1,8 +1,11 @@
import sinon from 'auto-release-sinon';
import expect from 'expect.js';
import ngMock from 'ngMock';
import StubbedSearchSourceProvider from 'fixtures/stubbed_search_source';
import CourierFetchRequestSegmentedProvider from 'ui/courier/fetch/request/segmented';
import SegmentedRequestProvider from '../segmented';
describe('ui/courier/fetch/request/segmented/_createQueue', () => {
let Promise;
@ -16,7 +19,7 @@ describe('ui/courier/fetch/request/segmented/_createQueue', () => {
beforeEach(ngMock.inject((Private, $injector) => {
Promise = $injector.get('Promise');
$rootScope = $injector.get('$rootScope');
SegmentedReq = Private(CourierFetchRequestSegmentedProvider);
SegmentedReq = Private(SegmentedRequestProvider);
MockSource = class {
constructor() {
@ -29,7 +32,7 @@ describe('ui/courier/fetch/request/segmented/_createQueue', () => {
const req = new SegmentedReq(new MockSource());
req._queueCreated = null;
var promise = req._createQueue();
const promise = req._createQueue();
expect(req._queueCreated).to.be(false);
await promise;
expect(req._queueCreated).to.be(true);

View file

@ -6,7 +6,8 @@ import sinon from 'auto-release-sinon';
import HitSortFnProv from 'plugins/kibana/discover/_hit_sort_fn';
import NoDigestPromises from 'testUtils/noDigestPromises';
import StubbedSearchSourceProvider from 'fixtures/stubbed_search_source';
import CourierFetchRequestSegmentedProvider from 'ui/courier/fetch/request/segmented';
import SegmentedRequestProvider from '../segmented';
describe('Segmented Request Index Selection', function () {
let Promise;
@ -22,7 +23,7 @@ describe('Segmented Request Index Selection', function () {
Promise = $injector.get('Promise');
HitSortFn = Private(HitSortFnProv);
$rootScope = $injector.get('$rootScope');
SegmentedReq = Private(CourierFetchRequestSegmentedProvider);
SegmentedReq = Private(SegmentedRequestProvider);
MockSource = class {
constructor() {

View file

@ -6,7 +6,8 @@ import sinon from 'auto-release-sinon';
import HitSortFnProv from 'plugins/kibana/discover/_hit_sort_fn';
import NoDigestPromises from 'testUtils/noDigestPromises';
import StubbedSearchSourceProvider from 'fixtures/stubbed_search_source';
import CourierFetchRequestSegmentedProvider from 'ui/courier/fetch/request/segmented';
import SegmentedRequestProvider from '../segmented';
describe('Segmented Request Size Picking', function () {
let Promise;
@ -22,7 +23,7 @@ describe('Segmented Request Size Picking', function () {
Promise = $injector.get('Promise');
HitSortFn = Private(HitSortFnProv);
$rootScope = $injector.get('$rootScope');
SegmentedReq = Private(CourierFetchRequestSegmentedProvider);
SegmentedReq = Private(SegmentedRequestProvider);
MockSource = class {
constructor() {

View file

@ -1,29 +0,0 @@
import _ from 'lodash';
import EventsProvider from 'ui/events';
export default function CourierSegmentedReqHandle(Private) {
var Events = Private(EventsProvider);
/**
* Simple class for creating an object to send to the
* requester of a SegmentedRequest. Since the SegmentedRequest
* extends AbstractRequest, it wasn't able to be the event
* emitter it was born to be. This provides a channel for
* setting values on the segmented request, and an event
* emitter for the request to speak outwardly
*
* @param {SegmentedRequest} - req - the requst this handle relates to
*/
_.class(SegmentedHandle).inherits(Events);
function SegmentedHandle(req) {
SegmentedHandle.Super.call(this);
// export a couple methods from the request
this.setDirection = _.bindKey(req, 'setDirection');
this.setSize = _.bindKey(req, 'setSize');
this.setMaxSegments = _.bindKey(req, 'setMaxSegments');
this.setSortFn = _.bindKey(req, 'setSortFn');
}
return SegmentedHandle;
};

View file

@ -1,42 +1,43 @@
import _ from 'lodash';
import CourierFetchStrategyDocProvider from 'ui/courier/fetch/strategy/doc';
import CourierFetchRequestRequestProvider from 'ui/courier/fetch/request/request';
import DocStrategyProvider from '../strategy/doc';
import AbstractRequestProvider from './request';
export default function DocRequestProvider(Private) {
var docStrategy = Private(CourierFetchStrategyDocProvider);
var AbstractRequest = Private(CourierFetchRequestRequestProvider);
const docStrategy = Private(DocStrategyProvider);
const AbstractRequest = Private(AbstractRequestProvider);
_.class(DocRequest).inherits(AbstractRequest);
function DocRequest(source, defer) {
DocRequest.Super.call(this, source, defer);
class DocRequest extends AbstractRequest {
constructor(...args) {
super(...args);
this.type = 'doc';
this.strategy = docStrategy;
}
DocRequest.prototype.canStart = function () {
var parent = DocRequest.Super.prototype.canStart.call(this);
if (!parent) return false;
var version = this.source._version;
var storedVersion = this.source._getStoredVersion();
// conditions that equal "fetch This DOC!"
var unknown = !version && !storedVersion;
var mismatch = version !== storedVersion;
return Boolean(mismatch || (unknown && !this.started));
};
DocRequest.prototype.handleResponse = function (resp) {
if (resp.found) {
this.source._storeVersion(resp._version);
} else {
this.source._clearVersion();
this.type = 'doc';
this.strategy = docStrategy;
}
return DocRequest.Super.prototype.handleResponse.call(this, resp);
};
canStart() {
const parent = super.canStart();
if (!parent) return false;
const version = this.source._version;
const storedVersion = this.source._getStoredVersion();
// conditions that equal "fetch This DOC!"
const unknown = !version && !storedVersion;
const mismatch = version !== storedVersion;
return Boolean(mismatch || (unknown && !this.started));
}
handleResponse(resp) {
if (resp.found) {
this.source._storeVersion(resp._version);
} else {
this.source._clearVersion();
}
return super.handleResponse(resp);
}
}
return DocRequest;
};

View file

@ -1,15 +1,16 @@
import CourierErrorHandlersProvider from 'ui/courier/_error_handlers';
import Notifier from 'ui/notify/notifier';
export default function RequestErrorHandlerFactory(Private) {
var errHandlers = Private(CourierErrorHandlersProvider);
import ErrorHandlersProvider from '../../_error_handlers';
var notify = new Notifier({
export default function RequestErrorHandlerFactory(Private) {
const errHandlers = Private(ErrorHandlersProvider);
const notify = new Notifier({
location: 'Courier Fetch Error'
});
function handleError(req, error) {
var myHandlers = [];
const myHandlers = [];
errHandlers.splice(0).forEach(function (handler) {
(handler.source === req.source ? myHandlers : errHandlers).push(handler);

View file

@ -1,115 +1,110 @@
import _ from 'lodash';
import moment from 'moment';
import errors from 'ui/errors';
import CourierRequestQueueProvider from 'ui/courier/_request_queue';
import CourierFetchRequestErrorHandlerProvider from 'ui/courier/fetch/request/_error_handler';
import RequestQueueProvider from '../../_request_queue';
import ErrorHandlerRequestProvider from './error_handler';
export default function AbstractReqProvider(Private, Promise) {
var requestQueue = Private(CourierRequestQueueProvider);
var requestErrorHandler = Private(CourierFetchRequestErrorHandlerProvider);
const requestQueue = Private(RequestQueueProvider);
const requestErrorHandler = Private(ErrorHandlerRequestProvider);
function AbstractReq(source, defer) {
if (!(this instanceof AbstractReq) || !this.constructor || this.constructor === AbstractReq) {
throw new Error('The AbstractReq class should not be called directly');
return class AbstractReq {
constructor(source, defer) {
this.source = source;
this.defer = defer || Promise.defer();
this._whenAbortedHandlers = [];
requestQueue.push(this);
}
this.source = source;
this.defer = defer || Promise.defer();
requestQueue.push(this);
}
AbstractReq.prototype.canStart = function () {
return Boolean(!this.stopped && !this.source._fetchDisabled);
};
AbstractReq.prototype.start = function () {
if (this.started) {
throw new TypeError('Unable to start request because it has already started');
canStart() {
return Boolean(!this.stopped && !this.source._fetchDisabled);
}
this.started = true;
this.moment = moment();
start() {
if (this.started) {
throw new TypeError('Unable to start request because it has already started');
}
var source = this.source;
if (source.activeFetchCount) {
source.activeFetchCount += 1;
} else {
source.activeFetchCount = 1;
this.started = true;
this.moment = moment();
const source = this.source;
if (source.activeFetchCount) {
source.activeFetchCount += 1;
} else {
source.activeFetchCount = 1;
}
source.history = [this];
}
source.history = [this];
};
getFetchParams() {
return this.source._flatten();
}
AbstractReq.prototype.getFetchParams = function () {
return this.source._flatten();
};
transformResponse(resp) {
return resp;
}
AbstractReq.prototype.transformResponse = function (resp) {
return resp;
};
filterError(resp) {
return false;
}
AbstractReq.prototype.filterError = function (resp) {
return false;
};
handleResponse(resp) {
this.success = true;
this.resp = resp;
}
AbstractReq.prototype.handleResponse = function (resp) {
this.success = true;
this.resp = resp;
};
handleFailure(error) {
this.success = false;
this.resp = error && error.resp;
this.retry();
return requestErrorHandler(this, error);
}
AbstractReq.prototype.handleFailure = function (error) {
this.success = false;
this.resp = error && error.resp;
this.retry();
return requestErrorHandler(this, error);
};
isIncomplete() {
return false;
}
AbstractReq.prototype.isIncomplete = function () {
return false;
};
continue() {
throw new Error('Unable to continue ' + this.type + ' request');
}
AbstractReq.prototype.continue = function () {
throw new Error('Unable to continue ' + this.type + ' request');
};
retry() {
const clone = this.clone();
this.abort();
return clone;
}
AbstractReq.prototype.retry = function () {
var clone = this.clone();
this.abort();
return clone;
};
// don't want people overriding this, so it becomes a natural
// part of .abort() and .complete()
function stop(then) {
return function () {
_markStopped() {
if (this.stopped) return;
this.stopped = true;
this.source.activeFetchCount -= 1;
_.pull(requestQueue, this);
then.call(this);
};
}
}
AbstractReq.prototype.abort = stop(function () {
this.defer = null;
this.aborted = true;
if (this._whenAborted) _.callEach(this._whenAborted);
});
abort() {
this._markStopped();
this.defer = null;
this.aborted = true;
_.callEach(this._whenAbortedHandlers);
}
AbstractReq.prototype.whenAborted = function (cb) {
this._whenAborted = (this._whenAborted || []);
this._whenAborted.push(cb);
whenAborted(cb) {
this._whenAbortedHandlers.push(cb);
}
complete() {
this._markStopped();
this.ms = this.moment.diff() * -1;
this.defer.resolve(this.resp);
}
clone() {
return new this.constructor(this.source, this.defer);
}
};
AbstractReq.prototype.complete = stop(function () {
this.ms = this.moment.diff() * -1;
this.defer.resolve(this.resp);
});
AbstractReq.prototype.clone = function () {
return new this.constructor(this.source, this.defer);
};
return AbstractReq;
};

View file

@ -1,19 +1,17 @@
import _ from 'lodash';
import CourierFetchStrategySearchProvider from 'ui/courier/fetch/strategy/search';
import CourierFetchRequestRequestProvider from 'ui/courier/fetch/request/request';
import SearchStrategyProvider from '../strategy/search';
import AbstractRequestProvider from './request';
export default function SearchReqProvider(Private) {
var searchStrategy = Private(CourierFetchStrategySearchProvider);
var AbstractRequest = Private(CourierFetchRequestRequestProvider);
const searchStrategy = Private(SearchStrategyProvider);
const AbstractRequest = Private(AbstractRequestProvider);
_.class(SearchReq).inherits(AbstractRequest);
var Super = SearchReq.Super;
function SearchReq(source, defer) {
Super.call(this, source, defer);
return class SearchReq extends AbstractRequest {
constructor(...args) {
super(...args);
this.type = 'search';
this.strategy = searchStrategy;
}
return SearchReq;
this.type = 'search';
this.strategy = searchStrategy;
}
};
};

View file

@ -1,334 +1,337 @@
import { isNumber } from 'lodash';
import _ from 'lodash';
import CourierFetchRequestSearchProvider from 'ui/courier/fetch/request/search';
import CourierFetchRequestSegmentedHandleProvider from 'ui/courier/fetch/request/_segmented_handle';
export default function CourierSegmentedReqProvider(es, Private, Promise, Notifier, timefilter, config) {
var SearchReq = Private(CourierFetchRequestSearchProvider);
var SegmentedHandle = Private(CourierFetchRequestSegmentedHandleProvider);
import { isNumber } from 'lodash';
var notify = new Notifier({
import Notifier from 'ui/notify/notifier';
import SearchRequestProvider from './search';
import SegmentedHandleProvider from './segmented_handle';
export default function SegmentedReqProvider(es, Private, Promise, timefilter, config) {
const SearchReq = Private(SearchRequestProvider);
const SegmentedHandle = Private(SegmentedHandleProvider);
const notify = new Notifier({
location: 'Segmented Fetch'
});
_.class(SegmentedReq).inherits(SearchReq);
function SegmentedReq(source, defer, initFn) {
SearchReq.call(this, source, defer);
class SegmentedReq extends SearchReq {
constructor(source, defer, initFn) {
super(source, defer);
this.type = 'segmented';
this.type = 'segmented';
// segmented request specific state
this._initFn = initFn;
// segmented request specific state
this._initFn = initFn;
this._desiredSize = null;
this._maxSegments = config.get('courier:maxSegmentCount');
this._direction = 'desc';
this._sortFn = null;
this._queueCreated = false;
this._handle = new SegmentedHandle(this);
this._desiredSize = null;
this._maxSegments = config.get('courier:maxSegmentCount');
this._direction = 'desc';
this._sortFn = null;
this._queueCreated = false;
this._handle = new SegmentedHandle(this);
this._hitWindow = null;
this._hitWindow = null;
// prevent the source from changing between requests,
// all calls will return the same promise
this._getFlattenedSource = _.once(this._getFlattenedSource);
}
/*********
** SearchReq overrides
*********/
SegmentedReq.prototype.start = function () {
var self = this;
this._complete = [];
this._active = null;
this._segments = [];
this._all = [];
this._queue = [];
this._mergedResp = {
took: 0,
hits: {
hits: [],
total: 0,
max_score: 0
}
};
// give the request consumer a chance to receive each segment and set
// parameters via the handle
if (_.isFunction(this._initFn)) this._initFn(this._handle);
return this._createQueue().then(function (queue) {
self._all = queue.slice(0);
// Send the initial fetch status
self._reportStatus();
return SearchReq.prototype.start.call(self);
});
};
SegmentedReq.prototype.continue = function () {
return this._reportStatus();
};
SegmentedReq.prototype.getFetchParams = function () {
var self = this;
return self._getFlattenedSource().then(function (flatSource) {
var params = _.cloneDeep(flatSource);
// calculate the number of indices to fetch in this request in order to prevent
// more than self._maxSegments requests. We use Math.max(1, n) to ensure that each request
// has at least one index pattern, and Math.floor() to make sure that if the
// number of indices does not round out evenly the extra index is tacked onto the last
// request, making sure the first request returns faster.
var remainingSegments = self._maxSegments - self._segments.length;
var indexCount = Math.max(1, Math.floor(self._queue.length / remainingSegments));
var indices = self._active = self._queue.splice(0, indexCount);
params.index = _.pluck(indices, 'index');
if (isNumber(self._desiredSize)) {
params.body.size = self._pickSizeForIndices(indices);
}
return params;
});
};
SegmentedReq.prototype.handleResponse = function (resp) {
return this._consumeSegment(resp);
};
SegmentedReq.prototype.filterError = function (resp) {
if (/ClusterBlockException.*index\sclosed/.test(resp.error)) {
this._consumeSegment(false);
return true;
// prevent the source from changing between requests,
// all calls will return the same promise
this._getFlattenedSource = _.once(this._getFlattenedSource);
}
};
SegmentedReq.prototype.isIncomplete = function () {
var queueNotCreated = !this._queueCreated;
var queueNotEmpty = this._queue.length > 0;
return queueNotCreated || queueNotEmpty;
};
/*********
** SearchReq overrides
*********/
SegmentedReq.prototype.clone = function () {
return new SegmentedReq(this.source, this.defer, this._initFn);
};
start() {
this._complete = [];
this._active = null;
this._segments = [];
this._all = [];
this._queue = [];
SegmentedReq.prototype.complete = function () {
this._reportStatus();
this._handle.emit('complete');
return SearchReq.prototype.complete.call(this);
};
this._mergedResp = {
took: 0,
hits: {
hits: [],
total: 0,
max_score: 0
}
};
/*********
** SegmentedReq specific methods
*********/
// give the request consumer a chance to receive each segment and set
// parameters via the handle
if (_.isFunction(this._initFn)) this._initFn(this._handle);
return this._createQueue().then((queue) => {
this._all = queue.slice(0);
/**
* Set the sort total number of segments to emit
*
* @param {number}
*/
SegmentedReq.prototype.setMaxSegments = function (maxSegments) {
this._maxSegments = Math.max(_.parseInt(maxSegments), 1);
};
// Send the initial fetch status
this._reportStatus();
/**
* Set the sort direction for the request.
*
* @param {string} dir - one of 'asc' or 'desc'
*/
SegmentedReq.prototype.setDirection = function (dir) {
switch (dir) {
case 'asc':
case 'desc':
return (this._direction = dir);
default:
throw new TypeError('unknown sort direction "' + dir + '"');
}
};
/**
* Set the function that will be used to sort the rows
*
* @param {fn}
*/
SegmentedReq.prototype.setSortFn = function (sortFn) {
this._sortFn = sortFn;
};
/**
* Set the sort total number of documents to
* emit
*
* Setting to false will not limit the documents,
* if a number is set the size of the request to es
* will be updated on each new request
*
* @param {number|false}
*/
SegmentedReq.prototype.setSize = function (totalSize) {
this._desiredSize = _.parseInt(totalSize);
if (isNaN(this._desiredSize)) this._desiredSize = null;
};
SegmentedReq.prototype._createQueue = function () {
var self = this;
var timeBounds = timefilter.getBounds();
var indexPattern = self.source.get('index');
self._queueCreated = false;
return indexPattern.toDetailedIndexList(timeBounds.min, timeBounds.max, self._direction)
.then(function (queue) {
if (!_.isArray(queue)) queue = [queue];
self._queue = queue;
self._queueCreated = true;
return queue;
});
};
SegmentedReq.prototype._reportStatus = function () {
return this._handle.emit('status', {
total: this._queueCreated ? this._all.length : NaN,
complete: this._queueCreated ? this._complete.length : NaN,
remaining: this._queueCreated ? this._queue.length : NaN,
hitCount: this._queueCreated ? this._mergedResp.hits.hits.length : NaN
});
};
SegmentedReq.prototype._getFlattenedSource = function () {
return this.source._flatten();
};
SegmentedReq.prototype._consumeSegment = function (seg) {
var index = this._active;
this._complete.push(index);
if (!seg) return; // segment was ignored/filtered, don't store it
var hadHits = _.get(this._mergedResp, 'hits.hits.length') > 0;
var gotHits = _.get(seg, 'hits.hits.length') > 0;
var firstHits = !hadHits && gotHits;
var haveHits = hadHits || gotHits;
this._mergeSegment(seg);
this.resp = _.omit(this._mergedResp, '_bucketIndex');
if (firstHits) this._handle.emit('first', seg);
if (gotHits) this._handle.emit('segment', seg);
if (haveHits) this._handle.emit('mergedSegment', this.resp);
};
SegmentedReq.prototype._mergeHits = function (hits) {
var mergedHits = this._mergedResp.hits.hits;
var desiredSize = this._desiredSize;
var sortFn = this._sortFn;
_.pushAll(hits, mergedHits);
if (sortFn) {
notify.event('resort rows', function () {
mergedHits.sort(sortFn);
return super.start();
});
}
if (isNumber(desiredSize)) {
mergedHits = this._mergedResp.hits.hits = mergedHits.slice(0, desiredSize);
}
};
SegmentedReq.prototype._mergeSegment = notify.timed('merge response segment', function (seg) {
var merged = this._mergedResp;
this._segments.push(seg);
merged.took += seg.took;
merged.hits.total += seg.hits.total;
merged.hits.max_score = Math.max(merged.hits.max_score, seg.hits.max_score);
if (_.size(seg.hits.hits)) {
this._mergeHits(seg.hits.hits);
this._detectHitsWindow(merged.hits.hits);
continue() {
return this._reportStatus();
}
if (!seg.aggregations) return;
getFetchParams() {
return this._getFlattenedSource().then(flatSource => {
const params = _.cloneDeep(flatSource);
Object.keys(seg.aggregations).forEach(function (aggKey) {
// calculate the number of indices to fetch in this request in order to prevent
// more than this._maxSegments requests. We use Math.max(1, n) to ensure that each request
// has at least one index pattern, and Math.floor() to make sure that if the
// number of indices does not round out evenly the extra index is tacked onto the last
// request, making sure the first request returns faster.
const remainingSegments = this._maxSegments - this._segments.length;
const indexCount = Math.max(1, Math.floor(this._queue.length / remainingSegments));
if (!merged.aggregations) {
// start merging aggregations
merged.aggregations = {};
merged._bucketIndex = {};
}
const indices = this._active = this._queue.splice(0, indexCount);
params.index = _.pluck(indices, 'index');
if (!merged.aggregations[aggKey]) {
merged.aggregations[aggKey] = {
buckets: []
};
}
seg.aggregations[aggKey].buckets.forEach(function (bucket) {
var mbucket = merged._bucketIndex[bucket.key];
if (mbucket) {
mbucket.doc_count += bucket.doc_count;
return;
if (isNumber(this._desiredSize)) {
params.body.size = this._pickSizeForIndices(indices);
}
mbucket = merged._bucketIndex[bucket.key] = bucket;
merged.aggregations[aggKey].buckets.push(mbucket);
return params;
});
});
});
SegmentedReq.prototype._detectHitsWindow = function (hits) {
hits = hits || [];
var indexPattern = this.source.get('index');
var desiredSize = this._desiredSize;
var size = _.size(hits);
if (!isNumber(desiredSize) || size < desiredSize) {
this._hitWindow = {
size: size,
min: -Infinity,
max: Infinity
};
return;
}
let min;
let max;
handleResponse(resp) {
return this._consumeSegment(resp);
}
hits.forEach(function (deepHit) {
var hit = indexPattern.flattenHit(deepHit);
var time = hit[indexPattern.timeFieldName];
if (min == null || time < min) min = time;
if (max == null || time > max) max = time;
});
filterError(resp) {
if (/ClusterBlockException.*index\sclosed/.test(resp.error)) {
this._consumeSegment(false);
return true;
}
}
this._hitWindow = { size, min, max };
};
isIncomplete() {
const queueNotCreated = !this._queueCreated;
const queueNotEmpty = this._queue.length > 0;
return queueNotCreated || queueNotEmpty;
}
SegmentedReq.prototype._pickSizeForIndices = function (indices) {
var hitWindow = this._hitWindow;
var desiredSize = this._desiredSize;
clone() {
return new SegmentedReq(this.source, this.defer, this._initFn);
}
if (!isNumber(desiredSize)) return null;
// we don't have any hits yet, get us more info!
if (!hitWindow) return desiredSize;
// the order of documents isn't important, just get us more
if (!this._sortFn) return Math.max(desiredSize - hitWindow.size, 0);
// if all of the documents in every index fall outside of our current doc set, we can ignore them.
var someOverlap = indices.some(function (index) {
return index.min <= hitWindow.max && hitWindow.min <= index.max;
});
complete() {
this._reportStatus();
this._handle.emit('complete');
return super.complete();
}
return someOverlap ? desiredSize : 0;
};
/*********
** SegmentedReq specific methods
*********/
/**
* Set the sort total number of segments to emit
*
* @param {number}
*/
setMaxSegments(maxSegments) {
this._maxSegments = Math.max(_.parseInt(maxSegments), 1);
}
/**
* Set the sort direction for the request.
*
* @param {string} dir - one of 'asc' or 'desc'
*/
setDirection(dir) {
switch (dir) {
case 'asc':
case 'desc':
return (this._direction = dir);
default:
throw new TypeError('unknown sort direction "' + dir + '"');
}
}
/**
* Set the function that will be used to sort the rows
*
* @param {fn}
*/
setSortFn(sortFn) {
this._sortFn = sortFn;
}
/**
* Set the sort total number of documents to
* emit
*
* Setting to false will not limit the documents,
* if a number is set the size of the request to es
* will be updated on each new request
*
* @param {number|false}
*/
setSize(totalSize) {
this._desiredSize = _.parseInt(totalSize);
if (isNaN(this._desiredSize)) this._desiredSize = null;
}
_createQueue() {
const timeBounds = timefilter.getBounds();
const indexPattern = this.source.get('index');
this._queueCreated = false;
return indexPattern.toDetailedIndexList(timeBounds.min, timeBounds.max, this._direction)
.then(queue => {
if (!_.isArray(queue)) queue = [queue];
this._queue = queue;
this._queueCreated = true;
return queue;
});
}
_reportStatus() {
return this._handle.emit('status', {
total: this._queueCreated ? this._all.length : NaN,
complete: this._queueCreated ? this._complete.length : NaN,
remaining: this._queueCreated ? this._queue.length : NaN,
hitCount: this._queueCreated ? this._mergedResp.hits.hits.length : NaN
});
}
_getFlattenedSource() {
return this.source._flatten();
}
_consumeSegment(seg) {
const index = this._active;
this._complete.push(index);
if (!seg) return; // segment was ignored/filtered, don't store it
const hadHits = _.get(this._mergedResp, 'hits.hits.length') > 0;
const gotHits = _.get(seg, 'hits.hits.length') > 0;
const firstHits = !hadHits && gotHits;
const haveHits = hadHits || gotHits;
this._mergeSegment(seg);
this.resp = _.omit(this._mergedResp, '_bucketIndex');
if (firstHits) this._handle.emit('first', seg);
if (gotHits) this._handle.emit('segment', seg);
if (haveHits) this._handle.emit('mergedSegment', this.resp);
}
_mergeHits(hits) {
const mergedHits = this._mergedResp.hits.hits;
const desiredSize = this._desiredSize;
const sortFn = this._sortFn;
_.pushAll(hits, mergedHits);
if (sortFn) {
notify.event('resort rows', function () {
mergedHits.sort(sortFn);
});
}
if (isNumber(desiredSize)) {
this._mergedResp.hits.hits = mergedHits.slice(0, desiredSize);
}
}
_mergeSegment(seg) {
const merged = this._mergedResp;
this._segments.push(seg);
merged.took += seg.took;
merged.hits.total += seg.hits.total;
merged.hits.max_score = Math.max(merged.hits.max_score, seg.hits.max_score);
if (_.size(seg.hits.hits)) {
this._mergeHits(seg.hits.hits);
this._detectHitsWindow(merged.hits.hits);
}
if (!seg.aggregations) return;
Object.keys(seg.aggregations).forEach(function (aggKey) {
if (!merged.aggregations) {
// start merging aggregations
merged.aggregations = {};
merged._bucketIndex = {};
}
if (!merged.aggregations[aggKey]) {
merged.aggregations[aggKey] = {
buckets: []
};
}
seg.aggregations[aggKey].buckets.forEach(function (bucket) {
let mbucket = merged._bucketIndex[bucket.key];
if (mbucket) {
mbucket.doc_count += bucket.doc_count;
return;
}
mbucket = merged._bucketIndex[bucket.key] = bucket;
merged.aggregations[aggKey].buckets.push(mbucket);
});
});
}
_detectHitsWindow(hits) {
hits = hits || [];
const indexPattern = this.source.get('index');
const desiredSize = this._desiredSize;
const size = _.size(hits);
if (!isNumber(desiredSize) || size < desiredSize) {
this._hitWindow = {
size: size,
min: -Infinity,
max: Infinity
};
return;
}
let min;
let max;
hits.forEach(function (deepHit) {
const hit = indexPattern.flattenHit(deepHit);
const time = hit[indexPattern.timeFieldName];
if (min == null || time < min) min = time;
if (max == null || time > max) max = time;
});
this._hitWindow = { size, min, max };
}
_pickSizeForIndices(indices) {
const hitWindow = this._hitWindow;
const desiredSize = this._desiredSize;
if (!isNumber(desiredSize)) return null;
// we don't have any hits yet, get us more info!
if (!hitWindow) return desiredSize;
// the order of documents isn't important, just get us more
if (!this._sortFn) return Math.max(desiredSize - hitWindow.size, 0);
// if all of the documents in every index fall outside of our current doc set, we can ignore them.
const someOverlap = indices.some(function (index) {
return index.min <= hitWindow.max && hitWindow.min <= index.max;
});
return someOverlap ? desiredSize : 0;
}
}
SegmentedReq.prototype.mergedSegment = notify.timed('merge response segment', SegmentedReq.prototype.mergedSegment);
return SegmentedReq;
};

View file

@ -0,0 +1,40 @@
import EventsProvider from 'ui/events';
export default function CourierSegmentedReqHandle(Private) {
const Events = Private(EventsProvider);
const segmentedRequest = Symbol('Actual Segmented Request');
/**
* Simple class for creating an object to send to the
* requester of a SegmentedRequest. Since the SegmentedRequest
* extends AbstractRequest, it wasn't able to be the event
* emitter it was born to be. This provides a channel for
* setting values on the segmented request, and an event
* emitter for the request to speak outwardly
*
* @param {SegmentedRequest} - req - the requst this handle relates to
*/
return class SegmentedHandle extends Events {
constructor(req) {
super();
this[segmentedRequest] = req;
}
setDirection(...args) {
this[segmentedRequest].setDirection(...args);
}
setSize(...args) {
this[segmentedRequest].setSize(...args);
}
setMaxSegments(...args) {
this[segmentedRequest].setMaxSegments(...args);
}
setSortFn(...args) {
this[segmentedRequest].setSortFn(...args);
}
};
};

View file

@ -2,7 +2,9 @@ import _ from 'lodash';
import sinon from 'auto-release-sinon';
import expect from 'expect.js';
import ngMock from 'ngMock';
import CourierFetchStrategySearchProvider from 'ui/courier/fetch/strategy/search';
import SearchStrategyProvider from '../search';
describe('ui/courier/fetch/strategy/search', () => {
let Promise;
@ -15,7 +17,7 @@ describe('ui/courier/fetch/strategy/search', () => {
beforeEach(ngMock.inject((Private, $injector) => {
Promise = $injector.get('Promise');
$rootScope = $injector.get('$rootScope');
search = Private(CourierFetchStrategySearchProvider);
search = Private(SearchStrategyProvider);
reqsFetchParams = [
{
index: ['logstash-123'],

View file

@ -1,6 +1,8 @@
import { toJson } from 'ui/utils/aggressive_parse';
import _ from 'lodash';
import angular from 'angular';
import { toJson } from 'ui/utils/aggressive_parse';
export default function FetchStrategyForSearch(Private, Promise, timefilter) {
return {
@ -20,7 +22,7 @@ export default function FetchStrategyForSearch(Private, Promise, timefilter) {
return indexList;
}
var timeBounds = timefilter.getBounds();
const timeBounds = timefilter.getBounds();
return indexList.toIndexList(timeBounds.min, timeBounds.max);
})
.then(function (indexList) {

View file

@ -15,25 +15,25 @@ module.directive('fieldName', function ($compile, $rootScope, $filter) {
var typeIcon = function (fieldType) {
switch (fieldType) {
case 'source':
return '<i class="fa fa-file-text-o "></i>';
return '<i title="source" class="fa fa-file-text-o "></i>';
case 'string':
return '<i><strong>t</strong></i>';
return '<i title="string"><strong>t</strong></i>';
case 'murmur3':
return '<i><strong>h</strong></i>';
return '<i title="murmur3"><strong>h</strong></i>';
case 'number':
return '<i><strong>#</strong></i>';
return '<i title="number"><strong>#</strong></i>';
case 'date':
return '<i class="fa fa-clock-o"></i>';
return '<i title="date" class="fa fa-clock-o"></i>';
case 'ip':
return '<i class="fa fa-laptop"></i>';
return '<i title="ip" class="fa fa-laptop"></i>';
case 'geo_point':
return '<i class="fa fa-globe"></i>';
return '<i title="geo_point" class="fa fa-globe"></i>';
case 'boolean':
return '<i class="fa fa-adjust"></i>';
return '<i title="boolean" class="fa fa-adjust"></i>';
case 'conflict':
return '<i class="fa fa-warning"></i>';
return '<i title="conflict" class="fa fa-warning"></i>';
default:
return '<i><strong>?</strong></i>';
return '<i title="unknown"><strong>?</strong></i>';
}
};

View file

@ -5,13 +5,13 @@ import uiModules from 'ui/modules';
var es; // share the client amoungst all apps
uiModules
.get('kibana', ['elasticsearch', 'kibana/config'])
.service('es', function (esFactory, esUrl, $q, esApiVersion) {
.service('es', function (esFactory, esUrl, $q, esApiVersion, esRequestTimeout) {
if (es) return es;
es = esFactory({
host: esUrl,
log: 'info',
requestTimeout: 0,
requestTimeout: esRequestTimeout,
apiVersion: esApiVersion,
plugins: [function (Client, config) {

View file

@ -1,15 +1,10 @@
import uiModules from 'ui/modules';
import { words, capitalize } from 'lodash';
uiModules
.get('kibana')
.filter('label', function () {
return function (str) {
var words = str.split(' ');
return words.map(capFirst).join(' ');
return words(str).map(capitalize).join(' ');
};
});
function capFirst(str) {
var i = str[0];
var r = new RegExp(i, 'i');
return str.replace(r, i.toUpperCase());
}

View file

@ -71,7 +71,7 @@ module.service('Promise', function ($q, $timeout) {
});
Promise.try = function (fn, args, ctx) {
if (typeof fn !== 'function') {
return Promise.reject('fn must be a function');
return Promise.reject(new TypeError('fn must be a function'));
}
var value;

View file

@ -1,15 +1,18 @@
import chrome from 'ui/chrome';
export default function createUrlShortener(Notifier, $http, $location) {
const notify = new Notifier({
location: 'Url Shortener'
});
const baseUrl = `${$location.protocol()}://${$location.host()}:${$location.port()}`;
const basePath = chrome.getBasePath();
const baseUrl = `${$location.protocol()}://${$location.host()}:${$location.port()}${basePath}`;
async function shortenUrl(url) {
const relativeUrl = url.replace(baseUrl, '');
const formData = { url: relativeUrl };
try {
const result = await $http.post('/shorten', formData);
const result = await $http.post(`${basePath}/shorten`, formData);
return `${baseUrl}/goto/${result.data}`;
} catch (err) {

View file

@ -6,14 +6,14 @@
class="form-control url">
</input>
<button
class="shorten-button"
class="shorten-button btn btn-default"
tooltip="Generate Short URL"
ng-click="generateShortUrl()"
ng-disabled="shortGenerated">
<span aria-hidden="true" class="fa fa-compress"></span>
</button>
<button
class="clipboard-button"
class="clipboard-button btn btn-default"
tooltip="Copy to Clipboard"
ng-click="copyToClipboard()">
<span aria-hidden="true" class="fa fa-clipboard"></span>

View file

@ -305,7 +305,7 @@ describe('Marker Tests', function () {
var arr = markerLayer._dataToHeatArray(max);
var index = _.random(mapData.features.length - 1);
var feature = mapData.features[index];
var featureValue = parseInt(feature.properties.value / max * 100);
var featureValue = feature.properties.value / max;
var featureArr = feature.geometry.coordinates.slice(0).concat(featureValue);
expect(arr[index]).to.eql(featureArr);
});

View file

@ -106,6 +106,9 @@ export default function DispatchClass(Private) {
var isClickable = this.listenerCount('click') > 0;
var addEvent = this.addEvent;
var $el = this.handler.el;
if (!this.handler.highlight) {
this.handler.highlight = self.highlight;
}
function hover(d, i) {
// Add pointer if item is clickable
@ -113,7 +116,7 @@ export default function DispatchClass(Private) {
self.addMousePointer.call(this, arguments);
}
self.highlightLegend.call(this, $el);
self.handler.highlight.call(this, $el);
self.emit('hover', self.eventResponse(d, i));
}
@ -129,9 +132,12 @@ export default function DispatchClass(Private) {
var self = this;
var addEvent = this.addEvent;
var $el = this.handler.el;
if (!this.handler.unHighlight) {
this.handler.unHighlight = self.unHighlight;
}
function mouseout() {
self.unHighlightLegend.call(this, $el);
self.handler.unHighlight.call(this, $el);
}
return addEvent('mouseout', mouseout);
@ -225,21 +231,24 @@ export default function DispatchClass(Private) {
* Mouseover Behavior
*
* @param element {D3.Selection}
* @method highlightLegend
* @method highlight
*/
Dispatch.prototype.highlightLegend = function (element) {
Dispatch.prototype.highlight = function (element) {
var label = this.getAttribute('data-label');
if (!label) return;
$('[data-label]', element.parentNode).not('[data-label="' + label + '"]').css('opacity', 0.5);
//Opacity 1 is needed to avoid the css application
$('[data-label]', element.parentNode).css('opacity', 1).not(
function (els, el) { return `${$(el).data('label')}` === label;}
).css('opacity', 0.5);
};
/**
* Mouseout Behavior
*
* @param element {D3.Selection}
* @method unHighlightLegend
* @method unHighlight
*/
Dispatch.prototype.unHighlightLegend = function (element) {
Dispatch.prototype.unHighlight = function (element) {
$('[data-label]', element.parentNode).css('opacity', 1);
};

View file

@ -33,7 +33,26 @@ export default function AreaChartFactory(Private) {
if (this.isOverlapping) {
// Default opacity should return to 0.6 on mouseout
handler._attr.defaultOpacity = 0.6;
var defaultOpacity = 0.6;
handler._attr.defaultOpacity = defaultOpacity;
handler.highlight = function (element) {
var label = this.getAttribute('data-label');
if (!label) return;
var highlightOpacity = 0.8;
var highlightElements = $('[data-label]', element.parentNode).filter(
function (els, el) {
return `${$(el).data('label')}` === label;
});
$('[data-label]', element.parentNode).not(highlightElements).css('opacity', defaultOpacity / 2); // half of the default opacity
highlightElements.css('opacity', highlightOpacity);
};
handler.unHighlight = function (element) {
$('[data-label]', element).css('opacity', defaultOpacity);
//The legend should keep max opacity
$('[data-label]', $(element).siblings()).css('opacity', 1);
};
}
this.checkIfEnoughData();

View file

@ -199,7 +199,7 @@ export default function HeatmapMarkerFactory(Private) {
heatIntensity = feature.properties.value;
} else {
// show bucket value normalized to max value
heatIntensity = parseInt(feature.properties.value / max * 100);
heatIntensity = feature.properties.value / max;
}
return [lat, lng, heatIntensity];

View file

@ -6,8 +6,8 @@
<li
ng-repeat="legendData in labels track by legendData.label"
ng-mouseenter="highlightSeries(legendData.label)"
ng-mouseleave="unhighlightSeries()"
ng-mouseenter="highlight($event)"
ng-mouseleave="unhighlight($event)"
data-label="{{legendData.label}}"
class="legend-value color">

View file

@ -29,12 +29,18 @@ uiModules.get('kibana')
refresh();
});
$scope.highlightSeries = function (label) {
$('[data-label]', $elem.siblings()).not('[data-label="' + label + '"]').css('opacity', 0.5);
$scope.highlight = function (event) {
var el = event.currentTarget;
var handler = $scope.renderbot.vislibVis.handler;
if (!handler) return;
handler.highlight.call(el, handler.el);
};
$scope.unhighlightSeries = function () {
$('[data-label]', $elem.siblings()).css('opacity', 1);
$scope.unhighlight = function (event) {
var el = event.currentTarget;
var handler = $scope.renderbot.vislibVis.handler;
if (!handler) return;
handler.unHighlight.call(el, handler.el);
};
$scope.setColor = function (label, color) {

View file

@ -1,6 +1,6 @@
import _ from 'lodash';
import UiApp from './ui_app';
let Collection = require('requirefrom')('src')('utils/Collection');
import Collection from '../utils/Collection';
let byIdCache = Symbol('byId');

View file

@ -1,5 +1,5 @@
let { defaults } = require('lodash');
let babelOptions = require('requirefrom')('src')('optimize/babelOptions');
let babelOptions = require('../../src/optimize/babelOptions');
module.exports = {
build: {

View file

@ -2,7 +2,7 @@ module.exports = function (grunt) {
var resolve = require('path').resolve;
var directory = resolve(__dirname, '../../esvm');
var dataDir = resolve(directory, 'data_dir');
var uiConfig = require('requirefrom')('test')('serverConfig');
var uiConfig = require('../../test/serverConfig');
return {
options: {

View file

@ -21,7 +21,6 @@ module.exports = {
'mapping': {
'type': 'string',
'index': 'analyzed',
'omit_norms': true,
'fields': {
'raw': {
'index': 'not_analyzed',

View file

@ -21,7 +21,6 @@ module.exports = {
'mapping': {
'type': 'string',
'index': 'analyzed',
'omit_norms': true,
'fields': {
'raw': {
'index': 'not_analyzed',

View file

@ -1,11 +1,8 @@
import { defaultsDeep, set } from 'lodash';
import requirefrom from 'requirefrom';
import { header as basicAuthHeader } from './base_auth';
import { kibanaUser, kibanaServer } from '../shield';
const src = requirefrom('src');
const KbnServer = src('server/KbnServer');
const fromRoot = src('utils/fromRoot');
import KbnServer from '../../src/server/KbnServer';
import fromRoot from '../../src/utils/fromRoot';
const SERVER_DEFAULTS = {
server: {