Merge branch 'master' into addDiscoverIntervalTests

This commit is contained in:
LeeDr 2016-01-20 08:27:07 -06:00
commit 3da9d84fe7
49 changed files with 658 additions and 180 deletions

View file

@ -4,16 +4,16 @@
At any given time the Kibana team at Elastic is working on dozens of features and enhancements to Kibana and other projects at Elastic. When you file an issue we'll take the time to digest it, consider solutions, and weigh its applicability to both the broad Kibana user base and our own goals for the project. Once we've completed that process we will assign the issue a priority.
- **P1**: A high priority issue that affects almost all Kibana users. Bugs that would cause incorrect results, security issues and features that would vastly improve the user experience for everyone. Work arounds for P1s generally don't exist without a code change.
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
- **P4**: Niche and special interest issues that may not fit our core goals. We would take a high quality pull for this if implemented in such a way that it does not meaningfully impact other functionality or existing code. Issues may also be labeled P4 if they would be better implemented in Elasticsearch.
- **P5**: Highly niche or in opposition to our core goals. Should usually be closed. This doesn't mean we wouldn't take a pull for it, but if someone really wanted this they would be better off working on a plugin. The Kibana team will usually not work on P5 issues but may be willing to assist plugin developers on IRC.
#### How to express the importance of an issue
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
#### My issue isn't getting enough attention
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
### I want to help!
**Now we're talking**. If you have a bugfix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
@ -74,6 +74,20 @@ optimize:
lazyPrebuild: false
```
#### SSL
When Kibana runs in development mode it will automatically use bundled SSL certificates. These certificates won't be trusted by your OS by default which will likely cause your browser to complain about the cert. You can deal with this in a few ways:
1. Supply your own cert using the `config/kibana.dev.yml` file.
1. Configure your OS to trust the cert:
- OSX: https://www.accuweaver.com/2014/09/19/make-chrome-accept-a-self-signed-certificate-on-osx/
- Window: http://stackoverflow.com/a/1412118
- Linux: http://unix.stackexchange.com/a/90607
1. Click through the warning and accept future warnings.
1. Disable SSL with the `--no-ssl` flag:
- `npm start -- --no-ssl`
#### Linting
A note about linting: We use [eslint](http://eslint.org) to check that the [styleguide](STYLEGUIDE.md) is being followed. It runs in a pre-commit hook and as a part of the tests, but most contributors integrate it with their code editors for real-time feedback.

View file

@ -31,6 +31,8 @@ For example, a chart of dates with incident counts can display dates in chronolo
priority of the incident-reporting aggregation to show the most active dates first. The chronological order might show
a time-dependent pattern in incident count, and sorting by active dates can reveal particular outliers in your data.
include::color-picker.asciidoc[]
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.

View file

@ -0,0 +1,4 @@
You can customize the colors of your visualization by clicking the color dot next to each label to display the
_color picker_.
image::images/color-picker.png[An array of color dots that users can select]

View file

@ -71,8 +71,13 @@ in your Web page.
NOTE: A user must have Kibana access in order to view embedded dashboards.
Click the *Share* button to display HTML code to embed the dashboard in another Web page, along with a direct link to
the dashboard. You can select the text in either option to copy the code or the link to your clipboard.
To share a dashboard, click the *Share* button image:images/share-dashboard.png[] to display the _Sharing_ panel.
image:images/sharing-panel.png[]
Click the *Copy to Clipboard* button image:images/share-link.png[] to copy the native URL or embed HTML to the clipboard.
Click the *Generate short URL* button image:images/share-short-link.png[] to create a shortened URL for sharing or
embedding.
[float]
[[embedding-dashboards]]

View file

@ -195,7 +195,8 @@ yellow open logstash-2015.05.20 5 1 4750 0 16.4mb
[[tutorial-define-index]]
=== Defining Your Index Patterns
Each set of data loaded to Elasticsearch has an <<settings-create-pattern,index pattern>>. In the previous section, the Shakespeare data set has an index named `shakespeare`, and the accounts
Each set of data loaded to Elasticsearch has an <<settings-create-pattern,index pattern>>. In the previous section, the
Shakespeare data set has an index named `shakespeare`, and the accounts
data set has an index named `bank`. An _index pattern_ is a string with optional wildcards that can match multiple
indices. For example, in the common logging use case, a typical index name contains the date in MM-DD-YYYY
format, and an index pattern for May would look something like `logstash-2015.05*`.
@ -211,6 +212,9 @@ The Logstash data set does contain time-series data, so after clicking *Add New*
set, make sure the *Index contains time-based events* box is checked and select the `@timestamp` field from the
*Time-field name* drop-down.
NOTE: When you define an index pattern, indices that match that pattern must exist in Elasticsearch. Those indices must
contain data.
[float]
[[tutorial-discovering]]
=== Discovering Your Data
@ -288,8 +292,10 @@ This shows you what proportion of the 1000 accounts fall in these balance ranges
we're going to add another bucket aggregation. We can break down each of the balance ranges further by the account
holder's age.
Click *Add sub-buckets* at the bottom, then select *Split Slices*. Choose the *Terms* aggregation and the *age* field from the drop-downs.
Click the green *Apply changes* button image:images/apply-changes-button.png[] to add an external ring with the new results.
Click *Add sub-buckets* at the bottom, then select *Split Slices*. Choose the *Terms* aggregation and the *age* field from
the drop-downs.
Click the green *Apply changes* button image:images/apply-changes-button.png[] to add an external ring with the new
results.
image::images/tutorial-visualize-pie-3.png[]
@ -321,7 +327,8 @@ as well as change many other options for your visualizations, by clicking the *O
Now that you have a list of the smallest casts for Shakespeare plays, you might also be curious to see which of these
plays makes the greatest demands on an individual actor by showing the maximum number of speeches for a given part. Add
a Y-axis aggregation with the *Add metrics* button, then choose the *Max* aggregation for the *speech_number* field. In
the *Options* tab, change the *Bar Mode* drop-down to *grouped*, then click the green *Apply changes* button image:images/apply-changes-button.png[]. Your
the *Options* tab, change the *Bar Mode* drop-down to *grouped*, then click the green *Apply changes* button
image:images/apply-changes-button.png[]. Your
chart should now look like this:
image::images/tutorial-visualize-bar-3.png[]
@ -371,7 +378,8 @@ Write the following text in the field:
The Markdown widget uses **markdown** syntax.
> Blockquotes in Markdown use the > character.
Click the green *Apply changes* button image:images/apply-changes-button.png[] to display the rendered Markdown in the preview pane:
Click the green *Apply changes* button image:images/apply-changes-button.png[] to display the rendered Markdown in the
preview pane:
image::images/tutorial-visualize-md-2.png[]

Binary file not shown.

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 554 B

BIN
docs/images/share-link.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 589 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 507 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 62 KiB

View file

@ -11,6 +11,8 @@ if the splits are displayed in a row or a column by clicking the *Rows | Columns
include::x-axis-aggs.asciidoc[]
include::color-picker.asciidoc[]
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.

View file

@ -55,6 +55,8 @@ types.
When multiple aggregations are defined on a chart's axis, you can use the up or down arrows to the right of the
aggregation's type to change the aggregation's priority.
include::color-picker.asciidoc[]
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.

View file

@ -1,27 +1,27 @@
[[releasenotes]]
== Kibana 4.3 Release Notes
== Kibana 4.4 Release Notes
The 4.3 release of Kibana requires Elasticsearch 2.1 or later.
The 4.4 release of Kibana requires Elasticsearch 2.2 or later.
Using event times to create index names is *deprecated* in this release of Kibana. Support for this functionality will be
removed entirely in the next major Kibana release. Elasticsearch 2.1 includes sophisticated date parsing APIs that Kibana
uses to determine date information, removing the need to specify dates in the index pattern name.
Using event times to create index names is no longer supported as of this release. Current versions of Elasticsearch
include sophisticated date parsing APIs that Kibana uses to determine date information, removing the need to specify dates
in the index pattern name.
[float]
[[enhancements]]
== Enhancements
* {k4issue}5109[Issue 5109]: Adds custom JSON and filter alias naming for filters.
* {k4issue}1726[Issue 1726]: Adds a color field formatter for value ranges in numeric fields.
* {k4issue}4342[Issue 4342]: Increased performance for wildcard indices.
* {k4issue}1600[Issue 1600]: Support for global time zones.
* {k4pull}5275[Pull Request 5275]: Highlighting values in Discover can now be disabled.
* {k4issue}5212[Issue 5212]: Adds support for multiple certificate authorities.
* {k4issue}2716[Issue 2716]: The open/closed position of the spy panel now persists across UI state changes.
// * {k4issue}5109[Issue 5109]: Adds custom JSON and filter alias naming for filters.
// * {k4issue}1726[Issue 1726]: Adds a color field formatter for value ranges in numeric fields.
// * {k4issue}4342[Issue 4342]: Increased performance for wildcard indices.
// * {k4issue}1600[Issue 1600]: Support for global time zones.
// * {k4pull}5275[Pull Request 5275]: Highlighting values in Discover can now be disabled.
// * {k4issue}5212[Issue 5212]: Adds support for multiple certificate authorities.
// * {k4issue}2716[Issue 2716]: The open/closed position of the spy panel now persists across UI state changes.
[float]
[[bugfixes]]
== Bug Fixes
* {k4issue}5165[Issue 5165]: Resolves a display error in embedded views.
* {k4issue}5021[Issue 5021]: Improves visualization dimming for dashboards with auto-refresh.
// * {k4issue}5165[Issue 5165]: Resolves a display error in embedded views.
// * {k4issue}5021[Issue 5021]: Improves visualization dimming for dashboards with auto-refresh.

View file

@ -35,11 +35,17 @@ list.
contains time-based events* option and select the index field that contains the timestamp. Kibana reads the index
mapping to list all of the fields that contain a timestamp.
. By default, Kibana restricts wildcard expansion of time-based index patterns to indices with data within the currently
selected time range. Click *Do not expand index pattern when search* to disable this behavior.
. Click *Create* to add the index pattern.
. To designate the new pattern as the default pattern to load when you view the Discover tab, click the *favorite*
button.
NOTE: When you define an index pattern, indices that match that pattern must exist in Elasticsearch. Those indices must
contain data.
To use an event time in an index name, enclose the static text in the pattern and specify the date format using the
tokens described in the following table.
@ -195,6 +201,8 @@ Scripted fields compute data on the fly from the data in your Elasticsearch indi
the Discover tab as part of the document data, and you can use scripted fields in your visualizations.
Scripted field values are computed at query time so they aren't indexed and cannot be searched.
NOTE: Kibana cannot query scripted fields.
WARNING: Computing data on the fly with scripted fields can be very resource intensive and can have a direct impact on
Kibana's performance. Keep in mind that there's no built-in validation of a scripted field. If your scripts are
buggy, you'll get exceptions whenever you try to view the dynamically generated data.
@ -449,10 +457,12 @@ To export a set of objects:
. Click the selection box for the objects you want to export, or click the *Select All* box.
. Click *Export* to select a location to write the exported JSON.
WARNING: Exported dashboards do not include their associated index patterns. Re-create the index patterns manually before
importing saved dashboards to a Kibana instance running on another Elasticsearch cluster.
To import a set of objects:
. Go to *Settings > Objects*.
. Click *Import* to navigate to the JSON file representing the set of objects to import.
. Click *Open* after selecting the JSON file.
. If any objects in the set would overwrite objects already present in Kibana, confirm the overwrite.

View file

@ -34,6 +34,8 @@ if the splits are displayed in a row or a column by clicking the *Rows | Columns
include::x-axis-aggs.asciidoc[]
include::color-picker.asciidoc[]
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.

View file

@ -53,6 +53,7 @@
"precommit": "grunt precommit",
"karma": "karma start",
"elasticsearch": "grunt esvm:dev:keepalive",
"elasticsearchWithPlugins": "grunt esvm:withPlugins:keepalive",
"lint": "grunt eslint:source",
"lintroller": "grunt eslint:fixSource",
"mocha": "mocha --compilers js:babel/register",

View file

@ -0,0 +1,98 @@
import { Server } from 'hapi';
import { notFound } from 'boom';
import { merge, sample } from 'lodash';
import { format as formatUrl } from 'url';
import { fromNode } from 'bluebird';
import { Agent as HttpsAgent } from 'https';
import { readFileSync } from 'fs';
import Config from '../../server/config/config';
import setupConnection from '../../server/http/setup_connection';
import setupLogging from '../../server/logging';
const alphabet = 'abcdefghijklmnopqrztuvwxyz'.split('');
export default class BasePathProxy {
constructor(clusterManager, userSettings) {
this.clusterManager = clusterManager;
this.server = new Server();
const config = Config.withDefaultSchema(userSettings);
this.targetPort = config.get('dev.basePathProxyTarget');
this.basePath = config.get('server.basePath');
const { cert } = config.get('server.ssl');
if (cert) {
this.proxyAgent = new HttpsAgent({
ca: readFileSync(cert)
});
}
if (!this.basePath) {
this.basePath = `/${sample(alphabet, 3).join('')}`;
config.set('server.basePath', this.basePath);
}
setupLogging(null, this.server, config);
setupConnection(null, this.server, config);
this.setupRoutes();
}
setupRoutes() {
const { server, basePath, targetPort } = this;
server.route({
method: 'GET',
path: '/',
handler(req, reply) {
return reply.redirect(basePath);
}
});
server.route({
method: '*',
path: `${basePath}/{kbnPath*}`,
handler: {
proxy: {
passThrough: true,
xforward: true,
agent: this.proxyAgent,
mapUri(req, callback) {
callback(null, formatUrl({
protocol: server.info.protocol,
hostname: server.info.host,
port: targetPort,
pathname: req.params.kbnPath,
query: req.query,
}));
}
}
}
});
server.route({
method: '*',
path: `/{oldBasePath}/{kbnPath*}`,
handler(req, reply) {
const {oldBasePath, kbnPath = ''} = req.params;
const isGet = req.method === 'get';
const isBasePath = oldBasePath.length === 3;
const isApp = kbnPath.slice(0, 4) === 'app/';
if (isGet && isBasePath && isApp) {
return reply.redirect(`${basePath}/${kbnPath}`);
}
return reply(notFound());
}
});
}
async listen() {
await fromNode(cb => this.server.start(cb));
this.server.log(['listening', 'info'], `basePath Proxy running at ${this.server.info.uri}${this.basePath}`);
}
}

View file

@ -1,15 +1,22 @@
let cluster = require('cluster');
let { join } = require('path');
let { debounce, compact, invoke, bindAll, once } = require('lodash');
const cluster = require('cluster');
const { join } = require('path');
const { format: formatUrl } = require('url');
const Hapi = require('hapi');
const { debounce, compact, get, invoke, bindAll, once, sample } = require('lodash');
let Log = require('../Log');
let Worker = require('./worker');
const Log = require('../Log');
const Worker = require('./worker');
const BasePathProxy = require('./base_path_proxy');
process.env.kbnWorkerType = 'managr';
module.exports = class ClusterManager {
constructor(opts) {
constructor(opts, settings) {
this.log = new Log(opts.quiet, opts.silent);
this.addedCount = 0;
this.basePathProxy = new BasePathProxy(this, settings);
this.workers = [
this.optimizer = new Worker({
type: 'optmzr',
@ -17,14 +24,19 @@ module.exports = class ClusterManager {
log: this.log,
argv: compact([
'--plugins.initialize=false',
'--server.autoListen=false'
'--server.autoListen=false',
`--server.basePath=${this.basePathProxy.basePath}`
]),
watch: false
}),
this.server = new Worker({
type: 'server',
log: this.log
log: this.log,
argv: compact([
`--server.port=${this.basePathProxy.targetPort}`,
`--server.basePath=${this.basePathProxy.basePath}`
])
})
];
@ -48,12 +60,13 @@ module.exports = class ClusterManager {
startCluster() {
this.setupManualRestart();
invoke(this.workers, 'start');
this.basePathProxy.listen();
}
setupWatching() {
var chokidar = require('chokidar');
let utils = require('requirefrom')('src/utils');
let fromRoot = utils('fromRoot');
const chokidar = require('chokidar');
const utils = require('requirefrom')('src/utils');
const fromRoot = utils('fromRoot');
this.watcher = chokidar.watch([
'src/plugins',
@ -81,12 +94,12 @@ module.exports = class ClusterManager {
}
setupManualRestart() {
let readline = require('readline');
let rl = readline.createInterface(process.stdin, process.stdout);
const readline = require('readline');
const rl = readline.createInterface(process.stdin, process.stdout);
let nls = 0;
let clear = () => nls = 0;
let clearSoon = debounce(clear, 2000);
const clear = () => nls = 0;
const clearSoon = debounce(clear, 2000);
rl.setPrompt('');
rl.prompt();

View file

@ -76,11 +76,11 @@ describe('kibana cli', function () {
options = { install: 'dummy/dummy', pluginDir: fromRoot('installedPlugins') };
});
it('should require the user to specify either install and remove', function () {
it('should require the user to specify either install, remove, or list', function () {
options.install = null;
parser = settingParser(options);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install or --remove./);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
});
it('should not allow the user to specify both install and remove', function () {
@ -88,7 +88,32 @@ describe('kibana cli', function () {
options.install = 'org/package/version';
parser = settingParser(options);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install or --remove./);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
});
it('should not allow the user to specify both install and list', function () {
options.list = true;
options.install = 'org/package/version';
parser = settingParser(options);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
});
it('should not allow the user to specify both remove and list', function () {
options.list = true;
options.remove = 'package';
parser = settingParser(options);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
});
it('should not allow the user to specify install, remove, and list', function () {
options.list = true;
options.install = 'org/package/version';
options.remove = 'package';
parser = settingParser(options);
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
});
describe('quiet option', function () {
@ -293,7 +318,7 @@ describe('kibana cli', function () {
describe('remove option', function () {
it('should set settings.action property to "remove"', function () {
options.install = null;
delete options.install;
options.remove = 'package';
parser = settingParser(options);
@ -303,7 +328,7 @@ describe('kibana cli', function () {
});
it('should allow one part to the remove parameter', function () {
options.install = null;
delete options.install;
options.remove = 'test-plugin';
parser = settingParser(options);
@ -312,8 +337,8 @@ describe('kibana cli', function () {
expect(settings).to.have.property('package', 'test-plugin');
});
it('should not allow more than one part to the install parameter', function () {
options.install = null;
it('should not allow more than one part to the remove parameter', function () {
delete options.install;
options.remove = 'kibana/test-plugin';
parser = settingParser(options);
@ -322,7 +347,7 @@ describe('kibana cli', function () {
});
it('should populate the pluginPath', function () {
options.install = null;
delete options.install;
options.remove = 'test-plugin';
parser = settingParser(options);
@ -334,6 +359,21 @@ describe('kibana cli', function () {
});
describe('list option', function () {
it('should set settings.action property to "list"', function () {
delete options.install;
delete options.remove;
options.list = true;
parser = settingParser(options);
var settings = parser.parse();
expect(settings).to.have.property('action', 'list');
});
});
});
});

View file

@ -3,6 +3,7 @@ const fromRoot = utils('fromRoot');
const settingParser = require('./setting_parser');
const installer = require('./plugin_installer');
const remover = require('./plugin_remover');
const lister = require('./plugin_lister');
const pluginLogger = require('./plugin_logger');
export default function pluginCli(program) {
@ -18,11 +19,16 @@ export default function pluginCli(program) {
const logger = pluginLogger(settings);
if (settings.action === 'install') {
installer.install(settings, logger);
}
if (settings.action === 'remove') {
remover.remove(settings, logger);
switch (settings.action) {
case 'install':
installer.install(settings, logger);
break;
case 'remove':
remover.remove(settings, logger);
break;
case 'list':
lister.list(settings, logger);
break;
}
}
@ -30,6 +36,7 @@ export default function pluginCli(program) {
.command('plugin')
.option('-i, --install <org>/<plugin>/<version>', 'The plugin to install')
.option('-r, --remove <plugin>', 'The plugin to remove')
.option('-l, --list', 'List installed plugins')
.option('-q, --quiet', 'Disable all process messaging except errors')
.option('-s, --silent', 'Disable all process messaging')
.option('-u, --url <url>', 'Specify download url')

View file

@ -0,0 +1,8 @@
const fs = require('fs');
export function list(settings, logger) {
fs.readdirSync(settings.pluginDir)
.forEach(function (pluginFile) {
logger.log(pluginFile);
});
}

View file

@ -1,5 +1,6 @@
const { resolve } = require('path');
const expiry = require('expiry-js');
import { intersection } from 'lodash';
export default function createSettingParser(options) {
function parseMilliseconds(val) {
@ -22,6 +23,10 @@ export default function createSettingParser(options) {
return 'https://download.elastic.co/' + settings.organization + '/' + settings.package + '/' + filename;
}
function areMultipleOptionsChosen(options, choices) {
return intersection(Object.keys(options), choices).length > 1;
}
function parse() {
let parts;
let settings = {
@ -84,8 +89,12 @@ export default function createSettingParser(options) {
settings.package = parts.shift();
}
if (!settings.action || (options.install && options.remove)) {
throw new Error('Please specify either --install or --remove.');
if (options.list) {
settings.action = 'list';
}
if (!settings.action || areMultipleOptionsChosen(options, [ 'install', 'remove', 'list' ])) {
throw new Error('Please specify either --install, --remove, or --list.');
}
settings.pluginDir = options.pluginDir;

View file

@ -1,10 +1,10 @@
let _ = require('lodash');
let { isWorker } = require('cluster');
let { resolve } = require('path');
const _ = require('lodash');
const { isWorker } = require('cluster');
const { resolve } = require('path');
let cwd = process.cwd();
let src = require('requirefrom')('src');
let fromRoot = src('utils/fromRoot');
const cwd = process.cwd();
const src = require('requirefrom')('src');
const fromRoot = src('utils/fromRoot');
let canCluster;
try {
@ -14,19 +14,61 @@ try {
canCluster = false;
}
let pathCollector = function () {
let paths = [];
const pathCollector = function () {
const paths = [];
return function (path) {
paths.push(resolve(process.cwd(), path));
return paths;
};
};
let pluginDirCollector = pathCollector();
let pluginPathCollector = pathCollector();
const pluginDirCollector = pathCollector();
const pluginPathCollector = pathCollector();
function initServerSettings(opts, extraCliOptions) {
const readYamlConfig = require('./read_yaml_config');
const settings = readYamlConfig(opts.config);
const set = _.partial(_.set, settings);
const get = _.partial(_.get, settings);
const has = _.partial(_.has, settings);
const merge = _.partial(_.merge, settings);
if (opts.dev) {
try { merge(readYamlConfig(fromRoot('config/kibana.dev.yml'))); }
catch (e) { null; }
}
if (opts.dev) {
set('env', 'development');
set('optimize.lazy', true);
if (opts.ssl && !has('server.ssl.cert') && !has('server.ssl.key')) {
set('server.host', 'localhost');
set('server.ssl.cert', fromRoot('test/dev_certs/server.crt'));
set('server.ssl.key', fromRoot('test/dev_certs/server.key'));
}
}
if (opts.elasticsearch) set('elasticsearch.url', opts.elasticsearch);
if (opts.port) set('server.port', opts.port);
if (opts.host) set('server.host', opts.host);
if (opts.quiet) set('logging.quiet', true);
if (opts.silent) set('logging.silent', true);
if (opts.verbose) set('logging.verbose', true);
if (opts.logFile) set('logging.dest', opts.logFile);
set('plugins.scanDirs', _.compact([].concat(
get('plugins.scanDirs'),
opts.pluginDir
)));
set('plugins.paths', [].concat(opts.pluginPath || []));
merge(extraCliOptions);
return settings;
}
module.exports = function (program) {
let command = program.command('serve');
const command = program.command('serve');
command
.description('Run the kibana server')
@ -64,59 +106,29 @@ module.exports = function (program) {
if (canCluster) {
command
.option('--dev', 'Run the server with development mode defaults')
.option('--no-ssl', 'Don\'t run the dev server using HTTPS')
.option('--no-watch', 'Prevents automatic restarts of the server in --dev mode');
}
command
.action(async function (opts) {
const settings = initServerSettings(opts, this.getUnknownOptions());
if (canCluster && opts.dev && !isWorker) {
// stop processing the action and handoff to cluster manager
let ClusterManager = require('../cluster/cluster_manager');
new ClusterManager(opts);
const ClusterManager = require('../cluster/cluster_manager');
new ClusterManager(opts, settings);
return;
}
let readYamlConfig = require('./read_yaml_config');
let KbnServer = src('server/KbnServer');
let settings = readYamlConfig(opts.config);
if (opts.dev) {
try { _.merge(settings, readYamlConfig(fromRoot('config/kibana.dev.yml'))); }
catch (e) { null; }
}
let set = _.partial(_.set, settings);
let get = _.partial(_.get, settings);
if (opts.dev) {
set('env', 'development');
set('optimize.lazy', true);
}
if (opts.elasticsearch) set('elasticsearch.url', opts.elasticsearch);
if (opts.port) set('server.port', opts.port);
if (opts.host) set('server.host', opts.host);
if (opts.quiet) set('logging.quiet', true);
if (opts.silent) set('logging.silent', true);
if (opts.verbose) set('logging.verbose', true);
if (opts.logFile) set('logging.dest', opts.logFile);
set('plugins.scanDirs', _.compact([].concat(
get('plugins.scanDirs'),
opts.pluginDir
)));
set('plugins.paths', [].concat(opts.pluginPath || []));
let kbnServer = {};
const KbnServer = src('server/KbnServer');
try {
kbnServer = new KbnServer(_.merge(settings, this.getUnknownOptions()));
kbnServer = new KbnServer(settings);
await kbnServer.ready();
}
catch (err) {
let { server } = kbnServer;
const { server } = kbnServer;
if (server) server.log(['fatal'], err);
console.error('FATAL', err);

View file

@ -9,7 +9,9 @@ module.exports = function ({ Plugin }) {
return new Plugin({
require: ['kibana'],
config({ array, boolean, number, object, string }) {
config(Joi) {
const { array, boolean, number, object, string } = Joi;
return object({
enabled: boolean().default(true),
url: string().uri({ scheme: ['http', 'https'] }).default('http://localhost:9200'),

View file

@ -1,10 +1,10 @@
var src = require('requirefrom')('src');
var expect = require('expect.js');
var util = require('util');
var requireFromTest = require('requirefrom')('test');
var kbnTestServer = requireFromTest('utils/kbn_server');
var format = util.format;
var KbnServer = src('server/KbnServer');
var fromRoot = src('utils/fromRoot');
describe('plugins/elasticsearch', function () {
describe('routes', function () {
@ -12,27 +12,7 @@ describe('plugins/elasticsearch', function () {
var kbnServer;
before(function () {
kbnServer = new KbnServer({
server: {
autoListen: false,
xsrf: {
disableProtection: true
}
},
logging: { quiet: true },
plugins: {
scanDirs: [
fromRoot('src/plugins')
]
},
optimize: {
enabled: false
},
elasticsearch: {
url: 'http://localhost:9210'
}
});
kbnServer = kbnTestServer.createServer();
return kbnServer.ready()
.then(() => kbnServer.server.plugins.elasticsearch.waitUntilReady());
});
@ -51,7 +31,7 @@ describe('plugins/elasticsearch', function () {
var statusCode = options.statusCode || 200;
describe(format('%s %s', options.method, options.url), function () {
it('should should return ' + statusCode, function (done) {
kbnServer.server.inject(options, function (res) {
kbnTestServer.makeRequest(kbnServer, options, function (res) {
try {
expect(res.statusCode).to.be(statusCode);
done();

View file

@ -20,7 +20,8 @@ module.exports = function (kibana) {
uses: [
'visTypes',
'spyModes',
'fieldFormats'
'fieldFormats',
'navbarExtensions'
],
injectVars: function (server, options) {

View file

@ -40,7 +40,7 @@ define(function (require) {
}
if (!field.indexed) {
warnings.push('This field is not indexed and can not be visualized.');
warnings.push('This field is not indexed and might not be usable in visualizations.');
}
}

View file

@ -35,7 +35,7 @@
</div>
</div>
<a ng-show="field.indexed || field.scripted"
<a
ng-href="{{vizLocation(field)}}"
class="sidebar-item-button primary">
Visualize
@ -43,8 +43,3 @@
( {{::warnings.length}} <ng-pluralize count="warnings.length" when="{'1':'warning', 'other':'warnings'}"></ng-pluralize> <i aria-hidden="true" class="fa fa-warning"></i> )
</span>
</a>
<div ng-show="!field.indexed && !field.scripted"
disabled="disabled"
tooltip="This field is not indexed thus unavailable for visualization and search"
class="sidebar-item-button primary">Not Indexed</div>

View file

@ -141,7 +141,6 @@ define(function (require) {
if (fieldTypes) {
fields = $filter('fieldType')(fields, fieldTypes);
fields = $filter('filter')(fields, { bucketable: true });
fields = $filter('orderBy')(fields, ['type', 'name']);
}

View file

@ -1,5 +1,4 @@
<div ng-controller="VisEditor" class="vis-editor vis-type-{{ vis.type.name }}">
<navbar ng-if="chrome.getVisible()">
<div class="fill bitty-modal-container">
<div ng-if="vis.type.requiresSearch && $state.linked && !unlinking"

View file

@ -1,10 +1,10 @@
import { resolve } from 'path';
import { fromNode as fn } from 'bluebird';
import expect from 'expect.js';
import requirefrom from 'requirefrom';
import KbnServer from '../KbnServer';
const src = resolve.bind(__dirname, '../../');
const requireFromTest = requirefrom('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
const basePath = '/kibana';
describe('Server basePath config', function () {
@ -13,14 +13,8 @@ describe('Server basePath config', function () {
let kbnServer;
before(async function () {
kbnServer = new KbnServer({
server: { autoListen: false, basePath },
plugins: { scanDirs: [src('plugins')] },
logging: { quiet: true },
optimize: { enabled: false },
elasticsearch: {
url: 'http://localhost:9210'
}
kbnServer = kbnTestServer.createServer({
server: { basePath }
});
await kbnServer.ready();
return kbnServer;
@ -30,12 +24,18 @@ describe('Server basePath config', function () {
await kbnServer.close();
});
it('appends the basePath to root redirect', async function () {
const response = await kbnServer.inject({
it('appends the basePath to root redirect', function (done) {
const options = {
url: '/',
method: 'GET'
};
kbnTestServer.makeRequest(kbnServer, options, function (res) {
try {
expect(res.payload).to.match(/defaultRoute = '\/kibana\/app\/kibana'/);
done();
} catch (e) {
done(e);
}
});
expect(response.payload).to.match(/defaultRoute = '\/kibana\/app\/kibana'/);
});
});

View file

@ -3,6 +3,7 @@ let Joi = require('joi');
let _ = require('lodash');
let { zipObject } = require('lodash');
let override = require('./override');
let createDefaultSchema = require('./schema');
let pkg = require('requirefrom')('src/utils')('packageJson');
const clone = require('./deepCloneWithBuffers');
@ -12,6 +13,10 @@ const vals = Symbol('config values');
const pendingSets = Symbol('Pending Settings');
module.exports = class Config {
static withDefaultSchema(settings = {}) {
return new Config(createDefaultSchema(), settings);
}
constructor(initialSchema, initialSettings) {
this[schemaKeys] = new Map();

View file

@ -20,6 +20,10 @@ module.exports = () => Joi.object({
prod: Joi.boolean().default(Joi.ref('$prod'))
}).default(),
dev: Joi.object({
basePathProxyTarget: Joi.number().default(5603),
}).default(),
pid: Joi.object({
file: Joi.string(),
exclusive: Joi.boolean().default(false)

View file

@ -1,6 +1,4 @@
module.exports = function (kbnServer) {
let Config = require('./config');
let schema = require('./schema')();
kbnServer.config = new Config(schema, kbnServer.settings || {});
kbnServer.config = Config.withDefaultSchema(kbnServer.settings);
};

View file

@ -1,12 +1,13 @@
import expect from 'expect.js';
import KbnServer from '../../KbnServer';
import requirefrom from 'requirefrom';
const requireFromTest = requirefrom('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
describe('cookie validation', function () {
let kbnServer;
beforeEach(function () {
kbnServer = new KbnServer({
server: { autoListen: false }
});
kbnServer = kbnTestServer.createServer();
return kbnServer.ready();
});
afterEach(function () {
@ -14,26 +15,28 @@ describe('cookie validation', function () {
});
it('allows non-strict cookies', function (done) {
kbnServer.server.inject({
const options = {
method: 'GET',
url: '/',
headers: {
cookie: 'test:80=value;test_80=value'
}
}, (res) => {
};
kbnTestServer.makeRequest(kbnServer, options, (res) => {
expect(res.payload).not.to.contain('Invalid cookie header');
done();
});
});
it('returns an error if the cookie can\'t be parsed', function (done) {
kbnServer.server.inject({
const options = {
method: 'GET',
url: '/',
headers: {
cookie: 'a'
}
}, (res) => {
};
kbnTestServer.makeRequest(kbnServer, options, (res) => {
expect(res.payload).to.contain('Invalid cookie header');
done();
});

View file

@ -2,7 +2,8 @@ import expect from 'expect.js';
import { fromNode as fn } from 'bluebird';
import { resolve } from 'path';
import KbnServer from '../../KbnServer';
const requireFromTest = require('requirefrom')('test');
const kbnTestServer = requireFromTest('utils/kbn_server');
const nonDestructiveMethods = ['GET'];
const destructiveMethods = ['POST', 'PUT', 'DELETE'];
@ -14,20 +15,16 @@ const version = require(src('../package.json')).version;
describe('xsrf request filter', function () {
function inject(kbnServer, opts) {
return fn(cb => {
kbnServer.server.inject(opts, (resp) => {
kbnTestServer.makeRequest(kbnServer, opts, (resp) => {
cb(null, resp);
});
});
}
const makeServer = async function () {
const kbnServer = new KbnServer({
server: { autoListen: false },
plugins: { scanDirs: [src('plugins')] },
logging: { quiet: true },
optimize: { enabled: false },
elasticsearch: {
url: 'http://localhost:9210'
const kbnServer = kbnTestServer.createServer({
server: {
xsrf: { disableProtection: false }
}
});

View file

@ -28,6 +28,7 @@ let typeColors = {
debug: 'brightBlack',
server: 'brightBlack',
optmzr: 'white',
managr: 'green',
optimize: 'magenta',
listening: 'magenta'
};

View file

@ -61,6 +61,7 @@ class UiExports {
case 'fieldFormats':
case 'spyModes':
case 'chromeNavControls':
case 'navbarExtensions':
return (plugin, spec) => {
this.aliases[type] = _.union(this.aliases[type] || [], spec);
};

View file

@ -112,7 +112,6 @@ describe('index pattern', function () {
describe('fields', function () {
it('should have expected properties on fields', function () {
expect(indexPattern.fields[0]).to.have.property('bucketable');
expect(indexPattern.fields[0]).to.have.property('displayName');
expect(indexPattern.fields[0]).to.have.property('filterable');
expect(indexPattern.fields[0]).to.have.property('format');

View file

@ -39,7 +39,6 @@ define(function (require) {
var indexed = !!spec.indexed;
var scripted = !!spec.scripted;
var sortable = spec.name === '_score' || ((indexed || scripted) && type.sortable);
var bucketable = indexed || scripted;
var filterable = spec.name === '_id' || scripted || (indexed && type.filterable);
obj.fact('name');
@ -59,7 +58,6 @@ define(function (require) {
// usage flags, read-only and won't be saved
obj.comp('format', format);
obj.comp('sortable', sortable);
obj.comp('bucketable', bucketable);
obj.comp('filterable', filterable);
// computed values

View file

@ -0,0 +1,8 @@
define(function (require) {
return require('ui/registry/_registry')({
name: 'navbarExtensions',
index: ['name'],
group: ['appName'],
order: ['order']
});
});

View file

@ -1,6 +1,6 @@
define(function (require) {
return require('ui/registry/_registry')({
name: 'visTypes',
name: 'spyModes',
index: ['name'],
order: ['order']
});

View file

@ -0,0 +1,65 @@
const angular = require('angular');
const sinon = require('sinon');
const expect = require('expect.js');
const ngMock = require('ngMock');
require('ui/render_directive');
let $parentScope;
let $elem;
let $directiveScope;
function init(markup = '', definition = {}) {
ngMock.module('kibana/render_directive');
// Create the scope
ngMock.inject(function ($rootScope, $compile) {
$parentScope = $rootScope;
// create the markup
$elem = angular.element('<render-directive>');
$elem.html(markup);
if (definition !== null) {
$parentScope.definition = definition;
$elem.attr('definition', 'definition');
}
// compile the directive
$compile($elem)($parentScope);
$parentScope.$apply();
$directiveScope = $elem.isolateScope();
});
}
describe('render_directive', function () {
describe('directive requirements', function () {
it('should throw if not given a definition', function () {
expect(() => init('', null)).to.throwException(/must have a definition/);
});
});
describe('rendering with definition', function () {
it('should call link method', function () {
const markup = '<p>hello world</p>';
const definition = {
link: sinon.stub(),
};
init(markup, definition);
sinon.assert.callCount(definition.link, 1);
});
it('should call controller method', function () {
const markup = '<p>hello world</p>';
const definition = {
controller: sinon.stub(),
};
init(markup, definition);
sinon.assert.callCount(definition.controller, 1);
});
});
});

View file

@ -0,0 +1,30 @@
const _ = require('lodash');
const $ = require('jquery');
const module = require('ui/modules').get('kibana/render_directive');
module.directive('renderDirective', function () {
return {
restrict: 'E',
scope: {
'definition': '='
},
template: function ($el, $attrs) {
return $el.html();
},
controller: function ($scope, $element, $attrs, $transclude) {
if (!$scope.definition) throw new Error('render-directive must have a definition attribute');
const { controller, controllerAs } = $scope.definition;
if (controller) {
if (controllerAs) $scope[controllerAs] = this;
$scope.$eval(controller, { $scope, $element, $attrs, $transclude });
}
},
link: function ($scope, $el, $attrs) {
const { link } = $scope.definition;
if (link) {
link($scope, $el, $attrs);
}
}
};
});

View file

@ -62,6 +62,44 @@ module.exports = function (grunt) {
}
}
}
},
withPlugins: {
options: {
version: '2.1.0',
directory: resolve(directory, 'withPlugins'),
plugins: [
'license',
'shield',
'marvel-agent',
'watcher'
],
shield: {
users: [
{
username: 'kibana',
password: 'notsecure',
roles: ['kibana4_server']
},
{
username: 'user',
password: 'notsecure',
roles: ['kibana4', 'marvel']
},
{
username: 'admin',
password: 'notsecure',
roles: ['admin']
}
]
},
config: {
marvel: {
agent: {
interval: '60s'
}
}
}
}
}
};
};

20
test/dev_certs/server.crt Normal file
View file

@ -0,0 +1,20 @@
-----BEGIN CERTIFICATE-----
MIIDMDCCAhgCCQCkOD7fnHiQrTANBgkqhkiG9w0BAQUFADBZMQswCQYDVQQGEwJB
VTETMBEGA1UECBMKU29tZS1TdGF0ZTEhMB8GA1UEChMYSW50ZXJuZXQgV2lkZ2l0
cyBQdHkgTHRkMRIwEAYDVQQDEwlsb2NhbGhvc3QwIBcNMTYwMTE5MjExNjQ5WhgP
MjA4NDAyMDYyMTE2NDlaMFkxCzAJBgNVBAYTAkFVMRMwEQYDVQQIEwpTb21lLVN0
YXRlMSEwHwYDVQQKExhJbnRlcm5ldCBXaWRnaXRzIFB0eSBMdGQxEjAQBgNVBAMT
CWxvY2FsaG9zdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBALR63SnR
LW/Dgl0Vuy8gB6KjWwvajRWeXNgvlf6LX56oNsUFREHBLQlC2uT5R26F3tOqCDbs
MFNoyDjMBinXRRFJJ2Sokue7GSsGvBv41LMTHnO/MeCCbEOqghJS/QI89cV+u+Aw
9U+v426KAlCa1sGuE2+3/JvqdBQyheiukmGLiJ0OofpfgpYuFmKi2uYBKU3qzjUx
D01wQ4rCpq5nEnksGhgBeBDnheYmmDsj/wDvnz1exK/WprvTiHQ5MwuIQ4OybwgV
WDF+zv8PXrObrsZvD/ulrjh1cakvnCe2kDYEKMRiHUDesHS2jNJkBUe+FJo4/E3U
pFoYOtdoBe69BIUCAwEAATANBgkqhkiG9w0BAQUFAAOCAQEAQhjF91G0R662XJJ7
jGMudA9VbRVCow8s68I/GWPZmpKxPAxwz0xiv1eFIoiP416LX9amnx3yAmUoN4Wr
Cq0jsgyT1AOiSCdxkvYsqQG3SFVVt5BDLjThH66Vxi7Bach6SyATa1NG588mg7n9
pPJ4A1rcj+5kZuwnd52kfVLP+535lylwMyoyJa2AskieRPLNSzus2eUDTR6F+9Mb
eLOwp5rMl2nNYfLXUCSqEeC6uPu0yq6Tu0N0SjThfKndd2NU1fk3zyOjxyCIhGPe
G8VhrPY4lkJ9EE9Tuq095jwd1+q9fYzlKZWhOmg+IcOwUMgbgeWpeZTAhUIZAnia
4UH6NA==
-----END CERTIFICATE-----

27
test/dev_certs/server.key Normal file
View file

@ -0,0 +1,27 @@
-----BEGIN RSA PRIVATE KEY-----
MIIEowIBAAKCAQEAtHrdKdEtb8OCXRW7LyAHoqNbC9qNFZ5c2C+V/otfnqg2xQVE
QcEtCULa5PlHboXe06oINuwwU2jIOMwGKddFEUknZKiS57sZKwa8G/jUsxMec78x
4IJsQ6qCElL9Ajz1xX674DD1T6/jbooCUJrWwa4Tb7f8m+p0FDKF6K6SYYuInQ6h
+l+Cli4WYqLa5gEpTerONTEPTXBDisKmrmcSeSwaGAF4EOeF5iaYOyP/AO+fPV7E
r9amu9OIdDkzC4hDg7JvCBVYMX7O/w9es5uuxm8P+6WuOHVxqS+cJ7aQNgQoxGId
QN6wdLaM0mQFR74Umjj8TdSkWhg612gF7r0EhQIDAQABAoIBADNwfU6o3vFm4OYV
BofU8jgppQ6I2QNbYoz/axnksXkv6oRXDvBK1cI4+tieL/zRTQQ5ByRYRyHO0JpX
lD4iq/3UQtUOsug3TGIWBlFWp5DulxRYXyflJGRY2b/NRW144HfMulGYwqJWuFTO
IwDEUQdczQ9fejEaLsF+8Omzr+b6+mEDewqoHb4mbr7gXMdkK85FU2tjWRHlUR+N
GMXRhQQE7o76UyYwNRbI418LaZitHcVYMKSsyU/+Uo/ivzD793OFzk5sDR9eThgi
B50yK2ClB+1bUfis18TXMtC4mZZeypTWifJpBzIiVLMBuN7/Y0HrSomIzyPC7ytW
5jZWCOECgYEA4k+AEmYOiIfPajT3YfG3y0Dptoup9x6zkzVNZnde3LgGRfezzJ7S
+lGbLNnIiuxAeHwZnyrptO0cgm0+SOCRwGXqySkl14WzD1P684nRwlQkzKv/50mZ
ZMUjM83SuIzh8mLv3F5xQ9td5R/y2kCw4YgyYHvhp1nLxEdwPbjtttkCgYEAzCgr
wlP0Uu5PlaSMKSbtmC6tHGv+kgTLM89V6o71+HiUTG6TWGmM+6XlT1cfVWE+27np
MsYoFTKI/meAeJwm+0pxEwemHgrgffkdmsWpTByLaOY1t/eNHagaLCGm2ZRu/WK7
oltV18kPijnmFs1uZLvlBkxmkadrVAj/cw5uZ40CgYBFZt/9xHJ8iDmhdnDPBpO4
r0V9B8Ot1yp24IfF/qGGyqCR4G6xN5u3zELsNDV99QmoaVZqK3zUUUrG7L2HF+da
u2aPHiFOwN+yuaxh90fucmN+qNinkziJYLN09Y/DrOC1toWcbRILH0DiPTP6npAf
+eaJFDSVX8JPhSD0rLupsQKBgEzILuz/NjyadEQLhstTYLiDlYfC9hNkyifKKr30
1n2EnAHC9Jej2uoqEnwsgBRUZpes7A+0hw6x2uQTeTXjRKXt8Wj+z3MtFBFMx92V
yX5ene/t5PYznFczCeTfIylhsfyKTZdaUoa9j6Kk8+xPht1L7W7Y/Rp6pNsOJ0TW
gJ9hAoGBAOJDNPkmH1hY64bvR6P0zEyedClhxIclPHgCrntFTlspkMj/QY48K6IP
R8SZ5jDpxAquQDgvpiuZRDYMHBRJkUe2VpHJFYGl1MLTfoBYn1IH401ixUSBYxAW
yfE7/zMDZUov2Uu8Muto5R/yNHEKBMOLGjadkADhRIHbW4WG1yVr
-----END RSA PRIVATE KEY-----

4
test/utils/base_auth.js Normal file
View file

@ -0,0 +1,4 @@
export function header(user, pass) {
const encoded = new Buffer(`${user}:${pass}`).toString('base64');
return `Basic ${encoded}`;
}

67
test/utils/kbn_server.js Normal file
View file

@ -0,0 +1,67 @@
import { defaultsDeep, set } from 'lodash';
import requirefrom from 'requirefrom';
import { header as basicAuthHeader } from './base_auth';
const src = requirefrom('src');
const KbnServer = src('server/KbnServer');
const fromRoot = src('utils/fromRoot');
const SERVER_DEFAULTS = {
server: {
autoListen: false,
xsrf: {
disableProtection: true
}
},
logging: {
quiet: true
},
plugins: {
scanDirs: [
fromRoot('src/plugins')
]
},
optimize: {
enabled: false
},
elasticsearch: {
url: 'http://localhost:9210',
username: 'kibana',
password: 'notsecure'
}
};
/**
* Creates an instance of KbnServer with default configuration
* tailored for unit tests
*
* @param {object} params Any config overrides for this instance
*/
export function createServer(params = {}) {
params = defaultsDeep({}, params, SERVER_DEFAULTS);
return new KbnServer(params);
};
/**
* Creates request configuration with a basic auth header
*/
export function authOptions() {
const authHeader = basicAuthHeader('user', 'notsecure');
return set({}, 'headers.Authorization', authHeader);
};
/**
* Makes a request with test headers via hapi server inject()
*
* The given options are decorated with default testing options, so it's
* recommended to use this function instead of using inject() directly whenever
* possible throughout the tests.
*
* @param {KbnServer} kbnServer
* @param {object} options Any additional options or overrides for inject()
* @param {Function} fn The callback to pass as the second arg to inject()
*/
export function makeRequest(kbnServer, options, fn) {
options = defaultsDeep({}, authOptions(), options);
return kbnServer.server.inject(options, fn);
};