Merge branch 'master' of github.com:elastic/kibana into implement/esvmConfigForShield

This commit is contained in:
spalger 2016-01-05 13:18:09 -07:00
commit 102f6bf0a4
191 changed files with 5389 additions and 2404 deletions

View file

@ -1 +1 @@
0.12.9
4.2.4

View file

@ -1,8 +1,26 @@
If you have a bugfix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
# Contributing to Kibana
## How issues work
At any given time the Kibana team at Elastic is working on dozens of features and enhancements to Kibana and other projects at Elastic. When you file an issue we'll take the time to digest it, consider solutions, and weigh its applicability to both the broad Kibana user base and our own goals for the project. Once we've completed that process we will assign the issue a priority.
- **P1**: A high priority issue that affects almost all Kibana users. Bugs that would cause incorrect results, security issues and features that would vastly improve the user experience for everyone. Work arounds for P1s generally don't exist without a code change.
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
- **P4**: Niche and special interest issues that may not fit our core goals. We would take a high quality pull for this if implemented in such a way that it does not meaningfully impact other functionality or existing code. Issues may also be labeled P4 if they would be better implemented in Elasticsearch.
- **P5**: Highly niche or in opposition to our core goals. Should usually be closed. This doesn't mean we wouldn't take a pull for it, but if someone really wanted this they would be better off working on a plugin. The Kibana team will usually not work on P5 issues but may be willing to assist plugin developers on IRC.
#### How to express the importance of an issue
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
#### My issue isn't getting enough attention
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
### I want to help!
**Now we're talking**. If you have a bugfix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
We enjoy working with contributors to get their code accepted. There are many approaches to fixing a problem and it is important to find the best approach before writing too much code.
The process for contributing to any of the Elasticsearch repositories is similar.
## How to contribute code
### Sign the contributor license agreement
@ -83,9 +101,7 @@ Once that is complete just run:
npm run test && npm run build
```
Distributable packages can be found in `target/` after the build completes.
#### Debugging test failures
#### Testing and debugging tests
The standard `npm run test` task runs several sub tasks and can take several minutes to complete, making debugging failures pretty painful. In order to ease the pain specialized tasks provide alternate methods for running the tests.
@ -102,7 +118,7 @@ The standard `npm run test` task runs several sub tasks and can take several min
<br>
<img src="http://i.imgur.com/DwHxgfq.png">
</dd>
<dt><code>npm run mocha [test file or dir]</code> or <code>npm run mocha:debug [test file or dir]</code></dt>
<dd>
Run a one off test with the local project version of mocha, babel compilation, and optional debugging. Great
@ -110,6 +126,22 @@ The standard `npm run test` task runs several sub tasks and can take several min
</dd>
</dl>
Distributable packages can be found in `target/` after the build completes.
#### Building OS packages
Packages are built using fpm, pleaserun, dpkg, and rpm. fpm and pleaserun can be installed using gem. Package building has only been tested on Linux and is not supported on any other platform.
```sh
gem install pleaserun
gem install fpm
npm run build:ospackages
```
To specify a package to build you can add `rpm` or `deb` as an argument.
```sh
npm run build:ospackages -- --rpm
```
### Functional UI Testing
#### Handy references
@ -121,15 +153,20 @@ The standard `npm run test` task runs several sub tasks and can take several min
*The Selenium server that is started currently only runs the tests in Firefox*
To runt the functional UI tests, execute the following command:
To run the functional UI tests use the following commands
`npm run test:ui`
<dl>
The task above takes a little time to start the servers. You can also start the servers and leave them running, and then run the tests separately:
<dt><code>npm run test:ui</code></dt>
<dd>Run the functional UI tests one time and exit. This is used by the CI systems and is great for quickly checking that things pass. It is essentially a combination of the next two tasks.</dd>
`npm run test:ui:server` will start the server required to run the selenium tests, leave this open
<dt><code>npm run test:ui:server</code></dt>
<dd>Start the server required for the <code>test:ui:runner</code> tasks. Once the server is started <code>test:ui:runner</code> can be run multiple times without waiting for the server to start.</dd>
`npm run test:ui:runner` will run the frontend tests and close when complete
<dt><code>npm run test:ui:runner</code></dt>
<dd>Execute the front-end selenium tests. This requires the server started by the <code>test:ui:server</code> task.</dd>
</dl>
#### Running tests locally with your existing (and already running) ElasticSearch, Kibana, and Selenium Server:
@ -137,7 +174,9 @@ Set your es and kibana ports in `test/intern.js` to 9220 and 5620, respecitively
Once you've got the services running, execute the following:
`npm run test:ui:runner`
```sh
npm run test:ui:runner
```
#### General notes:
@ -146,7 +185,7 @@ Once you've got the services running, execute the following:
- These tests have been developed and tested with Chrome and Firefox browser. In theory, they should work on all browsers (that's the benefit of Intern using Leadfoot).
- These tests should also work with an external testing service like https://saucelabs.com/ or https://www.browserstack.com/ but that has not been tested.
### Submit a pull request
## Submitting a pull request
Push your local changes to your forked copy of the repository and submit a pull request. In the pull request, describe what your changes do and mention the number of the issue where discussion has taken place, eg “Closes #123″.

15
FAQ.md
View file

@ -1,5 +1,15 @@
**Kibana 3 Migration FAQ:**
# Frequently asked questions
**Q:** I'm getting `bin/node/bin/node: not found` but I can see the node binary in the package?
**A:** Kibana 4 packages are architecture specific. Ensure you are using the correct package for your architecture.
**Q:** Where do I go for support?
**A:** Please join us at [discuss.elastic.co](discuss.elastic.co) with questions. Your problem might be a bug, but it might just be a misunderstanding, or feature we could improve. We're also available on Freenode in #kibana
**Q:** Ok, we talked about it and its definitely a bug
**A:** Doh, ok, let's get that fixed. File an issue on [github.com/elastic/kibana](github.com/elastic/kibana). I'd recommend reading the beginning of the CONTRIBUTING.md, just so you know how we'll handle the issue.
### Kibana 3 Migration
**Q:** Where is feature X that I loved from Kibana 3?
**A:** It might be coming! Weve published our immediate roadmap as tickets. Check out the beta milestones on GitHub to see if the feature youre missing is coming soon.
@ -12,6 +22,3 @@
**Q:** What happened to templated/scripted dashboards?
**A:** Check out the URL. The state of each app is stored there, including any filters, queries or columns. This should be a lot easier than constructing scripted dashboards. The encoding of the URL is RISON.
**Q:** I'm getting `bin/node/bin/node: not found` but I can see the node binary in the package?
**A:** Kibana 4 packages are architecture specific. Ensure you are using the correct package for your architecture.

View file

@ -4,15 +4,23 @@ Kibana is an open source ([Apache Licensed](https://github.com/elastic/kibana/bl
## Requirements
- Elasticsearch version 2.1.0 or later
- Elasticsearch version 2.2.0 or later
- Kibana binary package
## Installation
* Download: [http://www.elastic.co/downloads/kibana](http://www.elastic.co/downloads/kibana)
* Extract the files
* Run `bin/kibana` on unix, or `bin\kibana.bat` on Windows.
* Visit [http://localhost:5601](http://localhost:5601)
## Upgrade from previous version
* Move any custom configurations in your old kibana.yml to your new one
* Reinstall plugins
* Start or restart Kibana
## Quick Start
You're up and running! Fantastic! Kibana is now running on port 5601, so point your browser at http://YOURDOMAIN.com:5601.

View file

@ -825,7 +825,7 @@ Angular modules are defined using a custom require module named `ui/modules`. It
var app = require('ui/modules').get('app/namespace');
```
`app` above is a reference to an Angular module, and can be used to define controllers, providers and anything else used in Angular.
`app` above is a reference to an Angular module, and can be used to define controllers, providers and anything else used in Angular. While you can use this module to create/get any module with ui/modules, we generally use the "kibana" module for everything.
### Private modules
@ -838,6 +838,8 @@ app.controller('myController', function($scope, otherDeps, Private) {
});
```
*Use `Private` modules for everything except directives, filters, and controllers.*
### Promises
A more robust version of Angular's `$q` service is available as `Promise`. It can be used in the same way as `$q`, but it comes packaged with several utility methods that provide many of the same useful utilities as Bluebird.

View file

@ -8,6 +8,9 @@
# specify that path here. The basePath can't end in a slash.
# server.basePath: ""
# The maximum payload size in bytes on incoming server requests.
# server.maxPayloadBytes: 1048576
# The Elasticsearch instance to use for all your queries.
# elasticsearch.url: "http://localhost:9200"

Binary file not shown.

Before

Width:  |  Height:  |  Size: 5.5 KiB

After

Width:  |  Height:  |  Size: 1.4 KiB

Before After
Before After

View file

@ -21,22 +21,6 @@ bin/kibana plugin -i elasticsearch/marvel/latest
Because the organization given is `elasticsearch`, the plugin management tool automatically downloads the
plugin from `download.elastic.co`.
[float]
=== Installing Plugins from Github
When the specified plugin is not found at `download.elastic.co`, the plugin management tool parses the element
as a Github user name, as in the following example:
[source,shell]
bin/kibana plugin --install github-user/sample-plugin
Installing sample-plugin
Attempting to extract from https://download.elastic.co/github-user/sample-plugin/sample-plugin-latest.tar.gz
Attempting to extract from https://github.com/github-user/sample-plugin/archive/master.tar.gz
Downloading <some number> bytes....................
Extraction complete
Optimizing and caching browser bundles...
Plugin installation complete
[float]
=== Installing Plugins from an Arbitrary URL

View file

@ -48,6 +48,7 @@
"test:server": "grunt test:server",
"test:coverage": "grunt test:coverage",
"build": "grunt build",
"build:ospackages": "grunt build --os-packages",
"start": "./bin/kibana --dev",
"precommit": "grunt precommit",
"karma": "karma start",
@ -63,6 +64,7 @@
"url": "https://github.com/elastic/kibana.git"
},
"dependencies": {
"@bigfunger/decompress-zip": "0.2.0-stripfix2",
"@spalger/angular-bootstrap": "0.12.1",
"@spalger/filesaver": "1.1.2",
"@spalger/leaflet-draw": "0.2.3",
@ -86,6 +88,7 @@
"bootstrap": "3.3.5",
"brace": "0.5.1",
"bunyan": "1.4.0",
"clipboard": "1.5.5",
"commander": "2.8.1",
"css-loader": "0.17.0",
"d3": "3.5.6",
@ -131,7 +134,8 @@
"url-loader": "0.5.6",
"webpack": "1.12.1",
"webpack-directory-name-as-main": "1.0.0",
"whatwg-fetch": "0.9.0"
"whatwg-fetch": "0.9.0",
"wreck": "6.2.0"
},
"devDependencies": {
"Nonsense": "0.1.2",
@ -149,7 +153,7 @@
"grunt-cli": "0.1.13",
"grunt-contrib-clean": "0.6.0",
"grunt-contrib-copy": "0.8.1",
"grunt-esvm": "2.0.0-beta1",
"grunt-esvm": "2.0.0",
"grunt-karma": "0.12.0",
"grunt-run": "0.5.0",
"grunt-s3": "0.2.0-alpha.3",
@ -167,7 +171,7 @@
"karma-ie-launcher": "0.2.0",
"karma-mocha": "0.2.0",
"karma-safari-launcher": "0.1.1",
"libesvm": "1.0.7",
"libesvm": "3.3.0",
"license-checker": "3.1.0",
"load-grunt-config": "0.7.2",
"marked-text-renderer": "0.1.0",
@ -177,11 +181,10 @@
"portscanner": "1.0.0",
"simple-git": "1.8.0",
"sinon": "1.17.2",
"source-map": "0.4.4",
"wreck": "6.2.0"
"source-map": "0.4.4"
},
"engines": {
"node": "0.12.9",
"npm": "2.14.3"
"node": "4.2.4",
"npm": "2.14.15"
}
}

View file

@ -1,10 +1,6 @@
var expect = require('expect.js');
var sinon = require('sinon');
var plugin = require('../plugin');
var installer = require('../pluginInstaller');
var remover = require('../pluginRemover');
var settingParser = require('../settingParser');
const expect = require('expect.js');
const sinon = require('sinon');
const plugin = require('../plugin');
describe('kibana cli', function () {
@ -12,7 +8,7 @@ describe('kibana cli', function () {
describe('commander options', function () {
var program = {
let program = {
command: function () { return program; },
description: function () { return program; },
option: function () { return program; },
@ -38,9 +34,9 @@ describe('kibana cli', function () {
});
it('should define the command line options', function () {
var spy = sinon.spy(program, 'option');
const spy = sinon.spy(program, 'option');
var options = [
const options = [
/-i/,
/-r/,
/-s/,
@ -50,10 +46,10 @@ describe('kibana cli', function () {
plugin(program);
for (var i = 0; i < spy.callCount; i++) {
var call = spy.getCall(i);
for (var o = 0; o < options.length; o++) {
var option = options[o];
for (let i = 0; i < spy.callCount; i++) {
const call = spy.getCall(i);
for (let o = 0; o < options.length; o++) {
const option = options[o];
if (call.args[0].match(option)) {
options.splice(o, 1);
break;

View file

@ -1,249 +0,0 @@
var expect = require('expect.js');
var sinon = require('sinon');
var nock = require('nock');
var glob = require('glob');
var rimraf = require('rimraf');
var { join } = require('path');
var pluginLogger = require('../pluginLogger');
var pluginDownloader = require('../pluginDownloader');
describe('kibana cli', function () {
describe('plugin downloader', function () {
var testWorkingPath = join(__dirname, '.test.data');
var logger;
var downloader;
beforeEach(function () {
logger = pluginLogger(false);
sinon.stub(logger, 'log');
sinon.stub(logger, 'error');
rimraf.sync(testWorkingPath);
});
afterEach(function () {
logger.log.restore();
logger.error.restore();
rimraf.sync(testWorkingPath);
});
describe('_downloadSingle', function () {
beforeEach(function () {
downloader = pluginDownloader({}, logger);
});
afterEach(function () {
});
it.skip('should throw an ENOTFOUND error for a 404 error', function () {
var couchdb = nock('http://www.files.com')
.get('/plugin.tar.gz')
.reply(404);
var source = 'http://www.files.com/plugin.tar.gz';
var errorStub = sinon.stub();
return downloader._downloadSingle(source, testWorkingPath, 0, logger)
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(true);
expect(errorStub.lastCall.args[0].message).to.match(/ENOTFOUND/);
var files = glob.sync('**/*', { cwd: testWorkingPath });
expect(files).to.eql([]);
});
});
it.skip('should download and extract a valid plugin', function () {
var filename = join(__dirname, 'replies/test-plugin-master.tar.gz');
var couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/plugin.tar.gz')
.replyWithFile(200, filename);
var source = 'http://www.files.com/plugin.tar.gz';
return downloader._downloadSingle(source, testWorkingPath, 0, logger)
.then(function (data) {
var files = glob.sync('**/*', { cwd: testWorkingPath });
var expected = [
'README.md',
'index.js',
'package.json',
'public',
'public/app.js'
];
expect(files.sort()).to.eql(expected.sort());
});
});
it('should abort the download and extraction for a corrupt archive.', function () {
var filename = join(__dirname, 'replies/corrupt.tar.gz');
var couchdb = nock('http://www.files.com')
.get('/plugin.tar.gz')
.replyWithFile(200, filename);
var source = 'http://www.files.com/plugin.tar.gz';
var errorStub = sinon.stub();
return downloader._downloadSingle(source, testWorkingPath, 0, logger)
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(true);
var files = glob.sync('**/*', { cwd: testWorkingPath });
expect(files).to.eql([]);
});
});
});
describe('download', function () {
beforeEach(function () {});
afterEach(function () {});
it.skip('should loop through bad urls until it finds a good one.', function () {
var filename = join(__dirname, 'replies/test-plugin-master.tar.gz');
var settings = {
urls: [
'http://www.files.com/badfile1.tar.gz',
'http://www.files.com/badfile2.tar.gz',
'I am a bad uri',
'http://www.files.com/goodfile.tar.gz'
],
workingPath: testWorkingPath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
var couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/badfile1.tar.gz')
.reply(404)
.get('/badfile2.tar.gz')
.reply(404)
.get('/goodfile.tar.gz')
.replyWithFile(200, filename);
var errorStub = sinon.stub();
return downloader.download(settings, logger)
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(false);
expect(logger.log.getCall(0).args[0]).to.match(/badfile1.tar.gz/);
expect(logger.log.getCall(1).args[0]).to.match(/badfile2.tar.gz/);
expect(logger.log.getCall(2).args[0]).to.match(/I am a bad uri/);
expect(logger.log.getCall(3).args[0]).to.match(/goodfile.tar.gz/);
expect(logger.log.lastCall.args[0]).to.match(/complete/i);
var files = glob.sync('**/*', { cwd: testWorkingPath });
var expected = [
'README.md',
'index.js',
'package.json',
'public',
'public/app.js'
];
expect(files.sort()).to.eql(expected.sort());
});
});
it.skip('should stop looping through urls when it finds a good one.', function () {
var filename = join(__dirname, 'replies/test-plugin-master.tar.gz');
var settings = {
urls: [
'http://www.files.com/badfile1.tar.gz',
'http://www.files.com/badfile2.tar.gz',
'http://www.files.com/goodfile.tar.gz',
'http://www.files.com/badfile3.tar.gz'
],
workingPath: testWorkingPath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
var couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/badfile1.tar.gz')
.reply(404)
.get('/badfile2.tar.gz')
.reply(404)
.get('/goodfile.tar.gz')
.replyWithFile(200, filename)
.get('/badfile3.tar.gz')
.reply(404);
var errorStub = sinon.stub();
return downloader.download(settings, logger)
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(false);
for (var i = 0; i < logger.log.callCount; i++) {
expect(logger.log.getCall(i).args[0]).to.not.match(/badfile3.tar.gz/);
}
var files = glob.sync('**/*', { cwd: testWorkingPath });
var expected = [
'README.md',
'index.js',
'package.json',
'public',
'public/app.js'
];
expect(files.sort()).to.eql(expected.sort());
});
});
it.skip('should throw an error when it doesn\'t find a good url.', function () {
var settings = {
urls: [
'http://www.files.com/badfile1.tar.gz',
'http://www.files.com/badfile2.tar.gz',
'http://www.files.com/badfile3.tar.gz'
],
workingPath: testWorkingPath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
var couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/badfile1.tar.gz')
.reply(404)
.get('/badfile2.tar.gz')
.reply(404)
.get('/badfile3.tar.gz')
.reply(404);
var errorStub = sinon.stub();
return downloader.download(settings, logger)
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(true);
expect(errorStub.lastCall.args[0].message).to.match(/not a valid/i);
var files = glob.sync('**/*', { cwd: testWorkingPath });
expect(files).to.eql([]);
});
});
});
});
});

View file

@ -1,28 +1,26 @@
var expect = require('expect.js');
var sinon = require('sinon');
var fs = require('fs');
var rimraf = require('rimraf');
const expect = require('expect.js');
const sinon = require('sinon');
const fs = require('fs');
const rimraf = require('rimraf');
var pluginCleaner = require('../pluginCleaner');
var pluginLogger = require('../pluginLogger');
const pluginCleaner = require('../plugin_cleaner');
const pluginLogger = require('../plugin_logger');
describe('kibana cli', function () {
describe('plugin installer', function () {
describe('pluginCleaner', function () {
var settings = {
const settings = {
workingPath: 'dummy'
};
describe('cleanPrevious', function () {
var cleaner;
var errorStub;
var logger;
var progress;
var request;
let cleaner;
let errorStub;
let logger;
let progress;
let request;
beforeEach(function () {
errorStub = sinon.stub();
@ -46,7 +44,7 @@ describe('kibana cli', function () {
it('should resolve if the working path does not exist', function () {
sinon.stub(rimraf, 'sync');
sinon.stub(fs, 'statSync', function () {
var error = new Error('ENOENT');
const error = new Error('ENOENT');
error.code = 'ENOENT';
throw error;
});
@ -61,7 +59,7 @@ describe('kibana cli', function () {
it('should rethrow any exception except ENOENT from fs.statSync', function () {
sinon.stub(rimraf, 'sync');
sinon.stub(fs, 'statSync', function () {
var error = new Error('An Unhandled Error');
const error = new Error('An Unhandled Error');
throw error;
});
@ -112,8 +110,9 @@ describe('kibana cli', function () {
});
describe('cleanError', function () {
var cleaner;
var logger;
let cleaner;
let logger;
beforeEach(function () {
logger = pluginLogger(false);
cleaner = pluginCleaner(settings, logger);

View file

@ -0,0 +1,316 @@
const expect = require('expect.js');
const sinon = require('sinon');
const nock = require('nock');
const glob = require('glob');
const rimraf = require('rimraf');
const { join } = require('path');
const mkdirp = require('mkdirp');
const pluginLogger = require('../plugin_logger');
const pluginDownloader = require('../plugin_downloader');
describe('kibana cli', function () {
describe('plugin downloader', function () {
const testWorkingPath = join(__dirname, '.test.data');
const tempArchiveFilePath = join(testWorkingPath, 'archive.part');
let logger;
let downloader;
function expectWorkingPathEmpty() {
const files = glob.sync('**/*', { cwd: testWorkingPath });
expect(files).to.eql([]);
}
function expectWorkingPathNotEmpty() {
const files = glob.sync('**/*', { cwd: testWorkingPath });
const expected = [
'archive.part'
];
expect(files.sort()).to.eql(expected.sort());
}
function shouldReject() {
throw new Error('expected the promise to reject');
}
beforeEach(function () {
logger = pluginLogger(false);
sinon.stub(logger, 'log');
sinon.stub(logger, 'error');
rimraf.sync(testWorkingPath);
mkdirp.sync(testWorkingPath);
});
afterEach(function () {
logger.log.restore();
logger.error.restore();
rimraf.sync(testWorkingPath);
});
describe('_downloadSingle', function () {
beforeEach(function () {
const settings = {
urls: [],
workingPath: testWorkingPath,
tempArchiveFile: tempArchiveFilePath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
});
describe('http downloader', function () {
it('should download an unsupported file type, but return undefined for archiveType', function () {
const filePath = join(__dirname, 'replies/banana.jpg');
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10',
'content-type': 'image/jpeg'
})
.get('/banana.jpg')
.replyWithFile(200, filePath);
const sourceUrl = 'http://www.files.com/banana.jpg';
return downloader._downloadSingle(sourceUrl)
.then(function (data) {
expect(data.archiveType).to.be(undefined);
expectWorkingPathNotEmpty();
});
});
it('should throw an ENOTFOUND error for a http ulr that returns 404', function () {
const couchdb = nock('http://www.files.com')
.get('/plugin.tar.gz')
.reply(404);
const sourceUrl = 'http://www.files.com/plugin.tar.gz';
return downloader._downloadSingle(sourceUrl)
.then(shouldReject, function (err) {
expect(err.message).to.match(/ENOTFOUND/);
expectWorkingPathEmpty();
});
});
it('should throw an ENOTFOUND error for an invalid url', function () {
const sourceUrl = 'i am an invalid url';
return downloader._downloadSingle(sourceUrl)
.then(shouldReject, function (err) {
expect(err.message).to.match(/ENOTFOUND/);
expectWorkingPathEmpty();
});
});
it('should download a tarball from a valid http url', function () {
const filePath = join(__dirname, 'replies/test_plugin_master.tar.gz');
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10',
'content-type': 'application/x-gzip'
})
.get('/plugin.tar.gz')
.replyWithFile(200, filePath);
const sourceUrl = 'http://www.files.com/plugin.tar.gz';
return downloader._downloadSingle(sourceUrl)
.then(function (data) {
expect(data.archiveType).to.be('.tar.gz');
expectWorkingPathNotEmpty();
});
});
it('should download a zip from a valid http url', function () {
const filePath = join(__dirname, 'replies/test_plugin_master.zip');
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '341965',
'content-type': 'application/zip'
})
.get('/plugin.zip')
.replyWithFile(200, filePath);
const sourceUrl = 'http://www.files.com/plugin.zip';
return downloader._downloadSingle(sourceUrl)
.then(function (data) {
expect(data.archiveType).to.be('.zip');
expectWorkingPathNotEmpty();
});
});
});
describe('local file downloader', function () {
it('should copy an unsupported file type, but return undefined for archiveType', function () {
const filePath = join(__dirname, 'replies/banana.jpg');
const sourceUrl = 'file://' + filePath.replace(/\\/g, '/');
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10',
'content-type': 'image/jpeg'
})
.get('/banana.jpg')
.replyWithFile(200, filePath);
return downloader._downloadSingle(sourceUrl)
.then(function (data) {
expect(data.archiveType).to.be(undefined);
expectWorkingPathNotEmpty();
});
});
it('should throw an ENOTFOUND error for an invalid local file', function () {
const filePath = join(__dirname, 'replies/i-am-not-there.tar.gz');
const sourceUrl = 'file://' + filePath.replace(/\\/g, '/');
return downloader._downloadSingle(sourceUrl)
.then(shouldReject, function (err) {
expect(err.message).to.match(/ENOTFOUND/);
expectWorkingPathEmpty();
});
});
it('should copy a tarball from a valid local file', function () {
const filePath = join(__dirname, 'replies/test_plugin_master.tar.gz');
const sourceUrl = 'file://' + filePath.replace(/\\/g, '/');
return downloader._downloadSingle(sourceUrl)
.then(function (data) {
expect(data.archiveType).to.be('.tar.gz');
expectWorkingPathNotEmpty();
});
});
it('should copy a zip from a valid local file', function () {
const filePath = join(__dirname, 'replies/test_plugin_master.zip');
const sourceUrl = 'file://' + filePath.replace(/\\/g, '/');
return downloader._downloadSingle(sourceUrl)
.then(function (data) {
expect(data.archiveType).to.be('.zip');
expectWorkingPathNotEmpty();
});
});
});
});
describe('download', function () {
it('should loop through bad urls until it finds a good one.', function () {
const filePath = join(__dirname, 'replies/test_plugin_master.tar.gz');
const settings = {
urls: [
'http://www.files.com/badfile1.tar.gz',
'http://www.files.com/badfile2.tar.gz',
'I am a bad uri',
'http://www.files.com/goodfile.tar.gz'
],
workingPath: testWorkingPath,
tempArchiveFile: tempArchiveFilePath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/badfile1.tar.gz')
.reply(404)
.get('/badfile2.tar.gz')
.reply(404)
.get('/goodfile.tar.gz')
.replyWithFile(200, filePath);
return downloader.download(settings, logger)
.then(function (data) {
expect(logger.log.getCall(0).args[0]).to.match(/badfile1.tar.gz/);
expect(logger.log.getCall(1).args[0]).to.match(/badfile2.tar.gz/);
expect(logger.log.getCall(2).args[0]).to.match(/I am a bad uri/);
expect(logger.log.getCall(3).args[0]).to.match(/goodfile.tar.gz/);
expectWorkingPathNotEmpty();
});
});
it('should stop looping through urls when it finds a good one.', function () {
const filePath = join(__dirname, 'replies/test_plugin_master.tar.gz');
const settings = {
urls: [
'http://www.files.com/badfile1.tar.gz',
'http://www.files.com/badfile2.tar.gz',
'http://www.files.com/goodfile.tar.gz',
'http://www.files.com/badfile3.tar.gz'
],
workingPath: testWorkingPath,
tempArchiveFile: tempArchiveFilePath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/badfile1.tar.gz')
.reply(404)
.get('/badfile2.tar.gz')
.reply(404)
.get('/goodfile.tar.gz')
.replyWithFile(200, filePath)
.get('/badfile3.tar.gz')
.reply(404);
return downloader.download(settings, logger)
.then(function (data) {
for (let i = 0; i < logger.log.callCount; i++) {
expect(logger.log.getCall(i).args[0]).to.not.match(/badfile3.tar.gz/);
}
expectWorkingPathNotEmpty();
});
});
it('should throw an error when it doesn\'t find a good url.', function () {
const settings = {
urls: [
'http://www.files.com/badfile1.tar.gz',
'http://www.files.com/badfile2.tar.gz',
'http://www.files.com/badfile3.tar.gz'
],
workingPath: testWorkingPath,
tempArchiveFile: tempArchiveFilePath,
timeout: 0
};
downloader = pluginDownloader(settings, logger);
const couchdb = nock('http://www.files.com')
.defaultReplyHeaders({
'content-length': '10'
})
.get('/badfile1.tar.gz')
.reply(404)
.get('/badfile2.tar.gz')
.reply(404)
.get('/badfile3.tar.gz')
.reply(404);
return downloader.download(settings, logger)
.then(shouldReject, function (err) {
expect(err.message).to.match(/no valid url specified/i);
expectWorkingPathEmpty();
});
});
});
});
});

View file

@ -0,0 +1,131 @@
const expect = require('expect.js');
const sinon = require('sinon');
const glob = require('glob');
const rimraf = require('rimraf');
const { join } = require('path');
const mkdirp = require('mkdirp');
const pluginLogger = require('../plugin_logger');
const extract = require('../plugin_extractor');
const pluginDownloader = require('../plugin_downloader');
describe('kibana cli', function () {
describe('plugin extractor', function () {
const testWorkingPath = join(__dirname, '.test.data');
const tempArchiveFilePath = join(testWorkingPath, 'archive.part');
let logger;
let downloader;
const settings = {
workingPath: testWorkingPath,
tempArchiveFile: tempArchiveFilePath
};
function shouldReject() {
throw new Error('expected the promise to reject');
}
beforeEach(function () {
logger = pluginLogger(false);
sinon.stub(logger, 'log');
sinon.stub(logger, 'error');
rimraf.sync(testWorkingPath);
mkdirp.sync(testWorkingPath);
downloader = pluginDownloader(settings, logger);
});
afterEach(function () {
logger.log.restore();
logger.error.restore();
rimraf.sync(testWorkingPath);
});
function copyReplyFile(filename) {
const filePath = join(__dirname, 'replies', filename);
const sourceUrl = 'file://' + filePath.replace(/\\/g, '/');
return downloader._downloadSingle(sourceUrl);
}
function shouldReject() {
throw new Error('expected the promise to reject');
}
describe('extractArchive', function () {
it('successfully extract a valid tarball', function () {
return copyReplyFile('test_plugin_master.tar.gz')
.then((data) => {
return extract(settings, logger, data.archiveType);
})
.then(() => {
const files = glob.sync('**/*', { cwd: testWorkingPath });
const expected = [
'archive.part',
'README.md',
'index.js',
'package.json',
'public',
'public/app.js'
];
expect(files.sort()).to.eql(expected.sort());
});
});
it('successfully extract a valid zip', function () {
return copyReplyFile('test_plugin_master.zip')
.then((data) => {
return extract(settings, logger, data.archiveType);
})
.then(() => {
const files = glob.sync('**/*', { cwd: testWorkingPath });
const expected = [
'archive.part',
'README.md',
'index.js',
'package.json',
'public',
'public/app.js',
'extra file only in zip.txt'
];
expect(files.sort()).to.eql(expected.sort());
});
});
it('throw an error when extracting a corrupt zip', function () {
return copyReplyFile('corrupt.zip')
.then((data) => {
return extract(settings, logger, data.archiveType);
})
.then(shouldReject, (err) => {
expect(err.message).to.match(/error extracting/i);
});
});
it('throw an error when extracting a corrupt tarball', function () {
return copyReplyFile('corrupt.tar.gz')
.then((data) => {
return extract(settings, logger, data.archiveType);
})
.then(shouldReject, (err) => {
expect(err.message).to.match(/error extracting/i);
});
});
it('throw an error when passed an unknown archive type', function () {
return copyReplyFile('banana.jpg')
.then((data) => {
return extract(settings, logger, data.archiveType);
})
.then(shouldReject, (err) => {
expect(err.message).to.match(/unsupported archive format/i);
});
});
});
});
});

View file

@ -1,27 +1,22 @@
var expect = require('expect.js');
var sinon = require('sinon');
var nock = require('nock');
var rimraf = require('rimraf');
var fs = require('fs');
var { join } = require('path');
var Promise = require('bluebird');
var pluginLogger = require('../pluginLogger');
var pluginInstaller = require('../pluginInstaller');
const expect = require('expect.js');
const sinon = require('sinon');
const rimraf = require('rimraf');
const { mkdirSync } = require('fs');
const { join } = require('path');
const pluginLogger = require('../plugin_logger');
const pluginInstaller = require('../plugin_installer');
describe('kibana cli', function () {
describe('plugin installer', function () {
describe('pluginInstaller', function () {
let logger;
let testWorkingPath;
let processExitStub;
var logger;
var testWorkingPath;
var processExitStub;
var statSyncStub;
beforeEach(function () {
processExitStub = undefined;
statSyncStub = undefined;
logger = pluginLogger(false);
testWorkingPath = join(__dirname, '.test.data');
rimraf.sync(testWorkingPath);
@ -31,7 +26,6 @@ describe('kibana cli', function () {
afterEach(function () {
if (processExitStub) processExitStub.restore();
if (statSyncStub) statSyncStub.restore();
logger.log.restore();
logger.error.restore();
rimraf.sync(testWorkingPath);
@ -39,9 +33,9 @@ describe('kibana cli', function () {
it('should throw an error if the workingPath already exists.', function () {
processExitStub = sinon.stub(process, 'exit');
fs.mkdirSync(testWorkingPath);
mkdirSync(testWorkingPath);
var settings = {
let settings = {
pluginPath: testWorkingPath
};
@ -54,18 +48,6 @@ describe('kibana cli', function () {
});
});
it('should rethrow any non "ENOENT" error from fs.', function () {
statSyncStub = sinon.stub(fs, 'statSync', function () {
throw new Error('This is unexpected.');
});
var settings = {
pluginPath: testWorkingPath
};
expect(pluginInstaller.install).withArgs(settings, logger).to.throwException(/this is unexpected/i);
});
});
});

View file

@ -1,15 +1,13 @@
var expect = require('expect.js');
var sinon = require('sinon');
var pluginLogger = require('../pluginLogger');
const expect = require('expect.js');
const sinon = require('sinon');
const pluginLogger = require('../plugin_logger');
describe('kibana cli', function () {
describe('plugin installer', function () {
describe('logger', function () {
var logger;
let logger;
describe('logger.log', function () {
@ -23,18 +21,18 @@ describe('kibana cli', function () {
it('should log messages to the console and append a new line', function () {
logger = pluginLogger({ silent: false, quiet: false });
var message = 'this is my message';
const message = 'this is my message';
logger.log(message);
var callCount = process.stdout.write.callCount;
const callCount = process.stdout.write.callCount;
expect(process.stdout.write.getCall(callCount - 2).args[0]).to.be(message);
expect(process.stdout.write.getCall(callCount - 1).args[0]).to.be('\n');
});
it('should log messages to the console and append not append a new line', function () {
logger = pluginLogger({ silent: false, quiet: false });
for (var i = 0; i < 10; i++) {
for (let i = 0; i < 10; i++) {
logger.log('.', true);
}
logger.log('Done!');
@ -58,10 +56,10 @@ describe('kibana cli', function () {
it('should not log any messages when quiet is set', function () {
logger = pluginLogger({ silent: false, quiet: true });
var message = 'this is my message';
const message = 'this is my message';
logger.log(message);
for (var i = 0; i < 10; i++) {
for (let i = 0; i < 10; i++) {
logger.log('.', true);
}
logger.log('Done!');
@ -72,10 +70,10 @@ describe('kibana cli', function () {
it('should not log any messages when silent is set', function () {
logger = pluginLogger({ silent: true, quiet: false });
var message = 'this is my message';
const message = 'this is my message';
logger.log(message);
for (var i = 0; i < 10; i++) {
for (let i = 0; i < 10; i++) {
logger.log('.', true);
}
logger.log('Done!');
@ -97,7 +95,7 @@ describe('kibana cli', function () {
it('should log error messages to the console and append a new line', function () {
logger = pluginLogger({ silent: false, quiet: false });
var message = 'this is my error';
const message = 'this is my error';
logger.error(message);
expect(process.stderr.write.calledWith(message + '\n')).to.be(true);
@ -105,7 +103,7 @@ describe('kibana cli', function () {
it('should log error messages to the console when quiet is set', function () {
logger = pluginLogger({ silent: false, quiet: true });
var message = 'this is my error';
const message = 'this is my error';
logger.error(message);
expect(process.stderr.write.calledWith(message + '\n')).to.be(true);
@ -113,7 +111,7 @@ describe('kibana cli', function () {
it('should not log any error messages when silent is set', function () {
logger = pluginLogger({ silent: true, quiet: false });
var message = 'this is my error';
const message = 'this is my error';
logger.error(message);
expect(process.stderr.write.callCount).to.be(0);

View file

@ -1,301 +0,0 @@
var expect = require('expect.js');
var sinon = require('sinon');
var progressReporter = require('../progressReporter');
var pluginLogger = require('../pluginLogger');
describe('kibana cli', function () {
describe('plugin installer', function () {
describe('progressReporter', function () {
var logger;
var progress;
var request;
beforeEach(function () {
logger = pluginLogger(false);
sinon.stub(logger, 'log');
sinon.stub(logger, 'error');
request = {
abort: sinon.stub(),
emit: sinon.stub()
};
progress = progressReporter(logger, request);
});
afterEach(function () {
logger.log.restore();
logger.error.restore();
});
describe('handleResponse', function () {
describe('bad response codes', function () {
function testErrorResponse(element, index, array) {
it('should set the state to error for response code = ' + element, function () {
progress.handleResponse({ statusCode: element });
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(true);
expect(errorStub.lastCall.args[0].message).to.match(/ENOTFOUND/);
});
});
}
var badCodes = [
'400', '401', '402', '403', '404', '405', '406', '407', '408', '409', '410',
'411', '412', '413', '414', '415', '416', '417', '500', '501', '502', '503',
'504', '505'
];
badCodes.forEach(testErrorResponse);
});
describe('good response codes', function () {
function testSuccessResponse(statusCode, index, array) {
it('should set the state to success for response code = ' + statusCode, function () {
progress.handleResponse({ statusCode: statusCode, headers: { 'content-length': 1000 } });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(false);
expect(logger.log.getCall(logger.log.callCount - 2).args[0]).to.match(/1000/);
});
});
}
function testUnknownNumber(statusCode, index, array) {
it('should log "unknown number of" for response code = ' + statusCode + ' without content-length header', function () {
progress.handleResponse({ statusCode: statusCode, headers: {} });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(false);
expect(logger.log.getCall(logger.log.callCount - 2).args[0]).to.match(/unknown number/);
});
});
}
var goodCodes = [
'200', '201', '202', '203', '204', '205', '206', '300', '301', '302', '303',
'304', '305', '306', '307'
];
goodCodes.forEach(testSuccessResponse);
goodCodes.forEach(testUnknownNumber);
});
});
describe('handleData', function () {
it('should do nothing if the reporter is in an error state', function () {
progress.handleResponse({ statusCode: 400 });
progress.handleData({ length: 100 });
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(progress.hasError()).to.be(true);
expect(request.abort.called).to.be(true);
expect(logger.log.callCount).to.be(0);
});
});
it('should do nothing if handleResponse hasn\'t successfully executed yet', function () {
progress.handleData({ length: 100 });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(logger.log.callCount).to.be(1);
expect(logger.log.lastCall.args[0]).to.match(/complete/i);
});
});
it('should do nothing if handleResponse was called without a content-length header', function () {
progress.handleResponse({ statusCode: 200, headers: {} });
progress.handleData({ length: 100 });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(logger.log.callCount).to.be(2);
expect(logger.log.getCall(0).args[0]).to.match(/downloading/i);
expect(logger.log.getCall(1).args[0]).to.match(/complete/i);
});
});
it('should show a max of 20 dots for full prgress', function () {
progress.handleResponse({ statusCode: 200, headers: { 'content-length': 1000 } });
progress.handleData({ length: 1000 });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(logger.log.callCount).to.be(22);
expect(logger.log.getCall(0).args[0]).to.match(/downloading/i);
expect(logger.log.getCall(1).args[0]).to.be('.');
expect(logger.log.getCall(2).args[0]).to.be('.');
expect(logger.log.getCall(3).args[0]).to.be('.');
expect(logger.log.getCall(4).args[0]).to.be('.');
expect(logger.log.getCall(5).args[0]).to.be('.');
expect(logger.log.getCall(6).args[0]).to.be('.');
expect(logger.log.getCall(7).args[0]).to.be('.');
expect(logger.log.getCall(8).args[0]).to.be('.');
expect(logger.log.getCall(9).args[0]).to.be('.');
expect(logger.log.getCall(10).args[0]).to.be('.');
expect(logger.log.getCall(11).args[0]).to.be('.');
expect(logger.log.getCall(12).args[0]).to.be('.');
expect(logger.log.getCall(13).args[0]).to.be('.');
expect(logger.log.getCall(14).args[0]).to.be('.');
expect(logger.log.getCall(15).args[0]).to.be('.');
expect(logger.log.getCall(16).args[0]).to.be('.');
expect(logger.log.getCall(17).args[0]).to.be('.');
expect(logger.log.getCall(18).args[0]).to.be('.');
expect(logger.log.getCall(19).args[0]).to.be('.');
expect(logger.log.getCall(20).args[0]).to.be('.');
expect(logger.log.getCall(21).args[0]).to.match(/complete/i);
});
});
it('should show dot for each 5% of completion', function () {
progress.handleResponse({ statusCode: 200, headers: { 'content-length': 1000 } });
expect(logger.log.callCount).to.be(1);
progress.handleData({ length: 50 }); //5%
expect(logger.log.callCount).to.be(2);
progress.handleData({ length: 100 }); //15%
expect(logger.log.callCount).to.be(4);
progress.handleData({ length: 200 }); //25%
expect(logger.log.callCount).to.be(8);
progress.handleData({ length: 590 }); //94%
expect(logger.log.callCount).to.be(20);
progress.handleData({ length: 60 }); //100%
expect(logger.log.callCount).to.be(21);
//Any progress over 100% should be ignored.
progress.handleData({ length: 9999 });
expect(logger.log.callCount).to.be(21);
progress.handleEnd();
expect(logger.log.callCount).to.be(22);
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(false);
expect(logger.log.getCall(0).args[0]).to.match(/downloading/i);
expect(logger.log.getCall(21).args[0]).to.match(/complete/i);
});
});
});
describe('handleEnd', function () {
it('should reject the deferred with a ENOTFOUND error if the reporter is in an error state', function () {
progress.handleResponse({ statusCode: 400 });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.firstCall.args[0].message).to.match(/ENOTFOUND/);
expect(errorStub.called).to.be(true);
});
});
it('should resolve if the reporter is not in an error state', function () {
progress.handleResponse({ statusCode: 307, headers: { 'content-length': 1000 } });
progress.handleEnd();
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(false);
expect(logger.log.lastCall.args[0]).to.match(/complete/i);
});
});
});
describe('handleError', function () {
it('should log any errors', function () {
progress.handleError('ERRORMESSAGE', new Error('oops!'));
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(true);
expect(logger.error.callCount).to.be(1);
expect(logger.error.lastCall.args[0]).to.match(/oops!/);
});
});
it('should set the error state of the reporter', function () {
progress.handleError('ERRORMESSAGE', new Error('oops!'));
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(progress.hasError()).to.be(true);
});
});
it('should ignore all errors except the first.', function () {
progress.handleError('ERRORMESSAGE', new Error('oops!'));
progress.handleError('ERRORMESSAGE', new Error('second error!'));
progress.handleError('ERRORMESSAGE', new Error('third error!'));
progress.handleError('ERRORMESSAGE', new Error('fourth error!'));
var errorStub = sinon.stub();
return progress.promise
.catch(errorStub)
.then(function (data) {
expect(errorStub.called).to.be(true);
expect(logger.error.callCount).to.be(1);
expect(logger.error.lastCall.args[0]).to.match(/oops!/);
});
});
});
});
});
});

View file

@ -0,0 +1,96 @@
const expect = require('expect.js');
const sinon = require('sinon');
const progressReporter = require('../progress_reporter');
const pluginLogger = require('../plugin_logger');
describe('kibana cli', function () {
describe('plugin installer', function () {
describe('progressReporter', function () {
let logger;
let progress;
let request;
beforeEach(function () {
logger = pluginLogger({ silent: false, quiet: false });
sinon.stub(logger, 'log');
sinon.stub(logger, 'error');
progress = progressReporter(logger);
});
afterEach(function () {
logger.log.restore();
logger.error.restore();
});
describe('handleData', function () {
it('should show a max of 20 dots for full progress', function () {
progress.init(1000);
progress.progress(1000);
progress.complete();
expect(logger.log.callCount).to.be(22);
expect(logger.log.getCall(0).args[0]).to.match(/transfer/i);
expect(logger.log.getCall(1).args[0]).to.be('.');
expect(logger.log.getCall(2).args[0]).to.be('.');
expect(logger.log.getCall(3).args[0]).to.be('.');
expect(logger.log.getCall(4).args[0]).to.be('.');
expect(logger.log.getCall(5).args[0]).to.be('.');
expect(logger.log.getCall(6).args[0]).to.be('.');
expect(logger.log.getCall(7).args[0]).to.be('.');
expect(logger.log.getCall(8).args[0]).to.be('.');
expect(logger.log.getCall(9).args[0]).to.be('.');
expect(logger.log.getCall(10).args[0]).to.be('.');
expect(logger.log.getCall(11).args[0]).to.be('.');
expect(logger.log.getCall(12).args[0]).to.be('.');
expect(logger.log.getCall(13).args[0]).to.be('.');
expect(logger.log.getCall(14).args[0]).to.be('.');
expect(logger.log.getCall(15).args[0]).to.be('.');
expect(logger.log.getCall(16).args[0]).to.be('.');
expect(logger.log.getCall(17).args[0]).to.be('.');
expect(logger.log.getCall(18).args[0]).to.be('.');
expect(logger.log.getCall(19).args[0]).to.be('.');
expect(logger.log.getCall(20).args[0]).to.be('.');
expect(logger.log.getCall(21).args[0]).to.match(/complete/i);
});
it('should show dot for each 5% of completion', function () {
progress.init(1000);
expect(logger.log.callCount).to.be(1);
progress.progress(50); //5%
expect(logger.log.callCount).to.be(2);
progress.progress(100); //15%
expect(logger.log.callCount).to.be(4);
progress.progress(200); //25%
expect(logger.log.callCount).to.be(8);
progress.progress(590); //94%
expect(logger.log.callCount).to.be(20);
progress.progress(60); //100%
expect(logger.log.callCount).to.be(21);
//Any progress over 100% should be ignored.
progress.progress(9999);
expect(logger.log.callCount).to.be(21);
progress.complete();
expect(logger.log.callCount).to.be(22);
expect(logger.log.getCall(0).args[0]).to.match(/transfer/i);
expect(logger.log.getCall(21).args[0]).to.match(/complete/i);
});
});
});
});
});

Binary file not shown.

After

Width:  |  Height:  |  Size: 204 KiB

View file

@ -0,0 +1,97 @@
504b 0304 1400 0000 0000 d575 7147 0000
0000 0000 0000 0000 0000 1300 0000 7465
7374 2d70 6c75 6769 6e2d 6d61 7374 6572
2f50 4b03 040a 0000 0000 00f2 63d8 46a5
06bf 880c 0000 000c 0000 001d 0000 0074
6573 742d 706c 7567 696e 2d6d 6173 7465
722f 2e67 6974 6967 6e6f 7265 6e6f 6465
5f6d 6f64 756c 6573 504b 0304 1400 0000
0800 f263 d846 38c6 e53d 9d00 0000 ee00
0000 1b00 0000 7465 7374 2d70 6c75 6769
6e2d 6d61 7374 6572 2f69 6e64 6578 2e6a
733d 8cc1 0e82 3010 44ef 7cc5 de80 4469
133d 413c f807 1efc 8182 ab36 96ed 06b6
d1c4 f0ef 16a8 cc61 9399 7d33 bdbf 0587
157e d80f 32c2 09ee 813a b19e a078 d9d6
9029 e19b 010c 2861 2020 7cc3 1a57 1717
1e96 8af8 8c4a f57a 6617 19e6 c524 8915
8735 e457 1c05 d626 9c99 f3dd 46d8 ce53
049e 225c 2bc5 ce74 d89a 9855 84a2 8e5a
ab83 d611 dff8 ded8 99e7 656b 5412 87f7
ab51 260e 276e cafe 772a 9b6c 6a7e 504b
0304 1400 0000 0800 f263 d846 5c85 06c2
0901 0000 dc01 0000 1f00 0000 7465 7374
2d70 6c75 6769 6e2d 6d61 7374 6572 2f70
6163 6b61 6765 2e6a 736f 6e5d 90cd 6ec3
2010 84ef 790a ea4b 5a29 218e dd46 6a6e
7d8f a812 c62b 1b97 0262 975a 5695 772f
60e7 a7e1 c67c bb33 03bf 2bc6 0a23 bea1
38b2 8200 69eb 74e8 9429 3609 fc80 4765
4d62 7b5e f272 565b 40e9 95a3 850c 0189
0996 96d9 fdb2 0767 5191 f553 9c4a 3951
a3c9 e5a4 4e51 1cca 52f0 3a29 3d91 3bee
7623 3471 0778 1a88 fc9c 9de6 38bc d944
6352 0649 e8bc 6b6c 0b6c 0b6c 2dad 41ab
816b db3d 9f8a 78eb bca0 a045 aa8a 1b36
d9c0 466b 9efe 9f53 f1b2 ce59 cbe3 1c98
168c 5470 17d8 e800 8df2 6d4a fbac f83b
afcb 4b7f d022 9691 7cc0 0cf7 bce2 8f0c
4178 d967 fcc6 cb1b 1eac cae2 81bf f2fa
226a db0a 9c87 eb18 74d5 470f f26b f138
448f 6b63 ad24 18cc dffa e184 ec61 5b25
7c5e fd01 504b 0304 1400 0000 0000 d575
7147 0000 0000 0000 0000 0000 0000 1a00
0000 7465 7374 2d70 6c75 6769 6e2d 6d61
7374 6572 2f70 7562 6c69 632f 504b 0304
1400 0000 0800 f263 d846 674a 6865 4a00
0000 4e00 0000 2000 0000 7465 7374 2d70
6c75 6769 6e2d 6d61 7374 6572 2f70 7562
6c69 632f 6170 702e 6a73 05c1 c10d 8020
1004 c0bf 55ac 2fa1 062b f169 6091 4bc8
a178 e7c7 d8bb 3399 4594 a1b8 2693 ae08
8397 cb60 c43b 017b e3b0 b06c dd51 f787
104d cd33 33ac 12c6 db70 363f 44e7 25ae
d317 d71f 504b 0304 0a00 0000 0000 f263
d846 ac2f 0f2b 1200 0000 1200 0000 1c00
0000 7465 7374 2d70 6c75 6769 6e2d 6d61
7374 6572 2f52 4541 444d 452e 6d64 4920
616d 2061 2074 6573 7420 706c 7567 696e
504b 0304 1400 0000 0000 4b7e 7147 0000
0000 0000 0000 0000 0000 2d00 0000 7465
7374 2d70 6c75 6769 6e2d 6d61 7374 6572
2f65 7874 7261 2066 696c 6520 6f6e 6c79
2069 6e20 7a69 702e 7478 7450 4b01 0214
0014 0000 0000 00d5 7571 4700 0000 0000
0000 0000 0000 0013 0024 0000 0000 0000
0010 0000 0000 0000 0074 6573 742d 706c
7567 696e 2d6d 6173 7465 722f 0a00 2000
0000 0000 0100 1800 4634 e20f 7921 d101
4634 e20f 7921 d101 d449 e10f 7921 d101
504b 0102 1400 0a00 0000 0000 f263 d846
a506 bf88 0c00 0000 0c00 0000 1d00 2400
0000 0000 0000 2000 0000 3100 0000 7465
7374 2d70 6c75 6769 6e2d 6d61 7374 6572
2f2e 6769 7469 676e 6f72 650a 0020 0000
0000 0001 0018 0000 f483 00ac aed0 0179
98e1 0f79 21d1 017
0000 0008 00f2 63d8 4667 4a68 654a 0000
004e 0000 0020 0024 0000 0000 0000 0020
0000 00cc 0200 0074 6573 742d 706c 7567
696e 2d6d 6173 7465 722f 7075 626c 6963
2f61 7070 2e6a 730a 0020 0000 0000 0001
0018 0000 f483 00ac aed0 015b 5be2 0f79
21d1 015b 5be2 0f79 21d1 0150 4b01 0214
000a 0000 0000 00f2 63d8 46ac 2f0f 2b12
0000 0012 0000 001c 0024 0000 0000 0000
0020 0000 0054 0300 0074 6573 742d 706c
7567 696e 2d6d 6173 7465 722f 5245 4144
4d45 2e6d 640a 0020 0000 0000 0001 0018
0000 f483 00ac aed0 014e 0de2 0f79 21d1
014e 0de2 0f79 21d1 0150 4b01 0214 0014
0000 0000 004b 7e71 4700 0000 0000 0000
0000 0000 002d 0000 0000 0000 0000 0020
0000 00a0 0300 0074 6573 742d 706c 7567
696e 2d6d 6173 7465 722f 6578 7472 6120
6669 6c65 206f 6e6c 7920 696e 207a 6970
2e74 7874 504b 0506 0000 0000 0800 0800
5903 0000 eb03 0000 0000

View file

@ -1,13 +0,0 @@
{
"name": "test-plugin",
"version": "1.0.0",
"description": "just a test plugin",
"repository": {
"type": "git",
"url": "http://website.git"
},
"dependencies": {
"bluebird": "2.9.30"
},
"license": "Apache-2.0"
}

View file

@ -3,7 +3,7 @@ var expect = require('expect.js');
var utils = require('requirefrom')('src/utils');
var fromRoot = utils('fromRoot');
var settingParser = require('../settingParser');
var settingParser = require('../setting_parser');
describe('kibana cli', function () {
@ -205,9 +205,8 @@ describe('kibana cli', function () {
var settings = parser.parse();
expect(settings.urls).to.have.property('length', 2);
expect(settings.urls).to.have.property('length', 1);
expect(settings.urls).to.contain('https://download.elastic.co/kibana/test-plugin/test-plugin-latest.tar.gz');
expect(settings.urls).to.contain('https://github.com/kibana/test-plugin/archive/master.tar.gz');
});
it('should populate the urls collection properly version specified', function () {
@ -216,9 +215,8 @@ describe('kibana cli', function () {
var settings = parser.parse();
expect(settings.urls).to.have.property('length', 2);
expect(settings.urls).to.have.property('length', 1);
expect(settings.urls).to.contain('https://download.elastic.co/kibana/test-plugin/test-plugin-v1.1.1.tar.gz');
expect(settings.urls).to.contain('https://github.com/kibana/test-plugin/archive/v1.1.1.tar.gz');
});
it('should populate the pluginPath', function () {
@ -231,6 +229,26 @@ describe('kibana cli', function () {
expect(settings).to.have.property('pluginPath', expected);
});
it('should populate the workingPath', function () {
options.install = 'kibana/test-plugin';
parser = settingParser(options);
var settings = parser.parse();
var expected = fromRoot('installedPlugins/.plugin.installing');
expect(settings).to.have.property('workingPath', expected);
});
it('should populate the tempArchiveFile', function () {
options.install = 'kibana/test-plugin';
parser = settingParser(options);
var settings = parser.parse();
var expected = fromRoot('installedPlugins/.plugin.installing/archive.part');
expect(settings).to.have.property('tempArchiveFile', expected);
});
describe('with url option', function () {
it('should allow one part to the install parameter', function () {

View file

@ -0,0 +1,76 @@
const { createWriteStream, createReadStream, unlinkSync, statSync } = require('fs');
const getProgressReporter = require('../progress_reporter');
function openSourceFile({ sourcePath }) {
try {
let fileInfo = statSync(sourcePath);
const readStream = createReadStream(sourcePath);
return { readStream, fileInfo };
} catch (err) {
if (err.code === 'ENOENT') {
throw new Error('ENOTFOUND');
}
throw err;
}
}
async function copyFile({ readStream, writeStream, progressReporter }) {
await new Promise((resolve, reject) => {
// if either stream errors, fail quickly
readStream.on('error', reject);
writeStream.on('error', reject);
// report progress as we transfer
readStream.on('data', (chunk) => {
progressReporter.progress(chunk.length);
});
// write the download to the file system
readStream.pipe(writeStream);
// when the write is done, we are done
writeStream.on('finish', resolve);
});
}
function getArchiveTypeFromFilename(path) {
if (/\.zip$/i.test(path)) {
return '.zip';
}
if (/\.tar\.gz$/i.test(path)) {
return '.tar.gz';
}
}
/*
// Responsible for managing local file transfers
*/
export default async function copyLocalFile(logger, sourcePath, targetPath) {
try {
const { readStream, fileInfo } = openSourceFile({ sourcePath });
const writeStream = createWriteStream(targetPath);
try {
const progressReporter = getProgressReporter(logger);
progressReporter.init(fileInfo.size);
await copyFile({ readStream, writeStream, progressReporter });
progressReporter.complete();
} catch (err) {
readStream.close();
writeStream.close();
throw err;
}
// all is well, return our archive type
const archiveType = getArchiveTypeFromFilename(sourcePath);
return { archiveType };
} catch (err) {
logger.error(err);
throw err;
}
};

View file

@ -0,0 +1,96 @@
const { fromNode: fn } = require('bluebird');
const { createWriteStream, unlinkSync } = require('fs');
const Wreck = require('wreck');
const getProgressReporter = require('../progress_reporter');
function sendRequest({ sourceUrl, timeout }) {
const maxRedirects = 11; //Because this one goes to 11.
return fn(cb => {
const req = Wreck.request('GET', sourceUrl, { timeout, redirects: maxRedirects }, (err, resp) => {
if (err) {
if (err.code === 'ECONNREFUSED') {
err = new Error('ENOTFOUND');
}
return cb(err);
}
if (resp.statusCode >= 400) {
return cb(new Error('ENOTFOUND'));
}
cb(null, { req, resp });
});
});
}
function downloadResponse({ resp, targetPath, progressReporter }) {
return new Promise((resolve, reject) => {
const writeStream = createWriteStream(targetPath);
// if either stream errors, fail quickly
resp.on('error', reject);
writeStream.on('error', reject);
// report progress as we download
resp.on('data', (chunk) => {
progressReporter.progress(chunk.length);
});
// write the download to the file system
resp.pipe(writeStream);
// when the write is done, we are done
writeStream.on('finish', resolve);
});
}
function getArchiveTypeFromResponse(resp, sourceUrl) {
const contentType = (resp.headers['content-type'] || '');
switch (contentType.toLowerCase()) {
case 'application/zip': return '.zip';
case 'application/x-gzip': return '.tar.gz';
default:
//If we can't infer the archive type from the content-type header,
//fall back to checking the extension in the url
if (/\.zip$/i.test(sourceUrl)) {
return '.zip';
}
if (/\.tar\.gz$/i.test(sourceUrl)) {
return '.tar.gz';
}
break;
}
}
/*
Responsible for managing http transfers
*/
export default async function downloadUrl(logger, sourceUrl, targetPath, timeout) {
try {
const { req, resp } = await sendRequest({ sourceUrl, timeout });
try {
let totalSize = parseFloat(resp.headers['content-length']) || 0;
const progressReporter = getProgressReporter(logger);
progressReporter.init(totalSize);
await downloadResponse({ resp, targetPath, progressReporter });
progressReporter.complete();
} catch (err) {
req.abort();
throw err;
}
// all is well, return our archive type
const archiveType = getArchiveTypeFromResponse(resp, sourceUrl);
return { archiveType };
} catch (err) {
if (err.message !== 'ENOTFOUND') {
logger.error(err);
}
throw err;
}
};

View file

@ -0,0 +1,34 @@
const zlib = require('zlib');
const fs = require('fs');
const tar = require('tar');
async function extractArchive(settings) {
await new Promise((resolve, reject) => {
const gunzip = zlib.createGunzip();
const tarExtract = new tar.Extract({ path: settings.workingPath, strip: 1 });
const readStream = fs.createReadStream(settings.tempArchiveFile);
readStream.on('error', reject);
gunzip.on('error', reject);
tarExtract.on('error', reject);
readStream
.pipe(gunzip)
.pipe(tarExtract);
tarExtract.on('finish', resolve);
});
}
export default async function extractTarball(settings, logger) {
try {
logger.log('Extracting plugin archive');
await extractArchive(settings);
logger.log('Extraction complete');
} catch (err) {
logger.error(err);
throw new Error('Error extracting plugin archive');
}
};

View file

@ -0,0 +1,32 @@
const DecompressZip = require('@bigfunger/decompress-zip');
async function extractArchive(settings) {
await new Promise((resolve, reject) => {
const unzipper = new DecompressZip(settings.tempArchiveFile);
unzipper.on('error', reject);
unzipper.extract({
path: settings.workingPath,
strip: 1,
filter(file) {
return file.type !== 'SymbolicLink';
}
});
unzipper.on('extract', resolve);
});
}
export default async function extractZip(settings, logger) {
try {
logger.log('Extracting plugin archive');
await extractArchive(settings);
logger.log('Extraction complete');
} catch (err) {
logger.error(err);
throw new Error('Error extracting plugin archive');
}
};

View file

@ -1,14 +1,13 @@
var utils = require('requirefrom')('src/utils');
var fromRoot = utils('fromRoot');
const utils = require('requirefrom')('src/utils');
const fromRoot = utils('fromRoot');
const settingParser = require('./setting_parser');
const installer = require('./plugin_installer');
const remover = require('./plugin_remover');
const pluginLogger = require('./plugin_logger');
var settingParser = require('./settingParser');
var installer = require('./pluginInstaller');
var remover = require('./pluginRemover');
var pluginLogger = require('./pluginLogger');
module.exports = function (program) {
export default function pluginCli(program) {
function processCommand(command, options) {
var settings;
let settings;
try {
settings = settingParser(command).parse();
} catch (ex) {
@ -17,7 +16,7 @@ module.exports = function (program) {
process.exit(64); // eslint-disable-line no-process-exit
}
var logger = pluginLogger(settings);
const logger = pluginLogger(settings);
if (settings.action === 'install') {
installer.install(settings, logger);
@ -54,14 +53,12 @@ module.exports = function (program) {
`
Common examples:
-i username/sample
attempts to download the latest version from the following urls:
attempts to download the latest version from the following url:
https://download.elastic.co/username/sample/sample-latest.tar.gz
https://github.com/username/sample/archive/master.tar.gz
-i username/sample/v1.1.1
attempts to download version v1.1.1 from the following urls:
attempts to download version v1.1.1 from the following url:
https://download.elastic.co/username/sample/sample-v1.1.1.tar.gz
https://github.com/username/sample/archive/v1.1.1.tar.gz
-i sample -u http://www.example.com/other_name.tar.gz
attempts to download from the specified url,

View file

@ -1,98 +0,0 @@
var _ = require('lodash');
var zlib = require('zlib');
var Promise = require('bluebird');
var url = require('url');
var fs = require('fs');
var request = require('request');
var tar = require('tar');
var progressReporter = require('./progressReporter');
module.exports = function (settings, logger) {
//Attempts to download each url in turn until one is successful
function download() {
var urls = settings.urls;
function tryNext() {
var sourceUrl = urls.shift();
if (!sourceUrl) {
throw new Error('Not a valid url.');
}
logger.log('Attempting to extract from ' + sourceUrl);
return Promise.try(function () {
return downloadSingle(sourceUrl, settings.workingPath, settings.timeout, logger)
.catch(function (err) {
if (err.message === 'ENOTFOUND') {
return tryNext();
}
if (err.message === 'EEXTRACT') {
throw (new Error('Error extracting the plugin archive... is this a valid tar.gz file?'));
}
throw (err);
});
})
.catch(function (err) {
//Special case for when request.get throws an exception
if (err.message.match(/invalid uri/i)) {
return tryNext();
}
throw (err);
});
}
return tryNext();
}
//Attempts to download a single url
function downloadSingle(source, dest, timeout) {
var gunzip = zlib.createGunzip();
var tarExtract = new tar.Extract({ path: dest, strip: 1 });
var requestOptions = { url: source };
if (timeout !== 0) {
requestOptions.timeout = timeout;
}
return wrappedRequest(requestOptions)
.then(function (fileStream) {
var reporter = progressReporter(logger, fileStream);
fileStream
.on('response', reporter.handleResponse)
.on('data', reporter.handleData)
.on('error', _.partial(reporter.handleError, 'ENOTFOUND'))
.pipe(gunzip)
.on('error', _.partial(reporter.handleError, 'EEXTRACT'))
.pipe(tarExtract)
.on('error', _.partial(reporter.handleError, 'EEXTRACT'))
.on('end', reporter.handleEnd);
return reporter.promise;
});
}
function wrappedRequest(requestOptions) {
return Promise.try(function () {
let urlInfo = url.parse(requestOptions.url);
if (/^file/.test(urlInfo.protocol)) {
return fs.createReadStream(urlInfo.path);
} else {
return request.get(requestOptions);
}
})
.catch(function (err) {
if (err.message.match(/invalid uri/i)) {
throw new Error('ENOTFOUND');
}
throw err;
});
}
return {
download: download,
_downloadSingle: downloadSingle
};
};

View file

@ -1,71 +0,0 @@
let _ = require('lodash');
var utils = require('requirefrom')('src/utils');
var fromRoot = utils('fromRoot');
var pluginDownloader = require('./pluginDownloader');
var pluginCleaner = require('./pluginCleaner');
var KbnServer = require('../../server/KbnServer');
var readYamlConfig = require('../serve/read_yaml_config');
var fs = require('fs');
module.exports = {
install: install
};
function install(settings, logger) {
logger.log(`Installing ${settings.package}`);
try {
fs.statSync(settings.pluginPath);
logger.error(`Plugin ${settings.package} already exists, please remove before installing a new version`);
process.exit(70); // eslint-disable-line no-process-exit
} catch (e) {
if (e.code !== 'ENOENT') throw e;
}
var cleaner = pluginCleaner(settings, logger);
var downloader = pluginDownloader(settings, logger);
return cleaner.cleanPrevious()
.then(function () {
return downloader.download();
})
.then(async function() {
logger.log('Optimizing and caching browser bundles...');
let serverConfig = _.merge(
readYamlConfig(settings.config),
{
env: 'production',
logging: {
silent: settings.silent,
quiet: !settings.silent,
verbose: false
},
optimize: {
useBundleCache: false
},
server: {
autoListen: false
},
plugins: {
initialize: false,
scanDirs: [settings.pluginDir, fromRoot('src/plugins')],
paths: [settings.workingPath]
}
}
);
let kbnServer = new KbnServer(serverConfig);
await kbnServer.ready();
await kbnServer.close();
})
.then(function () {
fs.renameSync(settings.workingPath, settings.pluginPath);
logger.log('Plugin installation complete');
})
.catch(function (e) {
logger.error(`Plugin installation was unsuccessful due to error "${e.message}"`);
cleaner.cleanError();
process.exit(70); // eslint-disable-line no-process-exit
});
}

View file

@ -1,9 +1,7 @@
var rimraf = require('rimraf');
var fs = require('fs');
var Promise = require('bluebird');
module.exports = function (settings, logger) {
const rimraf = require('rimraf');
const fs = require('fs');
export default function createPluginCleaner(settings, logger) {
function cleanPrevious() {
return new Promise(function (resolve, reject) {
try {
@ -27,8 +25,10 @@ module.exports = function (settings, logger) {
function cleanError() {
// delete the working directory.
// At this point we're bailing, so swallow any errors on delete.
try { rimraf.sync(settings.workingPath); }
try {
rimraf.sync(settings.workingPath);
rimraf.sync(settings.pluginPath);
}
catch (e) {} // eslint-disable-line no-empty
}

View file

@ -0,0 +1,51 @@
const _ = require('lodash');
const urlParse = require('url').parse;
const downloadHttpFile = require('./downloaders/http');
const downloadLocalFile = require('./downloaders/file');
export default function createPluginDownloader(settings, logger) {
let archiveType;
let sourceType;
//Attempts to download each url in turn until one is successful
function download() {
const urls = settings.urls.slice(0);
function tryNext() {
const sourceUrl = urls.shift();
if (!sourceUrl) {
throw new Error('No valid url specified.');
}
logger.log(`Attempting to transfer from ${sourceUrl}`);
return downloadSingle(sourceUrl)
.catch((err) => {
if (err.message === 'ENOTFOUND') {
return tryNext();
}
throw (err);
});
}
return tryNext();
}
function downloadSingle(sourceUrl) {
const urlInfo = urlParse(sourceUrl);
let downloadPromise;
if (/^file/.test(urlInfo.protocol)) {
downloadPromise = downloadLocalFile(logger, urlInfo.path, settings.tempArchiveFile);
} else {
downloadPromise = downloadHttpFile(logger, sourceUrl, settings.tempArchiveFile, settings.timeout);
}
return downloadPromise;
}
return {
download: download,
_downloadSingle: downloadSingle
};
};

View file

@ -0,0 +1,15 @@
const zipExtract = require('./extractors/zip');
const tarGzExtract = require('./extractors/tar_gz');
export default function extractArchive(settings, logger, archiveType) {
switch (archiveType) {
case '.zip':
return zipExtract(settings, logger);
break;
case '.tar.gz':
return tarGzExtract(settings, logger);
break;
default:
throw new Error('Unsupported archive format.');
}
};

View file

@ -0,0 +1,87 @@
const _ = require('lodash');
const utils = require('requirefrom')('src/utils');
const fromRoot = utils('fromRoot');
const pluginDownloader = require('./plugin_downloader');
const pluginCleaner = require('./plugin_cleaner');
const pluginExtractor = require('./plugin_extractor');
const KbnServer = require('../../server/KbnServer');
const readYamlConfig = require('../serve/read_yaml_config');
const { statSync, renameSync } = require('fs');
const Promise = require('bluebird');
const rimrafSync = require('rimraf').sync;
const mkdirp = Promise.promisify(require('mkdirp'));
export default {
install: install
};
function checkForExistingInstall(settings, logger) {
try {
statSync(settings.pluginPath);
logger.error(`Plugin ${settings.package} already exists, please remove before installing a new version`);
process.exit(70); // eslint-disable-line no-process-exit
} catch (e) {
if (e.code !== 'ENOENT') throw e;
}
}
async function rebuildKibanaCache(settings, logger) {
logger.log('Optimizing and caching browser bundles...');
const serverConfig = _.merge(
readYamlConfig(settings.config),
{
env: 'production',
logging: {
silent: settings.silent,
quiet: !settings.silent,
verbose: false
},
optimize: {
useBundleCache: false
},
server: {
autoListen: false
},
plugins: {
initialize: false,
scanDirs: [settings.pluginDir, fromRoot('src/plugins')]
}
}
);
const kbnServer = new KbnServer(serverConfig);
await kbnServer.ready();
await kbnServer.close();
}
async function install(settings, logger) {
logger.log(`Installing ${settings.package}`);
const cleaner = pluginCleaner(settings, logger);
try {
checkForExistingInstall(settings, logger);
await cleaner.cleanPrevious();
await mkdirp(settings.workingPath);
const downloader = pluginDownloader(settings, logger);
const { archiveType } = await downloader.download();
await pluginExtractor (settings, logger, archiveType);
rimrafSync(settings.tempArchiveFile);
renameSync(settings.workingPath, settings.pluginPath);
await rebuildKibanaCache(settings, logger);
logger.log('Plugin installation complete');
} catch (err) {
logger.error(`Plugin installation was unsuccessful due to error "${err.message}"`);
cleaner.cleanError();
process.exit(70); // eslint-disable-line no-process-exit
}
}

View file

@ -1,7 +1,7 @@
module.exports = function (settings) {
var previousLineEnded = true;
var silent = !!settings.silent;
var quiet = !!settings.quiet;
export default function createPluginLogger(settings) {
let previousLineEnded = true;
const silent = !!settings.silent;
const quiet = !!settings.quiet;
function log(data, sameLine) {
if (silent || quiet) return;
@ -33,7 +33,7 @@ module.exports = function (settings) {
data.pipe(process.stderr);
return;
}
process.stderr.write(data + '\n');
process.stderr.write(`${data}\n`);
previousLineEnded = true;
}

View file

@ -1,5 +1,5 @@
var fs = require('fs');
var rimraf = require('rimraf');
const fs = require('fs');
const rimraf = require('rimraf');
module.exports = {
remove: remove

View file

@ -1,71 +0,0 @@
var Promise = require('bluebird');
/*
Responsible for reporting the progress of the file stream
*/
module.exports = function (logger, stream) {
var oldDotCount = 0;
var runningTotal = 0;
var totalSize = 0;
var hasError = false;
var _resolve;
var _reject;
var _resp;
var promise = new Promise(function (resolve, reject) {
_resolve = resolve;
_reject = reject;
});
function handleError(errorMessage, err) {
if (hasError) return;
if (err) logger.error(err);
hasError = true;
if (stream.abort) stream.abort();
_reject(new Error(errorMessage));
}
function handleResponse(resp) {
_resp = resp;
if (resp.statusCode >= 400) {
handleError('ENOTFOUND', null);
} else {
totalSize = parseInt(resp.headers['content-length'], 10) || 0;
var totalDesc = totalSize || 'unknown number of';
logger.log('Downloading ' + totalDesc + ' bytes', true);
}
}
//Should log a dot for every 5% of progress
//Note: no progress is logged if the plugin is downloaded in a single packet
function handleData(buffer) {
if (hasError) return;
if (!totalSize) return;
runningTotal += buffer.length;
var dotCount = Math.round(runningTotal / totalSize * 100 / 5);
if (dotCount > 20) dotCount = 20;
for (var i = 0; i < (dotCount - oldDotCount); i++) {
logger.log('.', true);
}
oldDotCount = dotCount;
}
function handleEnd() {
if (hasError) return;
logger.log('Extraction complete');
_resolve();
}
return {
promise: promise,
handleResponse: handleResponse,
handleError: handleError,
handleData: handleData,
handleEnd: handleEnd,
hasError: function () { return hasError; }
};
};

View file

@ -0,0 +1,38 @@
/*
Generates file transfer progress messages
*/
export default function createProgressReporter(logger) {
let dotCount = 0;
let runningTotal = 0;
let totalSize = 0;
function init(size) {
totalSize = size;
let totalDesc = totalSize || 'unknown number of';
logger.log(`Transferring ${totalDesc} bytes`, true);
}
//Should log a dot for every 5% of progress
function progress(size) {
if (!totalSize) return;
runningTotal += size;
let newDotCount = Math.round(runningTotal / totalSize * 100 / 5);
if (newDotCount > 20) newDotCount = 20;
for (let i = 0; i < (newDotCount - dotCount); i++) {
logger.log('.', true);
}
dotCount = newDotCount;
}
function complete() {
logger.log(`Transfer complete`, false);
}
return {
init: init,
progress: progress,
complete: complete
};
};

View file

@ -1,12 +1,12 @@
var { resolve } = require('path');
var expiry = require('expiry-js');
const { resolve } = require('path');
const expiry = require('expiry-js');
module.exports = function (options) {
export default function createSettingParser(options) {
function parseMilliseconds(val) {
var result;
let result;
try {
var timeVal = expiry(val);
let timeVal = expiry(val);
result = timeVal.asMilliseconds();
} catch (ex) {
result = 0;
@ -16,22 +16,15 @@ module.exports = function (options) {
}
function generateDownloadUrl(settings) {
var version = (settings.version) || 'latest';
var filename = settings.package + '-' + version + '.tar.gz';
const version = (settings.version) || 'latest';
const filename = settings.package + '-' + version + '.tar.gz';
return 'https://download.elastic.co/' + settings.organization + '/' + settings.package + '/' + filename;
}
function generateGithubUrl(settings) {
var version = (settings.version) || 'master';
var filename = version + '.tar.gz';
return 'https://github.com/' + settings.organization + '/' + settings.package + '/archive/' + filename;
}
function parse() {
var parts;
var settings = {
let parts;
let settings = {
timeout: 0,
silent: false,
quiet: false,
@ -78,7 +71,6 @@ module.exports = function (options) {
settings.version = parts.shift();
settings.urls.push(generateDownloadUrl(settings));
settings.urls.push(generateGithubUrl(settings));
}
}
@ -100,6 +92,7 @@ module.exports = function (options) {
if (settings.package) {
settings.pluginPath = resolve(settings.pluginDir, settings.package);
settings.workingPath = resolve(settings.pluginDir, '.plugin.installing');
settings.tempArchiveFile = resolve(settings.workingPath, 'archive.part');
}
return settings;

View file

@ -60,7 +60,7 @@ describe('plugins/elasticsearch', function () {
});
});
it('should be created with 1 shard and 1 replica', function () {
it('should be created with 1 shard and default replica', function () {
var fn = createKibanaIndex(server);
return fn.then(function () {
var params = client.indices.create.args[0][0];
@ -71,22 +71,7 @@ describe('plugins/elasticsearch', function () {
expect(params.body.settings)
.to.have.property('number_of_shards', 1);
expect(params.body.settings)
.to.have.property('number_of_replicas', 1);
});
});
it('should be created with 1 shard and 1 replica', function () {
var fn = createKibanaIndex(server);
return fn.then(function () {
var params = client.indices.create.args[0][0];
expect(params)
.to.have.property('body');
expect(params.body)
.to.have.property('settings');
expect(params.body.settings)
.to.have.property('number_of_shards', 1);
expect(params.body.settings)
.to.have.property('number_of_replicas', 1);
.to.not.have.property('number_of_replicas');
});
});
@ -137,4 +122,3 @@ describe('plugins/elasticsearch', function () {
});
});

View file

@ -85,7 +85,7 @@ describe('plugins/elasticsearch', function () {
expect(plugin.status.yellow.args[0][0]).to.be('Waiting for Elasticsearch');
sinon.assert.calledOnce(plugin.status.red);
expect(plugin.status.red.args[0][0]).to.be(
'Unable to connect to Elasticsearch at http://localhost:9210. Retrying in 2.5 seconds.'
'Unable to connect to Elasticsearch at http://localhost:9210.'
);
sinon.assert.calledTwice(client.ping);
sinon.assert.calledOnce(client.nodes.info);
@ -109,7 +109,7 @@ describe('plugins/elasticsearch', function () {
expect(plugin.status.yellow.args[0][0]).to.be('Waiting for Elasticsearch');
sinon.assert.calledOnce(plugin.status.red);
expect(plugin.status.red.args[0][0]).to.be(
'Elasticsearch is still initializing the kibana index... Trying again in 2.5 second.'
'Elasticsearch is still initializing the kibana index.'
);
sinon.assert.calledOnce(client.ping);
sinon.assert.calledOnce(client.nodes.info);

View file

@ -76,7 +76,7 @@ describe('plugins/elasticsearch', function () {
testRoute({
method: 'POST',
url: '/elasticsearch/.kibana',
payload: {settings: { number_of_shards: 1, number_of_replicas: 1 }},
payload: {settings: { number_of_shards: 1 }},
statusCode: 200
});

View file

@ -1,225 +0,0 @@
var _ = require('lodash');
var expect = require('expect.js');
var src = require('requirefrom')('src');
var fromRoot = src('utils/fromRoot');
var KbnServer = src('server/KbnServer');
var validateRequest = require('../validate');
describe('plugins/elasticsearch', function () {
var kbnServer;
var server;
var config;
before(function () {
kbnServer = new KbnServer({
server: { autoListen: false },
plugins: { scanDirs: [ fromRoot('src/plugins') ] },
logging: { quiet: true },
optimize: { enabled: false },
elasticsearch: {
url: 'http://localhost:9210'
}
});
return kbnServer.ready()
.then(function () {
server = kbnServer.server;
config = kbnServer.config;
});
});
after(function () {
return kbnServer.close();
});
describe('lib/validate', function () {
function del(path, body, valid) {
run('delEte', path, body, valid);
run('delete', path, body, valid);
}
function send(path, body, valid) {
run('POST', path, body, valid);
run('post', path, body, valid);
run('PUT', path, body, valid);
run('put', path, body, valid);
}
function run(method, path, body, valid) {
if (typeof body === 'boolean') {
valid = body;
body = null;
}
if (_.isArray(body)) body = body.map(JSON.stringify).join('\n') + '\n';
if (_.isObject(body)) body = JSON.stringify(body);
var pass = false;
try {
validateRequest(server, {
method: method.toLowerCase(),
path: path,
payload: body
});
pass = true;
} catch (e) {} // eslint-disable-line no-empty
if (pass !== Boolean(valid)) {
var msg = 'Expected ' + method + ' ' +
path + ' ' + (body ? 'with body ' : '') +
'to ' + (!valid ? 'not ' : '') + 'validate';
if (body) {
msg += ' ' + body;
}
throw new Error(msg);
}
}
describe('index management', function () {
it('allows creating kibana index', function () {
send('/' + config.get('kibana.index'), true);
});
it('allows deleting the kibana index', function () {
del('/' + config.get('kibana.index'), true);
});
it('blocks creating a non-kibana index', function () {
send('/app-index', false);
});
it('blocks deleting a non-kibana indices', function () {
del('/app-data', false);
});
});
describe('doc management', function () {
it('allows indexing to the kibana index', function () {
send('/' + config.get('kibana.index'), true);
send('/' + config.get('kibana.index') + '/index-patterns', true);
send('/' + config.get('kibana.index') + '/index-patterns/pattern-id', true);
});
it('allows deleting kibana documents', function () {
del('/' + config.get('kibana.index') + '/index-patterns', true);
del('/' + config.get('kibana.index') + '/index-patterns/pattern-id', true);
});
});
describe('allows any destructive non-bulk requests against kibana index', function () {
it('refresh', function () {
send('/' + config.get('kibana.index') + '/_refresh', true);
});
it('delete', function () {
del('/' + config.get('kibana.index') + '/pasta/lasagna', true);
});
});
describe('assorted methods that are non-destructive', function () {
it('validate', function () {
run('GET', '/_search?search_type=count', true);
run('GET', '/index/type/id', true);
run('GET', '/index/type/_mapping/field/field1', true);
run('GET', '/_aliases', true);
run('GET', '/_nodes/', true);
run('HEAD', '/', true);
run('HEAD', '/' + config.get('kibana.index'), true);
run('HEAD', '/other-index', true);
run('GET', '/_cluster/health', true);
run('POST', '/' + config.get('kibana.index') + '/__notRealIndex__/_validate/query?q=foo:bar', true);
run('POST', '/_validate', true);
run('POST', '/_search', true);
});
});
describe('bulk indexing', function () {
it('valid', function () {
send('/_bulk', [
{ create: { _index: config.get('kibana.index'), _type: 'index-pattern' } },
{ fields: [] },
{ create: { _index: config.get('kibana.index'), _type: 'vis' } },
{ aggs: [] }
], true);
send('/' + config.get('kibana.index') + '/_bulk', [
// implicit index
{ create: { _type: 'index-pattern' } },
{ fields: [] },
// explicit index
{ create: { _index: config.get('kibana.index'), _type: 'vis' } },
{ aggs: [] }
], true);
});
it('rejects bulks including even one other index', function () {
send('/' + config.get('kibana.index') + '/_bulk', [
// implicit index
{ create: { _type: 'index-pattern' } },
{ fields: [] },
// explicit index
{ create: { _index: 'app-data', _type: 'vis' } },
{ aggs: [] }
], false);
});
it('rejects malformed bulk bodies', function () {
send('/_bulk', '{}\n{ "_index": "john" }\n', false);
send('/_bulk', '{}\n{}\n', false);
send('/_bulk', '{ "field": "value" }', false);
send('/_bulk', '{ "field": "v', false);
});
});
describe('msearch', function () {
it('requires a bulk-formatted body', function () {
send('/_msearch', false);
send('/_msearch', '{}', false);
send('/_msearch', '{}\n{}\n', true);
send('/_msearch', '{}\n{}\n{}\n', false);
});
it('allows searching any index', function () {
send('/app-index/_msearch', [
{},
{ query: { match_all: {} } }
], true);
send('/app-index/data-type/_msearch', [
{},
{ query: { match_all: {} } }
], true);
send('/_msearch', [
{ _index: 'app-index', _type: 'data-type' },
{ query: { match_all: {} } },
{ _index: 'IT-index', _type: 'logs' },
{ query: { match_all: {} } },
{ _index: 'L33t', _type: '?' },
{ query: { match_all: {} } },
], true);
});
});
describe('mget', function () {
it('requires a valid json body', function () {
send('/_mget', false);
send('/_mget', '{}', true);
send('/_mget', '{}\n{}\n', false);
});
it('allows reading from any index', function () {
send('/app-index/_mget', { docs: { match_all: {} } }, true);
send('/app-index/data-type/_mget', { docs: [ {} ] }, true);
});
});
});
});

View file

@ -14,8 +14,7 @@ module.exports = function (server) {
index: index,
body: {
settings: {
number_of_shards: 1,
number_of_replicas: 1
number_of_shards: 1
},
mappings: {
config: {

View file

@ -19,7 +19,7 @@ module.exports = function (plugin, server) {
return client.ping({ requestTimeout: 1500 }).catch(function (err) {
if (!(err instanceof NoConnections)) throw err;
plugin.status.red(format('Unable to connect to Elasticsearch at %s. Retrying in 2.5 seconds.', config.get('elasticsearch.url')));
plugin.status.red(format('Unable to connect to Elasticsearch at %s.', config.get('elasticsearch.url')));
return Promise.delay(2500).then(waitForPong);
});
@ -42,7 +42,7 @@ module.exports = function (plugin, server) {
// If status === "red" that means that index(es) were found
// but the shards are not ready for queries
if (resp.status === 'red') {
plugin.status.red('Elasticsearch is still initializing the kibana index... Trying again in 2.5 second.');
plugin.status.red('Elasticsearch is still initializing the kibana index.');
return Promise.delay(2500).then(waitForShards);
}

View file

@ -1,112 +0,0 @@
var _ = require('lodash');
var parse = require('url').parse;
validate.Fail = function (index) {
this.message = 'Kibana only support modifying the "' + index +
'" index. Requests that might modify other indicies are not sent to elasticsearch.';
};
validate.BadIndex = function (index) {
validate.Fail.call(this, index);
this.message = 'Bad index "' + index + '" in request. ' + this.message;
};
function validate(server, req) {
var config = server.config();
var method = req.method.toUpperCase();
if (method === 'GET' || method === 'HEAD') return true;
var segments = _.compact(parse(req.path).pathname.split('/'));
var maybeIndex = _.first(segments);
var maybeMethod = _.last(segments);
var add = (method === 'POST' || method === 'PUT');
var rem = (method === 'DELETE');
// everything below this point assumes a destructive request of some sort
if (!add && !rem) throw new validate.Fail(config.get('kibana.index'));
var bodyStr = String(req.payload);
var jsonBody = bodyStr && parseJson(bodyStr);
var bulkBody = bodyStr && parseBulk(bodyStr);
// methods that accept standard json bodies
var maybeMGet = ('_mget' === maybeMethod && add && jsonBody);
var maybeSearch = ('_search' === maybeMethod && add);
var maybeValidate = ('_validate' === maybeMethod && add);
// methods that accept bulk bodies
var maybeBulk = ('_bulk' === maybeMethod && add && bulkBody);
var maybeMsearch = ('_msearch' === maybeMethod && add && bulkBody);
// indication that this request is against kibana
var maybeKibanaIndex = (maybeIndex === config.get('kibana.index'));
if (!maybeBulk) validateNonBulkDestructive();
else validateBulkBody(bulkBody);
return true;
function parseJson(str) {
try {
return JSON.parse(str);
} catch (e) {
return;
}
}
function parseBulk(str) {
var parts = str.split(/\r?\n/);
var finalLine = parts.pop();
var evenJsons = (parts.length % 2 === 0);
if (finalLine !== '' || !evenJsons) return;
var body = new Array(parts.length);
for (var i = 0; i < parts.length; i++) {
var part = parseJson(parts[i]);
if (!part) throw new validate.Fail(config.get('kibana.index'));
body[i] = part;
}
return body;
}
function stringifyBulk(body) {
return body.map(JSON.stringify).join('\n') + '\n';
}
function validateNonBulkDestructive() {
// allow any destructive request against the kibana index
if (maybeKibanaIndex) return;
// allow json bodies sent to _mget _search and _validate
if (maybeMGet || maybeSearch || maybeValidate) return;
// allow bulk bodies sent to _msearch
if (maybeMsearch) return;
throw new validate.Fail(config.get('kibana.index'));
}
function validateBulkBody(toValidate) {
while (toValidate.length) {
let header = toValidate.shift();
let body = toValidate.shift();
let op = _.keys(header).join('');
let meta = header[op];
if (!meta) throw new validate.Fail(config.get('kibana.index'));
let index = meta._index || maybeIndex;
if (index !== config.get('kibana.index')) {
throw new validate.BadIndex(index);
}
}
}
}
module.exports = validate;

View file

@ -17,7 +17,8 @@ module.exports = function (kibana) {
main: 'plugins/kibana/kibana',
uses: [
'visTypes',
'spyModes'
'spyModes',
'fieldFormats'
],
autoload: kibana.autoload.require.concat(

View file

@ -0,0 +1,2 @@
rules:
no-console: 2

View file

@ -25,7 +25,8 @@
<span ng-bind="error"></span>
</div>
<visualize ng-switch-when="visualization"
<visualize
ng-switch-when="visualization"
vis="savedObj.vis"
search-source="savedObj.searchSource"
show-spy-panel="chrome.getVisible()"

View file

@ -10,6 +10,7 @@ define(function (require) {
require('ui/config');
require('ui/notify');
require('ui/typeahead');
require('ui/share');
require('plugins/kibana/dashboard/directives/grid');
require('plugins/kibana/dashboard/components/panel/panel');
@ -233,15 +234,7 @@ define(function (require) {
ui: $state.options,
save: $scope.save,
addVis: $scope.addVis,
addSearch: $scope.addSearch,
shareData: function () {
return {
link: $location.absUrl(),
// This sucks, but seems like the cleanest way. Uhg.
embed: '<iframe src="' + $location.absUrl().replace('?', '?embed&') +
'" height="600" width="800"></iframe>'
};
}
addSearch: $scope.addSearch
};
init();

View file

@ -1,21 +1,4 @@
<form role="form" class="vis-share">
<p>
<div class="input-group">
<label>
Embed this dashboard
<small>Add to your html source. Note all clients must still be able to access kibana</small>
</label>
<div class="form-control" disabled>{{opts.shareData().embed}}</div>
</div>
</p>
<p>
<div class="input-group">
<label>
Share a link
</label>
<div class="form-control" disabled>{{opts.shareData().link}}</div>
</div>
</p>
</form>
<share
object-type="dashboard"
object-id="{{opts.dashboard.id}}">
</share>

View file

@ -11,7 +11,6 @@ define(function (require) {
require('ui/doc_table');
require('ui/visualize');
require('ui/notify');
require('ui/timepicker');
require('ui/fixedScroll');
require('ui/directives/validate_json');
require('ui/filters/moment');
@ -20,6 +19,7 @@ define(function (require) {
require('ui/state_management/app_state');
require('ui/timefilter');
require('ui/highlight/highlight_tags');
require('ui/share');
var app = require('ui/modules').get('apps/discover', [
'kibana/notify',
@ -91,7 +91,8 @@ define(function (require) {
// config panel templates
$scope.configTemplate = new ConfigTemplate({
load: require('plugins/kibana/discover/partials/load_search.html'),
save: require('plugins/kibana/discover/partials/save_search.html')
save: require('plugins/kibana/discover/partials/save_search.html'),
share: require('plugins/kibana/discover/partials/share_search.html')
});
$scope.timefilter = timefilter;
@ -485,6 +486,7 @@ define(function (require) {
}
$scope.vis = new Vis($scope.indexPattern, {
title: savedSearch.title,
type: 'histogram',
params: {
addLegend: false,
@ -492,7 +494,7 @@ define(function (require) {
},
listeners: {
click: function (e) {
console.log(e);
notify.log(e);
timefilter.time.from = moment(e.point.x);
timefilter.time.to = moment(e.point.x + e.data.ordered.interval);
timefilter.time.mode = 'absolute';

View file

@ -50,6 +50,16 @@
<i aria-hidden="true" class="fa fa-folder-open-o"></i>
</button>
</kbn-tooltip>
<kbn-tooltip text="Share" placement="bottom" append-to-body="1">
<button
aria-label="Share Search"
aria-haspopup="true"
aria-expanded="{{ configTemplate.is('share') }}"
ng-class="{active: configTemplate.is('share')}"
ng-click="configTemplate.toggle('share');">
<i aria-hidden="true" class="fa fa-external-link"></i>
</button>
</kbn-tooltip>
</div>
</navbar>
@ -176,7 +186,13 @@
</header>
<visualize ng-if="vis && rows.length != 0" vis="vis" ui-state="uiState" es-resp="mergedEsResp" search-source="searchSource"></visualize>
<visualize
ng-if="vis && rows.length != 0"
vis="vis"
ui-state="uiState"
es-resp="mergedEsResp"
search-source="searchSource">
</visualize>
</div>
<div class="discover-table" fixed-scroll>

View file

@ -0,0 +1,5 @@
<share
object-type="search"
object-id="{{opts.savedSearch.id}}"
allow-embed="false">
</share>

View file

@ -3,6 +3,7 @@ require('plugins/kibana/visualize/index');
require('plugins/kibana/dashboard/index');
require('plugins/kibana/settings/index');
require('plugins/kibana/doc/index');
require('ui/timepicker');
var moment = require('moment-timezone');
@ -12,6 +13,8 @@ var modules = require('ui/modules');
var kibanaLogoUrl = require('ui/images/kibana.svg');
routes.enable();
routes
.otherwise({
redirectTo: `/${chrome.getInjected('kbnDefaultAppId', 'discover')}`

View file

@ -73,6 +73,36 @@
</small>
</div>
<div class="form-group" ng-if="canExpandIndices()">
<label>
<input ng-model="index.notExpandable" type="checkbox">
Do not expand index pattern when searching <small>(Not recommended)</small>
</label>
<div ng-if="index.notExpandable" class="alert alert-info">
This index pattern will be queried directly rather than being
expanded into more performant searches against individual indices.
Elasticsearch will receive a query against <em>{{index.name}}</em>
and will have to search through all matching indices regardless
of whether they have data that matches the current time range.
</div>
<p class="help-block">
By default, searches against any time-based index pattern that
contains a wildcard will automatically be expanded to query only
the indices that contain data within the currently selected time
range.
</p>
<p class="help-block">
Searching against the index pattern <em>logstash-*</em> will
actually query elasticsearch for the specific matching indices
(e.g. <em>logstash-2015.12.21</em>) that fall within the current
time range.
</p>
</div>
<section>
<div class="alert alert-danger" ng-repeat="err in index.patternErrors">
{{err}}

View file

@ -24,6 +24,7 @@ define(function (require) {
isTimeBased: true,
nameIsPattern: false,
notExpandable: false,
sampleCount: 5,
nameIntervalOptions: intervals,
@ -33,6 +34,12 @@ define(function (require) {
index.nameInterval = _.find(index.nameIntervalOptions, { name: 'daily' });
index.timeField = null;
$scope.canExpandIndices = function () {
// to maximize performance in the digest cycle, move from the least
// expensive operation to most
return index.isTimeBased && !index.nameIsPattern && _.includes(index.name, '*');
};
$scope.refreshFieldList = function () {
fetchFieldList().then(updateFieldList);
};
@ -50,6 +57,10 @@ define(function (require) {
}
}
if (index.notExpandable && $scope.canExpandIndices()) {
indexPattern.notExpandable = true;
}
// fetch the fields
return indexPattern.create()
.then(function (id) {

View file

@ -22,6 +22,10 @@
<div ng-if="indexPattern.timeFieldName && indexPattern.intervalName" class="alert alert-info">
This index uses a <strong>Time-based index pattern</strong> which repeats <span ng-bind="::indexPattern.getInterval().display"></span>
</div>
<div ng-if="!indexPattern.canExpandIndices()" class="alert alert-info">
This index pattern is set to be queried directly rather than being
expanded into more performant searches against individual indices.
</div>
<div ng-if="conflictFields.length" class="alert alert-warning">
<strong>Mapping conflict!</strong> {{conflictFields.length > 1 ? conflictFields.length : 'A'}} field{{conflictFields.length > 1 ? 's' : ''}} {{conflictFields.length > 1 ? 'are' : 'is'}} defined as several types (string, integer, etc) across the indices that match this pattern. You may still be able to use these conflict fields in parts of Kibana, but they will be unavailable for functions that require Kibana to know their type. Correcting this issue will require reindexing your data.
</div>

View file

@ -44,8 +44,8 @@ define(function (require) {
rowScopes.push(rowScope);
return [
field.name,
field.script,
_.escape(field.name),
_.escape(field.script),
_.get($scope.indexPattern, ['fieldFormatMap', field.name, 'type', 'title']),
{
markup: controlsHtml,

View file

@ -101,6 +101,7 @@ define(function (require) {
};
function retrieveAndExportDocs(objs) {
if (!objs.length) return notify.error('No saved objects to export.');
es.mget({
index: kbnIndex,
body: {docs: objs.map(transformToMget)}

View file

@ -138,7 +138,13 @@
</div>
</div>
<visualize vis="vis" ui-state="uiState" show-spy-panel="chrome.getVisible()" search-source="savedVis.searchSource" editable-vis="editableVis"></visualize>
<visualize
vis="vis"
ui-state="uiState"
show-spy-panel="chrome.getVisible()"
editable-vis="editableVis"
search-source="savedVis.searchSource">
</visualize>
</div>
</div>

View file

@ -6,6 +6,7 @@ define(function (require) {
require('ui/visualize');
require('ui/collapsible_sidebar');
require('ui/share');
require('ui/routes')
.when('/visualize/create', {
@ -234,15 +235,6 @@ define(function (require) {
}, notify.fatal);
};
$scope.shareData = function () {
return {
link: $location.absUrl(),
// This sucks, but seems like the cleanest way. Uhg.
embed: '<iframe src="' + $location.absUrl().replace('?', '?embed&') +
'" height="600" width="800"></iframe>'
};
};
$scope.unlink = function () {
if (!$state.linked) return;

View file

@ -1,22 +1,4 @@
<form role="form" class="vis-share">
<p>
<div class="form-group">
<label>
Embed this visualization.
<small>Add to your html source. Note all clients must still be able to access kibana</small>
</label>
<div class="form-control" disabled>{{conf.shareData().embed}}</div>
</div>
</p>
<p>
<div class="form-group">
<label>
Share a link
</label>
<div class="form-control" disabled>{{conf.shareData().link}}</div>
</div>
</p>
</form>
<share
object-type="visualization"
object-id="{{conf.savedVis.id}}">
</share>

View file

@ -103,6 +103,7 @@ define(function (require) {
self.visState = Vis.convertOldState(self.typeName, JSON.parse(self.stateJSON));
}
self.visState.title = self.title;
self.vis = new Vis(
self.searchSource.get('index'),
self.visState
@ -115,6 +116,7 @@ define(function (require) {
var self = this;
self.vis.indexPattern = self.searchSource.get('index');
self.visState.title = self.title;
self.vis.setState(self.visState);
};

View file

@ -1,6 +1,6 @@
<div ng-controller="KbnMetricVisController" class="metric-vis">
<div class="metric-container" ng-repeat="metric in metrics">
<div class="metric-value" ng-style="{'font-size': vis.params.fontSize+'pt'}">{{metric.value}}</div>
<div>{{metric.label}}</div>
</div>
<div class="metric-container" ng-repeat="metric in metrics">
<div class="metric-value" ng-style="{'font-size': vis.params.fontSize+'pt'}">{{metric.value}}</div>
<div>{{metric.label}}</div>
</div>
</div>

View file

@ -23,6 +23,7 @@ define(function (require) {
template: require('plugins/metric_vis/metric_vis.html'),
params: {
defaults: {
handleNoResults: true,
fontSize: 60
},
editor: require('plugins/metric_vis/metric_vis_params.html')

View file

@ -1,4 +1,5 @@
define(function (require) {
var _ = require('lodash');
// get the kibana/metric_vis module, and make sure that it requires the "kibana" module if it
// didn't already
var module = require('ui/modules').get('kibana/metric_vis', ['kibana']);
@ -8,13 +9,21 @@ define(function (require) {
var metrics = $scope.metrics = [];
function isInvalid(val) {
return _.isUndefined(val) || _.isNull(val) || _.isNaN(val);
}
$scope.processTableGroups = function (tableGroups) {
tableGroups.tables.forEach(function (table) {
table.columns.forEach(function (column, i) {
var fieldFormatter = table.aggConfig(column).fieldFormatter();
var value = table.rows[0][i];
value = isInvalid(value) ? '?' : fieldFormatter(value);
metrics.push({
label: column.title,
value: fieldFormatter(table.rows[0][i])
value: value
});
});
});

View file

@ -1 +1,5 @@
<kbn-agg-table table="table" per-page="editableVis.params.spyPerPage"></kbn-agg-table>
<kbn-agg-table
table="table"
export-title="vis.title"
per-page="editableVis.params.spyPerPage">
</kbn-agg-table>

View file

@ -5,6 +5,10 @@
</div>
<div ng-if="tableGroups" class="table-vis-container">
<kbn-agg-table-group group="tableGroups" per-page="vis.params.perPage"></kbn-agg-table-group>
<kbn-agg-table-group
group="tableGroups"
export-title="vis.title"
per-page="vis.params.perPage">
</kbn-agg-table-group>
</div>
</div>
</div>

View file

@ -32,7 +32,6 @@ define(function (require) {
if (hasSomeRows) {
$scope.tableGroups = tableGroups;
}
});
});

View file

@ -28,6 +28,7 @@ module.exports = () => Joi.object({
server: Joi.object({
host: Joi.string().hostname().default('0.0.0.0'),
port: Joi.number().default(5601),
maxPayloadBytes: Joi.number().default(1048576),
autoListen: Joi.boolean().default(true),
defaultRoute: Joi.string(),
basePath: Joi.string().default('').allow('').regex(/(^$|^\/.*[^\/]$)/, `start with a slash, don't end with one`),

View file

@ -10,6 +10,8 @@ module.exports = function (kbnServer, server, config) {
server = kbnServer.server = new Hapi.Server();
const shortUrlLookup = require('./short_url_lookup')(server);
// Create a new connection
var connectionOptions = {
host: config.get('server.host'),
@ -18,7 +20,10 @@ module.exports = function (kbnServer, server, config) {
strictHeader: false
},
routes: {
cors: config.get('server.cors')
cors: config.get('server.cors'),
payload: {
maxBytes: config.get('server.maxPayloadBytes')
}
}
};
@ -154,5 +159,23 @@ module.exports = function (kbnServer, server, config) {
}
});
server.route({
method: 'GET',
path: '/goto/{urlId}',
handler: async function (request, reply) {
const url = await shortUrlLookup.getUrl(request.params.urlId);
reply().redirect(url);
}
});
server.route({
method: 'POST',
path: '/shorten',
handler: async function (request, reply) {
const urlId = await shortUrlLookup.generateUrlId(request.payload.url);
reply(urlId);
}
});
return kbnServer.mixin(require('./xsrf'));
};

View file

@ -0,0 +1,101 @@
const crypto = require('crypto');
export default function (server) {
async function updateMetadata(urlId, urlDoc) {
const client = server.plugins.elasticsearch.client;
try {
await client.update({
index: '.kibana',
type: 'url',
id: urlId,
body: {
doc: {
'accessDate': new Date(),
'accessCount': urlDoc._source.accessCount + 1
}
}
});
} catch (err) {
server.log('Warning: Error updating url metadata', err);
//swallow errors. It isn't critical if there is no update.
}
}
async function getUrlDoc(urlId) {
const urlDoc = await new Promise((resolve, reject) => {
const client = server.plugins.elasticsearch.client;
client.get({
index: '.kibana',
type: 'url',
id: urlId
})
.then(response => {
resolve(response);
})
.catch(err => {
resolve();
});
});
return urlDoc;
}
async function createUrlDoc(url, urlId) {
const newUrlId = await new Promise((resolve, reject) => {
const client = server.plugins.elasticsearch.client;
client.index({
index: '.kibana',
type: 'url',
id: urlId,
body: {
url,
'accessCount': 0,
'createDate': new Date(),
'accessDate': new Date()
}
})
.then(response => {
resolve(response._id);
})
.catch(err => {
reject(err);
});
});
return newUrlId;
}
function createUrlId(url) {
const urlId = crypto.createHash('md5')
.update(url)
.digest('hex');
return urlId;
}
return {
async generateUrlId(url) {
const urlId = createUrlId(url);
const urlDoc = await getUrlDoc(urlId);
if (urlDoc) return urlId;
return createUrlDoc(url, urlId);
},
async getUrl(urlId) {
try {
const urlDoc = await getUrlDoc(urlId);
if (!urlDoc) throw new Error('Requested shortened url does not exist in kibana index');
updateMetadata(urlId, urlDoc);
return urlDoc._source.url;
} catch (err) {
return '/';
}
}
};
};

View file

@ -18,7 +18,7 @@ class UiApp {
this.icon = this.spec.icon;
this.hidden = this.spec.hidden;
this.autoloadOverrides = this.spec.autoload;
this.templateName = this.spec.templateName || 'uiApp';
this.templateName = this.spec.templateName || 'ui_app';
this.url = `${spec.urlBasePath || ''}${this.spec.url || `/app/${this.id}`}`;
// once this resolves, no reason to run it again
@ -32,6 +32,7 @@ class UiApp {
return _.chain([
this.autoloadOverrides || autoload.require,
this.uiExports.find(_.get(this, 'spec.uses', [])),
this.uiExports.find(['chromeNavControls']),
])
.flatten()
.uniq()

View file

@ -60,6 +60,7 @@ class UiExports {
case 'visTypes':
case 'fieldFormats':
case 'spyModes':
case 'chromeNavControls':
return (plugin, spec) => {
this.aliases[type] = _.union(this.aliases[type] || [], spec);
};

View file

@ -63,7 +63,7 @@ exports.reload = function () {
'ui/stringify/register',
'ui/styleCompile',
'ui/timefilter',
'ui/timepicker',
'ui/timepicker', // TODO: remove this for 5.0
'ui/tooltip',
'ui/typeahead',
'ui/url',

2
src/ui/public/.eslintrc Normal file
View file

@ -0,0 +1,2 @@
rules:
no-console: 2

View file

@ -66,6 +66,7 @@ define(function (require) {
Vis.prototype.type = 'histogram';
Vis.prototype.setState = function (state) {
this.title = state.title || '';
this.type = state.type || this.type;
if (_.isString(this.type)) this.type = visTypes.byName[this.type];
@ -80,6 +81,7 @@ define(function (require) {
Vis.prototype.getState = function () {
return {
title: this.title,
type: this.type.name,
params: this.params,
aggs: this.aggs.map(function (agg) {

View file

@ -180,5 +180,22 @@ describe('AggTable Directive', function () {
});
expect(call.args[1]).to.be('somefilename.csv');
});
it('should use the export-title attribute', function () {
var expected = 'export file name';
var $el = $compile(`<kbn-agg-table table="table" export-title="exportTitle">`)($scope);
$scope.$digest();
var $tableScope = $el.isolateScope();
var aggTable = $tableScope.aggTable;
$tableScope.table = {
columns: [],
rows: []
};
$tableScope.exportTitle = expected;
$scope.$digest();
expect(aggTable.csv.filename).to.equal(`${expected}.csv`);
});
});
});

View file

@ -13,7 +13,8 @@ define(function (require) {
template: require('ui/agg_table/agg_table.html'),
scope: {
table: '=',
perPage: '=?'
perPage: '=?',
exportTitle: '=?'
},
controllerAs: 'aggTable',
compile: function ($el) {
@ -75,7 +76,7 @@ define(function (require) {
return;
}
self.csv.filename = (table.title() || 'table') + '.csv';
self.csv.filename = ($scope.exportTitle || table.title() || 'table') + '.csv';
$scope.rows = table.rows;
$scope.formattedColumns = table.columns.map(function (col, i) {
var agg = $scope.table.aggConfig(col);

View file

@ -10,7 +10,12 @@
<tr>
<td>
<kbn-agg-table-group ng-if="table.tables" group="table" per-page="perPage"></kbn-agg-table-group>
<kbn-agg-table ng-if="table.rows" table="table" per-page="perPage"></kbn-agg-table>
<kbn-agg-table
ng-if="table.rows"
table="table"
export-title="exportTitle"
per-page="perPage">
</kbn-agg-table>
</td>
</tr>
</tbody>
@ -28,7 +33,12 @@
<tr>
<td ng-repeat="table in columns">
<kbn-agg-table-group ng-if="table.tables" group="table" per-page="perPage"></kbn-agg-table-group>
<kbn-agg-table ng-if="table.rows" table="table" per-page="perPage"></kbn-agg-table>
<kbn-agg-table
ng-if="table.rows"
table="table"
export-title="exportTitle"
per-page="perPage">
</kbn-agg-table>
</td>
</tr>
</tbody>

View file

@ -10,7 +10,8 @@ define(function (require) {
template: require('ui/agg_table/agg_table_group.html'),
scope: {
group: '=',
perPage: '=?'
perPage: '=?',
exportTitle: '=?'
},
compile: function ($el) {
// Use the compile function from the RecursionHelper,
@ -25,7 +26,7 @@ define(function (require) {
var firstTable = group.tables[0];
var params = firstTable.aggConfig && firstTable.aggConfig.params;
// render groups that have Table children as if they were rows, because itteration is cleaner
// render groups that have Table children as if they were rows, because iteration is cleaner
var childLayout = (params && !params.row) ? 'columns' : 'rows';
$scope[childLayout] = group.tables;

View file

@ -9,7 +9,7 @@
</span>
<div class="hintbox" ng-show="showAnalyzedFieldWarning && agg.params.field.analyzed">
<p>
<strong>Careful!</strong> The field selected contains analyzed strings. Analyzed strings are highly unique and can use a lot of memory to visualize. Values such as <i>foo-bar</i> will be broken into <i>foo</i> and <i>bar</i>. See <a href="http://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-core-types.html" target="_blank">Mapping Core Types</a> for more information on setting this field as <i>not_analyzed</i>
<strong>Careful!</strong> The field selected contains analyzed strings. Analyzed strings are highly unique and can use a lot of memory to visualize. Values such as <i>foo-bar</i> will be broken into <i>foo</i> and <i>bar</i>. See <a href="http://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html" target="_blank">Mapping Types</a> for more information on setting this field as <i>not_analyzed</i>
</p>
<p ng-show="indexedFields.byName[agg.params.field.name + '.raw'].analyzed == false">

View file

@ -19,10 +19,17 @@ define(function (require) {
/**
* Read the values for this metric from the
* @param {[type]} bucket [description]
* @return {[type]} [description]
* @return {*} [description]
*/
MetricAggType.prototype.getValue = function (agg, bucket) {
return bucket[agg.id].value;
// Metric types where an empty set equals `zero`
var isSettableToZero = ['cardinality', 'sum'].indexOf(agg.__type.name) !== -1;
// Return proper values when no buckets are present
// `Count` handles empty sets properly
if (!bucket[agg.id] && isSettableToZero) return 0;
return bucket[agg.id] && bucket[agg.id].value;
};
/**

View file

@ -49,7 +49,7 @@ define(function (require) {
getValue: function (agg, bucket) {
// values for 1, 5, and 10 will come back as 1.0, 5.0, and 10.0 so we
// parse the keys and respond with the value that matches
return _.find(bucket[agg.parentId].values, function (value, key) {
return _.find(bucket[agg.parentId] && bucket[agg.parentId].values, function (value, key) {
return agg.key === parseFloat(key);
}) / 100;
}

View file

@ -44,7 +44,7 @@ define(function (require) {
getValue: function (agg, bucket) {
// percentiles for 1, 5, and 10 will come back as 1.0, 5.0, and 10.0 so we
// parse the keys and respond with the value that matches
return _.find(bucket[agg.parentId].values, function (value, key) {
return _.find(bucket[agg.parentId] && bucket[agg.parentId].values, function (value, key) {
return agg.key === parseFloat(key);
});
}

View file

@ -0,0 +1,73 @@
import ngMock from 'ngMock';
import $ from 'jquery';
import expect from 'expect.js';
import uiModules from 'ui/modules';
import chromeNavControlsRegistry from 'ui/registry/chrome_nav_controls';
import Registry from 'ui/registry/_registry';
describe('chrome nav controls', function () {
let compile;
let stubRegistry;
beforeEach(ngMock.module('kibana', function (PrivateProvider) {
stubRegistry = new Registry({
order: ['order']
});
PrivateProvider.swap(chromeNavControlsRegistry, stubRegistry);
}));
beforeEach(ngMock.inject(function ($compile, $rootScope) {
compile = function () {
const $el = $('<div kbn-chrome-append-nav-controls>');
$rootScope.$apply();
$compile($el)($rootScope);
return $el;
};
}));
it('injects templates from the ui/registry/chrome_nav_controls registry', function () {
stubRegistry.register(function () {
return {
name: 'control',
order: 100,
template: `<span id="testTemplateEl"></span>`
};
});
var $el = compile();
expect($el.find('#testTemplateEl')).to.have.length(1);
});
it('renders controls in reverse order, assuming that each control will float:right', function () {
stubRegistry.register(function () {
return {
name: 'control2',
order: 2,
template: `<span id="2", class="testControl"></span>`
};
});
stubRegistry.register(function () {
return {
name: 'control1',
order: 1,
template: `<span id="1", class="testControl"></span>`
};
});
stubRegistry.register(function () {
return {
name: 'control3',
order: 3,
template: `<span id="3", class="testControl"></span>`
};
});
var $el = compile();
expect(
$el.find('.testControl')
.toArray()
.map(el => el.id)
).to.eql(['3', '2', '1']);
});
});

View file

@ -1,13 +1,9 @@
var $ = require('jquery');
var _ = require('lodash');
require('../appSwitcher');
var modules = require('ui/modules');
var ConfigTemplate = require('ui/ConfigTemplate');
require('ui/directives/config');
module.exports = function (chrome, internals) {
chrome.setupAngular = function () {
var modules = require('ui/modules');
var kibana = modules.get('kibana');
_.forOwn(chrome.getInjected(), function (val, name) {
@ -24,53 +20,9 @@ module.exports = function (chrome, internals) {
a.href = chrome.addBasePath('/elasticsearch');
return a.href;
}()))
.config(chrome.$setupXsrfRequestInterceptor)
.directive('kbnChrome', function ($rootScope) {
return {
template: function ($el) {
var $content = $(require('ui/chrome/chrome.html'));
var $app = $content.find('.application');
.config(chrome.$setupXsrfRequestInterceptor);
if (internals.rootController) {
$app.attr('ng-controller', internals.rootController);
}
if (internals.rootTemplate) {
$app.removeAttr('ng-view');
$app.html(internals.rootTemplate);
}
return $content;
},
controllerAs: 'chrome',
controller: function ($scope, $rootScope, $location, $http) {
// are we showing the embedded version of the chrome?
internals.setVisibleDefault(!$location.search().embed);
// listen for route changes, propogate to tabs
var onRouteChange = function () {
let { href } = window.location;
let persist = chrome.getVisible();
internals.trackPossibleSubUrl(href);
internals.tabs.consumeRouteUpdate(href, persist);
};
$rootScope.$on('$routeChangeSuccess', onRouteChange);
$rootScope.$on('$routeUpdate', onRouteChange);
onRouteChange();
// and some local values
$scope.httpActive = $http.pendingRequests;
$scope.notifList = require('ui/notify')._notifs;
$scope.appSwitcherTemplate = new ConfigTemplate({
switcher: '<app-switcher></app-switcher>'
});
return chrome;
}
};
});
require('../directives')(chrome, internals);
modules.link(kibana);
};

View file

@ -22,7 +22,7 @@
<!-- /Mobile navbar -->
<!-- Full navbar -->
<div collapse="!showCollapsed" class="navbar-collapse">
<div collapse="!showCollapsed" class="navbar-collapse" kbn-chrome-append-nav-controls>
<ul class="nav navbar-nav" role="navigation">
<li
ng-if="chrome.getBrand('logo')"
@ -52,50 +52,11 @@
</a>
</li>
</ul>
<ul ng-show="timefilter.enabled" class="nav navbar-nav navbar-right navbar-timepicker">
<li>
<a
ng-click="toggleRefresh()"
ng-show="timefilter.refreshInterval.value > 0">
<i class="fa" ng-class="timefilter.refreshInterval.pause ? 'fa-play' : 'fa-pause'"></i>
</a>
</li>
<li
ng-class="{active: pickerTemplate.is('interval') }"
ng-show="timefilter.refreshInterval.value > 0 || !!pickerTemplate.current"
class="to-body">
<a ng-click="pickerTemplate.toggle('interval')" class="navbar-timepicker-auto-refresh-desc">
<span ng-show="timefilter.refreshInterval.value === 0"><i class="fa fa-repeat"></i> Auto-refresh</span>
<span ng-show="timefilter.refreshInterval.value > 0">{{timefilter.refreshInterval.display}}</span>
</a>
</li>
<li class="to-body" ng-class="{active: pickerTemplate.is('filter')}">
<a
ng-click="pickerTemplate.toggle('filter')"
aria-haspopup="true"
aria-expanded="false"
class="navbar-timepicker-time-desc">
<i aria-hidden="true" class="fa fa-clock-o"></i>
<pretty-duration from="timefilter.time.from" to="timefilter.time.to"></pretty-duration>
</a>
</li>
</ul>
<ul class="nav navbar-nav navbar-right navbar-timepicker" >
<li ng-show="httpActive.length" class="navbar-text hidden-xs">
<div class="spinner"></div>
</li>
</ul>
</div>
<!-- /Full navbar -->
</nav>
<!-- TODO: These config dropdowns shouldn't be hard coded -->
<config
config-template="appSwitcherTemplate"
config-object="chrome"

View file

@ -4,6 +4,7 @@ define(function (require) {
require('ui/modules')
.get('kibana')
// TODO: all of this really belongs in the timepicker
.directive('chromeContext', function (timefilter, globalState) {
var listenForUpdates = _.once(function ($scope) {

View file

@ -8,7 +8,7 @@ var cloneDeep = require('lodash').cloneDeep;
var indexBy = require('lodash').indexBy;
require('ui/chrome');
require('ui/chrome/appSwitcher');
require('../app_switcher');
var DomLocationProvider = require('ui/domLocation');
describe('appSwitcher directive', function () {

View file

@ -1,7 +1,7 @@
var parse = require('url').parse;
var bindKey = require('lodash').bindKey;
require('../appSwitcher/appSwitcher.less');
require('../app_switcher/app_switcher.less');
var DomLocationProvider = require('ui/domLocation');
require('ui/modules')
@ -9,7 +9,7 @@ require('ui/modules')
.directive('appSwitcher', function () {
return {
restrict: 'E',
template: require('./appSwitcher.html'),
template: require('./app_switcher.html'),
controllerAs: 'switcher',
controller: function ($scope, Private) {
var domLocation = Private(DomLocationProvider);

View file

@ -0,0 +1,28 @@
import $ from 'jquery';
import chromeNavControlsRegistry from 'ui/registry/chrome_nav_controls';
import UiModules from 'ui/modules';
export default function (chrome, internals) {
UiModules
.get('kibana')
.directive('kbnChromeAppendNavControls', function (Private) {
return {
template: function ($element) {
const parts = [$element.html()];
const controls = Private(chromeNavControlsRegistry);
for (const control of controls.inOrder) {
parts.unshift(
`<!-- nav control ${control.name} -->`,
control.template
);
}
return parts.join('\n');
}
};
});
}

View file

@ -0,0 +1,10 @@
import 'ui/directives/config';
import './app_switcher';
import kbnChromeProv from './kbn_chrome';
import kbnChromeNavControlsProv from './append_nav_controls';
export default function (chrome, internals) {
kbnChromeProv(chrome, internals);
kbnChromeNavControlsProv(chrome, internals);
}

View file

@ -0,0 +1,58 @@
import $ from 'jquery';
import UiModules from 'ui/modules';
import ConfigTemplate from 'ui/ConfigTemplate';
export default function (chrome, internals) {
UiModules
.get('kibana')
.directive('kbnChrome', function ($rootScope) {
return {
template($el) {
const $content = $(require('ui/chrome/chrome.html'));
const $app = $content.find('.application');
if (internals.rootController) {
$app.attr('ng-controller', internals.rootController);
}
if (internals.rootTemplate) {
$app.removeAttr('ng-view');
$app.html(internals.rootTemplate);
}
return $content;
},
controllerAs: 'chrome',
controller($scope, $rootScope, $location, $http) {
// are we showing the embedded version of the chrome?
internals.setVisibleDefault(!$location.search().embed);
// listen for route changes, propogate to tabs
const onRouteChange = function () {
let { href } = window.location;
let persist = chrome.getVisible();
internals.trackPossibleSubUrl(href);
internals.tabs.consumeRouteUpdate(href, persist);
};
$rootScope.$on('$routeChangeSuccess', onRouteChange);
$rootScope.$on('$routeUpdate', onRouteChange);
onRouteChange();
// and some local values
$scope.httpActive = $http.pendingRequests;
$scope.notifList = require('ui/notify')._notifs;
$scope.appSwitcherTemplate = new ConfigTemplate({
switcher: '<app-switcher></app-switcher>'
});
return chrome;
}
};
});
}

View file

@ -200,7 +200,11 @@ define(function (require) {
},
'dashboard:defaultDarkTheme': {
value: false,
description: 'New dashboards use dark theme by default',
description: 'New dashboards use dark theme by default'
},
'filters:pinnedByDefault': {
value: false,
description: 'Whether the filters should have a global state (be pinned) by default'
}
};
};

Some files were not shown because too many files have changed in this diff Show more