Merge remote-tracking branch 'upstream/master' into feature/5.0
|
@ -4,16 +4,16 @@
|
|||
At any given time the Kibana team at Elastic is working on dozens of features and enhancements to Kibana and other projects at Elastic. When you file an issue we'll take the time to digest it, consider solutions, and weigh its applicability to both the broad Kibana user base and our own goals for the project. Once we've completed that process we will assign the issue a priority.
|
||||
|
||||
- **P1**: A high priority issue that affects almost all Kibana users. Bugs that would cause incorrect results, security issues and features that would vastly improve the user experience for everyone. Work arounds for P1s generally don't exist without a code change.
|
||||
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
|
||||
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
|
||||
- **P2**: A broadly applicable, high visibility, issue that enhances the usability of Kibana for a majority users.
|
||||
- **P3**: Nice-to-have bug fixes or functionality. Work arounds for P3 items generally exist.
|
||||
- **P4**: Niche and special interest issues that may not fit our core goals. We would take a high quality pull for this if implemented in such a way that it does not meaningfully impact other functionality or existing code. Issues may also be labeled P4 if they would be better implemented in Elasticsearch.
|
||||
- **P5**: Highly niche or in opposition to our core goals. Should usually be closed. This doesn't mean we wouldn't take a pull for it, but if someone really wanted this they would be better off working on a plugin. The Kibana team will usually not work on P5 issues but may be willing to assist plugin developers on IRC.
|
||||
|
||||
#### How to express the importance of an issue
|
||||
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
|
||||
Let's just get this out there: **Feel free to +1 an issue**. That said, a +1 isn't a vote. We keep up on highly commented issues, but comments are but one of many reasons we might, or might not, work on an issue. A solid write up of your use case is more likely to make your case than a comment that says *+10000*.
|
||||
|
||||
#### My issue isn't getting enough attention
|
||||
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
|
||||
First of all, sorry about that, we want you to have a great time with Kibana! You should join us on IRC (#kibana on freenode) and chat about it. Github is terrible for conversations. With that out of the way, there are a number of variables that go into deciding what to work on. These include priority, impact, difficulty, applicability to use cases, and last, and importantly: What we feel like working on.
|
||||
|
||||
### I want to help!
|
||||
**Now we're talking**. If you have a bugfix or new feature that you would like to contribute to Kibana, please **find or open an issue about it before you start working on it.** Talk about what you would like to do. It may be that somebody is already working on it, or that there are particular issues that you should know about before implementing the change.
|
||||
|
@ -74,6 +74,20 @@ optimize:
|
|||
lazyPrebuild: false
|
||||
```
|
||||
|
||||
#### SSL
|
||||
|
||||
When Kibana runs in development mode it will automatically use bundled SSL certificates. These certificates won't be trusted by your OS by default which will likely cause your browser to complain about the cert. You can deal with this in a few ways:
|
||||
|
||||
1. Supply your own cert using the `config/kibana.dev.yml` file.
|
||||
1. Configure your OS to trust the cert:
|
||||
- OSX: https://www.accuweaver.com/2014/09/19/make-chrome-accept-a-self-signed-certificate-on-osx/
|
||||
- Window: http://stackoverflow.com/a/1412118
|
||||
- Linux: http://unix.stackexchange.com/a/90607
|
||||
1. Click through the warning and accept future warnings.
|
||||
1. Disable SSL with the `--no-ssl` flag:
|
||||
- `npm start -- --no-ssl`
|
||||
|
||||
|
||||
#### Linting
|
||||
|
||||
A note about linting: We use [eslint](http://eslint.org) to check that the [styleguide](STYLEGUIDE.md) is being followed. It runs in a pre-commit hook and as a part of the tests, but most contributors integrate it with their code editors for real-time feedback.
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
Copyright 2012–2014 Elasticsearch BV
|
||||
Copyright 2012–2015 Elasticsearch BV
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
|
||||
|
||||
|
|
|
@ -31,6 +31,8 @@ For example, a chart of dates with incident counts can display dates in chronolo
|
|||
priority of the incident-reporting aggregation to show the most active dates first. The chronological order might show
|
||||
a time-dependent pattern in incident count, and sorting by active dates can reveal particular outliers in your data.
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
4
docs/color-picker.asciidoc
Normal file
|
@ -0,0 +1,4 @@
|
|||
You can customize the colors of your visualization by clicking the color dot next to each label to display the
|
||||
_color picker_.
|
||||
|
||||
image::images/color-picker.png[An array of color dots that users can select]
|
|
@ -71,8 +71,13 @@ in your Web page.
|
|||
|
||||
NOTE: A user must have Kibana access in order to view embedded dashboards.
|
||||
|
||||
Click the *Share* button to display HTML code to embed the dashboard in another Web page, along with a direct link to
|
||||
the dashboard. You can select the text in either option to copy the code or the link to your clipboard.
|
||||
To share a dashboard, click the *Share* button image:images/share-dashboard.png[] to display the _Sharing_ panel.
|
||||
|
||||
image:images/sharing-panel.png[]
|
||||
|
||||
Click the *Copy to Clipboard* button image:images/share-link.png[] to copy the native URL or embed HTML to the clipboard.
|
||||
Click the *Generate short URL* button image:images/share-short-link.png[] to create a shortened URL for sharing or
|
||||
embedding.
|
||||
|
||||
[float]
|
||||
[[embedding-dashboards]]
|
||||
|
|
|
@ -195,7 +195,8 @@ yellow open logstash-2015.05.20 5 1 4750 0 16.4mb
|
|||
[[tutorial-define-index]]
|
||||
=== Defining Your Index Patterns
|
||||
|
||||
Each set of data loaded to Elasticsearch has an <<settings-create-pattern,index pattern>>. In the previous section, the Shakespeare data set has an index named `shakespeare`, and the accounts
|
||||
Each set of data loaded to Elasticsearch has an <<settings-create-pattern,index pattern>>. In the previous section, the
|
||||
Shakespeare data set has an index named `shakespeare`, and the accounts
|
||||
data set has an index named `bank`. An _index pattern_ is a string with optional wildcards that can match multiple
|
||||
indices. For example, in the common logging use case, a typical index name contains the date in MM-DD-YYYY
|
||||
format, and an index pattern for May would look something like `logstash-2015.05*`.
|
||||
|
@ -211,6 +212,9 @@ The Logstash data set does contain time-series data, so after clicking *Add New*
|
|||
set, make sure the *Index contains time-based events* box is checked and select the `@timestamp` field from the
|
||||
*Time-field name* drop-down.
|
||||
|
||||
NOTE: When you define an index pattern, indices that match that pattern must exist in Elasticsearch. Those indices must
|
||||
contain data.
|
||||
|
||||
[float]
|
||||
[[tutorial-discovering]]
|
||||
=== Discovering Your Data
|
||||
|
@ -288,8 +292,10 @@ This shows you what proportion of the 1000 accounts fall in these balance ranges
|
|||
we're going to add another bucket aggregation. We can break down each of the balance ranges further by the account
|
||||
holder's age.
|
||||
|
||||
Click *Add sub-buckets* at the bottom, then select *Split Slices*. Choose the *Terms* aggregation and the *age* field from the drop-downs.
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to add an external ring with the new results.
|
||||
Click *Add sub-buckets* at the bottom, then select *Split Slices*. Choose the *Terms* aggregation and the *age* field from
|
||||
the drop-downs.
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to add an external ring with the new
|
||||
results.
|
||||
|
||||
image::images/tutorial-visualize-pie-3.png[]
|
||||
|
||||
|
@ -321,7 +327,8 @@ as well as change many other options for your visualizations, by clicking the *O
|
|||
Now that you have a list of the smallest casts for Shakespeare plays, you might also be curious to see which of these
|
||||
plays makes the greatest demands on an individual actor by showing the maximum number of speeches for a given part. Add
|
||||
a Y-axis aggregation with the *Add metrics* button, then choose the *Max* aggregation for the *speech_number* field. In
|
||||
the *Options* tab, change the *Bar Mode* drop-down to *grouped*, then click the green *Apply changes* button image:images/apply-changes-button.png[]. Your
|
||||
the *Options* tab, change the *Bar Mode* drop-down to *grouped*, then click the green *Apply changes* button
|
||||
image:images/apply-changes-button.png[]. Your
|
||||
chart should now look like this:
|
||||
|
||||
image::images/tutorial-visualize-bar-3.png[]
|
||||
|
@ -371,7 +378,8 @@ Write the following text in the field:
|
|||
The Markdown widget uses **markdown** syntax.
|
||||
> Blockquotes in Markdown use the > character.
|
||||
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to display the rendered Markdown in the preview pane:
|
||||
Click the green *Apply changes* button image:images/apply-changes-button.png[] to display the rendered Markdown in the
|
||||
preview pane:
|
||||
|
||||
image::images/tutorial-visualize-md-2.png[]
|
||||
|
||||
|
|
BIN
docs/images/color-picker.png
Normal file
After Width: | Height: | Size: 28 KiB |
BIN
docs/images/share-dashboard.png
Normal file
After Width: | Height: | Size: 554 B |
BIN
docs/images/share-link.png
Normal file
After Width: | Height: | Size: 589 B |
BIN
docs/images/share-short-link.png
Normal file
After Width: | Height: | Size: 507 B |
BIN
docs/images/sharing-panel.png
Normal file
After Width: | Height: | Size: 62 KiB |
|
@ -11,6 +11,8 @@ if the splits are displayed in a row or a column by clicking the *Rows | Columns
|
|||
|
||||
include::x-axis-aggs.asciidoc[]
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
|
@ -55,6 +55,8 @@ types.
|
|||
When multiple aggregations are defined on a chart's axis, you can use the up or down arrows to the right of the
|
||||
aggregation's type to change the aggregation's priority.
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
|
@ -42,7 +42,7 @@ kibana_elasticsearch_password: kibana4-password
|
|||
----
|
||||
|
||||
Kibana 4 users also need access to the `.kibana` index so they can save and load searches, visualizations, and dashboards.
|
||||
For more information, see {shield}/kibana.html#kibana4-server-role[Configuring Roles for Kibana 4 Users] in
|
||||
For more information, see {shield}/kibana.html[Using Kibana with Shield] in
|
||||
the Shield documentation.
|
||||
|
||||
TIP: See <<kibana-dynamic-mapping, Kibana and Elasticsearch Dynamic Mapping>> for important information on Kibana and
|
||||
|
|
|
@ -1,27 +1,27 @@
|
|||
[[releasenotes]]
|
||||
== Kibana 4.3 Release Notes
|
||||
== Kibana 4.4 Release Notes
|
||||
|
||||
The 4.3 release of Kibana requires Elasticsearch 2.1 or later.
|
||||
The 4.4 release of Kibana requires Elasticsearch 2.2 or later.
|
||||
|
||||
Using event times to create index names is *deprecated* in this release of Kibana. Support for this functionality will be
|
||||
removed entirely in the next major Kibana release. Elasticsearch 2.1 includes sophisticated date parsing APIs that Kibana
|
||||
uses to determine date information, removing the need to specify dates in the index pattern name.
|
||||
Using event times to create index names is no longer supported as of this release. Current versions of Elasticsearch
|
||||
include sophisticated date parsing APIs that Kibana uses to determine date information, removing the need to specify dates
|
||||
in the index pattern name.
|
||||
|
||||
[float]
|
||||
[[enhancements]]
|
||||
== Enhancements
|
||||
|
||||
* {k4issue}5109[Issue 5109]: Adds custom JSON and filter alias naming for filters.
|
||||
* {k4issue}1726[Issue 1726]: Adds a color field formatter for value ranges in numeric fields.
|
||||
* {k4issue}4342[Issue 4342]: Increased performance for wildcard indices.
|
||||
* {k4issue}1600[Issue 1600]: Support for global time zones.
|
||||
* {k4pull}5275[Pull Request 5275]: Highlighting values in Discover can now be disabled.
|
||||
* {k4issue}5212[Issue 5212]: Adds support for multiple certificate authorities.
|
||||
* {k4issue}2716[Issue 2716]: The open/closed position of the spy panel now persists across UI state changes.
|
||||
// * {k4issue}5109[Issue 5109]: Adds custom JSON and filter alias naming for filters.
|
||||
// * {k4issue}1726[Issue 1726]: Adds a color field formatter for value ranges in numeric fields.
|
||||
// * {k4issue}4342[Issue 4342]: Increased performance for wildcard indices.
|
||||
// * {k4issue}1600[Issue 1600]: Support for global time zones.
|
||||
// * {k4pull}5275[Pull Request 5275]: Highlighting values in Discover can now be disabled.
|
||||
// * {k4issue}5212[Issue 5212]: Adds support for multiple certificate authorities.
|
||||
// * {k4issue}2716[Issue 2716]: The open/closed position of the spy panel now persists across UI state changes.
|
||||
|
||||
[float]
|
||||
[[bugfixes]]
|
||||
== Bug Fixes
|
||||
|
||||
* {k4issue}5165[Issue 5165]: Resolves a display error in embedded views.
|
||||
* {k4issue}5021[Issue 5021]: Improves visualization dimming for dashboards with auto-refresh.
|
||||
// * {k4issue}5165[Issue 5165]: Resolves a display error in embedded views.
|
||||
// * {k4issue}5021[Issue 5021]: Improves visualization dimming for dashboards with auto-refresh.
|
||||
|
|
|
@ -35,11 +35,17 @@ list.
|
|||
contains time-based events* option and select the index field that contains the timestamp. Kibana reads the index
|
||||
mapping to list all of the fields that contain a timestamp.
|
||||
|
||||
. By default, Kibana restricts wildcard expansion of time-based index patterns to indices with data within the currently
|
||||
selected time range. Click *Do not expand index pattern when search* to disable this behavior.
|
||||
|
||||
. Click *Create* to add the index pattern.
|
||||
|
||||
. To designate the new pattern as the default pattern to load when you view the Discover tab, click the *favorite*
|
||||
button.
|
||||
|
||||
NOTE: When you define an index pattern, indices that match that pattern must exist in Elasticsearch. Those indices must
|
||||
contain data.
|
||||
|
||||
To use an event time in an index name, enclose the static text in the pattern and specify the date format using the
|
||||
tokens described in the following table.
|
||||
|
||||
|
@ -195,6 +201,8 @@ Scripted fields compute data on the fly from the data in your Elasticsearch indi
|
|||
the Discover tab as part of the document data, and you can use scripted fields in your visualizations.
|
||||
Scripted field values are computed at query time so they aren't indexed and cannot be searched.
|
||||
|
||||
NOTE: Kibana cannot query scripted fields.
|
||||
|
||||
WARNING: Computing data on the fly with scripted fields can be very resource intensive and can have a direct impact on
|
||||
Kibana's performance. Keep in mind that there's no built-in validation of a scripted field. If your scripts are
|
||||
buggy, you'll get exceptions whenever you try to view the dynamically generated data.
|
||||
|
@ -449,10 +457,12 @@ To export a set of objects:
|
|||
. Click the selection box for the objects you want to export, or click the *Select All* box.
|
||||
. Click *Export* to select a location to write the exported JSON.
|
||||
|
||||
WARNING: Exported dashboards do not include their associated index patterns. Re-create the index patterns manually before
|
||||
importing saved dashboards to a Kibana instance running on another Elasticsearch cluster.
|
||||
|
||||
To import a set of objects:
|
||||
|
||||
. Go to *Settings > Objects*.
|
||||
. Click *Import* to navigate to the JSON file representing the set of objects to import.
|
||||
. Click *Open* after selecting the JSON file.
|
||||
. If any objects in the set would overwrite objects already present in Kibana, confirm the overwrite.
|
||||
|
||||
|
|
|
@ -34,6 +34,8 @@ if the splits are displayed in a row or a column by clicking the *Rows | Columns
|
|||
|
||||
include::x-axis-aggs.asciidoc[]
|
||||
|
||||
include::color-picker.asciidoc[]
|
||||
|
||||
You can click the *Advanced* link to display more customization options for your metrics or bucket aggregation:
|
||||
|
||||
*Exclude Pattern*:: Specify a pattern in this field to exclude from the results.
|
||||
|
|
|
@ -53,6 +53,7 @@
|
|||
"precommit": "grunt precommit",
|
||||
"karma": "karma start",
|
||||
"elasticsearch": "grunt esvm:dev:keepalive",
|
||||
"elasticsearchWithPlugins": "grunt esvm:withPlugins:keepalive",
|
||||
"lint": "grunt eslint:source",
|
||||
"lintroller": "grunt eslint:fixSource",
|
||||
"mocha": "mocha --compilers js:babel/register",
|
||||
|
@ -103,6 +104,7 @@
|
|||
"good-squeeze": "2.1.0",
|
||||
"gridster": "0.5.6",
|
||||
"hapi": "8.8.1",
|
||||
"httpolyglot": "0.1.1",
|
||||
"imports-loader": "0.6.4",
|
||||
"jade": "1.11.0",
|
||||
"jade-loader": "0.7.1",
|
||||
|
|
98
src/cli/cluster/base_path_proxy.js
Normal file
|
@ -0,0 +1,98 @@
|
|||
import { Server } from 'hapi';
|
||||
import { notFound } from 'boom';
|
||||
import { merge, sample } from 'lodash';
|
||||
import { format as formatUrl } from 'url';
|
||||
import { fromNode } from 'bluebird';
|
||||
import { Agent as HttpsAgent } from 'https';
|
||||
import { readFileSync } from 'fs';
|
||||
|
||||
import Config from '../../server/config/config';
|
||||
import setupConnection from '../../server/http/setup_connection';
|
||||
import setupLogging from '../../server/logging';
|
||||
|
||||
const alphabet = 'abcdefghijklmnopqrztuvwxyz'.split('');
|
||||
|
||||
export default class BasePathProxy {
|
||||
constructor(clusterManager, userSettings) {
|
||||
this.clusterManager = clusterManager;
|
||||
this.server = new Server();
|
||||
|
||||
const config = Config.withDefaultSchema(userSettings);
|
||||
|
||||
this.targetPort = config.get('dev.basePathProxyTarget');
|
||||
this.basePath = config.get('server.basePath');
|
||||
|
||||
const { cert } = config.get('server.ssl');
|
||||
if (cert) {
|
||||
this.proxyAgent = new HttpsAgent({
|
||||
ca: readFileSync(cert)
|
||||
});
|
||||
}
|
||||
|
||||
if (!this.basePath) {
|
||||
this.basePath = `/${sample(alphabet, 3).join('')}`;
|
||||
config.set('server.basePath', this.basePath);
|
||||
}
|
||||
|
||||
setupLogging(null, this.server, config);
|
||||
setupConnection(null, this.server, config);
|
||||
this.setupRoutes();
|
||||
}
|
||||
|
||||
setupRoutes() {
|
||||
const { server, basePath, targetPort } = this;
|
||||
|
||||
server.route({
|
||||
method: 'GET',
|
||||
path: '/',
|
||||
handler(req, reply) {
|
||||
return reply.redirect(basePath);
|
||||
}
|
||||
});
|
||||
|
||||
server.route({
|
||||
method: '*',
|
||||
path: `${basePath}/{kbnPath*}`,
|
||||
handler: {
|
||||
proxy: {
|
||||
passThrough: true,
|
||||
xforward: true,
|
||||
agent: this.proxyAgent,
|
||||
mapUri(req, callback) {
|
||||
callback(null, formatUrl({
|
||||
protocol: server.info.protocol,
|
||||
hostname: server.info.host,
|
||||
port: targetPort,
|
||||
pathname: req.params.kbnPath,
|
||||
query: req.query,
|
||||
}));
|
||||
}
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
server.route({
|
||||
method: '*',
|
||||
path: `/{oldBasePath}/{kbnPath*}`,
|
||||
handler(req, reply) {
|
||||
const {oldBasePath, kbnPath = ''} = req.params;
|
||||
|
||||
const isGet = req.method === 'get';
|
||||
const isBasePath = oldBasePath.length === 3;
|
||||
const isApp = kbnPath.slice(0, 4) === 'app/';
|
||||
|
||||
if (isGet && isBasePath && isApp) {
|
||||
return reply.redirect(`${basePath}/${kbnPath}`);
|
||||
}
|
||||
|
||||
return reply(notFound());
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async listen() {
|
||||
await fromNode(cb => this.server.start(cb));
|
||||
this.server.log(['listening', 'info'], `basePath Proxy running at ${this.server.info.uri}${this.basePath}`);
|
||||
}
|
||||
|
||||
}
|
|
@ -1,30 +1,52 @@
|
|||
let cluster = require('cluster');
|
||||
let { join } = require('path');
|
||||
let { debounce, compact, invoke, bindAll, once } = require('lodash');
|
||||
const cluster = require('cluster');
|
||||
const { join } = require('path');
|
||||
const { format: formatUrl } = require('url');
|
||||
const Hapi = require('hapi');
|
||||
const { debounce, compact, get, invoke, bindAll, once, sample } = require('lodash');
|
||||
|
||||
let Log = require('../Log');
|
||||
let Worker = require('./worker');
|
||||
const Log = require('../Log');
|
||||
const Worker = require('./worker');
|
||||
const BasePathProxy = require('./base_path_proxy');
|
||||
|
||||
process.env.kbnWorkerType = 'managr';
|
||||
|
||||
module.exports = class ClusterManager {
|
||||
constructor(opts) {
|
||||
constructor(opts, settings) {
|
||||
this.log = new Log(opts.quiet, opts.silent);
|
||||
this.addedCount = 0;
|
||||
|
||||
const serverArgv = [];
|
||||
const optimizerArgv = [
|
||||
'--plugins.initialize=false',
|
||||
'--server.autoListen=false',
|
||||
];
|
||||
|
||||
if (opts.basePath) {
|
||||
this.basePathProxy = new BasePathProxy(this, settings);
|
||||
|
||||
optimizerArgv.push(
|
||||
`--server.basePath=${this.basePathProxy.basePath}`
|
||||
);
|
||||
|
||||
serverArgv.push(
|
||||
`--server.port=${this.basePathProxy.targetPort}`,
|
||||
`--server.basePath=${this.basePathProxy.basePath}`
|
||||
);
|
||||
}
|
||||
|
||||
this.workers = [
|
||||
this.optimizer = new Worker({
|
||||
type: 'optmzr',
|
||||
title: 'optimizer',
|
||||
log: this.log,
|
||||
argv: compact([
|
||||
'--plugins.initialize=false',
|
||||
'--server.autoListen=false'
|
||||
]),
|
||||
argv: optimizerArgv,
|
||||
watch: false
|
||||
}),
|
||||
|
||||
this.server = new Worker({
|
||||
type: 'server',
|
||||
log: this.log
|
||||
log: this.log,
|
||||
argv: serverArgv
|
||||
})
|
||||
];
|
||||
|
||||
|
@ -48,12 +70,15 @@ module.exports = class ClusterManager {
|
|||
startCluster() {
|
||||
this.setupManualRestart();
|
||||
invoke(this.workers, 'start');
|
||||
if (this.basePathProxy) {
|
||||
this.basePathProxy.listen();
|
||||
}
|
||||
}
|
||||
|
||||
setupWatching() {
|
||||
var chokidar = require('chokidar');
|
||||
let utils = require('requirefrom')('src/utils');
|
||||
let fromRoot = utils('fromRoot');
|
||||
const chokidar = require('chokidar');
|
||||
const utils = require('requirefrom')('src/utils');
|
||||
const fromRoot = utils('fromRoot');
|
||||
|
||||
this.watcher = chokidar.watch([
|
||||
'src/plugins',
|
||||
|
@ -81,12 +106,12 @@ module.exports = class ClusterManager {
|
|||
}
|
||||
|
||||
setupManualRestart() {
|
||||
let readline = require('readline');
|
||||
let rl = readline.createInterface(process.stdin, process.stdout);
|
||||
const readline = require('readline');
|
||||
const rl = readline.createInterface(process.stdin, process.stdout);
|
||||
|
||||
let nls = 0;
|
||||
let clear = () => nls = 0;
|
||||
let clearSoon = debounce(clear, 2000);
|
||||
const clear = () => nls = 0;
|
||||
const clearSoon = debounce(clear, 2000);
|
||||
|
||||
rl.setPrompt('');
|
||||
rl.prompt();
|
||||
|
|
|
@ -76,11 +76,11 @@ describe('kibana cli', function () {
|
|||
options = { install: 'dummy/dummy', pluginDir: fromRoot('installedPlugins') };
|
||||
});
|
||||
|
||||
it('should require the user to specify either install and remove', function () {
|
||||
it('should require the user to specify either install, remove, or list', function () {
|
||||
options.install = null;
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install or --remove./);
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify both install and remove', function () {
|
||||
|
@ -88,7 +88,32 @@ describe('kibana cli', function () {
|
|||
options.install = 'org/package/version';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install or --remove./);
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify both install and list', function () {
|
||||
options.list = true;
|
||||
options.install = 'org/package/version';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify both remove and list', function () {
|
||||
options.list = true;
|
||||
options.remove = 'package';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
it('should not allow the user to specify install, remove, and list', function () {
|
||||
options.list = true;
|
||||
options.install = 'org/package/version';
|
||||
options.remove = 'package';
|
||||
parser = settingParser(options);
|
||||
|
||||
expect(parser.parse).withArgs().to.throwError(/Please specify either --install, --remove, or --list./);
|
||||
});
|
||||
|
||||
describe('quiet option', function () {
|
||||
|
@ -293,7 +318,7 @@ describe('kibana cli', function () {
|
|||
describe('remove option', function () {
|
||||
|
||||
it('should set settings.action property to "remove"', function () {
|
||||
options.install = null;
|
||||
delete options.install;
|
||||
options.remove = 'package';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -303,7 +328,7 @@ describe('kibana cli', function () {
|
|||
});
|
||||
|
||||
it('should allow one part to the remove parameter', function () {
|
||||
options.install = null;
|
||||
delete options.install;
|
||||
options.remove = 'test-plugin';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -312,8 +337,8 @@ describe('kibana cli', function () {
|
|||
expect(settings).to.have.property('package', 'test-plugin');
|
||||
});
|
||||
|
||||
it('should not allow more than one part to the install parameter', function () {
|
||||
options.install = null;
|
||||
it('should not allow more than one part to the remove parameter', function () {
|
||||
delete options.install;
|
||||
options.remove = 'kibana/test-plugin';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -322,7 +347,7 @@ describe('kibana cli', function () {
|
|||
});
|
||||
|
||||
it('should populate the pluginPath', function () {
|
||||
options.install = null;
|
||||
delete options.install;
|
||||
options.remove = 'test-plugin';
|
||||
parser = settingParser(options);
|
||||
|
||||
|
@ -334,6 +359,21 @@ describe('kibana cli', function () {
|
|||
|
||||
});
|
||||
|
||||
describe('list option', function () {
|
||||
|
||||
it('should set settings.action property to "list"', function () {
|
||||
delete options.install;
|
||||
delete options.remove;
|
||||
options.list = true;
|
||||
parser = settingParser(options);
|
||||
|
||||
var settings = parser.parse();
|
||||
|
||||
expect(settings).to.have.property('action', 'list');
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
|
|
|
@ -3,6 +3,7 @@ const fromRoot = utils('fromRoot');
|
|||
const settingParser = require('./setting_parser');
|
||||
const installer = require('./plugin_installer');
|
||||
const remover = require('./plugin_remover');
|
||||
const lister = require('./plugin_lister');
|
||||
const pluginLogger = require('./plugin_logger');
|
||||
|
||||
export default function pluginCli(program) {
|
||||
|
@ -18,11 +19,16 @@ export default function pluginCli(program) {
|
|||
|
||||
const logger = pluginLogger(settings);
|
||||
|
||||
if (settings.action === 'install') {
|
||||
installer.install(settings, logger);
|
||||
}
|
||||
if (settings.action === 'remove') {
|
||||
remover.remove(settings, logger);
|
||||
switch (settings.action) {
|
||||
case 'install':
|
||||
installer.install(settings, logger);
|
||||
break;
|
||||
case 'remove':
|
||||
remover.remove(settings, logger);
|
||||
break;
|
||||
case 'list':
|
||||
lister.list(settings, logger);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -30,6 +36,7 @@ export default function pluginCli(program) {
|
|||
.command('plugin')
|
||||
.option('-i, --install <org>/<plugin>/<version>', 'The plugin to install')
|
||||
.option('-r, --remove <plugin>', 'The plugin to remove')
|
||||
.option('-l, --list', 'List installed plugins')
|
||||
.option('-q, --quiet', 'Disable all process messaging except errors')
|
||||
.option('-s, --silent', 'Disable all process messaging')
|
||||
.option('-u, --url <url>', 'Specify download url')
|
||||
|
|
8
src/cli/plugin/plugin_lister.js
Normal file
|
@ -0,0 +1,8 @@
|
|||
const fs = require('fs');
|
||||
|
||||
export function list(settings, logger) {
|
||||
fs.readdirSync(settings.pluginDir)
|
||||
.forEach(function (pluginFile) {
|
||||
logger.log(pluginFile);
|
||||
});
|
||||
}
|
|
@ -1,5 +1,6 @@
|
|||
const { resolve } = require('path');
|
||||
const expiry = require('expiry-js');
|
||||
import { intersection } from 'lodash';
|
||||
|
||||
export default function createSettingParser(options) {
|
||||
function parseMilliseconds(val) {
|
||||
|
@ -22,6 +23,10 @@ export default function createSettingParser(options) {
|
|||
return 'https://download.elastic.co/' + settings.organization + '/' + settings.package + '/' + filename;
|
||||
}
|
||||
|
||||
function areMultipleOptionsChosen(options, choices) {
|
||||
return intersection(Object.keys(options), choices).length > 1;
|
||||
}
|
||||
|
||||
function parse() {
|
||||
let parts;
|
||||
let settings = {
|
||||
|
@ -84,8 +89,12 @@ export default function createSettingParser(options) {
|
|||
settings.package = parts.shift();
|
||||
}
|
||||
|
||||
if (!settings.action || (options.install && options.remove)) {
|
||||
throw new Error('Please specify either --install or --remove.');
|
||||
if (options.list) {
|
||||
settings.action = 'list';
|
||||
}
|
||||
|
||||
if (!settings.action || areMultipleOptionsChosen(options, [ 'install', 'remove', 'list' ])) {
|
||||
throw new Error('Please specify either --install, --remove, or --list.');
|
||||
}
|
||||
|
||||
settings.pluginDir = options.pluginDir;
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
let _ = require('lodash');
|
||||
let { isWorker } = require('cluster');
|
||||
let { resolve } = require('path');
|
||||
const _ = require('lodash');
|
||||
const { isWorker } = require('cluster');
|
||||
const { resolve } = require('path');
|
||||
|
||||
let cwd = process.cwd();
|
||||
let src = require('requirefrom')('src');
|
||||
let fromRoot = src('utils/fromRoot');
|
||||
const cwd = process.cwd();
|
||||
const src = require('requirefrom')('src');
|
||||
const fromRoot = src('utils/fromRoot');
|
||||
|
||||
let canCluster;
|
||||
try {
|
||||
|
@ -14,19 +14,61 @@ try {
|
|||
canCluster = false;
|
||||
}
|
||||
|
||||
let pathCollector = function () {
|
||||
let paths = [];
|
||||
const pathCollector = function () {
|
||||
const paths = [];
|
||||
return function (path) {
|
||||
paths.push(resolve(process.cwd(), path));
|
||||
return paths;
|
||||
};
|
||||
};
|
||||
|
||||
let pluginDirCollector = pathCollector();
|
||||
let pluginPathCollector = pathCollector();
|
||||
const pluginDirCollector = pathCollector();
|
||||
const pluginPathCollector = pathCollector();
|
||||
|
||||
function initServerSettings(opts, extraCliOptions) {
|
||||
const readYamlConfig = require('./read_yaml_config');
|
||||
const settings = readYamlConfig(opts.config);
|
||||
const set = _.partial(_.set, settings);
|
||||
const get = _.partial(_.get, settings);
|
||||
const has = _.partial(_.has, settings);
|
||||
const merge = _.partial(_.merge, settings);
|
||||
|
||||
if (opts.dev) {
|
||||
try { merge(readYamlConfig(fromRoot('config/kibana.dev.yml'))); }
|
||||
catch (e) { null; }
|
||||
}
|
||||
|
||||
if (opts.dev) {
|
||||
set('env', 'development');
|
||||
set('optimize.lazy', true);
|
||||
if (opts.ssl && !has('server.ssl.cert') && !has('server.ssl.key')) {
|
||||
set('server.host', 'localhost');
|
||||
set('server.ssl.cert', fromRoot('test/dev_certs/server.crt'));
|
||||
set('server.ssl.key', fromRoot('test/dev_certs/server.key'));
|
||||
}
|
||||
}
|
||||
|
||||
if (opts.elasticsearch) set('elasticsearch.url', opts.elasticsearch);
|
||||
if (opts.port) set('server.port', opts.port);
|
||||
if (opts.host) set('server.host', opts.host);
|
||||
if (opts.quiet) set('logging.quiet', true);
|
||||
if (opts.silent) set('logging.silent', true);
|
||||
if (opts.verbose) set('logging.verbose', true);
|
||||
if (opts.logFile) set('logging.dest', opts.logFile);
|
||||
|
||||
set('plugins.scanDirs', _.compact([].concat(
|
||||
get('plugins.scanDirs'),
|
||||
opts.pluginDir
|
||||
)));
|
||||
|
||||
set('plugins.paths', [].concat(opts.pluginPath || []));
|
||||
merge(extraCliOptions);
|
||||
|
||||
return settings;
|
||||
}
|
||||
|
||||
module.exports = function (program) {
|
||||
let command = program.command('serve');
|
||||
const command = program.command('serve');
|
||||
|
||||
command
|
||||
.description('Run the kibana server')
|
||||
|
@ -64,59 +106,30 @@ module.exports = function (program) {
|
|||
if (canCluster) {
|
||||
command
|
||||
.option('--dev', 'Run the server with development mode defaults')
|
||||
.option('--no-ssl', 'Don\'t run the dev server using HTTPS')
|
||||
.option('--no-base-path', 'Don\'t put a proxy in front of the dev server, which adds a random basePath')
|
||||
.option('--no-watch', 'Prevents automatic restarts of the server in --dev mode');
|
||||
}
|
||||
|
||||
command
|
||||
.action(async function (opts) {
|
||||
const settings = initServerSettings(opts, this.getUnknownOptions());
|
||||
|
||||
if (canCluster && opts.dev && !isWorker) {
|
||||
// stop processing the action and handoff to cluster manager
|
||||
let ClusterManager = require('../cluster/cluster_manager');
|
||||
new ClusterManager(opts);
|
||||
const ClusterManager = require('../cluster/cluster_manager');
|
||||
new ClusterManager(opts, settings);
|
||||
return;
|
||||
}
|
||||
|
||||
let readYamlConfig = require('./read_yaml_config');
|
||||
let KbnServer = src('server/KbnServer');
|
||||
|
||||
let settings = readYamlConfig(opts.config);
|
||||
|
||||
if (opts.dev) {
|
||||
try { _.merge(settings, readYamlConfig(fromRoot('config/kibana.dev.yml'))); }
|
||||
catch (e) { null; }
|
||||
}
|
||||
|
||||
let set = _.partial(_.set, settings);
|
||||
let get = _.partial(_.get, settings);
|
||||
|
||||
if (opts.dev) {
|
||||
set('env', 'development');
|
||||
set('optimize.lazy', true);
|
||||
}
|
||||
|
||||
if (opts.elasticsearch) set('elasticsearch.url', opts.elasticsearch);
|
||||
if (opts.port) set('server.port', opts.port);
|
||||
if (opts.host) set('server.host', opts.host);
|
||||
if (opts.quiet) set('logging.quiet', true);
|
||||
if (opts.silent) set('logging.silent', true);
|
||||
if (opts.verbose) set('logging.verbose', true);
|
||||
if (opts.logFile) set('logging.dest', opts.logFile);
|
||||
|
||||
set('plugins.scanDirs', _.compact([].concat(
|
||||
get('plugins.scanDirs'),
|
||||
opts.pluginDir
|
||||
)));
|
||||
|
||||
set('plugins.paths', [].concat(opts.pluginPath || []));
|
||||
|
||||
let kbnServer = {};
|
||||
|
||||
const KbnServer = src('server/KbnServer');
|
||||
try {
|
||||
kbnServer = new KbnServer(_.merge(settings, this.getUnknownOptions()));
|
||||
kbnServer = new KbnServer(settings);
|
||||
await kbnServer.ready();
|
||||
}
|
||||
catch (err) {
|
||||
let { server } = kbnServer;
|
||||
const { server } = kbnServer;
|
||||
|
||||
if (server) server.log(['fatal'], err);
|
||||
console.error('FATAL', err);
|
||||
|
|
|
@ -6,6 +6,7 @@ define(function (require) {
|
|||
this.on = _.noop;
|
||||
this.off = _.noop;
|
||||
this.save = sinon.stub();
|
||||
this.replace = sinon.stub();
|
||||
_.assign(this, defaults);
|
||||
}
|
||||
|
||||
|
|
|
@ -1,38 +1,40 @@
|
|||
var _ = require('lodash');
|
||||
let Boom = require('boom');
|
||||
import { trim, trimRight } from 'lodash';
|
||||
import { methodNotAllowed } from 'boom';
|
||||
|
||||
module.exports = function (kibana) {
|
||||
var healthCheck = require('./lib/health_check');
|
||||
var exposeClient = require('./lib/expose_client');
|
||||
var createProxy = require('./lib/create_proxy');
|
||||
import healthCheck from './lib/health_check';
|
||||
import exposeClient from './lib/expose_client';
|
||||
import createProxy, { createPath } from './lib/create_proxy';
|
||||
|
||||
return new kibana.Plugin({
|
||||
module.exports = function ({ Plugin }) {
|
||||
return new Plugin({
|
||||
require: ['kibana'],
|
||||
|
||||
config: function (Joi) {
|
||||
return Joi.object({
|
||||
enabled: Joi.boolean().default(true),
|
||||
url: Joi.string().uri({ scheme: ['http', 'https'] }).default('http://localhost:9200'),
|
||||
preserveHost: Joi.boolean().default(true),
|
||||
username: Joi.string(),
|
||||
password: Joi.string(),
|
||||
shardTimeout: Joi.number().default(0),
|
||||
requestTimeout: Joi.number().default(30000),
|
||||
pingTimeout: Joi.number().default(30000),
|
||||
startupTimeout: Joi.number().default(5000),
|
||||
ssl: Joi.object({
|
||||
verify: Joi.boolean().default(true),
|
||||
ca: Joi.array().single().items(Joi.string()),
|
||||
cert: Joi.string(),
|
||||
key: Joi.string()
|
||||
config(Joi) {
|
||||
const { array, boolean, number, object, string } = Joi;
|
||||
|
||||
return object({
|
||||
enabled: boolean().default(true),
|
||||
url: string().uri({ scheme: ['http', 'https'] }).default('http://localhost:9200'),
|
||||
preserveHost: boolean().default(true),
|
||||
username: string(),
|
||||
password: string(),
|
||||
shardTimeout: number().default(0),
|
||||
requestTimeout: number().default(30000),
|
||||
pingTimeout: number().default(30000),
|
||||
startupTimeout: number().default(5000),
|
||||
ssl: object({
|
||||
verify: boolean().default(true),
|
||||
ca: array().single().items(string()),
|
||||
cert: string(),
|
||||
key: string()
|
||||
}).default(),
|
||||
apiVersion: Joi.string().default('2.0'),
|
||||
engineVersion: Joi.string().valid('^2.1.0').default('^2.1.0')
|
||||
apiVersion: string().default('2.0'),
|
||||
engineVersion: string().valid('^2.1.0').default('^2.1.0')
|
||||
}).default();
|
||||
},
|
||||
|
||||
init: function (server, options) {
|
||||
var config = server.config();
|
||||
init(server, options) {
|
||||
const kibanaIndex = server.config().get('kibana.index');
|
||||
|
||||
// Expose the client to the server
|
||||
exposeClient(server);
|
||||
|
@ -43,8 +45,8 @@ module.exports = function (kibana) {
|
|||
createProxy(server, 'POST', '/_msearch');
|
||||
createProxy(server, 'POST', '/_search/scroll');
|
||||
|
||||
function noBulkCheck(request, reply) {
|
||||
if (/\/_bulk/.test(request.path)) {
|
||||
function noBulkCheck({ path }, reply) {
|
||||
if (/\/_bulk/.test(path)) {
|
||||
return reply({
|
||||
error: 'You can not send _bulk requests to this interface.'
|
||||
}).code(400).takeover();
|
||||
|
@ -52,12 +54,12 @@ module.exports = function (kibana) {
|
|||
return reply.continue();
|
||||
}
|
||||
|
||||
function noCreateIndex(request, reply) {
|
||||
const requestPath = _.trimRight(_.trim(request.path), '/');
|
||||
const matchPath = createProxy.createPath(config.get('kibana.index'));
|
||||
function noCreateIndex({ path }, reply) {
|
||||
const requestPath = trimRight(trim(path), '/');
|
||||
const matchPath = createPath(kibanaIndex);
|
||||
|
||||
if (requestPath === matchPath) {
|
||||
return reply(Boom.methodNotAllowed('You cannot modify the primary kibana index through this interface.'));
|
||||
return reply(methodNotAllowed('You cannot modify the primary kibana index through this interface.'));
|
||||
}
|
||||
|
||||
reply.continue();
|
||||
|
@ -71,16 +73,16 @@ module.exports = function (kibana) {
|
|||
createProxy(
|
||||
server,
|
||||
['PUT', 'POST', 'DELETE'],
|
||||
'/' + config.get('kibana.index') + '/{paths*}',
|
||||
`/${kibanaIndex}/{paths*}`,
|
||||
{
|
||||
pre: [ noCreateIndex, noBulkCheck ]
|
||||
}
|
||||
);
|
||||
|
||||
// Set up the health check service and start it.
|
||||
var hc = healthCheck(this, server);
|
||||
server.expose('waitUntilReady', hc.waitUntilReady);
|
||||
hc.start();
|
||||
const { start, waitUntilReady } = healthCheck(this, server);
|
||||
server.expose('waitUntilReady', waitUntilReady);
|
||||
start();
|
||||
}
|
||||
});
|
||||
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
var src = require('requirefrom')('src');
|
||||
var expect = require('expect.js');
|
||||
var util = require('util');
|
||||
var requireFromTest = require('requirefrom')('test');
|
||||
var kbnTestServer = requireFromTest('utils/kbn_server');
|
||||
|
||||
var format = util.format;
|
||||
|
||||
var KbnServer = src('server/KbnServer');
|
||||
var fromRoot = src('utils/fromRoot');
|
||||
|
||||
describe('plugins/elasticsearch', function () {
|
||||
describe('routes', function () {
|
||||
|
@ -12,27 +12,7 @@ describe('plugins/elasticsearch', function () {
|
|||
var kbnServer;
|
||||
|
||||
before(function () {
|
||||
kbnServer = new KbnServer({
|
||||
server: {
|
||||
autoListen: false,
|
||||
xsrf: {
|
||||
disableProtection: true
|
||||
}
|
||||
},
|
||||
logging: { quiet: true },
|
||||
plugins: {
|
||||
scanDirs: [
|
||||
fromRoot('src/plugins')
|
||||
]
|
||||
},
|
||||
optimize: {
|
||||
enabled: false
|
||||
},
|
||||
elasticsearch: {
|
||||
url: 'http://localhost:9210'
|
||||
}
|
||||
});
|
||||
|
||||
kbnServer = kbnTestServer.createServer();
|
||||
return kbnServer.ready()
|
||||
.then(() => kbnServer.server.plugins.elasticsearch.waitUntilReady());
|
||||
});
|
||||
|
@ -51,7 +31,7 @@ describe('plugins/elasticsearch', function () {
|
|||
var statusCode = options.statusCode || 200;
|
||||
describe(format('%s %s', options.method, options.url), function () {
|
||||
it('should should return ' + statusCode, function (done) {
|
||||
kbnServer.server.inject(options, function (res) {
|
||||
kbnTestServer.makeRequest(kbnServer, options, function (res) {
|
||||
try {
|
||||
expect(res.statusCode).to.be(statusCode);
|
||||
done();
|
||||
|
|
|
@ -20,7 +20,8 @@ module.exports = function (kibana) {
|
|||
uses: [
|
||||
'visTypes',
|
||||
'spyModes',
|
||||
'fieldFormats'
|
||||
'fieldFormats',
|
||||
'navbarExtensions'
|
||||
],
|
||||
|
||||
injectVars: function (server, options) {
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
<div dashboard-app class="app-container dashboard-container">
|
||||
<navbar ng-show="chrome.getVisible()">
|
||||
<navbar ng-show="chrome.getVisible()" name="dashboard">
|
||||
<span class="name" ng-if="dash.id" ng-bind="::dash.title" tooltip="{{::dash.title}}"></span>
|
||||
|
||||
<form name="queryInput"
|
||||
|
|
|
@ -10,6 +10,7 @@ define(function (require) {
|
|||
require('ui/config');
|
||||
require('ui/notify');
|
||||
require('ui/typeahead');
|
||||
require('ui/navbar');
|
||||
require('ui/share');
|
||||
|
||||
require('plugins/kibana/dashboard/directives/grid');
|
||||
|
|
|
@ -40,7 +40,7 @@ define(function (require) {
|
|||
}
|
||||
|
||||
if (!field.indexed) {
|
||||
warnings.push('This field is not indexed and can not be visualized.');
|
||||
warnings.push('This field is not indexed and might not be usable in visualizations.');
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -35,7 +35,7 @@
|
|||
</div>
|
||||
</div>
|
||||
|
||||
<a ng-show="field.indexed || field.scripted"
|
||||
<a
|
||||
ng-href="{{vizLocation(field)}}"
|
||||
class="sidebar-item-button primary">
|
||||
Visualize
|
||||
|
@ -43,8 +43,3 @@
|
|||
( {{::warnings.length}} <ng-pluralize count="warnings.length" when="{'1':'warning', 'other':'warnings'}"></ng-pluralize> <i aria-hidden="true" class="fa fa-warning"></i> )
|
||||
</span>
|
||||
</a>
|
||||
|
||||
<div ng-show="!field.indexed && !field.scripted"
|
||||
disabled="disabled"
|
||||
tooltip="This field is not indexed thus unavailable for visualization and search"
|
||||
class="sidebar-item-button primary">Not Indexed</div>
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
<div ng-controller="discover" class="app-container">
|
||||
<navbar>
|
||||
<navbar name="discover">
|
||||
<form role="form" class="fill inline-form" ng-submit="fetch()" name="discoverSearch">
|
||||
<div class="typeahead" kbn-typeahead="discover">
|
||||
<div class="input-group"
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
define(function (require, module, exports) {
|
||||
require('plugins/kibana/discover/saved_searches/saved_searches');
|
||||
require('plugins/kibana/discover/directives/timechart');
|
||||
require('ui/navbar');
|
||||
require('ui/collapsible_sidebar');
|
||||
require('plugins/kibana/discover/components/field_chooser/field_chooser');
|
||||
require('plugins/kibana/discover/controllers/discover');
|
||||
|
@ -9,6 +10,6 @@ define(function (require, module, exports) {
|
|||
// preload
|
||||
require('ui/doc_table/components/table_row');
|
||||
|
||||
require('ui/saved_objects/saved_object_registry').register(require('plugins/kibana/discover/saved_searches/saved_search_register'));
|
||||
|
||||
var savedObjectRegistry = require('ui/saved_objects/saved_object_registry');
|
||||
savedObjectRegistry.register(require('plugins/kibana/discover/saved_searches/saved_search_register'));
|
||||
});
|
||||
|
|
|
@ -141,7 +141,6 @@ define(function (require) {
|
|||
|
||||
if (fieldTypes) {
|
||||
fields = $filter('fieldType')(fields, fieldTypes);
|
||||
fields = $filter('filter')(fields, { bucketable: true });
|
||||
fields = $filter('orderBy')(fields, ['type', 'name']);
|
||||
}
|
||||
|
||||
|
|
|
@ -1,6 +1,5 @@
|
|||
<div ng-controller="VisEditor" class="vis-editor vis-type-{{ vis.type.name }}">
|
||||
|
||||
<navbar ng-if="chrome.getVisible()">
|
||||
<navbar ng-if="chrome.getVisible()" name="visualize">
|
||||
<div class="fill bitty-modal-container">
|
||||
<div ng-if="vis.type.requiresSearch && $state.linked && !unlinking"
|
||||
ng-dblclick="unlink()"
|
||||
|
|
|
@ -4,6 +4,7 @@ define(function (require) {
|
|||
require('plugins/kibana/visualize/editor/sidebar');
|
||||
require('plugins/kibana/visualize/editor/agg_filter');
|
||||
|
||||
require('ui/navbar');
|
||||
require('ui/visualize');
|
||||
require('ui/collapsible_sidebar');
|
||||
require('ui/share');
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
<kbn-agg-table
|
||||
table="table"
|
||||
export-title="vis.title"
|
||||
per-page="editableVis.params.spyPerPage">
|
||||
per-page="spy.params.spyPerPage">
|
||||
</kbn-agg-table>
|
||||
|
|
|
@ -21,8 +21,8 @@ define(function (require) {
|
|||
if (!$scope.vis || !$scope.esResp) {
|
||||
$scope.table = null;
|
||||
} else {
|
||||
if (!$scope.editableVis.params.spyPerPage) {
|
||||
$scope.editableVis.params.spyPerPage = PER_PAGE_DEFAULT;
|
||||
if (!$scope.spy.params.spyPerPage) {
|
||||
$scope.spy.params.spyPerPage = PER_PAGE_DEFAULT;
|
||||
}
|
||||
|
||||
$scope.table = tabifyAggResponse($scope.vis, $scope.esResp, {
|
||||
|
|
|
@ -1,10 +1,10 @@
|
|||
import { resolve } from 'path';
|
||||
import { fromNode as fn } from 'bluebird';
|
||||
import expect from 'expect.js';
|
||||
import requirefrom from 'requirefrom';
|
||||
|
||||
import KbnServer from '../KbnServer';
|
||||
|
||||
const src = resolve.bind(__dirname, '../../');
|
||||
const requireFromTest = requirefrom('test');
|
||||
const kbnTestServer = requireFromTest('utils/kbn_server');
|
||||
const basePath = '/kibana';
|
||||
|
||||
describe('Server basePath config', function () {
|
||||
|
@ -13,14 +13,8 @@ describe('Server basePath config', function () {
|
|||
|
||||
let kbnServer;
|
||||
before(async function () {
|
||||
kbnServer = new KbnServer({
|
||||
server: { autoListen: false, basePath },
|
||||
plugins: { scanDirs: [src('plugins')] },
|
||||
logging: { quiet: true },
|
||||
optimize: { enabled: false },
|
||||
elasticsearch: {
|
||||
url: 'http://localhost:9210'
|
||||
}
|
||||
kbnServer = kbnTestServer.createServer({
|
||||
server: { basePath }
|
||||
});
|
||||
await kbnServer.ready();
|
||||
return kbnServer;
|
||||
|
@ -30,12 +24,18 @@ describe('Server basePath config', function () {
|
|||
await kbnServer.close();
|
||||
});
|
||||
|
||||
it('appends the basePath to root redirect', async function () {
|
||||
const response = await kbnServer.inject({
|
||||
it('appends the basePath to root redirect', function (done) {
|
||||
const options = {
|
||||
url: '/',
|
||||
method: 'GET'
|
||||
};
|
||||
kbnTestServer.makeRequest(kbnServer, options, function (res) {
|
||||
try {
|
||||
expect(res.payload).to.match(/defaultRoute = '\/kibana\/app\/kibana'/);
|
||||
done();
|
||||
} catch (e) {
|
||||
done(e);
|
||||
}
|
||||
});
|
||||
|
||||
expect(response.payload).to.match(/defaultRoute = '\/kibana\/app\/kibana'/);
|
||||
});
|
||||
});
|
||||
|
|
|
@ -3,6 +3,7 @@ let Joi = require('joi');
|
|||
let _ = require('lodash');
|
||||
let { zipObject } = require('lodash');
|
||||
let override = require('./override');
|
||||
let createDefaultSchema = require('./schema');
|
||||
let pkg = require('requirefrom')('src/utils')('packageJson');
|
||||
const clone = require('./deepCloneWithBuffers');
|
||||
|
||||
|
@ -12,6 +13,10 @@ const vals = Symbol('config values');
|
|||
const pendingSets = Symbol('Pending Settings');
|
||||
|
||||
module.exports = class Config {
|
||||
static withDefaultSchema(settings = {}) {
|
||||
return new Config(createDefaultSchema(), settings);
|
||||
}
|
||||
|
||||
constructor(initialSchema, initialSettings) {
|
||||
this[schemaKeys] = new Map();
|
||||
|
||||
|
|
|
@ -20,6 +20,10 @@ module.exports = () => Joi.object({
|
|||
prod: Joi.boolean().default(Joi.ref('$prod'))
|
||||
}).default(),
|
||||
|
||||
dev: Joi.object({
|
||||
basePathProxyTarget: Joi.number().default(5603),
|
||||
}).default(),
|
||||
|
||||
pid: Joi.object({
|
||||
file: Joi.string(),
|
||||
exclusive: Joi.boolean().default(false)
|
||||
|
|
|
@ -1,6 +1,4 @@
|
|||
module.exports = function (kbnServer) {
|
||||
let Config = require('./config');
|
||||
let schema = require('./schema')();
|
||||
|
||||
kbnServer.config = new Config(schema, kbnServer.settings || {});
|
||||
kbnServer.config = Config.withDefaultSchema(kbnServer.settings);
|
||||
};
|
||||
|
|
|
@ -1,12 +1,13 @@
|
|||
import expect from 'expect.js';
|
||||
import KbnServer from '../../KbnServer';
|
||||
import requirefrom from 'requirefrom';
|
||||
|
||||
const requireFromTest = requirefrom('test');
|
||||
const kbnTestServer = requireFromTest('utils/kbn_server');
|
||||
|
||||
describe('cookie validation', function () {
|
||||
let kbnServer;
|
||||
beforeEach(function () {
|
||||
kbnServer = new KbnServer({
|
||||
server: { autoListen: false }
|
||||
});
|
||||
kbnServer = kbnTestServer.createServer();
|
||||
return kbnServer.ready();
|
||||
});
|
||||
afterEach(function () {
|
||||
|
@ -14,26 +15,28 @@ describe('cookie validation', function () {
|
|||
});
|
||||
|
||||
it('allows non-strict cookies', function (done) {
|
||||
kbnServer.server.inject({
|
||||
const options = {
|
||||
method: 'GET',
|
||||
url: '/',
|
||||
headers: {
|
||||
cookie: 'test:80=value;test_80=value'
|
||||
}
|
||||
}, (res) => {
|
||||
};
|
||||
kbnTestServer.makeRequest(kbnServer, options, (res) => {
|
||||
expect(res.payload).not.to.contain('Invalid cookie header');
|
||||
done();
|
||||
});
|
||||
});
|
||||
|
||||
it('returns an error if the cookie can\'t be parsed', function (done) {
|
||||
kbnServer.server.inject({
|
||||
const options = {
|
||||
method: 'GET',
|
||||
url: '/',
|
||||
headers: {
|
||||
cookie: 'a'
|
||||
}
|
||||
}, (res) => {
|
||||
};
|
||||
kbnTestServer.makeRequest(kbnServer, options, (res) => {
|
||||
expect(res.payload).to.contain('Invalid cookie header');
|
||||
done();
|
||||
});
|
||||
|
|
|
@ -2,7 +2,8 @@ import expect from 'expect.js';
|
|||
import { fromNode as fn } from 'bluebird';
|
||||
import { resolve } from 'path';
|
||||
|
||||
import KbnServer from '../../KbnServer';
|
||||
const requireFromTest = require('requirefrom')('test');
|
||||
const kbnTestServer = requireFromTest('utils/kbn_server');
|
||||
|
||||
const nonDestructiveMethods = ['GET'];
|
||||
const destructiveMethods = ['POST', 'PUT', 'DELETE'];
|
||||
|
@ -14,20 +15,16 @@ const version = require(src('../package.json')).version;
|
|||
describe('xsrf request filter', function () {
|
||||
function inject(kbnServer, opts) {
|
||||
return fn(cb => {
|
||||
kbnServer.server.inject(opts, (resp) => {
|
||||
kbnTestServer.makeRequest(kbnServer, opts, (resp) => {
|
||||
cb(null, resp);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
const makeServer = async function () {
|
||||
const kbnServer = new KbnServer({
|
||||
server: { autoListen: false },
|
||||
plugins: { scanDirs: [src('plugins')] },
|
||||
logging: { quiet: true },
|
||||
optimize: { enabled: false },
|
||||
elasticsearch: {
|
||||
url: 'http://localhost:9210'
|
||||
const kbnServer = kbnTestServer.createServer({
|
||||
server: {
|
||||
xsrf: { disableProtection: false }
|
||||
}
|
||||
});
|
||||
|
||||
|
|
|
@ -1,13 +1,19 @@
|
|||
import fs from 'fs';
|
||||
import { readFileSync } from 'fs';
|
||||
import { format as formatUrl } from 'url';
|
||||
import httpolyglot from 'httpolyglot';
|
||||
|
||||
import tlsCiphers from './tls_ciphers';
|
||||
|
||||
export default function (kbnServer, server, config) {
|
||||
// this mixin is used outside of the kbn server, so it MUST work without a full kbnServer object.
|
||||
kbnServer = null;
|
||||
|
||||
// Create a new connection
|
||||
var connectionOptions = {
|
||||
host: config.get('server.host'),
|
||||
port: config.get('server.port'),
|
||||
const host = config.get('server.host');
|
||||
const port = config.get('server.port');
|
||||
|
||||
const connectionOptions = {
|
||||
host,
|
||||
port,
|
||||
state: {
|
||||
strictHeader: false
|
||||
},
|
||||
|
@ -19,42 +25,39 @@ export default function (kbnServer, server, config) {
|
|||
}
|
||||
};
|
||||
|
||||
// enable tls if ssl key and cert are defined
|
||||
if (config.get('server.ssl.key') && config.get('server.ssl.cert')) {
|
||||
connectionOptions.tls = {
|
||||
key: fs.readFileSync(config.get('server.ssl.key')),
|
||||
cert: fs.readFileSync(config.get('server.ssl.cert')),
|
||||
// The default ciphers in node 0.12.x include insecure ciphers, so until
|
||||
// we enforce a more recent version of node, we craft our own list
|
||||
// @see https://github.com/nodejs/node/blob/master/src/node_constants.h#L8-L28
|
||||
ciphers: [
|
||||
'ECDHE-RSA-AES128-GCM-SHA256',
|
||||
'ECDHE-ECDSA-AES128-GCM-SHA256',
|
||||
'ECDHE-RSA-AES256-GCM-SHA384',
|
||||
'ECDHE-ECDSA-AES256-GCM-SHA384',
|
||||
'DHE-RSA-AES128-GCM-SHA256',
|
||||
'ECDHE-RSA-AES128-SHA256',
|
||||
'DHE-RSA-AES128-SHA256',
|
||||
'ECDHE-RSA-AES256-SHA384',
|
||||
'DHE-RSA-AES256-SHA384',
|
||||
'ECDHE-RSA-AES256-SHA256',
|
||||
'DHE-RSA-AES256-SHA256',
|
||||
'HIGH',
|
||||
'!aNULL',
|
||||
'!eNULL',
|
||||
'!EXPORT',
|
||||
'!DES',
|
||||
'!RC4',
|
||||
'!MD5',
|
||||
'!PSK',
|
||||
'!SRP',
|
||||
'!CAMELLIA'
|
||||
].join(':'),
|
||||
// We use the server's cipher order rather than the client's to prevent
|
||||
// the BEAST attack
|
||||
honorCipherOrder: true
|
||||
};
|
||||
// enable tlsOpts if ssl key and cert are defined
|
||||
const useSsl = config.get('server.ssl.key') && config.get('server.ssl.cert');
|
||||
|
||||
// not using https? well that's easy!
|
||||
if (!useSsl) {
|
||||
server.connection(connectionOptions);
|
||||
return;
|
||||
}
|
||||
|
||||
server.connection(connectionOptions);
|
||||
server.connection({
|
||||
...connectionOptions,
|
||||
tls: true,
|
||||
listener: httpolyglot.createServer({
|
||||
key: readFileSync(config.get('server.ssl.key')),
|
||||
cert: readFileSync(config.get('server.ssl.cert')),
|
||||
|
||||
ciphers: tlsCiphers,
|
||||
// We use the server's cipher order rather than the client's to prevent the BEAST attack
|
||||
honorCipherOrder: true
|
||||
})
|
||||
});
|
||||
|
||||
server.ext('onRequest', function (req, reply) {
|
||||
if (req.raw.req.socket.encrypted) {
|
||||
reply.continue();
|
||||
} else {
|
||||
reply.redirect(formatUrl({
|
||||
port,
|
||||
protocol: 'https',
|
||||
hostname: host,
|
||||
pathname: req.url.pathname,
|
||||
search: req.url.search,
|
||||
}));
|
||||
}
|
||||
});
|
||||
}
|
||||
|
|
26
src/server/http/tls_ciphers.js
Normal file
|
@ -0,0 +1,26 @@
|
|||
// The default ciphers in node 0.12.x include insecure ciphers, so until
|
||||
// we enforce a more recent version of node, we craft our own list
|
||||
// @see https://github.com/nodejs/node/blob/master/src/node_constants.h#L8-L28
|
||||
export default [
|
||||
'ECDHE-RSA-AES128-GCM-SHA256',
|
||||
'ECDHE-ECDSA-AES128-GCM-SHA256',
|
||||
'ECDHE-RSA-AES256-GCM-SHA384',
|
||||
'ECDHE-ECDSA-AES256-GCM-SHA384',
|
||||
'DHE-RSA-AES128-GCM-SHA256',
|
||||
'ECDHE-RSA-AES128-SHA256',
|
||||
'DHE-RSA-AES128-SHA256',
|
||||
'ECDHE-RSA-AES256-SHA384',
|
||||
'DHE-RSA-AES256-SHA384',
|
||||
'ECDHE-RSA-AES256-SHA256',
|
||||
'DHE-RSA-AES256-SHA256',
|
||||
'HIGH',
|
||||
'!aNULL',
|
||||
'!eNULL',
|
||||
'!EXPORT',
|
||||
'!DES',
|
||||
'!RC4',
|
||||
'!MD5',
|
||||
'!PSK',
|
||||
'!SRP',
|
||||
'!CAMELLIA'
|
||||
].join(':');
|
|
@ -14,6 +14,7 @@ module.exports = function (kbnServer, server, config) {
|
|||
else if (config.get('logging.quiet')) {
|
||||
_.defaults(events, {
|
||||
log: ['listening', 'error', 'fatal'],
|
||||
request: ['error'],
|
||||
error: '*'
|
||||
});
|
||||
}
|
||||
|
@ -30,6 +31,7 @@ module.exports = function (kbnServer, server, config) {
|
|||
_.defaults(events, {
|
||||
log: ['info', 'warning', 'error', 'fatal'],
|
||||
response: config.get('logging.json') ? '*' : '!',
|
||||
request: ['info', 'warning', 'error', 'fatal'],
|
||||
error: '*'
|
||||
});
|
||||
}
|
||||
|
|
|
@ -28,6 +28,7 @@ let typeColors = {
|
|||
debug: 'brightBlack',
|
||||
server: 'brightBlack',
|
||||
optmzr: 'white',
|
||||
managr: 'green',
|
||||
optimize: 'magenta',
|
||||
listening: 'magenta'
|
||||
};
|
||||
|
|
|
@ -1,6 +1,6 @@
|
|||
import expect from 'expect.js';
|
||||
|
||||
import UiExports from '../UiExports';
|
||||
import UiExports from '../ui_exports';
|
||||
|
||||
describe('UiExports', function () {
|
||||
describe('#find()', function () {
|
||||
|
|
|
@ -6,10 +6,10 @@ module.exports = async (kbnServer, server, config) => {
|
|||
let readFile = require('fs').readFileSync;
|
||||
|
||||
let fromRoot = require('../utils/fromRoot');
|
||||
let UiExports = require('./UiExports');
|
||||
let UiBundle = require('./UiBundle');
|
||||
let UiBundleCollection = require('./UiBundleCollection');
|
||||
let UiBundlerEnv = require('./UiBundlerEnv');
|
||||
let UiExports = require('./ui_exports');
|
||||
let UiBundle = require('./ui_bundle');
|
||||
let UiBundleCollection = require('./ui_bundle_collection');
|
||||
let UiBundlerEnv = require('./ui_bundler_env');
|
||||
let loadingGif = readFile(fromRoot('src/ui/public/loading.gif'), { encoding: 'base64'});
|
||||
|
||||
let uiExports = kbnServer.uiExports = new UiExports({
|
||||
|
|
|
@ -27,6 +27,7 @@ require('ui/storage');
|
|||
require('ui/stringify/register');
|
||||
require('ui/styleCompile');
|
||||
require('ui/timefilter');
|
||||
require('ui/timepicker');
|
||||
require('ui/tooltip');
|
||||
require('ui/typeahead');
|
||||
require('ui/url');
|
||||
|
|
|
@ -52,6 +52,13 @@ describe('ui/courier/fetch/_fetch_these', () => {
|
|||
$rootScope.$apply();
|
||||
expect(fakeResponses.callCount).to.be(3);
|
||||
});
|
||||
|
||||
it('invokes request failure handler if starting fails', () => {
|
||||
request.start = sinon.stub().returns(Promise.reject('some error'));
|
||||
fetchThese(requests);
|
||||
$rootScope.$apply();
|
||||
sinon.assert.calledWith(request.handleFailure, 'some error');
|
||||
});
|
||||
});
|
||||
|
||||
context('when request has already started', () => {
|
||||
|
@ -69,6 +76,12 @@ describe('ui/courier/fetch/_fetch_these', () => {
|
|||
$rootScope.$apply();
|
||||
expect(fakeResponses.callCount).to.be(3);
|
||||
});
|
||||
it('invokes request failure handler if continuing fails', () => {
|
||||
request.continue = sinon.stub().returns(Promise.reject('some error'));
|
||||
fetchThese(requests);
|
||||
$rootScope.$apply();
|
||||
sinon.assert.calledWith(request.handleFailure, 'some error');
|
||||
});
|
||||
});
|
||||
|
||||
function mockRequest() {
|
||||
|
@ -76,6 +89,7 @@ describe('ui/courier/fetch/_fetch_these', () => {
|
|||
strategy: 'mock',
|
||||
started: true,
|
||||
aborted: false,
|
||||
handleFailure: sinon.spy(),
|
||||
retry: sinon.spy(function () { return this; }),
|
||||
continue: sinon.spy(function () { return this; }),
|
||||
start: sinon.spy(function () { return this; })
|
||||
|
|
|
@ -62,7 +62,8 @@ define(function (require) {
|
|||
return new Promise(function (resolve) {
|
||||
var action = req.started ? req.continue : req.start;
|
||||
resolve(action.call(req));
|
||||
});
|
||||
})
|
||||
.catch(err => req.handleFailure(err));
|
||||
});
|
||||
}
|
||||
|
||||
|
|
|
@ -39,7 +39,8 @@ describe('get filters', function () {
|
|||
beforeEach(function () {
|
||||
filters = [
|
||||
{ query: { match: { extension: { query: 'jpg', type: 'phrase' } } } },
|
||||
{ query: { match: { '@tags': { query: 'info', type: 'phrase' } } } }
|
||||
{ query: { match: { '@tags': { query: 'info', type: 'phrase' } } } },
|
||||
null
|
||||
];
|
||||
});
|
||||
|
||||
|
@ -69,16 +70,40 @@ describe('get filters', function () {
|
|||
expect(res[1].$state.store).to.be(storeNames.app);
|
||||
});
|
||||
|
||||
it('should return filters from specific states', function () {
|
||||
it('should return non-null filters from specific states', function () {
|
||||
var states = [
|
||||
[ globalState, queryFilter.getGlobalFilters ],
|
||||
[ appState, queryFilter.getAppFilters ],
|
||||
];
|
||||
|
||||
_.each(states, function (state) {
|
||||
state[0].filters = filters;
|
||||
state[0].filters = filters.slice(0);
|
||||
expect(state[0].filters).to.contain(null);
|
||||
|
||||
var res = state[1]();
|
||||
expect(res.length).to.be(state[0].filters.length);
|
||||
expect(state[0].filters).to.not.contain(null);
|
||||
});
|
||||
});
|
||||
|
||||
it('should replace the state, not save it', function () {
|
||||
var states = [
|
||||
[ globalState, queryFilter.getGlobalFilters ],
|
||||
[ appState, queryFilter.getAppFilters ],
|
||||
];
|
||||
|
||||
expect(appState.save.called).to.be(false);
|
||||
expect(appState.replace.called).to.be(false);
|
||||
|
||||
|
||||
_.each(states, function (state) {
|
||||
expect(state[0].save.called).to.be(false);
|
||||
expect(state[0].replace.called).to.be(false);
|
||||
|
||||
state[0].filters = filters.slice(0);
|
||||
var res = state[1]();
|
||||
expect(state[0].save.called).to.be(false);
|
||||
expect(state[0].replace.called).to.be(true);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
@ -39,7 +39,9 @@ describe('Query Filter', function () {
|
|||
expect(queryFilter.pinFilter).to.be.a('function');
|
||||
expect(queryFilter.pinAll).to.be.a('function');
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
});
|
||||
|
||||
describe('Actions', function () {
|
||||
|
|
|
@ -23,11 +23,19 @@ define(function (require) {
|
|||
queryFilter.getAppFilters = function () {
|
||||
var appState = getAppState();
|
||||
if (!appState || !appState.filters) return [];
|
||||
|
||||
// Work around for https://github.com/elastic/kibana/issues/5896
|
||||
appState.filters = validateStateFilters(appState);
|
||||
|
||||
return (appState.filters) ? _.map(appState.filters, appendStoreType('appState')) : [];
|
||||
};
|
||||
|
||||
queryFilter.getGlobalFilters = function () {
|
||||
if (!globalState.filters) return [];
|
||||
|
||||
// Work around for https://github.com/elastic/kibana/issues/5896
|
||||
globalState.filters = validateStateFilters(globalState);
|
||||
|
||||
return _.map(globalState.filters, appendStoreType('globalState'));
|
||||
};
|
||||
|
||||
|
@ -213,6 +221,19 @@ define(function (require) {
|
|||
|
||||
return queryFilter;
|
||||
|
||||
/**
|
||||
* Rids filter list of null values and replaces state if any nulls are found
|
||||
*/
|
||||
function validateStateFilters(state) {
|
||||
var compacted = _.compact(state.filters);
|
||||
if (state.filters.length !== compacted.length) {
|
||||
state.filters = compacted;
|
||||
state.replace();
|
||||
}
|
||||
return state.filters;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Saves both app and global states, ensuring filters are persisted
|
||||
* @returns {object} Resulting filter list, app and global combined
|
||||
|
|
|
@ -112,7 +112,6 @@ describe('index pattern', function () {
|
|||
|
||||
describe('fields', function () {
|
||||
it('should have expected properties on fields', function () {
|
||||
expect(indexPattern.fields[0]).to.have.property('bucketable');
|
||||
expect(indexPattern.fields[0]).to.have.property('displayName');
|
||||
expect(indexPattern.fields[0]).to.have.property('filterable');
|
||||
expect(indexPattern.fields[0]).to.have.property('format');
|
||||
|
|
|
@ -39,7 +39,6 @@ define(function (require) {
|
|||
var indexed = !!spec.indexed;
|
||||
var scripted = !!spec.scripted;
|
||||
var sortable = spec.name === '_score' || ((indexed || scripted) && type.sortable);
|
||||
var bucketable = indexed || scripted;
|
||||
var filterable = spec.name === '_id' || scripted || (indexed && type.filterable);
|
||||
|
||||
obj.fact('name');
|
||||
|
@ -59,7 +58,6 @@ define(function (require) {
|
|||
// usage flags, read-only and won't be saved
|
||||
obj.comp('format', format);
|
||||
obj.comp('sortable', sortable);
|
||||
obj.comp('bucketable', bucketable);
|
||||
obj.comp('filterable', filterable);
|
||||
|
||||
// computed values
|
||||
|
|
|
@ -8,8 +8,8 @@
|
|||
*
|
||||
* In the scenario below, require.js would load directive.js first because it is a
|
||||
* dependency of app.js. This would cause the call to `angular.module('app')` to
|
||||
* execute before the module is actually created. This causes angular to through an
|
||||
* error. This effect is magnifies when app.js links off to many different modules.
|
||||
* execute before the module is actually created. This causes angular to throw an
|
||||
* error. This effect is magnified when app.js links off to many different modules.
|
||||
*
|
||||
* This is normally solved by creating unique modules per file, listed as the 1st
|
||||
* alternate solution below. Unfortunately this solution would have required that
|
||||
|
|
140
src/ui/public/navbar/__tests__/navbar.js
Normal file
|
@ -0,0 +1,140 @@
|
|||
const ngMock = require('ngMock');
|
||||
const sinon = require('sinon');
|
||||
const expect = require('expect.js');
|
||||
const angular = require('angular');
|
||||
const _ = require('lodash');
|
||||
|
||||
require('ui/navbar');
|
||||
const navbarExtensionsRegistry = require('ui/registry/navbar_extensions');
|
||||
const Registry = require('ui/registry/_registry');
|
||||
|
||||
const defaultMarkup = `
|
||||
<navbar name="testing">
|
||||
<div class="button-group" role="toolbar">
|
||||
<button>
|
||||
<i aria-hidden="true" class="fa fa-file-new-o"></i>
|
||||
</button>
|
||||
<button>
|
||||
<i aria-hidden="true" class="fa fa-save"></i>
|
||||
</button>
|
||||
<button>
|
||||
<i aria-hidden="true" class="fa fa-folder-open-o"></i>
|
||||
</button>
|
||||
</div>
|
||||
</navbar>`;
|
||||
|
||||
|
||||
describe('navbar directive', function () {
|
||||
let $rootScope;
|
||||
let $compile;
|
||||
let stubRegistry;
|
||||
|
||||
beforeEach(function () {
|
||||
ngMock.module('kibana', function (PrivateProvider) {
|
||||
stubRegistry = new Registry({
|
||||
index: ['name'],
|
||||
group: ['appName'],
|
||||
order: ['order']
|
||||
});
|
||||
|
||||
PrivateProvider.swap(navbarExtensionsRegistry, stubRegistry);
|
||||
});
|
||||
|
||||
ngMock.module('kibana/navbar');
|
||||
|
||||
// Create the scope
|
||||
ngMock.inject(function ($injector) {
|
||||
$rootScope = $injector.get('$rootScope');
|
||||
$compile = $injector.get('$compile');
|
||||
});
|
||||
});
|
||||
|
||||
function init(markup = defaultMarkup) {
|
||||
// Give us a scope
|
||||
const $el = angular.element(markup);
|
||||
$compile($el)($rootScope);
|
||||
$el.scope().$digest();
|
||||
return $el;
|
||||
}
|
||||
|
||||
describe('incorrect use', function () {
|
||||
it('should throw if missing a name property', function () {
|
||||
const markup = `<navbar><div class="button-group" role="toolbar"></div></navbar>`;
|
||||
expect(() => init(markup)).to.throwException(/requires a name attribute/);
|
||||
});
|
||||
|
||||
it('should throw if missing a button group', function () {
|
||||
const markup = `<navbar name="testing"></navbar>`;
|
||||
expect(() => init(markup)).to.throwException(/must have exactly 1 button group/);
|
||||
});
|
||||
|
||||
it('should throw if multiple button groups', function () {
|
||||
const markup = ` <navbar name="testing">
|
||||
<div class="button-group" role="toolbar">
|
||||
<button>
|
||||
<i aria-hidden="true" class="fa fa-file-new-o"></i>
|
||||
</button>
|
||||
<button>
|
||||
<i aria-hidden="true" class="fa fa-save"></i>
|
||||
</button>
|
||||
</div>
|
||||
<div class="button-group" role="toolbar">
|
||||
<button>
|
||||
<i aria-hidden="true" class="fa fa-folder-open-o"></i>
|
||||
</button>
|
||||
</div>
|
||||
</navbar>`;
|
||||
expect(() => init(markup)).to.throwException(/must have exactly 1 button group/);
|
||||
});
|
||||
|
||||
it('should throw if button group not direct child', function () {
|
||||
const markup = `<navbar><div><div class="button-group" role="toolbar"></div></div></navbar>`;
|
||||
expect(() => init(markup)).to.throwException(/must have exactly 1 button group/);
|
||||
});
|
||||
});
|
||||
|
||||
describe('injecting extensions', function () {
|
||||
function registerExtension(def = {}) {
|
||||
stubRegistry.register(function () {
|
||||
return _.defaults(def, {
|
||||
name: 'exampleButton',
|
||||
appName: 'testing',
|
||||
order: 0,
|
||||
template: `
|
||||
<button class="test-button">
|
||||
<i aria-hidden="true" class="fa fa-rocket"></i>
|
||||
</button>`
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
it('should use the default markup', function () {
|
||||
var $el = init();
|
||||
expect($el.find('.button-group button').length).to.equal(3);
|
||||
});
|
||||
|
||||
it('should append to end then order == 0', function () {
|
||||
registerExtension({ order: 0 });
|
||||
var $el = init();
|
||||
|
||||
expect($el.find('.button-group button').length).to.equal(4);
|
||||
expect($el.find('.button-group button').last().hasClass('test-button')).to.be.ok();
|
||||
});
|
||||
|
||||
it('should append to end then order > 0', function () {
|
||||
registerExtension({ order: 1 });
|
||||
var $el = init();
|
||||
|
||||
expect($el.find('.button-group button').length).to.equal(4);
|
||||
expect($el.find('.button-group button').last().hasClass('test-button')).to.be.ok();
|
||||
});
|
||||
|
||||
it('should append to end then order < 0', function () {
|
||||
registerExtension({ order: -1 });
|
||||
var $el = init();
|
||||
|
||||
expect($el.find('.button-group button').length).to.equal(4);
|
||||
expect($el.find('.button-group button').first().hasClass('test-button')).to.be.ok();
|
||||
});
|
||||
});
|
||||
});
|
56
src/ui/public/navbar/navbar.js
Normal file
|
@ -0,0 +1,56 @@
|
|||
const _ = require('lodash');
|
||||
const $ = require('jquery');
|
||||
const navbar = require('ui/modules').get('kibana/navbar');
|
||||
|
||||
require('ui/render_directive');
|
||||
|
||||
navbar.directive('navbar', function (Private, $compile) {
|
||||
const navbarExtensions = Private(require('ui/registry/navbar_extensions'));
|
||||
const getExtensions = _.memoize(function (name) {
|
||||
if (!name) throw new Error('navbar directive requires a name attribute');
|
||||
return _.sortBy(navbarExtensions.byAppName[name], 'order');
|
||||
});
|
||||
|
||||
return {
|
||||
restrict: 'E',
|
||||
template: function ($el, $attrs) {
|
||||
const $buttonGroup = $el.children('.button-group');
|
||||
if ($buttonGroup.length !== 1) throw new Error('navbar must have exactly 1 button group');
|
||||
|
||||
const extensions = getExtensions($attrs.name);
|
||||
const buttons = $buttonGroup.children().detach().toArray();
|
||||
const controls = [
|
||||
...buttons.map(function (button) {
|
||||
return {
|
||||
order: 0,
|
||||
$el: $(button),
|
||||
};
|
||||
}),
|
||||
...extensions.map(function (extension, i) {
|
||||
return {
|
||||
order: extension.order,
|
||||
index: i,
|
||||
extension: extension,
|
||||
};
|
||||
}),
|
||||
];
|
||||
|
||||
_.sortBy(controls, 'order').forEach(function (control) {
|
||||
if (control.$el) {
|
||||
return $buttonGroup.append(control.$el);
|
||||
}
|
||||
|
||||
const { extension, index } = control;
|
||||
const $ext = $(`<render-directive definition="navbar.extensions[${index}]"></render-directive>`);
|
||||
$ext.html(extension.template);
|
||||
$buttonGroup.append($ext);
|
||||
});
|
||||
|
||||
return $el.html();
|
||||
},
|
||||
controllerAs: 'navbar',
|
||||
controller: function ($attrs) {
|
||||
this.extensions = getExtensions($attrs.name);
|
||||
}
|
||||
};
|
||||
});
|
8
src/ui/public/registry/navbar_extensions.js
Normal file
|
@ -0,0 +1,8 @@
|
|||
define(function (require) {
|
||||
return require('ui/registry/_registry')({
|
||||
name: 'navbarExtensions',
|
||||
index: ['name'],
|
||||
group: ['appName'],
|
||||
order: ['order']
|
||||
});
|
||||
});
|
|
@ -1,6 +1,6 @@
|
|||
define(function (require) {
|
||||
return require('ui/registry/_registry')({
|
||||
name: 'visTypes',
|
||||
name: 'spyModes',
|
||||
index: ['name'],
|
||||
order: ['order']
|
||||
});
|
||||
|
|
65
src/ui/public/render_directive/__tests__/render_directive.js
Normal file
|
@ -0,0 +1,65 @@
|
|||
const angular = require('angular');
|
||||
const sinon = require('sinon');
|
||||
const expect = require('expect.js');
|
||||
const ngMock = require('ngMock');
|
||||
|
||||
require('ui/render_directive');
|
||||
|
||||
let $parentScope;
|
||||
let $elem;
|
||||
let $directiveScope;
|
||||
|
||||
function init(markup = '', definition = {}) {
|
||||
ngMock.module('kibana/render_directive');
|
||||
|
||||
// Create the scope
|
||||
ngMock.inject(function ($rootScope, $compile) {
|
||||
$parentScope = $rootScope;
|
||||
|
||||
// create the markup
|
||||
$elem = angular.element('<render-directive>');
|
||||
$elem.html(markup);
|
||||
if (definition !== null) {
|
||||
$parentScope.definition = definition;
|
||||
$elem.attr('definition', 'definition');
|
||||
}
|
||||
|
||||
// compile the directive
|
||||
$compile($elem)($parentScope);
|
||||
$parentScope.$apply();
|
||||
|
||||
$directiveScope = $elem.isolateScope();
|
||||
});
|
||||
}
|
||||
|
||||
describe('render_directive', function () {
|
||||
describe('directive requirements', function () {
|
||||
it('should throw if not given a definition', function () {
|
||||
expect(() => init('', null)).to.throwException(/must have a definition/);
|
||||
});
|
||||
});
|
||||
|
||||
describe('rendering with definition', function () {
|
||||
it('should call link method', function () {
|
||||
const markup = '<p>hello world</p>';
|
||||
const definition = {
|
||||
link: sinon.stub(),
|
||||
};
|
||||
|
||||
init(markup, definition);
|
||||
|
||||
sinon.assert.callCount(definition.link, 1);
|
||||
});
|
||||
|
||||
it('should call controller method', function () {
|
||||
const markup = '<p>hello world</p>';
|
||||
const definition = {
|
||||
controller: sinon.stub(),
|
||||
};
|
||||
|
||||
init(markup, definition);
|
||||
|
||||
sinon.assert.callCount(definition.controller, 1);
|
||||
});
|
||||
});
|
||||
});
|
30
src/ui/public/render_directive/render_directive.js
Normal file
|
@ -0,0 +1,30 @@
|
|||
const _ = require('lodash');
|
||||
const $ = require('jquery');
|
||||
const module = require('ui/modules').get('kibana/render_directive');
|
||||
|
||||
module.directive('renderDirective', function () {
|
||||
return {
|
||||
restrict: 'E',
|
||||
scope: {
|
||||
'definition': '='
|
||||
},
|
||||
template: function ($el, $attrs) {
|
||||
return $el.html();
|
||||
},
|
||||
controller: function ($scope, $element, $attrs, $transclude) {
|
||||
if (!$scope.definition) throw new Error('render-directive must have a definition attribute');
|
||||
|
||||
const { controller, controllerAs } = $scope.definition;
|
||||
if (controller) {
|
||||
if (controllerAs) $scope[controllerAs] = this;
|
||||
$scope.$eval(controller, { $scope, $element, $attrs, $transclude });
|
||||
}
|
||||
},
|
||||
link: function ($scope, $el, $attrs) {
|
||||
const { link } = $scope.definition;
|
||||
if (link) {
|
||||
link($scope, $el, $attrs);
|
||||
}
|
||||
}
|
||||
};
|
||||
});
|
|
@ -458,3 +458,14 @@ fieldset {
|
|||
overflow-y: hidden;
|
||||
}
|
||||
}
|
||||
|
||||
.list-group {
|
||||
.list-group-item {
|
||||
&.active,
|
||||
&.active:hover,
|
||||
&.active:focus {
|
||||
background-color: @list-group-menu-item-active-bg;
|
||||
cursor: default;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
100
src/ui/public/styles/fonts.less
Normal file
|
@ -0,0 +1,100 @@
|
|||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: normal;
|
||||
font-weight: 300;
|
||||
src: local('Open Sans Light'), local('OpenSans-Light'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: italic;
|
||||
font-weight: 300;
|
||||
src: local('Open Sans Light Italic'), local('OpenSansLight-Italic'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300italic.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300italic.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300italic.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_300italic.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: normal;
|
||||
font-weight: 400;
|
||||
src: local('Open Sans'), local('OpenSans'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_regular.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_regular.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_regular.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_regular.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: italic;
|
||||
font-weight: 400;
|
||||
src: local('Open Sans Italic'), local('OpenSans-Italic'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_italic.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_italic.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_italic.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_italic.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: normal;
|
||||
font-weight: 600;
|
||||
src: local('Open Sans Semibold'), local('OpenSans-Semibold'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: italic;
|
||||
font-weight: 600;
|
||||
src: local('Open Sans Semibold Italic'), local('OpenSans-SemiboldItalic'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600italic.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600italic.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600italic.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_600italic.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: normal;
|
||||
font-weight: 700;
|
||||
src: local('Open Sans Bold'), local('OpenSans-Bold'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: italic;
|
||||
font-weight: 700;
|
||||
src: local('Open Sans Bold Italic'), local('OpenSans-BoldItalic'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700italic.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700italic.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700italic.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_700italic.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: normal;
|
||||
font-weight: 800;
|
||||
src: local('Open Sans Extrabold'), local('OpenSans-Extrabold'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800.svg#OpenSans') format('svg');
|
||||
}
|
||||
@font-face {
|
||||
font-family: 'Open Sans';
|
||||
font-style: italic;
|
||||
font-weight: 800;
|
||||
src: local('Open Sans Extrabold Italic'), local('OpenSans-ExtraboldItalic'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800italic.woff2') format('woff2'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800italic.woff') format('woff'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800italic.ttf') format('truetype'),
|
||||
url('fonts/open_sans/open_sans_v13_latin_800italic.svg#OpenSans') format('svg');
|
||||
}
|
202
src/ui/public/styles/fonts/open_sans/LICENSE.txt
Normal file
|
@ -0,0 +1,202 @@
|
|||
|
||||
Apache License
|
||||
Version 2.0, January 2004
|
||||
http://www.apache.org/licenses/
|
||||
|
||||
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||
|
||||
1. Definitions.
|
||||
|
||||
"License" shall mean the terms and conditions for use, reproduction,
|
||||
and distribution as defined by Sections 1 through 9 of this document.
|
||||
|
||||
"Licensor" shall mean the copyright owner or entity authorized by
|
||||
the copyright owner that is granting the License.
|
||||
|
||||
"Legal Entity" shall mean the union of the acting entity and all
|
||||
other entities that control, are controlled by, or are under common
|
||||
control with that entity. For the purposes of this definition,
|
||||
"control" means (i) the power, direct or indirect, to cause the
|
||||
direction or management of such entity, whether by contract or
|
||||
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||
|
||||
"You" (or "Your") shall mean an individual or Legal Entity
|
||||
exercising permissions granted by this License.
|
||||
|
||||
"Source" form shall mean the preferred form for making modifications,
|
||||
including but not limited to software source code, documentation
|
||||
source, and configuration files.
|
||||
|
||||
"Object" form shall mean any form resulting from mechanical
|
||||
transformation or translation of a Source form, including but
|
||||
not limited to compiled object code, generated documentation,
|
||||
and conversions to other media types.
|
||||
|
||||
"Work" shall mean the work of authorship, whether in Source or
|
||||
Object form, made available under the License, as indicated by a
|
||||
copyright notice that is included in or attached to the work
|
||||
(an example is provided in the Appendix below).
|
||||
|
||||
"Derivative Works" shall mean any work, whether in Source or Object
|
||||
form, that is based on (or derived from) the Work and for which the
|
||||
editorial revisions, annotations, elaborations, or other modifications
|
||||
represent, as a whole, an original work of authorship. For the purposes
|
||||
of this License, Derivative Works shall not include works that remain
|
||||
separable from, or merely link (or bind by name) to the interfaces of,
|
||||
the Work and Derivative Works thereof.
|
||||
|
||||
"Contribution" shall mean any work of authorship, including
|
||||
the original version of the Work and any modifications or additions
|
||||
to that Work or Derivative Works thereof, that is intentionally
|
||||
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||
or by an individual or Legal Entity authorized to submit on behalf of
|
||||
the copyright owner. For the purposes of this definition, "submitted"
|
||||
means any form of electronic, verbal, or written communication sent
|
||||
to the Licensor or its representatives, including but not limited to
|
||||
communication on electronic mailing lists, source code control systems,
|
||||
and issue tracking systems that are managed by, or on behalf of, the
|
||||
Licensor for the purpose of discussing and improving the Work, but
|
||||
excluding communication that is conspicuously marked or otherwise
|
||||
designated in writing by the copyright owner as "Not a Contribution."
|
||||
|
||||
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||
on behalf of whom a Contribution has been received by Licensor and
|
||||
subsequently incorporated within the Work.
|
||||
|
||||
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
copyright license to reproduce, prepare Derivative Works of,
|
||||
publicly display, publicly perform, sublicense, and distribute the
|
||||
Work and such Derivative Works in Source or Object form.
|
||||
|
||||
3. Grant of Patent License. Subject to the terms and conditions of
|
||||
this License, each Contributor hereby grants to You a perpetual,
|
||||
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||
(except as stated in this section) patent license to make, have made,
|
||||
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||
where such license applies only to those patent claims licensable
|
||||
by such Contributor that are necessarily infringed by their
|
||||
Contribution(s) alone or by combination of their Contribution(s)
|
||||
with the Work to which such Contribution(s) was submitted. If You
|
||||
institute patent litigation against any entity (including a
|
||||
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||
or a Contribution incorporated within the Work constitutes direct
|
||||
or contributory patent infringement, then any patent licenses
|
||||
granted to You under this License for that Work shall terminate
|
||||
as of the date such litigation is filed.
|
||||
|
||||
4. Redistribution. You may reproduce and distribute copies of the
|
||||
Work or Derivative Works thereof in any medium, with or without
|
||||
modifications, and in Source or Object form, provided that You
|
||||
meet the following conditions:
|
||||
|
||||
(a) You must give any other recipients of the Work or
|
||||
Derivative Works a copy of this License; and
|
||||
|
||||
(b) You must cause any modified files to carry prominent notices
|
||||
stating that You changed the files; and
|
||||
|
||||
(c) You must retain, in the Source form of any Derivative Works
|
||||
that You distribute, all copyright, patent, trademark, and
|
||||
attribution notices from the Source form of the Work,
|
||||
excluding those notices that do not pertain to any part of
|
||||
the Derivative Works; and
|
||||
|
||||
(d) If the Work includes a "NOTICE" text file as part of its
|
||||
distribution, then any Derivative Works that You distribute must
|
||||
include a readable copy of the attribution notices contained
|
||||
within such NOTICE file, excluding those notices that do not
|
||||
pertain to any part of the Derivative Works, in at least one
|
||||
of the following places: within a NOTICE text file distributed
|
||||
as part of the Derivative Works; within the Source form or
|
||||
documentation, if provided along with the Derivative Works; or,
|
||||
within a display generated by the Derivative Works, if and
|
||||
wherever such third-party notices normally appear. The contents
|
||||
of the NOTICE file are for informational purposes only and
|
||||
do not modify the License. You may add Your own attribution
|
||||
notices within Derivative Works that You distribute, alongside
|
||||
or as an addendum to the NOTICE text from the Work, provided
|
||||
that such additional attribution notices cannot be construed
|
||||
as modifying the License.
|
||||
|
||||
You may add Your own copyright statement to Your modifications and
|
||||
may provide additional or different license terms and conditions
|
||||
for use, reproduction, or distribution of Your modifications, or
|
||||
for any such Derivative Works as a whole, provided Your use,
|
||||
reproduction, and distribution of the Work otherwise complies with
|
||||
the conditions stated in this License.
|
||||
|
||||
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||
any Contribution intentionally submitted for inclusion in the Work
|
||||
by You to the Licensor shall be under the terms and conditions of
|
||||
this License, without any additional terms or conditions.
|
||||
Notwithstanding the above, nothing herein shall supersede or modify
|
||||
the terms of any separate license agreement you may have executed
|
||||
with Licensor regarding such Contributions.
|
||||
|
||||
6. Trademarks. This License does not grant permission to use the trade
|
||||
names, trademarks, service marks, or product names of the Licensor,
|
||||
except as required for reasonable and customary use in describing the
|
||||
origin of the Work and reproducing the content of the NOTICE file.
|
||||
|
||||
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||
agreed to in writing, Licensor provides the Work (and each
|
||||
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||
implied, including, without limitation, any warranties or conditions
|
||||
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||
appropriateness of using or redistributing the Work and assume any
|
||||
risks associated with Your exercise of permissions under this License.
|
||||
|
||||
8. Limitation of Liability. In no event and under no legal theory,
|
||||
whether in tort (including negligence), contract, or otherwise,
|
||||
unless required by applicable law (such as deliberate and grossly
|
||||
negligent acts) or agreed to in writing, shall any Contributor be
|
||||
liable to You for damages, including any direct, indirect, special,
|
||||
incidental, or consequential damages of any character arising as a
|
||||
result of this License or out of the use or inability to use the
|
||||
Work (including but not limited to damages for loss of goodwill,
|
||||
work stoppage, computer failure or malfunction, or any and all
|
||||
other commercial damages or losses), even if such Contributor
|
||||
has been advised of the possibility of such damages.
|
||||
|
||||
9. Accepting Warranty or Additional Liability. While redistributing
|
||||
the Work or Derivative Works thereof, You may choose to offer,
|
||||
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||
or other liability obligations and/or rights consistent with this
|
||||
License. However, in accepting such obligations, You may act only
|
||||
on Your own behalf and on Your sole responsibility, not on behalf
|
||||
of any other Contributor, and only if You agree to indemnify,
|
||||
defend, and hold each Contributor harmless for any liability
|
||||
incurred by, or claims asserted against, such Contributor by reason
|
||||
of your accepting any such warranty or additional liability.
|
||||
|
||||
END OF TERMS AND CONDITIONS
|
||||
|
||||
APPENDIX: How to apply the Apache License to your work.
|
||||
|
||||
To apply the Apache License to your work, attach the following
|
||||
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||
replaced with your own identifying information. (Don't include
|
||||
the brackets!) The text should be enclosed in the appropriate
|
||||
comment syntax for the file format. We also recommend that a
|
||||
file or class name and description of purpose be included on the
|
||||
same "printed page" as the copyright notice for easier
|
||||
identification within third-party archives.
|
||||
|
||||
Copyright [yyyy] [name of copyright owner]
|
||||
|
||||
Licensed under the Apache License, Version 2.0 (the "License");
|
||||
you may not use this file except in compliance with the License.
|
||||
You may obtain a copy of the License at
|
||||
|
||||
http://www.apache.org/licenses/LICENSE-2.0
|
||||
|
||||
Unless required by applicable law or agreed to in writing, software
|
||||
distributed under the License is distributed on an "AS IS" BASIS,
|
||||
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
See the License for the specific language governing permissions and
|
||||
limitations under the License.
|
1633
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_300.svg
Normal file
After Width: | Height: | Size: 104 KiB |
BIN
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_300.ttf
Normal file
After Width: | Height: | Size: 108 KiB |
1637
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_600.svg
Normal file
After Width: | Height: | Size: 104 KiB |
BIN
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_600.ttf
Normal file
After Width: | Height: | Size: 108 KiB |
1635
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_700.svg
Normal file
After Width: | Height: | Size: 104 KiB |
BIN
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_700.ttf
Normal file
After Width: | Height: | Size: 107 KiB |
1637
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_800.svg
Normal file
After Width: | Height: | Size: 105 KiB |
BIN
src/ui/public/styles/fonts/open_sans/open_sans_v13_latin_800.ttf
Normal file
After Width: | Height: | Size: 107 KiB |