Merge remote-tracking branch 'upstream/6.7' into 6.7
|
@ -5,7 +5,6 @@ set -e
|
|||
# move to Kibana root
|
||||
cd "$(dirname "$0")/.."
|
||||
|
||||
source src/dev/ci_setup/extract_bootstrap_cache.sh
|
||||
source src/dev/ci_setup/setup.sh
|
||||
|
||||
case "$JOB" in
|
||||
|
|
|
@ -1,22 +1,23 @@
|
|||
[[add-sample-data]]
|
||||
== Get up and running with sample data
|
||||
|
||||
{kib} has three sample data sets that you can use to explore {kib} before loading your own data
|
||||
source. Each set is prepackaged with a dashboard of visualizations and a
|
||||
{kibana-ref}/canvas-getting-started.html[Canvas workpad].
|
||||
{kib} has several sample data sets that you can use to explore {kib} before loading your own data.
|
||||
Sample data sets install prepackaged visualizations, dashboards,
|
||||
{kibana-ref}/canvas-getting-started.html[Canvas workpads],
|
||||
and {kibana-ref}/maps.html[Maps].
|
||||
|
||||
The sample data sets address common use cases:
|
||||
The sample data sets showcase a variety of use cases:
|
||||
|
||||
* *eCommerce orders* includes visualizations for product-related information,
|
||||
such as cost, revenue, and price.
|
||||
* *Web logs* lets you analyze website traffic.
|
||||
* *Flight data* enables you to view and interact with flight routes for four airlines.
|
||||
* *eCommerce orders* includes visualizations for product-related information,
|
||||
such as cost, revenue, and price.
|
||||
* *Web logs* lets you analyze website traffic.
|
||||
* *Flight data* enables you to view and interact with flight routes for four airlines.
|
||||
|
||||
To get started, go to the home page and click the link next to *Add sample data*.
|
||||
|
||||
Once you have loaded a data set, click *View data* to view visualizations in *Dashboard*.
|
||||
|
||||
*Note:* The timestamps in the sample data sets are relative to when they are installed.
|
||||
*Note:* The timestamps in the sample data sets are relative to when they are installed.
|
||||
If you uninstall and reinstall a data set, the timestamps will change to reflect the most recent installation.
|
||||
|
||||
|
||||
|
|
BIN
docs/maps/images/grid_metrics_both.png
Normal file
After Width: | Height: | Size: 113 KiB |
BIN
docs/maps/images/gs_add_cloropeth_layer.png
Normal file
After Width: | Height: | Size: 967 KiB |
BIN
docs/maps/images/gs_add_es_layer.png
Normal file
After Width: | Height: | Size: 809 KiB |
BIN
docs/maps/images/gs_create_new_map.png
Normal file
After Width: | Height: | Size: 731 KiB |
BIN
docs/maps/images/gs_link_icon.png
Normal file
After Width: | Height: | Size: 665 B |
BIN
docs/maps/images/gs_plus_icon.png
Normal file
After Width: | Height: | Size: 629 B |
BIN
docs/maps/images/sample_data_web_logs.png
Normal file
After Width: | Height: | Size: 698 KiB |
|
@ -13,6 +13,7 @@ image::maps/images/sample_data_ecommerce.png[]
|
|||
|
||||
--
|
||||
|
||||
include::maps-getting-started.asciidoc[]
|
||||
include::heatmap-layer.asciidoc[]
|
||||
include::tile-layer.asciidoc[]
|
||||
include::vector-layer.asciidoc[]
|
||||
|
|
171
docs/maps/maps-getting-started.asciidoc
Normal file
|
@ -0,0 +1,171 @@
|
|||
[[maps-getting-started]]
|
||||
== Getting started with Maps
|
||||
|
||||
You work with *Maps* by adding layers. The data for a layer can come from
|
||||
sources such as {es} documents, vector sources, tile map services, web map
|
||||
services, and more. You can symbolize the data in different ways.
|
||||
For example, you might show which airports have the longest flight
|
||||
delays by using circles from small to big. Or,
|
||||
you might show the amount of web log traffic by shading countries from
|
||||
light to dark.
|
||||
|
||||
[role="screenshot"]
|
||||
image::maps/images/sample_data_web_logs.png[]
|
||||
|
||||
[float]
|
||||
=== Prerequisites
|
||||
Before you start this tutorial, <<add-sample-data, add the web logs sample data set>>. Each
|
||||
sample data set includes a map to go along with the data. Once you've added the data, open *Maps* and
|
||||
explore the different layers of the *[Logs] Total Requests and Bytes* map.
|
||||
You'll re-create this map in this tutorial.
|
||||
|
||||
[float]
|
||||
=== Take-away skills
|
||||
In this tutorial, you'll learn to:
|
||||
|
||||
* Create a multi-layer map
|
||||
* Connect a layer to a data source
|
||||
* Use symbols, colors, and labels to style a layer
|
||||
* Create layers for {es} data
|
||||
|
||||
|
||||
=== Creating a new map
|
||||
|
||||
The first thing to do is to create a new map.
|
||||
|
||||
. If you haven't already, open *Maps*.
|
||||
. On the maps list page, click *Create map*.
|
||||
+
|
||||
A new map is created using a base tile layer.
|
||||
+
|
||||
[role="screenshot"]
|
||||
image::maps/images/gs_create_new_map.png[]
|
||||
|
||||
|
||||
=== Adding a choropleth layer
|
||||
|
||||
Now that you have a map, you'll want to add layers to it.
|
||||
The first layer you'll add is a choropleth layer to shade world countries
|
||||
by web log traffic. Darker shades symbolize countries with more web log traffic,
|
||||
and lighter shades symbolize countries with less traffic.
|
||||
|
||||
==== Add a vector layer from the Elastic Maps Service source
|
||||
|
||||
. In the map legend, click *Add layer*.
|
||||
. Click the *Vector shapes* data source.
|
||||
. From the *Layer* dropdown menu, select *World Countries*.
|
||||
. Click the *Add layer* button.
|
||||
. Set *Layer name* to `Total Requests by Country`.
|
||||
. Set *Layer transparency* to 0.5.
|
||||
|
||||
===== Join the vector layer with the sample web log index
|
||||
|
||||
You must add the web log traffic property to the world countries so
|
||||
that the property is available for styling.
|
||||
You'll create a <<terms-join, terms join>> to link the vector source *World Countries* to
|
||||
the {es} index `kibana_sample_data_logs` on the shared key iso2 = geo.src.
|
||||
|
||||
. Click plus image:maps/images/gs_plus_icon.png[] to the right of *Term Joins* label.
|
||||
. Click *Join --select--*
|
||||
. Set *Left field* to *ISO 3166-1 alpha-2 code*.
|
||||
. Set *Right source* to *kibana_sample_data_logs*.
|
||||
. Set *Right field* to *geo.src*.
|
||||
|
||||
===== Set the vector style
|
||||
|
||||
The final step is to set the vector fill color to shade
|
||||
the countries by web log traffic.
|
||||
|
||||
. Click image:maps/images/gs_link_icon.png[] to the right of *Fill color*.
|
||||
. Select the grey color ramp.
|
||||
. In the field select input, select *count of kibana_sample_data_logs:geo.src*.
|
||||
. Click *Save & close*.
|
||||
+
|
||||
Your map now looks like this:
|
||||
+
|
||||
[role="screenshot"]
|
||||
image::maps/images/gs_add_cloropeth_layer.png[]
|
||||
|
||||
=== Adding layers for {es} data
|
||||
|
||||
You'll add two layers for {es} data. The first layer displays documents, and the
|
||||
second layer displays aggregated data.
|
||||
The raw documents appear when you zoom in the map to show smaller regions.
|
||||
The aggregated data
|
||||
appears when you zoom out the map to show larger amounts of the globe.
|
||||
|
||||
==== Add a vector layer from the document source
|
||||
|
||||
This layer displays web log documents as points.
|
||||
The layer is only visible when you zoom in the map past zoom level 9.
|
||||
|
||||
. In the map legend, click *Add layer*.
|
||||
. Click the *Documents* data source.
|
||||
. Set *Index pattern* to *kibana_sample_data_logs*.
|
||||
. Click the *Add layer* button.
|
||||
. Set *Layer name* to `Actual Requests`.
|
||||
. Set *Min zoom* to 9 and *Max zoom* to 24.
|
||||
. Set *Layer transparency* to 1.
|
||||
. Set *Fill color* to *#2200ff*.
|
||||
. Click *Save & close*.
|
||||
|
||||
==== Add a vector layer from the grid aggregation source
|
||||
|
||||
Aggregations group {es} documents into grids. You can calculate metrics
|
||||
for each gridded cell.
|
||||
|
||||
You'll create a layer for aggregated data and make it visible only when the map
|
||||
is zoomed out past zoom level 9. Darker colors will symbolize grids
|
||||
with more web log traffic, and lighter colors will symbolize grids with less
|
||||
traffic. Larger circles will symbolize grids with
|
||||
more total bytes transferred, and smaller circles will symbolize
|
||||
grids with less bytes transferred.
|
||||
|
||||
[role="screenshot"]
|
||||
image::maps/images/grid_metrics_both.png[]
|
||||
|
||||
===== Add the layer
|
||||
|
||||
. In the map legend, click *Add layer*.
|
||||
. Click the *Grid aggregation* data source.
|
||||
. Set *Index pattern* to *kibana_sample_data_logs*.
|
||||
. Click the *Add layer* button.
|
||||
. Set *Layer name* to `Total Requests and Bytes`.
|
||||
. Set *Min zoom* to 0 and *Max zoom* to 9.
|
||||
. Set *Layer transparency* to 1.
|
||||
|
||||
===== Configure the aggregation metrics
|
||||
|
||||
. Click plus image:maps/images/gs_plus_icon.png[] to the right of *Metrics* label.
|
||||
. Select *Sum* in the aggregation select.
|
||||
. Select *bytes* in the field select.
|
||||
|
||||
===== Set the vector style
|
||||
|
||||
. In *Vector style*, change *Symbol size*:
|
||||
.. Set *Min size* to 1.
|
||||
.. Set *Max size* to 25.
|
||||
.. In the field select, select *sum of bytes*.
|
||||
. Click *Save & close* button.
|
||||
+
|
||||
Your map now looks like this:
|
||||
+
|
||||
[role="screenshot"]
|
||||
image::maps/images/gs_add_es_layer.png[]
|
||||
|
||||
=== Saving the map
|
||||
Now that your map is complete, you'll want to save it so others can use it.
|
||||
|
||||
. In the application toolbar, click *Save*.
|
||||
. Enter `Tutorial web logs map` for the title.
|
||||
. Click *Confirm Save*.
|
||||
|
||||
You're now ready to start creating maps using your own data. You might find
|
||||
these resources helpful:
|
||||
|
||||
* <<heatmap-layer, Heat map layer>>
|
||||
* <<tile-layer, Tile layer>>
|
||||
* <<vector-layer, Vector layer>>
|
||||
|
||||
|
||||
|
|
@ -39,6 +39,13 @@ Alternatively, you can download other Docker images that contain only features
|
|||
available under the Apache 2.0 license. To download the images, go to
|
||||
https://www.docker.elastic.co[www.docker.elastic.co].
|
||||
|
||||
[float]
|
||||
=== Running Kibana on Docker for development
|
||||
Kibana can be quickly started and connected to a local Elasticsearch container for development
|
||||
or testing use with the following command:
|
||||
--------------------------------------------
|
||||
docker run --link YOUR_ELASTICSEARCH_CONTAINER_NAME_OR_ID:elasticsearch -p 5601:5601 {docker-repo}:{version}
|
||||
--------------------------------------------
|
||||
endif::[]
|
||||
|
||||
[float]
|
||||
|
|
|
@ -166,7 +166,7 @@
|
|||
"globby": "^8.0.1",
|
||||
"good-squeeze": "2.1.0",
|
||||
"h2o2": "^8.1.2",
|
||||
"handlebars": "4.0.5",
|
||||
"handlebars": "4.0.13",
|
||||
"hapi": "^17.5.3",
|
||||
"hjson": "3.1.0",
|
||||
"hoek": "^5.0.4",
|
||||
|
|
|
@ -53,7 +53,7 @@ describe('Filter Manager', function () {
|
|||
const field = getField(indexPattern, 'script number');
|
||||
expected.meta.field = 'script number';
|
||||
_.set(expected, 'script.script', {
|
||||
inline: '(' + field.script + ') == value',
|
||||
source: '(' + field.script + ') == value',
|
||||
lang: 'expression',
|
||||
params: {
|
||||
value: 5,
|
||||
|
|
|
@ -51,7 +51,7 @@ describe('Filter Manager', function () {
|
|||
expected.meta.field = 'script number';
|
||||
_.set(expected, 'script.script', {
|
||||
lang: 'expression',
|
||||
inline: '(' + field.script + ')>=gte && (' + field.script + ')<=lte',
|
||||
source: '(' + field.script + ')>=gte && (' + field.script + ')<=lte',
|
||||
params: {
|
||||
value: '>=1 <=3',
|
||||
gte: 1,
|
||||
|
@ -68,7 +68,7 @@ describe('Filter Manager', function () {
|
|||
`gte(() -> { ${field.script} }, params.gte) && ` +
|
||||
`lte(() -> { ${field.script} }, params.lte)`;
|
||||
|
||||
const inlineScript = buildRangeFilter(field, { gte: 1, lte: 3 }, indexPattern).script.script.inline;
|
||||
const inlineScript = buildRangeFilter(field, { gte: 1, lte: 3 }, indexPattern).script.script.source;
|
||||
expect(inlineScript).to.be(expected);
|
||||
});
|
||||
|
||||
|
@ -89,7 +89,7 @@ describe('Filter Manager', function () {
|
|||
params[key] = 5;
|
||||
const filter = buildRangeFilter(field, params, indexPattern);
|
||||
|
||||
expect(filter.script.script.inline).to.be(
|
||||
expect(filter.script.script.source).to.be(
|
||||
'(' + field.script + ')' + operator + key);
|
||||
expect(filter.script.script.params[key]).to.be(5);
|
||||
expect(filter.script.script.params.value).to.be(operator + 5);
|
||||
|
@ -120,7 +120,7 @@ describe('Filter Manager', function () {
|
|||
it('does not contain a script condition for the infinite side', function () {
|
||||
const field = getField(indexPattern, 'script number');
|
||||
const script = field.script;
|
||||
expect(filter.script.script.inline).to.equal(`(${script})>=gte`);
|
||||
expect(filter.script.script.source).to.equal(`(${script})>=gte`);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
|
|
@ -40,7 +40,7 @@ export function getPhraseScript(field, value) {
|
|||
|
||||
return {
|
||||
script: {
|
||||
inline: script,
|
||||
source: script,
|
||||
lang: field.lang,
|
||||
params: {
|
||||
value: convertedValue
|
||||
|
|
|
@ -96,7 +96,7 @@ export function getRangeScript(field, params) {
|
|||
|
||||
return {
|
||||
script: {
|
||||
inline: script,
|
||||
source: script,
|
||||
params: knownParams,
|
||||
lang: field.lang
|
||||
}
|
||||
|
|
|
@ -18,6 +18,7 @@
|
|||
*/
|
||||
|
||||
import { FUNCTIONS_URL } from './consts';
|
||||
import _ from 'lodash';
|
||||
|
||||
/**
|
||||
* Create a function which executes an Expression function on the
|
||||
|
@ -51,12 +52,30 @@ export function batchedFetch({ kfetch, serialize, ms = 10 }) {
|
|||
timeout = setTimeout(runBatch, ms);
|
||||
}
|
||||
|
||||
const id = nextId();
|
||||
const request = {
|
||||
functionName,
|
||||
args,
|
||||
context: serialize(context),
|
||||
};
|
||||
|
||||
// Check to see if this is a duplicate server function.
|
||||
const duplicate = Object.values(batch).find(batchedRequest =>
|
||||
_.isMatch(batchedRequest.request, request)
|
||||
);
|
||||
|
||||
// If it is, just return the promise of the duplicated request.
|
||||
if (duplicate) {
|
||||
return duplicate.future.promise;
|
||||
}
|
||||
|
||||
// If not, create a new promise, id, and add it to the batched collection.
|
||||
const future = createFuture();
|
||||
const id = nextId();
|
||||
request.id = id;
|
||||
|
||||
batch[id] = {
|
||||
future,
|
||||
request: { id, functionName, args, context: serialize(context) },
|
||||
request,
|
||||
};
|
||||
|
||||
return future.promise;
|
||||
|
|
|
@ -67,7 +67,7 @@ describe('kbn-interpreter/interpreter', () => {
|
|||
|
||||
expect(register).toHaveBeenCalledTimes(2);
|
||||
|
||||
const [ hello, world ] = register.mock.calls.map(([fn]) => fn());
|
||||
const [hello, world] = register.mock.calls.map(([fn]) => fn());
|
||||
|
||||
expect(hello.name).toEqual('hello');
|
||||
expect(typeof hello.fn).toEqual('function');
|
||||
|
@ -85,14 +85,15 @@ describe('kbn-interpreter/interpreter', () => {
|
|||
pathname: FUNCTIONS_URL,
|
||||
method: 'POST',
|
||||
body: JSON.stringify({
|
||||
functions: [{
|
||||
id: 1,
|
||||
functionName: 'hello',
|
||||
args,
|
||||
context,
|
||||
}]
|
||||
functions: [
|
||||
{
|
||||
functionName: 'hello',
|
||||
args,
|
||||
context,
|
||||
id: 1,
|
||||
},
|
||||
],
|
||||
}),
|
||||
});
|
||||
});
|
||||
|
||||
});
|
||||
|
|
|
@ -5,6 +5,6 @@
|
|||
],
|
||||
"include": [
|
||||
"./src/**/*.ts",
|
||||
"./types/index.d.ts"
|
||||
"./types/**/*.ts"
|
||||
]
|
||||
}
|
||||
|
|
|
@ -17,9 +17,5 @@
|
|||
* under the License.
|
||||
*/
|
||||
|
||||
import { ToolingLog } from '@kbn/dev-utils';
|
||||
|
||||
export function createFailError(msg: string, exitCode?: number): Error;
|
||||
export function run(
|
||||
body: (args: { flags: Record<string, any>; log: ToolingLog }) => void
|
||||
): Promise<void>;
|
||||
require('../src/setup_node_env');
|
||||
require('../src/dev/typescript/run_check_ts_projects_cli').runCheckTsProjectsCli();
|
|
@ -1,15 +0,0 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
set -e
|
||||
|
||||
###
|
||||
### Extract the bootstrap cache that we create in the packer_cache.sh script
|
||||
###
|
||||
bootstrapCache="$HOME/.kibana/bootstrap_cache/master.tar"
|
||||
if [ -f "$bootstrapCache" ]; then
|
||||
echo "extracting bootstrap_cache from $bootstrapCache";
|
||||
tar -xf "$bootstrapCache";
|
||||
else
|
||||
echo "bootstrap_cache missing";
|
||||
exit 1;
|
||||
fi
|
|
@ -21,18 +21,23 @@ import { inspect } from 'util';
|
|||
|
||||
const FAIL_TAG = Symbol('fail error');
|
||||
|
||||
export function createFailError(reason, exitCode = 1) {
|
||||
const error = new Error(reason);
|
||||
error.exitCode = exitCode;
|
||||
error[FAIL_TAG] = true;
|
||||
return error;
|
||||
interface FailError extends Error {
|
||||
exitCode: number;
|
||||
[FAIL_TAG]: true;
|
||||
}
|
||||
|
||||
export function isFailError(error) {
|
||||
export function createFailError(reason: string, exitCode = 1): FailError {
|
||||
return Object.assign(new Error(reason), {
|
||||
exitCode,
|
||||
[FAIL_TAG]: true as true,
|
||||
});
|
||||
}
|
||||
|
||||
export function isFailError(error: any): error is FailError {
|
||||
return Boolean(error && error[FAIL_TAG]);
|
||||
}
|
||||
|
||||
export function combineErrors(errors) {
|
||||
export function combineErrors(errors: Array<Error | FailError>) {
|
||||
if (errors.length === 1) {
|
||||
return errors[0];
|
||||
}
|
|
@ -21,8 +21,21 @@ import { relative } from 'path';
|
|||
|
||||
import getopts from 'getopts';
|
||||
|
||||
export function getFlags(argv) {
|
||||
return getopts(argv, {
|
||||
import { Options } from './run';
|
||||
|
||||
export interface Flags {
|
||||
verbose: boolean;
|
||||
quiet: boolean;
|
||||
silent: boolean;
|
||||
debug: boolean;
|
||||
help: boolean;
|
||||
_: string[];
|
||||
|
||||
[key: string]: undefined | boolean | string | string[];
|
||||
}
|
||||
|
||||
export function getFlags(argv: string[]): Flags {
|
||||
const { verbose, quiet, silent, debug, help, _, ...others } = getopts(argv, {
|
||||
alias: {
|
||||
v: 'verbose',
|
||||
},
|
||||
|
@ -32,16 +45,25 @@ export function getFlags(argv) {
|
|||
silent: false,
|
||||
debug: false,
|
||||
help: false,
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
return {
|
||||
verbose,
|
||||
quiet,
|
||||
silent,
|
||||
debug,
|
||||
help,
|
||||
_,
|
||||
...others,
|
||||
};
|
||||
}
|
||||
|
||||
export function getHelp() {
|
||||
return (
|
||||
`
|
||||
node ${relative(process.cwd(), process.argv[1], '.js')}
|
||||
export function getHelp(options: Options) {
|
||||
return `
|
||||
node ${relative(process.cwd(), process.argv[1])}
|
||||
|
||||
Runs a dev task
|
||||
${options.helpDescription || 'Runs a dev task'}
|
||||
|
||||
Options:
|
||||
--verbose, -v Log verbosely
|
||||
|
@ -50,6 +72,5 @@ export function getHelp() {
|
|||
--silent Don't log anything
|
||||
--help Show this message
|
||||
|
||||
`
|
||||
);
|
||||
`;
|
||||
}
|
|
@ -17,25 +17,31 @@
|
|||
* under the License.
|
||||
*/
|
||||
|
||||
import { ToolingLog, pickLevelFromFlags } from '@kbn/dev-utils';
|
||||
import { pickLevelFromFlags, ToolingLog } from '@kbn/dev-utils';
|
||||
import { isFailError } from './fail';
|
||||
import { getFlags, getHelp } from './flags';
|
||||
import { Flags, getFlags, getHelp } from './flags';
|
||||
|
||||
export async function run(body) {
|
||||
type RunFn = (args: { log: ToolingLog; flags: Flags }) => Promise<void> | void;
|
||||
|
||||
export interface Options {
|
||||
helpDescription?: string;
|
||||
}
|
||||
|
||||
export async function run(fn: RunFn, options: Options = {}) {
|
||||
const flags = getFlags(process.argv.slice(2));
|
||||
|
||||
if (flags.help) {
|
||||
process.stderr.write(getHelp());
|
||||
process.stderr.write(getHelp(options));
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const log = new ToolingLog({
|
||||
level: pickLevelFromFlags(flags),
|
||||
writeTo: process.stdout
|
||||
writeTo: process.stdout,
|
||||
});
|
||||
|
||||
try {
|
||||
await body({ log, flags });
|
||||
await fn({ log, flags });
|
||||
} catch (error) {
|
||||
if (isFailError(error)) {
|
||||
log.error(error.message);
|
|
@ -49,6 +49,16 @@ run(
|
|||
);
|
||||
}
|
||||
|
||||
if (typeof path === 'boolean' || typeof includeConfig === 'boolean') {
|
||||
throw createFailError(
|
||||
`${chalk.white.bgRed(' I18N ERROR ')} --path and --include-config require a value`
|
||||
);
|
||||
}
|
||||
|
||||
if (typeof fix !== 'boolean') {
|
||||
throw createFailError(`${chalk.white.bgRed(' I18N ERROR ')} --fix can't have a value`);
|
||||
}
|
||||
|
||||
const config = await mergeConfigs(includeConfig);
|
||||
const defaultMessages = await extractDefaultMessages({ path, config });
|
||||
|
||||
|
|
|
@ -39,6 +39,12 @@ run(
|
|||
);
|
||||
}
|
||||
|
||||
if (typeof path === 'boolean' || typeof includeConfig === 'boolean') {
|
||||
throw createFailError(
|
||||
`${chalk.white.bgRed(' I18N ERROR ')} --path and --include-config require a value`
|
||||
);
|
||||
}
|
||||
|
||||
const config = await mergeConfigs(includeConfig);
|
||||
const defaultMessages = await extractDefaultMessages({ path, config });
|
||||
|
||||
|
|
|
@ -47,9 +47,30 @@ run(
|
|||
);
|
||||
}
|
||||
|
||||
if (Array.isArray(target)) {
|
||||
if (typeof target === 'boolean' || Array.isArray(target)) {
|
||||
throw createFailError(
|
||||
`${chalk.white.bgRed(' I18N ERROR ')} --target should be specified only once.`
|
||||
`${chalk.white.bgRed(
|
||||
' I18N ERROR '
|
||||
)} --target should be specified only once and must have a value.`
|
||||
);
|
||||
}
|
||||
|
||||
if (typeof path === 'boolean' || typeof includeConfig === 'boolean') {
|
||||
throw createFailError(
|
||||
`${chalk.white.bgRed(' I18N ERROR ')} --path and --include-config require a value`
|
||||
);
|
||||
}
|
||||
|
||||
if (
|
||||
typeof ignoreIncompatible !== 'boolean' ||
|
||||
typeof ignoreUnused !== 'boolean' ||
|
||||
typeof ignoreMissing !== 'boolean' ||
|
||||
typeof dryRun !== 'boolean'
|
||||
) {
|
||||
throw createFailError(
|
||||
`${chalk.white.bgRed(
|
||||
' I18N ERROR '
|
||||
)} --ignore-incompatible, --ignore-unused, --ignore-missing, and --dry-run can't have values`
|
||||
);
|
||||
}
|
||||
|
||||
|
|
91
src/dev/typescript/run_check_ts_projects_cli.ts
Normal file
|
@ -0,0 +1,91 @@
|
|||
/*
|
||||
* Licensed to Elasticsearch B.V. under one or more contributor
|
||||
* license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright
|
||||
* ownership. Elasticsearch B.V. licenses this file to you under
|
||||
* the Apache License, Version 2.0 (the "License"); you may
|
||||
* not use this file except in compliance with the License.
|
||||
* You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing,
|
||||
* software distributed under the License is distributed on an
|
||||
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
||||
* KIND, either express or implied. See the License for the
|
||||
* specific language governing permissions and limitations
|
||||
* under the License.
|
||||
*/
|
||||
|
||||
import { resolve } from 'path';
|
||||
|
||||
import execa from 'execa';
|
||||
|
||||
import { run } from '../run';
|
||||
|
||||
const REPO_ROOT = resolve(__dirname, '../../../');
|
||||
import { File } from '../file';
|
||||
import { PROJECTS } from './projects';
|
||||
|
||||
export async function runCheckTsProjectsCli() {
|
||||
run(
|
||||
async ({ log }) => {
|
||||
const files = await execa.stdout('git', ['ls-tree', '--name-only', '-r', 'HEAD'], {
|
||||
cwd: REPO_ROOT,
|
||||
});
|
||||
|
||||
const isNotInTsProject: File[] = [];
|
||||
const isInMultipleTsProjects: File[] = [];
|
||||
|
||||
for (const lineRaw of files.split('\n')) {
|
||||
const line = lineRaw.trim();
|
||||
|
||||
if (!line) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const file = new File(resolve(REPO_ROOT, line));
|
||||
if (!file.isTypescript() || file.isFixture()) {
|
||||
continue;
|
||||
}
|
||||
|
||||
log.verbose('Checking %s', file.getAbsolutePath());
|
||||
|
||||
const projects = PROJECTS.filter(p => p.isAbsolutePathSelected(file.getAbsolutePath()));
|
||||
if (projects.length === 0) {
|
||||
isNotInTsProject.push(file);
|
||||
}
|
||||
if (projects.length > 1) {
|
||||
isInMultipleTsProjects.push(file);
|
||||
}
|
||||
}
|
||||
|
||||
if (!isNotInTsProject.length && !isInMultipleTsProjects.length) {
|
||||
log.success('All ts files belong to a single ts project');
|
||||
return;
|
||||
}
|
||||
|
||||
if (isNotInTsProject.length) {
|
||||
log.error(
|
||||
`The following files do not belong to a tsconfig.json file, or that tsconfig.json file is not listed in src/dev/typescript/projects.ts\n${isNotInTsProject
|
||||
.map(file => ` - ${file.getRelativePath()}`)
|
||||
.join('\n')}`
|
||||
);
|
||||
}
|
||||
|
||||
if (isInMultipleTsProjects.length) {
|
||||
log.error(
|
||||
`The following files belong to multiple tsconfig.json files listed in src/dev/typescript/projects.ts\n${isInMultipleTsProjects
|
||||
.map(file => ` - ${file.getRelativePath()}`)
|
||||
.join('\n')}`
|
||||
);
|
||||
}
|
||||
|
||||
process.exit(1);
|
||||
},
|
||||
{
|
||||
helpDescription:
|
||||
'Check that all .ts and .tsx files in the repository are assigned to a tsconfig.json file',
|
||||
}
|
||||
);
|
||||
}
|
|
@ -62,9 +62,9 @@ export default function (api) {
|
|||
|
||||
api.addGlobalAutocompleteRules('script', {
|
||||
__template: {
|
||||
inline: 'SCRIPT'
|
||||
source: 'SCRIPT'
|
||||
},
|
||||
inline: 'SCRIPT',
|
||||
source: 'SCRIPT',
|
||||
file: 'FILE_SCRIPT_NAME',
|
||||
id: 'SCRIPT_ID',
|
||||
lang: '',
|
||||
|
|
|
@ -207,7 +207,7 @@ export default function (api) {
|
|||
api.addEndpointDescription('render_search_template', {
|
||||
data_autocomplete_rules: {
|
||||
__one_of: [
|
||||
{ inline: { __scope_link: 'search' } },
|
||||
{ source: { __scope_link: 'search' } },
|
||||
{ __scope_link: 'GLOBAL.script' },
|
||||
],
|
||||
params: {},
|
||||
|
|
|
@ -322,7 +322,7 @@ describe('Input Tokenization', () => {
|
|||
['start', 'json', 'json', 'start'],
|
||||
'POST _search\n' +
|
||||
'{\n' +
|
||||
' "script": { "inline": "" }\n' +
|
||||
' "script": { "source": "" }\n' +
|
||||
'}'
|
||||
);
|
||||
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
Scripts in requests
|
||||
-------------------------------------
|
||||
{
|
||||
"f": { "script" : { "inline": "test\ntest\\2" } },
|
||||
"f": { "script" : { "source": "test\ntest\\2" } },
|
||||
"g": { "script" : "second + \"\\\";" },
|
||||
"f": "short with \\",
|
||||
"h": 1,
|
||||
|
@ -10,7 +10,7 @@ Scripts in requests
|
|||
}
|
||||
-------------------------------------
|
||||
{
|
||||
"f": { "script" : { "inline": """
|
||||
"f": { "script" : { "source": """
|
||||
test
|
||||
test\2
|
||||
""" } },
|
||||
|
|
|
@ -45,7 +45,7 @@ const termsAgg = ({ field, size, direction, query }) => {
|
|||
|
||||
if (field.scripted) {
|
||||
terms.script = {
|
||||
inline: field.script,
|
||||
source: field.script,
|
||||
lang: field.lang
|
||||
};
|
||||
terms.valueType = field.type === 'number' ? 'float' : field.type;
|
||||
|
|
|
@ -31,7 +31,7 @@ const minMaxAgg = (field) => {
|
|||
const aggBody = {};
|
||||
if (field.scripted) {
|
||||
aggBody.script = {
|
||||
inline: field.script,
|
||||
source: field.script,
|
||||
lang: field.lang
|
||||
};
|
||||
} else {
|
||||
|
|
|
@ -72,11 +72,15 @@ function runServerFunctions(server) {
|
|||
.catch(err => {
|
||||
if (Boom.isBoom(err)) {
|
||||
return { err, statusCode: err.statusCode, message: err.output.payload };
|
||||
} else if (err instanceof Error) {
|
||||
return { err, statusCode: 500, message: err.message };
|
||||
}
|
||||
|
||||
server.log(['interpreter', 'error'], err);
|
||||
return { err: 'Internal Server Error', statusCode: 500, message: 'See server logs for details.' };
|
||||
});
|
||||
|
||||
if (result == null) {
|
||||
if (typeof result === 'undefined') {
|
||||
const { functionName } = fnCall;
|
||||
return {
|
||||
id,
|
||||
|
|
|
@ -318,7 +318,7 @@ export const createJsAgentInstructions = (apmServerUrl = '') => [
|
|||
textPre: i18n.translate('kbn.server.tutorials.apm.jsClient.install.textPre', {
|
||||
defaultMessage: 'Install the APM agent for JavaScript as a dependency to your application:',
|
||||
}),
|
||||
commands: [`npm install elastic-apm-js-base --save`],
|
||||
commands: [`npm install @elastic/apm-rum --save`],
|
||||
},
|
||||
{
|
||||
title: i18n.translate('kbn.server.tutorials.apm.jsClient.configure.title', {
|
||||
|
@ -327,7 +327,7 @@ export const createJsAgentInstructions = (apmServerUrl = '') => [
|
|||
textPre: i18n.translate('kbn.server.tutorials.apm.jsClient.configure.textPre', {
|
||||
defaultMessage: 'Agents are libraries that run inside of your application.',
|
||||
}),
|
||||
commands: `import {curlyOpen} init as initApm {curlyClose} from 'elastic-apm-js-base'
|
||||
commands: `import {curlyOpen} init as initApm {curlyClose} from '@elastic/apm-rum'
|
||||
var apm = initApm({curlyOpen}
|
||||
|
||||
// ${i18n.translate('kbn.server.tutorials.apm.jsClient.configure.commands.setRequiredServiceNameComment', {
|
||||
|
|
|
@ -143,7 +143,7 @@ describe(filename, () => {
|
|||
expect(agg.time_buckets.aggs['avg(scriptedBytes)']).to.eql({
|
||||
avg: {
|
||||
script: {
|
||||
inline: 'doc["bytes"].value',
|
||||
source: 'doc["bytes"].value',
|
||||
lang: 'painless'
|
||||
}
|
||||
}
|
||||
|
@ -338,14 +338,14 @@ describe(filename, () => {
|
|||
|
||||
expect(aggs.scriptedBeer.meta.type).to.eql('split');
|
||||
expect(aggs.scriptedBeer.terms.script).to.eql({
|
||||
inline: 'doc["beer"].value',
|
||||
source: 'doc["beer"].value',
|
||||
lang: 'painless'
|
||||
});
|
||||
expect(aggs.scriptedBeer.terms.size).to.eql(5);
|
||||
|
||||
expect(aggs.scriptedBeer.aggs.scriptedWine.meta.type).to.eql('split');
|
||||
expect(aggs.scriptedBeer.aggs.scriptedWine.terms.script).to.eql({
|
||||
inline: 'doc["wine"].value',
|
||||
source: 'doc["wine"].value',
|
||||
lang: 'painless'
|
||||
});
|
||||
expect(aggs.scriptedBeer.aggs.scriptedWine.terms.size).to.eql(10);
|
||||
|
|
|
@ -27,7 +27,7 @@ export function buildAggBody(fieldName, scriptedFields) {
|
|||
if (scriptedField) {
|
||||
return {
|
||||
script: {
|
||||
inline: scriptedField.script,
|
||||
source: scriptedField.script,
|
||||
lang: scriptedField.lang
|
||||
}
|
||||
};
|
||||
|
|
|
@ -46,7 +46,7 @@ export default function createDateAgg(config, tlConfig, scriptedFields) {
|
|||
dateAgg.time_buckets.aggs[metric] = {
|
||||
bucket_script: {
|
||||
buckets_path: '_count',
|
||||
script: { inline: '_value', lang: 'expression' }
|
||||
script: { source: '_value', lang: 'expression' }
|
||||
}
|
||||
};
|
||||
} else if (metric[0] && metric[1]) {
|
||||
|
|
|
@ -50,7 +50,10 @@ export const createListRoute = () => ({
|
|||
const dataIndexConfig = sampleDataset.dataIndices[i];
|
||||
const index = createIndexName(sampleDataset.id, dataIndexConfig.id);
|
||||
try {
|
||||
const indexExists = await callWithRequest(request, 'indices.exists', { index: index });
|
||||
const indexExists = await callWithRequest(request, 'indices.exists', {
|
||||
index: index,
|
||||
include_type_name: false,
|
||||
});
|
||||
if (!indexExists) {
|
||||
sampleDataset.status = NOT_INSTALLED;
|
||||
return;
|
||||
|
|
|
@ -40,7 +40,7 @@ export function createSavedObjectsService(server, schema, serializer, migrator)
|
|||
name: `kibana_index_template:${index}`,
|
||||
include_type_name: true,
|
||||
body: {
|
||||
template: index,
|
||||
index_patterns: [index],
|
||||
settings: {
|
||||
number_of_shards: 1,
|
||||
auto_expand_replicas: '0-1',
|
||||
|
|
|
@ -47,7 +47,8 @@ function getFieldsForTypes(searchFields, types) {
|
|||
|
||||
if (!searchFields || !searchFields.length) {
|
||||
return {
|
||||
all_fields: true
|
||||
lenient: true,
|
||||
fields: ['*'],
|
||||
};
|
||||
}
|
||||
|
||||
|
|
|
@ -237,7 +237,8 @@ describe('searchDsl/queryParams', () => {
|
|||
{
|
||||
simple_query_string: {
|
||||
query: 'us*',
|
||||
all_fields: true
|
||||
lenient: true,
|
||||
fields: ['*'],
|
||||
}
|
||||
}
|
||||
]
|
||||
|
@ -267,7 +268,8 @@ describe('searchDsl/queryParams', () => {
|
|||
{
|
||||
simple_query_string: {
|
||||
query: 'us*',
|
||||
all_fields: true
|
||||
lenient: true,
|
||||
fields: ['*'],
|
||||
}
|
||||
}
|
||||
]
|
||||
|
@ -296,7 +298,8 @@ describe('searchDsl/queryParams', () => {
|
|||
{
|
||||
simple_query_string: {
|
||||
query: 'us*',
|
||||
all_fields: true
|
||||
lenient: true,
|
||||
fields: ['*'],
|
||||
}
|
||||
}
|
||||
]
|
||||
|
@ -325,7 +328,8 @@ describe('searchDsl/queryParams', () => {
|
|||
{
|
||||
simple_query_string: {
|
||||
query: 'us*',
|
||||
all_fields: true
|
||||
lenient: true,
|
||||
fields: ['*'],
|
||||
}
|
||||
}
|
||||
]
|
||||
|
|
|
@ -75,7 +75,7 @@ export const topHitMetricAgg = new MetricAggType({
|
|||
output.params.script_fields = {
|
||||
[ field.name ]: {
|
||||
script: {
|
||||
inline: field.script,
|
||||
source: field.script,
|
||||
lang: field.lang
|
||||
}
|
||||
}
|
||||
|
@ -203,7 +203,7 @@ export const topHitMetricAgg = new MetricAggType({
|
|||
{
|
||||
_script: {
|
||||
script: {
|
||||
inline: sortField.script,
|
||||
source: sortField.script,
|
||||
lang: sortField.lang
|
||||
},
|
||||
type: sortField.type,
|
||||
|
|
|
@ -132,7 +132,7 @@ FieldParamType.prototype.write = function (aggConfig, output) {
|
|||
|
||||
if (field.scripted) {
|
||||
output.params.script = {
|
||||
inline: field.script,
|
||||
source: field.script,
|
||||
lang: field.lang,
|
||||
};
|
||||
} else {
|
||||
|
|
|
@ -78,7 +78,7 @@ describe('SearchSource#normalizeSortRequest', function () {
|
|||
normalizedSort = {
|
||||
_script: {
|
||||
script: {
|
||||
inline: indexField.script,
|
||||
source: indexField.script,
|
||||
lang: indexField.lang
|
||||
},
|
||||
type: indexField.type,
|
||||
|
|
|
@ -52,7 +52,7 @@ export function NormalizeSortRequestProvider(config) {
|
|||
sortField = '_script';
|
||||
sortValue = {
|
||||
script: {
|
||||
inline: indexField.script,
|
||||
source: indexField.script,
|
||||
lang: indexField.lang
|
||||
},
|
||||
type: castSortType(indexField.type),
|
||||
|
|
|
@ -38,7 +38,7 @@ export function getComputedFields() {
|
|||
_.each(self.getScriptedFields(), function (field) {
|
||||
scriptFields[field.name] = {
|
||||
script: {
|
||||
inline: field.script,
|
||||
source: field.script,
|
||||
lang: field.lang
|
||||
}
|
||||
};
|
||||
|
|
|
@ -106,7 +106,7 @@ export function VislibAxisLabelsProvider() {
|
|||
selection.selectAll('.tick text')
|
||||
.text(function (d) {
|
||||
const par = d3.select(this.parentNode).node();
|
||||
const myPos = scaleStartPad + (config.isHorizontal() ? self.axisScale.scale(d) : maxSize - self.axisScale.scale(d));
|
||||
const myPos = scaleStartPad + self.axisScale.scale(d);
|
||||
const mySize = (config.isHorizontal() ? par.getBBox().width : par.getBBox().height) * padding;
|
||||
const halfSize = mySize / 2;
|
||||
|
||||
|
|
|
@ -112,6 +112,15 @@ module.exports = function (grunt) {
|
|||
]
|
||||
},
|
||||
|
||||
// used by the test and jenkins:unit tasks
|
||||
// ensures that all typescript files belong to a typescript project
|
||||
checkTsProjects: {
|
||||
cmd: process.execPath,
|
||||
args: [
|
||||
require.resolve('../../scripts/check_ts_projects')
|
||||
]
|
||||
},
|
||||
|
||||
// used by the test and jenkins:unit tasks
|
||||
// runs the i18n_check script to check i18n engine usage
|
||||
i18nCheck: {
|
||||
|
|
|
@ -26,6 +26,7 @@ module.exports = function (grunt) {
|
|||
'run:eslint',
|
||||
'run:tslint',
|
||||
'run:sasslint',
|
||||
'run:checkTsProjects',
|
||||
'run:typeCheck',
|
||||
'run:i18nCheck',
|
||||
'run:checkFileCasing',
|
||||
|
|
|
@ -72,6 +72,7 @@ module.exports = function (grunt) {
|
|||
!grunt.option('quick') && 'run:eslint',
|
||||
!grunt.option('quick') && 'run:tslint',
|
||||
!grunt.option('quick') && 'run:sasslint',
|
||||
!grunt.option('quick') && 'run:checkTsProjects',
|
||||
!grunt.option('quick') && 'run:typeCheck',
|
||||
!grunt.option('quick') && 'run:i18nCheck',
|
||||
'run:checkFileCasing',
|
||||
|
|
|
@ -178,7 +178,7 @@
|
|||
"graphql-fields": "^1.0.2",
|
||||
"graphql-tag": "^2.9.2",
|
||||
"graphql-tools": "^3.0.2",
|
||||
"handlebars": "^4.0.10",
|
||||
"handlebars": "^4.0.13",
|
||||
"hapi-auth-cookie": "^9.0.0",
|
||||
"history": "4.7.2",
|
||||
"history-extra": "^4.0.2",
|
||||
|
|
|
@ -9,6 +9,7 @@ import { all } from './all';
|
|||
import { any } from './any';
|
||||
import { asFn } from './as';
|
||||
import { axisConfig } from './axisConfig';
|
||||
import { clear } from './clear';
|
||||
import { compare } from './compare';
|
||||
import { containerStyle } from './containerStyle';
|
||||
import { context } from './context';
|
||||
|
@ -64,6 +65,7 @@ export const functions = [
|
|||
any,
|
||||
asFn,
|
||||
axisConfig,
|
||||
clear,
|
||||
columns,
|
||||
compare,
|
||||
containerStyle,
|
||||
|
|
|
@ -55,21 +55,21 @@ export const routes = [
|
|||
try {
|
||||
const fetchedWorkpad = await workpadService.get(params.id);
|
||||
|
||||
const { assets, ...workpad } = fetchedWorkpad;
|
||||
dispatch(setWorkpad(workpad));
|
||||
dispatch(setAssets(assets));
|
||||
|
||||
// tests if user has permissions to write to workpads
|
||||
// TODO: remove this and switch to checking user privileges when canvas loads when granular app privileges are introduced
|
||||
// https://github.com/elastic/kibana/issues/20277
|
||||
if (firstLoad) {
|
||||
workpadService.update(params.id, fetchedWorkpad).catch(err => {
|
||||
await workpadService.update(params.id, fetchedWorkpad).catch(err => {
|
||||
if (err.response && err.response.status === 403) {
|
||||
dispatch(setCanUserWrite(false));
|
||||
}
|
||||
});
|
||||
dispatch(setFirstLoad(false));
|
||||
}
|
||||
|
||||
const { assets, ...workpad } = fetchedWorkpad;
|
||||
dispatch(setWorkpad(workpad));
|
||||
dispatch(setAssets(assets));
|
||||
} catch (err) {
|
||||
notify.error(err, { title: `Couldn't load workpad with ID` });
|
||||
return router.redirectTo('home');
|
||||
|
|
|
@ -41,6 +41,10 @@ const getCtrlShortcuts = shortcuts => {
|
|||
const refreshShortcut = { ...getAltShortcuts('r'), help: 'Refresh workpad' };
|
||||
const previousPageShortcut = { ...getAltShortcuts('['), help: 'Go to previous page' };
|
||||
const nextPageShortcut = { ...getAltShortcuts(']'), help: 'Go to next page' };
|
||||
const deleteElementShortcuts = ['del', 'backspace'];
|
||||
const groupShortcut = ['g'];
|
||||
const ungroupShortcut = ['u'];
|
||||
const fullscreentExitShortcut = ['esc'];
|
||||
|
||||
export const keymap = {
|
||||
ELEMENT: {
|
||||
|
@ -50,10 +54,10 @@ export const keymap = {
|
|||
CUT: { ...getCtrlShortcuts('x'), help: 'Cut' },
|
||||
PASTE: { ...getCtrlShortcuts('v'), help: 'Paste' },
|
||||
DELETE: {
|
||||
osx: ['backspace'],
|
||||
windows: ['del', 'backspace'],
|
||||
linux: ['del', 'backspace'],
|
||||
other: ['del', 'backspace'],
|
||||
osx: deleteElementShortcuts,
|
||||
windows: deleteElementShortcuts,
|
||||
linux: deleteElementShortcuts,
|
||||
other: deleteElementShortcuts,
|
||||
help: 'Delete',
|
||||
},
|
||||
BRING_FORWARD: {
|
||||
|
@ -73,17 +77,17 @@ export const keymap = {
|
|||
help: 'Send to back',
|
||||
},
|
||||
GROUP: {
|
||||
osx: ['g'],
|
||||
windows: ['g'],
|
||||
linux: ['g'],
|
||||
other: ['g'],
|
||||
osx: groupShortcut,
|
||||
windows: groupShortcut,
|
||||
linux: groupShortcut,
|
||||
other: groupShortcut,
|
||||
help: 'Group',
|
||||
},
|
||||
UNGROUP: {
|
||||
osx: ['u'],
|
||||
windows: ['u'],
|
||||
linux: ['u'],
|
||||
other: ['u'],
|
||||
osx: ungroupShortcut,
|
||||
windows: ungroupShortcut,
|
||||
linux: ungroupShortcut,
|
||||
other: ungroupShortcut,
|
||||
help: 'Ungroup',
|
||||
},
|
||||
},
|
||||
|
@ -101,10 +105,10 @@ export const keymap = {
|
|||
displayName: 'Presentation mode',
|
||||
FULLSCREEN: { ...getAltShortcuts(['p', 'f']), help: 'Enter presentation mode' },
|
||||
FULLSCREEN_EXIT: {
|
||||
osx: ['esc'],
|
||||
windows: ['esc'],
|
||||
linux: ['esc'],
|
||||
other: ['esc'],
|
||||
osx: fullscreentExitShortcut,
|
||||
windows: fullscreentExitShortcut,
|
||||
linux: fullscreentExitShortcut,
|
||||
other: fullscreentExitShortcut,
|
||||
help: 'Exit presentation mode',
|
||||
},
|
||||
PREV: mapValues(previousPageShortcut, (osShortcuts, key) =>
|
||||
|
|
|
@ -9,7 +9,8 @@ import { createAction } from 'redux-actions';
|
|||
export const setLoading = createAction('setResolvedLoading');
|
||||
export const setValue = createAction('setResolvedValue');
|
||||
export const setValues = createAction('setResolvedValues');
|
||||
export const clear = createAction('clearResolvedValue');
|
||||
export const clearValue = createAction('clearResolvedValue');
|
||||
export const clearValues = createAction('clearResolvedValues');
|
||||
|
||||
export const inFlightActive = createAction('inFlightActive');
|
||||
export const inFlightComplete = createAction('inFlightComplete');
|
||||
|
|
|
@ -18,7 +18,7 @@ import {
|
|||
} from '../actions/elements';
|
||||
import { restoreHistory } from '../actions/history';
|
||||
import { selectElement } from '../actions/transient';
|
||||
import { addPage, removePage, duplicatePage } from '../actions/pages';
|
||||
import { addPage, removePage, duplicatePage, gotoPage } from '../actions/pages';
|
||||
import { appReady } from '../actions/app';
|
||||
import { setWorkpad } from '../actions/workpad';
|
||||
import { getNodes, getPages, getSelectedPage, getSelectedElement } from '../selectors/workpad';
|
||||
|
@ -59,6 +59,8 @@ const aeroelasticConfiguration = {
|
|||
|
||||
const isGroupId = id => id.startsWith(aeroelasticConfiguration.groupName);
|
||||
|
||||
const pageChangerActions = [gotoPage.toString(), duplicatePage.toString(), addPage.toString()];
|
||||
|
||||
/**
|
||||
* elementToShape
|
||||
*
|
||||
|
@ -320,6 +322,14 @@ export const aeroelastic = ({ dispatch, getState }) => {
|
|||
aero.removeStore(action.payload);
|
||||
}
|
||||
|
||||
if (pageChangerActions.indexOf(action.type) >= 0) {
|
||||
if (getSelectedElement(getState())) {
|
||||
dispatch(selectElement(null)); // ensure sidebar etc. get updated; will update the layout engine too
|
||||
} else {
|
||||
unselectShape(prevPage); // deselect persistent groups as they're not currently selections in Redux
|
||||
}
|
||||
}
|
||||
|
||||
next(action);
|
||||
|
||||
switch (action.type) {
|
||||
|
|
|
@ -7,7 +7,7 @@
|
|||
import { isEqual } from 'lodash';
|
||||
import { getWorkpad, getFullWorkpadPersisted, getWorkpadPersisted } from '../selectors/workpad';
|
||||
import { getAssetIds } from '../selectors/assets';
|
||||
import { setWorkpad } from '../actions/workpad';
|
||||
import { setWorkpad, setRefreshInterval } from '../actions/workpad';
|
||||
import { setAssets, resetAssets } from '../actions/assets';
|
||||
import * as transientActions from '../actions/transient';
|
||||
import * as resolvedArgsActions from '../actions/resolved_args';
|
||||
|
@ -31,6 +31,7 @@ export const esPersistMiddleware = ({ getState }) => {
|
|||
setWorkpad, // used for loading and creating workpads
|
||||
setAssets, // used when loading assets
|
||||
resetAssets, // used when creating new workpads
|
||||
setRefreshInterval, // used to set refresh time interval which is a transient value
|
||||
...Object.values(resolvedArgsActions), // no resolved args affect persisted values
|
||||
...Object.values(transientActions), // no transient actions cause persisted state changes
|
||||
].map(a => a.toString());
|
||||
|
|
|
@ -16,11 +16,13 @@ import { workpadUpdate } from './workpad_update';
|
|||
import { workpadRefresh } from './workpad_refresh';
|
||||
import { appReady } from './app_ready';
|
||||
import { elementStats } from './element_stats';
|
||||
import { resolvedArgs } from './resolved_args';
|
||||
|
||||
const middlewares = [
|
||||
applyMiddleware(
|
||||
thunkMiddleware,
|
||||
elementStats,
|
||||
resolvedArgs,
|
||||
esPersistMiddleware,
|
||||
historyMiddleware,
|
||||
aeroelastic,
|
||||
|
|
|
@ -0,0 +1,31 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { getAllElements } from '../selectors/workpad';
|
||||
import { clearValues } from '../actions/resolved_args';
|
||||
|
||||
/**
|
||||
* This middleware is responsible for keeping the resolved_args collection in transient state
|
||||
* synced with the elements represented by the workpad.
|
||||
*/
|
||||
export const resolvedArgs = ({ dispatch, getState }) => next => action => {
|
||||
// Get the Element IDs that are present before the action.
|
||||
const startElementIds = getAllElements(getState()).map(element => element.id);
|
||||
|
||||
// execute the action
|
||||
next(action);
|
||||
|
||||
// Get the Element IDs after the action...
|
||||
const resolvedElementIds = getAllElements(getState()).map(element => element.id);
|
||||
// ...and get a list of IDs that are no longer present.
|
||||
const deadIds = startElementIds.filter(id => !resolvedElementIds.includes(id));
|
||||
|
||||
// If we have some dead elements, we need to clear them from resolved_args collection
|
||||
// in transient state.
|
||||
if (deadIds.length > 0) {
|
||||
dispatch(clearValues(deadIds));
|
||||
}
|
||||
};
|
|
@ -119,7 +119,7 @@ describe('resolved args reducer', () => {
|
|||
|
||||
describe('clear', () => {
|
||||
it('removed resolved value at path', () => {
|
||||
const action = actionCreator(actions.clear)({
|
||||
const action = actionCreator(actions.clearValue)({
|
||||
path: 'element-0.1',
|
||||
});
|
||||
|
||||
|
@ -135,7 +135,7 @@ describe('resolved args reducer', () => {
|
|||
});
|
||||
|
||||
it('deeply removes resolved values', () => {
|
||||
const action = actionCreator(actions.clear)({
|
||||
const action = actionCreator(actions.clearValue)({
|
||||
path: 'element-0',
|
||||
});
|
||||
|
||||
|
|
|
@ -90,11 +90,17 @@ export const resolvedArgsReducer = handleActions(
|
|||
}, transientState);
|
||||
},
|
||||
|
||||
[actions.clear]: (transientState, { payload }) => {
|
||||
[actions.clearValue]: (transientState, { payload }) => {
|
||||
const { path } = payload;
|
||||
return del(transientState, getFullPath(path));
|
||||
},
|
||||
|
||||
[actions.clearValues]: (transientState, { payload }) => {
|
||||
return payload.reduce((transientState, path) => {
|
||||
return del(transientState, getFullPath(path));
|
||||
}, transientState);
|
||||
},
|
||||
|
||||
[actions.inFlightActive]: transientState => {
|
||||
return set(transientState, 'inFlight', true);
|
||||
},
|
||||
|
|
|
@ -19,7 +19,7 @@ export interface InfraSource {
|
|||
/** The id of the source */
|
||||
id: string;
|
||||
/** The version number the source configuration was last persisted with */
|
||||
version?: number | null;
|
||||
version?: string | null;
|
||||
/** The timestamp the source configuration was last persisted at */
|
||||
updatedAt?: number | null;
|
||||
/** The raw configuration of the source */
|
||||
|
@ -34,6 +34,8 @@ export interface InfraSource {
|
|||
logEntriesBetween: InfraLogEntryInterval;
|
||||
/** A consecutive span of summary buckets within an interval */
|
||||
logSummaryBetween: InfraLogSummaryInterval;
|
||||
|
||||
logItem: InfraLogItem;
|
||||
/** A hierarchy of hosts, pods, containers, services or arbitrary groups */
|
||||
map?: InfraResponse | null;
|
||||
|
||||
|
@ -179,6 +181,22 @@ export interface InfraLogSummaryBucket {
|
|||
entriesCount: number;
|
||||
}
|
||||
|
||||
export interface InfraLogItem {
|
||||
/** The ID of the document */
|
||||
id: string;
|
||||
/** The index where the document was found */
|
||||
index: string;
|
||||
/** An array of flattened fields and values */
|
||||
fields: InfraLogItemField[];
|
||||
}
|
||||
|
||||
export interface InfraLogItemField {
|
||||
/** The flattened field name */
|
||||
field: string;
|
||||
/** The value for the Field as a string */
|
||||
value: string;
|
||||
}
|
||||
|
||||
export interface InfraResponse {
|
||||
nodes: InfraNode[];
|
||||
}
|
||||
|
@ -199,6 +217,10 @@ export interface InfraNodeMetric {
|
|||
name: InfraMetricType;
|
||||
|
||||
value: number;
|
||||
|
||||
avg: number;
|
||||
|
||||
max: number;
|
||||
}
|
||||
|
||||
export interface InfraMetricData {
|
||||
|
@ -397,6 +419,9 @@ export interface LogSummaryBetweenInfraSourceArgs {
|
|||
/** The query to filter the log entries by */
|
||||
filterQuery?: string | null;
|
||||
}
|
||||
export interface LogItemInfraSourceArgs {
|
||||
id: string;
|
||||
}
|
||||
export interface MapInfraSourceArgs {
|
||||
timerange: InfraTimerangeInput;
|
||||
|
||||
|
@ -458,6 +483,7 @@ export enum InfraPathType {
|
|||
hosts = 'hosts',
|
||||
pods = 'pods',
|
||||
containers = 'containers',
|
||||
custom = 'custom',
|
||||
}
|
||||
|
||||
export enum InfraMetricType {
|
||||
|
@ -523,6 +549,45 @@ export type InfraLogMessageSegment = InfraLogMessageFieldSegment | InfraLogMessa
|
|||
// Documents
|
||||
// ====================================================
|
||||
|
||||
export namespace FlyoutItemQuery {
|
||||
export type Variables = {
|
||||
sourceId: string;
|
||||
itemId: string;
|
||||
};
|
||||
|
||||
export type Query = {
|
||||
__typename?: 'Query';
|
||||
|
||||
source: Source;
|
||||
};
|
||||
|
||||
export type Source = {
|
||||
__typename?: 'InfraSource';
|
||||
|
||||
id: string;
|
||||
|
||||
logItem: LogItem;
|
||||
};
|
||||
|
||||
export type LogItem = {
|
||||
__typename?: 'InfraLogItem';
|
||||
|
||||
id: string;
|
||||
|
||||
index: string;
|
||||
|
||||
fields: Fields[];
|
||||
};
|
||||
|
||||
export type Fields = {
|
||||
__typename?: 'InfraLogItemField';
|
||||
|
||||
field: string;
|
||||
|
||||
value: string;
|
||||
};
|
||||
}
|
||||
|
||||
export namespace MetadataQuery {
|
||||
export type Variables = {
|
||||
sourceId: string;
|
||||
|
@ -660,6 +725,10 @@ export namespace WaffleNodesQuery {
|
|||
name: InfraMetricType;
|
||||
|
||||
value: number;
|
||||
|
||||
avg: number;
|
||||
|
||||
max: number;
|
||||
};
|
||||
}
|
||||
|
||||
|
@ -847,7 +916,7 @@ export namespace SourceFields {
|
|||
|
||||
id: string;
|
||||
|
||||
version?: number | null;
|
||||
version?: string | null;
|
||||
|
||||
updatedAt?: number | null;
|
||||
|
||||
|
|
|
@ -89,15 +89,28 @@ const createContainerProps = memoizeLast((sourceId: string, apolloClient: Apollo
|
|||
undefined,
|
||||
getDerivedIndexPattern: () => getDerivedIndexPattern,
|
||||
getVersion: () => state => (state && state.source && state.source.version) || undefined,
|
||||
getExists: () => state =>
|
||||
(state && state.source && typeof state.source.version === 'number') || false,
|
||||
getExists: () => state => (state && state.source && !!state.source.version) || false,
|
||||
});
|
||||
|
||||
const effects = inferEffectMap<State>()({
|
||||
create: (sourceConfiguration: CreateSourceInput) => ({ setState }) => {
|
||||
const variables = {
|
||||
sourceId,
|
||||
sourceConfiguration,
|
||||
sourceConfiguration: {
|
||||
name: sourceConfiguration.name,
|
||||
description: sourceConfiguration.description,
|
||||
metricAlias: sourceConfiguration.metricAlias,
|
||||
logAlias: sourceConfiguration.logAlias,
|
||||
fields: sourceConfiguration.fields
|
||||
? {
|
||||
container: sourceConfiguration.fields.container,
|
||||
host: sourceConfiguration.fields.host,
|
||||
pod: sourceConfiguration.fields.pod,
|
||||
tiebreaker: sourceConfiguration.fields.tiebreaker,
|
||||
timestamp: sourceConfiguration.fields.timestamp,
|
||||
}
|
||||
: undefined,
|
||||
},
|
||||
};
|
||||
|
||||
setState(actions.startOperation({ name: 'create', parameters: variables }));
|
||||
|
@ -219,7 +232,7 @@ interface WithSourceProps {
|
|||
metricIndicesExist?: boolean;
|
||||
sourceId: string;
|
||||
update: (changes: UpdateSourceInput[]) => Promise<any>;
|
||||
version?: number;
|
||||
version?: string;
|
||||
}>;
|
||||
}
|
||||
|
||||
|
|
|
@ -916,7 +916,7 @@ export namespace SourceFields {
|
|||
|
||||
id: string;
|
||||
|
||||
version?: number | null;
|
||||
version?: string | null;
|
||||
|
||||
updatedAt?: number | null;
|
||||
|
||||
|
|
|
@ -82,7 +82,7 @@ export const watch = {
|
|||
start: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: `LocalDateTime.ofEpochSecond((doc["timestamp"].date.getMillis()-((doc["bucket_span"].value * 1000)
|
||||
source: `LocalDateTime.ofEpochSecond((doc["timestamp"].date.getMillis()-((doc["bucket_span"].value * 1000)
|
||||
* params.padding)) / 1000, 0, ZoneOffset.UTC).toString()+\":00.000Z\"`,
|
||||
params: {
|
||||
'padding': 10
|
||||
|
@ -92,7 +92,7 @@ export const watch = {
|
|||
end: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: `LocalDateTime.ofEpochSecond((doc["timestamp"].date.getMillis()+((doc["bucket_span"].value * 1000)
|
||||
source: `LocalDateTime.ofEpochSecond((doc["timestamp"].date.getMillis()+((doc["bucket_span"].value * 1000)
|
||||
* params.padding)) / 1000, 0, ZoneOffset.UTC).toString()+\":00.000Z\"`,
|
||||
params: {
|
||||
'padding': 10
|
||||
|
@ -102,19 +102,19 @@ export const watch = {
|
|||
timestamp_epoch: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: 'doc["timestamp"].date.getMillis()/1000'
|
||||
source: 'doc["timestamp"].date.getMillis()/1000'
|
||||
}
|
||||
},
|
||||
timestamp_iso8601: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: 'doc["timestamp"].date'
|
||||
source: 'doc["timestamp"].date'
|
||||
}
|
||||
},
|
||||
score: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: 'Math.round(doc["anomaly_score"].value)'
|
||||
source: 'Math.round(doc["anomaly_score"].value)'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -155,7 +155,7 @@ export const watch = {
|
|||
score: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: 'Math.round(doc["influencer_score"].value)'
|
||||
source: 'Math.round(doc["influencer_score"].value)'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -199,7 +199,7 @@ export const watch = {
|
|||
score: {
|
||||
script: {
|
||||
lang: 'painless',
|
||||
inline: 'Math.round(doc["record_score"].value)'
|
||||
source: 'Math.round(doc["record_score"].value)'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -96,10 +96,10 @@ export const reporting = (kibana) => {
|
|||
concurrency: Joi.number().integer().default(appConfig.concurrency), //deprecated
|
||||
browser: Joi.object({
|
||||
type: Joi.any().valid('phantom', 'chromium').default(await getDefaultBrowser()), // TODO: make chromium the only valid option in 7.0
|
||||
autoDownload: Joi.boolean().when('$dev', {
|
||||
autoDownload: Joi.boolean().when('$dist', {
|
||||
is: true,
|
||||
then: Joi.default(true),
|
||||
otherwise: Joi.default(false),
|
||||
then: Joi.default(false),
|
||||
otherwise: Joi.default(true),
|
||||
}),
|
||||
chromium: Joi.object({
|
||||
disableSandbox: Joi.boolean().default(await getDefaultChromiumSandboxDisabled()),
|
||||
|
|
|
@ -16,13 +16,16 @@ import {
|
|||
EuiTitle,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import { FixDefaultFieldsButton } from './default_fields/button';
|
||||
import { DeleteTasksButton } from './delete_tasks_button';
|
||||
import { ReindexButton } from './reindex';
|
||||
|
||||
interface DeprecationCellProps {
|
||||
items?: Array<{ title?: string; body: string }>;
|
||||
reindexIndexName?: string;
|
||||
deleteIndexName?: string;
|
||||
indexName?: string;
|
||||
reindex?: boolean;
|
||||
deleteIndex?: boolean;
|
||||
needsDefaultFields?: boolean;
|
||||
docUrl?: string;
|
||||
headline?: string;
|
||||
healthColor?: string;
|
||||
|
@ -35,8 +38,10 @@ interface DeprecationCellProps {
|
|||
export const DeprecationCell: StatelessComponent<DeprecationCellProps> = ({
|
||||
headline,
|
||||
healthColor,
|
||||
reindexIndexName,
|
||||
deleteIndexName,
|
||||
indexName,
|
||||
reindex,
|
||||
deleteIndex,
|
||||
needsDefaultFields,
|
||||
docUrl,
|
||||
items = [],
|
||||
children,
|
||||
|
@ -78,17 +83,23 @@ export const DeprecationCell: StatelessComponent<DeprecationCellProps> = ({
|
|||
))}
|
||||
</EuiFlexItem>
|
||||
|
||||
{reindexIndexName && (
|
||||
{reindex && (
|
||||
<EuiFlexItem grow={false}>
|
||||
<ReindexButton indexName={reindexIndexName} />
|
||||
<ReindexButton indexName={indexName!} />
|
||||
</EuiFlexItem>
|
||||
)}
|
||||
|
||||
{deleteIndexName && (
|
||||
{deleteIndex && (
|
||||
<EuiFlexItem grow={false}>
|
||||
<DeleteTasksButton />
|
||||
</EuiFlexItem>
|
||||
)}
|
||||
|
||||
{needsDefaultFields && (
|
||||
<EuiFlexItem grow={false}>
|
||||
<FixDefaultFieldsButton indexName={indexName!} />
|
||||
</EuiFlexItem>
|
||||
)}
|
||||
</EuiFlexGroup>
|
||||
|
||||
<EuiSpacer size="s" />
|
||||
|
|
|
@ -0,0 +1,129 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { ReactNode } from 'react';
|
||||
|
||||
import { EuiButton } from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import { kfetch } from 'ui/kfetch';
|
||||
import { LoadingState } from '../../../../types';
|
||||
|
||||
/**
|
||||
* Field types used by Metricbeat to generate the default_field setting.
|
||||
* Matches Beats code here:
|
||||
* https://github.com/elastic/beats/blob/eee127cb59b56f2ed7c7e317398c3f79c4158216/libbeat/template/processor.go#L104
|
||||
*/
|
||||
const METRICBEAT_DEFAULT_FIELD_TYPES: ReadonlySet<string> = new Set(['keyword', 'text', 'ip']);
|
||||
const METRICBEAT_OTHER_DEFAULT_FIELDS: ReadonlySet<string> = new Set(['fields.*']);
|
||||
|
||||
interface FixDefaultFieldsButtonProps {
|
||||
indexName: string;
|
||||
}
|
||||
|
||||
interface FixDefaultFieldsButtonState {
|
||||
fixLoadingState?: LoadingState;
|
||||
}
|
||||
|
||||
/**
|
||||
* Renders a button if given index is a valid Metricbeat index to add a default_field setting.
|
||||
*/
|
||||
export class FixDefaultFieldsButton extends React.Component<
|
||||
FixDefaultFieldsButtonProps,
|
||||
FixDefaultFieldsButtonState
|
||||
> {
|
||||
constructor(props: FixDefaultFieldsButtonProps) {
|
||||
super(props);
|
||||
this.state = {};
|
||||
}
|
||||
|
||||
public render() {
|
||||
const { fixLoadingState } = this.state;
|
||||
|
||||
if (!this.isMetricbeatIndex()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const buttonProps: any = { size: 's', onClick: this.fixMetricbeatIndex };
|
||||
let buttonContent: ReactNode;
|
||||
|
||||
switch (fixLoadingState) {
|
||||
case LoadingState.Loading:
|
||||
buttonProps.disabled = true;
|
||||
buttonProps.isLoading = true;
|
||||
buttonContent = (
|
||||
<FormattedMessage
|
||||
id="xpack.upgradeAssistant.checkupTab.fixMetricbeatIndexButton.fixingLabel"
|
||||
defaultMessage="Fixing…"
|
||||
/>
|
||||
);
|
||||
break;
|
||||
case LoadingState.Success:
|
||||
buttonProps.iconSide = 'left';
|
||||
buttonProps.iconType = 'check';
|
||||
buttonProps.disabled = true;
|
||||
buttonContent = (
|
||||
<FormattedMessage
|
||||
id="xpack.upgradeAssistant.checkupTab.fixMetricbeatIndexButton.fixedLabel"
|
||||
defaultMessage="Fixed"
|
||||
/>
|
||||
);
|
||||
break;
|
||||
case LoadingState.Error:
|
||||
buttonProps.color = 'danger';
|
||||
buttonProps.iconSide = 'left';
|
||||
buttonProps.iconType = 'cross';
|
||||
buttonContent = (
|
||||
<FormattedMessage
|
||||
id="xpack.upgradeAssistant.checkupTab.fixMetricbeatIndexButton.failedLabel"
|
||||
defaultMessage="Failed"
|
||||
/>
|
||||
);
|
||||
break;
|
||||
default:
|
||||
buttonContent = (
|
||||
<FormattedMessage
|
||||
id="xpack.upgradeAssistant.checkupTab.fixMetricbeatIndexButton.reindexLabel"
|
||||
defaultMessage="Fix"
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
return <EuiButton {...buttonProps}>{buttonContent}</EuiButton>;
|
||||
}
|
||||
|
||||
private isMetricbeatIndex = () => {
|
||||
return this.props.indexName.startsWith('metricbeat-');
|
||||
};
|
||||
|
||||
private fixMetricbeatIndex = async () => {
|
||||
if (!this.isMetricbeatIndex()) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.setState({
|
||||
fixLoadingState: LoadingState.Loading,
|
||||
});
|
||||
|
||||
try {
|
||||
await kfetch({
|
||||
pathname: `/api/upgrade_assistant/add_query_default_field/${this.props.indexName}`,
|
||||
method: 'POST',
|
||||
body: JSON.stringify({
|
||||
fieldTypes: [...METRICBEAT_DEFAULT_FIELD_TYPES],
|
||||
otherFields: [...METRICBEAT_OTHER_DEFAULT_FIELDS],
|
||||
}),
|
||||
});
|
||||
|
||||
this.setState({
|
||||
fixLoadingState: LoadingState.Success,
|
||||
});
|
||||
} catch (e) {
|
||||
this.setState({
|
||||
fixLoadingState: LoadingState.Error,
|
||||
});
|
||||
}
|
||||
};
|
||||
}
|
|
@ -9,6 +9,7 @@ import React from 'react';
|
|||
|
||||
import { EuiBasicTable } from '@elastic/eui';
|
||||
import { injectI18n } from '@kbn/i18n/react';
|
||||
import { FixDefaultFieldsButton } from './default_fields/button';
|
||||
import { DeleteTasksButton } from './delete_tasks_button';
|
||||
import { ReindexButton } from './reindex';
|
||||
|
||||
|
@ -18,6 +19,7 @@ export interface IndexDeprecationDetails {
|
|||
index: string;
|
||||
reindex: boolean;
|
||||
delete: boolean;
|
||||
needsDefaultFields: boolean;
|
||||
details?: string;
|
||||
}
|
||||
|
||||
|
@ -137,7 +139,10 @@ export class IndexDeprecationTableUI extends React.Component<
|
|||
// NOTE: this naive implementation assumes all indices in the table are
|
||||
// should show the reindex button. This should work for known usecases.
|
||||
const { indices } = this.props;
|
||||
if (!indices.find(i => i.reindex || i.delete)) {
|
||||
const showDeleteButton = indices.find(i => i.delete === true);
|
||||
const showReindexButton = indices.find(i => i.reindex === true);
|
||||
const showNeedsDefaultFieldsButton = indices.find(i => i.needsDefaultFields === true);
|
||||
if (!showDeleteButton && !showReindexButton && !showNeedsDefaultFieldsButton) {
|
||||
return null;
|
||||
}
|
||||
|
||||
|
@ -145,11 +150,13 @@ export class IndexDeprecationTableUI extends React.Component<
|
|||
actions: [
|
||||
{
|
||||
render(indexDep: IndexDeprecationDetails) {
|
||||
return indexDep.delete ? (
|
||||
<DeleteTasksButton />
|
||||
) : (
|
||||
<ReindexButton indexName={indexDep.index} />
|
||||
);
|
||||
if (showDeleteButton) {
|
||||
return <DeleteTasksButton />;
|
||||
} else if (showReindexButton) {
|
||||
return <ReindexButton indexName={indexDep.index!} />;
|
||||
} else {
|
||||
return <FixDefaultFieldsButton indexName={indexDep.index!} />;
|
||||
}
|
||||
},
|
||||
},
|
||||
],
|
||||
|
|
|
@ -74,12 +74,14 @@ describe('DeprecationList', () => {
|
|||
"delete": false,
|
||||
"details": undefined,
|
||||
"index": "0",
|
||||
"needsDefaultFields": false,
|
||||
"reindex": false,
|
||||
},
|
||||
Object {
|
||||
"delete": false,
|
||||
"details": undefined,
|
||||
"index": "1",
|
||||
"needsDefaultFields": false,
|
||||
"reindex": false,
|
||||
},
|
||||
]
|
||||
|
|
|
@ -17,6 +17,7 @@ import { IndexDeprecationDetails, IndexDeprecationTable } from './index_table';
|
|||
|
||||
const OLD_INDEX_MESSAGE = `Index created before ${CURRENT_MAJOR_VERSION}.0`;
|
||||
const DELETE_INDEX_MESSAGE = `.tasks index must be re-created`;
|
||||
const NEEDS_DEFAULT_FIELD_MESSAGE = 'Number of fields exceeds automatic field expansion limit';
|
||||
|
||||
const sortByLevelDesc = (a: DeprecationInfo, b: DeprecationInfo) => {
|
||||
return -1 * (LEVEL_MAP[a.level] - LEVEL_MAP[b.level]);
|
||||
|
@ -38,10 +39,10 @@ const MessageDeprecation: StatelessComponent<{ deprecation: EnrichedDeprecationI
|
|||
<DeprecationCell
|
||||
headline={deprecation.message}
|
||||
healthColor={COLOR_MAP[deprecation.level]}
|
||||
reindexIndexName={deprecation.message === OLD_INDEX_MESSAGE ? deprecation.index! : undefined}
|
||||
deleteIndexName={
|
||||
deprecation.message === DELETE_INDEX_MESSAGE ? deprecation.index! : undefined
|
||||
}
|
||||
indexName={deprecation.index}
|
||||
reindex={deprecation.message === OLD_INDEX_MESSAGE}
|
||||
deleteIndex={deprecation.message === DELETE_INDEX_MESSAGE}
|
||||
needsDefaultFields={deprecation.message === NEEDS_DEFAULT_FIELD_MESSAGE}
|
||||
docUrl={deprecation.url}
|
||||
items={items}
|
||||
/>
|
||||
|
@ -97,6 +98,7 @@ export const DeprecationList: StatelessComponent<{
|
|||
details: dep.details,
|
||||
reindex: dep.message === OLD_INDEX_MESSAGE,
|
||||
delete: dep.message === DELETE_INDEX_MESSAGE,
|
||||
needsDefaultFields: dep.message === NEEDS_DEFAULT_FIELD_MESSAGE,
|
||||
}));
|
||||
|
||||
return <IndexDeprecation indices={indices} deprecation={deprecations[0]} />;
|
||||
|
|
|
@ -11,6 +11,7 @@ import { makeUpgradeAssistantUsageCollector } from './lib/telemetry';
|
|||
import { registerClusterCheckupRoutes } from './routes/cluster_checkup';
|
||||
import { registerDeleteTasksRoutes } from './routes/delete_tasks';
|
||||
import { registerDeprecationLoggingRoutes } from './routes/deprecation_logging';
|
||||
import { registerQueryDefaultFieldRoutes } from './routes/query_default_field';
|
||||
import { registerReindexIndicesRoutes, registerReindexWorker } from './routes/reindex_indices';
|
||||
import { registerTelemetryRoutes } from './routes/telemetry';
|
||||
|
||||
|
@ -18,6 +19,7 @@ export function initServer(server: Legacy.Server) {
|
|||
registerClusterCheckupRoutes(server);
|
||||
registerDeleteTasksRoutes(server);
|
||||
registerDeprecationLoggingRoutes(server);
|
||||
registerQueryDefaultFieldRoutes(server);
|
||||
|
||||
// The ReindexWorker uses a map of request headers that contain the authentication credentials
|
||||
// for a given reindex. We cannot currently store these in an the .kibana index b/c we do not
|
||||
|
|
|
@ -0,0 +1,106 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
import { MappingProperties } from './reindexing/types';
|
||||
|
||||
import { addDefaultField, generateDefaultFields } from './query_default_field';
|
||||
|
||||
const defaultFieldTypes = new Set(['keyword', 'text', 'ip']);
|
||||
|
||||
describe('getDefaultFieldList', () => {
|
||||
it('returns dot-delimited flat list', () => {
|
||||
const mapping: MappingProperties = {
|
||||
nested1: {
|
||||
properties: {
|
||||
included2: { type: 'ip' },
|
||||
ignored2: { type: 'geopoint' },
|
||||
nested2: {
|
||||
properties: {
|
||||
included3: { type: 'keyword' },
|
||||
'included4.keyword': { type: 'keyword' },
|
||||
},
|
||||
},
|
||||
},
|
||||
},
|
||||
ignored1: { type: 'object' },
|
||||
included1: { type: 'text' },
|
||||
};
|
||||
|
||||
expect(generateDefaultFields(mapping, defaultFieldTypes)).toMatchInlineSnapshot(`
|
||||
Array [
|
||||
"nested1.included2",
|
||||
"nested1.nested2.included3",
|
||||
"nested1.nested2.included4.keyword",
|
||||
"included1",
|
||||
]
|
||||
`);
|
||||
});
|
||||
});
|
||||
|
||||
describe('fixMetricbeatIndex', () => {
|
||||
const mockMappings = {
|
||||
'metricbeat-1': {
|
||||
mappings: { _doc: { properties: { field1: { type: 'text' }, field2: { type: 'float' } } } },
|
||||
},
|
||||
};
|
||||
const mockSettings = {
|
||||
'metricbeat-1': {
|
||||
settings: {},
|
||||
},
|
||||
};
|
||||
|
||||
it('fails if index already has index.query.default_field setting', async () => {
|
||||
const callWithRequest = jest.fn().mockResolvedValueOnce({
|
||||
'metricbeat-1': {
|
||||
settings: { index: { query: { default_field: [] } } },
|
||||
},
|
||||
});
|
||||
await expect(
|
||||
addDefaultField(callWithRequest, {} as any, 'metricbeat-1', defaultFieldTypes)
|
||||
).rejects.toThrowErrorMatchingInlineSnapshot(
|
||||
`"Index metricbeat-1 already has index.query.default_field set"`
|
||||
);
|
||||
});
|
||||
|
||||
it('updates index settings with default_field generated from mappings and otherFields', async () => {
|
||||
const callWithRequest = jest
|
||||
.fn()
|
||||
.mockResolvedValueOnce(mockSettings)
|
||||
.mockResolvedValueOnce(mockMappings)
|
||||
.mockResolvedValueOnce({ acknowledged: true });
|
||||
|
||||
await expect(
|
||||
addDefaultField(
|
||||
callWithRequest,
|
||||
{} as any,
|
||||
'metricbeat-1',
|
||||
defaultFieldTypes,
|
||||
new Set(['fields.*', 'myCustomField'])
|
||||
)
|
||||
).resolves.toEqual({
|
||||
acknowledged: true,
|
||||
});
|
||||
expect(callWithRequest.mock.calls[2]).toMatchInlineSnapshot(`
|
||||
Array [
|
||||
Object {},
|
||||
"indices.putSettings",
|
||||
Object {
|
||||
"body": Object {
|
||||
"index": Object {
|
||||
"query": Object {
|
||||
"default_field": Array [
|
||||
"field1",
|
||||
"fields.*",
|
||||
"myCustomField",
|
||||
],
|
||||
},
|
||||
},
|
||||
},
|
||||
"index": "metricbeat-1",
|
||||
},
|
||||
]
|
||||
`);
|
||||
});
|
||||
});
|
|
@ -0,0 +1,80 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import Boom from 'boom';
|
||||
import { Request } from 'hapi';
|
||||
import { get } from 'lodash';
|
||||
|
||||
import { CallClusterWithRequest } from 'src/legacy/core_plugins/elasticsearch';
|
||||
import { MappingProperties } from './reindexing/types';
|
||||
|
||||
/**
|
||||
* Adds the index.query.default_field setting, generated from the index's mapping.
|
||||
*
|
||||
* @param callWithRequest
|
||||
* @param request
|
||||
* @param indexName
|
||||
* @param fieldTypes - Elasticsearch field types that should be used to generate the default_field from the index mapping
|
||||
* @param otherFields - Other fields that should be included in the generated default_field that do not match `fieldTypes`
|
||||
*/
|
||||
export const addDefaultField = async (
|
||||
callWithRequest: CallClusterWithRequest,
|
||||
request: Request,
|
||||
indexName: string,
|
||||
fieldTypes: ReadonlySet<string>,
|
||||
otherFields: ReadonlySet<string> = new Set()
|
||||
) => {
|
||||
// Verify index.query.default_field is not already set.
|
||||
const settings = await callWithRequest(request, 'indices.getSettings', {
|
||||
index: indexName,
|
||||
});
|
||||
if (get(settings, `${indexName}.settings.index.query.default_field`)) {
|
||||
throw Boom.badRequest(`Index ${indexName} already has index.query.default_field set`);
|
||||
}
|
||||
|
||||
// Get the mapping and generate the default_field based on `fieldTypes`
|
||||
const mappingResp = await callWithRequest(request, 'indices.getMapping', {
|
||||
index: indexName,
|
||||
include_type_name: true,
|
||||
});
|
||||
const typeName = Object.getOwnPropertyNames(mappingResp[indexName].mappings)[0];
|
||||
const mapping = mappingResp[indexName].mappings[typeName].properties as MappingProperties;
|
||||
const generatedDefaultFields = new Set(generateDefaultFields(mapping, fieldTypes));
|
||||
|
||||
// Update the setting with the generated default_field
|
||||
return await callWithRequest(request, 'indices.putSettings', {
|
||||
index: indexName,
|
||||
body: {
|
||||
index: { query: { default_field: [...generatedDefaultFields, ...otherFields] } },
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Recursively walks an index mapping and returns a flat array of dot-delimited
|
||||
* strings represent all fields that are of a type included in `DEFAULT_FIELD_TYPES`
|
||||
* @param mapping
|
||||
*/
|
||||
export const generateDefaultFields = (
|
||||
mapping: MappingProperties,
|
||||
fieldTypes: ReadonlySet<string>
|
||||
): string[] =>
|
||||
Object.getOwnPropertyNames(mapping).reduce(
|
||||
(defaultFields, fieldName) => {
|
||||
const { type, properties } = mapping[fieldName];
|
||||
|
||||
if (type && fieldTypes.has(type)) {
|
||||
defaultFields.push(fieldName);
|
||||
} else if (properties) {
|
||||
generateDefaultFields(properties, fieldTypes).forEach(subField =>
|
||||
defaultFields.push(`${fieldName}.${subField}`)
|
||||
);
|
||||
}
|
||||
|
||||
return defaultFields;
|
||||
},
|
||||
[] as string[]
|
||||
);
|
|
@ -0,0 +1,87 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { Server } from 'hapi';
|
||||
|
||||
jest.mock('../lib/es_version_precheck');
|
||||
|
||||
const mockAddDefaultField = jest.fn();
|
||||
jest.mock('../lib/query_default_field', () => ({
|
||||
addDefaultField: mockAddDefaultField,
|
||||
}));
|
||||
|
||||
import { registerQueryDefaultFieldRoutes } from './query_default_field';
|
||||
|
||||
const callWithRequest = jest.fn();
|
||||
|
||||
const server = new Server();
|
||||
server.plugins = {
|
||||
elasticsearch: {
|
||||
getCluster: () => ({ callWithRequest } as any),
|
||||
} as any,
|
||||
} as any;
|
||||
server.config = () => ({ get: () => '' } as any);
|
||||
|
||||
registerQueryDefaultFieldRoutes(server);
|
||||
|
||||
describe('add query default field API', () => {
|
||||
beforeEach(() => {
|
||||
mockAddDefaultField.mockClear();
|
||||
});
|
||||
|
||||
it('calls addDefaultField with index, field types, and other fields', async () => {
|
||||
mockAddDefaultField.mockResolvedValueOnce({ acknowledged: true });
|
||||
const resp = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/upgrade_assistant/add_query_default_field/myIndex',
|
||||
payload: {
|
||||
fieldTypes: ['text', 'boolean'],
|
||||
otherFields: ['myCustomField'],
|
||||
},
|
||||
});
|
||||
|
||||
expect(mockAddDefaultField).toHaveBeenCalledWith(
|
||||
callWithRequest,
|
||||
expect.anything(),
|
||||
'myIndex',
|
||||
new Set(['text', 'boolean']),
|
||||
new Set(['myCustomField'])
|
||||
);
|
||||
expect(resp.statusCode).toEqual(200);
|
||||
expect(resp.payload).toMatchInlineSnapshot(`"{\\"acknowledged\\":true}"`);
|
||||
});
|
||||
|
||||
it('calls addDefaultField with index, field types if other fields is not specified', async () => {
|
||||
mockAddDefaultField.mockResolvedValueOnce({ acknowledged: true });
|
||||
const resp = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/upgrade_assistant/add_query_default_field/myIndex',
|
||||
payload: {
|
||||
fieldTypes: ['text', 'boolean'],
|
||||
},
|
||||
});
|
||||
|
||||
expect(mockAddDefaultField).toHaveBeenCalledWith(
|
||||
callWithRequest,
|
||||
expect.anything(),
|
||||
'myIndex',
|
||||
new Set(['text', 'boolean']),
|
||||
undefined
|
||||
);
|
||||
expect(resp.statusCode).toEqual(200);
|
||||
expect(resp.payload).toMatchInlineSnapshot(`"{\\"acknowledged\\":true}"`);
|
||||
});
|
||||
|
||||
it('fails if fieldTypes is not specified', async () => {
|
||||
const resp = await server.inject({
|
||||
method: 'POST',
|
||||
url: '/api/upgrade_assistant/add_query_default_field/myIndex',
|
||||
});
|
||||
|
||||
expect(mockAddDefaultField).not.toHaveBeenCalled();
|
||||
expect(resp.statusCode).toEqual(400);
|
||||
});
|
||||
});
|
|
@ -0,0 +1,67 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import Boom from 'boom';
|
||||
import Joi from 'joi';
|
||||
import { Legacy } from 'kibana';
|
||||
import _ from 'lodash';
|
||||
|
||||
import { EsVersionPrecheck } from '../lib/es_version_precheck';
|
||||
import { addDefaultField } from '../lib/query_default_field';
|
||||
|
||||
/**
|
||||
* Adds routes for detecting and fixing 6.x Metricbeat indices that need the
|
||||
* `index.query.default_field` index setting added.
|
||||
*
|
||||
* @param server
|
||||
*/
|
||||
export function registerQueryDefaultFieldRoutes(server: Legacy.Server) {
|
||||
const { callWithRequest } = server.plugins.elasticsearch.getCluster('admin');
|
||||
|
||||
server.route({
|
||||
path: '/api/upgrade_assistant/add_query_default_field/{indexName}',
|
||||
method: 'POST',
|
||||
options: {
|
||||
pre: [EsVersionPrecheck],
|
||||
validate: {
|
||||
params: Joi.object({
|
||||
indexName: Joi.string().required(),
|
||||
}),
|
||||
payload: Joi.object({
|
||||
fieldTypes: Joi.array()
|
||||
.items(Joi.string())
|
||||
.required(),
|
||||
otherFields: Joi.array().items(Joi.string()),
|
||||
}),
|
||||
},
|
||||
},
|
||||
async handler(request) {
|
||||
try {
|
||||
const { indexName } = request.params;
|
||||
const { fieldTypes, otherFields } = request.payload as {
|
||||
fieldTypes: string[];
|
||||
otherFields?: string[];
|
||||
};
|
||||
|
||||
return await addDefaultField(
|
||||
callWithRequest,
|
||||
request,
|
||||
indexName,
|
||||
new Set(fieldTypes),
|
||||
otherFields ? new Set(otherFields) : undefined
|
||||
);
|
||||
} catch (e) {
|
||||
if (e.status === 403) {
|
||||
return Boom.forbidden(e.message);
|
||||
}
|
||||
|
||||
return Boom.boomify(e, {
|
||||
statusCode: 500,
|
||||
});
|
||||
}
|
||||
},
|
||||
});
|
||||
}
|
|
@ -24,7 +24,9 @@ export const uptime = (kibana: any) =>
|
|||
}),
|
||||
icon: 'plugins/uptime/icons/heartbeat_white.svg',
|
||||
euiIconType: 'uptimeApp',
|
||||
title: 'Uptime',
|
||||
title: i18n.translate('xpack.uptime.uptimeFeatureCatalogueTitle', {
|
||||
defaultMessage: 'Uptime',
|
||||
}),
|
||||
main: 'plugins/uptime/app',
|
||||
order: 8900,
|
||||
url: '/app/uptime#/',
|
||||
|
|
|
@ -34,7 +34,7 @@ exports[`FilterBar component renders without errors 1`] = `
|
|||
Object {
|
||||
"field": "monitor.id",
|
||||
"multiSelect": false,
|
||||
"name": "Host",
|
||||
"name": "ID",
|
||||
"options": Array [
|
||||
Object {
|
||||
"value": "http@http://localhost:12349/",
|
||||
|
|
|
@ -8,6 +8,7 @@ export default function ({ loadTestFile }) {
|
|||
describe('upgrade assistant', function () {
|
||||
this.tags('ciGroup5');
|
||||
|
||||
loadTestFile(require.resolve('./query_default_field'));
|
||||
loadTestFile(require.resolve('./reindexing'));
|
||||
});
|
||||
}
|
||||
|
|
|
@ -0,0 +1,46 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
import expect from 'expect.js';
|
||||
|
||||
export default function ({ getService }) {
|
||||
const supertest = getService('supertest');
|
||||
const esArchiver = getService('esArchiver');
|
||||
const es = getService('es');
|
||||
|
||||
const indexName = `metricbeat-6.7.0-2019.03.11`;
|
||||
|
||||
describe('add default_field setting', () => {
|
||||
beforeEach(async () => {
|
||||
await esArchiver.load('upgrade_assistant/metricbeat');
|
||||
});
|
||||
|
||||
afterEach(async () => {
|
||||
await esArchiver.unload('upgrade_assistant/metricbeat');
|
||||
});
|
||||
|
||||
it('adds index.query.default_field to metricbeat index', async () => {
|
||||
const { body } = await supertest
|
||||
.post(`/api/upgrade_assistant/add_query_default_field/${indexName}`)
|
||||
.set('kbn-xsrf', 'xxx')
|
||||
.send({ fieldTypes: ['text', 'keyword', 'ip'], otherFields: ['fields.*'] })
|
||||
.expect(200);
|
||||
expect(body.acknowledged).to.be(true);
|
||||
|
||||
// The index.query.default_field setting should now be set
|
||||
const settingsResp = await es.indices.getSettings({ index: indexName });
|
||||
expect(settingsResp[indexName].settings.index.query.default_field).to.not.be(undefined);
|
||||
|
||||
// Deprecation message should be gone
|
||||
const { body: uaBody } = await supertest.get('/api/upgrade_assistant/status').expect(200);
|
||||
const depMessage = uaBody.indices.find(
|
||||
dep => dep.index === indexName && dep.message === 'Number of fields exceeds automatic field expansion limit'
|
||||
);
|
||||
expect(depMessage).to.be(undefined);
|
||||
});
|
||||
});
|
||||
}
|
27
yarn.lock
|
@ -10806,7 +10806,7 @@ handlebars@4.0.11:
|
|||
optionalDependencies:
|
||||
uglify-js "^2.6"
|
||||
|
||||
handlebars@4.0.12, handlebars@^4.0.1, handlebars@^4.0.10, handlebars@^4.0.3:
|
||||
handlebars@4.0.12, handlebars@^4.0.1, handlebars@^4.0.3:
|
||||
version "4.0.12"
|
||||
resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.0.12.tgz#2c15c8a96d46da5e266700518ba8cb8d919d5bc5"
|
||||
integrity sha512-RhmTekP+FZL+XNhwS1Wf+bTTZpdLougwt5pcgA1tuz6Jcx0fpH/7z0qd71RKnZHBCxIRBHfBOnio4gViPemNzA==
|
||||
|
@ -10817,16 +10817,27 @@ handlebars@4.0.12, handlebars@^4.0.1, handlebars@^4.0.10, handlebars@^4.0.3:
|
|||
optionalDependencies:
|
||||
uglify-js "^3.1.4"
|
||||
|
||||
handlebars@4.0.5:
|
||||
version "4.0.5"
|
||||
resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.0.5.tgz#92c6ed6bb164110c50d4d8d0fbddc70806c6f8e7"
|
||||
integrity sha1-ksbta7FkEQxQ1NjQ+93HCAbG+Oc=
|
||||
handlebars@4.0.13:
|
||||
version "4.0.13"
|
||||
resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.0.13.tgz#89fc17bf26f46fd7f6f99d341d92efaae64f997d"
|
||||
integrity sha512-uydY0jy4Z3wy/iGXsi64UtLD4t1fFJe16c/NFxsYE4WdQis8ZCzOXUZaPQNG0e5bgtLQV41QTfqBindhEjnpyQ==
|
||||
dependencies:
|
||||
async "^1.4.0"
|
||||
async "^2.5.0"
|
||||
optimist "^0.6.1"
|
||||
source-map "^0.4.4"
|
||||
source-map "^0.6.1"
|
||||
optionalDependencies:
|
||||
uglify-js "^2.6"
|
||||
uglify-js "^3.1.4"
|
||||
|
||||
handlebars@^4.0.13:
|
||||
version "4.1.0"
|
||||
resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.1.0.tgz#0d6a6f34ff1f63cecec8423aa4169827bf787c3a"
|
||||
integrity sha512-l2jRuU1NAWK6AW5qqcTATWQJvNPEwkM7NEKSiv/gqOsoSQbVoWyqVEY5GS+XPQ88zLNmqASRpzfdm8d79hJS+w==
|
||||
dependencies:
|
||||
async "^2.5.0"
|
||||
optimist "^0.6.1"
|
||||
source-map "^0.6.1"
|
||||
optionalDependencies:
|
||||
uglify-js "^3.1.4"
|
||||
|
||||
hapi-auth-cookie@^9.0.0:
|
||||
version "9.0.0"
|
||||
|
|