mirror of
https://github.com/elastic/kibana.git
synced 2025-04-24 01:38:56 -04:00
* [esArchiver] combine elasticDump and ScenarioManager (#10359) * As a part of bringing functional testing to plugins, esArchiver gives these plugins a way to capture and reload es indexes without needing to write a bunch of custom code. It works similarly to the elasticDump and ScenarioManager tools that it replaces. Differences: - Streaming implementation allows for much larger archives - CLI for creating and using archives - Configurable archive location - Stores the data in gzipped files (better for source control, searching, large archives) - Automatically identifies and upgrades Kibana config documents Methods: - `#load(name)`: import an archive - `#loadIfNeeded(name)`: import an archive, but skip the documents what belong to any existing index - `#unload(name)`: delete the indexes stored in an archive CLI operations: - `./bin/es_archiver save <name> [index patterns...]`: save the mapping and documents in one or more indexes that match the wild-card patterns into an the `<name>` archive - `./bin/es_archiver load <name>`: load the mapping and documents from the `<name>` archive * [functional_tests/common/nagivate] check for statusPage * [es_archiver] move bins into new scripts dir * [functional_tests/apps/context] use esArchiver * [esArchiver] general improvements after showing to a few folks - remove auto-upgrading config doc logic (until we have better access to kibana version info) - export unload command - remove preemptive checks in favor of reacting to errors - use type "doc" vs "hit" for doc records (consistency) - wrote a bunch of pending tests to think though and plan * [esArchiver] make log a stream that writes to itself * [esArchiver] fill in stats and archive format tests * [esArchiver] splitup action logic * [esArchiver/cli] fix cli --help output and comment * [esArchiver] remove type-based param coercion * [esArchiver/log] use strings for log levels * [esArchvier] remove unused var * [esArchiver/indexDocRecordsStream] add tests * [esArchive] fill in remaining tests * [esArchiver] fix dem tests * [eslint] remove unused vars * [esArchiver/loadIfNeeded] fix call to load() * [esArchiver] remove loadDumpData helpers * [esArchiver] update archives for 5.x * [functionalTestRunner] replace intern (#10910) * [functional_test_runner] replace functional testing tools with custom/pluggable solution * [functional_test_runner] Convert unit tests to commonjs format * [functional_test_runner] Fix dashboard test in wrong mode * [functional_test_runner] Add dashboardLandingPage test subject * [functional_test_runner] Get Visualize page object * [functional_test_runner] Fix outdated references * [functional_test_runner] Fix more outdated refs * [functional_test_runner] Remove duplicate tests * [functional_test_runner] Improve test readability * [functional_test_runner] 😞 So many duplicate methods * [functional_test_runner] Move mgmt `before` outside toplevel describe * [functional_test_runner] Settings page obj missing methods * [functional_test_runner] Add improvements from @gammon * [functional_test_runner] Fix return statements in async funcs * [functional_test_runner] Move before() to correct scope * [functional_test_runner] Add after() hooks to remove index patterns * [functional_test_runner] Attempt to fix vertical bar chart tests * [functional_test_runner] Clean up * [functional_test_runner] Reinstate unit tests * [functional_test_runner] Set default loglevel back to info * [functional_test_runner] Replace `context`s with `describe`s * [functional_test_runner] Better error handling * [functional_test_runner] Add in new Tile Map tests * Incorporate changes from master * [functional_test_runner] validate that every test file has a single top-level suite * Update contributing doc with link to full doc * [docs] Spelling and grammar fixes * docs: writing and running functional tests * [docs] Move plugin doc to plugin area * [docs] Housekeeping. Doc in wrong place * [docs] Remove dup doc file * [grunt] Only run mocha_setup when running tests, not every grunt task * [eslint] remove use of context()
This commit is contained in:
parent
420e308797
commit
72f716b5e6
253 changed files with 16194 additions and 68080 deletions
|
@ -308,12 +308,7 @@ npm run test:ui:runner
|
|||
|
||||
##### Browser Automation Notes
|
||||
|
||||
- Using Page Objects pattern (https://theintern.github.io/intern/#writing-functional-test)
|
||||
- At least the initial tests for the Settings, Discover, and Visualize tabs all depend on a very specific set of logstash-type data (generated with makelogs). Since that is a static set of data, all the Discover and Visualize tests use a specific Absolute time range. This guarantees the same results each run.
|
||||
- These tests have been developed and tested with Chrome and Firefox browser. In theory, they should work on all browsers (that's the benefit of Intern using Leadfoot).
|
||||
- These tests should also work with an external testing service like https://saucelabs.com/ or https://www.browserstack.com/ but that has not been tested.
|
||||
- https://theintern.github.io/
|
||||
- https://theintern.github.io/leadfoot/module-leadfoot_Element.html
|
||||
[Read about the `FunctionalTestRunner`](https://www.elastic.co/guide/en/kibana/current/development-functional-tests.html) to learn more about how you can run and develop functional tests for Kibana core and plugins.
|
||||
|
||||
### Building OS packages
|
||||
|
||||
|
@ -374,4 +369,4 @@ Remember, someone is blocked by a pull awaiting review, make it count. Be thorou
|
|||
1. **Hand it off** If you're the first reviewer and everything looks good but the changes are more than a few lines, hand the pull to someone else to take a second look. Again, try to find the right person to assign it to.
|
||||
1. **Merge the code** When everything looks good, put in a `LGTM` (looks good to me) comment. Merge into the target branch. Check the labels on the pull to see if backporting is required, and perform the backport if so.
|
||||
|
||||
Thank you so much for reading our guidelines! :tada:
|
||||
Thank you so much for reading our guidelines! :tada:
|
||||
|
|
|
@ -57,7 +57,7 @@ module.exports = function (grunt) {
|
|||
init: true,
|
||||
config: config,
|
||||
loadGruntTasks: {
|
||||
pattern: ['grunt-*', '@*/grunt-*', 'gruntify-*', '@*/gruntify-*', 'intern']
|
||||
pattern: ['grunt-*', '@*/grunt-*', 'gruntify-*', '@*/gruntify-*']
|
||||
}
|
||||
});
|
||||
|
||||
|
|
|
@ -5,6 +5,7 @@
|
|||
* <<development-dependencies>>
|
||||
* <<development-modules>>
|
||||
* <<development-elasticsearch>>
|
||||
* <<development-functional-tests>>
|
||||
|
||||
include::core/development-basepath.asciidoc[]
|
||||
|
||||
|
@ -12,4 +13,6 @@ include::core/development-dependencies.asciidoc[]
|
|||
|
||||
include::core/development-modules.asciidoc[]
|
||||
|
||||
include::plugin/development-elasticsearch.asciidoc[]
|
||||
include::core/development-elasticsearch.asciidoc[]
|
||||
|
||||
include::core/development-functional-tests.asciidoc[]
|
||||
|
|
383
docs/development/core/development-functional-tests.asciidoc
Normal file
383
docs/development/core/development-functional-tests.asciidoc
Normal file
|
@ -0,0 +1,383 @@
|
|||
[[development-functional-tests]]
|
||||
=== Functional Testing
|
||||
|
||||
We use functional tests to make sure the Kibana UI works as expected. It replaces hours of manual testing by automating user interaction. To have better control over our functional test environment, and to make it more accessible to plugin authors, Kibana uses a tool called the `FunctionalTestRunner`.
|
||||
|
||||
[float]
|
||||
==== Running functional tests
|
||||
|
||||
The `FunctionalTestRunner` is very bare bones and gets most of its functionality from its config file, located at {blob}test/functional/config.js[test/functional/config.js]. If you’re writing a plugin you will have your own config file. See <<development-plugin-functional-tests>> for more info.
|
||||
|
||||
Execute the `FunctionalTestRunner`'s script with node.js to run the tests with Kibana's default configuration:
|
||||
|
||||
["source","shell"]
|
||||
-----------
|
||||
node scripts/functional_test_runner
|
||||
-----------
|
||||
|
||||
When run without any arguments the `FunctionalTestRunner` automatically loads the configuration in the standard location, but you can override that behavior with the `--config` flag. There are also command line flags for `--bail` and `--grep`, which behave just like their mocha counterparts. The logging can also be customized with `--quiet`, `--debug`, or `--verbose` flags.
|
||||
|
||||
Use the `--help` flag for more options.
|
||||
|
||||
[float]
|
||||
==== Writing functional tests
|
||||
|
||||
[float]
|
||||
===== Environment
|
||||
|
||||
The tests are written in https://mochajs.org[mocha] using https://github.com/Automattic/expect.js[expect] for assertions.
|
||||
|
||||
We use https://sites.google.com/a/chromium.org/chromedriver/[chromedriver], https://theintern.github.io/leadfoot[leadfoot], and https://github.com/theintern/digdug[digdug] for automating Chrome. When the `FunctionalTestRunner` launches, digdug opens a `Tunnel` which starts chromedriver and a stripped-down instance of Chrome. It also creates an instance of https://theintern.github.io/leadfoot/module-leadfoot_Command.html[Leadfoot's `Command`] class, which is available via the `remote` service. The `remote` communicates to Chrome through the digdug `Tunnel`. See the https://theintern.github.io/leadfoot/module-leadfoot_Command.html[leadfoot/Command API] docs for all the commands you can use with `remote`.
|
||||
|
||||
The `FunctionalTestRunner` automatically transpiles functional tests using babel, so that tests can use the same ECMAScript features that Kibana source code uses. See [style_guides/js_style_guide.md]({blob}style_guides/js_style_guide.md).
|
||||
|
||||
[float]
|
||||
===== Definitions
|
||||
|
||||
**Provider:**
|
||||
|
||||
Code run by the `FunctionalTestRunner` is wrapped in a function so it can be passed around via config files and be parameterized. Any of these Provider functions may be asynchronous and should return/resolve-to the value they are meant to *provide*. Provider functions will always be called with a single argument: a provider API (see Provider API section).
|
||||
|
||||
A config provder:
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
// config and test files use `export default`
|
||||
export default function (/* { providerAPI } */) {
|
||||
return {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
-----------
|
||||
|
||||
**Services:**
|
||||
|
||||
Services are named singleton values produced by a Service Provider. Tests and other services can retrieve service instances by asking for them by name. All functionality except the mocha API is exposed via services.
|
||||
|
||||
**Page objects:**
|
||||
|
||||
Page objects are a special type of service that encapsulate behaviors common to a particular page or plugin. When you write your own plugin, you’ll likely want to add a page object (or several) that describes the common interactions your tests need to execute.
|
||||
|
||||
**Test Files:**
|
||||
|
||||
The `FunctionalTestRunner`'s primary purpose is to execute test files. These files export a Test Provider that is called with a Provider API but is not expected to return a value. Instead Test Providers define a suite using https://mochajs.org/#bdd[mocha's BDD interface].
|
||||
|
||||
**Test Suite:**
|
||||
|
||||
A test suite is a collection of tests defined by calling `describe()`, and then populated with tests and setup/teardown hooks by calling `it()`, `before()`, `beforeEach()`, etc. Every test file must define only one top level test suite, and test suites can have as many nested test suites as they like.
|
||||
|
||||
[float]
|
||||
===== Anatomy of a test file
|
||||
|
||||
The annotated example file below shows the basic structure every test suite uses. It starts by importing https://github.com/Automattic/expect.js[`expect.js`] and defining its default export: an anonymous Test Provider. The test provider then destructures the Provider API for the `getService()` and `getPageObjects()` functions. It uses these functions to collect the dependencies of this suite. The rest of the test file will look pretty normal to mocha.js users. `describe()`, `it()`, `before()` and the lot are used to define suites that happen to automate a browser via services and objects of type `PageObject`.
|
||||
|
||||
["source","js"]
|
||||
----
|
||||
import expect from 'expect.js';
|
||||
// test files must `export default` a function that defines a test suite
|
||||
export default function ({ getService, getPageObject }) {
|
||||
|
||||
// most test files will start off by loading some services
|
||||
const retry = getService('retry');
|
||||
const testSubjects = getService('testSubjects');
|
||||
const esArchiver = getService('esArchiver');
|
||||
|
||||
// for historical reasons, PageObjects are loaded in a single API call
|
||||
// and returned on an object with a key/value for each requested PageObject
|
||||
const PageObjects = getPageObjects(['common', 'visualize']);
|
||||
|
||||
// every file must define a top-level suite before defining hooks/tests
|
||||
describe('My Test Suite', () => {
|
||||
|
||||
// most suites start with a before hook that navigates to a specific
|
||||
// app/page and restores some archives into elasticsearch with esArchiver
|
||||
before(async () => {
|
||||
await Promise.all([
|
||||
// start with an empty .kibana index
|
||||
esArchiver.load('empty_kibana'),
|
||||
// load some basic log data only if the index doesn't exist
|
||||
esArchiver.loadIfNeeded('makelogs')
|
||||
]);
|
||||
// go to the page described by `apps.visualize` in the config
|
||||
await PageObjects.common.navigateTo('visualize');
|
||||
});
|
||||
|
||||
// right after the before() hook definition, add the teardown steps
|
||||
// that will tidy up elasticsearch for other test suites
|
||||
after(async () => {
|
||||
// we unload the empty_kibana archive but not the makelogs
|
||||
// archive because we don't make any changes to it, and subsequent
|
||||
// suites could use it if they call `.loadIfNeeded()`.
|
||||
await esArchiver.unload('empty_kibana');
|
||||
});
|
||||
|
||||
// This series of tests illustrate how tests generally verify
|
||||
// one step of a larger process and then move on to the next in
|
||||
// a new test, each step building on top of the previous
|
||||
it('Vis Listing Page is empty');
|
||||
it('Create a new vis');
|
||||
it('Shows new vis in listing page');
|
||||
it('Opens the saved vis');
|
||||
it('Respects time filter changes');
|
||||
it(...
|
||||
});
|
||||
|
||||
}
|
||||
----
|
||||
|
||||
[float]
|
||||
==== Provider API
|
||||
|
||||
The first and only argument to all providers is a Provider API Object. This object can be used to load service/page objects and config/test files.
|
||||
|
||||
Within config files the API has the following properties
|
||||
|
||||
* `log` - An instance of the {blob}src/utils/tooling_log/tooling_log.js[`ToolingLog`] that is ready for use
|
||||
|
||||
* `readConfigFile(path)` - Returns a promise that will resolve to a Config instance that provides the values from the config file at `path`
|
||||
|
||||
Within service and PageObject Providers the API is:
|
||||
|
||||
* `getService(name)` - Load and return the singleton instance of a service by name
|
||||
|
||||
* `getPageObjects(names)` - Load the singleton instances of `PageObject`s and collect them on an object where each name is the key to the singleton instance of that PageObject
|
||||
|
||||
Within a test Provider the API is exactly the same as the service providers API but with an additional method:
|
||||
|
||||
* `loadTestFile(path)` - Load the test file at path in place. Use this method to nest suites from other files into a higher-level suite
|
||||
|
||||
[float]
|
||||
==== Service Index
|
||||
|
||||
[float]
|
||||
===== Built-in Services
|
||||
|
||||
The `FunctionalTestRunner` comes with three built-in services:
|
||||
|
||||
**config:**
|
||||
|
||||
* Source: {blob}src/functional_test_runner/lib/config/config.js[src/functional_test_runner/lib/config/config.js]
|
||||
|
||||
* Schema: {blob}src/functional_test_runner/lib/config/schema.js[src/functional_test_runner/lib/config/schema.js]
|
||||
|
||||
* Use `config.get(path)` to read any value from the config file
|
||||
|
||||
**log:**
|
||||
|
||||
* Source: {blob}src/utils/tooling_log/tooling_log.js[src/utils/tooling_log/tooling_log.js]
|
||||
* `ToolingLog` instances are readable streams. The instance provided by this service is automatically piped to stdout by the `FunctionalTestRunner` CLI
|
||||
* `log.verbose()`, `log.debug()`, `log.info()`, `log.warning()` all work just like console.log but produce more organized output
|
||||
|
||||
**lifecycle:**
|
||||
|
||||
* Source: {blob}src/functional_test_runner/lib/lifecycle.js[src/functional_test_runner/lib/lifecycle.js]
|
||||
* Designed primary for use in services
|
||||
* Exposes lifecycle events for basic coordination. Handlers can return a promise and resolve/fail asynchronously
|
||||
* Phases include: `beforeLoadTests`, `beforeTests`, `beforeEachTest`, `cleanup`, `phaseStart`, `phaseEnd`
|
||||
|
||||
[float]
|
||||
===== Kibana Services
|
||||
|
||||
The Kibana functional tests define the vast majority of the actual functionality used by tests.
|
||||
|
||||
**retry:**
|
||||
|
||||
* Source: {blob}test/functional/services/retry.js[test/functional/services/retry.js]
|
||||
* Helpers for retrying operations
|
||||
* Popular methods:
|
||||
* `retry.try(fn)` - execute `fn` in a loop until it succeeds or the default try timeout elapses
|
||||
* `retry.tryForTime(ms, fn)` execute fn in a loop until it succeeds or `ms` milliseconds elapses
|
||||
|
||||
**testSubjects:**
|
||||
|
||||
* Source: {blob}test/functional/services/test_subjects.js[test/functional/services/test_subjects.js]
|
||||
* Test subjects are elements that are tagged specifically for selecting from tests
|
||||
* Use `testSubjects` over CSS selectors when possible
|
||||
* Usage:
|
||||
* Tag your test subject with a `data-test-subj` attribute:
|
||||
|
||||
["source","html"]
|
||||
-----------
|
||||
<div id="container”>
|
||||
<button id="clickMe” data-test-subj=”containerButton” />
|
||||
</div>
|
||||
-----------
|
||||
|
||||
* Click this button using the `testSubjects` helper:
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
await testSubjects.click(‘containerButton’);
|
||||
-----------
|
||||
|
||||
* Popular methods:
|
||||
* `testSubjects.find(testSubjectSelector)` - Find a test subject in the page; throw if it can't be found after some time
|
||||
* `testSubjects.click(testSubjectSelector)` - Click a test subject in the page; throw if it can't be found after some time
|
||||
|
||||
**find:**
|
||||
|
||||
* Source: {blob}test/functional/services/find.js[test/functional/services/find.js]
|
||||
* Helpers for `remote.findBy*` methods that log and manage timeouts
|
||||
* Popular methods:
|
||||
* `find.byCssSelector()`
|
||||
* `find.allByCssSelector()`
|
||||
|
||||
**kibanaServer:**
|
||||
|
||||
* Source: {blob}test/functional/services/kibana_server/kibana_server.js[test/functional/services/kibana_server/kibana_server.js]
|
||||
* Helpers for interacting with Kibana's server
|
||||
* Commonly used methods:
|
||||
* `kibanaServer.uiSettings.update()`
|
||||
* `kibanaServer.version.get()`
|
||||
* `kibanaServer.status.getOverallState()`
|
||||
|
||||
**esArchiver:**
|
||||
|
||||
* Source: {blob}test/functional/services/es_archiver.js[test/functional/services/es_archiver.js]
|
||||
* Load/unload archives created with the `esArchiver`
|
||||
* Popular methods:
|
||||
* `esArchiver.load(name)`
|
||||
* `esArchiver.loadIfNeeded(name)`
|
||||
* `esArchiver.unload(name)`
|
||||
|
||||
**docTable:**
|
||||
|
||||
* Source: {blob}test/functional/services/doc_table.js[test/functional/services/doc_table.js]
|
||||
* Helpers for interacting with doc tables
|
||||
|
||||
**pointSeriesVis:**
|
||||
|
||||
* Source: {blob}test/functional/services/point_series_vis.js[test/functional/services/point_series_vis.js]
|
||||
* Helpers for interacting with point series visualizations
|
||||
|
||||
**Low-level utilities:**
|
||||
|
||||
* es
|
||||
* Source: {blob}test/functional/services/es.js[test/functional/services/es.js]
|
||||
* Elasticsearch client
|
||||
* Higher level options: `kibanaServer.uiSettings` or `esArchiver`
|
||||
|
||||
* remote
|
||||
* Source: {blob}test/functional/services/remote/remote.js[test/functional/services/remote/remote.js]
|
||||
* Instance of https://theintern.github.io/leadfoot/module-leadfoot_Command.html[Leadfoot's `Command]` class
|
||||
* Responsible for all communication with the browser
|
||||
* Higher level options: `testSubjects`, `find`, and `PageObjects.common`
|
||||
* See the https://theintern.github.io/leadfoot/module-leadfoot_Command.html[leadfoot/Command API] for full API
|
||||
|
||||
[float]
|
||||
===== Custom Services
|
||||
|
||||
Services are intentionally generic. They can be literally anything (even nothing). Some services have helpers for interacting with a specific types of UI elements, like `pointSeriesVis`, and others are more foundational, like `log` or `config`. Whenever you want to provide some functionality in a reusable package, consider making a custom service.
|
||||
|
||||
To create a custom service `somethingUseful`:
|
||||
|
||||
* Create a `test/functional/services/something_useful.js` file that looks like this:
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
// Services are defined by Provider functions that receive the ServiceProviderAPI
|
||||
export function SomethingUsefulProvider({ getService }) {
|
||||
const log = getService('log');
|
||||
|
||||
class SomethingUseful {
|
||||
doSomething() {
|
||||
}
|
||||
}
|
||||
return new SomethingUseful();
|
||||
}
|
||||
-----------
|
||||
|
||||
* Re-export your provider from `services/index.js`
|
||||
|
||||
* Import it into `src/functional/config.js` and add it to the services config:
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
import { SomethingUsefulProvider } from './services';
|
||||
|
||||
export default function () {
|
||||
return {
|
||||
// … truncated ...
|
||||
services: {
|
||||
somethingUseful: SomethingUsefulProvider
|
||||
}
|
||||
}
|
||||
}
|
||||
-----------
|
||||
|
||||
[float]
|
||||
==== PageObjects
|
||||
|
||||
The purpose for each PageObject is pretty self-explanatory. The visualize PageObject provides helpers for interacting with the visualize app, dashboard is the same for the dashboard app, and so on.
|
||||
|
||||
One exception is the "common" PageObject. A holdover from the intern implementation, the common PageObject is a collection of helpers useful across pages. Now that we have shareable services, and those services can be shared with other `FunctionalTestRunner` configurations, we will continue to move functionality out of the common PageObject and into services.
|
||||
|
||||
Please add new methods to existing or new services rather than further expanding the CommonPage class.
|
||||
|
||||
[float]
|
||||
==== Gotchas
|
||||
|
||||
Remember that you can’t run an individual test in the file (`it` block) because the whole `describe` needs to be run in order. There should only be one top level `describe` in a file.
|
||||
|
||||
[float]
|
||||
===== Functional Test Timing
|
||||
|
||||
Another important gotcha is writing stable tests by being mindful of timing. All methods on `remote` run asynchronously. It’s better to write interactions that wait for changes on the UI to appear before moving onto the next step.
|
||||
|
||||
For example, rather than writing an interaction that simply clicks a button, write an interaction with the a higher-level purpose in mind:
|
||||
|
||||
Bad example: `PageObjects.app.clickButton()`
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
class AppPage {
|
||||
// what can people who call this method expect from the
|
||||
// UI after the promise resolves? Since the reaction to most
|
||||
// clicks is asynchronous the behavior is dependant on timing
|
||||
// and likely to cause test that fail unexpectedly
|
||||
async clickButton () {
|
||||
await testSubjects.click(‘menuButton’);
|
||||
}
|
||||
}
|
||||
-----------
|
||||
|
||||
Good example: `PageObjects.app.openMenu()`
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
class AppPage {
|
||||
// unlike `clickButton()`, callers of `openMenu()` know
|
||||
// the state that the UI will be in before they move on to
|
||||
// the next step
|
||||
async openMenu () {
|
||||
await testSubjects.click(‘menuButton’);
|
||||
await testSubjects.exists(‘menu’);
|
||||
}
|
||||
}
|
||||
-----------
|
||||
|
||||
Writing in this way will ensure your test timings are not flaky or based on assumptions about UI updates after interactions.
|
||||
|
||||
[float]
|
||||
==== Debugging
|
||||
|
||||
From the command line run:
|
||||
|
||||
["source","shell"]
|
||||
-----------
|
||||
node --debug-brk --inspect scripts/functional_test_runner
|
||||
-----------
|
||||
|
||||
This prints out a URL that you can visit in Chrome and debug your functional tests in the browser.
|
||||
|
||||
You can also see additional logs in the terminal by running the `FunctionalTestRunner` with the `--debug` or `--verbose` flag. Add more logs with statements in your tests like
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
// load the log service
|
||||
const log = getService(‘log’);
|
||||
|
||||
// log.debug only writes when using the `--debug` or `--verbose` flag.
|
||||
log.debug(‘done clicking menu’);
|
||||
-----------
|
||||
|
|
@ -8,8 +8,11 @@ The Kibana plugin interfaces are in a state of constant development. We cannot
|
|||
|
||||
* <<development-plugin-resources>>
|
||||
* <<development-uiexports>>
|
||||
* <<development-plugin-functional-tests>>
|
||||
|
||||
|
||||
include::plugin/development-plugin-resources.asciidoc[]
|
||||
|
||||
include::plugin/development-uiexports.asciidoc[]
|
||||
|
||||
include::plugin/development-plugin-functional-tests.asciidoc[]
|
||||
|
|
|
@ -0,0 +1,91 @@
|
|||
[[development-plugin-functional-tests]]
|
||||
=== Functional Tests for Plugins
|
||||
|
||||
Plugins use the `FunctionalTestRunner` by running it out of the Kibana repo. Ensure that your Kibana Development Environment is setup properly before continuing.
|
||||
|
||||
[float]
|
||||
==== Writing your own configuration
|
||||
|
||||
Every project or plugin should have its own `FunctionalTestRunner` config file. Just like Kibana's, this config file will define all of the test files to load, providers for Services and PageObjects, as well as configuration options for certain services.
|
||||
|
||||
To get started copy and paste this example to `test/functional/config.js`:
|
||||
|
||||
["source","js"]
|
||||
-----------
|
||||
import { resolve } from 'path';
|
||||
import { MyServiceProvider } from './services/my_service';
|
||||
import { MyAppPageProvider } from './services/my_app_page;
|
||||
|
||||
// allow overriding the default kibana directory
|
||||
// using the KIBANA_DIR environment variable
|
||||
const KIBANA_CONFIG_PATH = resolve(process.env.KIBANA_DIR || '../kibana', 'test/functional/config.js');
|
||||
|
||||
// the default export of config files must be a config provider
|
||||
// that returns an object with the projects config values
|
||||
export default async function ({ readConfigFile }) {
|
||||
|
||||
// read the Kibana config file so that we can utilize some of
|
||||
// its services and PageObjects
|
||||
const kibanaConfig = await readConfigFile(KIBANA_CONFIG_PATH);
|
||||
|
||||
return {
|
||||
// list paths to the files that contain your plugins tests
|
||||
testFiles: [
|
||||
resolve(__dirname, './my_test_file.js'),
|
||||
],
|
||||
|
||||
// define the name and providers for services that should be
|
||||
// available to your tests. If you don't specify anything here
|
||||
// only the built-in services will be avaliable
|
||||
services: {
|
||||
...kibanaConfig.get('services'),
|
||||
myService: MyServiceProvider,
|
||||
},
|
||||
|
||||
// just like services, PageObjects are defined as a map of
|
||||
// names to Providers. Merge in Kibana's or pick specific ones
|
||||
pageObjects: {
|
||||
management: kibanaConfig.get('pageObjects.management'),
|
||||
myApp: MyAppPageProvider,
|
||||
},
|
||||
|
||||
// the apps section defines the urls that
|
||||
// `PageObjects.common.navigateTo(appKey)` will use.
|
||||
// Merge urls for your plugin with the urls defined in
|
||||
// Kibana's config in order to use this helper
|
||||
apps: {
|
||||
...kibanaConfig.get('apps'),
|
||||
myApp: {
|
||||
pathname: '/app/my_app',
|
||||
}
|
||||
},
|
||||
|
||||
// choose where esArchiver should load archives from
|
||||
esArchiver: {
|
||||
directory: resolve(__dirname, './es_archives'),
|
||||
},
|
||||
|
||||
// choose where screenshots should be saved
|
||||
screenshots: {
|
||||
directory: resolve(__dirname, './tmp/screenshots'),
|
||||
}
|
||||
|
||||
// more settings, like timeouts, mochaOpts, etc are
|
||||
// defined in the config schema. See {blob}src/functional_test_runner/lib/config/schema.js[src/functional_test_runner/lib/config/schema.js]
|
||||
};
|
||||
}
|
||||
|
||||
-----------
|
||||
|
||||
From the root of your repo you should now be able to run the `FunctionalTestRunner` script from your plugin project.
|
||||
|
||||
["source","shell"]
|
||||
-----------
|
||||
node ../kibana/scripts/functional_test_runner
|
||||
-----------
|
||||
|
||||
[float]
|
||||
==== Using esArchiver
|
||||
|
||||
We're working on documentation for this, but for now the best place to look is the original {pull}10359[pull request].
|
||||
|
|
@ -16,6 +16,7 @@ release-state can be: released | prerelease | unreleased
|
|||
:issue: {repo}issues/
|
||||
:pull: {repo}pull/
|
||||
:commit: {repo}commit/
|
||||
:blob: {repo}blob/{branch}/
|
||||
:security: https://www.elastic.co/community/security/
|
||||
|
||||
|
||||
|
|
10
package.json
10
package.json
|
@ -126,8 +126,8 @@
|
|||
"expose-loader": "0.7.0",
|
||||
"extract-text-webpack-plugin": "0.8.2",
|
||||
"file-loader": "0.8.4",
|
||||
"font-awesome": "4.4.0",
|
||||
"flot-charts": "0.8.3",
|
||||
"font-awesome": "4.4.0",
|
||||
"glob": "5.0.13",
|
||||
"glob-all": "3.0.1",
|
||||
"good-squeeze": "2.1.0",
|
||||
|
@ -206,12 +206,12 @@
|
|||
"auto-release-sinon": "1.0.3",
|
||||
"babel-eslint": "6.1.2",
|
||||
"chai": "3.5.0",
|
||||
"chance": "1.0.6",
|
||||
"cheerio": "0.22.0",
|
||||
"chokidar": "1.6.0",
|
||||
"chromedriver": "2.24.1",
|
||||
"chromedriver": "2.28.0",
|
||||
"classnames": "2.2.5",
|
||||
"del": "1.2.1",
|
||||
"elasticdump": "2.1.1",
|
||||
"digdug": "1.6.3",
|
||||
"enzyme": "2.7.0",
|
||||
"enzyme-to-json": "1.4.5",
|
||||
"eslint": "3.11.1",
|
||||
|
@ -240,7 +240,6 @@
|
|||
"html-loader": "0.4.3",
|
||||
"husky": "0.8.1",
|
||||
"image-diff": "1.6.0",
|
||||
"intern": "3.2.3",
|
||||
"istanbul-instrumenter-loader": "0.1.3",
|
||||
"jest": "19.0.0",
|
||||
"jest-cli": "19.0.0",
|
||||
|
@ -253,6 +252,7 @@
|
|||
"karma-mocha": "0.2.0",
|
||||
"karma-safari-launcher": "0.1.1",
|
||||
"keymirror": "0.1.1",
|
||||
"leadfoot": "1.7.1",
|
||||
"license-checker": "5.1.2",
|
||||
"load-grunt-config": "0.19.2",
|
||||
"makelogs": "3.2.3",
|
||||
|
|
16
scripts/README.md
Normal file
16
scripts/README.md
Normal file
|
@ -0,0 +1,16 @@
|
|||
# kibana dev scripts
|
||||
|
||||
This directory contains scripts useful for interacting with Kibana tools in development. Use the node executable and `--help` flag to learn about how they work:
|
||||
|
||||
```sh
|
||||
node scripts/{{script name}} --help
|
||||
```
|
||||
|
||||
## for developers
|
||||
|
||||
This directory is excluded from the build and tools within it should help users discover their capabilities. Each script in this directory must:
|
||||
|
||||
- include the `../src/optimize/babel/register` module to bootstrap babel
|
||||
- call out to source code that is in the `src` directory
|
||||
- react to the `--help` flag
|
||||
- run everywhere OR check and fail fast when a required OS or toolchain is not available
|
2
scripts/es_archiver.js
Executable file
2
scripts/es_archiver.js
Executable file
|
@ -0,0 +1,2 @@
|
|||
require('../src/optimize/babel/register');
|
||||
require('../src/es_archiver/cli');
|
2
scripts/functional_test_runner.js
Normal file
2
scripts/functional_test_runner.js
Normal file
|
@ -0,0 +1,2 @@
|
|||
require('../src/optimize/babel/register');
|
||||
require('../src/functional_test_runner/cli');
|
|
@ -12,7 +12,10 @@
|
|||
</div>
|
||||
</kbn-top-nav>
|
||||
|
||||
<div class="kuiViewContent kuiViewContent--constrainedWidth">
|
||||
<div
|
||||
class="kuiViewContent kuiViewContent--constrainedWidth"
|
||||
data-test-subj="dashboardLandingPage"
|
||||
>
|
||||
<!-- ControlledTable -->
|
||||
<div class="kuiViewContentItem kuiControlledTable kuiVerticalRhythm">
|
||||
<!-- ToolBar -->
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
<div class="container overall_state_default overall_state_{{ui.serverState}}">
|
||||
<div data-test-subj="statusPageContainer" class="container overall_state_default overall_state_{{ui.serverState}}">
|
||||
<header>
|
||||
<h1>
|
||||
Status: <span class="overall_state_color">{{ ui.serverStateMessage }}</span>
|
||||
|
|
4
src/es_archiver/actions/index.js
Normal file
4
src/es_archiver/actions/index.js
Normal file
|
@ -0,0 +1,4 @@
|
|||
export { saveAction } from './save';
|
||||
export { loadAction } from './load';
|
||||
export { unloadAction } from './unload';
|
||||
export { rebuildAllAction } from './rebuild_all';
|
39
src/es_archiver/actions/load.js
Normal file
39
src/es_archiver/actions/load.js
Normal file
|
@ -0,0 +1,39 @@
|
|||
import { resolve } from 'path';
|
||||
import { createReadStream } from 'fs';
|
||||
|
||||
import {
|
||||
createPromiseFromStreams
|
||||
} from '../../utils';
|
||||
|
||||
import {
|
||||
isGzip,
|
||||
createStats,
|
||||
prioritizeMappings,
|
||||
readDirectory,
|
||||
createParseArchiveStreams,
|
||||
createCreateIndexStream,
|
||||
createIndexDocRecordsStream,
|
||||
} from '../lib';
|
||||
|
||||
export async function loadAction({ name, skipExisting, client, dataDir, log }) {
|
||||
const inputDir = resolve(dataDir, name);
|
||||
const stats = createStats(name, log);
|
||||
|
||||
const files = prioritizeMappings(await readDirectory(inputDir));
|
||||
for (const filename of files) {
|
||||
log.info('[%s] Loading %j', name, filename);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createReadStream(resolve(inputDir, filename)),
|
||||
...createParseArchiveStreams({ gzip: isGzip(filename) }),
|
||||
createCreateIndexStream({ client, stats, skipExisting }),
|
||||
createIndexDocRecordsStream(client, stats),
|
||||
]);
|
||||
}
|
||||
|
||||
stats.forEachIndex((index, { docs }) => {
|
||||
log.info('[%s] Indexed %d docs into %j', name, docs.indexed, index);
|
||||
});
|
||||
|
||||
return stats.toJSON();
|
||||
}
|
46
src/es_archiver/actions/rebuild_all.js
Normal file
46
src/es_archiver/actions/rebuild_all.js
Normal file
|
@ -0,0 +1,46 @@
|
|||
import { resolve } from 'path';
|
||||
import {
|
||||
rename,
|
||||
createReadStream,
|
||||
createWriteStream
|
||||
} from 'fs';
|
||||
|
||||
import { fromNode } from 'bluebird';
|
||||
|
||||
import {
|
||||
createPromiseFromStreams
|
||||
} from '../../utils';
|
||||
|
||||
import {
|
||||
prioritizeMappings,
|
||||
readDirectory,
|
||||
isGzip,
|
||||
createParseArchiveStreams,
|
||||
createFormatArchiveStreams,
|
||||
} from '../lib';
|
||||
|
||||
export async function rebuildAllAction({ dataDir, log }) {
|
||||
const archiveNames = await readDirectory(dataDir);
|
||||
|
||||
for (const name of archiveNames) {
|
||||
const inputDir = resolve(dataDir, name);
|
||||
const files = prioritizeMappings(await readDirectory(inputDir));
|
||||
for (const filename of files) {
|
||||
log.info('[%s] Rebuilding %j', name, filename);
|
||||
|
||||
const path = resolve(inputDir, filename);
|
||||
const gzip = isGzip(path);
|
||||
const tempFile = path + (gzip ? '.rebuilding.gz' : '.rebuilding');
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createReadStream(path),
|
||||
...createParseArchiveStreams({ gzip }),
|
||||
...createFormatArchiveStreams({ gzip }),
|
||||
createWriteStream(tempFile),
|
||||
]);
|
||||
|
||||
await fromNode(cb => rename(tempFile, path, cb));
|
||||
log.info('[%s] Rebuilt %j', name, filename);
|
||||
}
|
||||
}
|
||||
}
|
55
src/es_archiver/actions/save.js
Normal file
55
src/es_archiver/actions/save.js
Normal file
|
@ -0,0 +1,55 @@
|
|||
import { resolve } from 'path';
|
||||
import { createWriteStream } from 'fs';
|
||||
|
||||
import { fromNode } from 'bluebird';
|
||||
import mkdirp from 'mkdirp';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
} from '../../utils';
|
||||
|
||||
import {
|
||||
createStats,
|
||||
createGenerateIndexRecordsStream,
|
||||
createFormatArchiveStreams,
|
||||
createGenerateDocRecordsStream,
|
||||
} from '../lib';
|
||||
|
||||
export async function saveAction({ name, indices, client, dataDir, log }) {
|
||||
const outputDir = resolve(dataDir, name);
|
||||
const stats = createStats(name, log);
|
||||
|
||||
log.info('[%s] Creating archive of %j', name, indices);
|
||||
|
||||
await fromNode(cb => mkdirp(outputDir, cb));
|
||||
const resolvedIndexes = Object.keys(await client.indices.get({
|
||||
index: indices,
|
||||
feature: ['_settings'],
|
||||
filterPath: ['*.settings.index.uuid']
|
||||
}));
|
||||
|
||||
await Promise.all([
|
||||
// export and save the matching indices to mappings.json
|
||||
createPromiseFromStreams([
|
||||
createListStream(resolvedIndexes),
|
||||
createGenerateIndexRecordsStream(client, stats),
|
||||
...createFormatArchiveStreams(),
|
||||
createWriteStream(resolve(outputDir, 'mappings.json')),
|
||||
]),
|
||||
|
||||
// export all documents from matching indexes into data.json.gz
|
||||
createPromiseFromStreams([
|
||||
createListStream(resolvedIndexes),
|
||||
createGenerateDocRecordsStream(client, stats),
|
||||
...createFormatArchiveStreams({ gzip: true }),
|
||||
createWriteStream(resolve(outputDir, 'data.json.gz'))
|
||||
])
|
||||
]);
|
||||
|
||||
stats.forEachIndex((index, { docs }) => {
|
||||
log.info('[%s] Archived %d docs from %j', name, docs.archived, index);
|
||||
});
|
||||
|
||||
return stats.toJSON();
|
||||
}
|
35
src/es_archiver/actions/unload.js
Normal file
35
src/es_archiver/actions/unload.js
Normal file
|
@ -0,0 +1,35 @@
|
|||
import { resolve } from 'path';
|
||||
import { createReadStream } from 'fs';
|
||||
|
||||
import {
|
||||
createPromiseFromStreams
|
||||
} from '../../utils';
|
||||
|
||||
import {
|
||||
isGzip,
|
||||
createStats,
|
||||
prioritizeMappings,
|
||||
readDirectory,
|
||||
createParseArchiveStreams,
|
||||
createFilterRecordsStream,
|
||||
createDeleteIndexStream
|
||||
} from '../lib';
|
||||
|
||||
export async function unloadAction({ name, client, dataDir, log }) {
|
||||
const inputDir = resolve(dataDir, name);
|
||||
const stats = createStats(name, log);
|
||||
|
||||
const files = prioritizeMappings(await readDirectory(inputDir));
|
||||
for (const filename of files) {
|
||||
log.info('[%s] Unloading indices from %j', name, filename);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createReadStream(resolve(inputDir, filename)),
|
||||
...createParseArchiveStreams({ gzip: isGzip(filename) }),
|
||||
createFilterRecordsStream('index'),
|
||||
createDeleteIndexStream(client, stats)
|
||||
]);
|
||||
}
|
||||
|
||||
return stats.toJSON();
|
||||
}
|
106
src/es_archiver/cli.js
Normal file
106
src/es_archiver/cli.js
Normal file
|
@ -0,0 +1,106 @@
|
|||
/*************************************************************
|
||||
*
|
||||
* Run `node scripts/es_archiver --help` for usage information
|
||||
*
|
||||
*************************************************************/
|
||||
|
||||
import { resolve } from 'path';
|
||||
import { readFileSync } from 'fs';
|
||||
import { format as formatUrl } from 'url';
|
||||
|
||||
import { Command } from 'commander';
|
||||
import elasticsearch from 'elasticsearch';
|
||||
|
||||
import { EsArchiver } from './es_archiver';
|
||||
import { createToolingLog } from '../utils';
|
||||
import { readConfigFile } from '../functional_test_runner';
|
||||
|
||||
const cmd = new Command('node scripts/es_archiver');
|
||||
|
||||
cmd
|
||||
.description(`CLI to manage archiving/restoring data in elasticsearch`)
|
||||
.option('--es-url [url]', 'url for elasticsearch')
|
||||
.option(`--dir [path]`, 'where archives are stored')
|
||||
.option('--verbose', 'turn on verbose logging')
|
||||
.option('--config [path]', 'path to a functional test config file to use for default values')
|
||||
.on('--help', () => {
|
||||
console.log(readFileSync(resolve(__dirname, './cli_help.txt'), 'utf8'));
|
||||
});
|
||||
|
||||
cmd.command('save <name> <indices...>')
|
||||
.description('archive the <indices ...> into the --dir with <name>')
|
||||
.action((name, indices) => execute('save', name, indices));
|
||||
|
||||
cmd.command('load <name>')
|
||||
.description('load the archive in --dir with <name>')
|
||||
.action(name => execute('load', name));
|
||||
|
||||
cmd.command('unload <name>')
|
||||
.description('remove indices created by the archive in --dir with <name>')
|
||||
.action(name => execute('unload', name));
|
||||
|
||||
cmd.command('rebuild-all')
|
||||
.description('[internal] read and write all archives in --dir to remove any inconsistencies')
|
||||
.action(() => execute('rebuildAll'));
|
||||
|
||||
cmd.parse(process.argv);
|
||||
|
||||
const missingCommand = cmd.args.every(a => !(a instanceof Command));
|
||||
if (missingCommand) {
|
||||
execute();
|
||||
}
|
||||
|
||||
async function execute(operation, ...args) {
|
||||
try {
|
||||
const log = createToolingLog(cmd.verbose ? 'debug' : 'info');
|
||||
log.pipe(process.stdout);
|
||||
|
||||
if (cmd.config) {
|
||||
// load default values from the specified config file
|
||||
const config = await readConfigFile(log, resolve(cmd.config));
|
||||
if (!cmd.esUrl) cmd.esUrl = formatUrl(config.get('servers.elasticsearch'));
|
||||
if (!cmd.dir) cmd.dir = config.get('esArchiver.directory');
|
||||
}
|
||||
|
||||
// log and count all validation errors
|
||||
let errorCount = 0;
|
||||
const error = (msg) => {
|
||||
errorCount++;
|
||||
log.error(msg);
|
||||
};
|
||||
|
||||
if (!operation) error('Missing or invalid command');
|
||||
if (!cmd.esUrl) {
|
||||
error('You must specify either --es-url or --config flags');
|
||||
}
|
||||
if (!cmd.dir) {
|
||||
error('You must specify either --dir or --config flags');
|
||||
}
|
||||
|
||||
// if there was a validation error display the help
|
||||
if (errorCount) {
|
||||
cmd.help();
|
||||
return;
|
||||
}
|
||||
|
||||
// run!
|
||||
|
||||
const client = new elasticsearch.Client({
|
||||
host: cmd.esUrl,
|
||||
log: cmd.verbose ? 'trace' : []
|
||||
});
|
||||
|
||||
try {
|
||||
const esArchiver = new EsArchiver({
|
||||
log,
|
||||
client,
|
||||
dataDir: resolve(cmd.dir),
|
||||
});
|
||||
await esArchiver[operation](...args);
|
||||
} finally {
|
||||
await client.close();
|
||||
}
|
||||
} catch (err) {
|
||||
console.log('FATAL ERROR', err.stack);
|
||||
}
|
||||
}
|
15
src/es_archiver/cli_help.txt
Normal file
15
src/es_archiver/cli_help.txt
Normal file
|
@ -0,0 +1,15 @@
|
|||
Examples:
|
||||
Dump an index to disk:
|
||||
Save all `logstash-*` indices from http://localhost:9200 to `snapshots/my_test_data` directory
|
||||
|
||||
WARNING: If the `my_test_data` snapshot exists it will be deleted!
|
||||
|
||||
$ node scripts/es_archiver save my_test_data logstash-* --dir snapshots
|
||||
|
||||
Load an index from disk
|
||||
Load the `my_test_data` snapshot from the archive directory and elasticsearch instance defined
|
||||
in the `test/functional/config.js` config file
|
||||
|
||||
WARNING: If the indices exist already they will be deleted!
|
||||
|
||||
$ node scripts/es_archiver load my_test_data --config test/functional/config.js
|
92
src/es_archiver/es_archiver.js
Normal file
92
src/es_archiver/es_archiver.js
Normal file
|
@ -0,0 +1,92 @@
|
|||
import {
|
||||
saveAction,
|
||||
loadAction,
|
||||
unloadAction,
|
||||
rebuildAllAction,
|
||||
} from './actions';
|
||||
|
||||
export class EsArchiver {
|
||||
constructor({ client, dataDir, log }) {
|
||||
this.client = client;
|
||||
this.dataDir = dataDir;
|
||||
this.log = log;
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract data and mappings from an elasticsearch index and store
|
||||
* it in the dataDir so it can be used later to recreate the index.
|
||||
*
|
||||
* @param {String} name - the name of this archive, used to determine filename
|
||||
* @param {String|Array<String>} indices - the indices to archive
|
||||
* @return Promise<Stats>
|
||||
*/
|
||||
async save(name, indices) {
|
||||
return await saveAction({
|
||||
name,
|
||||
indices,
|
||||
client: this.client,
|
||||
dataDir: this.dataDir,
|
||||
log: this.log,
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Load an index from an archive
|
||||
*
|
||||
* @param {String} name - the name of the archive to load
|
||||
* @param {Object} options
|
||||
* @property {Boolean} options.skipExisting - should existing indices
|
||||
* be ignored or overwritten
|
||||
* @return Promise<Stats>
|
||||
*/
|
||||
async load(name, options = {}) {
|
||||
const { skipExisting } = options;
|
||||
|
||||
return await loadAction({
|
||||
name,
|
||||
skipExisting: !!skipExisting,
|
||||
client: this.client,
|
||||
dataDir: this.dataDir,
|
||||
log: this.log,
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Remove the indexes in elasticsearch that have data in an archive.
|
||||
*
|
||||
* @param {String} name
|
||||
* @return Promise<Stats>
|
||||
*/
|
||||
async unload(name) {
|
||||
return await unloadAction({
|
||||
name,
|
||||
client: this.client,
|
||||
dataDir: this.dataDir,
|
||||
log: this.log,
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Parse and reformat all of the archives. This is primarily helpful
|
||||
* for working on the esArchiver.
|
||||
*
|
||||
* @return Promise<Stats>
|
||||
*/
|
||||
async rebuildAll() {
|
||||
return rebuildAllAction({
|
||||
client: this.client,
|
||||
dataDir: this.dataDir,
|
||||
log: this.log
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Just like load, but skips any existing index
|
||||
*
|
||||
* @param {String} name
|
||||
* @return Promise<Stats>
|
||||
*/
|
||||
async loadIfNeeded(name) {
|
||||
return this.load(name, { skipExisting: true });
|
||||
}
|
||||
}
|
1
src/es_archiver/index.js
Normal file
1
src/es_archiver/index.js
Normal file
|
@ -0,0 +1 @@
|
|||
export { EsArchiver } from './es_archiver';
|
211
src/es_archiver/lib/__tests__/stats.js
Normal file
211
src/es_archiver/lib/__tests__/stats.js
Normal file
|
@ -0,0 +1,211 @@
|
|||
import expect from 'expect.js';
|
||||
import { uniq } from 'lodash';
|
||||
import sinon from 'sinon';
|
||||
|
||||
import { createStats } from '../';
|
||||
|
||||
import {
|
||||
createToolingLog,
|
||||
createConcatStream,
|
||||
createPromiseFromStreams
|
||||
} from '../../../utils';
|
||||
|
||||
function drain(log) {
|
||||
log.end();
|
||||
return createPromiseFromStreams([
|
||||
log,
|
||||
createConcatStream('')
|
||||
]);
|
||||
}
|
||||
|
||||
function assertDeepClones(a, b) {
|
||||
const path = [];
|
||||
try {
|
||||
(function recurse(one, two) {
|
||||
if (typeof one !== 'object' || typeof two !== 'object') {
|
||||
expect(one).to.be(two);
|
||||
return;
|
||||
}
|
||||
|
||||
expect(one).to.eql(two);
|
||||
expect(one).to.not.be(two);
|
||||
const keys = uniq(Object.keys(one).concat(Object.keys(two)));
|
||||
keys.forEach(k => {
|
||||
path.push(k);
|
||||
recurse(one[k], two[k]);
|
||||
path.pop();
|
||||
});
|
||||
}(a, b));
|
||||
} catch (err) {
|
||||
throw new Error(
|
||||
`Expected a and b to be deep clones of each other, error at:\n\n` +
|
||||
` "${path.join('.') || '-'}"\n\n` +
|
||||
err.stack
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
describe('esArchiver: Stats', () => {
|
||||
describe('#skippedIndex(index)', () => {
|
||||
it('marks the index as skipped', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.skippedIndex('index-name');
|
||||
const indexStats = stats.toJSON()['index-name'];
|
||||
expect(indexStats).to.have.property('skipped', true);
|
||||
});
|
||||
|
||||
it('logs that the index was skipped', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.skippedIndex('index-name');
|
||||
expect(await drain(log)).to.contain('Skipped');
|
||||
});
|
||||
});
|
||||
|
||||
describe('#deletedIndex(index)', () => {
|
||||
it('marks the index as deleted', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.deletedIndex('index-name');
|
||||
const indexStats = stats.toJSON()['index-name'];
|
||||
expect(indexStats).to.have.property('deleted', true);
|
||||
});
|
||||
it('logs that the index was deleted', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.deletedIndex('index-name');
|
||||
expect(await drain(log)).to.contain('Deleted');
|
||||
});
|
||||
});
|
||||
|
||||
describe('#createdIndex(index, [metadata])', () => {
|
||||
it('marks the index as created', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.createdIndex('index-name');
|
||||
const indexStats = stats.toJSON()['index-name'];
|
||||
expect(indexStats).to.have.property('created', true);
|
||||
});
|
||||
it('logs that the index was created', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.createdIndex('index-name');
|
||||
expect(await drain(log)).to.contain('Created');
|
||||
});
|
||||
describe('with metadata', () => {
|
||||
it('debug-logs each key from the metadata', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.createdIndex('index-name', {
|
||||
foo: 'bar'
|
||||
});
|
||||
const output = await drain(log);
|
||||
expect(output).to.contain('debg');
|
||||
expect(output).to.contain('foo "bar"');
|
||||
});
|
||||
});
|
||||
describe('without metadata', () => {
|
||||
it('no debug logging', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.createdIndex('index-name');
|
||||
const output = await drain(log);
|
||||
expect(output).to.not.contain('debg');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('#archivedIndex(index, [metadata])', () => {
|
||||
it('marks the index as archived', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.archivedIndex('index-name');
|
||||
const indexStats = stats.toJSON()['index-name'];
|
||||
expect(indexStats).to.have.property('archived', true);
|
||||
});
|
||||
it('logs that the index was archived', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.archivedIndex('index-name');
|
||||
expect(await drain(log)).to.contain('Archived');
|
||||
});
|
||||
describe('with metadata', () => {
|
||||
it('debug-logs each key from the metadata', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.archivedIndex('index-name', {
|
||||
foo: 'bar'
|
||||
});
|
||||
const output = await drain(log);
|
||||
expect(output).to.contain('debg');
|
||||
expect(output).to.contain('foo "bar"');
|
||||
});
|
||||
});
|
||||
describe('without metadata', () => {
|
||||
it('no debug logging', async () => {
|
||||
const log = createToolingLog('debug');
|
||||
const stats = createStats('name', log);
|
||||
stats.archivedIndex('index-name');
|
||||
const output = await drain(log);
|
||||
expect(output).to.not.contain('debg');
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('#indexedDoc(index)', () => {
|
||||
it('increases the docs.indexed count for the index', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.indexedDoc('index-name');
|
||||
expect(stats.toJSON()['index-name'].docs.indexed).to.be(1);
|
||||
stats.indexedDoc('index-name');
|
||||
stats.indexedDoc('index-name');
|
||||
expect(stats.toJSON()['index-name'].docs.indexed).to.be(3);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#archivedDoc(index)', () => {
|
||||
it('increases the docs.archived count for the index', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.archivedDoc('index-name');
|
||||
expect(stats.toJSON()['index-name'].docs.archived).to.be(1);
|
||||
stats.archivedDoc('index-name');
|
||||
stats.archivedDoc('index-name');
|
||||
expect(stats.toJSON()['index-name'].docs.archived).to.be(3);
|
||||
});
|
||||
});
|
||||
|
||||
describe('#toJSON()', () => {
|
||||
it('returns the stats for all indexes', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.archivedIndex('index1');
|
||||
stats.archivedIndex('index2');
|
||||
expect(Object.keys(stats.toJSON())).to.eql(['index1', 'index2']);
|
||||
});
|
||||
it('returns a deep clone of the stats', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.archivedIndex('index1');
|
||||
stats.archivedIndex('index2');
|
||||
stats.deletedIndex('index3');
|
||||
stats.createdIndex('index3');
|
||||
assertDeepClones(stats.toJSON(), stats.toJSON());
|
||||
});
|
||||
});
|
||||
|
||||
describe('#forEachIndex(fn)', () => {
|
||||
it('iterates a clone of the index stats', () => {
|
||||
const stats = createStats('name', createToolingLog());
|
||||
stats.archivedIndex('index1');
|
||||
stats.archivedIndex('index2');
|
||||
stats.deletedIndex('index3');
|
||||
stats.createdIndex('index3');
|
||||
const stub1 = sinon.stub();
|
||||
stats.forEachIndex(stub1);
|
||||
const stub2 = sinon.stub();
|
||||
stats.forEachIndex(stub2);
|
||||
sinon.assert.callCount(stub1, 3);
|
||||
sinon.assert.callCount(stub2, 3);
|
||||
|
||||
for (let i = 0; i < 3; i++) {
|
||||
assertDeepClones(stub1.args[i][0], stub2.args[i][0]);
|
||||
assertDeepClones(stub1.args[i][1], stub2.args[i][1]);
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
89
src/es_archiver/lib/archives/__tests__/format.js
Normal file
89
src/es_archiver/lib/archives/__tests__/format.js
Normal file
|
@ -0,0 +1,89 @@
|
|||
import Stream from 'stream';
|
||||
import { createGunzip } from 'zlib';
|
||||
|
||||
import expect from 'expect.js';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
createConcatStream,
|
||||
} from '../../../../utils';
|
||||
|
||||
import { createFormatArchiveStreams } from '../format';
|
||||
|
||||
const INPUTS = [1, 2, { foo: 'bar' }, [1,2]];
|
||||
const INPUT_JSON = INPUTS.map(i => JSON.stringify(i, null, 2)).join('\n\n');
|
||||
|
||||
describe('esArchiver createFormatArchiveStreams', () => {
|
||||
describe('{ gzip: false }', () => {
|
||||
it('returns an array of streams', () => {
|
||||
const streams = createFormatArchiveStreams({ gzip: false });
|
||||
expect(streams).to.be.an('array');
|
||||
expect(streams.length).to.be.greaterThan(0);
|
||||
streams.forEach(s => expect(s).to.be.a(Stream));
|
||||
});
|
||||
|
||||
it('streams consume js values and produces buffers', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream(INPUTS),
|
||||
...createFormatArchiveStreams({ gzip: false }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output.length).to.be.greaterThan(0);
|
||||
output.forEach(b => expect(b).to.be.a(Buffer));
|
||||
});
|
||||
|
||||
it('product is pretty-printed JSON seperated by two newlines', async () => {
|
||||
const json = await createPromiseFromStreams([
|
||||
createListStream(INPUTS),
|
||||
...createFormatArchiveStreams({ gzip: false }),
|
||||
createConcatStream('')
|
||||
]);
|
||||
|
||||
expect(json).to.be(INPUT_JSON);
|
||||
});
|
||||
});
|
||||
|
||||
describe('{ gzip: true }', () => {
|
||||
it('returns an array of streams', () => {
|
||||
const streams = createFormatArchiveStreams({ gzip: true });
|
||||
expect(streams).to.be.an('array');
|
||||
expect(streams.length).to.be.greaterThan(0);
|
||||
streams.forEach(s => expect(s).to.be.a(Stream));
|
||||
});
|
||||
|
||||
it('streams consume js values and produces buffers', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([1, 2, { foo: 'bar' }, [1,2]]),
|
||||
...createFormatArchiveStreams({ gzip: true }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output.length).to.be.greaterThan(0);
|
||||
output.forEach(b => expect(b).to.be.a(Buffer));
|
||||
});
|
||||
|
||||
it('output can be gunzipped', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream(INPUTS),
|
||||
...createFormatArchiveStreams({ gzip: true }),
|
||||
createGunzip(),
|
||||
createConcatStream('')
|
||||
]);
|
||||
expect(output).to.be(INPUT_JSON);
|
||||
});
|
||||
});
|
||||
|
||||
describe('defaults', () => {
|
||||
it('product is not gzipped', async () => {
|
||||
const json = await createPromiseFromStreams([
|
||||
createListStream(INPUTS),
|
||||
...createFormatArchiveStreams(),
|
||||
createConcatStream('')
|
||||
]);
|
||||
|
||||
expect(json).to.be(INPUT_JSON);
|
||||
});
|
||||
});
|
||||
});
|
174
src/es_archiver/lib/archives/__tests__/parse.js
Normal file
174
src/es_archiver/lib/archives/__tests__/parse.js
Normal file
|
@ -0,0 +1,174 @@
|
|||
import Stream, { PassThrough, Transform } from 'stream';
|
||||
import { createGzip } from 'zlib';
|
||||
|
||||
import expect from 'expect.js';
|
||||
|
||||
import {
|
||||
createConcatStream,
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
} from '../../../../utils';
|
||||
|
||||
import { createParseArchiveStreams } from '../parse';
|
||||
|
||||
describe('esArchiver createParseArchiveStreams', () => {
|
||||
describe('{ gzip: false }', () => {
|
||||
it('returns an array of streams', () => {
|
||||
const streams = createParseArchiveStreams({ gzip: false });
|
||||
expect(streams).to.be.an('array');
|
||||
expect(streams.length).to.be.greaterThan(0);
|
||||
streams.forEach(s => expect(s).to.be.a(Stream));
|
||||
});
|
||||
|
||||
describe('streams', () => {
|
||||
it('consume buffers of valid JSON', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
Buffer.from('{'),
|
||||
Buffer.from('"'),
|
||||
Buffer.from('a":'),
|
||||
Buffer.from('1}')
|
||||
]),
|
||||
...createParseArchiveStreams({ gzip: false })
|
||||
]);
|
||||
|
||||
expect(output).to.eql({ a: 1 });
|
||||
});
|
||||
it('consume buffers of valid JSON seperated by two newlines', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
Buffer.from('{'),
|
||||
Buffer.from('"'),
|
||||
Buffer.from('a":'),
|
||||
Buffer.from('1}'),
|
||||
Buffer.from('\n'),
|
||||
Buffer.from('\n'),
|
||||
Buffer.from('1'),
|
||||
]),
|
||||
...createParseArchiveStreams({ gzip: false }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output).to.eql([{ a: 1 }, 1]);
|
||||
});
|
||||
|
||||
it('provides each JSON object as soon as it is parsed', async () => {
|
||||
let onReceived;
|
||||
const receivedPromise = new Promise(resolve => onReceived = resolve);
|
||||
const input = new PassThrough();
|
||||
const check = new Transform({
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
transform(chunk, env, callback) {
|
||||
onReceived(chunk);
|
||||
callback(null, chunk);
|
||||
}
|
||||
});
|
||||
|
||||
const finalPromise = createPromiseFromStreams([
|
||||
input,
|
||||
...createParseArchiveStreams(),
|
||||
check,
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
input.write(Buffer.from('{"a": 1}\n\n'));
|
||||
expect(await receivedPromise).to.eql({ a: 1 });
|
||||
input.write(Buffer.from('{"a": 2}'));
|
||||
input.end();
|
||||
expect(await finalPromise).to.eql([{ a: 1 }, { a:2 }]);
|
||||
});
|
||||
});
|
||||
|
||||
describe('stream errors', () => {
|
||||
it('stops when any document contains invalid json', async () => {
|
||||
try {
|
||||
await createPromiseFromStreams([
|
||||
createListStream([
|
||||
Buffer.from('{"a": 1}\n\n'),
|
||||
Buffer.from('{1}\n\n'),
|
||||
Buffer.from('{"a": 2}\n\n'),
|
||||
]),
|
||||
...createParseArchiveStreams({ gzip: false }),
|
||||
createConcatStream()
|
||||
]);
|
||||
throw new Error('should have failed');
|
||||
} catch (err) {
|
||||
expect(err.message).to.contain('Unexpected number');
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('{ gzip: true }', () => {
|
||||
it('returns an array of streams', () => {
|
||||
const streams = createParseArchiveStreams({ gzip: true });
|
||||
expect(streams).to.be.an('array');
|
||||
expect(streams.length).to.be.greaterThan(0);
|
||||
streams.forEach(s => expect(s).to.be.a(Stream));
|
||||
});
|
||||
|
||||
describe('streams', () => {
|
||||
it('consumes gzipped buffers of valid JSON', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
Buffer.from('{'),
|
||||
Buffer.from('"'),
|
||||
Buffer.from('a":'),
|
||||
Buffer.from('1}')
|
||||
]),
|
||||
createGzip(),
|
||||
...createParseArchiveStreams({ gzip: true })
|
||||
]);
|
||||
|
||||
expect(output).to.eql({ a: 1 });
|
||||
});
|
||||
|
||||
it('parses valid gzipped JSON strings seperated by two newlines', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
'{\n',
|
||||
' "a": 1\n',
|
||||
'}',
|
||||
'\n\n',
|
||||
'{"a":2}'
|
||||
]),
|
||||
createGzip(),
|
||||
...createParseArchiveStreams({ gzip: true }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output).to.eql([{ a: 1 }, { a: 2 }]);
|
||||
});
|
||||
});
|
||||
|
||||
describe('stream errors', () => {
|
||||
it('stops when the input is not valid gzip archive', async () => {
|
||||
try {
|
||||
await createPromiseFromStreams([
|
||||
createListStream([
|
||||
Buffer.from('{"a": 1}'),
|
||||
]),
|
||||
...createParseArchiveStreams({ gzip: true }),
|
||||
createConcatStream()
|
||||
]);
|
||||
throw new Error('should have failed');
|
||||
} catch (err) {
|
||||
expect(err.message).to.contain('incorrect header check');
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('defaults', () => {
|
||||
it('does not try to gunzip the content', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
Buffer.from('{"a": 1}'),
|
||||
]),
|
||||
...createParseArchiveStreams(),
|
||||
]);
|
||||
expect(output).to.eql({ a: 1 });
|
||||
});
|
||||
});
|
||||
});
|
1
src/es_archiver/lib/archives/constants.js
Normal file
1
src/es_archiver/lib/archives/constants.js
Normal file
|
@ -0,0 +1 @@
|
|||
export const RECORD_SEPARATOR = '\n\n';
|
30
src/es_archiver/lib/archives/filenames.js
Normal file
30
src/es_archiver/lib/archives/filenames.js
Normal file
|
@ -0,0 +1,30 @@
|
|||
import { basename, extname } from 'path';
|
||||
|
||||
export function isGzip(path) {
|
||||
return extname(path) === '.gz';
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a path is for a, potentially gzipped, mapping file
|
||||
* @param {String} path
|
||||
* @return {Boolean}
|
||||
*/
|
||||
export function isMappingFile(path) {
|
||||
return basename(path, '.gz') === 'mappings.json';
|
||||
}
|
||||
|
||||
/**
|
||||
* Sorts the filenames found in an archive so that
|
||||
* "mappings" files come first, which is the order they
|
||||
* need to be imported so that data files will have their
|
||||
* indexes before the docs are indexed.
|
||||
*
|
||||
* @param {Array<String>} filenames
|
||||
* @return {Array<String>}
|
||||
*/
|
||||
export function prioritizeMappings(filenames) {
|
||||
return filenames.slice().sort((fa, fb) => {
|
||||
if (isMappingFile(fa) === isMappingFile(fb)) return 0;
|
||||
return isMappingFile(fb) ? 1 : -1;
|
||||
});
|
||||
}
|
17
src/es_archiver/lib/archives/format.js
Normal file
17
src/es_archiver/lib/archives/format.js
Normal file
|
@ -0,0 +1,17 @@
|
|||
import { createGzip, Z_BEST_COMPRESSION } from 'zlib';
|
||||
import { PassThrough } from 'stream';
|
||||
|
||||
import {
|
||||
createIntersperseStream,
|
||||
createJsonStringifyStream
|
||||
} from '../../../utils';
|
||||
|
||||
import { RECORD_SEPARATOR } from './constants';
|
||||
|
||||
export function createFormatArchiveStreams({ gzip = false } = {}) {
|
||||
return [
|
||||
createJsonStringifyStream({ pretty: true }),
|
||||
createIntersperseStream(RECORD_SEPARATOR),
|
||||
gzip ? createGzip({ level: Z_BEST_COMPRESSION }) : new PassThrough(),
|
||||
];
|
||||
}
|
12
src/es_archiver/lib/archives/index.js
Normal file
12
src/es_archiver/lib/archives/index.js
Normal file
|
@ -0,0 +1,12 @@
|
|||
export {
|
||||
isGzip,
|
||||
prioritizeMappings,
|
||||
} from './filenames';
|
||||
|
||||
export {
|
||||
createParseArchiveStreams,
|
||||
} from './parse';
|
||||
|
||||
export {
|
||||
createFormatArchiveStreams,
|
||||
} from './format';
|
17
src/es_archiver/lib/archives/parse.js
Normal file
17
src/es_archiver/lib/archives/parse.js
Normal file
|
@ -0,0 +1,17 @@
|
|||
import { createGunzip } from 'zlib';
|
||||
import { PassThrough } from 'stream';
|
||||
|
||||
import {
|
||||
createSplitStream,
|
||||
createJsonParseStream,
|
||||
} from '../../../utils';
|
||||
|
||||
import { RECORD_SEPARATOR } from './constants';
|
||||
|
||||
export function createParseArchiveStreams({ gzip = false } = {}) {
|
||||
return [
|
||||
gzip ? createGunzip() : new PassThrough(),
|
||||
createSplitStream(RECORD_SEPARATOR),
|
||||
createJsonParseStream(),
|
||||
];
|
||||
}
|
8
src/es_archiver/lib/directory.js
Normal file
8
src/es_archiver/lib/directory.js
Normal file
|
@ -0,0 +1,8 @@
|
|||
import { readdir } from 'fs';
|
||||
|
||||
import { fromNode } from 'bluebird';
|
||||
|
||||
export async function readDirectory(path) {
|
||||
const allNames = await fromNode(cb => readdir(path, cb));
|
||||
return allNames.filter(name => !name.startsWith('.'));
|
||||
}
|
|
@ -0,0 +1,124 @@
|
|||
import sinon from 'sinon';
|
||||
import expect from 'expect.js';
|
||||
import { delay } from 'bluebird';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
createConcatStream,
|
||||
} from '../../../../utils';
|
||||
|
||||
import { createGenerateDocRecordsStream } from '../generate_doc_records_stream';
|
||||
import {
|
||||
createStubStats,
|
||||
createStubClient,
|
||||
} from './stubs';
|
||||
|
||||
describe('esArchiver: createGenerateDocRecordsStream()', () => {
|
||||
it('scolls 1000 documents at a time', async () => {
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient([
|
||||
(name, params) => {
|
||||
expect(name).to.be('search');
|
||||
expect(params).to.have.property('index', 'logstash-*');
|
||||
expect(params).to.have.property('size', 1000);
|
||||
return {
|
||||
hits: {
|
||||
total: 0,
|
||||
hits: []
|
||||
}
|
||||
};
|
||||
}
|
||||
]);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(['logstash-*']),
|
||||
createGenerateDocRecordsStream(client, stats)
|
||||
]);
|
||||
});
|
||||
|
||||
it('uses a 1 minute scroll timeout', async () => {
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient([
|
||||
(name, params) => {
|
||||
expect(name).to.be('search');
|
||||
expect(params).to.have.property('index', 'logstash-*');
|
||||
expect(params).to.have.property('scroll', '1m');
|
||||
return {
|
||||
hits: {
|
||||
total: 0,
|
||||
hits: []
|
||||
}
|
||||
};
|
||||
}
|
||||
]);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(['logstash-*']),
|
||||
createGenerateDocRecordsStream(client, stats)
|
||||
]);
|
||||
});
|
||||
|
||||
it('consumes index names and scrolls completely before continuing', async () => {
|
||||
const stats = createStubStats();
|
||||
let checkpoint = Date.now();
|
||||
const client = createStubClient([
|
||||
async (name, params) => {
|
||||
expect(name).to.be('search');
|
||||
expect(params).to.have.property('index', 'index1');
|
||||
await delay(200);
|
||||
return {
|
||||
_scroll_id: 'index1ScrollId',
|
||||
hits: { total: 2, hits: [ { _id: 1 } ] }
|
||||
};
|
||||
},
|
||||
async (name, params) => {
|
||||
expect(name).to.be('scroll');
|
||||
expect(params).to.have.property('scrollId', 'index1ScrollId');
|
||||
expect(Date.now() - checkpoint).to.not.be.lessThan(200);
|
||||
checkpoint = Date.now();
|
||||
await delay(200);
|
||||
return { hits: { total: 2, hits: [ { _id: 2 } ] } };
|
||||
},
|
||||
async (name, params) => {
|
||||
expect(name).to.be('search');
|
||||
expect(params).to.have.property('index', 'index2');
|
||||
expect(Date.now() - checkpoint).to.not.be.lessThan(200);
|
||||
checkpoint = Date.now();
|
||||
await delay(200);
|
||||
return { hits: { total: 0, hits: [] } };
|
||||
}
|
||||
]);
|
||||
|
||||
const docRecords = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
'index1',
|
||||
'index2',
|
||||
]),
|
||||
createGenerateDocRecordsStream(client, stats),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(docRecords).to.eql([
|
||||
{
|
||||
type: 'doc',
|
||||
value: {
|
||||
index: undefined,
|
||||
type: undefined,
|
||||
id: 1,
|
||||
source: undefined
|
||||
}
|
||||
},
|
||||
{
|
||||
type: 'doc',
|
||||
value: {
|
||||
index: undefined,
|
||||
type: undefined,
|
||||
id: 2,
|
||||
source: undefined
|
||||
}
|
||||
},
|
||||
]);
|
||||
sinon.assert.calledTwice(stats.archivedDoc);
|
||||
});
|
||||
});
|
159
src/es_archiver/lib/docs/__tests__/index_doc_records_stream.js
Normal file
159
src/es_archiver/lib/docs/__tests__/index_doc_records_stream.js
Normal file
|
@ -0,0 +1,159 @@
|
|||
import expect from 'expect.js';
|
||||
import { delay } from 'bluebird';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
} from '../../../../utils';
|
||||
|
||||
import { createIndexDocRecordsStream } from '../index_doc_records_stream';
|
||||
import {
|
||||
createStubStats,
|
||||
createStubClient,
|
||||
createPersonDocRecords,
|
||||
} from './stubs';
|
||||
|
||||
const recordsToBulkBody = records => {
|
||||
return records.reduce((acc, record) => {
|
||||
const { index, type, id, source } = record.value;
|
||||
|
||||
return [
|
||||
...acc,
|
||||
{ index: { _index: index, _type: type, _id: id } },
|
||||
source
|
||||
];
|
||||
}, []);
|
||||
};
|
||||
|
||||
describe('esArchiver: createIndexDocRecordsStream()', () => {
|
||||
it('consumes doc records and sends to `_bulk` api', async () => {
|
||||
const records = createPersonDocRecords(1);
|
||||
const client = createStubClient([
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params).to.eql({
|
||||
body: recordsToBulkBody(records)
|
||||
});
|
||||
return { ok: true };
|
||||
}
|
||||
]);
|
||||
const stats = createStubStats();
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(records),
|
||||
createIndexDocRecordsStream(client, stats),
|
||||
]);
|
||||
|
||||
client.assertNoPendingResponses();
|
||||
});
|
||||
|
||||
it('consumes multiple doc records and sends to `_bulk` api together', async () => {
|
||||
const records = createPersonDocRecords(10);
|
||||
const client = createStubClient([
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params).to.eql({
|
||||
body: recordsToBulkBody(records.slice(0, 1))
|
||||
});
|
||||
return { ok: true };
|
||||
},
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params).to.eql({
|
||||
body: recordsToBulkBody(records.slice(1))
|
||||
});
|
||||
return { ok: true };
|
||||
}
|
||||
]);
|
||||
const stats = createStubStats();
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(records),
|
||||
createIndexDocRecordsStream(client, stats),
|
||||
]);
|
||||
|
||||
client.assertNoPendingResponses();
|
||||
});
|
||||
|
||||
it('waits until request is complete before sending more', async () => {
|
||||
const records = createPersonDocRecords(10);
|
||||
const stats = createStubStats();
|
||||
const start = Date.now();
|
||||
const delayMs = 1234;
|
||||
const client = createStubClient([
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params).to.eql({
|
||||
body: recordsToBulkBody(records.slice(0, 1))
|
||||
});
|
||||
await delay(delayMs);
|
||||
return { ok: true };
|
||||
},
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params).to.eql({
|
||||
body: recordsToBulkBody(records.slice(1))
|
||||
});
|
||||
expect(Date.now() - start).to.not.be.lessThan(delayMs);
|
||||
return { ok: true };
|
||||
}
|
||||
]);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(records),
|
||||
createIndexDocRecordsStream(client, stats)
|
||||
]);
|
||||
|
||||
client.assertNoPendingResponses();
|
||||
});
|
||||
|
||||
it('sends a maximum of 1000 documents at a time', async () => {
|
||||
const records = createPersonDocRecords(1001);
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient([
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params.body.length).to.eql(1 * 2);
|
||||
return { ok: true };
|
||||
},
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params.body.length).to.eql(999 * 2);
|
||||
return { ok: true };
|
||||
},
|
||||
async (name, params) => {
|
||||
expect(name).to.be('bulk');
|
||||
expect(params.body.length).to.eql(1 * 2);
|
||||
return { ok: true };
|
||||
},
|
||||
]);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(records),
|
||||
createIndexDocRecordsStream(client, stats),
|
||||
]);
|
||||
|
||||
client.assertNoPendingResponses();
|
||||
});
|
||||
|
||||
it('emits an error if any request fails', async () => {
|
||||
const records = createPersonDocRecords(2);
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient([
|
||||
async () => ({ ok: true }),
|
||||
async () => ({ errors: true, forcedError: true })
|
||||
]);
|
||||
|
||||
try {
|
||||
await createPromiseFromStreams([
|
||||
createListStream(records),
|
||||
createIndexDocRecordsStream(client, stats),
|
||||
]);
|
||||
throw new Error('expected stream to emit error');
|
||||
} catch (err) {
|
||||
expect(err.message).to.match(/"forcedError":\s*true/);
|
||||
}
|
||||
|
||||
client.assertNoPendingResponses();
|
||||
});
|
||||
});
|
46
src/es_archiver/lib/docs/__tests__/stubs.js
Normal file
46
src/es_archiver/lib/docs/__tests__/stubs.js
Normal file
|
@ -0,0 +1,46 @@
|
|||
import sinon from 'sinon';
|
||||
import Chance from 'chance';
|
||||
import { times } from 'lodash';
|
||||
const chance = new Chance();
|
||||
|
||||
export const createStubStats = () => ({
|
||||
indexedDoc: sinon.stub(),
|
||||
archivedDoc: sinon.stub(),
|
||||
});
|
||||
|
||||
export const createPersonDocRecords = n => times(n, () => ({
|
||||
type: 'doc',
|
||||
value: {
|
||||
index: 'people',
|
||||
type: 'person',
|
||||
id: chance.natural(),
|
||||
source: {
|
||||
name: chance.name(),
|
||||
birthday: chance.birthday(),
|
||||
ssn: chance.ssn(),
|
||||
}
|
||||
}
|
||||
}));
|
||||
|
||||
export const createStubClient = (responses = []) => {
|
||||
const createStubClientMethod = name => sinon.spy(async (params) => {
|
||||
if (responses.length === 0) {
|
||||
throw new Error(`unexpected client.${name} call`);
|
||||
}
|
||||
|
||||
const response = responses.shift();
|
||||
return await response(name, params);
|
||||
});
|
||||
|
||||
return {
|
||||
search: createStubClientMethod('search'),
|
||||
scroll: createStubClientMethod('scroll'),
|
||||
bulk: createStubClientMethod('bulk'),
|
||||
|
||||
assertNoPendingResponses() {
|
||||
if (responses.length) {
|
||||
throw new Error(`There are ${responses.length} unsent responses.`);
|
||||
}
|
||||
},
|
||||
};
|
||||
};
|
52
src/es_archiver/lib/docs/generate_doc_records_stream.js
Normal file
52
src/es_archiver/lib/docs/generate_doc_records_stream.js
Normal file
|
@ -0,0 +1,52 @@
|
|||
import { Transform } from 'stream';
|
||||
|
||||
const SCROLL_SIZE = 1000;
|
||||
const SCROLL_TIMEOUT = '1m';
|
||||
|
||||
export function createGenerateDocRecordsStream(client, stats) {
|
||||
return new Transform({
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
async transform(index, enc, callback) {
|
||||
try {
|
||||
let remainingHits = null;
|
||||
let resp = null;
|
||||
|
||||
while (!resp || remainingHits > 0) {
|
||||
if (!resp) {
|
||||
resp = await client.search({
|
||||
index: index,
|
||||
scroll: SCROLL_TIMEOUT,
|
||||
size: SCROLL_SIZE,
|
||||
_source: true
|
||||
});
|
||||
remainingHits = resp.hits.total;
|
||||
} else {
|
||||
resp = await client.scroll({
|
||||
scroll: SCROLL_TIMEOUT,
|
||||
scrollId: resp._scroll_id,
|
||||
});
|
||||
}
|
||||
|
||||
for (const hit of resp.hits.hits) {
|
||||
remainingHits -= 1;
|
||||
stats.archivedDoc(hit._index);
|
||||
this.push({
|
||||
type: 'doc',
|
||||
value: {
|
||||
index: hit._index,
|
||||
type: hit._type,
|
||||
id: hit._id,
|
||||
source: hit._source,
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
callback(null);
|
||||
} catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
2
src/es_archiver/lib/docs/index.js
Normal file
2
src/es_archiver/lib/docs/index.js
Normal file
|
@ -0,0 +1,2 @@
|
|||
export { createIndexDocRecordsStream } from './index_doc_records_stream';
|
||||
export { createGenerateDocRecordsStream } from './generate_doc_records_stream';
|
50
src/es_archiver/lib/docs/index_doc_records_stream.js
Normal file
50
src/es_archiver/lib/docs/index_doc_records_stream.js
Normal file
|
@ -0,0 +1,50 @@
|
|||
import { Writable } from 'stream';
|
||||
|
||||
export function createIndexDocRecordsStream(client, stats) {
|
||||
|
||||
async function indexDocs(docs) {
|
||||
const body = [];
|
||||
|
||||
docs.forEach(doc => {
|
||||
stats.indexedDoc(doc.index);
|
||||
body.push(
|
||||
{
|
||||
index: {
|
||||
_index: doc.index,
|
||||
_type: doc.type,
|
||||
_id: doc.id,
|
||||
}
|
||||
},
|
||||
doc.source
|
||||
);
|
||||
});
|
||||
|
||||
const resp = await client.bulk({ body });
|
||||
if (resp.errors) {
|
||||
throw new Error(`Failed to index all documents: ${JSON.stringify(resp, null, 2)}`);
|
||||
}
|
||||
}
|
||||
|
||||
return new Writable({
|
||||
highWaterMark: 1000,
|
||||
objectMode: true,
|
||||
|
||||
async write(record, enc, callback) {
|
||||
try {
|
||||
await indexDocs([record.value]);
|
||||
callback(null);
|
||||
} catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
},
|
||||
|
||||
async writev(chunks, callback) {
|
||||
try {
|
||||
await indexDocs(chunks.map(({ chunk: record }) => record.value));
|
||||
callback(null);
|
||||
} catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
},
|
||||
});
|
||||
}
|
29
src/es_archiver/lib/index.js
Normal file
29
src/es_archiver/lib/index.js
Normal file
|
@ -0,0 +1,29 @@
|
|||
export {
|
||||
createIndexDocRecordsStream,
|
||||
createGenerateDocRecordsStream,
|
||||
} from './docs';
|
||||
|
||||
export {
|
||||
createCreateIndexStream,
|
||||
createDeleteIndexStream,
|
||||
createGenerateIndexRecordsStream,
|
||||
} from './indices';
|
||||
|
||||
export {
|
||||
createFilterRecordsStream,
|
||||
} from './records';
|
||||
|
||||
export {
|
||||
createStats,
|
||||
} from './stats';
|
||||
|
||||
export {
|
||||
isGzip,
|
||||
prioritizeMappings,
|
||||
createParseArchiveStreams,
|
||||
createFormatArchiveStreams,
|
||||
} from './archives';
|
||||
|
||||
export {
|
||||
readDirectory
|
||||
} from './directory';
|
167
src/es_archiver/lib/indices/__tests__/create_index_stream.js
Normal file
167
src/es_archiver/lib/indices/__tests__/create_index_stream.js
Normal file
|
@ -0,0 +1,167 @@
|
|||
import expect from 'expect.js';
|
||||
import sinon from 'sinon';
|
||||
import Chance from 'chance';
|
||||
|
||||
import {
|
||||
createPromiseFromStreams,
|
||||
createConcatStream,
|
||||
createListStream
|
||||
} from '../../../../utils';
|
||||
|
||||
import {
|
||||
createCreateIndexStream
|
||||
} from '../create_index_stream';
|
||||
|
||||
import {
|
||||
createStubStats,
|
||||
createStubIndexRecord,
|
||||
createStubDocRecord,
|
||||
createStubClient
|
||||
} from './stubs';
|
||||
|
||||
const chance = new Chance();
|
||||
|
||||
describe('esArchiver: createCreateIndexStream()', () => {
|
||||
describe('defaults', () => {
|
||||
it('deletes existing indices, creates all', async () => {
|
||||
const client = createStubClient(['existing-index']);
|
||||
const stats = createStubStats();
|
||||
await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('existing-index'),
|
||||
createStubIndexRecord('new-index')
|
||||
]),
|
||||
createCreateIndexStream({ client, stats })
|
||||
]);
|
||||
|
||||
expect(stats.getTestSummary()).to.eql({
|
||||
deletedIndex: 1,
|
||||
createdIndex: 2
|
||||
});
|
||||
sinon.assert.callCount(client.indices.delete, 1);
|
||||
sinon.assert.callCount(client.indices.create, 3); // one failed create because of existing
|
||||
});
|
||||
|
||||
it('passes through "hit" records', async () => {
|
||||
const client = createStubClient();
|
||||
const stats = createStubStats();
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('index'),
|
||||
createStubDocRecord('index', 1),
|
||||
createStubDocRecord('index', 2),
|
||||
]),
|
||||
createCreateIndexStream({ client, stats }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output).to.eql([
|
||||
createStubDocRecord('index', 1),
|
||||
createStubDocRecord('index', 2),
|
||||
]);
|
||||
});
|
||||
|
||||
it('passes through records with unknown types', async () => {
|
||||
const client = createStubClient();
|
||||
const stats = createStubStats();
|
||||
const randoms = [
|
||||
{ type: chance.word(), value: chance.hashtag() },
|
||||
{ type: chance.word(), value: chance.hashtag() },
|
||||
];
|
||||
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('index'),
|
||||
...randoms
|
||||
]),
|
||||
createCreateIndexStream({ client, stats }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output).to.eql(randoms);
|
||||
});
|
||||
|
||||
it('passes through non-record values', async () => {
|
||||
const client = createStubClient();
|
||||
const stats = createStubStats();
|
||||
const nonRecordValues = [
|
||||
undefined,
|
||||
chance.email(),
|
||||
12345,
|
||||
Infinity,
|
||||
/abc/,
|
||||
new Date()
|
||||
];
|
||||
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream(nonRecordValues),
|
||||
createCreateIndexStream({ client, stats }),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(output).to.eql(nonRecordValues);
|
||||
});
|
||||
});
|
||||
|
||||
describe('skipExisting = true', () => {
|
||||
it('ignores preexisting indexes', async () => {
|
||||
const client = createStubClient(['existing-index']);
|
||||
const stats = createStubStats();
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('new-index'),
|
||||
createStubIndexRecord('existing-index'),
|
||||
]),
|
||||
createCreateIndexStream({
|
||||
client,
|
||||
stats,
|
||||
skipExisting: true
|
||||
})
|
||||
]);
|
||||
|
||||
expect(stats.getTestSummary()).to.eql({
|
||||
skippedIndex: 1,
|
||||
createdIndex: 1,
|
||||
});
|
||||
sinon.assert.callCount(client.indices.delete, 0);
|
||||
sinon.assert.callCount(client.indices.create, 2); // one failed create because of existing
|
||||
expect(client.indices.create.args[0][0]).to.have.property('index', 'new-index');
|
||||
});
|
||||
|
||||
it('filters documents for skipped indicies', async () => {
|
||||
const client = createStubClient(['existing-index']);
|
||||
const stats = createStubStats();
|
||||
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('new-index'),
|
||||
createStubDocRecord('new-index', 1),
|
||||
createStubDocRecord('new-index', 2),
|
||||
createStubIndexRecord('existing-index'),
|
||||
createStubDocRecord('existing-index', 1),
|
||||
createStubDocRecord('existing-index', 2),
|
||||
]),
|
||||
createCreateIndexStream({
|
||||
client,
|
||||
stats,
|
||||
skipExisting: true
|
||||
}),
|
||||
createConcatStream([])
|
||||
]);
|
||||
|
||||
expect(stats.getTestSummary()).to.eql({
|
||||
skippedIndex: 1,
|
||||
createdIndex: 1
|
||||
});
|
||||
sinon.assert.callCount(client.indices.delete, 0);
|
||||
sinon.assert.callCount(client.indices.create, 2); // one failed create because of existing
|
||||
|
||||
expect(output).to.have.length(2);
|
||||
expect(output).to.eql([
|
||||
createStubDocRecord('new-index', 1),
|
||||
createStubDocRecord('new-index', 2)
|
||||
]);
|
||||
});
|
||||
});
|
||||
});
|
52
src/es_archiver/lib/indices/__tests__/delete_index_stream.js
Normal file
52
src/es_archiver/lib/indices/__tests__/delete_index_stream.js
Normal file
|
@ -0,0 +1,52 @@
|
|||
import sinon from 'sinon';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
} from '../../../../utils';
|
||||
|
||||
import {
|
||||
createDeleteIndexStream,
|
||||
} from '../delete_index_stream';
|
||||
|
||||
import {
|
||||
createStubStats,
|
||||
createStubClient,
|
||||
createStubIndexRecord
|
||||
} from './stubs';
|
||||
|
||||
describe('esArchiver: createDeleteIndexStream()', () => {
|
||||
it('deletes the index without checking if it exists', async () => {
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient([]);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('index1')
|
||||
]),
|
||||
createDeleteIndexStream(client, stats),
|
||||
]);
|
||||
|
||||
sinon.assert.notCalled(stats.deletedIndex);
|
||||
sinon.assert.notCalled(client.indices.create);
|
||||
sinon.assert.calledOnce(client.indices.delete);
|
||||
sinon.assert.notCalled(client.indices.exists);
|
||||
});
|
||||
|
||||
it('reports the delete when the index existed', async () => {
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient(['index1']);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream([
|
||||
createStubIndexRecord('index1')
|
||||
]),
|
||||
createDeleteIndexStream(client, stats),
|
||||
]);
|
||||
|
||||
sinon.assert.calledOnce(stats.deletedIndex);
|
||||
sinon.assert.notCalled(client.indices.create);
|
||||
sinon.assert.calledOnce(client.indices.delete);
|
||||
sinon.assert.notCalled(client.indices.exists);
|
||||
});
|
||||
});
|
|
@ -0,0 +1,82 @@
|
|||
import sinon from 'sinon';
|
||||
import expect from 'expect.js';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
createConcatStream,
|
||||
} from '../../../../utils';
|
||||
|
||||
import {
|
||||
createStubClient,
|
||||
createStubStats,
|
||||
} from './stubs';
|
||||
|
||||
import {
|
||||
createGenerateIndexRecordsStream,
|
||||
} from '../generate_index_records_stream';
|
||||
|
||||
describe('esArchiver: createGenerateIndexRecordsStream()', () => {
|
||||
it('consumes index names and queries for the mapping of each', async () => {
|
||||
const indices = ['index1', 'index2', 'index3', 'index4'];
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient(indices);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(indices),
|
||||
createGenerateIndexRecordsStream(client, stats)
|
||||
]);
|
||||
|
||||
expect(stats.getTestSummary()).to.eql({
|
||||
archivedIndex: 4
|
||||
});
|
||||
|
||||
sinon.assert.callCount(client.indices.get, 4);
|
||||
sinon.assert.notCalled(client.indices.create);
|
||||
sinon.assert.notCalled(client.indices.delete);
|
||||
sinon.assert.notCalled(client.indices.exists);
|
||||
});
|
||||
|
||||
it('filters index metadata from settings', async () => {
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient(['index1']);
|
||||
|
||||
await createPromiseFromStreams([
|
||||
createListStream(['index1']),
|
||||
createGenerateIndexRecordsStream(client, stats)
|
||||
]);
|
||||
|
||||
const params = client.indices.get.args[0][0];
|
||||
expect(params).to.have.property('filterPath');
|
||||
const filters = params.filterPath;
|
||||
expect(filters.some(path => path.includes('index.creation_date'))).to.be(true);
|
||||
expect(filters.some(path => path.includes('index.uuid'))).to.be(true);
|
||||
expect(filters.some(path => path.includes('index.version'))).to.be(true);
|
||||
expect(filters.some(path => path.includes('index.provided_name'))).to.be(true);
|
||||
});
|
||||
|
||||
it('produces one index record for each index name it receives', async () => {
|
||||
const stats = createStubStats();
|
||||
const client = createStubClient(['index1', 'index2', 'index3']);
|
||||
|
||||
const indexRecords = await createPromiseFromStreams([
|
||||
createListStream(['index1', 'index2', 'index3']),
|
||||
createGenerateIndexRecordsStream(client, stats),
|
||||
createConcatStream([]),
|
||||
]);
|
||||
|
||||
expect(indexRecords).to.have.length(3);
|
||||
|
||||
expect(indexRecords[0]).to.have.property('type', 'index');
|
||||
expect(indexRecords[0]).to.have.property('value');
|
||||
expect(indexRecords[0].value).to.have.property('index', 'index1');
|
||||
|
||||
expect(indexRecords[1]).to.have.property('type', 'index');
|
||||
expect(indexRecords[1]).to.have.property('value');
|
||||
expect(indexRecords[1].value).to.have.property('index', 'index2');
|
||||
|
||||
expect(indexRecords[2]).to.have.property('type', 'index');
|
||||
expect(indexRecords[2]).to.have.property('value');
|
||||
expect(indexRecords[2].value).to.have.property('index', 'index3');
|
||||
});
|
||||
});
|
73
src/es_archiver/lib/indices/__tests__/stubs.js
Normal file
73
src/es_archiver/lib/indices/__tests__/stubs.js
Normal file
|
@ -0,0 +1,73 @@
|
|||
import sinon from 'sinon';
|
||||
|
||||
export const createStubStats = () => ({
|
||||
createdIndex: sinon.stub(),
|
||||
deletedIndex: sinon.stub(),
|
||||
skippedIndex: sinon.stub(),
|
||||
archivedIndex: sinon.stub(),
|
||||
getTestSummary() {
|
||||
const summary = {};
|
||||
Object.keys(this).forEach(key => {
|
||||
if (this[key].callCount) {
|
||||
summary[key] = this[key].callCount;
|
||||
}
|
||||
});
|
||||
return summary;
|
||||
},
|
||||
});
|
||||
|
||||
export const createStubIndexRecord = (index) => ({
|
||||
type: 'index',
|
||||
value: { index }
|
||||
});
|
||||
|
||||
export const createStubDocRecord = (index, id) => ({
|
||||
type: 'doc',
|
||||
value: { index, id }
|
||||
});
|
||||
|
||||
const createEsClientError = (errorType) => {
|
||||
const err = new Error(`ES Client Error Stub "${errorType}"`);
|
||||
err.body = {
|
||||
error: {
|
||||
type: errorType
|
||||
}
|
||||
};
|
||||
return err;
|
||||
};
|
||||
|
||||
export const createStubClient = (existingIndices = []) => ({
|
||||
indices: {
|
||||
get: sinon.spy(async ({ index }) => {
|
||||
if (!existingIndices.includes(index)) {
|
||||
throw createEsClientError('index_not_found_exception');
|
||||
}
|
||||
|
||||
return {
|
||||
[index]: {
|
||||
mappings: {},
|
||||
settings: {},
|
||||
}
|
||||
};
|
||||
}),
|
||||
create: sinon.spy(async ({ index }) => {
|
||||
if (existingIndices.includes(index)) {
|
||||
throw createEsClientError('resource_already_exists_exception');
|
||||
} else {
|
||||
existingIndices.push(index);
|
||||
return { ok: true };
|
||||
}
|
||||
}),
|
||||
delete: sinon.spy(async ({ index }) => {
|
||||
if (existingIndices.includes(index)) {
|
||||
existingIndices.splice(existingIndices.indexOf(index), 1);
|
||||
return { ok: true };
|
||||
} else {
|
||||
throw createEsClientError('index_not_found_exception');
|
||||
}
|
||||
}),
|
||||
exists: sinon.spy(async () => {
|
||||
throw new Error('Do not use indices.exists(). React to errors instead.');
|
||||
})
|
||||
}
|
||||
});
|
73
src/es_archiver/lib/indices/create_index_stream.js
Normal file
73
src/es_archiver/lib/indices/create_index_stream.js
Normal file
|
@ -0,0 +1,73 @@
|
|||
import { Transform } from 'stream';
|
||||
|
||||
import { get } from 'lodash';
|
||||
|
||||
export function createCreateIndexStream({ client, stats, skipExisting }) {
|
||||
const skipDocsFromIndices = new Set();
|
||||
|
||||
async function handleDoc(stream, record) {
|
||||
if (skipDocsFromIndices.has(record.value.index)) {
|
||||
return;
|
||||
}
|
||||
|
||||
stream.push(record);
|
||||
}
|
||||
|
||||
async function handleIndex(stream, record) {
|
||||
const { index, settings, mappings } = record.value;
|
||||
|
||||
async function attemptToCreate(attemptNumber = 1) {
|
||||
try {
|
||||
await client.indices.create({
|
||||
method: 'PUT',
|
||||
index,
|
||||
body: { settings, mappings },
|
||||
});
|
||||
stats.createdIndex(index, { settings });
|
||||
} catch (err) {
|
||||
if (get(err, 'body.error.type') !== 'resource_already_exists_exception' || attemptNumber >= 3) {
|
||||
throw err;
|
||||
}
|
||||
|
||||
if (skipExisting) {
|
||||
skipDocsFromIndices.add(index);
|
||||
stats.skippedIndex(index);
|
||||
return;
|
||||
}
|
||||
|
||||
await client.indices.delete({ index });
|
||||
stats.deletedIndex(index);
|
||||
await attemptToCreate(attemptNumber + 1);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
await attemptToCreate();
|
||||
}
|
||||
|
||||
return new Transform({
|
||||
readableObjectMode: true,
|
||||
writableObjectMode: true,
|
||||
async transform(record, enc, callback) {
|
||||
try {
|
||||
switch (record && record.type) {
|
||||
case 'index':
|
||||
await handleIndex(this, record);
|
||||
break;
|
||||
|
||||
case 'doc':
|
||||
await handleDoc(this, record);
|
||||
break;
|
||||
|
||||
default:
|
||||
this.push(record);
|
||||
break;
|
||||
}
|
||||
|
||||
callback(null);
|
||||
} catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
34
src/es_archiver/lib/indices/delete_index_stream.js
Normal file
34
src/es_archiver/lib/indices/delete_index_stream.js
Normal file
|
@ -0,0 +1,34 @@
|
|||
import { Transform } from 'stream';
|
||||
|
||||
import { get } from 'lodash';
|
||||
|
||||
export function createDeleteIndexStream(client, stats) {
|
||||
|
||||
async function deleteIndex(index) {
|
||||
try {
|
||||
await client.indices.delete({ index });
|
||||
stats.deletedIndex(index);
|
||||
} catch (err) {
|
||||
if (get(err, 'body.error.type') !== 'index_not_found_exception') {
|
||||
throw err;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return new Transform({
|
||||
readableObjectMode: true,
|
||||
writableObjectMode: true,
|
||||
async transform(record, enc, callback) {
|
||||
try {
|
||||
if (!record || record.type === 'index') {
|
||||
await deleteIndex(record.value.index);
|
||||
} else {
|
||||
this.push(record);
|
||||
}
|
||||
callback();
|
||||
} catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
36
src/es_archiver/lib/indices/generate_index_records_stream.js
Normal file
36
src/es_archiver/lib/indices/generate_index_records_stream.js
Normal file
|
@ -0,0 +1,36 @@
|
|||
import { Transform } from 'stream';
|
||||
|
||||
export function createGenerateIndexRecordsStream(client, stats) {
|
||||
return new Transform({
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
async transform(index, enc, callback) {
|
||||
try {
|
||||
const resp = await client.indices.get({
|
||||
index,
|
||||
feature: ['_settings', '_mappings'],
|
||||
filterPath: [
|
||||
// remove settings that aren't really settings
|
||||
'-*.settings.index.creation_date',
|
||||
'-*.settings.index.uuid',
|
||||
'-*.settings.index.version',
|
||||
'-*.settings.index.provided_name',
|
||||
]
|
||||
});
|
||||
|
||||
const { settings, mappings } = resp[index];
|
||||
stats.archivedIndex(index, { settings, mappings });
|
||||
callback(null, {
|
||||
type: 'index',
|
||||
value: {
|
||||
index,
|
||||
settings,
|
||||
mappings,
|
||||
}
|
||||
});
|
||||
} catch (err) {
|
||||
callback(err);
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
3
src/es_archiver/lib/indices/index.js
Normal file
3
src/es_archiver/lib/indices/index.js
Normal file
|
@ -0,0 +1,3 @@
|
|||
export { createCreateIndexStream } from './create_index_stream';
|
||||
export { createDeleteIndexStream } from './delete_index_stream';
|
||||
export { createGenerateIndexRecordsStream } from './generate_index_records_stream';
|
|
@ -0,0 +1,58 @@
|
|||
import Chance from 'chance';
|
||||
import expect from 'expect.js';
|
||||
|
||||
import {
|
||||
createListStream,
|
||||
createPromiseFromStreams,
|
||||
createConcatStream,
|
||||
} from '../../../../utils';
|
||||
|
||||
import {
|
||||
createFilterRecordsStream,
|
||||
} from '../filter_records_stream';
|
||||
|
||||
const chance = new Chance();
|
||||
|
||||
describe('esArchiver: createFilterRecordsStream()', () => {
|
||||
it('consumes any value', async () => {
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
chance.integer(),
|
||||
/test/,
|
||||
{
|
||||
birthday: chance.birthday(),
|
||||
ssn: chance.ssn(),
|
||||
},
|
||||
chance.bool()
|
||||
]),
|
||||
createFilterRecordsStream('type'),
|
||||
createConcatStream([]),
|
||||
]);
|
||||
|
||||
expect(output).to.eql([]);
|
||||
});
|
||||
|
||||
it('produces record values that have a matching type', async () => {
|
||||
const type1 = chance.word();
|
||||
const output = await createPromiseFromStreams([
|
||||
createListStream([
|
||||
{ type: type1, value: {} },
|
||||
{ type: type1, value: {} },
|
||||
{ type: chance.word(), value: {} },
|
||||
{ type: chance.word(), value: {} },
|
||||
{ type: type1, value: {} },
|
||||
{ type: chance.word(), value: {} },
|
||||
{ type: chance.word(), value: {} },
|
||||
]),
|
||||
createFilterRecordsStream(type1),
|
||||
createConcatStream([]),
|
||||
]);
|
||||
|
||||
expect(output).to.have.length(3);
|
||||
expect(output.map(o => o.type)).to.eql([
|
||||
type1,
|
||||
type1,
|
||||
type1,
|
||||
]);
|
||||
});
|
||||
});
|
16
src/es_archiver/lib/records/filter_records_stream.js
Normal file
16
src/es_archiver/lib/records/filter_records_stream.js
Normal file
|
@ -0,0 +1,16 @@
|
|||
import { Transform } from 'stream';
|
||||
|
||||
export function createFilterRecordsStream(type) {
|
||||
return new Transform({
|
||||
writableObjectMode: true,
|
||||
readableObjectMode: true,
|
||||
|
||||
transform(record, enc, callback) {
|
||||
if (record && record.type === type) {
|
||||
callback(null, record);
|
||||
} else {
|
||||
callback();
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
1
src/es_archiver/lib/records/index.js
Normal file
1
src/es_archiver/lib/records/index.js
Normal file
|
@ -0,0 +1 @@
|
|||
export { createFilterRecordsStream } from './filter_records_stream';
|
77
src/es_archiver/lib/stats.js
Normal file
77
src/es_archiver/lib/stats.js
Normal file
|
@ -0,0 +1,77 @@
|
|||
import { cloneDeep } from 'lodash';
|
||||
|
||||
export function createStats(name, log) {
|
||||
const info = (msg, ...args) => log.info(`[${name}] ${msg}`, ...args);
|
||||
const debug = (msg, ...args) => log.debug(`[${name}] ${msg}`, ...args);
|
||||
|
||||
const indices = {};
|
||||
const getOrCreate = index => {
|
||||
if (!indices[index]) {
|
||||
indices[index] = {
|
||||
skipped: false,
|
||||
deleted: false,
|
||||
created: false,
|
||||
archived: false,
|
||||
configDocs: {
|
||||
upgraded: 0,
|
||||
tagged: 0,
|
||||
upToDate: 0,
|
||||
},
|
||||
docs: {
|
||||
indexed: 0,
|
||||
archived: 0,
|
||||
}
|
||||
};
|
||||
}
|
||||
return indices[index];
|
||||
};
|
||||
|
||||
class Stats {
|
||||
skippedIndex(index) {
|
||||
getOrCreate(index).skipped = true;
|
||||
info('Skipped restore for existing index %j', index);
|
||||
}
|
||||
|
||||
deletedIndex(index) {
|
||||
getOrCreate(index).deleted = true;
|
||||
info('Deleted existing index %j', index);
|
||||
}
|
||||
|
||||
createdIndex(index, metadata) {
|
||||
getOrCreate(index).created = true;
|
||||
info('Created index %j', index);
|
||||
Object.keys(metadata || {}).forEach(name => {
|
||||
debug('%j %s %j', index, name, metadata[name]);
|
||||
});
|
||||
}
|
||||
|
||||
archivedIndex(index, metadata) {
|
||||
getOrCreate(index).archived = true;
|
||||
info('Archived %j', index);
|
||||
Object.keys(metadata || {}).forEach(name => {
|
||||
debug('%j %s %j', index, name, metadata[name]);
|
||||
});
|
||||
}
|
||||
|
||||
indexedDoc(index) {
|
||||
getOrCreate(index).docs.indexed += 1;
|
||||
}
|
||||
|
||||
archivedDoc(index) {
|
||||
getOrCreate(index).docs.archived += 1;
|
||||
}
|
||||
|
||||
toJSON() {
|
||||
return cloneDeep(indices);
|
||||
}
|
||||
|
||||
forEachIndex(fn) {
|
||||
const clone = this.toJSON();
|
||||
Object.keys(clone).forEach(index => {
|
||||
fn(index, clone[index]);
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
return new Stats();
|
||||
}
|
BIN
src/fixtures/es_archives/dashboard/data.json.gz
Normal file
BIN
src/fixtures/es_archives/dashboard/data.json.gz
Normal file
Binary file not shown.
194
src/fixtures/es_archives/dashboard/mappings.json
Normal file
194
src/fixtures/es_archives/dashboard/mappings.json
Normal file
|
@ -0,0 +1,194 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "5",
|
||||
"number_of_replicas": "1"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"config": {
|
||||
"properties": {
|
||||
"buildNum": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"visualization": {
|
||||
"properties": {
|
||||
"description": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"uiStateJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
},
|
||||
"visState": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"search": {
|
||||
"properties": {
|
||||
"columns": {
|
||||
"type": "text"
|
||||
},
|
||||
"description": {
|
||||
"type": "text"
|
||||
},
|
||||
"hits": {
|
||||
"type": "integer"
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
},
|
||||
"sort": {
|
||||
"type": "text"
|
||||
},
|
||||
"title": {
|
||||
"type": "text"
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"index-pattern": {
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"sourceFilters": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"timeFieldName": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"server": {
|
||||
"properties": {
|
||||
"uuid": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"dashboard": {
|
||||
"properties": {
|
||||
"description": {
|
||||
"type": "text"
|
||||
},
|
||||
"hits": {
|
||||
"type": "integer"
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
},
|
||||
"optionsJSON": {
|
||||
"type": "text"
|
||||
},
|
||||
"panelsJSON": {
|
||||
"type": "text"
|
||||
},
|
||||
"timeFrom": {
|
||||
"type": "text"
|
||||
},
|
||||
"timeRestore": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"timeTo": {
|
||||
"type": "text"
|
||||
},
|
||||
"title": {
|
||||
"type": "text"
|
||||
},
|
||||
"uiStateJSON": {
|
||||
"type": "text"
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
BIN
src/fixtures/es_archives/discover/data.json.gz
Normal file
BIN
src/fixtures/es_archives/discover/data.json.gz
Normal file
Binary file not shown.
113
src/fixtures/es_archives/discover/mappings.json
Normal file
113
src/fixtures/es_archives/discover/mappings.json
Normal file
|
@ -0,0 +1,113 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "5",
|
||||
"number_of_replicas": "1"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"search": {
|
||||
"properties": {
|
||||
"columns": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"description": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"hits": {
|
||||
"type": "long"
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"sort": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"version": {
|
||||
"type": "long"
|
||||
}
|
||||
}
|
||||
},
|
||||
"index-pattern": {
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"sourceFilters": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"timeFieldName": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
BIN
src/fixtures/es_archives/empty_kibana/data.json.gz
Normal file
BIN
src/fixtures/es_archives/empty_kibana/data.json.gz
Normal file
Binary file not shown.
30
src/fixtures/es_archives/empty_kibana/mappings.json
Normal file
30
src/fixtures/es_archives/empty_kibana/mappings.json
Normal file
|
@ -0,0 +1,30 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "1",
|
||||
"number_of_replicas": "1"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"config": {
|
||||
"properties": {
|
||||
"buildNum": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"dateFormat:tz": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
BIN
src/fixtures/es_archives/logstash_functional/data.json.gz
Normal file
BIN
src/fixtures/es_archives/logstash_functional/data.json.gz
Normal file
Binary file not shown.
2333
src/fixtures/es_archives/logstash_functional/mappings.json
Normal file
2333
src/fixtures/es_archives/logstash_functional/mappings.json
Normal file
File diff suppressed because it is too large
Load diff
BIN
src/fixtures/es_archives/makelogs/data.json.gz
Normal file
BIN
src/fixtures/es_archives/makelogs/data.json.gz
Normal file
Binary file not shown.
1552
src/fixtures/es_archives/makelogs/mappings.json
Normal file
1552
src/fixtures/es_archives/makelogs/mappings.json
Normal file
File diff suppressed because it is too large
Load diff
BIN
src/fixtures/es_archives/visualize/data.json.gz
Normal file
BIN
src/fixtures/es_archives/visualize/data.json.gz
Normal file
Binary file not shown.
110
src/fixtures/es_archives/visualize/mappings.json
Normal file
110
src/fixtures/es_archives/visualize/mappings.json
Normal file
|
@ -0,0 +1,110 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "5",
|
||||
"number_of_replicas": "1"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"index-pattern": {
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"sourceFilters": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"timeFieldName": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"visualization": {
|
||||
"properties": {
|
||||
"description": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"uiStateJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
},
|
||||
"visState": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
BIN
src/fixtures/es_archives/visualize_source-filters/data.json.gz
Normal file
BIN
src/fixtures/es_archives/visualize_source-filters/data.json.gz
Normal file
Binary file not shown.
|
@ -0,0 +1,54 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "5",
|
||||
"number_of_replicas": "1"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"index-pattern": {
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"sourceFilters": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"timeFieldName": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
BIN
src/fixtures/es_archives/visualize_source_filters/data.json.gz
Normal file
BIN
src/fixtures/es_archives/visualize_source_filters/data.json.gz
Normal file
Binary file not shown.
|
@ -0,0 +1,76 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "1",
|
||||
"number_of_replicas": "0"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"config": {
|
||||
"properties": {
|
||||
"dateFormat:tz": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"defaultIndex": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"index-pattern": {
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"sourceFilters": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"timeFieldName": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,7 @@
|
|||
import { resolve } from 'path';
|
||||
|
||||
export default () => ({
|
||||
testFiles: [
|
||||
resolve(__dirname, 'tests.js')
|
||||
]
|
||||
});
|
|
@ -0,0 +1,18 @@
|
|||
import expect from 'expect.js';
|
||||
|
||||
export default () => {
|
||||
describe('app one', () => {
|
||||
before(() => {
|
||||
console.log('$BEFORE$');
|
||||
});
|
||||
|
||||
it('$TESTNAME$', () => {
|
||||
expect(1).to.be(1);
|
||||
console.log('$INTEST$');
|
||||
});
|
||||
|
||||
after(() => {
|
||||
console.log('$AFTER$');
|
||||
});
|
||||
});
|
||||
};
|
Binary file not shown.
|
@ -0,0 +1,212 @@
|
|||
{
|
||||
"type": "index",
|
||||
"value": {
|
||||
"index": ".kibana",
|
||||
"settings": {
|
||||
"index": {
|
||||
"number_of_shards": "1",
|
||||
"number_of_replicas": "0"
|
||||
}
|
||||
},
|
||||
"mappings": {
|
||||
"server": {
|
||||
"properties": {
|
||||
"uuid": {
|
||||
"type": "keyword"
|
||||
}
|
||||
}
|
||||
},
|
||||
"search": {
|
||||
"properties": {
|
||||
"columns": {
|
||||
"type": "text"
|
||||
},
|
||||
"description": {
|
||||
"type": "text"
|
||||
},
|
||||
"hits": {
|
||||
"type": "integer"
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
},
|
||||
"sort": {
|
||||
"type": "text"
|
||||
},
|
||||
"title": {
|
||||
"type": "text"
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"config": {
|
||||
"properties": {
|
||||
"buildNum": {
|
||||
"type": "keyword"
|
||||
},
|
||||
"dateFormat:tz": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"defaultIndex": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"dashboard": {
|
||||
"properties": {
|
||||
"description": {
|
||||
"type": "text"
|
||||
},
|
||||
"hits": {
|
||||
"type": "integer"
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text"
|
||||
}
|
||||
}
|
||||
},
|
||||
"optionsJSON": {
|
||||
"type": "text"
|
||||
},
|
||||
"panelsJSON": {
|
||||
"type": "text"
|
||||
},
|
||||
"timeFrom": {
|
||||
"type": "text"
|
||||
},
|
||||
"timeRestore": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"timeTo": {
|
||||
"type": "text"
|
||||
},
|
||||
"title": {
|
||||
"type": "text"
|
||||
},
|
||||
"uiStateJSON": {
|
||||
"type": "text"
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
}
|
||||
}
|
||||
},
|
||||
"visualization": {
|
||||
"properties": {
|
||||
"description": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"kibanaSavedObjectMeta": {
|
||||
"properties": {
|
||||
"searchSourceJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"uiStateJSON": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"version": {
|
||||
"type": "integer"
|
||||
},
|
||||
"visState": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
},
|
||||
"index-pattern": {
|
||||
"properties": {
|
||||
"fields": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"sourceFilters": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"timeFieldName": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
},
|
||||
"title": {
|
||||
"type": "text",
|
||||
"fields": {
|
||||
"keyword": {
|
||||
"type": "keyword",
|
||||
"ignore_above": 256
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,24 @@
|
|||
import { resolve } from 'path';
|
||||
|
||||
export default () => ({
|
||||
testFiles: [
|
||||
resolve(__dirname, 'tests.js')
|
||||
],
|
||||
|
||||
esArchiver: {
|
||||
directory: resolve(__dirname, 'archives')
|
||||
},
|
||||
|
||||
servers: {
|
||||
elasticsearch: {
|
||||
protocol: 'http',
|
||||
hostname: 'localhost',
|
||||
port: 5700
|
||||
},
|
||||
kibana: {
|
||||
protocol: 'http',
|
||||
hostname: 'localhost',
|
||||
port: 5701
|
||||
}
|
||||
}
|
||||
});
|
|
@ -0,0 +1,23 @@
|
|||
export default ({ getService }) => {
|
||||
const esArchiver = getService('esArchiver');
|
||||
const es = getService('es');
|
||||
const log = getService('log');
|
||||
|
||||
describe('tests', () => {
|
||||
before(async () => {
|
||||
log.debug('before load()');
|
||||
await esArchiver.load('test_archive');
|
||||
log.debug('after load()');
|
||||
});
|
||||
|
||||
it('loaded the archive', async () => {
|
||||
log.debug('es aliases', await es.indices.getAlias());
|
||||
});
|
||||
|
||||
after(async () => {
|
||||
log.debug('before unload()');
|
||||
await esArchiver.unload('test_archive');
|
||||
log.debug('after unload()');
|
||||
});
|
||||
});
|
||||
};
|
20
src/functional_test_runner/__tests__/integration/basic.js
Normal file
20
src/functional_test_runner/__tests__/integration/basic.js
Normal file
|
@ -0,0 +1,20 @@
|
|||
import { spawnSync } from 'child_process';
|
||||
import { resolve } from 'path';
|
||||
|
||||
import expect from 'expect.js';
|
||||
|
||||
const SCRIPT = resolve(__dirname, '../../../../scripts/functional_test_runner.js');
|
||||
const BASIC_CONFIG = resolve(__dirname, '../fixtures/simple_project/config.js');
|
||||
|
||||
describe('basic config file with a single app and test', function () {
|
||||
this.timeout(60 * 1000);
|
||||
|
||||
it('runs and prints expected output', () => {
|
||||
const proc = spawnSync(process.execPath, [SCRIPT, '--config', BASIC_CONFIG]);
|
||||
const stdout = proc.stdout.toString('utf8');
|
||||
expect(stdout).to.contain('$BEFORE$');
|
||||
expect(stdout).to.contain('$TESTNAME$');
|
||||
expect(stdout).to.contain('$INTEST$');
|
||||
expect(stdout).to.contain('$AFTER$');
|
||||
});
|
||||
});
|
|
@ -0,0 +1,71 @@
|
|||
import { spawn } from 'child_process';
|
||||
import { resolve } from 'path';
|
||||
import { format as formatUrl } from 'url';
|
||||
|
||||
import { readConfigFile } from '../../lib';
|
||||
import { createToolingLog, createReduceStream } from '../../../utils';
|
||||
import { startupEs, startupKibana } from '../lib';
|
||||
|
||||
const SCRIPT = resolve(__dirname, '../../../../scripts/functional_test_runner.js');
|
||||
const CONFIG = resolve(__dirname, '../fixtures/with_es_archiver/config.js');
|
||||
|
||||
describe('single test that uses esArchiver', function () {
|
||||
this.timeout(60 * 1000);
|
||||
|
||||
let log;
|
||||
const cleanupWork = [];
|
||||
|
||||
before(async () => {
|
||||
log = createToolingLog('verbose', process.stdout);
|
||||
log.indent(6);
|
||||
|
||||
const config = await readConfigFile(log, CONFIG);
|
||||
|
||||
log.info('starting elasticsearch');
|
||||
log.indent(2);
|
||||
const es = await startupEs({
|
||||
log,
|
||||
port: config.get('servers.elasticsearch.port'),
|
||||
fresh: false
|
||||
});
|
||||
log.indent(-2);
|
||||
|
||||
log.info('starting kibana');
|
||||
log.indent(2);
|
||||
const kibana = await startupKibana({
|
||||
port: config.get('servers.kibana.port'),
|
||||
esUrl: formatUrl(config.get('servers.elasticsearch'))
|
||||
});
|
||||
log.indent(-2);
|
||||
|
||||
cleanupWork.push(() => es.shutdown());
|
||||
cleanupWork.push(() => kibana.close());
|
||||
});
|
||||
|
||||
it('test', async () => {
|
||||
const proc = spawn(process.execPath, [SCRIPT, '--config', CONFIG], {
|
||||
stdio: ['ignore', 'pipe', 'ignore']
|
||||
});
|
||||
|
||||
const concatChunks = (acc, chunk) => `${acc}${chunk}`;
|
||||
const concatStdout = proc.stdout.pipe(createReduceStream(concatChunks));
|
||||
|
||||
const [stdout] = await Promise.all([
|
||||
new Promise((resolve, reject) => {
|
||||
concatStdout.on('error', reject);
|
||||
concatStdout.on('data', resolve); // reduce streams produce a single value, no need to wait for anything else
|
||||
}),
|
||||
|
||||
new Promise((resolve, reject) => {
|
||||
proc.on('error', reject);
|
||||
proc.on('close', resolve);
|
||||
})
|
||||
]);
|
||||
|
||||
log.debug(stdout.toString('utf8'));
|
||||
});
|
||||
|
||||
after(() => {
|
||||
return Promise.all(cleanupWork.splice(0).map(fn => fn()));
|
||||
});
|
||||
});
|
43
src/functional_test_runner/__tests__/lib/es.js
Normal file
43
src/functional_test_runner/__tests__/lib/es.js
Normal file
|
@ -0,0 +1,43 @@
|
|||
import { resolve } from 'path';
|
||||
|
||||
import { once, merge } from 'lodash';
|
||||
import libesvm from 'libesvm';
|
||||
|
||||
const VERSION = 'master';
|
||||
const DIRECTORY = resolve(__dirname, '../../../../esvm/functional_test_runner_tests');
|
||||
|
||||
const createCluster = (options = {}) => {
|
||||
return libesvm.createCluster(merge({
|
||||
directory: DIRECTORY,
|
||||
branch: VERSION,
|
||||
}, options));
|
||||
};
|
||||
|
||||
const install = once(async (fresh) => {
|
||||
await createCluster({ fresh }).install();
|
||||
});
|
||||
|
||||
export async function startupEs(opts) {
|
||||
const {
|
||||
port,
|
||||
log,
|
||||
fresh = true
|
||||
} = opts;
|
||||
|
||||
await install({ fresh });
|
||||
const cluster = createCluster({
|
||||
config: {
|
||||
http: {
|
||||
port
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
cluster.on('log', (event) => {
|
||||
const method = event.level.toLowerCase() === 'info' ? 'verbose' : 'debug';
|
||||
log[method](`${event.level}: ${event.type} - ${event.message}`);
|
||||
});
|
||||
|
||||
await cluster.start();
|
||||
return cluster;
|
||||
}
|
2
src/functional_test_runner/__tests__/lib/index.js
Normal file
2
src/functional_test_runner/__tests__/lib/index.js
Normal file
|
@ -0,0 +1,2 @@
|
|||
export { startupEs } from './es';
|
||||
export { startupKibana } from './kibana';
|
25
src/functional_test_runner/__tests__/lib/kibana.js
Normal file
25
src/functional_test_runner/__tests__/lib/kibana.js
Normal file
|
@ -0,0 +1,25 @@
|
|||
import { resolve } from 'path';
|
||||
|
||||
import { createServer } from '../../../../test/utils/kbn_server';
|
||||
|
||||
export async function startupKibana({ port, esUrl }) {
|
||||
const server = createServer({
|
||||
server: {
|
||||
port,
|
||||
autoListen: true,
|
||||
},
|
||||
|
||||
plugins: {
|
||||
scanDirs: [
|
||||
resolve(__dirname, '../../../core_plugins')
|
||||
],
|
||||
},
|
||||
|
||||
elasticsearch: {
|
||||
url: esUrl
|
||||
}
|
||||
});
|
||||
|
||||
await server.ready();
|
||||
return server;
|
||||
}
|
82
src/functional_test_runner/cli.js
Normal file
82
src/functional_test_runner/cli.js
Normal file
|
@ -0,0 +1,82 @@
|
|||
import { resolve } from 'path';
|
||||
|
||||
import { Command } from 'commander';
|
||||
|
||||
import { createToolingLog } from '../utils';
|
||||
import { createFunctionalTestRunner } from './functional_test_runner';
|
||||
|
||||
const cmd = new Command('node scripts/functional_test_runner');
|
||||
|
||||
const resolveConfigPath = v => resolve(process.cwd(), v);
|
||||
const defaultConfigPath = resolveConfigPath('test/functional/config.js');
|
||||
|
||||
cmd
|
||||
.option('--config [path]', 'Path to a config file', resolveConfigPath, defaultConfigPath)
|
||||
.option('--bail', 'stop tests after the first failure', false)
|
||||
.option('--grep <pattern>', 'pattern used to select which tests to run')
|
||||
.option('--verbose', 'Log everything', false)
|
||||
.option('--quiet', 'Only log errors', false)
|
||||
.option('--silent', 'Log nothing', false)
|
||||
.option('--debug', 'Run in debug mode', false)
|
||||
.parse(process.argv);
|
||||
|
||||
if (!cmd.config) {
|
||||
console.log('');
|
||||
console.log(' --config is a required parameter');
|
||||
console.log('');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
let logLevel = 'info';
|
||||
if (cmd.silent) logLevel = 'silent';
|
||||
if (cmd.quiet) logLevel = 'error';
|
||||
if (cmd.debug) logLevel = 'debug';
|
||||
if (cmd.verbose) logLevel = 'verbose';
|
||||
|
||||
const log = createToolingLog(logLevel);
|
||||
log.pipe(process.stdout);
|
||||
|
||||
const functionalTestRunner = createFunctionalTestRunner({
|
||||
log,
|
||||
configFile: cmd.config,
|
||||
configOverrides: {
|
||||
mochaOpts: {
|
||||
bail: cmd.bail,
|
||||
grep: cmd.grep,
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
async function run() {
|
||||
try {
|
||||
const failureCount = await functionalTestRunner.run();
|
||||
process.exitCode = failureCount ? 1 : 0;
|
||||
} catch (err) {
|
||||
await teardown(err);
|
||||
} finally {
|
||||
await teardown();
|
||||
}
|
||||
}
|
||||
|
||||
let teardownRun = false;
|
||||
async function teardown(err) {
|
||||
if (teardownRun) return;
|
||||
|
||||
teardownRun = true;
|
||||
if (err) {
|
||||
log.indent(-log.indent());
|
||||
log.error(err);
|
||||
process.exitCode = 1;
|
||||
}
|
||||
|
||||
try {
|
||||
await functionalTestRunner.close();
|
||||
} finally {
|
||||
process.exit();
|
||||
}
|
||||
}
|
||||
|
||||
process.on('unhandledRejection', err => teardown(err));
|
||||
process.on('SIGTERM', () => teardown());
|
||||
process.on('SIGINT', () => teardown());
|
||||
run();
|
64
src/functional_test_runner/functional_test_runner.js
Normal file
64
src/functional_test_runner/functional_test_runner.js
Normal file
|
@ -0,0 +1,64 @@
|
|||
import {
|
||||
createLifecycle,
|
||||
readConfigFile,
|
||||
createProviderCollection,
|
||||
setupMocha,
|
||||
runTests,
|
||||
} from './lib';
|
||||
|
||||
export function createFunctionalTestRunner({ log, configFile, configOverrides }) {
|
||||
const lifecycle = createLifecycle();
|
||||
|
||||
lifecycle.on('phaseStart', name => {
|
||||
log.verbose('starting %j lifecycle phase', name);
|
||||
});
|
||||
|
||||
lifecycle.on('phaseEnd', name => {
|
||||
log.verbose('ending %j lifecycle phase', name);
|
||||
});
|
||||
|
||||
class FunctionalTestRunner {
|
||||
async run() {
|
||||
let runErrorOccurred = false;
|
||||
|
||||
try {
|
||||
const config = await readConfigFile(log, configFile, configOverrides);
|
||||
log.info('Config loaded');
|
||||
|
||||
const providers = createProviderCollection(lifecycle, log, config);
|
||||
await providers.loadAll();
|
||||
|
||||
const mocha = await setupMocha(lifecycle, log, config, providers);
|
||||
await lifecycle.trigger('beforeTests');
|
||||
log.info('Starting tests');
|
||||
return await runTests(lifecycle, log, mocha);
|
||||
|
||||
} catch (runError) {
|
||||
runErrorOccurred = true;
|
||||
throw runError;
|
||||
|
||||
} finally {
|
||||
try {
|
||||
await this.close();
|
||||
|
||||
} catch (closeError) {
|
||||
if (runErrorOccurred) {
|
||||
log.error('failed to close functional_test_runner');
|
||||
log.error(closeError);
|
||||
} else {
|
||||
throw closeError;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async close() {
|
||||
if (this._closed) return;
|
||||
|
||||
this._closed = true;
|
||||
await lifecycle.trigger('cleanup');
|
||||
}
|
||||
}
|
||||
|
||||
return new FunctionalTestRunner();
|
||||
}
|
1
src/functional_test_runner/index.js
Normal file
1
src/functional_test_runner/index.js
Normal file
|
@ -0,0 +1 @@
|
|||
export { createFunctionalTestRunner } from './functional_test_runner';
|
64
src/functional_test_runner/lib/config/config.js
Normal file
64
src/functional_test_runner/lib/config/config.js
Normal file
|
@ -0,0 +1,64 @@
|
|||
import { get, has, cloneDeep } from 'lodash';
|
||||
import toPath from 'lodash/internal/toPath';
|
||||
|
||||
import { schema } from './schema';
|
||||
|
||||
const $values = Symbol('values');
|
||||
|
||||
export class Config {
|
||||
constructor(settings) {
|
||||
const { error, value } = schema.validate(settings);
|
||||
if (error) throw error;
|
||||
this[$values] = value;
|
||||
}
|
||||
|
||||
has(key) {
|
||||
function recursiveHasCheck(path, values, schema) {
|
||||
if (!schema._inner) return false;
|
||||
|
||||
// normalize child and pattern checks so we can iterate the checks in a single loop
|
||||
const checks = [].concat(
|
||||
// match children first, they have priority
|
||||
(schema._inner.children || []).map(child => ({
|
||||
test: key => child.key === key,
|
||||
schema: child.schema
|
||||
})),
|
||||
// match patterns on any key that doesn't match an explicit child
|
||||
(schema._inner.patterns || []).map(pattern => ({
|
||||
test: key => pattern.regex.test(key) && has(values, key),
|
||||
schema: pattern.rule
|
||||
}))
|
||||
);
|
||||
|
||||
for (const check of checks) {
|
||||
if (!check.test(path[0])) {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (path.length > 1) {
|
||||
return recursiveHasCheck(path.slice(1), get(values, path[0]), check.schema);
|
||||
}
|
||||
|
||||
return true;
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
const path = toPath(key);
|
||||
if (!path.length) return true;
|
||||
return recursiveHasCheck(path, this[$values], schema);
|
||||
}
|
||||
|
||||
get(key, defaultValue) {
|
||||
if (!this.has(key)) {
|
||||
throw new Error(`Unknown config key "${key}"`);
|
||||
}
|
||||
|
||||
return cloneDeep(get(this[$values], key, defaultValue), (v) => {
|
||||
if (typeof v === 'function') {
|
||||
return v;
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
28
src/functional_test_runner/lib/config/create_config.js
Normal file
28
src/functional_test_runner/lib/config/create_config.js
Normal file
|
@ -0,0 +1,28 @@
|
|||
import { defaultsDeep } from 'lodash';
|
||||
|
||||
import { Config } from './config';
|
||||
|
||||
export async function readConfigFile(log, configFile, settingOverrides) {
|
||||
log.debug('Loading config file from %j', configFile);
|
||||
|
||||
const configModule = require(configFile);
|
||||
const configProvider = configModule.__esModule
|
||||
? configModule.default
|
||||
: configModule;
|
||||
|
||||
const settings = defaultsDeep(
|
||||
{},
|
||||
settingOverrides,
|
||||
await configProvider({
|
||||
log,
|
||||
|
||||
// give a version of the readConfigFile function to
|
||||
// the config file that already has has the logger bound
|
||||
readConfigFile: async (...args) => (
|
||||
await readConfigFile(log, ...args)
|
||||
)
|
||||
})
|
||||
);
|
||||
|
||||
return new Config(settings);
|
||||
}
|
1
src/functional_test_runner/lib/config/index.js
Normal file
1
src/functional_test_runner/lib/config/index.js
Normal file
|
@ -0,0 +1 @@
|
|||
export { readConfigFile } from './create_config';
|
81
src/functional_test_runner/lib/config/schema.js
Normal file
81
src/functional_test_runner/lib/config/schema.js
Normal file
|
@ -0,0 +1,81 @@
|
|||
import Joi from 'joi';
|
||||
|
||||
import { ConsoleReporterProvider } from '../reporters';
|
||||
|
||||
// valid pattern for ID
|
||||
// enforced camel-case identifiers for consistency
|
||||
const ID_PATTERN = /^[a-zA-Z0-9_]+$/;
|
||||
const INSPECTING = process.execArgv.includes('--inspect');
|
||||
|
||||
const urlPartsSchema = () => Joi.object().keys({
|
||||
protocol: Joi.string().valid('http', 'https'),
|
||||
hostname: Joi.string().hostname(),
|
||||
port: Joi.number(),
|
||||
auth: Joi.string().regex(/^[^:]+:.+$/, 'username and password seperated by a colon'),
|
||||
username: Joi.string(),
|
||||
password: Joi.string(),
|
||||
pathname: Joi.string().regex(/^\//, 'start with a /'),
|
||||
hash: Joi.string().regex(/^\//, 'start with a /')
|
||||
}).default();
|
||||
|
||||
export const schema = Joi.object().keys({
|
||||
testFiles: Joi.array().items(Joi.string()).required(),
|
||||
|
||||
services: Joi.object().pattern(
|
||||
ID_PATTERN,
|
||||
Joi.func().required()
|
||||
).default(),
|
||||
|
||||
pageObjects: Joi.object().pattern(
|
||||
ID_PATTERN,
|
||||
Joi.func().required()
|
||||
).default(),
|
||||
|
||||
timeouts: Joi.object().keys({
|
||||
find: Joi.number().default(10000),
|
||||
try: Joi.number().default(40000),
|
||||
test: Joi.number().default(INSPECTING ? Infinity : 120000),
|
||||
esRequestTimeout: Joi.number().default(30000),
|
||||
kibanaStabilize: Joi.number().default(15000),
|
||||
navigateStatusPageCheck: Joi.number().default(250),
|
||||
}).default(),
|
||||
|
||||
mochaOpts: Joi.object().keys({
|
||||
bail: Joi.boolean().default(false),
|
||||
grep: Joi.string(),
|
||||
slow: Joi.number().default(30000),
|
||||
timeout: Joi.number().default(60000),
|
||||
ui: Joi.string().default('bdd'),
|
||||
reporterProvider: Joi.func().default(ConsoleReporterProvider),
|
||||
}).default(),
|
||||
|
||||
users: Joi.object().pattern(
|
||||
ID_PATTERN,
|
||||
Joi.object().keys({
|
||||
username: Joi.string().required(),
|
||||
password: Joi.string().required(),
|
||||
}).required()
|
||||
),
|
||||
|
||||
servers: Joi.object().keys({
|
||||
webdriver: urlPartsSchema(),
|
||||
kibana: urlPartsSchema(),
|
||||
elasticsearch: urlPartsSchema(),
|
||||
}).default(),
|
||||
|
||||
// definition of apps that work with `common.navigateToApp()`
|
||||
apps: Joi.object().pattern(
|
||||
ID_PATTERN,
|
||||
urlPartsSchema()
|
||||
).default(),
|
||||
|
||||
// settings for the esArchiver module
|
||||
esArchiver: Joi.object().keys({
|
||||
directory: Joi.string().required()
|
||||
}),
|
||||
|
||||
// settings for the screenshots module
|
||||
screenshots: Joi.object().keys({
|
||||
directory: Joi.string().required()
|
||||
}),
|
||||
}).default();
|
31
src/functional_test_runner/lib/create_provider_collection.js
Normal file
31
src/functional_test_runner/lib/create_provider_collection.js
Normal file
|
@ -0,0 +1,31 @@
|
|||
import {
|
||||
ProviderCollection,
|
||||
readProviderSpec
|
||||
} from './providers';
|
||||
|
||||
/**
|
||||
* Create a ProviderCollection that includes the Service
|
||||
* providers and PageObject providers from config, as well
|
||||
* providers for the default services, lifecycle, log, and
|
||||
* config
|
||||
*
|
||||
* @param {Lifecycle} lifecycle
|
||||
* @param {ToolingLog} log
|
||||
* @param {Config} config [description]
|
||||
* @return {ProviderCollection}
|
||||
*/
|
||||
export function createProviderCollection(lifecycle, log, config) {
|
||||
return new ProviderCollection([
|
||||
...readProviderSpec('Service', {
|
||||
// base level services that functional_test_runner exposes
|
||||
lifecycle: () => lifecycle,
|
||||
log: () => log,
|
||||
config: () => config,
|
||||
|
||||
...config.get('services'),
|
||||
}),
|
||||
...readProviderSpec('PageObject', {
|
||||
...config.get('pageObjects')
|
||||
})
|
||||
]);
|
||||
}
|
74
src/functional_test_runner/lib/describe_nesting_validator.js
Normal file
74
src/functional_test_runner/lib/describe_nesting_validator.js
Normal file
|
@ -0,0 +1,74 @@
|
|||
/**
|
||||
* Creates an object that enables us to intercept all calls to mocha
|
||||
* interface functions `describe()`, `before()`, etc. and ensure that:
|
||||
*
|
||||
* - all calls are made within a `describe()`
|
||||
* - there is only one top-level `describe()`
|
||||
*
|
||||
* To do this we create a proxy to another object, `context`. Mocha
|
||||
* interfaces will assign all of their exposed methods on this Proxy
|
||||
* which will wrap all functions with checks for the above rules.
|
||||
*
|
||||
* @return {any} the context that mocha-ui interfaces will assign to
|
||||
*/
|
||||
export function createDescribeNestingValidator(context) {
|
||||
let describeCount = 0;
|
||||
let describeLevel = 0;
|
||||
|
||||
function createContextProxy() {
|
||||
return new Proxy(context, {
|
||||
set(target, property, value) {
|
||||
return Reflect.set(target, property, wrapContextAssignmentValue(property, value));
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function wrapContextAssignmentValue(name, value) {
|
||||
if (typeof value !== 'function') {
|
||||
return value;
|
||||
}
|
||||
|
||||
if (name === 'describe') {
|
||||
return createDescribeProxy(value);
|
||||
}
|
||||
|
||||
return createNonDescribeProxy(name, value);
|
||||
}
|
||||
|
||||
function createDescribeProxy(describe) {
|
||||
return new Proxy(describe, {
|
||||
apply(target, thisArg, args) {
|
||||
try {
|
||||
if (describeCount > 0 && describeLevel === 0) {
|
||||
throw new Error(`
|
||||
Test files must only define a single top-level suite. Please ensure that
|
||||
all calls to \`describe()\` are within a single \`describe()\` call in this file.
|
||||
`);
|
||||
}
|
||||
|
||||
describeCount += 1;
|
||||
describeLevel += 1;
|
||||
return Reflect.apply(describe, thisArg, args);
|
||||
} finally {
|
||||
describeLevel -= 1;
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function createNonDescribeProxy(name, nonDescribe) {
|
||||
return new Proxy(nonDescribe, {
|
||||
apply(target, thisArg, args) {
|
||||
if (describeCount === 0) {
|
||||
throw new Error(`
|
||||
All ${name}() calls in test files must be within a describe() call.
|
||||
`);
|
||||
}
|
||||
|
||||
return Reflect.apply(nonDescribe, thisArg, args);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
return createContextProxy();
|
||||
}
|
5
src/functional_test_runner/lib/index.js
Normal file
5
src/functional_test_runner/lib/index.js
Normal file
|
@ -0,0 +1,5 @@
|
|||
export { createLifecycle } from './lifecycle';
|
||||
export { readConfigFile } from './config';
|
||||
export { createProviderCollection } from './create_provider_collection';
|
||||
export { setupMocha } from './setup_mocha';
|
||||
export { runTests } from './run_tests';
|
42
src/functional_test_runner/lib/lifecycle.js
Normal file
42
src/functional_test_runner/lib/lifecycle.js
Normal file
|
@ -0,0 +1,42 @@
|
|||
export function createLifecycle() {
|
||||
const listeners = {
|
||||
beforeLoadTests: [],
|
||||
beforeTests: [],
|
||||
beforeEachTest: [],
|
||||
cleanup: [],
|
||||
phaseStart: [],
|
||||
phaseEnd: [],
|
||||
};
|
||||
|
||||
class Lifecycle {
|
||||
on(name, fn) {
|
||||
if (!listeners[name]) {
|
||||
throw new TypeError(`invalid lifecycle event "${name}"`);
|
||||
}
|
||||
|
||||
listeners[name].push(fn);
|
||||
}
|
||||
|
||||
async trigger(name, ...args) {
|
||||
if (!listeners[name]) {
|
||||
throw new TypeError(`invalid lifecycle event "${name}"`);
|
||||
}
|
||||
|
||||
try {
|
||||
if (name !== 'phaseStart' && name !== 'phaseEnd') {
|
||||
await this.trigger('phaseStart', name);
|
||||
}
|
||||
|
||||
await Promise.all(listeners[name].map(
|
||||
async fn => await fn(...args)
|
||||
));
|
||||
} finally {
|
||||
if (name !== 'phaseStart' && name !== 'phaseEnd') {
|
||||
await this.trigger('phaseEnd', name);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return new Lifecycle();
|
||||
}
|
60
src/functional_test_runner/lib/load_test_files.js
Normal file
60
src/functional_test_runner/lib/load_test_files.js
Normal file
|
@ -0,0 +1,60 @@
|
|||
import { isAbsolute } from 'path';
|
||||
|
||||
import { loadTracer } from './load_tracer';
|
||||
import { createDescribeNestingValidator } from './describe_nesting_validator';
|
||||
|
||||
/**
|
||||
* Load an array of test files into a mocha instance
|
||||
*
|
||||
* @param {Mocha} mocha
|
||||
* @param {ToolingLog} log
|
||||
* @param {ProviderCollection} providers
|
||||
* @param {String} path
|
||||
* @return {undefined} - mutates mocha, no return value
|
||||
*/
|
||||
export const loadTestFiles = (mocha, log, providers, paths) => {
|
||||
const innerLoadTestFile = (path) => {
|
||||
if (typeof path !== 'string' || !isAbsolute(path)) {
|
||||
throw new TypeError('loadTestFile() only accepts absolute paths');
|
||||
}
|
||||
|
||||
loadTracer(path, `testFile[${path}]`, () => {
|
||||
log.verbose('Loading test file %s', path);
|
||||
|
||||
const testModule = require(path);
|
||||
const testProvider = testModule.__esModule
|
||||
? testModule.default
|
||||
: testModule;
|
||||
|
||||
runTestProvider(testProvider, path); // eslint-disable-line
|
||||
});
|
||||
};
|
||||
|
||||
const runTestProvider = (provider, path) => {
|
||||
if (typeof provider !== 'function') {
|
||||
throw new Error(`Default export of test files must be a function, got ${provider}`);
|
||||
}
|
||||
|
||||
loadTracer(provider, `testProvider[${path}]`, () => {
|
||||
// mocha.suite hocus-pocus comes from: https://git.io/vDnXO
|
||||
|
||||
mocha.suite.emit('pre-require', createDescribeNestingValidator(global), path, mocha);
|
||||
|
||||
const returnVal = provider({
|
||||
loadTestFile: innerLoadTestFile,
|
||||
getService: providers.getService,
|
||||
getPageObject: providers.getPageObject,
|
||||
getPageObjects: providers.getPageObjects,
|
||||
});
|
||||
|
||||
if (returnVal && typeof returnVal.then === 'function') {
|
||||
throw new TypeError('Default export of test files must not be an async function');
|
||||
}
|
||||
|
||||
mocha.suite.emit('require', returnVal, path, mocha);
|
||||
mocha.suite.emit('post-require', global, path, mocha);
|
||||
});
|
||||
};
|
||||
|
||||
paths.forEach(innerLoadTestFile);
|
||||
};
|
46
src/functional_test_runner/lib/load_tracer.js
Normal file
46
src/functional_test_runner/lib/load_tracer.js
Normal file
|
@ -0,0 +1,46 @@
|
|||
const globalLoadPath = [];
|
||||
function getPath(startAt = 0) {
|
||||
return globalLoadPath
|
||||
.slice(startAt)
|
||||
.map(step => step.descrption)
|
||||
.join(' -> ');
|
||||
}
|
||||
|
||||
function addPathToMessage(message, startAt) {
|
||||
const path = getPath(startAt);
|
||||
if (!path) return message;
|
||||
return `${message} -- from ${path}`;
|
||||
}
|
||||
|
||||
/**
|
||||
* Trace the path followed as dependencies are loaded and
|
||||
* check for circular dependencies at each step
|
||||
*
|
||||
* @param {Any} ident identity of this load step, === compaired
|
||||
* to identities of previous steps to find circles
|
||||
* @param {String} descrption description of this step
|
||||
* @param {Function} load function that executes this step
|
||||
* @return {Any} the value produced by load()
|
||||
*/
|
||||
export function loadTracer(ident, descrption, load) {
|
||||
const isCircular = globalLoadPath.find(step => step.ident === ident);
|
||||
if (isCircular) {
|
||||
throw new Error(addPathToMessage(`Circular reference to "${descrption}"`));
|
||||
}
|
||||
|
||||
try {
|
||||
globalLoadPath.unshift({ ident, descrption });
|
||||
return load();
|
||||
} catch (err) {
|
||||
if (err.__fromLoadTracer) {
|
||||
throw err;
|
||||
}
|
||||
|
||||
const wrapped = new Error(addPathToMessage(`Failure to load ${descrption}`, 1));
|
||||
wrapped.stack = `${wrapped.message}\n\n Original Error: ${err.stack}`;
|
||||
wrapped.__fromLoadTracer = true;
|
||||
throw wrapped;
|
||||
} finally {
|
||||
globalLoadPath.shift();
|
||||
}
|
||||
}
|
99
src/functional_test_runner/lib/providers/async_instance.js
Normal file
99
src/functional_test_runner/lib/providers/async_instance.js
Normal file
|
@ -0,0 +1,99 @@
|
|||
const createdInstanceProxies = new WeakSet();
|
||||
|
||||
export const isAsyncInstance = val =>(
|
||||
createdInstanceProxies.has(val)
|
||||
);
|
||||
|
||||
export const createAsyncInstance = (type, name, promiseForValue) => {
|
||||
let finalValue;
|
||||
|
||||
const initPromise = promiseForValue.then(v => finalValue = v);
|
||||
const initFn = () => initPromise;
|
||||
|
||||
const assertReady = desc => {
|
||||
if (!finalValue) {
|
||||
throw new Error(`
|
||||
${type} \`${desc}\` is loaded asynchronously but isn't available yet. Either await the
|
||||
promise returned from ${name}.init(), or move this access into a test hook
|
||||
like \`before()\` or \`beforeEach()\`.
|
||||
`);
|
||||
}
|
||||
};
|
||||
|
||||
const proxy = new Proxy({}, {
|
||||
apply(target, context, args) {
|
||||
assertReady(`${name}()`);
|
||||
return Reflect.apply(finalValue, context, args);
|
||||
},
|
||||
|
||||
construct(target, args, newTarget) {
|
||||
assertReady(`new ${name}()`);
|
||||
return Reflect.construct(finalValue, args, newTarget);
|
||||
},
|
||||
|
||||
defineProperty(target, prop, descriptor) {
|
||||
assertReady(`${name}.${prop}`);
|
||||
return Reflect.defineProperty(finalValue, prop, descriptor);
|
||||
},
|
||||
|
||||
deleteProperty(target, prop) {
|
||||
assertReady(`${name}.${prop}`);
|
||||
return Reflect.deleteProperty(finalValue, prop);
|
||||
},
|
||||
|
||||
get(target, prop, receiver) {
|
||||
if (prop === 'init') return initFn;
|
||||
|
||||
assertReady(`${name}.${prop}`);
|
||||
return Reflect.get(finalValue, prop, receiver);
|
||||
},
|
||||
|
||||
getOwnPropertyDescriptor(target, prop) {
|
||||
assertReady(`${name}.${prop}`);
|
||||
return Reflect.getOwnPropertyDescriptor(finalValue, prop);
|
||||
},
|
||||
|
||||
getPrototypeOf() {
|
||||
assertReady(`${name}`);
|
||||
return Reflect.getPrototypeOf(finalValue);
|
||||
},
|
||||
|
||||
has(target, prop) {
|
||||
if (prop === 'init') return true;
|
||||
|
||||
assertReady(`${name}.${prop}`);
|
||||
return Reflect.has(finalValue, prop);
|
||||
},
|
||||
|
||||
isExtensible() {
|
||||
assertReady(`${name}`);
|
||||
return Reflect.isExtensible(finalValue);
|
||||
},
|
||||
|
||||
ownKeys() {
|
||||
assertReady(`${name}`);
|
||||
return Reflect.ownKeys(finalValue);
|
||||
},
|
||||
|
||||
preventExtensions() {
|
||||
assertReady(`${name}`);
|
||||
return Reflect.preventExtensions(finalValue);
|
||||
},
|
||||
|
||||
set(target, prop, value, receiver) {
|
||||
assertReady(`${name}.${prop}`);
|
||||
return Reflect.set(finalValue, prop, value, receiver);
|
||||
},
|
||||
|
||||
setPrototypeOf(target, prototype) {
|
||||
assertReady(`${name}`);
|
||||
return Reflect.setPrototypeOf(finalValue, prototype);
|
||||
}
|
||||
});
|
||||
|
||||
// add the created provider to the WeakMap so we can
|
||||
// check for it later in `isAsyncProvider()`
|
||||
createdInstanceProxies.add(proxy);
|
||||
|
||||
return proxy;
|
||||
};
|
2
src/functional_test_runner/lib/providers/index.js
Normal file
2
src/functional_test_runner/lib/providers/index.js
Normal file
|
@ -0,0 +1,2 @@
|
|||
export { ProviderCollection } from './provider_collection';
|
||||
export { readProviderSpec } from './read_provider_spec';
|
|
@ -0,0 +1,84 @@
|
|||
import { loadTracer } from '../load_tracer';
|
||||
import { createAsyncInstance, isAsyncInstance } from './async_instance';
|
||||
|
||||
export class ProviderCollection {
|
||||
constructor(providers) {
|
||||
this._instances = new Map();
|
||||
this._providers = providers;
|
||||
}
|
||||
|
||||
getService = name => (
|
||||
this._getInstance('Service', name)
|
||||
)
|
||||
|
||||
getPageObject = name => (
|
||||
this._getInstance('PageObject', name)
|
||||
)
|
||||
|
||||
getPageObjects = names => {
|
||||
const pageObjects = {};
|
||||
names.forEach(name => pageObjects[name] = this.getPageObject(name));
|
||||
return pageObjects;
|
||||
}
|
||||
|
||||
loadExternalService(name, provider) {
|
||||
return this._getInstance('Service', name, provider);
|
||||
}
|
||||
|
||||
async loadAll() {
|
||||
const asyncInitErrors = [];
|
||||
await Promise.all(
|
||||
this._providers.map(async ({ type, name }) => {
|
||||
try {
|
||||
const instance = this._getInstance(type, name);
|
||||
if (isAsyncInstance(instance)) {
|
||||
await instance.init();
|
||||
}
|
||||
} catch (err) {
|
||||
asyncInitErrors.push(err);
|
||||
}
|
||||
})
|
||||
);
|
||||
|
||||
if (asyncInitErrors.length) {
|
||||
// just throw the first, it probably caused the others and if not they
|
||||
// will show up once we fix the first, but creating an AggregateError or
|
||||
// something seems like overkill
|
||||
throw asyncInitErrors[0];
|
||||
}
|
||||
}
|
||||
|
||||
_getProvider(type, name) {
|
||||
const providerDef = this._providers.find(p => p.type === type && p.name === name);
|
||||
if (!providerDef) {
|
||||
throw new Error(`Unknown ${type} "${name}"`);
|
||||
}
|
||||
return providerDef.fn;
|
||||
}
|
||||
|
||||
_getInstance(type, name, provider = this._getProvider(type, name)) {
|
||||
const instances = this._instances;
|
||||
|
||||
return loadTracer(provider, `${type}(${name})`, () => {
|
||||
if (!provider) {
|
||||
throw new Error(`Unknown ${type} "${name}"`);
|
||||
}
|
||||
|
||||
if (!instances.has(provider)) {
|
||||
let instance = provider({
|
||||
getService: this.getService,
|
||||
getPageObject: this.getPageObject,
|
||||
getPageObjects: this.getPageObjects,
|
||||
});
|
||||
|
||||
if (instance && typeof instance.then === 'function') {
|
||||
instance = createAsyncInstance(type, name, instance);
|
||||
}
|
||||
|
||||
instances.set(provider, instance);
|
||||
}
|
||||
|
||||
return instances.get(provider);
|
||||
});
|
||||
}
|
||||
}
|
|
@ -0,0 +1,9 @@
|
|||
export function readProviderSpec(type, providers) {
|
||||
return Object.keys(providers).map(name => {
|
||||
return {
|
||||
type,
|
||||
name,
|
||||
fn: providers[name],
|
||||
};
|
||||
});
|
||||
}
|
|
@ -0,0 +1,19 @@
|
|||
import { brightBlack, green, yellow, red, brightWhite, brightCyan } from 'ansicolors';
|
||||
|
||||
export const suite = brightWhite;
|
||||
export const pending = brightCyan;
|
||||
export const pass = green;
|
||||
export const fail = red;
|
||||
|
||||
export function speed(name, txt) {
|
||||
switch (name) {
|
||||
case 'fast':
|
||||
return green(txt);
|
||||
case 'medium':
|
||||
return yellow(txt);
|
||||
case 'slow':
|
||||
return red(txt);
|
||||
default:
|
||||
return brightBlack(txt);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,122 @@
|
|||
import { format } from 'util';
|
||||
|
||||
import Mocha from 'mocha';
|
||||
|
||||
import * as colors from './colors';
|
||||
import * as symbols from './symbols';
|
||||
import { ms } from './ms';
|
||||
import { writeEpilogue } from './write_epilogue';
|
||||
|
||||
export function ConsoleReporterProvider({ getService }) {
|
||||
const log = getService('log');
|
||||
|
||||
return class MochaReporter extends Mocha.reporters.Base {
|
||||
constructor(runner) {
|
||||
super(runner);
|
||||
runner.on('start', this.onStart);
|
||||
runner.on('hook', this.onHookStart);
|
||||
runner.on('hook end', this.onHookEnd);
|
||||
runner.on('test', this.onTestStart);
|
||||
runner.on('suite', this.onSuiteStart);
|
||||
runner.on('pending', this.onPending);
|
||||
runner.on('pass', this.onPass);
|
||||
runner.on('fail', this.onFail);
|
||||
runner.on('test end', this.onTestEnd);
|
||||
runner.on('suite end', this.onSuiteEnd);
|
||||
runner.on('end', this.onEnd);
|
||||
}
|
||||
|
||||
onStart = () => {
|
||||
log.write('');
|
||||
}
|
||||
|
||||
onHookStart = hook => {
|
||||
log.write('-> ' + colors.suite(hook.title));
|
||||
log.indent(2);
|
||||
}
|
||||
|
||||
onHookEnd = () => {
|
||||
log.indent(-2);
|
||||
}
|
||||
|
||||
onSuiteStart = suite => {
|
||||
if (!suite.root) {
|
||||
log.write('-: ' + colors.suite(suite.title));
|
||||
}
|
||||
|
||||
log.indent(2);
|
||||
}
|
||||
|
||||
onSuiteEnd = () => {
|
||||
if (log.indent(-2) === '') {
|
||||
log.write();
|
||||
}
|
||||
}
|
||||
|
||||
onTestStart = test => {
|
||||
log.write(`-> ${test.title}`);
|
||||
log.indent(2);
|
||||
}
|
||||
|
||||
onTestEnd = () => {
|
||||
log.indent(-2);
|
||||
}
|
||||
|
||||
onPending = () => {
|
||||
log.write('-> ' + colors.pending(test.title));
|
||||
}
|
||||
|
||||
onPass = test => {
|
||||
|
||||
let time = '';
|
||||
if (test.speed !== 'fast') {
|
||||
time = colors.speed(test.speed, ` (${ms(test.duration)})`);
|
||||
}
|
||||
|
||||
const pass = colors.pass(`${symbols.ok} pass`);
|
||||
log.write(`- ${pass} ${time}`);
|
||||
}
|
||||
|
||||
onFail = test => {
|
||||
// NOTE: this is super gross
|
||||
//
|
||||
// - I started by trying to extract the Base.list() logic from mocha
|
||||
// but it's a lot more complicated than this is horrible.
|
||||
// - In order to fix the numbering and indentation we monkey-patch
|
||||
// console.log and parse the logged output.
|
||||
//
|
||||
let output = '';
|
||||
const realLog = console.log;
|
||||
console.log = (...args) => output += `${format(...args)}\n`;
|
||||
try {
|
||||
Mocha.reporters.Base.list([test]);
|
||||
} finally {
|
||||
console.log = realLog;
|
||||
}
|
||||
|
||||
log.indent(-2);
|
||||
log.write(
|
||||
`- ${symbols.err} ` +
|
||||
colors.fail(`fail: "${test.fullTitle()}"`) +
|
||||
'\n' +
|
||||
output
|
||||
.split('\n')
|
||||
.slice(2) // drop the first two lines, (empty + test title)
|
||||
.map(line => {
|
||||
// move leading colors behind leading spaces
|
||||
return line.replace(/^((?:\[.+m)+)(\s+)/, '$2$1');
|
||||
})
|
||||
.map(line => {
|
||||
// shrink mocha's indentation
|
||||
return line.replace(/^\s{5,5}/, ' ');
|
||||
})
|
||||
.join('\n')
|
||||
);
|
||||
log.indent(2);
|
||||
}
|
||||
|
||||
onEnd = () => {
|
||||
writeEpilogue(log, this.stats);
|
||||
}
|
||||
};
|
||||
}
|
|
@ -0,0 +1 @@
|
|||
export { ConsoleReporterProvider } from './console_reporter';
|
|
@ -0,0 +1,28 @@
|
|||
import moment from 'moment';
|
||||
|
||||
/**
|
||||
* Format a milliseconds value as a string
|
||||
*
|
||||
* @param {number} val
|
||||
* @return {string}
|
||||
*/
|
||||
export function ms(val) {
|
||||
const duration = moment.duration(val);
|
||||
if (duration.days() >= 1) {
|
||||
return duration.days().toFixed(1) + 'd';
|
||||
}
|
||||
|
||||
if (duration.hours() >= 1) {
|
||||
return duration.hours().toFixed(1) + 'h';
|
||||
}
|
||||
|
||||
if (duration.minutes() >= 1) {
|
||||
return duration.minutes().toFixed(1) + 'm';
|
||||
}
|
||||
|
||||
if (duration.seconds() >= 1) {
|
||||
return duration.as('seconds').toFixed(1) + 's';
|
||||
}
|
||||
|
||||
return val + 'ms';
|
||||
}
|
|
@ -0,0 +1,9 @@
|
|||
// originally extracted from mocha https://git.io/v1PGh
|
||||
|
||||
export const ok = process.platform === 'win32'
|
||||
? '\u221A'
|
||||
: '✓';
|
||||
|
||||
export const err = process.platform === 'win32'
|
||||
? '\u00D7'
|
||||
: '✖';
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Add a link
Reference in a new issue