mirror of
https://github.com/elastic/kibana.git
synced 2025-04-23 09:19:04 -04:00
* [File upload] New plugin: file upload (#36404) * Add file upload x-pack plugin * Clean up * Remove unneeded cluster config * Remove unneeded test * First pass basic telemetry (not connected). * Basic telemetry connected * Review feedback * Revise telemetry to use savedObjectRepository. Capture metrics on app and file types * Lots of cleanup, consolidation of logic * Clean up, reorg * Update telem tests and telem functions * Add back import data model * Clean up and update telemetry tests * Fix telemetry test issues and update corresponding code * Up chunk limit to 30 MB * Add file upload telemetry to saved objects management builder * Missing space * Add descriptive comments to dynamic keys in telemetry fields * [Maps] [File upload] Geojson upload (#36410) * Client side basics * File added and default named correctly * Connect transient layer removal to file import component * Simplify transient layer removal * Move file import ui over to new file upload plugin and make more generic * Add post-processing option to file upload. Make component json-specific * Add flag for source indexing * Revise import layer workflow to be separate from add layer workflow * Differentiate between normal sources and import sources. Add back layer add/next button * Update indexing boolean in component after file upload & parse * First pass rough indexing from maps working on specific use cases * Update parsing logic to handle geojson formats * Index pattern added following index creation * Pass onsuccess callback to file_upload to add layer to maps app * Handle mulltipolygon type in es geo utils * Add functionality to make es layer permanent and switch to edit panel * Add to index if exists or create new * Make plugin API more intuitive. Set up to handle remove action * Pass transient removal call through to file import * Clean up layer viewing logic for temp and perm layers * Remove change source from import screen * Add option to provide mappings array and pass geo_point and geo_shape to array * Add support for multiple mappings select and index naming * Match style of import file button to add layer for now * Remove duplicate case handling resulting from merge * Move geo processing logic over to file upload plugin for reuse * Remove old geo_shape formatting from geo_point code * Set default index data type. Remove unneeded stringify request logic * Check for custom processor object which contains function * Move file picker to separate component * Some cleaning. Add geojson clean & validate code * Catch file parsing errors and notify user * Disable index type if valid file not referenced * Set error messaging on invalid index name used * Add index pattern checking logic and error handling * Dynamically populate geo index options * Set index data type earlier in the workflow. Don't duplicate requests * Pass back index ready status from plugin and connect to layer next button * Increase max bytes to ~50MB * Don't parse files over max size & warn user. Also, remove toasts and warn similarly to other components * Uploaded file default label: 'fileToImport' -> 'Import File' * Expand out feature properties for mapping * Pass through telemetry data to plugin back-end * Clean up indexing flow. Separate creation of index patterns. Add new index pattern create callbacks * Pass back info for indexing failures but don't connect to UI yet * Fix telemetry test issues and update corresponding code # Conflicts: # x-pack/plugins/file_upload/server/telemetry/telemetry.test.ts * Add file upload telemetry to saved objects management builder * Missing space * Add descriptive comments to dynamic keys in telemetry fields * Divide up the import layer add workflow on the maps side and tweak to avoid layer color change * Fix bug affecting file preview replacement not updating coordinate index type * Remove index pattern. Organize effects. General clean up * preIndexTransform -> transformDetails * Update proptypes * Cleaning, organizing * Add index name guidelines. Show conditionally * Add file size, type guidance. Filter on file size, not chunk size. Small tweaks to file/index tips * Zoom to layer extent on preview of imported data * Revert "Revise import layer workflow to be separate from add layer workflow" This reverts commit3b35f5371d
. * Handle import file determination to accomodate add layer flow. Update card wording * Decompose addLayer component into smaller pieces. Add placeholder import progress component * Dynamically change footer button text for context * Move import card to top of sources * Get basic progress tracking in place * Allow second layer (indexed layer) to get removed. Reconnect file remove. Small UI tweaks * Add link to management for further index mods * Fix i18n failures * Add file parsing progress indicator & text * Reset importView on source reset * Add dynamic chunking to handle fluctuating data sizes common to geo features * Don't duplicate index request if request in flight * Modify json upload and import to use JS classes & react class component state * Transition remaining file_upload components over to class/component structure with state * Move functions into index_settings class * Review feedback * Add fetch with timeout * Split out import editor into separate component and clean up logic * Clean up add layer panel/button title logic * More cleanup * Tweaks to success and error handling flow * Handle success/error handling on add/view indexed data * Jump to indexing complete on error. Handle nothing returned to client app * Update name/location of source select css file * Update import source card border color and icon * Suggest name for index based upon file name * Add validation to auto-suggested index name/pattern * Use constants for geo_point and geo_shape in mappings * Update geojson upload card/description * Catch-all review feedback * Fix internationalization syntax errors * Review feedback * Get index names and patterns only when needed * Make addLayer async again to fix zoom to extent issue, waits on just the async syncDataForLayer function (which is async) * Remove panel description as derived state and shift to function * Remove geojson fit to extent for now * Remove unused class selector * Remove unneeded i18 wrapper for what's already passed in as an i18 element * Revise import state to be handle via redux. Some json upload args changes * Review feedback. Some cleanup and bug fixes * Roll back store actions changes and layer_addpanel changes related to color change * Follow scss file naming conventions for source_select * Review feedback * Restore clobbered layer_control view to master state. Add source_select css * Update import to use plugin local indexPatternService, not maps' * Review feedback, mostly i18n. Also add index to scss path * i18n translation updates * Assign error message to values rather than error object * Update getMapColors to filter out transient layer * Wrap Feature as FeatureCollection in Maps * Add jest tests for geo processing functions. Add fixes for single feature handling * i18n * Review feedback. Test cleanup/fixes * Update layer add panel footer logic to still show when source not selected * Fix issue of not recognizing MultiPoint type. Remove throw logic for now * Update telemetry with newly required placeholder function * Prevent external modification of nested geojson objects * i18n translation updates * Revert "Fix issue of not recognizing MultiPoint type. Remove throw logic for now" This reverts commitd692f913f8
. * Revert "Prevent external modification of nested geojson objects" This reverts commit0ea9fd3336
. * yarn.lock update * [File upload] Remove dynamic fields from mappings, code and telemetry test (#38902) * Remove dynamic fields from mappings, code and telemetry test * Add file-upload-telemetry to spaces and es archiver test mappings * Don't create telemetry saved object if none exists, create on first update instead * Back out es archiver mappings update * Update zh-CN translations
This commit is contained in:
parent
8eaaf3ca60
commit
1b67eb03b2
59 changed files with 2997 additions and 171 deletions
|
@ -29,6 +29,7 @@
|
|||
"xpack.code": "x-pack/plugins/code",
|
||||
"xpack.crossClusterReplication": "x-pack/plugins/cross_cluster_replication",
|
||||
"xpack.dashboardMode": "x-pack/plugins/dashboard_mode",
|
||||
"xpack.fileUpload": "x-pack/plugins/file_upload",
|
||||
"xpack.graph": "x-pack/plugins/graph",
|
||||
"xpack.grokDebugger": "x-pack/plugins/grokdebugger",
|
||||
"xpack.idxMgmt": "x-pack/plugins/index_management",
|
||||
|
|
|
@ -39,6 +39,7 @@ import { translations } from './plugins/translations';
|
|||
import { upgradeAssistant } from './plugins/upgrade_assistant';
|
||||
import { uptime } from './plugins/uptime';
|
||||
import { ossTelemetry } from './plugins/oss_telemetry';
|
||||
import { fileUpload } from './plugins/file_upload';
|
||||
import { telemetry } from './plugins/telemetry';
|
||||
import { encryptedSavedObjects } from './plugins/encrypted_saved_objects';
|
||||
import { snapshotRestore } from './plugins/snapshot_restore';
|
||||
|
@ -81,6 +82,7 @@ module.exports = function (kibana) {
|
|||
upgradeAssistant(kibana),
|
||||
uptime(kibana),
|
||||
ossTelemetry(kibana),
|
||||
fileUpload(kibana),
|
||||
encryptedSavedObjects(kibana),
|
||||
snapshotRestore(kibana),
|
||||
];
|
||||
|
|
|
@ -223,6 +223,7 @@
|
|||
"file-type": "^10.9.0",
|
||||
"font-awesome": "4.7.0",
|
||||
"formsy-react": "^1.1.5",
|
||||
"geojson-rewind": "^0.3.1",
|
||||
"get-port": "4.2.0",
|
||||
"getos": "^3.1.0",
|
||||
"git-url-parse": "11.1.2",
|
||||
|
@ -249,6 +250,7 @@
|
|||
"js-yaml": "3.13.1",
|
||||
"json-stable-stringify": "^1.0.1",
|
||||
"jsonwebtoken": "^8.3.0",
|
||||
"jsts": "^2.0.4",
|
||||
"lodash": "npm:@elastic/lodash@3.10.1-kibana1",
|
||||
"lodash.keyby": "^4.6.0",
|
||||
"lodash.lowercase": "^4.3.0",
|
||||
|
|
18
x-pack/plugins/file_upload/common/constants/file_import.ts
Normal file
18
x-pack/plugins/file_upload/common/constants/file_import.ts
Normal file
|
@ -0,0 +1,18 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
export const MAX_BYTES = 31457280;
|
||||
|
||||
export const MAX_FILE_SIZE = 52428800;
|
||||
|
||||
// Value to use in the Elasticsearch index mapping metadata to identify the
|
||||
// index as having been created by the File Upload Plugin.
|
||||
export const INDEX_META_DATA_CREATED_BY = 'file-upload-plugin';
|
||||
|
||||
export const ES_GEO_FIELD_TYPE = {
|
||||
GEO_POINT: 'geo_point',
|
||||
GEO_SHAPE: 'geo_shape',
|
||||
};
|
33
x-pack/plugins/file_upload/index.js
Normal file
33
x-pack/plugins/file_upload/index.js
Normal file
|
@ -0,0 +1,33 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
import { mirrorPluginStatus } from '../../server/lib/mirror_plugin_status';
|
||||
import { fileUploadRoutes } from './server/routes/file_upload';
|
||||
import { makeUsageCollector } from './server/telemetry/';
|
||||
import mappings from './mappings';
|
||||
|
||||
export const fileUpload = kibana => {
|
||||
return new kibana.Plugin({
|
||||
require: ['elasticsearch', 'xpack_main'],
|
||||
name: 'file_upload',
|
||||
id: 'file_upload',
|
||||
uiExports: {
|
||||
mappings,
|
||||
},
|
||||
savedObjectSchemas: {
|
||||
'file-upload-telemetry': {
|
||||
isNamespaceAgnostic: true
|
||||
}
|
||||
},
|
||||
|
||||
init(server) {
|
||||
const { xpack_main: xpackMainPlugin } = server.plugins;
|
||||
|
||||
mirrorPluginStatus(xpackMainPlugin, this);
|
||||
fileUploadRoutes(server);
|
||||
makeUsageCollector(server);
|
||||
}
|
||||
});
|
||||
};
|
9
x-pack/plugins/file_upload/mappings.json
Normal file
9
x-pack/plugins/file_upload/mappings.json
Normal file
|
@ -0,0 +1,9 @@
|
|||
{
|
||||
"file-upload-telemetry": {
|
||||
"properties": {
|
||||
"filesUploadedTotalCount": {
|
||||
"type": "long"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
204
x-pack/plugins/file_upload/public/components/index_settings.js
Normal file
204
x-pack/plugins/file_upload/public/components/index_settings.js
Normal file
|
@ -0,0 +1,204 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { Fragment, Component } from 'react';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
import {
|
||||
EuiFormRow,
|
||||
EuiFieldText,
|
||||
EuiSpacer,
|
||||
EuiSelect,
|
||||
EuiCallOut
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import { getExistingIndices, getExistingIndexPatterns }
|
||||
from '../util/indexing_service';
|
||||
|
||||
export class IndexSettings extends Component {
|
||||
|
||||
state = {
|
||||
indexNameError: '',
|
||||
indexDisabled: true,
|
||||
indexPatterns: null,
|
||||
indexNames: null,
|
||||
indexName: '',
|
||||
};
|
||||
|
||||
componentDidUpdate(prevProps, prevState) {
|
||||
const { indexNameError, indexName } = this.state;
|
||||
if (prevState.indexNameError !== indexNameError) {
|
||||
this.props.setHasIndexErrors(!!indexNameError);
|
||||
}
|
||||
const { disabled, indexTypes } = this.props;
|
||||
const indexDisabled = disabled || !indexTypes || !indexTypes.length;
|
||||
if (indexDisabled !== this.state.indexDisabled) {
|
||||
this.setState({ indexDisabled });
|
||||
}
|
||||
if (this.props.indexName !== indexName) {
|
||||
this._setIndexName(this.props.indexName);
|
||||
}
|
||||
}
|
||||
|
||||
async _getIndexNames() {
|
||||
if (this.state.indexNames) {
|
||||
return this.state.indexNames;
|
||||
}
|
||||
const indices = await getExistingIndices();
|
||||
const indexNames = indices
|
||||
? indices.map(({ name }) => name)
|
||||
: [];
|
||||
this.setState({ indexNames });
|
||||
return indexNames;
|
||||
}
|
||||
|
||||
async _getIndexPatterns() {
|
||||
if (this.state.indexPatterns) {
|
||||
return this.state.indexPatterns;
|
||||
}
|
||||
const patterns = await getExistingIndexPatterns();
|
||||
const indexPatterns = patterns
|
||||
? patterns.map(({ name }) => name)
|
||||
: [];
|
||||
this.setState({ indexPatterns });
|
||||
return indexPatterns;
|
||||
}
|
||||
|
||||
_setIndexName = async name => {
|
||||
const errorMessage = await this._isIndexNameAndPatternValid(name);
|
||||
return this.setState({
|
||||
indexName: name,
|
||||
indexNameError: errorMessage
|
||||
});
|
||||
}
|
||||
|
||||
_onIndexChange = async ({ target }) => {
|
||||
const name = target.value;
|
||||
await this._setIndexName(name);
|
||||
this.props.setIndexName(name);
|
||||
}
|
||||
|
||||
_isIndexNameAndPatternValid = async name => {
|
||||
const indexNames = await this._getIndexNames();
|
||||
const indexPatterns = await this._getIndexPatterns();
|
||||
if (indexNames.find(i => i === name) || indexPatterns.find(i => i === name)) {
|
||||
return (
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.indexSettings.indexNameAlreadyExistsErrorMessage"
|
||||
defaultMessage="Index name or pattern already exists."
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
const reg = new RegExp('[\\\\/\*\?\"\<\>\|\\s\,\#]+');
|
||||
if (
|
||||
(name !== name.toLowerCase()) || // name should be lowercase
|
||||
(name === '.' || name === '..') || // name can't be . or ..
|
||||
name.match(/^[-_+]/) !== null || // name can't start with these chars
|
||||
name.match(reg) !== null // name can't contain these chars
|
||||
) {
|
||||
return (
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.indexSettings.indexNameContainsIllegalCharactersErrorMessage"
|
||||
defaultMessage="Index name contains illegal characters."
|
||||
/>
|
||||
);
|
||||
}
|
||||
return '';
|
||||
}
|
||||
|
||||
render() {
|
||||
const { setSelectedIndexType, indexTypes } = this.props;
|
||||
const { indexNameError, indexDisabled, indexName } = this.state;
|
||||
|
||||
return (
|
||||
<Fragment>
|
||||
<EuiSpacer size="m"/>
|
||||
<EuiFormRow
|
||||
label={
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.indexSettings.enterIndexTypeLabel"
|
||||
defaultMessage="Index type"
|
||||
/>
|
||||
}
|
||||
>
|
||||
<EuiSelect
|
||||
disabled={indexDisabled}
|
||||
options={indexTypes.map(indexType => ({
|
||||
text: indexType,
|
||||
value: indexType,
|
||||
}))}
|
||||
onChange={({ target }) => setSelectedIndexType(target.value)}
|
||||
/>
|
||||
</EuiFormRow>
|
||||
<EuiSpacer size="m"/>
|
||||
{indexDisabled
|
||||
? null
|
||||
: (
|
||||
<EuiCallOut
|
||||
title={i18n.translate('xpack.fileUpload.indexSettings.indexNameGuidelines',
|
||||
{ defaultMessage: 'Index name guidelines' })}
|
||||
iconType="pin"
|
||||
>
|
||||
<div>
|
||||
<ul>
|
||||
<li>{i18n.translate('xpack.fileUpload.indexSettings.guidelines.mustBeNewIndex',
|
||||
{ defaultMessage: 'Must be a new index' })}
|
||||
</li>
|
||||
<li>{i18n.translate('xpack.fileUpload.indexSettings.guidelines.lowercaseOnly',
|
||||
{ defaultMessage: 'Lowercase only' })}
|
||||
</li>
|
||||
<li>{i18n.translate('xpack.fileUpload.indexSettings.guidelines.cannotInclude',
|
||||
{ defaultMessage: 'Cannot include \\\\, /, *, ?, ", <, >, |, \
|
||||
" " (space character), , (comma), #'
|
||||
})}
|
||||
</li>
|
||||
<li>{i18n.translate('xpack.fileUpload.indexSettings.guidelines.cannotStartWith',
|
||||
{ defaultMessage: 'Cannot start with -, _, +' })}
|
||||
</li>
|
||||
<li>{i18n.translate('xpack.fileUpload.indexSettings.guidelines.cannotBe',
|
||||
{ defaultMessage: 'Cannot be . or ..' })}
|
||||
</li>
|
||||
<li>{i18n.translate('xpack.fileUpload.indexSettings.guidelines.length',
|
||||
{ defaultMessage:
|
||||
'Cannot be longer than 255 bytes (note it is bytes, \
|
||||
so multi-byte characters will count towards the 255 \
|
||||
limit faster)'
|
||||
})}
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</EuiCallOut>
|
||||
)}
|
||||
<EuiSpacer size="s"/>
|
||||
<EuiFormRow
|
||||
label={
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.indexSettings.enterIndexNameLabel"
|
||||
defaultMessage="Index name"
|
||||
/>
|
||||
}
|
||||
isInvalid={indexNameError !== ''}
|
||||
error={[indexNameError]}
|
||||
>
|
||||
<EuiFieldText
|
||||
disabled={indexDisabled}
|
||||
placeholder={i18n.translate('xpack.fileUpload.enterIndexName',
|
||||
{ defaultMessage: 'Enter Index Name' })}
|
||||
value={indexName}
|
||||
onChange={this._onIndexChange}
|
||||
isInvalid={indexNameError !== ''}
|
||||
aria-label={i18n.translate('xpack.fileUpload.indexNameReqField',
|
||||
{ defaultMessage: 'Index name, required field' })}
|
||||
/>
|
||||
</EuiFormRow>
|
||||
|
||||
<EuiSpacer size="s"/>
|
||||
|
||||
</Fragment>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
|
@ -0,0 +1,171 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
import React, { Fragment, Component } from 'react';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
import {
|
||||
EuiCodeBlock,
|
||||
EuiSpacer,
|
||||
EuiFormRow,
|
||||
EuiText,
|
||||
EuiProgress,
|
||||
EuiFlexItem,
|
||||
EuiCallOut,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import chrome from 'ui/chrome';
|
||||
|
||||
export class JsonImportProgress extends Component {
|
||||
|
||||
state = {
|
||||
indexDataJson: null,
|
||||
indexPatternJson: null,
|
||||
indexName: '',
|
||||
importStage: '',
|
||||
};
|
||||
|
||||
componentDidUpdate(prevProps, prevState) {
|
||||
this._setIndex(this.props);
|
||||
this._formatIndexDataResponse({ ...this.state, ...this.props });
|
||||
this._formatIndexPatternResponse({ ...this.state, ...this.props });
|
||||
if (prevState.importStage !== this.props.importStage) {
|
||||
this.setState({
|
||||
importStage: this.props.importStage
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Retain last index for UI purposes
|
||||
_setIndex = ({ indexName }) => {
|
||||
if (indexName && !this.state.indexName) {
|
||||
this.setState({ indexName });
|
||||
}
|
||||
}
|
||||
|
||||
// Format json responses
|
||||
_formatIndexDataResponse = ({ indexDataResp, indexDataJson }) => {
|
||||
if (indexDataResp && !indexDataJson) {
|
||||
this.setState({ indexDataJson: JSON.stringify(indexDataResp, null, 2) });
|
||||
}
|
||||
}
|
||||
|
||||
_formatIndexPatternResponse = ({ indexPatternResp, indexPatternJson }) => {
|
||||
if (indexPatternResp && !indexPatternJson) {
|
||||
this.setState(
|
||||
{ indexPatternJson: JSON.stringify(indexPatternResp, null, 2) }
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
render() {
|
||||
const { complete } = this.props;
|
||||
const { indexPatternJson, indexDataJson, indexName, importStage } = this.state;
|
||||
const importMessage = complete
|
||||
? importStage
|
||||
: `${importStage}: ${indexName}`;
|
||||
|
||||
return (
|
||||
<Fragment>
|
||||
{!complete ?
|
||||
<EuiProgress size="xs" color="accent" position="absolute"/> : null}
|
||||
<EuiSpacer size="m"/>
|
||||
<EuiFormRow
|
||||
label={
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonImport.indexingStatus"
|
||||
defaultMessage="Indexing status"
|
||||
/>
|
||||
}
|
||||
>
|
||||
<EuiText>
|
||||
{importMessage}
|
||||
</EuiText>
|
||||
</EuiFormRow>
|
||||
<EuiSpacer size="m"/>
|
||||
{complete
|
||||
? (
|
||||
<Fragment>
|
||||
{
|
||||
indexDataJson
|
||||
? (
|
||||
<EuiFormRow
|
||||
label={
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonImport.indexingResponse"
|
||||
defaultMessage="Indexing response"
|
||||
/>
|
||||
}
|
||||
>
|
||||
<EuiCodeBlock
|
||||
paddingSize="s"
|
||||
overflowHeight={200}
|
||||
>
|
||||
{indexDataJson}
|
||||
</EuiCodeBlock>
|
||||
</EuiFormRow>
|
||||
)
|
||||
: null
|
||||
}
|
||||
{
|
||||
indexPatternJson
|
||||
? (
|
||||
<EuiFormRow
|
||||
label={
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonImport.indexPatternResponse"
|
||||
defaultMessage="Index pattern response"
|
||||
/>
|
||||
}
|
||||
>
|
||||
<EuiCodeBlock
|
||||
paddingSize="s"
|
||||
overflowHeight={200}
|
||||
>
|
||||
{indexPatternJson}
|
||||
</EuiCodeBlock>
|
||||
</EuiFormRow>
|
||||
)
|
||||
: null
|
||||
}
|
||||
<EuiFormRow>
|
||||
<EuiFlexItem>
|
||||
<EuiCallOut
|
||||
title={
|
||||
i18n.translate('xpack.fileUpload.jsonImport.indexModsTitle',
|
||||
{ defaultMessage: 'Index modifications' })}
|
||||
iconType="pin"
|
||||
>
|
||||
<div>
|
||||
{
|
||||
i18n.translate('xpack.fileUpload.jsonImport.indexModsMsg',
|
||||
{ defaultMessage: 'Further index modifications can be made using\n'
|
||||
})
|
||||
}
|
||||
<a
|
||||
target="_blank"
|
||||
href={`${chrome.getBasePath()}/app/kibana#/
|
||||
management/elasticsearch/index_management/indices/
|
||||
filter/${indexName}`.replace(/\s/g, '')}
|
||||
>
|
||||
{
|
||||
i18n.translate('xpack.fileUpload.jsonImport.indexMgmtLink',
|
||||
{ defaultMessage: 'Index Management' })
|
||||
}
|
||||
</a>
|
||||
</div>
|
||||
</EuiCallOut>
|
||||
</EuiFlexItem>
|
||||
</EuiFormRow>
|
||||
</Fragment>
|
||||
)
|
||||
: null
|
||||
}
|
||||
<EuiSpacer size="s"/>
|
||||
</Fragment>
|
||||
);
|
||||
}
|
||||
}
|
|
@ -0,0 +1,221 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
import React, { Fragment, Component } from 'react';
|
||||
import {
|
||||
EuiFilePicker,
|
||||
EuiFormRow,
|
||||
EuiSpacer,
|
||||
EuiCallOut,
|
||||
EuiProgress,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
import { parseFile } from '../util/file_parser';
|
||||
import { MAX_FILE_SIZE } from '../../common/constants/file_import';
|
||||
|
||||
const ACCEPTABLE_FILETYPES = [
|
||||
'json',
|
||||
'geojson',
|
||||
];
|
||||
|
||||
export class JsonIndexFilePicker extends Component {
|
||||
|
||||
state = {
|
||||
fileUploadError: '',
|
||||
fileParsingProgress: '',
|
||||
fileRef: null
|
||||
};
|
||||
|
||||
componentDidUpdate(prevProps, prevState) {
|
||||
if (prevState.fileRef !== this.props.fileRef) {
|
||||
this.setState({ fileRef: this.props.fileRef });
|
||||
}
|
||||
}
|
||||
|
||||
_fileHandler = async fileList => {
|
||||
const {
|
||||
resetFileAndIndexSettings, setParsedFile, onFileRemove, onFileUpload,
|
||||
transformDetails, setFileRef, setIndexName
|
||||
} = this.props;
|
||||
|
||||
const { fileRef } = this.state;
|
||||
|
||||
resetFileAndIndexSettings();
|
||||
this.setState({ fileUploadError: '' });
|
||||
if (fileList.length === 0) { // Remove
|
||||
setParsedFile(null);
|
||||
if (onFileRemove) {
|
||||
onFileRemove(fileRef);
|
||||
}
|
||||
} else if (fileList.length === 1) { // Parse & index file
|
||||
const file = fileList[0];
|
||||
if (!file.name) {
|
||||
this.setState({
|
||||
fileUploadError: i18n.translate(
|
||||
'xpack.fileUpload.jsonIndexFilePicker.noFileNameError',
|
||||
{ defaultMessage: 'No file name provided' })
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Check file type, assign default index name
|
||||
const splitNameArr = file.name.split('.');
|
||||
const fileType = splitNameArr.pop();
|
||||
const types = ACCEPTABLE_FILETYPES.reduce((accu, type) => {
|
||||
accu = accu ? `${accu}, ${type}` : type;
|
||||
return accu;
|
||||
}, '');
|
||||
if (!ACCEPTABLE_FILETYPES.includes(fileType)) {
|
||||
this.setState({
|
||||
fileUploadError: (
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonIndexFilePicker.acceptableTypesError"
|
||||
defaultMessage="File is not one of acceptable types: {types}"
|
||||
values={{ types }}
|
||||
/>
|
||||
)
|
||||
});
|
||||
return;
|
||||
}
|
||||
const initIndexName = splitNameArr[0];
|
||||
setIndexName(initIndexName);
|
||||
|
||||
// Check valid size
|
||||
const { size } = file;
|
||||
if (size > MAX_FILE_SIZE) {
|
||||
this.setState({
|
||||
fileUploadError: (
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonIndexFilePicker.acceptableFileSize"
|
||||
defaultMessage="File size {fileSize} bytes exceeds max file size of {maxFileSize}"
|
||||
values={{
|
||||
fileSize: size,
|
||||
maxFileSize: MAX_FILE_SIZE
|
||||
}}
|
||||
/>
|
||||
)
|
||||
});
|
||||
return;
|
||||
}
|
||||
|
||||
// Parse file
|
||||
this.setState({ fileParsingProgress: i18n.translate(
|
||||
'xpack.fileUpload.jsonIndexFilePicker.parsingFile',
|
||||
{ defaultMessage: 'Parsing file...' })
|
||||
});
|
||||
const parsedFileResult = await parseFile(
|
||||
file, onFileUpload, transformDetails
|
||||
).catch(err => {
|
||||
this.setState({
|
||||
fileUploadError: (
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonIndexFilePicker.unableParseFile"
|
||||
defaultMessage="Unable to parse file: {error}"
|
||||
values={{
|
||||
error: err.message
|
||||
}}
|
||||
/>
|
||||
)
|
||||
});
|
||||
});
|
||||
this.setState({ fileParsingProgress: '' });
|
||||
if (!parsedFileResult) {
|
||||
if (fileRef) {
|
||||
if (onFileRemove) {
|
||||
onFileRemove(fileRef);
|
||||
}
|
||||
setFileRef(null);
|
||||
}
|
||||
return;
|
||||
}
|
||||
setFileRef(file);
|
||||
setParsedFile(parsedFileResult);
|
||||
|
||||
} else {
|
||||
// No else
|
||||
}
|
||||
}
|
||||
|
||||
render() {
|
||||
const { fileParsingProgress, fileUploadError, fileRef } = this.state;
|
||||
|
||||
return (
|
||||
<Fragment>
|
||||
{ fileParsingProgress
|
||||
? <EuiProgress size="xs" color="accent" position="absolute" />
|
||||
: null
|
||||
}
|
||||
{
|
||||
fileRef && !fileUploadError
|
||||
? null
|
||||
: (
|
||||
<EuiCallOut
|
||||
title={i18n.translate(
|
||||
'xpack.fileUpload.jsonIndexFilePicker.fileUploadGuidelines',
|
||||
{ defaultMessage: 'File upload guidelines' }
|
||||
)}
|
||||
iconType="pin"
|
||||
>
|
||||
<div>
|
||||
<ul>
|
||||
<li>
|
||||
{
|
||||
i18n.translate(
|
||||
'xpack.fileUpload.jsonIndexFilePicker.formatsAccepted',
|
||||
{ defaultMessage: 'Formats accepted: .json, .geojson' }
|
||||
)
|
||||
}
|
||||
</li>
|
||||
<li>
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonIndexFilePicker.maxSize"
|
||||
defaultMessage="Max size: {maxFileSize}"
|
||||
values={{
|
||||
maxFileSize: bytesToSize(MAX_FILE_SIZE)
|
||||
}}
|
||||
/>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
</EuiCallOut>
|
||||
)
|
||||
}
|
||||
<EuiSpacer size="m" />
|
||||
<EuiFormRow
|
||||
label={(
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonIndexFilePicker.filePickerLabel"
|
||||
defaultMessage="Select a file to upload"
|
||||
/>
|
||||
)}
|
||||
isInvalid={fileUploadError !== ''}
|
||||
error={[fileUploadError]}
|
||||
helpText={fileParsingProgress}
|
||||
>
|
||||
<EuiFilePicker
|
||||
initialPromptText={(
|
||||
<FormattedMessage
|
||||
id="xpack.fileUpload.jsonIndexFilePicker.filePicker"
|
||||
defaultMessage="Upload file"
|
||||
/>
|
||||
)}
|
||||
onChange={this._fileHandler}
|
||||
/>
|
||||
</EuiFormRow>
|
||||
</Fragment>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
function bytesToSize(bytes) {
|
||||
const sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB'];
|
||||
if (bytes === 0) return 'n/a';
|
||||
const i = parseInt(Math.floor(Math.log(bytes) / Math.log(1024)), 10);
|
||||
if (i === 0) return `${bytes} ${sizes[i]})`;
|
||||
return `${(bytes / (1024 ** i)).toFixed(1)} ${sizes[i]}`;
|
||||
}
|
|
@ -0,0 +1,274 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { Component, Fragment } from 'react';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
import {
|
||||
EuiForm,
|
||||
} from '@elastic/eui';
|
||||
import PropTypes from 'prop-types';
|
||||
import { indexData, createIndexPattern } from '../util/indexing_service';
|
||||
import { getGeoIndexTypesForFeatures } from '../util/geo_processing';
|
||||
import { IndexSettings } from './index_settings';
|
||||
import { JsonIndexFilePicker } from './json_index_file_picker';
|
||||
import { JsonImportProgress } from './json_import_progress';
|
||||
import _ from 'lodash';
|
||||
|
||||
const INDEXING_STAGE = {
|
||||
INDEXING_STARTED: i18n.translate(
|
||||
'xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted',
|
||||
{ defaultMessage: 'Data indexing started' }),
|
||||
WRITING_TO_INDEX: i18n.translate(
|
||||
'xpack.fileUpload.jsonUploadAndParse.writingToIndex',
|
||||
{ defaultMessage: 'Writing to index' }),
|
||||
CREATING_INDEX_PATTERN: i18n.translate(
|
||||
'xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern',
|
||||
{ defaultMessage: 'Creating index pattern' }),
|
||||
INDEX_PATTERN_COMPLETE: i18n.translate(
|
||||
'xpack.fileUpload.jsonUploadAndParse.indexPatternComplete',
|
||||
{ defaultMessage: 'Index pattern complete' }),
|
||||
DATA_INDEXING_ERROR: i18n.translate(
|
||||
'xpack.fileUpload.jsonUploadAndParse.dataIndexingError',
|
||||
{ defaultMessage: 'Data indexing error' }),
|
||||
INDEX_PATTERN_ERROR: i18n.translate(
|
||||
'xpack.fileUpload.jsonUploadAndParse.indexPatternError',
|
||||
{ defaultMessage: 'Index pattern error' }),
|
||||
};
|
||||
|
||||
export class JsonUploadAndParse extends Component {
|
||||
|
||||
state = {
|
||||
// File state
|
||||
fileRef: null,
|
||||
parsedFile: null,
|
||||
indexedFile: null,
|
||||
|
||||
// Index state
|
||||
indexTypes: [],
|
||||
selectedIndexType: '',
|
||||
indexName: '',
|
||||
indexRequestInFlight: false,
|
||||
indexPatternRequestInFlight: false,
|
||||
hasIndexErrors: false,
|
||||
isIndexReady: false,
|
||||
|
||||
// Progress-tracking state
|
||||
showImportProgress: false,
|
||||
currentIndexingStage: INDEXING_STAGE.INDEXING_STARTED,
|
||||
indexDataResp: '',
|
||||
indexPatternResp: '',
|
||||
};
|
||||
|
||||
_resetFileAndIndexSettings = () => {
|
||||
this.setState({
|
||||
indexTypes: [],
|
||||
selectedIndexType: '',
|
||||
indexName: '',
|
||||
indexedFile: null,
|
||||
parsedFile: null,
|
||||
fileRef: null,
|
||||
});
|
||||
};
|
||||
|
||||
componentDidUpdate(prevProps, prevState) {
|
||||
if (!_.isEqual(prevState.parsedFile, this.state.parsedFile)) {
|
||||
this._setIndexTypes({ ...this.state, ...this.props });
|
||||
}
|
||||
this._setSelectedType(this.state);
|
||||
this._setIndexReady({ ...this.state, ...this.props });
|
||||
this._indexData({ ...this.state, ...this.props });
|
||||
if (this.props.isIndexingTriggered && !this.state.showImportProgress) {
|
||||
this.setState({ showImportProgress: true });
|
||||
}
|
||||
}
|
||||
|
||||
_setSelectedType = ({ selectedIndexType, indexTypes }) => {
|
||||
if (!selectedIndexType && indexTypes.length) {
|
||||
this.setState({ selectedIndexType: indexTypes[0] });
|
||||
}
|
||||
}
|
||||
|
||||
_setIndexReady = ({
|
||||
parsedFile, selectedIndexType, indexName, hasIndexErrors,
|
||||
indexRequestInFlight, onIndexReady
|
||||
}) => {
|
||||
const isIndexReady = !!parsedFile && !!selectedIndexType &&
|
||||
!!indexName && !hasIndexErrors && !indexRequestInFlight;
|
||||
if (isIndexReady !== this.state.isIndexReady) {
|
||||
this.setState({ isIndexReady });
|
||||
if (onIndexReady) {
|
||||
onIndexReady(isIndexReady);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
_indexData = async ({
|
||||
indexedFile, parsedFile, indexRequestInFlight, transformDetails,
|
||||
indexName, appName, selectedIndexType, isIndexingTriggered, isIndexReady,
|
||||
onIndexingComplete, boolCreateIndexPattern
|
||||
}) => {
|
||||
// Check index ready
|
||||
const filesAreEqual = _.isEqual(indexedFile, parsedFile);
|
||||
if (!isIndexingTriggered || filesAreEqual || !isIndexReady || indexRequestInFlight) {
|
||||
return;
|
||||
}
|
||||
this.setState({
|
||||
indexRequestInFlight: true,
|
||||
currentIndexingStage: INDEXING_STAGE.WRITING_TO_INDEX
|
||||
});
|
||||
|
||||
// Index data
|
||||
const indexDataResp = await indexData(
|
||||
parsedFile, transformDetails, indexName, selectedIndexType, appName
|
||||
);
|
||||
|
||||
// Index error
|
||||
if (!indexDataResp.success) {
|
||||
this.setState({
|
||||
indexedFile: null,
|
||||
indexDataResp,
|
||||
indexRequestInFlight: false,
|
||||
currentIndexingStage: INDEXING_STAGE.INDEXING_COMPLETE,
|
||||
});
|
||||
this._resetFileAndIndexSettings();
|
||||
if (onIndexingComplete) {
|
||||
onIndexingComplete();
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Index data success. Update state & create index pattern
|
||||
this.setState({
|
||||
indexDataResp,
|
||||
indexedFile: parsedFile,
|
||||
});
|
||||
let indexPatternResp;
|
||||
if (boolCreateIndexPattern) {
|
||||
indexPatternResp = await this._createIndexPattern(this.state);
|
||||
}
|
||||
|
||||
// Indexing complete, update state & callback (if any)
|
||||
this.setState({ currentIndexingStage: INDEXING_STAGE.INDEXING_COMPLETE });
|
||||
if (onIndexingComplete) {
|
||||
onIndexingComplete({
|
||||
indexDataResp,
|
||||
...(boolCreateIndexPattern ? { indexPatternResp } : {})
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
_createIndexPattern = async ({ indexName }) => {
|
||||
this.setState({
|
||||
indexPatternRequestInFlight: true,
|
||||
currentIndexingStage: INDEXING_STAGE.CREATING_INDEX_PATTERN
|
||||
});
|
||||
const indexPatternResp = await createIndexPattern(indexName);
|
||||
|
||||
this.setState({
|
||||
indexPatternResp,
|
||||
indexPatternRequestInFlight: false,
|
||||
});
|
||||
this._resetFileAndIndexSettings();
|
||||
|
||||
return indexPatternResp;
|
||||
}
|
||||
|
||||
// This is mostly for geo. Some data have multiple valid index types that can
|
||||
// be chosen from, such as 'geo_point' vs. 'geo_shape' for point data
|
||||
_setIndexTypes = ({ transformDetails, parsedFile }) => {
|
||||
if (parsedFile) {
|
||||
// User-provided index types
|
||||
if (typeof transformDetails === 'object') {
|
||||
this.setState({ indexTypes: transformDetails.indexTypes });
|
||||
} else {
|
||||
// Included index types
|
||||
switch (transformDetails) {
|
||||
case 'geo':
|
||||
const featureTypes = _.uniq(
|
||||
parsedFile.features
|
||||
? parsedFile.features.map(({ geometry }) => geometry.type)
|
||||
: [ parsedFile.geometry.type ]
|
||||
);
|
||||
this.setState({
|
||||
indexTypes: getGeoIndexTypesForFeatures(featureTypes)
|
||||
});
|
||||
break;
|
||||
default:
|
||||
this.setState({ indexTypes: [] });
|
||||
return;
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
render() {
|
||||
const {
|
||||
currentIndexingStage, indexDataResp, indexPatternResp, fileRef,
|
||||
indexName, indexTypes, showImportProgress
|
||||
} = this.state;
|
||||
const { onFileUpload, onFileRemove, transformDetails } = this.props;
|
||||
|
||||
return (
|
||||
<EuiForm>
|
||||
{showImportProgress
|
||||
? <JsonImportProgress
|
||||
importStage={currentIndexingStage}
|
||||
indexDataResp={indexDataResp}
|
||||
indexPatternResp={indexPatternResp}
|
||||
complete={currentIndexingStage === INDEXING_STAGE.INDEXING_COMPLETE}
|
||||
indexName={indexName}
|
||||
/>
|
||||
: (
|
||||
<Fragment>
|
||||
<JsonIndexFilePicker
|
||||
{...{
|
||||
onFileUpload,
|
||||
onFileRemove,
|
||||
fileRef,
|
||||
setIndexName: indexName => this.setState({ indexName }),
|
||||
setFileRef: fileRef => this.setState({ fileRef }),
|
||||
setParsedFile: parsedFile => this.setState({ parsedFile }),
|
||||
transformDetails,
|
||||
resetFileAndIndexSettings: this._resetFileAndIndexSettings,
|
||||
}}
|
||||
/>
|
||||
<IndexSettings
|
||||
disabled={!fileRef}
|
||||
indexName={indexName}
|
||||
setIndexName={indexName => this.setState({ indexName })}
|
||||
indexTypes={indexTypes}
|
||||
setSelectedIndexType={selectedIndexType =>
|
||||
this.setState({ selectedIndexType })
|
||||
}
|
||||
setHasIndexErrors={hasIndexErrors =>
|
||||
this.setState({ hasIndexErrors })
|
||||
}
|
||||
/>
|
||||
</Fragment>
|
||||
)
|
||||
}
|
||||
</EuiForm>
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
JsonUploadAndParse.defaultProps = {
|
||||
isIndexingTriggered: false,
|
||||
boolCreateIndexPattern: true,
|
||||
};
|
||||
|
||||
JsonUploadAndParse.propTypes = {
|
||||
appName: PropTypes.string,
|
||||
isIndexingTriggered: PropTypes.bool,
|
||||
boolCreateIndexPattern: PropTypes.bool,
|
||||
transformDetails: PropTypes.oneOfType([
|
||||
PropTypes.string,
|
||||
PropTypes.object,
|
||||
]),
|
||||
onIndexReadyStatusChange: PropTypes.func,
|
||||
onIndexingComplete: PropTypes.func,
|
||||
onFileUpload: PropTypes.func
|
||||
};
|
7
x-pack/plugins/file_upload/public/index.js
Normal file
7
x-pack/plugins/file_upload/public/index.js
Normal file
|
@ -0,0 +1,7 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
export { JsonUploadAndParse } from './components/json_upload_and_parse';
|
13
x-pack/plugins/file_upload/public/kibana_services.js
Normal file
13
x-pack/plugins/file_upload/public/kibana_services.js
Normal file
|
@ -0,0 +1,13 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { uiModules } from 'ui/modules';
|
||||
|
||||
export let indexPatternService;
|
||||
|
||||
uiModules.get('app/file_upload').run(($injector) => {
|
||||
indexPatternService = $injector.get('indexPatterns');
|
||||
});
|
55
x-pack/plugins/file_upload/public/util/file_parser.js
Normal file
55
x-pack/plugins/file_upload/public/util/file_parser.js
Normal file
|
@ -0,0 +1,55 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import _ from 'lodash';
|
||||
import { geoJsonCleanAndValidate } from './geo_json_clean_and_validate';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
|
||||
export async function parseFile(file, previewCallback = null, transformDetails,
|
||||
FileReader = window.FileReader) {
|
||||
|
||||
let cleanAndValidate;
|
||||
if (typeof transformDetails === 'object') {
|
||||
cleanAndValidate = transformDetails.cleanAndValidate;
|
||||
} else {
|
||||
switch(transformDetails) {
|
||||
case 'geo':
|
||||
cleanAndValidate = geoJsonCleanAndValidate;
|
||||
break;
|
||||
default:
|
||||
throw(
|
||||
i18n.translate(
|
||||
'xpack.fileUpload.fileParser.transformDetailsNotDefined', {
|
||||
defaultMessage: 'Index options for {transformDetails} not defined',
|
||||
values: { transformDetails }
|
||||
})
|
||||
);
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
return new Promise((resolve, reject) => {
|
||||
const fr = new FileReader();
|
||||
fr.onload = ({ target: { result } }) => {
|
||||
try {
|
||||
const parsedJson = JSON.parse(result);
|
||||
// Clean & validate
|
||||
const cleanAndValidJson = cleanAndValidate(parsedJson);
|
||||
if (!cleanAndValidJson) {
|
||||
return;
|
||||
}
|
||||
if (previewCallback) {
|
||||
const defaultName = _.get(cleanAndValidJson, 'name', 'Import File');
|
||||
previewCallback(cleanAndValidJson, defaultName);
|
||||
}
|
||||
resolve(cleanAndValidJson);
|
||||
} catch (e) {
|
||||
reject(e);
|
||||
}
|
||||
};
|
||||
fr.readAsText(file);
|
||||
});
|
||||
}
|
|
@ -0,0 +1,50 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
const jsts = require('jsts');
|
||||
import rewind from 'geojson-rewind';
|
||||
|
||||
export function geoJsonCleanAndValidate(parsedFile) {
|
||||
|
||||
const reader = new jsts.io.GeoJSONReader();
|
||||
const geoJson = reader.read(parsedFile);
|
||||
const isSingleFeature = parsedFile.type === 'Feature';
|
||||
const features = isSingleFeature
|
||||
? [{ ...geoJson }]
|
||||
: geoJson.features;
|
||||
|
||||
// Pass features for cleaning
|
||||
const cleanedFeatures = cleanFeatures(features);
|
||||
|
||||
// Put clean features back in geoJson object
|
||||
const cleanGeoJson = {
|
||||
...parsedFile,
|
||||
...(isSingleFeature
|
||||
? cleanedFeatures[0]
|
||||
: { features: cleanedFeatures }
|
||||
),
|
||||
};
|
||||
|
||||
// Pass entire geoJson object for winding
|
||||
// JSTS does not enforce winding order, wind in clockwise order
|
||||
const correctlyWindedGeoJson = rewind(cleanGeoJson, false);
|
||||
return correctlyWindedGeoJson;
|
||||
}
|
||||
|
||||
export function cleanFeatures(features) {
|
||||
const writer = new jsts.io.GeoJSONWriter();
|
||||
return features.map(({ id, geometry, properties }) => {
|
||||
const geojsonGeometry = (geometry.isSimple() || geometry.isValid())
|
||||
? writer.write(geometry)
|
||||
: writer.write(geometry.buffer(0));
|
||||
return ({
|
||||
type: 'Feature',
|
||||
geometry: geojsonGeometry,
|
||||
...(id ? { id } : {}),
|
||||
...(properties ? { properties } : {}),
|
||||
});
|
||||
});
|
||||
}
|
|
@ -0,0 +1,144 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import {
|
||||
cleanFeatures,
|
||||
geoJsonCleanAndValidate,
|
||||
} from './geo_json_clean_and_validate';
|
||||
const jsts = require('jsts');
|
||||
|
||||
describe('geo_json_clean_and_validate', () => {
|
||||
|
||||
const reader = new jsts.io.GeoJSONReader();
|
||||
|
||||
it('should not modify valid features', () => {
|
||||
const goodFeatureGeoJson = {
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [[
|
||||
[-104.05, 78.99],
|
||||
[-87.22, 78.98],
|
||||
[-86.58, 75.94],
|
||||
[-104.03, 75.94],
|
||||
[-104.05, 78.99]
|
||||
]]
|
||||
},
|
||||
};
|
||||
|
||||
// Confirm valid geometry
|
||||
const geoJson = reader.read(goodFeatureGeoJson);
|
||||
const isSimpleOrValid = (geoJson.geometry.isSimple()
|
||||
|| geoJson.geometry.isValid());
|
||||
expect(isSimpleOrValid).toEqual(true);
|
||||
|
||||
// Confirm no change to features
|
||||
const cleanedFeatures = cleanFeatures([geoJson]);
|
||||
expect(cleanedFeatures[0]).toEqual(goodFeatureGeoJson);
|
||||
});
|
||||
|
||||
it('should modify incorrect features', () => {
|
||||
// This feature collection contains polygons which cross over themselves,
|
||||
// which is invalid for geojson
|
||||
const badFeaturesGeoJson = {
|
||||
type: 'FeatureCollection',
|
||||
features: [
|
||||
{
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [[
|
||||
[0, 0],
|
||||
[2, 2],
|
||||
[0, 2],
|
||||
[2, 0],
|
||||
[0, 0]
|
||||
]]
|
||||
}
|
||||
},
|
||||
{
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [[
|
||||
[2, 2],
|
||||
[4, 0],
|
||||
[2, 0],
|
||||
[4, 2],
|
||||
[2, 2]
|
||||
]]
|
||||
}
|
||||
}
|
||||
]
|
||||
};
|
||||
|
||||
// Confirm invalid geometry
|
||||
let geoJson = reader.read(badFeaturesGeoJson);
|
||||
let isSimpleOrValid;
|
||||
geoJson.features.forEach(feature => {
|
||||
isSimpleOrValid = (feature.geometry.isSimple()
|
||||
|| feature.geometry.isValid());
|
||||
expect(isSimpleOrValid).toEqual(false);
|
||||
});
|
||||
|
||||
// Confirm changes to object
|
||||
const cleanedFeatures = cleanFeatures(geoJson.features);
|
||||
expect(cleanedFeatures).not.toEqual(badFeaturesGeoJson.features);
|
||||
|
||||
// Confirm now valid features geometry
|
||||
geoJson = reader.read({ ...badFeaturesGeoJson, features: cleanedFeatures });
|
||||
geoJson.features.forEach(feature => {
|
||||
isSimpleOrValid = (feature.geometry.isSimple()
|
||||
|| feature.geometry.isValid());
|
||||
expect(isSimpleOrValid).toEqual(true);
|
||||
});
|
||||
});
|
||||
|
||||
it('should reverse counter-clockwise winding order', () => {
|
||||
const counterClockwiseGeoJson = {
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [[
|
||||
[100, 0],
|
||||
[101, 0],
|
||||
[101, 1],
|
||||
[100, 1],
|
||||
[100, 0]
|
||||
], [
|
||||
[100.2, 0.2],
|
||||
[100.8, 0.2],
|
||||
[100.8, 0.8],
|
||||
[100.2, 0.8],
|
||||
[100.2, 0.2]
|
||||
]]
|
||||
}
|
||||
};
|
||||
|
||||
// Confirm changes to object
|
||||
const clockwiseGeoJson = geoJsonCleanAndValidate(counterClockwiseGeoJson);
|
||||
expect(clockwiseGeoJson).not.toEqual(counterClockwiseGeoJson);
|
||||
|
||||
// Run it through again, expect it not to change
|
||||
const clockwiseGeoJson2 = geoJsonCleanAndValidate(clockwiseGeoJson);
|
||||
expect(clockwiseGeoJson).toEqual(clockwiseGeoJson2);
|
||||
});
|
||||
|
||||
it('error out on invalid object', () => {
|
||||
const invalidGeoJson = {
|
||||
type: 'notMyType',
|
||||
geometry: 'shmeometry'
|
||||
};
|
||||
|
||||
const notEvenCloseToGeoJson = [1, 2, 3, 4];
|
||||
|
||||
const badObjectPassed = () => geoJsonCleanAndValidate(invalidGeoJson);
|
||||
expect(badObjectPassed).toThrow();
|
||||
|
||||
const worseObjectPassed = () => geoJsonCleanAndValidate(notEvenCloseToGeoJson);
|
||||
expect(worseObjectPassed).toThrow();
|
||||
});
|
||||
});
|
95
x-pack/plugins/file_upload/public/util/geo_processing.js
Normal file
95
x-pack/plugins/file_upload/public/util/geo_processing.js
Normal file
|
@ -0,0 +1,95 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import _ from 'lodash';
|
||||
import { ES_GEO_FIELD_TYPE } from '../../common/constants/file_import';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
|
||||
const DEFAULT_SETTINGS = {
|
||||
number_of_shards: 1
|
||||
};
|
||||
|
||||
const DEFAULT_GEO_SHAPE_MAPPINGS = {
|
||||
'coordinates': {
|
||||
'type': ES_GEO_FIELD_TYPE.GEO_SHAPE
|
||||
}
|
||||
};
|
||||
|
||||
const DEFAULT_GEO_POINT_MAPPINGS = {
|
||||
'coordinates': {
|
||||
'type': ES_GEO_FIELD_TYPE.GEO_POINT
|
||||
}
|
||||
};
|
||||
|
||||
const DEFAULT_INGEST_PIPELINE = {};
|
||||
|
||||
export function getGeoIndexTypesForFeatures(featureTypes) {
|
||||
if (!featureTypes || !featureTypes.length) {
|
||||
return [];
|
||||
} else if (!featureTypes.includes('Point')) {
|
||||
return [ES_GEO_FIELD_TYPE.GEO_SHAPE];
|
||||
} else if (featureTypes.includes('Point') && featureTypes.length === 1) {
|
||||
return [ ES_GEO_FIELD_TYPE.GEO_POINT, ES_GEO_FIELD_TYPE.GEO_SHAPE ];
|
||||
} else {
|
||||
return [ ES_GEO_FIELD_TYPE.GEO_SHAPE ];
|
||||
}
|
||||
}
|
||||
|
||||
// Reduces & flattens geojson to coordinates and properties (if any)
|
||||
export function geoJsonToEs(parsedGeojson, datatype) {
|
||||
if (!parsedGeojson) {
|
||||
return [];
|
||||
}
|
||||
const features = parsedGeojson.type === 'Feature'
|
||||
? [ parsedGeojson ]
|
||||
: parsedGeojson.features;
|
||||
|
||||
if (datatype === ES_GEO_FIELD_TYPE.GEO_SHAPE) {
|
||||
return features.reduce((accu, { geometry, properties }) => {
|
||||
const { coordinates } = geometry;
|
||||
accu.push({
|
||||
coordinates: {
|
||||
'type': geometry.type.toLowerCase(),
|
||||
'coordinates': coordinates
|
||||
},
|
||||
...(!_.isEmpty(properties) ? { ...properties } : {})
|
||||
});
|
||||
return accu;
|
||||
}, []);
|
||||
} else if (datatype === ES_GEO_FIELD_TYPE.GEO_POINT) {
|
||||
return features.reduce((accu, { geometry, properties }) => {
|
||||
const { coordinates } = geometry;
|
||||
if (Array.isArray(coordinates[0])) {
|
||||
throw(
|
||||
i18n.translate(
|
||||
'xpack.fileUpload.geoProcessing.notPointError', {
|
||||
defaultMessage: 'Coordinates {coordinates} does not contain point datatype',
|
||||
values: { coordinates: coordinates.toString() }
|
||||
})
|
||||
);
|
||||
return accu;
|
||||
}
|
||||
accu.push({
|
||||
coordinates,
|
||||
...(!_.isEmpty(properties) ? { ...properties } : {})
|
||||
});
|
||||
return accu;
|
||||
}, []);
|
||||
} else {
|
||||
return [];
|
||||
}
|
||||
}
|
||||
|
||||
export function getGeoJsonIndexingDetails(parsedGeojson, dataType) {
|
||||
return {
|
||||
data: geoJsonToEs(parsedGeojson, dataType),
|
||||
ingestPipeline: DEFAULT_INGEST_PIPELINE,
|
||||
mappings: (dataType === ES_GEO_FIELD_TYPE.GEO_POINT)
|
||||
? DEFAULT_GEO_POINT_MAPPINGS
|
||||
: DEFAULT_GEO_SHAPE_MAPPINGS,
|
||||
settings: DEFAULT_SETTINGS
|
||||
};
|
||||
}
|
147
x-pack/plugins/file_upload/public/util/geo_processing.test.js
Normal file
147
x-pack/plugins/file_upload/public/util/geo_processing.test.js
Normal file
|
@ -0,0 +1,147 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { geoJsonToEs } from './geo_processing';
|
||||
import { ES_GEO_FIELD_TYPE } from '../../common/constants/file_import';
|
||||
|
||||
describe('geo_processing', () => {
|
||||
describe('getGeoJsonToEs', () => {
|
||||
|
||||
const parsedPointFeature = {
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Point',
|
||||
coordinates: [105.7, 18.9]
|
||||
},
|
||||
properties: {
|
||||
name: 'Dogeville'
|
||||
}
|
||||
};
|
||||
|
||||
it('should convert point feature to flattened ES compatible feature', () => {
|
||||
const esFeatureArr = geoJsonToEs(parsedPointFeature, ES_GEO_FIELD_TYPE.GEO_POINT);
|
||||
expect(esFeatureArr).toEqual([{
|
||||
coordinates: [
|
||||
105.7,
|
||||
18.9
|
||||
],
|
||||
name: 'Dogeville',
|
||||
}]);
|
||||
});
|
||||
|
||||
it('should convert point feature collection to flattened ES compatible feature', () => {
|
||||
const parsedPointFeatureCollection = {
|
||||
type: 'FeatureCollection',
|
||||
features: [
|
||||
{
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Point',
|
||||
coordinates: [34.1, 15.3]
|
||||
},
|
||||
properties: {
|
||||
name: 'Meowsers City'
|
||||
}
|
||||
}
|
||||
]
|
||||
};
|
||||
|
||||
const esFeatureArr = geoJsonToEs(
|
||||
parsedPointFeatureCollection,
|
||||
ES_GEO_FIELD_TYPE.GEO_POINT
|
||||
);
|
||||
expect(esFeatureArr).toEqual([{
|
||||
coordinates: [
|
||||
34.1,
|
||||
15.3,
|
||||
],
|
||||
name: 'Meowsers City',
|
||||
}]);
|
||||
});
|
||||
|
||||
it('should convert shape feature to flattened ES compatible feature', () => {
|
||||
const parsedShapeFeature = {
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [[
|
||||
[-104.05, 78.99],
|
||||
[-87.22, 78.98],
|
||||
[-86.58, 75.94],
|
||||
[-104.03, 75.94],
|
||||
[-104.05, 78.99]
|
||||
]]
|
||||
},
|
||||
properties: {
|
||||
name: 'Whiskers City'
|
||||
}
|
||||
};
|
||||
|
||||
const esFeatureArr = geoJsonToEs(parsedShapeFeature, ES_GEO_FIELD_TYPE.GEO_SHAPE);
|
||||
expect(esFeatureArr).toEqual([{
|
||||
coordinates: {
|
||||
coordinates: [[
|
||||
[-104.05, 78.99],
|
||||
[-87.22, 78.98],
|
||||
[-86.58, 75.94],
|
||||
[-104.03, 75.94],
|
||||
[-104.05, 78.99],
|
||||
]],
|
||||
type: 'polygon'
|
||||
},
|
||||
name: 'Whiskers City',
|
||||
}]);
|
||||
});
|
||||
|
||||
it('should convert shape feature collection to flattened ES compatible feature', () => {
|
||||
|
||||
const parsedShapeFeatureCollection = {
|
||||
type: 'FeatureCollection',
|
||||
features: [
|
||||
{
|
||||
type: 'Feature',
|
||||
geometry: {
|
||||
type: 'Polygon',
|
||||
coordinates: [[
|
||||
[-104.05, 79.89],
|
||||
[-87.22, 79.88],
|
||||
[-86.58, 74.84],
|
||||
[-104.03, 75.84],
|
||||
[-104.05, 78.89]
|
||||
]]
|
||||
},
|
||||
properties: {
|
||||
name: 'Woof Crossing'
|
||||
}
|
||||
}
|
||||
]
|
||||
};
|
||||
|
||||
const esFeatureArr = geoJsonToEs(
|
||||
parsedShapeFeatureCollection,
|
||||
ES_GEO_FIELD_TYPE.GEO_SHAPE
|
||||
);
|
||||
expect(esFeatureArr).toEqual([{
|
||||
coordinates: {
|
||||
coordinates: [[
|
||||
[-104.05, 79.89],
|
||||
[-87.22, 79.88],
|
||||
[-86.58, 74.84],
|
||||
[-104.03, 75.84],
|
||||
[-104.05, 78.89]
|
||||
]],
|
||||
type: 'polygon',
|
||||
},
|
||||
name: 'Woof Crossing',
|
||||
}]);
|
||||
});
|
||||
|
||||
it('should return an empty for an unhandled datatype', () => {
|
||||
const esFeatureArr = geoJsonToEs(parsedPointFeature, 'different datatype');
|
||||
expect(esFeatureArr).toEqual([]);
|
||||
});
|
||||
});
|
||||
});
|
77
x-pack/plugins/file_upload/public/util/http_service.js
Normal file
77
x-pack/plugins/file_upload/public/util/http_service.js
Normal file
|
@ -0,0 +1,77 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
// service for interacting with the server
|
||||
|
||||
import chrome from 'ui/chrome';
|
||||
import { addSystemApiHeader } from 'ui/system_api';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
|
||||
const FETCH_TIMEOUT = 10000;
|
||||
|
||||
export async function http(options) {
|
||||
if(!(options && options.url)) {
|
||||
throw(
|
||||
i18n.translate('xpack.fileUpload.httpService.noUrl',
|
||||
{ defaultMessage: 'No URL provided' })
|
||||
);
|
||||
}
|
||||
const url = options.url || '';
|
||||
const headers = addSystemApiHeader({
|
||||
'Content-Type': 'application/json',
|
||||
'kbn-version': chrome.getXsrfToken(),
|
||||
...options.headers
|
||||
});
|
||||
|
||||
const allHeaders = (options.headers === undefined) ? headers : { ...options.headers, ...headers };
|
||||
const body = (options.data === undefined) ? null : JSON.stringify(options.data);
|
||||
|
||||
const payload = {
|
||||
method: (options.method || 'GET'),
|
||||
headers: allHeaders,
|
||||
credentials: 'same-origin'
|
||||
};
|
||||
|
||||
if (body !== null) {
|
||||
payload.body = body;
|
||||
}
|
||||
return await fetchWithTimeout(url, payload);
|
||||
}
|
||||
|
||||
async function fetchWithTimeout(url, payload) {
|
||||
let timedOut = false;
|
||||
|
||||
return new Promise(function (resolve, reject) {
|
||||
const timeout = setTimeout(function () {
|
||||
timedOut = true;
|
||||
reject(new Error(
|
||||
i18n.translate('xpack.fileUpload.httpService.requestTimedOut',
|
||||
{ defaultMessage: 'Request timed out' }))
|
||||
);
|
||||
}, FETCH_TIMEOUT);
|
||||
|
||||
fetch(url, payload)
|
||||
.then(resp => {
|
||||
clearTimeout(timeout);
|
||||
if (!timedOut) {
|
||||
resolve(resp);
|
||||
}
|
||||
})
|
||||
.catch(function (err) {
|
||||
reject(err);
|
||||
if (timedOut) return;
|
||||
});
|
||||
}).then(resp => resp.json())
|
||||
.catch(function (err) {
|
||||
console.error(
|
||||
i18n.translate('xpack.fileUpload.httpService.fetchError', {
|
||||
defaultMessage: 'Error performing fetch: {error}',
|
||||
values: { error: err.message }
|
||||
}));
|
||||
});
|
||||
}
|
257
x-pack/plugins/file_upload/public/util/indexing_service.js
Normal file
257
x-pack/plugins/file_upload/public/util/indexing_service.js
Normal file
|
@ -0,0 +1,257 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { http } from './http_service';
|
||||
import chrome from 'ui/chrome';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
import { indexPatternService } from '../kibana_services';
|
||||
import { getGeoJsonIndexingDetails } from './geo_processing';
|
||||
import { sizeLimitedChunking } from './size_limited_chunking';
|
||||
|
||||
const basePath = chrome.addBasePath('/api/fileupload');
|
||||
const fileType = 'json';
|
||||
|
||||
export async function indexData(parsedFile, transformDetails, indexName, dataType, appName) {
|
||||
if (!parsedFile) {
|
||||
throw(i18n.translate('xpack.fileUpload.indexingService.noFileImported', {
|
||||
defaultMessage: 'No file imported.'
|
||||
}));
|
||||
return;
|
||||
}
|
||||
|
||||
// Perform any processing required on file prior to indexing
|
||||
const transformResult = transformDataByFormatForIndexing(transformDetails, parsedFile, dataType);
|
||||
if (!transformResult.success) {
|
||||
throw(i18n.translate('xpack.fileUpload.indexingService.transformResultError', {
|
||||
defaultMessage: 'Error transforming data: {error}',
|
||||
values: { error: transformResult.error }
|
||||
}));
|
||||
}
|
||||
|
||||
// Create new index
|
||||
const { indexingDetails } = transformResult;
|
||||
const createdIndex = await writeToIndex({
|
||||
appName,
|
||||
...indexingDetails,
|
||||
id: undefined,
|
||||
data: [],
|
||||
index: indexName,
|
||||
});
|
||||
let id;
|
||||
try {
|
||||
if (createdIndex && createdIndex.id) {
|
||||
id = createdIndex.id;
|
||||
} else {
|
||||
throw i18n.translate('xpack.fileUpload.indexingService.errorCreatingIndex', {
|
||||
defaultMessage: 'Error creating index',
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
return {
|
||||
error,
|
||||
success: false
|
||||
};
|
||||
}
|
||||
|
||||
// Write to index
|
||||
const indexWriteResults = await chunkDataAndWriteToIndex({
|
||||
id,
|
||||
index: indexName,
|
||||
...indexingDetails,
|
||||
settings: {},
|
||||
mappings: {},
|
||||
});
|
||||
return indexWriteResults;
|
||||
}
|
||||
|
||||
|
||||
function transformDataByFormatForIndexing(transform, parsedFile, dataType) {
|
||||
let indexingDetails;
|
||||
if (!transform) {
|
||||
return {
|
||||
success: false,
|
||||
error: i18n.translate('xpack.fileUpload.indexingService.noTransformDefined', {
|
||||
defaultMessage: 'No transform defined',
|
||||
})
|
||||
};
|
||||
}
|
||||
if (typeof transform !== 'object') {
|
||||
switch(transform) {
|
||||
case 'geo':
|
||||
indexingDetails = getGeoJsonIndexingDetails(parsedFile, dataType);
|
||||
break;
|
||||
default:
|
||||
return {
|
||||
success: false,
|
||||
error: i18n.translate('xpack.fileUpload.indexingService.noHandlingForTransform', {
|
||||
defaultMessage: 'No handling defined for transform: {transform}',
|
||||
values: { transform }
|
||||
})
|
||||
};
|
||||
}
|
||||
} else { // Custom transform
|
||||
indexingDetails = transform.getIndexingDetails(parsedFile);
|
||||
}
|
||||
if (indexingDetails && indexingDetails.data && indexingDetails.data.length) {
|
||||
return {
|
||||
success: true,
|
||||
indexingDetails
|
||||
};
|
||||
} else if (indexingDetails && indexingDetails.data) {
|
||||
return {
|
||||
success: false,
|
||||
error: i18n.translate('xpack.fileUpload.indexingService.noIndexingDetailsForDatatype', {
|
||||
defaultMessage: `No indexing details defined for datatype: {dataType}`,
|
||||
values: { dataType }
|
||||
})
|
||||
};
|
||||
} else {
|
||||
return {
|
||||
success: false,
|
||||
error: i18n.translate('xpack.fileUpload.indexingService.unknownTransformError', {
|
||||
defaultMessage: 'Unknown error performing transform: {transform}',
|
||||
values: { transform }
|
||||
})
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
async function writeToIndex(indexingDetails) {
|
||||
const paramString = (indexingDetails.id !== undefined) ? `?id=${indexingDetails.id}` : '';
|
||||
const {
|
||||
appName,
|
||||
index,
|
||||
data,
|
||||
settings,
|
||||
mappings,
|
||||
ingestPipeline
|
||||
} = indexingDetails;
|
||||
|
||||
return await http({
|
||||
url: `${basePath}/import${paramString}`,
|
||||
method: 'POST',
|
||||
data: {
|
||||
index,
|
||||
data,
|
||||
settings,
|
||||
mappings,
|
||||
ingestPipeline,
|
||||
fileType,
|
||||
...(appName ? { app: appName } : {})
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
async function chunkDataAndWriteToIndex({ id, index, data, mappings, settings }) {
|
||||
if (!index) {
|
||||
return {
|
||||
success: false,
|
||||
error: i18n.translate('xpack.fileUpload.noIndexSuppliedErrorMessage', {
|
||||
defaultMessage: 'No index provided.'
|
||||
})
|
||||
};
|
||||
}
|
||||
|
||||
const chunks = sizeLimitedChunking(data);
|
||||
|
||||
let success = true;
|
||||
let failures = [];
|
||||
let error;
|
||||
let docCount = 0;
|
||||
|
||||
for (let i = 0; i < chunks.length; i++) {
|
||||
const aggs = {
|
||||
id,
|
||||
index,
|
||||
data: chunks[i],
|
||||
settings,
|
||||
mappings,
|
||||
ingestPipeline: {} // TODO: Support custom ingest pipelines
|
||||
};
|
||||
|
||||
let resp = {
|
||||
success: false,
|
||||
failures: [],
|
||||
docCount: 0,
|
||||
};
|
||||
resp = await writeToIndex(aggs);
|
||||
|
||||
failures = [ ...failures, ...resp.failures ];
|
||||
if (resp.success) {
|
||||
({ success } = resp);
|
||||
docCount = docCount + resp.docCount;
|
||||
} else {
|
||||
success = false;
|
||||
error = resp.error;
|
||||
docCount = 0;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
success,
|
||||
failures,
|
||||
docCount,
|
||||
...(error ? { error } : {})
|
||||
};
|
||||
}
|
||||
|
||||
export async function createIndexPattern(indexPatternName) {
|
||||
const indexPatterns = await indexPatternService.get();
|
||||
try {
|
||||
Object.assign(indexPatterns, {
|
||||
id: '',
|
||||
title: indexPatternName,
|
||||
});
|
||||
|
||||
await indexPatterns.create(true);
|
||||
const id = await getIndexPatternId(indexPatternName);
|
||||
const indexPattern = await indexPatternService.get(id);
|
||||
return {
|
||||
success: true,
|
||||
id,
|
||||
fields: indexPattern.fields
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
error,
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
async function getIndexPatternId(name) {
|
||||
const savedObjectsClient = chrome.getSavedObjectsClient();
|
||||
const savedObjectSearch =
|
||||
await savedObjectsClient.find({ type: 'index-pattern', perPage: 1000 });
|
||||
const indexPatternSavedObjects = savedObjectSearch.savedObjects;
|
||||
|
||||
if (indexPatternSavedObjects) {
|
||||
const ip = indexPatternSavedObjects.find(i => i.attributes.title === name);
|
||||
return (ip !== undefined) ? ip.id : undefined;
|
||||
} else {
|
||||
return undefined;
|
||||
}
|
||||
}
|
||||
|
||||
export async function getExistingIndices() {
|
||||
const basePath = chrome.addBasePath('/api');
|
||||
return await http({
|
||||
url: `${basePath}/index_management/indices`,
|
||||
method: 'GET',
|
||||
});
|
||||
}
|
||||
|
||||
export async function getExistingIndexPatterns() {
|
||||
const savedObjectsClient = chrome.getSavedObjectsClient();
|
||||
return savedObjectsClient.find({
|
||||
type: 'index-pattern',
|
||||
fields: ['id', 'title', 'type', 'fields'],
|
||||
perPage: 10000
|
||||
}).then(({ savedObjects }) =>
|
||||
savedObjects.map(savedObject => savedObject.get('title'))
|
||||
);
|
||||
}
|
|
@ -0,0 +1,29 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
import { MAX_BYTES } from '../../common/constants/file_import';
|
||||
|
||||
// Add data elements to chunk until limit is met
|
||||
export function sizeLimitedChunking(dataArr, maxChunkSize = MAX_BYTES) {
|
||||
let chunkSize = 0;
|
||||
return dataArr.reduce((accu, el) => {
|
||||
const featureByteSize = (
|
||||
new Blob([JSON.stringify(el)], { type: 'application/json' })
|
||||
).size;
|
||||
if (featureByteSize > maxChunkSize) {
|
||||
throw `Some features exceed maximum chunk size of ${maxChunkSize}`;
|
||||
} else if (chunkSize + featureByteSize < maxChunkSize) {
|
||||
const lastChunkRef = accu.length - 1;
|
||||
chunkSize += featureByteSize;
|
||||
accu[lastChunkRef].push(el);
|
||||
} else {
|
||||
chunkSize = featureByteSize;
|
||||
accu.push([el]);
|
||||
}
|
||||
return accu;
|
||||
}, [[]]);
|
||||
}
|
||||
|
||||
|
|
@ -0,0 +1,32 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import {
|
||||
sizeLimitedChunking
|
||||
} from './size_limited_chunking';
|
||||
|
||||
describe('size_limited_chunking', () => {
|
||||
|
||||
// 1000 elements where element value === index
|
||||
const testArr = Array.from(Array(1000), (_, x) => x);
|
||||
|
||||
it('should limit each sub-array to the max chunk size', () => {
|
||||
// Confirm valid geometry
|
||||
const chunkLimit = 100;
|
||||
const chunkedArr = sizeLimitedChunking(testArr, chunkLimit);
|
||||
chunkedArr.forEach(sizeLimitedArr => {
|
||||
const arrByteSize = (
|
||||
new Blob(sizeLimitedArr, { type: 'application/json' })
|
||||
).size;
|
||||
|
||||
// Chunk size should be less than chunk limit
|
||||
expect(arrByteSize).toBeLessThan(chunkLimit);
|
||||
// # of arrays generated should be greater than original array length
|
||||
// divided by chunk limit
|
||||
expect(chunkedArr.length).toBeGreaterThanOrEqual(testArr.length / chunkLimit);
|
||||
});
|
||||
});
|
||||
});
|
9
x-pack/plugins/file_upload/server/client/call_with_internal_user_factory.d.ts
vendored
Normal file
9
x-pack/plugins/file_upload/server/client/call_with_internal_user_factory.d.ts
vendored
Normal file
|
@ -0,0 +1,9 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { Server } from 'hapi';
|
||||
|
||||
export function callWithInternalUserFactory(server: Server): any;
|
|
@ -0,0 +1,20 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
import { once } from 'lodash';
|
||||
|
||||
const _callWithInternalUser = once((server) => {
|
||||
const { callWithInternalUser } = server.plugins.elasticsearch.getCluster('admin');
|
||||
return callWithInternalUser;
|
||||
});
|
||||
|
||||
export const callWithInternalUserFactory = (server) => {
|
||||
return (...args) => {
|
||||
return _callWithInternalUser(server)(...args);
|
||||
};
|
||||
};
|
|
@ -0,0 +1,32 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { callWithInternalUserFactory } from './call_with_internal_user_factory';
|
||||
|
||||
describe('call_with_internal_user_factory', () => {
|
||||
describe('callWithInternalUserFactory', () => {
|
||||
let server: any;
|
||||
let callWithInternalUser: any;
|
||||
|
||||
beforeEach(() => {
|
||||
callWithInternalUser = jest.fn();
|
||||
server = {
|
||||
plugins: {
|
||||
elasticsearch: {
|
||||
getCluster: jest.fn(() => ({ callWithInternalUser })),
|
||||
},
|
||||
},
|
||||
};
|
||||
});
|
||||
|
||||
it('should use internal user "admin"', () => {
|
||||
const callWithInternalUserInstance = callWithInternalUserFactory(server);
|
||||
callWithInternalUserInstance();
|
||||
|
||||
expect(server.plugins.elasticsearch.getCluster).toHaveBeenCalledWith('admin');
|
||||
});
|
||||
});
|
||||
});
|
|
@ -0,0 +1,20 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
import { once } from 'lodash';
|
||||
|
||||
const callWithRequest = once((server) => {
|
||||
const cluster = server.plugins.elasticsearch.getCluster('data');
|
||||
return cluster.callWithRequest;
|
||||
});
|
||||
|
||||
export const callWithRequestFactory = (server, request) => {
|
||||
return (...args) => {
|
||||
return callWithRequest(server)(request, ...args);
|
||||
};
|
||||
};
|
13
x-pack/plugins/file_upload/server/client/errors.js
Normal file
13
x-pack/plugins/file_upload/server/client/errors.js
Normal file
|
@ -0,0 +1,13 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
import { boomify } from 'boom';
|
||||
|
||||
export function wrapError(error) {
|
||||
return boomify(error, { statusCode: error.status });
|
||||
}
|
|
@ -0,0 +1,167 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { INDEX_META_DATA_CREATED_BY } from '../../../common/constants/file_import';
|
||||
import uuid from 'uuid';
|
||||
|
||||
export function importDataProvider(callWithRequest) {
|
||||
async function importData(id, index, settings, mappings, ingestPipeline, data) {
|
||||
let createdIndex;
|
||||
let createdPipelineId;
|
||||
const docCount = data.length;
|
||||
|
||||
try {
|
||||
|
||||
const {
|
||||
id: pipelineId,
|
||||
pipeline,
|
||||
} = ingestPipeline;
|
||||
|
||||
if (id === undefined) {
|
||||
// first chunk of data, create the index and id to return
|
||||
id = uuid.v1();
|
||||
|
||||
await createIndex(index, settings, mappings);
|
||||
createdIndex = index;
|
||||
|
||||
// create the pipeline if one has been supplied
|
||||
if (pipelineId !== undefined) {
|
||||
const success = await createPipeline(pipelineId, pipeline);
|
||||
if (success.acknowledged !== true) {
|
||||
throw success;
|
||||
}
|
||||
}
|
||||
createdPipelineId = pipelineId;
|
||||
|
||||
} else {
|
||||
createdIndex = index;
|
||||
createdPipelineId = pipelineId;
|
||||
}
|
||||
|
||||
let failures = [];
|
||||
if (data.length) {
|
||||
const resp = await indexData(index, createdPipelineId, data);
|
||||
if (resp.success === false) {
|
||||
if (resp.ingestError) {
|
||||
// all docs failed, abort
|
||||
throw resp;
|
||||
} else {
|
||||
// some docs failed.
|
||||
// still report success but with a list of failures
|
||||
failures = (resp.failures || []);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
success: true,
|
||||
id,
|
||||
index: createdIndex,
|
||||
pipelineId: createdPipelineId,
|
||||
docCount,
|
||||
failures,
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
success: false,
|
||||
id,
|
||||
index: createdIndex,
|
||||
pipelineId: createdPipelineId,
|
||||
error: (error.error !== undefined) ? error.error : error,
|
||||
docCount,
|
||||
ingestError: error.ingestError,
|
||||
failures: (error.failures || [])
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
async function createIndex(index, settings, mappings) {
|
||||
const body = {
|
||||
mappings: {
|
||||
_meta: {
|
||||
created_by: INDEX_META_DATA_CREATED_BY
|
||||
},
|
||||
properties: mappings
|
||||
}
|
||||
};
|
||||
|
||||
if (settings && Object.keys(settings).length) {
|
||||
body.settings = settings;
|
||||
}
|
||||
|
||||
await callWithRequest('indices.create', { index, body });
|
||||
}
|
||||
|
||||
async function indexData(index, pipelineId, data) {
|
||||
try {
|
||||
const body = [];
|
||||
for (let i = 0; i < data.length; i++) {
|
||||
body.push({ index: {} });
|
||||
body.push(data[i]);
|
||||
}
|
||||
|
||||
const settings = { index, body };
|
||||
if (pipelineId !== undefined) {
|
||||
settings.pipeline = pipelineId;
|
||||
}
|
||||
|
||||
const resp = await callWithRequest('bulk', settings);
|
||||
if (resp.errors) {
|
||||
throw resp;
|
||||
} else {
|
||||
return {
|
||||
success: true,
|
||||
docs: data.length,
|
||||
failures: [],
|
||||
};
|
||||
}
|
||||
} catch (error) {
|
||||
|
||||
let failures = [];
|
||||
let ingestError = false;
|
||||
if (error.errors !== undefined && Array.isArray(error.items)) {
|
||||
// an expected error where some or all of the bulk request
|
||||
// docs have failed to be ingested.
|
||||
failures = getFailures(error.items, data);
|
||||
} else {
|
||||
// some other error has happened.
|
||||
ingestError = true;
|
||||
}
|
||||
|
||||
return {
|
||||
success: false,
|
||||
error,
|
||||
docCount: data.length,
|
||||
failures,
|
||||
ingestError,
|
||||
};
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
async function createPipeline(id, pipeline) {
|
||||
return await callWithRequest('ingest.putPipeline', { id, body: pipeline });
|
||||
}
|
||||
|
||||
function getFailures(items, data) {
|
||||
const failures = [];
|
||||
for (let i = 0; i < items.length; i++) {
|
||||
const item = items[i];
|
||||
if (item.index && item.index.error) {
|
||||
failures.push({
|
||||
item: i,
|
||||
reason: item.index.error.reason,
|
||||
doc: data[i],
|
||||
});
|
||||
}
|
||||
}
|
||||
return failures;
|
||||
}
|
||||
|
||||
return {
|
||||
importData,
|
||||
};
|
||||
}
|
|
@ -0,0 +1,8 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
export { importDataProvider } from './import_data';
|
45
x-pack/plugins/file_upload/server/routes/file_upload.js
Normal file
45
x-pack/plugins/file_upload/server/routes/file_upload.js
Normal file
|
@ -0,0 +1,45 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { callWithRequestFactory } from '../client/call_with_request_factory';
|
||||
import { wrapError } from '../client/errors';
|
||||
import { importDataProvider } from '../models/import_data';
|
||||
import { MAX_BYTES } from '../../common/constants/file_import';
|
||||
import { updateTelemetry } from '../telemetry/telemetry';
|
||||
|
||||
|
||||
function importData({
|
||||
callWithRequest, id, index, settings, mappings, ingestPipeline, data
|
||||
}) {
|
||||
const { importData: importDataFunc } = importDataProvider(callWithRequest);
|
||||
return importDataFunc(id, index, settings, mappings, ingestPipeline, data);
|
||||
}
|
||||
|
||||
export function fileUploadRoutes(server, commonRouteConfig) {
|
||||
|
||||
server.route({
|
||||
method: 'POST',
|
||||
path: '/api/fileupload/import',
|
||||
async handler(request) {
|
||||
|
||||
// `id` being `undefined` tells us that this is a new import due to create a new index.
|
||||
// follow-up import calls to just add additional data will include the `id` of the created
|
||||
// index, we'll ignore those and don't increment the counter.
|
||||
const { id } = request.query;
|
||||
if (id === undefined) {
|
||||
await updateTelemetry({ server, ...request.payload });
|
||||
}
|
||||
|
||||
const callWithRequest = callWithRequestFactory(server, request);
|
||||
return importData({ callWithRequest, id, ...request.payload })
|
||||
.catch(wrapError);
|
||||
},
|
||||
config: {
|
||||
...commonRouteConfig,
|
||||
payload: { maxBytes: MAX_BYTES },
|
||||
}
|
||||
});
|
||||
}
|
8
x-pack/plugins/file_upload/server/telemetry/index.ts
Normal file
8
x-pack/plugins/file_upload/server/telemetry/index.ts
Normal file
|
@ -0,0 +1,8 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
export * from './telemetry';
|
||||
export { makeUsageCollector } from './make_usage_collector';
|
|
@ -0,0 +1,27 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { Server } from 'hapi';
|
||||
import { getTelemetry, initTelemetry, Telemetry } from './telemetry';
|
||||
|
||||
// TODO this type should be defined by the platform
|
||||
interface KibanaHapiServer extends Server {
|
||||
usage: {
|
||||
collectorSet: {
|
||||
makeUsageCollector: any;
|
||||
register: any;
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export function makeUsageCollector(server: KibanaHapiServer): void {
|
||||
const fileUploadUsageCollector = server.usage.collectorSet.makeUsageCollector({
|
||||
type: 'fileUploadTelemetry',
|
||||
isReady: () => true,
|
||||
fetch: async (): Promise<Telemetry> => (await getTelemetry(server)) || initTelemetry(),
|
||||
});
|
||||
server.usage.collectorSet.register(fileUploadUsageCollector);
|
||||
}
|
|
@ -0,0 +1,57 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { getTelemetry, updateTelemetry } from './telemetry';
|
||||
|
||||
const internalRepository = () => ({
|
||||
get: jest.fn(() => null),
|
||||
create: jest.fn(() => ({ attributes: 'test' })),
|
||||
update: jest.fn(() => ({ attributes: 'test' })),
|
||||
});
|
||||
const server: any = {
|
||||
savedObjects: {
|
||||
getSavedObjectsRepository: jest.fn(() => internalRepository()),
|
||||
},
|
||||
plugins: {
|
||||
elasticsearch: {
|
||||
getCluster: jest.fn(() => ({ callWithInternalUser })),
|
||||
},
|
||||
},
|
||||
};
|
||||
const callWithInternalUser = jest.fn();
|
||||
|
||||
function mockInit(getVal: any = { attributes: {} }): any {
|
||||
return {
|
||||
...internalRepository(),
|
||||
get: jest.fn(() => getVal),
|
||||
};
|
||||
}
|
||||
|
||||
describe('file upload plugin telemetry', () => {
|
||||
describe('getTelemetry', () => {
|
||||
it('should get existing telemetry', async () => {
|
||||
const internalRepo = mockInit();
|
||||
await getTelemetry(server, internalRepo);
|
||||
expect(internalRepo.update.mock.calls.length).toBe(0);
|
||||
expect(internalRepo.get.mock.calls.length).toBe(1);
|
||||
expect(internalRepo.create.mock.calls.length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
describe('updateTelemetry', () => {
|
||||
it('should update existing telemetry', async () => {
|
||||
const internalRepo = mockInit({
|
||||
attributes: {
|
||||
filesUploadedTotalCount: 2,
|
||||
},
|
||||
});
|
||||
await updateTelemetry({ server, internalRepo });
|
||||
expect(internalRepo.update.mock.calls.length).toBe(1);
|
||||
expect(internalRepo.get.mock.calls.length).toBe(1);
|
||||
expect(internalRepo.create.mock.calls.length).toBe(0);
|
||||
});
|
||||
});
|
||||
});
|
73
x-pack/plugins/file_upload/server/telemetry/telemetry.ts
Normal file
73
x-pack/plugins/file_upload/server/telemetry/telemetry.ts
Normal file
|
@ -0,0 +1,73 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { Server } from 'hapi';
|
||||
import _ from 'lodash';
|
||||
import { callWithInternalUserFactory } from '../client/call_with_internal_user_factory';
|
||||
|
||||
export const TELEMETRY_DOC_ID = 'file-upload-telemetry';
|
||||
|
||||
export interface Telemetry {
|
||||
filesUploadedTotalCount: number;
|
||||
}
|
||||
|
||||
export interface TelemetrySavedObject {
|
||||
attributes: Telemetry;
|
||||
}
|
||||
|
||||
export function getInternalRepository(server: Server): any {
|
||||
const { getSavedObjectsRepository } = server.savedObjects;
|
||||
const callWithInternalUser = callWithInternalUserFactory(server);
|
||||
return getSavedObjectsRepository(callWithInternalUser);
|
||||
}
|
||||
|
||||
export function initTelemetry(): Telemetry {
|
||||
return {
|
||||
filesUploadedTotalCount: 0,
|
||||
};
|
||||
}
|
||||
|
||||
export async function getTelemetry(server: Server, internalRepo?: object): Promise<Telemetry> {
|
||||
const internalRepository = internalRepo || getInternalRepository(server);
|
||||
let telemetrySavedObject;
|
||||
|
||||
try {
|
||||
telemetrySavedObject = await internalRepository.get(TELEMETRY_DOC_ID, TELEMETRY_DOC_ID);
|
||||
} catch (e) {
|
||||
// Fail silently
|
||||
}
|
||||
|
||||
return telemetrySavedObject ? telemetrySavedObject.attributes : null;
|
||||
}
|
||||
|
||||
export async function updateTelemetry({
|
||||
server,
|
||||
internalRepo,
|
||||
}: {
|
||||
server: any;
|
||||
internalRepo?: any;
|
||||
}) {
|
||||
const internalRepository = internalRepo || getInternalRepository(server);
|
||||
let telemetry = await getTelemetry(server, internalRepository);
|
||||
// Create if doesn't exist
|
||||
if (!telemetry || _.isEmpty(telemetry)) {
|
||||
const newTelemetrySavedObject = await internalRepository.create(
|
||||
TELEMETRY_DOC_ID,
|
||||
initTelemetry(),
|
||||
{ id: TELEMETRY_DOC_ID }
|
||||
);
|
||||
telemetry = newTelemetrySavedObject.attributes;
|
||||
}
|
||||
|
||||
await internalRepository.update(TELEMETRY_DOC_ID, TELEMETRY_DOC_ID, incrementCounts(telemetry));
|
||||
}
|
||||
|
||||
export function incrementCounts({ filesUploadedTotalCount }: { filesUploadedTotalCount: number }) {
|
||||
return {
|
||||
// TODO: get telemetry for app, total file counts, file type
|
||||
filesUploadedTotalCount: filesUploadedTotalCount + 1,
|
||||
};
|
||||
}
|
|
@ -24,6 +24,8 @@ export const ES_GEO_GRID = 'ES_GEO_GRID';
|
|||
export const ES_SEARCH = 'ES_SEARCH';
|
||||
export const SOURCE_DATA_ID_ORIGIN = 'source';
|
||||
|
||||
export const GEOJSON_FILE = 'GEOJSON_FILE';
|
||||
|
||||
export const DECIMAL_DEGREES_PRECISION = 5; // meters precision
|
||||
export const ZOOM_PRECISION = 2;
|
||||
export const DEFAULT_ES_DOC_LIMIT = 2048;
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
@import './gis_map/gis_map';
|
||||
@import './layer_addpanel/layer_addpanel';
|
||||
@import './layer_addpanel/source_select/index';
|
||||
@import './layer_panel/index';
|
||||
@import './widget_overlay/index';
|
||||
@import './toolbar_overlay/index';
|
||||
|
|
|
@ -0,0 +1,32 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
import { connect } from 'react-redux';
|
||||
import { FlyoutFooter } from './view';
|
||||
import { getSelectedLayer } from '../../../selectors/map_selectors';
|
||||
import {
|
||||
clearTransientLayerStateAndCloseFlyout,
|
||||
} from '../../../actions/store_actions';
|
||||
|
||||
function mapStateToProps(state = {}) {
|
||||
const selectedLayer = getSelectedLayer(state);
|
||||
return {
|
||||
hasLayerSelected: !!selectedLayer,
|
||||
isLoading: selectedLayer && selectedLayer.isLayerLoading(),
|
||||
};
|
||||
}
|
||||
|
||||
function mapDispatchToProps(dispatch) {
|
||||
return {
|
||||
closeFlyout: () => dispatch(clearTransientLayerStateAndCloseFlyout()),
|
||||
};
|
||||
}
|
||||
|
||||
const connectedFlyOut = connect(mapStateToProps, mapDispatchToProps)(FlyoutFooter);
|
||||
export { connectedFlyOut as FlyoutFooter };
|
||||
|
|
@ -0,0 +1,58 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React from 'react';
|
||||
import {
|
||||
EuiFlexGroup,
|
||||
EuiFlexItem,
|
||||
EuiFlyoutFooter,
|
||||
EuiButtonEmpty,
|
||||
EuiButton,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
|
||||
export const FlyoutFooter = ({
|
||||
onClick, showNextButton, disableNextButton, nextButtonText, closeFlyout,
|
||||
hasLayerSelected, isLoading
|
||||
}) => {
|
||||
|
||||
const nextButton = showNextButton
|
||||
? (
|
||||
<EuiButton
|
||||
disabled={!hasLayerSelected || disableNextButton || isLoading}
|
||||
isLoading={hasLayerSelected && isLoading}
|
||||
iconSide="right"
|
||||
iconType={'sortRight'}
|
||||
onClick={onClick}
|
||||
fill
|
||||
>
|
||||
{nextButtonText}
|
||||
</EuiButton>
|
||||
)
|
||||
: null;
|
||||
|
||||
return (
|
||||
<EuiFlyoutFooter className="mapLayerPanel__footer">
|
||||
<EuiFlexGroup justifyContent="spaceBetween" responsive={false}>
|
||||
<EuiFlexItem grow={false}>
|
||||
<EuiButtonEmpty
|
||||
onClick={closeFlyout}
|
||||
flush="left"
|
||||
data-test-subj="layerAddCancelButton"
|
||||
>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.footer.cancelButtonLabel"
|
||||
defaultMessage="Cancel"
|
||||
/>
|
||||
</EuiButtonEmpty>
|
||||
</EuiFlexItem>
|
||||
<EuiFlexItem grow={false}>
|
||||
{nextButton}
|
||||
</EuiFlexItem>
|
||||
</EuiFlexGroup>
|
||||
</EuiFlyoutFooter>
|
||||
);
|
||||
};
|
|
@ -0,0 +1,31 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
import { connect } from 'react-redux';
|
||||
import { ImportEditor } from './view';
|
||||
import { getInspectorAdapters } from '../../../store/non_serializable_instances';
|
||||
import { INDEXING_STAGE, updateIndexingStage, getIndexingStage } from '../../../store/ui';
|
||||
|
||||
function mapStateToProps(state = {}) {
|
||||
return {
|
||||
inspectorAdapters: getInspectorAdapters(state),
|
||||
isIndexingTriggered: getIndexingStage(state) === INDEXING_STAGE.TRIGGERED,
|
||||
};
|
||||
}
|
||||
|
||||
const mapDispatchToProps = {
|
||||
onIndexReady: indexReady => indexReady
|
||||
? updateIndexingStage(INDEXING_STAGE.READY)
|
||||
: updateIndexingStage(null),
|
||||
importSuccessHandler: () => updateIndexingStage(INDEXING_STAGE.SUCCESS),
|
||||
importErrorHandler: () => updateIndexingStage(INDEXING_STAGE.ERROR),
|
||||
};
|
||||
|
||||
const connectedFlyOut = connect(mapStateToProps, mapDispatchToProps)(ImportEditor);
|
||||
export { connectedFlyOut as ImportEditor };
|
||||
|
|
@ -0,0 +1,64 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { Fragment } from 'react';
|
||||
import { GeojsonFileSource } from '../../../shared/layers/sources/client_file_source';
|
||||
import {
|
||||
EuiSpacer,
|
||||
EuiPanel,
|
||||
EuiButtonEmpty,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
|
||||
export const ImportEditor = ({
|
||||
clearSource, isIndexingTriggered, ...props
|
||||
}) => {
|
||||
const editorProperties = getEditorProperties({ isIndexingTriggered, ...props });
|
||||
const editor = GeojsonFileSource.renderEditor(editorProperties);
|
||||
return (
|
||||
<Fragment>
|
||||
{
|
||||
isIndexingTriggered
|
||||
? null
|
||||
: (
|
||||
<Fragment>
|
||||
<EuiButtonEmpty
|
||||
size="xs"
|
||||
flush="left"
|
||||
onClick={clearSource}
|
||||
iconType="arrowLeft"
|
||||
>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.changeDataSourceButtonLabel"
|
||||
defaultMessage="Change data source"
|
||||
/>
|
||||
</EuiButtonEmpty>
|
||||
<EuiSpacer size="s" />
|
||||
</Fragment>
|
||||
)
|
||||
}
|
||||
<EuiPanel>
|
||||
{editor}
|
||||
</EuiPanel>
|
||||
</Fragment>
|
||||
);
|
||||
};
|
||||
|
||||
function getEditorProperties({
|
||||
inspectorAdapters, onRemove, viewLayer,
|
||||
isIndexingTriggered, onIndexReady, importSuccessHandler, importErrorHandler
|
||||
}) {
|
||||
return {
|
||||
onPreviewSource: viewLayer,
|
||||
inspectorAdapters,
|
||||
onRemove,
|
||||
importSuccessHandler,
|
||||
importErrorHandler,
|
||||
isIndexingTriggered,
|
||||
addAndViewSource: viewLayer,
|
||||
onIndexReady,
|
||||
};
|
||||
}
|
|
@ -6,37 +6,32 @@
|
|||
|
||||
import { connect } from 'react-redux';
|
||||
import { AddLayerPanel } from './view';
|
||||
import { getFlyoutDisplay, updateFlyout, FLYOUT_STATE } from '../../store/ui';
|
||||
import { getSelectedLayer, getMapColors } from '../../selectors/map_selectors';
|
||||
import { getFlyoutDisplay, updateFlyout, FLYOUT_STATE, updateIndexingStage,
|
||||
getIndexingStage, INDEXING_STAGE } from '../../store/ui';
|
||||
import { getMapColors } from '../../selectors/map_selectors';
|
||||
import { getInspectorAdapters } from '../../store/non_serializable_instances';
|
||||
import {
|
||||
clearTransientLayerStateAndCloseFlyout,
|
||||
setTransientLayer,
|
||||
addLayer,
|
||||
setSelectedLayer,
|
||||
removeTransientLayer
|
||||
removeTransientLayer,
|
||||
} from '../../actions/store_actions';
|
||||
|
||||
function mapStateToProps(state = {}) {
|
||||
const selectedLayer = getSelectedLayer(state);
|
||||
const indexingStage = getIndexingStage(state);
|
||||
return {
|
||||
inspectorAdapters: getInspectorAdapters(state),
|
||||
flyoutVisible: getFlyoutDisplay(state) !== FLYOUT_STATE.NONE,
|
||||
hasLayerSelected: !!selectedLayer,
|
||||
isLoading: selectedLayer && selectedLayer.isLayerLoading(),
|
||||
mapColors: getMapColors(state),
|
||||
isIndexingTriggered: indexingStage === INDEXING_STAGE.TRIGGERED,
|
||||
isIndexingSuccess: indexingStage === INDEXING_STAGE.SUCCESS,
|
||||
isIndexingReady: indexingStage === INDEXING_STAGE.READY,
|
||||
};
|
||||
}
|
||||
|
||||
function mapDispatchToProps(dispatch) {
|
||||
return {
|
||||
closeFlyout: () => {
|
||||
dispatch(clearTransientLayerStateAndCloseFlyout());
|
||||
},
|
||||
previewLayer: async (layer) => {
|
||||
//this removal always needs to happen prior to adding the new layer
|
||||
//many source editors allow users to modify the settings in the add-source wizard
|
||||
//this triggers a new request for preview. Any existing transient layers need to be cleared before the new one can be added.
|
||||
viewLayer: async layer => {
|
||||
await dispatch(setSelectedLayer(null));
|
||||
await dispatch(removeTransientLayer());
|
||||
dispatch(addLayer(layer.toLayerDescriptor()));
|
||||
|
@ -51,6 +46,8 @@ function mapDispatchToProps(dispatch) {
|
|||
dispatch(setTransientLayer(null));
|
||||
dispatch(updateFlyout(FLYOUT_STATE.LAYER_PANEL));
|
||||
},
|
||||
setIndexingTriggered: () => dispatch(updateIndexingStage(INDEXING_STAGE.TRIGGERED)),
|
||||
resetIndexing: () => dispatch(updateIndexingStage(null)),
|
||||
};
|
||||
}
|
||||
|
||||
|
|
|
@ -0,0 +1,21 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
|
||||
import { connect } from 'react-redux';
|
||||
import { SourceEditor } from './view';
|
||||
import { getInspectorAdapters } from '../../../store/non_serializable_instances';
|
||||
|
||||
function mapStateToProps(state = {}) {
|
||||
return {
|
||||
inspectorAdapters: getInspectorAdapters(state),
|
||||
};
|
||||
}
|
||||
|
||||
const connectedFlyOut = connect(mapStateToProps)(SourceEditor);
|
||||
export { connectedFlyOut as SourceEditor };
|
||||
|
|
@ -0,0 +1,58 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { Fragment } from 'react';
|
||||
import { ALL_SOURCES } from '../../../shared/layers/sources/all_sources';
|
||||
import {
|
||||
EuiSpacer,
|
||||
EuiPanel,
|
||||
EuiButtonEmpty,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
|
||||
export const SourceEditor = ({
|
||||
clearSource, sourceType, isIndexingTriggered, inspectorAdapters, previewLayer
|
||||
}) => {
|
||||
const editorProperties = {
|
||||
onPreviewSource: previewLayer,
|
||||
inspectorAdapters,
|
||||
};
|
||||
const Source = ALL_SOURCES.find(Source => {
|
||||
return Source.type === sourceType;
|
||||
});
|
||||
if (!Source) {
|
||||
throw new Error(`Unexpected source type: ${sourceType}`);
|
||||
}
|
||||
const editor = Source.renderEditor(editorProperties);
|
||||
return (
|
||||
<Fragment>
|
||||
{
|
||||
isIndexingTriggered
|
||||
? null
|
||||
: (
|
||||
<Fragment>
|
||||
<EuiButtonEmpty
|
||||
size="xs"
|
||||
flush="left"
|
||||
onClick={clearSource}
|
||||
iconType="arrowLeft"
|
||||
>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.changeDataSourceButtonLabel"
|
||||
defaultMessage="Change data source"
|
||||
/>
|
||||
</EuiButtonEmpty>
|
||||
<EuiSpacer size="s" />
|
||||
</Fragment>
|
||||
)
|
||||
}
|
||||
<EuiPanel>
|
||||
{editor}
|
||||
</EuiPanel>
|
||||
</Fragment>
|
||||
);
|
||||
};
|
||||
|
|
@ -0,0 +1 @@
|
|||
@import './source_select';
|
|
@ -0,0 +1,58 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { Fragment } from 'react';
|
||||
import { ALL_SOURCES } from '../../../shared/layers/sources/all_sources';
|
||||
import {
|
||||
EuiTitle,
|
||||
EuiSpacer,
|
||||
EuiCard,
|
||||
EuiIcon,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import _ from 'lodash';
|
||||
|
||||
export function SourceSelect({
|
||||
updateSourceSelection
|
||||
}) {
|
||||
|
||||
const sourceCards = ALL_SOURCES.map(Source => {
|
||||
const icon = Source.icon
|
||||
? <EuiIcon type={Source.icon} size="l" />
|
||||
: null;
|
||||
|
||||
return (
|
||||
<Fragment key={Source.type}>
|
||||
<EuiSpacer size="s" />
|
||||
<EuiCard
|
||||
className="mapLayerAddpanel__card"
|
||||
title={Source.title}
|
||||
icon={icon}
|
||||
onClick={() => updateSourceSelection(
|
||||
{ type: Source.type, isIndexingSource: Source.isIndexingSource })
|
||||
}
|
||||
description={Source.description}
|
||||
layout="horizontal"
|
||||
data-test-subj={_.camelCase(Source.title)}
|
||||
/>
|
||||
</Fragment>
|
||||
);
|
||||
});
|
||||
|
||||
return (
|
||||
<Fragment>
|
||||
<EuiTitle size="xs">
|
||||
<h2>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.chooseDataSourceTitle"
|
||||
defaultMessage="Choose data source"
|
||||
/>
|
||||
</h2>
|
||||
</EuiTitle>
|
||||
{sourceCards}
|
||||
</Fragment>
|
||||
);
|
||||
}
|
|
@ -4,167 +4,145 @@
|
|||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import React, { Component, Fragment } from 'react';
|
||||
import { ALL_SOURCES } from '../../shared/layers/sources/all_sources';
|
||||
import React, { Component } from 'react';
|
||||
import { SourceSelect } from './source_select/source_select';
|
||||
import { FlyoutFooter } from './flyout_footer';
|
||||
import { SourceEditor } from './source_editor';
|
||||
import { ImportEditor } from './import_editor';
|
||||
import {
|
||||
EuiButton,
|
||||
EuiButtonEmpty,
|
||||
EuiFlexGroup,
|
||||
EuiFlexItem,
|
||||
EuiTitle,
|
||||
EuiPanel,
|
||||
EuiSpacer,
|
||||
EuiCard,
|
||||
EuiIcon,
|
||||
EuiFlyoutHeader,
|
||||
EuiFlyoutFooter,
|
||||
} from '@elastic/eui';
|
||||
import { FormattedMessage } from '@kbn/i18n/react';
|
||||
import _ from 'lodash';
|
||||
import { i18n } from '@kbn/i18n';
|
||||
|
||||
export class AddLayerPanel extends Component {
|
||||
|
||||
state = {
|
||||
sourceType: null,
|
||||
layer: null
|
||||
layer: null,
|
||||
importView: false,
|
||||
layerImportAddReady: false,
|
||||
}
|
||||
|
||||
_previewLayer = (source) => {
|
||||
componentDidUpdate() {
|
||||
if (!this.state.layerImportAddReady && this.props.isIndexingSuccess) {
|
||||
this.setState({ layerImportAddReady: true });
|
||||
}
|
||||
}
|
||||
|
||||
_getPanelDescription() {
|
||||
const { sourceType, importView, layerImportAddReady } = this.state;
|
||||
let panelDescription;
|
||||
if (!sourceType) {
|
||||
panelDescription = i18n.translate('xpack.maps.addLayerPanel.selectSource',
|
||||
{ defaultMessage: 'Select source' });
|
||||
} else if (layerImportAddReady || !importView) {
|
||||
panelDescription = i18n.translate('xpack.maps.addLayerPanel.addLayer',
|
||||
{ defaultMessage: 'Add layer' });
|
||||
} else {
|
||||
panelDescription = i18n.translate('xpack.maps.addLayerPanel.importFile',
|
||||
{ defaultMessage: 'Import file' });
|
||||
}
|
||||
return panelDescription;
|
||||
}
|
||||
|
||||
_viewLayer = async source => {
|
||||
if (!source) {
|
||||
this.setState({ layer: null });
|
||||
this.props.removeTransientLayer();
|
||||
return;
|
||||
}
|
||||
|
||||
|
||||
const layerOptions = this.state.layer
|
||||
? { style: this.state.layer.getCurrentStyle().getDescriptor() }
|
||||
: {};
|
||||
this.setState({
|
||||
layer: source.createDefaultLayer(layerOptions, this.props.mapColors)
|
||||
},
|
||||
() => this.props.previewLayer(this.state.layer));
|
||||
const newLayer = source.createDefaultLayer(layerOptions, this.props.mapColors);
|
||||
this.setState({ layer: newLayer }, () =>
|
||||
this.props.viewLayer(this.state.layer));
|
||||
};
|
||||
|
||||
_clearSource = () => {
|
||||
_clearLayerData = ({ keepSourceType = false }) => {
|
||||
this.setState({
|
||||
layer: null,
|
||||
sourceType: null
|
||||
...(
|
||||
!keepSourceType
|
||||
? { sourceType: null, importView: false }
|
||||
: {}
|
||||
),
|
||||
});
|
||||
this.props.removeTransientLayer();
|
||||
}
|
||||
|
||||
_onSourceTypeChange = (sourceType) => {
|
||||
this.setState({ sourceType });
|
||||
_onSourceSelectionChange = ({ type, isIndexingSource }) => {
|
||||
this.setState({ sourceType: type, importView: isIndexingSource });
|
||||
}
|
||||
|
||||
_renderNextBtn() {
|
||||
if (!this.state.sourceType) {
|
||||
return null;
|
||||
_layerAddHandler = () => {
|
||||
const { isIndexingTriggered, setIndexingTriggered, selectLayerAndAdd,
|
||||
resetIndexing } = this.props;
|
||||
const layerSource = this.state.layer.getSource();
|
||||
const boolIndexLayer = layerSource.shouldBeIndexed();
|
||||
this.setState({ layer: null });
|
||||
if (boolIndexLayer && !isIndexingTriggered) {
|
||||
setIndexingTriggered();
|
||||
} else {
|
||||
selectLayerAndAdd();
|
||||
if (this.state.importView) {
|
||||
this.setState({
|
||||
layerImportAddReady: false,
|
||||
});
|
||||
resetIndexing();
|
||||
}
|
||||
}
|
||||
|
||||
const { hasLayerSelected, isLoading, selectLayerAndAdd } = this.props;
|
||||
return (
|
||||
<EuiButton
|
||||
disabled={!hasLayerSelected}
|
||||
isLoading={hasLayerSelected && isLoading}
|
||||
iconSide="right"
|
||||
iconType={'sortRight'}
|
||||
onClick={() => {
|
||||
this.setState({ layer: null });
|
||||
selectLayerAndAdd();
|
||||
}}
|
||||
fill
|
||||
>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.addLayerButtonLabel"
|
||||
defaultMessage="Add layer"
|
||||
/>
|
||||
</EuiButton>
|
||||
);
|
||||
}
|
||||
|
||||
_renderSourceCards() {
|
||||
return ALL_SOURCES.map(Source => {
|
||||
const icon = Source.icon
|
||||
? <EuiIcon type={Source.icon} size="l" />
|
||||
: null;
|
||||
_renderAddLayerPanel() {
|
||||
const { sourceType, importView } = this.state;
|
||||
if (!sourceType) {
|
||||
return (
|
||||
<Fragment key={Source.type}>
|
||||
<EuiSpacer size="s" />
|
||||
<EuiCard
|
||||
className="mapLayerAddpanel__card"
|
||||
title={Source.title}
|
||||
icon={icon}
|
||||
onClick={() => this._onSourceTypeChange(Source.type)}
|
||||
description={Source.description}
|
||||
layout="horizontal"
|
||||
data-test-subj={_.camelCase(Source.title)}
|
||||
/>
|
||||
</Fragment>
|
||||
<SourceSelect updateSourceSelection={this._onSourceSelectionChange} />
|
||||
);
|
||||
});
|
||||
}
|
||||
|
||||
_renderSourceSelect() {
|
||||
}
|
||||
if (importView) {
|
||||
return (
|
||||
<ImportEditor
|
||||
clearSource={this._clearLayerData}
|
||||
viewLayer={this._viewLayer}
|
||||
onRemove={() => this._clearLayerData({ keepSourceType: true })}
|
||||
/>
|
||||
);
|
||||
}
|
||||
return (
|
||||
<Fragment>
|
||||
<EuiTitle size="xs">
|
||||
<h2>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.chooseDataSourceTitle"
|
||||
defaultMessage="Choose data source"
|
||||
/>
|
||||
</h2>
|
||||
</EuiTitle>
|
||||
{this._renderSourceCards()}
|
||||
</Fragment>
|
||||
<SourceEditor
|
||||
clearSource={this._clearLayerData}
|
||||
sourceType={sourceType}
|
||||
previewLayer={this._viewLayer}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
_renderSourceEditor() {
|
||||
const editorProperties = {
|
||||
onPreviewSource: this._previewLayer,
|
||||
inspectorAdapters: this.props.inspectorAdapters,
|
||||
};
|
||||
_renderFooter(buttonDescription) {
|
||||
const { importView, layer } = this.state;
|
||||
const { isIndexingReady, isIndexingSuccess } = this.props;
|
||||
|
||||
const Source = ALL_SOURCES.find((Source) => {
|
||||
return Source.type === this.state.sourceType;
|
||||
});
|
||||
if (!Source) {
|
||||
throw new Error(`Unexepected source type: ${this.state.sourceType}`);
|
||||
}
|
||||
const buttonEnabled = importView
|
||||
? isIndexingReady || isIndexingSuccess
|
||||
: !!layer;
|
||||
|
||||
return (
|
||||
<Fragment>
|
||||
<EuiButtonEmpty
|
||||
size="xs"
|
||||
flush="left"
|
||||
onClick={this._clearSource}
|
||||
iconType="arrowLeft"
|
||||
>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.changeDataSourceButtonLabel"
|
||||
defaultMessage="Change data source"
|
||||
/>
|
||||
</EuiButtonEmpty>
|
||||
<EuiSpacer size="s" />
|
||||
<EuiPanel>
|
||||
{Source.renderEditor(editorProperties)}
|
||||
</EuiPanel>
|
||||
</Fragment>
|
||||
<FlyoutFooter
|
||||
showNextButton={!!this.state.sourceType}
|
||||
disableNextButton={!buttonEnabled}
|
||||
onClick={this._layerAddHandler}
|
||||
nextButtonText={buttonDescription}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
_renderAddLayerForm() {
|
||||
if (!this.state.sourceType) {
|
||||
return this._renderSourceSelect();
|
||||
}
|
||||
|
||||
return this._renderSourceEditor();
|
||||
}
|
||||
|
||||
_renderFlyout() {
|
||||
const panelDescription = this._getPanelDescription();
|
||||
|
||||
return (
|
||||
<EuiFlexGroup
|
||||
direction="column"
|
||||
|
@ -173,39 +151,17 @@ export class AddLayerPanel extends Component {
|
|||
<EuiFlyoutHeader hasBorder className="mapLayerPanel__header">
|
||||
<EuiTitle size="s">
|
||||
<h2>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.panelTitle"
|
||||
defaultMessage="Add layer"
|
||||
/>
|
||||
{panelDescription}
|
||||
</h2>
|
||||
</EuiTitle>
|
||||
</EuiFlyoutHeader>
|
||||
|
||||
<div className="mapLayerPanel__body" data-test-subj="layerAddForm">
|
||||
<div className="mapLayerPanel__bodyOverflow">
|
||||
{this._renderAddLayerForm()}
|
||||
{ this._renderAddLayerPanel() }
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<EuiFlyoutFooter className="mapLayerPanel__footer">
|
||||
<EuiFlexGroup justifyContent="spaceBetween" responsive={false}>
|
||||
<EuiFlexItem grow={false}>
|
||||
<EuiButtonEmpty
|
||||
onClick={this.props.closeFlyout}
|
||||
flush="left"
|
||||
data-test-subj="layerAddCancelButton"
|
||||
>
|
||||
<FormattedMessage
|
||||
id="xpack.maps.addLayerPanel.cancelButtonLabel"
|
||||
defaultMessage="Cancel"
|
||||
/>
|
||||
</EuiButtonEmpty>
|
||||
</EuiFlexItem>
|
||||
<EuiFlexItem grow={false}>
|
||||
{this._renderNextBtn()}
|
||||
</EuiFlexItem>
|
||||
</EuiFlexGroup>
|
||||
</EuiFlyoutFooter>
|
||||
{ this._renderFooter(panelDescription) }
|
||||
</EuiFlexGroup>
|
||||
);
|
||||
}
|
||||
|
|
|
@ -99,15 +99,6 @@ export const getMapCenter = ({ map }) => map.mapState.center ?
|
|||
|
||||
export const getMouseCoordinates = ({ map }) => map.mapState.mouseCoordinates;
|
||||
|
||||
export const getMapColors = ({ map }) => {
|
||||
return map.layerList.reduce((accu, layer) => {
|
||||
// This will evolve as color options are expanded
|
||||
const color = _.get(layer, 'style.properties.fillColor.options.color');
|
||||
if (color) accu.push(color);
|
||||
return accu;
|
||||
}, []);
|
||||
};
|
||||
|
||||
export const getTimeFilters = ({ map }) => map.mapState.timeFilters ?
|
||||
map.mapState.timeFilters : timefilter.getTime();
|
||||
|
||||
|
@ -167,6 +158,19 @@ export const getSelectedLayer = createSelector(
|
|||
return layerList.find(layer => layer.getId() === selectedLayerId);
|
||||
});
|
||||
|
||||
export const getMapColors = createSelector(
|
||||
getTransientLayerId,
|
||||
getLayerListRaw,
|
||||
(transientLayerId, layerList) => layerList.reduce((accu, layer) => {
|
||||
if (layer.id === transientLayerId) {
|
||||
return accu;
|
||||
}
|
||||
const color = _.get(layer, 'style.properties.fillColor.options.color');
|
||||
if (color) accu.push(color);
|
||||
return accu;
|
||||
}, [])
|
||||
);
|
||||
|
||||
export const getSelectedLayerJoinDescriptors = createSelector(
|
||||
getSelectedLayer,
|
||||
(selectedLayer) => {
|
||||
|
|
|
@ -6,6 +6,7 @@
|
|||
|
||||
|
||||
import { EMSFileSource } from './ems_file_source';
|
||||
import { GeojsonFileSource } from './client_file_source';
|
||||
import { KibanaRegionmapSource } from './kibana_regionmap_source';
|
||||
import { XYZTMSSource } from './xyz_tms_source';
|
||||
import { EMSTMSSource } from './ems_tms_source';
|
||||
|
@ -16,6 +17,7 @@ import { ESSearchSource } from './es_search_source';
|
|||
|
||||
|
||||
export const ALL_SOURCES = [
|
||||
GeojsonFileSource,
|
||||
ESSearchSource,
|
||||
ESGeoGridSource,
|
||||
EMSFileSource,
|
||||
|
|
|
@ -0,0 +1,29 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
|
||||
import React from 'react';
|
||||
import { JsonUploadAndParse } from '../../../../../../file_upload/public';
|
||||
|
||||
export function ClientFileCreateSourceEditor({
|
||||
previewGeojsonFile,
|
||||
isIndexingTriggered = false,
|
||||
onIndexingComplete,
|
||||
onRemove,
|
||||
onIndexReady,
|
||||
}) {
|
||||
return (
|
||||
<JsonUploadAndParse
|
||||
appName={'Maps'}
|
||||
isIndexingTriggered={isIndexingTriggered}
|
||||
onFileUpload={previewGeojsonFile}
|
||||
onFileRemove={onRemove}
|
||||
onIndexReady={onIndexReady}
|
||||
transformDetails={'geo'}
|
||||
onIndexingComplete={onIndexingComplete}
|
||||
/>
|
||||
);
|
||||
}
|
|
@ -0,0 +1,127 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
import { AbstractVectorSource } from '../vector_source';
|
||||
import React from 'react';
|
||||
import { ES_GEO_FIELD_TYPE, GEOJSON_FILE } from '../../../../../common/constants';
|
||||
import { ClientFileCreateSourceEditor } from './create_client_file_source_editor';
|
||||
import { ESSearchSource } from '../es_search_source';
|
||||
import uuid from 'uuid/v4';
|
||||
import _ from 'lodash';
|
||||
|
||||
export class GeojsonFileSource extends AbstractVectorSource {
|
||||
|
||||
static type = GEOJSON_FILE;
|
||||
static title = 'Upload GeoJSON vector file';
|
||||
static description = 'Upload a GeoJSON file and index in Elasticsearch';
|
||||
static icon = 'importAction';
|
||||
static isIndexingSource = true;
|
||||
|
||||
static createDescriptor(geoJson, name) {
|
||||
// Wrap feature as feature collection if needed
|
||||
const featureCollection = (geoJson.type === 'Feature')
|
||||
? {
|
||||
type: 'FeatureCollection',
|
||||
features: [{ ...geoJson }]
|
||||
}
|
||||
: geoJson;
|
||||
|
||||
return {
|
||||
type: GeojsonFileSource.type,
|
||||
featureCollection,
|
||||
name
|
||||
};
|
||||
}
|
||||
|
||||
static viewIndexedData = (
|
||||
addAndViewSource, inspectorAdapters, importSuccessHandler, importErrorHandler
|
||||
) => {
|
||||
return (indexResponses = {}) => {
|
||||
const { indexDataResp, indexPatternResp } = indexResponses;
|
||||
if (!(indexDataResp && indexDataResp.success) ||
|
||||
!(indexPatternResp && indexPatternResp.success)) {
|
||||
importErrorHandler(indexResponses);
|
||||
return;
|
||||
}
|
||||
const { fields, id } = indexPatternResp;
|
||||
const geoFieldArr = fields.filter(
|
||||
field => Object.values(ES_GEO_FIELD_TYPE).includes(field.type)
|
||||
);
|
||||
const geoField = _.get(geoFieldArr, '[0].name');
|
||||
const indexPatternId = id;
|
||||
if (!indexPatternId || !geoField) {
|
||||
addAndViewSource(null);
|
||||
} else {
|
||||
const source = new ESSearchSource({
|
||||
id: uuid(),
|
||||
indexPatternId,
|
||||
geoField,
|
||||
}, inspectorAdapters);
|
||||
addAndViewSource(source);
|
||||
importSuccessHandler(indexResponses);
|
||||
}
|
||||
};
|
||||
};
|
||||
|
||||
static previewGeojsonFile = (onPreviewSource, inspectorAdapters) => {
|
||||
return (geojsonFile, name) => {
|
||||
if (!geojsonFile) {
|
||||
onPreviewSource(null);
|
||||
return;
|
||||
}
|
||||
const sourceDescriptor = GeojsonFileSource.createDescriptor(geojsonFile, name);
|
||||
const source = new GeojsonFileSource(sourceDescriptor, inspectorAdapters);
|
||||
onPreviewSource(source);
|
||||
};
|
||||
};
|
||||
|
||||
static renderEditor({
|
||||
onPreviewSource, inspectorAdapters, addAndViewSource, isIndexingTriggered,
|
||||
onRemove, onIndexReady, importSuccessHandler, importErrorHandler
|
||||
}) {
|
||||
return (
|
||||
<ClientFileCreateSourceEditor
|
||||
previewGeojsonFile={
|
||||
GeojsonFileSource.previewGeojsonFile(
|
||||
onPreviewSource,
|
||||
inspectorAdapters
|
||||
)
|
||||
}
|
||||
isIndexingTriggered={isIndexingTriggered}
|
||||
onIndexingComplete={
|
||||
GeojsonFileSource.viewIndexedData(
|
||||
addAndViewSource,
|
||||
inspectorAdapters,
|
||||
importSuccessHandler,
|
||||
importErrorHandler,
|
||||
)
|
||||
}
|
||||
onRemove={onRemove}
|
||||
onIndexReady={onIndexReady}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
async getGeoJsonWithMeta() {
|
||||
return {
|
||||
data: this._descriptor.featureCollection,
|
||||
meta: {}
|
||||
};
|
||||
}
|
||||
|
||||
async getDisplayName() {
|
||||
return this._descriptor.name;
|
||||
}
|
||||
|
||||
canFormatFeatureProperties() {
|
||||
return true;
|
||||
}
|
||||
|
||||
shouldBeIndexed() {
|
||||
return GeojsonFileSource.isIndexingSource;
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,7 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
export { GeojsonFileSource } from './geojson_file_source';
|
|
@ -8,6 +8,8 @@ import { copyPersistentState } from '../../../store/util';
|
|||
|
||||
export class AbstractSource {
|
||||
|
||||
static isIndexingSource = false;
|
||||
|
||||
static renderEditor() {
|
||||
throw new Error('Must implement Source.renderEditor');
|
||||
}
|
||||
|
@ -109,6 +111,10 @@ export class AbstractSource {
|
|||
return false;
|
||||
}
|
||||
|
||||
shouldBeIndexed() {
|
||||
return AbstractSource.isIndexingSource;
|
||||
}
|
||||
|
||||
supportsElasticsearchFilters() {
|
||||
return false;
|
||||
}
|
||||
|
|
19
x-pack/plugins/maps/public/shared/layers/util/import_file.js
Normal file
19
x-pack/plugins/maps/public/shared/layers/util/import_file.js
Normal file
|
@ -0,0 +1,19 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License;
|
||||
* you may not use this file except in compliance with the Elastic License.
|
||||
*/
|
||||
|
||||
export async function importFile(file, FileReader = window.FileReader) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const fr = new FileReader();
|
||||
fr.onload = ({ target: { result } }) => {
|
||||
try {
|
||||
resolve(JSON.parse(result));
|
||||
} catch (e) {
|
||||
reject(e);
|
||||
}
|
||||
};
|
||||
fr.readAsText(file);
|
||||
});
|
||||
}
|
|
@ -110,7 +110,7 @@ const INITIAL_STATE = {
|
|||
selectedLayerId: null,
|
||||
__transientLayerId: null,
|
||||
layerList: [],
|
||||
waitingForMapReadyLayerList: []
|
||||
waitingForMapReadyLayerList: [],
|
||||
};
|
||||
|
||||
|
||||
|
|
|
@ -13,6 +13,7 @@ export const SET_FILTERABLE = 'IS_FILTERABLE';
|
|||
export const SET_OPEN_TOC_DETAILS = 'SET_OPEN_TOC_DETAILS';
|
||||
export const SHOW_TOC_DETAILS = 'SHOW_TOC_DETAILS';
|
||||
export const HIDE_TOC_DETAILS = 'HIDE_TOC_DETAILS';
|
||||
export const UPDATE_INDEXING_STAGE = 'UPDATE_INDEXING_STAGE';
|
||||
|
||||
export const FLYOUT_STATE = {
|
||||
NONE: 'NONE',
|
||||
|
@ -20,6 +21,13 @@ export const FLYOUT_STATE = {
|
|||
ADD_LAYER_WIZARD: 'ADD_LAYER_WIZARD'
|
||||
};
|
||||
|
||||
export const INDEXING_STAGE = {
|
||||
READY: 'READY',
|
||||
TRIGGERED: 'TRIGGERED',
|
||||
SUCCESS: 'SUCCESS',
|
||||
ERROR: 'ERROR',
|
||||
};
|
||||
|
||||
export const DEFAULT_IS_LAYER_TOC_OPEN = true;
|
||||
|
||||
const INITIAL_STATE = {
|
||||
|
@ -31,6 +39,7 @@ const INITIAL_STATE = {
|
|||
// storing TOC detail visibility outside of map.layerList because its UI state and not map rendering state.
|
||||
// This also makes for easy read/write access for embeddables.
|
||||
openTOCDetails: [],
|
||||
importIndexingStage: null
|
||||
};
|
||||
|
||||
// Reducer
|
||||
|
@ -67,6 +76,8 @@ export function ui(state = INITIAL_STATE, action) {
|
|||
return layerId !== action.layerId;
|
||||
})
|
||||
};
|
||||
case UPDATE_INDEXING_STAGE:
|
||||
return { ...state, importIndexingStage: action.stage };
|
||||
default:
|
||||
return state;
|
||||
}
|
||||
|
@ -142,6 +153,13 @@ export function hideTOCDetails(layerId) {
|
|||
};
|
||||
}
|
||||
|
||||
export function updateIndexingStage(stage) {
|
||||
return {
|
||||
type: UPDATE_INDEXING_STAGE,
|
||||
stage,
|
||||
};
|
||||
}
|
||||
|
||||
// Selectors
|
||||
export const getFlyoutDisplay = ({ ui }) => ui && ui.flyoutDisplay
|
||||
|| INITIAL_STATE.flyoutDisplay;
|
||||
|
@ -151,3 +169,4 @@ export const getOpenTOCDetails = ({ ui }) => ui.openTOCDetails;
|
|||
export const getIsFullScreen = ({ ui }) => ui.isFullScreen;
|
||||
export const getIsReadOnly = ({ ui }) => ui.isReadOnly;
|
||||
export const getIsFilterable = ({ ui }) => ui.isFilterable;
|
||||
export const getIndexingStage = ({ ui }) => ui.importIndexingStage;
|
||||
|
|
|
@ -5244,11 +5244,8 @@
|
|||
"xpack.telemetry.welcomeBanner.telemetryConfigDetailsDescription.exampleLinkText": "例",
|
||||
"xpack.telemetry.welcomeBanner.telemetryConfigDetailsDescription.telemetryPrivacyStatementLinkText": "遠隔測定に関するプライバシーステートメント",
|
||||
"xpack.telemetry.welcomeBanner.yesButtonLabel": "はい",
|
||||
"xpack.maps.addLayerPanel.addLayerButtonLabel": "レイヤーを追加",
|
||||
"xpack.maps.addLayerPanel.cancelButtonLabel": "キャンセル",
|
||||
"xpack.maps.addLayerPanel.changeDataSourceButtonLabel": "データソースを変更",
|
||||
"xpack.maps.addLayerPanel.chooseDataSourceTitle": "データソースの選択",
|
||||
"xpack.maps.addLayerPanel.panelTitle": "レイヤーの追加",
|
||||
"xpack.maps.appDescription": "マップアプリケーション",
|
||||
"xpack.maps.appTitle": "マップ",
|
||||
"xpack.maps.badge.readOnly.text": "読み込み専用",
|
||||
|
@ -5271,7 +5268,6 @@
|
|||
"xpack.maps.inspector.mapDetailsViewHelpText": "マップステータスを表示します",
|
||||
"xpack.maps.inspector.mapDetailsViewTitle": "マップの詳細",
|
||||
"xpack.maps.inspector.zoomLabel": "ズーム:",
|
||||
"xpack.maps.layerControl.addLayerButtonLabel": "レイヤーを追加",
|
||||
"xpack.maps.layerControl.closeLayerTOCButtonAriaLabel": "レイヤーパネルを畳む",
|
||||
"xpack.maps.layerControl.layersTitle": "レイヤー",
|
||||
"xpack.maps.layerControl.openLayerTOCButtonAriaLabel": "レイヤーパネルを拡張",
|
||||
|
|
|
@ -5246,11 +5246,8 @@
|
|||
"xpack.main.welcomeBanner.licenseIsExpiredDescription": "联系您的管理员或直接{updateYourLicenseLinkText}。",
|
||||
"xpack.main.welcomeBanner.licenseIsExpiredDescription.updateYourLicenseLinkText": "更新您的许可",
|
||||
"xpack.main.welcomeBanner.licenseIsExpiredTitle": "您的{licenseType}许可已过期",
|
||||
"xpack.maps.addLayerPanel.addLayerButtonLabel": "添加图层",
|
||||
"xpack.maps.addLayerPanel.cancelButtonLabel": "取消",
|
||||
"xpack.maps.addLayerPanel.changeDataSourceButtonLabel": "更改数据源",
|
||||
"xpack.maps.addLayerPanel.chooseDataSourceTitle": "选择数据源",
|
||||
"xpack.maps.addLayerPanel.panelTitle": "添加图层",
|
||||
"xpack.maps.appDescription": "地图应用程序",
|
||||
"xpack.maps.appTitle": "Maps",
|
||||
"xpack.maps.badge.readOnly.text": "只读",
|
||||
|
|
|
@ -81,6 +81,13 @@
|
|||
}
|
||||
}
|
||||
},
|
||||
"file-upload-telemetry": {
|
||||
"properties": {
|
||||
"filesUploadedTotalCount": {
|
||||
"type": "long"
|
||||
}
|
||||
}
|
||||
},
|
||||
"graph-workspace": {
|
||||
"properties": {
|
||||
"description": {
|
||||
|
|
15
yarn.lock
15
yarn.lock
|
@ -12867,6 +12867,16 @@ geojson-random@^0.2.2:
|
|||
resolved "https://registry.yarnpkg.com/geojson-random/-/geojson-random-0.2.2.tgz#ab4838f126adc5e16f8f94e655def820f9119dbc"
|
||||
integrity sha1-q0g48SatxeFvj5TmVd74IPkRnbw=
|
||||
|
||||
geojson-rewind@^0.3.1:
|
||||
version "0.3.1"
|
||||
resolved "https://registry.yarnpkg.com/geojson-rewind/-/geojson-rewind-0.3.1.tgz#22240797c847cc2f0c1d313e4aa0c915afa7f29d"
|
||||
integrity sha1-IiQHl8hHzC8MHTE+SqDJFa+n8p0=
|
||||
dependencies:
|
||||
"@mapbox/geojson-area" "0.2.2"
|
||||
concat-stream "~1.6.0"
|
||||
minimist "1.2.0"
|
||||
sharkdown "^0.1.0"
|
||||
|
||||
geojson-vt@^3.2.1:
|
||||
version "3.2.1"
|
||||
resolved "https://registry.yarnpkg.com/geojson-vt/-/geojson-vt-3.2.1.tgz#f8adb614d2c1d3f6ee7c4265cad4bbf3ad60c8b7"
|
||||
|
@ -16963,6 +16973,11 @@ jsts@1.1.2:
|
|||
resolved "https://registry.yarnpkg.com/jsts/-/jsts-1.1.2.tgz#d205d2cc8393081d9e484ae36282110695edc230"
|
||||
integrity sha1-0gXSzIOTCB2eSErjYoIRBpXtwjA=
|
||||
|
||||
jsts@^2.0.4:
|
||||
version "2.0.4"
|
||||
resolved "https://registry.yarnpkg.com/jsts/-/jsts-2.0.4.tgz#5fb91aaea070ebedf0121454ed90b074027e8241"
|
||||
integrity sha512-YCfCuEgG9ynMFazjIH0YAtliFlaIcYmRqBY6EQP+VjNDEjuu4Il+91RDQWP4hAS7TXOeN/NYF/OL7Fmvg69pKg==
|
||||
|
||||
jsx-ast-utils@^2.0.1:
|
||||
version "2.0.1"
|
||||
resolved "https://registry.yarnpkg.com/jsx-ast-utils/-/jsx-ast-utils-2.0.1.tgz#e801b1b39985e20fffc87b40e3748080e2dcac7f"
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue