[maps] shapefile import (#123764)

* [maps] shapefile importer

* render importer editor

* shapefile_editor

* break geojson_importer into abstract class

* start shapefile_importer _readNext

* parse shapefile

* getProgress

* comments

* set table row count

* hide shapefile editor once all files selected

* i18n fixes

* validate sidecar file names match .shp name

* rename JsonUploadAndParse to GeoUploadWizard

* eslint

* tslint

* fix getFileNameWithoutExt

* backout i18n changes

* reset i18n files to main

* revert to main zh-CN.json

* i18n updates

* use geoJsonCleanAndValidate to clean features from all geo importers

* update docs

* doc refinement

* tslint fix for loaders

* eslint

* shapefile upload functional test

* geoFileUploadPageObject

* add geojson functional test with geoFileUpload page object

* replace old workflow functional tests with new ones based on geoFileUplad page object

* tslint and eslint

* eslint

* fix jest test

* use smaller chunks to avoid timeout in CI

* try without geojson test

* try small geojson file in CI

* point.json

* do not use copy to clipboard, does not work in CI

* eslint

* more linting

* add retry

* one more thing for functional tests

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
This commit is contained in:
Nathan Reese 2022-02-01 11:06:49 -07:00 committed by GitHub
parent 6e7fcfe181
commit 790089e18f
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
61 changed files with 1352 additions and 806 deletions

View file

@ -6,7 +6,7 @@ To import geospatical data into the Elastic Stack, the data must be indexed as {
Geospatial data comes in many formats.
Choose an import tool based on the format of your geospatial data.
TIP: When you upload GeoJSON or delimited files in {kib}, there is a file size
TIP: When you upload files in {kib}, there is a file size
limit, which is configurable in <<fileupload-maxfilesize,Advanced Settings>>.
[discrete]
@ -19,7 +19,7 @@ spaces in **{stack-manage-app}** in {kib}. For more information, see
{ref}/security-privileges.html[Security privileges],
<<kibana-privileges, {kib} privileges>>, and <<kibana-role-management, {kib} role management>>.
To upload GeoJSON files and draw features in {kib} with *Maps*, you must have:
To upload GeoJSON files, shapefiles, and draw features in {kib} with *Maps*, you must have:
* The `all` {kib} privilege for *Maps*
* The `all` {kib} privilege for *{ipm-app}*
@ -50,12 +50,28 @@ On the {kib} home page, you can upload a file and import it into an {es} index w
[discrete]
=== Upload a GeoJSON file
*Upload GeoJSON* indexes GeoJSON features as a `geo_point` or `geo_shape`.
*Upload file* indexes GeoJSON features in {es}, creating a document for each feature.
NOTE: GeoJSON feature coordinates must be in EPSG:4326 coordinate reference system..
. <<maps-create, Create a new map>>.
. Click *Add layer*.
. Select *Upload GeoJSON*.
. Use the file chooser to select a GeoJSON file.
. Select *Upload file*.
. Use the file chooser to select a GeoJSON file with the extension `.json` or `.geojson`.
. Click *Import file*.
discrete]
=== Upload a shapefile
*Upload file* indexes shapefile features in {es}, creating a document for each feature.
. <<maps-create, Create a new map>>.
. Click *Add layer*.
. Select *Upload file*.
. Use the file chooser to select the `.shp` file from your shapefile folder.
. Use the `.dbf` file chooser to select the `.dbf` file from your shapefile folder.
. Use the `.prj` file chooser to select the `.prj` file from your shapefile folder.
. Use the `.shx` file chooser to select the `.shx` file from your shapefile folder.
. Click *Import file*.
[discrete]

View file

@ -12,7 +12,7 @@ Create beautiful maps from your geographical data. With **Maps**, you can:
* Build maps with multiple layers and indices.
* Animate spatial temporal data.
* Upload GeoJSON.
* Upload GeoJSON files and shapefiles.
* Embed your map in dashboards.
* Symbolize features using data values.
* Focus on only the data thats important to you.
@ -50,8 +50,8 @@ This animated map uses the time slider to show Portland buses over a period of 1
image::maps/images/timeslider.gif[]
[float]
=== Upload GeoJSON
Use **Maps** to drag and drop your GeoJSON points, lines, and polygons into Elasticsearch, and then use them as layers in your map.
=== Upload GeoJSON files and shapefiles
Use **Maps** to drag and drop your GeoJSON and shapefile data into Elasticsearch, and then use them as layers in your map.
[float]
=== Embed your map in dashboards

View file

@ -47,7 +47,7 @@ image::maps/images/fu_gs_new_england_map.png[]
For each GeoJSON file you downloaded, complete the following steps:
. Click *Add layer*.
. From the list of layer types, click *Upload GeoJSON*.
. From the list of layer types, click *Upload file*.
. Using the File Picker, upload the GeoJSON file.
+
Depending on the geometry type of your features, this will

View file

@ -42,16 +42,16 @@ CSAs generally share the same telecom providers and ad networks. New fast food f
To get the CSA boundary data:
. Download the https://www.census.gov/geographies/mapping-files/time-series/geo/carto-boundary-file.html[Cartographic Boundary shapefile (.shp)] from the Census Bureaus website.
. To use the data in Kibana, convert it to GeoJSON format. Follow this https://gist.github.com/YKCzoli/b7f5ff0e0f641faba0f47fa5d16c4d8d[helpful tutorial] to use QGIS to convert the Cartographic Boundary shapefile to GeoJSON. Or, download a https://raw.githubusercontent.com/elastic/examples/master/blog/reverse-geocoding/csba.json[prebuilt GeoJSON version].
Once you have your GeoJSON file:
. Open the main menu, and click *Maps*.
. Download the "Combined Statistical Areas (CSAs)" zip file from https://www.census.gov/geographies/mapping-files/time-series/geo/carto-boundary-file.html[Census Bureaus website].
. Uncompress the zip file.
. In Kibana, open the main menu, and click *Maps*.
. Click *Create map*.
. Click *Add layer*.
. Click *Upload GeoJSON*.
. Use the file chooser to import the CSA GeoJSON file.
. Click *Upload file*.
. Use the file chooser to select the `.shp` file from the CSA shapefile folder.
. Use the `.dbf` file chooser to select the `.dbf` file from the CSA shapefile folder.
. Use the `.prj` file chooser to select the `.prj` file from the CSA shapefile folder.
. Use the `.shx` file chooser to select the `.shx` file from the CSA shapefile folder.
. Set index name to *csa* and click *Import file*.
. When importing is complete, click *Add as document layer*.
. Add Tooltip fields:

View file

@ -176,6 +176,7 @@
"@kbn/utils": "link:bazel-bin/packages/kbn-utils",
"@loaders.gl/core": "^2.3.1",
"@loaders.gl/json": "^2.3.1",
"@loaders.gl/shapefile": "^2.3.1",
"@mapbox/geojson-rewind": "^0.5.0",
"@mapbox/mapbox-gl-draw": "1.3.0",
"@mapbox/mapbox-gl-rtl-text": "0.2.3",

View file

@ -10,15 +10,12 @@ import { EuiLoadingContent } from '@elastic/eui';
import { FileUploadComponentProps, lazyLoadModules } from '../lazy_load_bundle';
interface State {
JsonUploadAndParse: React.ComponentType<FileUploadComponentProps> | null;
GeoUploadWizard: React.ComponentType<FileUploadComponentProps> | null;
}
export class JsonUploadAndParseAsyncWrapper extends React.Component<
FileUploadComponentProps,
State
> {
export class GeoUploadWizardAsyncWrapper extends React.Component<FileUploadComponentProps, State> {
state: State = {
JsonUploadAndParse: null,
GeoUploadWizard: null,
};
private _isMounted = false;
@ -27,7 +24,7 @@ export class JsonUploadAndParseAsyncWrapper extends React.Component<
lazyLoadModules().then((modules) => {
if (this._isMounted) {
this.setState({
JsonUploadAndParse: modules.JsonUploadAndParse,
GeoUploadWizard: modules.GeoUploadWizard,
});
}
});
@ -38,11 +35,7 @@ export class JsonUploadAndParseAsyncWrapper extends React.Component<
}
render() {
const { JsonUploadAndParse } = this.state;
return JsonUploadAndParse ? (
<JsonUploadAndParse {...this.props} />
) : (
<EuiLoadingContent lines={3} />
);
const { GeoUploadWizard } = this.state;
return GeoUploadWizard ? <GeoUploadWizard {...this.props} /> : <EuiLoadingContent lines={3} />;
}
}

View file

@ -9,11 +9,11 @@ import { lazyLoadModules } from '../lazy_load_bundle';
import type { IImporter, ImportFactoryOptions } from '../importer';
import type { HasImportPermission, FindFileStructureResponse } from '../../common/types';
import type { getMaxBytes, getMaxBytesFormatted } from '../importer/get_max_bytes';
import { JsonUploadAndParseAsyncWrapper } from './json_upload_and_parse_async_wrapper';
import { GeoUploadWizardAsyncWrapper } from './geo_upload_wizard_async_wrapper';
import { IndexNameFormAsyncWrapper } from './index_name_form_async_wrapper';
export interface FileUploadStartApi {
FileUploadComponent: typeof JsonUploadAndParseAsyncWrapper;
FileUploadComponent: typeof GeoUploadWizardAsyncWrapper;
IndexNameFormComponent: typeof IndexNameFormAsyncWrapper;
importerFactory: typeof importerFactory;
getMaxBytes: typeof getMaxBytes;
@ -30,7 +30,7 @@ export interface GetTimeFieldRangeResponse {
end: { epoch: number; string: string };
}
export const FileUploadComponent = JsonUploadAndParseAsyncWrapper;
export const FileUploadComponent = GeoUploadWizardAsyncWrapper;
export const IndexNameFormComponent = IndexNameFormAsyncWrapper;
export async function importerFactory(

View file

@ -0,0 +1,173 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Component } from 'react';
import { EuiFilePicker, EuiFormRow } from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import { MB } from '../../../common/constants';
import { getMaxBytesFormatted } from '../../importer/get_max_bytes';
import { GEO_FILE_TYPES, geoImporterFactory } from '../../importer/geo';
import type { GeoFileImporter, GeoFilePreview } from '../../importer/geo';
export type OnFileSelectParameters = GeoFilePreview & {
indexName: string;
importer: GeoFileImporter;
};
interface Props {
onSelect: (onFileSelectParameters: OnFileSelectParameters) => void;
onClear: () => void;
}
interface State {
defaultIndexName: string | null;
error: string | null;
isLoadingPreview: boolean;
importer: GeoFileImporter | null;
previewSummary: string | null;
}
export class GeoFilePicker extends Component<Props, State> {
private _isMounted = false;
state: State = {
defaultIndexName: null,
error: null,
isLoadingPreview: false,
importer: null,
previewSummary: null,
};
async componentDidMount() {
this._isMounted = true;
}
componentWillUnmount() {
this._isMounted = false;
}
_onFileSelect = (files: FileList | null) => {
this.props.onClear();
this.setState({
defaultIndexName: null,
error: null,
isLoadingPreview: false,
importer: null,
previewSummary: null,
});
if (files && files.length) {
const file = files[0];
try {
const importer = geoImporterFactory(file);
this.setState(
{
defaultIndexName: file.name.split('.')[0].toLowerCase(),
importer,
},
this._loadFilePreview
);
} catch (error) {
this.setState({ error: error.message });
}
}
};
_loadFilePreview = async () => {
if (!this.state.importer || !this.state.importer.canPreview()) {
return;
}
this.setState({ isLoadingPreview: true });
let previewError: string | null = null;
let preview: GeoFilePreview | null = null;
try {
preview = await this.state.importer.previewFile(10000, MB * 3);
if (preview.features.length === 0) {
previewError = i18n.translate('xpack.fileUpload.geoFilePicker.noFeaturesDetected', {
defaultMessage: 'No features found in selected file.',
});
}
} catch (error) {
previewError = error.message;
}
if (!this._isMounted) {
return;
}
this.setState({
error: previewError,
isLoadingPreview: false,
previewSummary:
!previewError && preview
? i18n.translate('xpack.fileUpload.geoFilePicker.previewSummary', {
defaultMessage: 'Previewing {numFeatures} features, {previewCoverage}% of file.',
values: {
numFeatures: preview.features.length,
previewCoverage: preview.previewCoverage,
},
})
: null,
});
if (preview) {
this.props.onSelect({
...preview,
importer: this.state.importer,
indexName: this.state.defaultIndexName ? this.state.defaultIndexName : 'features',
});
}
};
_renderHelpText() {
return this.state.previewSummary !== null ? (
this.state.previewSummary
) : (
<span>
{i18n.translate('xpack.fileUpload.geoFilePicker.acceptedFormats', {
defaultMessage: 'Formats accepted: {fileTypes}',
values: { fileTypes: GEO_FILE_TYPES.join(', ') },
})}
<br />
{i18n.translate('xpack.fileUpload.geoFilePicker.maxSize', {
defaultMessage: 'Max size: {maxFileSize}',
values: { maxFileSize: getMaxBytesFormatted() },
})}
</span>
);
}
_renderImporterEditor() {
return this.state.importer ? this.state.importer.renderEditor(this._loadFilePreview) : null;
}
render() {
return (
<>
<EuiFormRow
isInvalid={!!this.state.error}
error={!!this.state.error ? [this.state.error] : []}
helpText={this._renderHelpText()}
>
<EuiFilePicker
initialPromptText={i18n.translate('xpack.fileUpload.geoFilePicker.filePicker', {
defaultMessage: 'Select or drag and drop a file',
})}
onChange={this._onFileSelect}
accept={GEO_FILE_TYPES.join(',')}
isLoading={this.state.isLoadingPreview}
data-test-subj="geoFilePicker"
/>
</EuiFormRow>
{this._renderImporterEditor()}
</>
);
}
}

View file

@ -8,7 +8,7 @@
import React, { ChangeEvent, Component } from 'react';
import { EuiForm, EuiFormRow, EuiSelect } from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import { GeoJsonFilePicker, OnFileSelectParameters } from './geojson_file_picker';
import { GeoFilePicker, OnFileSelectParameters } from './geo_file_picker';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
import { IndexNameForm } from './index_name_form';
import { validateIndexName } from '../../validate_index_name';
@ -41,7 +41,7 @@ interface State {
isPointsOnly: boolean;
}
export class GeoJsonUploadForm extends Component<Props, State> {
export class GeoUploadForm extends Component<Props, State> {
private _isMounted = false;
state: State = {
hasFile: false,
@ -116,7 +116,7 @@ export class GeoJsonUploadForm extends Component<Props, State> {
render() {
return (
<EuiForm>
<GeoJsonFilePicker onSelect={this._onFileSelect} onClear={this._onFileClear} />
<GeoFilePicker onSelect={this._onFileSelect} onClear={this._onFileClear} />
{this._renderGeoFieldTypeSelect()}
{this.state.hasFile ? (
<IndexNameForm

View file

@ -5,5 +5,5 @@
* 2.0.
*/
export { GeoJsonUploadForm } from './geojson_upload_form';
export type { OnFileSelectParameters } from './geojson_file_picker';
export { GeoUploadForm } from './geo_upload_form';
export type { OnFileSelectParameters } from './geo_file_picker';

View file

@ -9,12 +9,12 @@ import React, { Component, Fragment } from 'react';
import { i18n } from '@kbn/i18n';
import { EuiProgress, EuiText } from '@elastic/eui';
import { getIndexPatternService } from '../kibana_services';
import { GeoJsonUploadForm, OnFileSelectParameters } from './geojson_upload_form';
import { GeoUploadForm, OnFileSelectParameters } from './geo_upload_form';
import { ImportCompleteView } from './import_complete_view';
import { ES_FIELD_TYPES } from '../../../../../src/plugins/data/public';
import type { FileUploadComponentProps, FileUploadGeoResults } from '../lazy_load_bundle';
import { ImportResults } from '../importer';
import { GeoJsonImporter } from '../importer/geojson_importer';
import { GeoFileImporter } from '../importer/geo';
import type { Settings } from '../../common/types';
import { hasImportPermission } from '../api';
@ -25,7 +25,7 @@ enum PHASE {
}
function getWritingToIndexMsg(progress: number) {
return i18n.translate('xpack.fileUpload.jsonUploadAndParse.writingToIndex', {
return i18n.translate('xpack.fileUpload.geoUploadWizard.writingToIndex', {
defaultMessage: 'Writing to index: {progress}% complete',
values: { progress },
});
@ -42,8 +42,8 @@ interface State {
phase: PHASE;
}
export class JsonUploadAndParse extends Component<FileUploadComponentProps, State> {
private _geojsonImporter?: GeoJsonImporter;
export class GeoUploadWizard extends Component<FileUploadComponentProps, State> {
private _geoFileImporter?: GeoFileImporter;
private _isMounted = false;
state: State = {
@ -60,9 +60,9 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
componentWillUnmount() {
this._isMounted = false;
if (this._geojsonImporter) {
this._geojsonImporter.destroy();
this._geojsonImporter = undefined;
if (this._geoFileImporter) {
this._geoFileImporter.destroy();
this._geoFileImporter = undefined;
}
}
@ -73,7 +73,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
}
_import = async () => {
if (!this._geojsonImporter) {
if (!this._geoFileImporter) {
return;
}
@ -115,14 +115,14 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
processors: [],
};
this.setState({
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted', {
importStatus: i18n.translate('xpack.fileUpload.geoUploadWizard.dataIndexingStarted', {
defaultMessage: 'Creating index: {indexName}',
values: { indexName: this.state.indexName },
}),
phase: PHASE.IMPORT,
});
this._geojsonImporter.setGeoFieldType(this.state.geoFieldType);
const initializeImportResp = await this._geojsonImporter.initializeImport(
this._geoFileImporter.setGeoFieldType(this.state.geoFieldType);
const initializeImportResp = await this._geoFileImporter.initializeImport(
this.state.indexName,
settings,
mappings,
@ -146,7 +146,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
this.setState({
importStatus: getWritingToIndexMsg(0),
});
const importResults = await this._geojsonImporter.import(
const importResults = await this._geoFileImporter.import(
initializeImportResp.id,
this.state.indexName,
initializeImportResp.pipelineId,
@ -165,7 +165,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
if (!importResults.success) {
this.setState({
importResults,
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.dataIndexingError', {
importStatus: i18n.translate('xpack.fileUpload.geoUploadWizard.dataIndexingError', {
defaultMessage: 'Data indexing error',
}),
phase: PHASE.COMPLETE,
@ -179,7 +179,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
//
this.setState({
importResults,
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern', {
importStatus: i18n.translate('xpack.fileUpload.geoUploadWizard.creatingIndexPattern', {
defaultMessage: 'Creating data view: {indexName}',
values: { indexName: this.state.indexName },
}),
@ -213,7 +213,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
} catch (error) {
if (this._isMounted) {
this.setState({
importStatus: i18n.translate('xpack.fileUpload.jsonUploadAndParse.indexPatternError', {
importStatus: i18n.translate('xpack.fileUpload.geoUploadWizard.indexPatternError', {
defaultMessage: 'Unable to create data view',
}),
phase: PHASE.COMPLETE,
@ -242,7 +242,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
};
_onFileSelect = ({ features, importer, indexName, previewCoverage }: OnFileSelectParameters) => {
this._geojsonImporter = importer;
this._geoFileImporter = importer;
this.props.onFileSelect(
{
@ -255,9 +255,9 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
};
_onFileClear = () => {
if (this._geojsonImporter) {
this._geojsonImporter.destroy();
this._geojsonImporter = undefined;
if (this._geoFileImporter) {
this._geoFileImporter.destroy();
this._geoFileImporter = undefined;
}
this.props.onFileClear();
@ -305,7 +305,7 @@ export class JsonUploadAndParse extends Component<FileUploadComponentProps, Stat
}
return (
<GeoJsonUploadForm
<GeoUploadForm
geoFieldType={this.state.geoFieldType}
indexName={this.state.indexName}
indexNameError={this.state.indexNameError}

View file

@ -1,154 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Component } from 'react';
import { EuiFilePicker, EuiFormRow } from '@elastic/eui';
import { i18n } from '@kbn/i18n';
import { MB } from '../../../common/constants';
import { getMaxBytesFormatted } from '../../importer/get_max_bytes';
import { validateFile } from '../../importer';
import {
GeoJsonImporter,
GeoJsonPreview,
GEOJSON_FILE_TYPES,
} from '../../importer/geojson_importer';
export type OnFileSelectParameters = GeoJsonPreview & {
indexName: string;
importer: GeoJsonImporter;
};
interface Props {
onSelect: (onFileSelectParameters: OnFileSelectParameters) => void;
onClear: () => void;
}
interface State {
error: string | null;
isLoadingPreview: boolean;
previewSummary: string | null;
}
export class GeoJsonFilePicker extends Component<Props, State> {
private _isMounted = false;
state: State = {
error: null,
isLoadingPreview: false,
previewSummary: null,
};
async componentDidMount() {
this._isMounted = true;
}
componentWillUnmount() {
this._isMounted = false;
}
_onFileSelect = (files: FileList | null) => {
this.props.onClear();
this.setState({
error: null,
isLoadingPreview: false,
previewSummary: null,
});
if (files && files.length) {
this._loadFilePreview(files[0]);
}
};
async _loadFilePreview(file: File) {
this.setState({ isLoadingPreview: true });
let importer: GeoJsonImporter | null = null;
let previewError: string | null = null;
let preview: GeoJsonPreview | null = null;
try {
validateFile(file, GEOJSON_FILE_TYPES);
importer = new GeoJsonImporter(file);
preview = await importer.previewFile(10000, MB * 3);
if (preview.features.length === 0) {
previewError = i18n.translate('xpack.fileUpload.geojsonFilePicker.noFeaturesDetected', {
defaultMessage: 'No GeoJson features found in selected file.',
});
}
} catch (error) {
previewError = error.message;
}
if (!this._isMounted) {
return;
}
this.setState({
error: previewError,
isLoadingPreview: false,
previewSummary:
!previewError && preview
? i18n.translate('xpack.fileUpload.geojsonFilePicker.previewSummary', {
defaultMessage: 'Previewing {numFeatures} features, {previewCoverage}% of file.',
values: {
numFeatures: preview.features.length,
previewCoverage: preview.previewCoverage,
},
})
: null,
});
if (importer && preview) {
this.props.onSelect({
...preview,
importer,
indexName: file.name.split('.')[0].toLowerCase(),
});
}
}
_renderHelpText() {
return this.state.previewSummary !== null ? (
this.state.previewSummary
) : (
<span>
{i18n.translate('xpack.fileUpload.geojsonFilePicker.acceptedFormats', {
defaultMessage: 'Formats accepted: {fileTypes}',
values: { fileTypes: GEOJSON_FILE_TYPES.join(', ') },
})}
<br />
{i18n.translate('xpack.fileUpload.geojsonFilePicker.maxSize', {
defaultMessage: 'Max size: {maxFileSize}',
values: { maxFileSize: getMaxBytesFormatted() },
})}
<br />
{i18n.translate('xpack.fileUpload.geojsonFilePicker.acceptedCoordinateSystem', {
defaultMessage: 'Coordinates must be in EPSG:4326 coordinate reference system.',
})}
</span>
);
}
render() {
return (
<EuiFormRow
isInvalid={!!this.state.error}
error={!!this.state.error ? [this.state.error] : []}
helpText={this._renderHelpText()}
>
<EuiFilePicker
initialPromptText={i18n.translate('xpack.fileUpload.geojsonFilePicker.filePicker', {
defaultMessage: 'Select or drag and drop a file',
})}
onChange={this._onFileSelect}
accept={GEOJSON_FILE_TYPES.join(',')}
isLoading={this.state.isLoadingPreview}
/>
</EuiFormRow>
);
}
}

View file

@ -34,6 +34,8 @@ interface Props {
indexName: string;
}
const STATUS_CALLOUT_DATA_TEST_SUBJ = 'fileUploadStatusCallout';
export class ImportCompleteView extends Component<Props, {}> {
_renderCodeEditor(json: object | undefined, title: string, copyButtonDataTestSubj: string) {
if (!json) {
@ -103,6 +105,7 @@ export class ImportCompleteView extends Component<Props, {}> {
})}
color="danger"
iconType="alert"
data-test-subj={STATUS_CALLOUT_DATA_TEST_SUBJ}
>
<p>
{i18n.translate('xpack.fileUpload.importComplete.permissionFailureMsg', {
@ -139,6 +142,7 @@ export class ImportCompleteView extends Component<Props, {}> {
})}
color="danger"
iconType="alert"
data-test-subj={STATUS_CALLOUT_DATA_TEST_SUBJ}
>
<p>{errorMsg}</p>
</EuiCallOut>
@ -166,6 +170,7 @@ export class ImportCompleteView extends Component<Props, {}> {
title={i18n.translate('xpack.fileUpload.importComplete.uploadSuccessTitle', {
defaultMessage: 'File upload complete',
})}
data-test-subj={STATUS_CALLOUT_DATA_TEST_SUBJ}
>
<p>{`${successMsg} ${failedFeaturesMsg}`}</p>
</EuiCallOut>

View file

@ -5,32 +5,24 @@
* 2.0.
*/
import { Feature, Point } from 'geojson';
import { ReactNode } from 'react';
import { Feature } from 'geojson';
import { i18n } from '@kbn/i18n';
// @ts-expect-error
import { JSONLoader, loadInBatches } from './loaders';
import { GeoFileImporter, GeoFilePreview } from './types';
import { CreateDocsResponse, ImportResults } from '../types';
import { callImportRoute, Importer, IMPORT_RETRIES, MAX_CHUNK_CHAR_COUNT } from '../importer';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
// @ts-expect-error
import { geoJsonCleanAndValidate } from './geojson_clean_and_validate';
import { MB } from '../../../common/constants';
import type { ImportDoc, ImportFailure, ImportResponse } from '../../../common/types';
// @ts-expect-error
import { geoJsonCleanAndValidate } from './geojson_clean_and_validate';
import { createChunks } from './create_chunks';
const BLOCK_SIZE_MB = 5 * MB;
export const GEOJSON_FILE_TYPES = ['.json', '.geojson'];
export interface GeoJsonPreview {
features: Feature[];
hasPoints: boolean;
hasShapes: boolean;
previewCoverage: number;
}
export class GeoJsonImporter extends Importer {
export class AbstractGeoFileImporter extends Importer implements GeoFileImporter {
private _file: File;
private _isActive = true;
private _iterator?: Iterator<unknown>;
private _hasNext = true;
private _features: Feature[] = [];
private _totalBytesRead = 0;
@ -40,7 +32,6 @@ export class GeoJsonImporter extends Importer {
private _totalFeaturesImported = 0;
private _geometryTypesMap = new Map<string, boolean>();
private _invalidFeatures: ImportFailure[] = [];
private _prevBatchLastFeature?: Feature;
private _geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE =
ES_FIELD_TYPES.GEO_SHAPE;
@ -54,12 +45,20 @@ export class GeoJsonImporter extends Importer {
this._isActive = false;
}
public async previewFile(rowLimit?: number, sizeLimit?: number): Promise<GeoJsonPreview> {
public canPreview() {
return true;
}
public renderEditor(onChange: () => void): ReactNode {
return null;
}
public async previewFile(rowLimit?: number, sizeLimit?: number): Promise<GeoFilePreview> {
await this._readUntil(rowLimit, sizeLimit);
return {
features: [...this._features],
previewCoverage: this._hasNext
? Math.round((this._blockSizeInBytes / this._file.size) * 100)
? Math.round(this._getProgress(this._features.length, this._blockSizeInBytes))
: 100,
hasPoints: this._geometryTypesMap.has('Point') || this._geometryTypesMap.has('MultiPoint'),
hasShapes:
@ -236,8 +235,8 @@ export class GeoJsonImporter extends Importer {
// because features are converted to elasticsearch documents which changes the size.
const chunkProgress = (i + 1) / chunks.length;
const totalBytesImported = this._totalBytesImported + blockSizeInBytes * chunkProgress;
const progressPercent = (totalBytesImported / this._file.size) * 100;
setImportProgress(Math.round(progressPercent * 10) / 10);
const importPercent = this._getProgress(this._totalFeaturesImported, totalBytesImported);
setImportProgress(Math.round(importPercent * 10) / 10);
} else {
success = false;
error = resp.error;
@ -261,125 +260,55 @@ export class GeoJsonImporter extends Importer {
(rowLimit === undefined || this._features.length < rowLimit) &&
(sizeLimit === undefined || this._blockSizeInBytes < sizeLimit)
) {
await this._next();
const results = await this._readNext(this._totalFeaturesRead, this._totalBytesRead);
this._hasNext = results.hasNext;
this._blockSizeInBytes = this._blockSizeInBytes + results.bytesRead;
this._features = [
...this._features,
...results.features.map((feature) => {
return geoJsonCleanAndValidate(feature);
}),
];
results.geometryTypesMap.forEach((value, key) => {
this._geometryTypesMap.set(key, value);
});
this._invalidFeatures = [...this._invalidFeatures, ...results.invalidFeatures];
this._totalBytesRead = this._totalBytesRead + results.bytesRead;
this._totalFeaturesRead =
this._totalFeaturesRead + results.features.length + results.invalidFeatures.length;
}
}
private async _next() {
if (this._iterator === undefined) {
this._iterator = await loadInBatches(this._file, JSONLoader, {
json: {
jsonpaths: ['$.features'],
_rootObjectBatches: true,
},
});
}
protected _readNext(
prevFeaturesRead: number,
prevBytesRead: number
): Promise<{
bytesRead: number;
features: Feature[];
geometryTypesMap: Map<string, boolean>;
invalidFeatures: ImportFailure[];
hasNext: boolean;
}> {
throw new Error('Should implement AbstractGeoFileImporter._next');
}
if (!this._isActive || !this._iterator) {
return;
}
protected _getProgress(featuresProcessed: number, bytesProcessed: number) {
return (bytesProcessed / this._file.size) * 100;
}
const { value: batch, done } = await this._iterator.next();
protected _getIsActive() {
return this._isActive;
}
if (!this._isActive || done) {
this._hasNext = false;
return;
}
if ('bytesUsed' in batch) {
const bytesRead = batch.bytesUsed - this._totalBytesRead;
this._blockSizeInBytes += bytesRead;
this._totalBytesRead = batch.bytesUsed;
}
const rawFeatures: unknown[] = this._prevBatchLastFeature ? [this._prevBatchLastFeature] : [];
this._prevBatchLastFeature = undefined;
const isLastBatch = batch.batchType === 'root-object-batch-complete';
if (isLastBatch) {
// Handle single feature geoJson
if (this._totalFeaturesRead === 0) {
rawFeatures.push(batch.container);
}
} else {
rawFeatures.push(...batch.data);
}
for (let i = 0; i < rawFeatures.length; i++) {
const rawFeature = rawFeatures[i] as Feature;
if (!isLastBatch && i === rawFeatures.length - 1) {
// Do not process last feature until next batch is read, features on batch boundary may be incomplete.
this._prevBatchLastFeature = rawFeature;
continue;
}
this._totalFeaturesRead++;
if (!rawFeature.geometry || !rawFeature.geometry.type) {
this._invalidFeatures.push({
item: this._totalFeaturesRead,
reason: i18n.translate('xpack.fileUpload.geojsonImporter.noGeometry', {
defaultMessage: 'Feature does not contain required field "geometry"',
}),
doc: rawFeature,
});
} else {
if (!this._geometryTypesMap.has(rawFeature.geometry.type)) {
this._geometryTypesMap.set(rawFeature.geometry.type, true);
}
this._features.push(geoJsonCleanAndValidate(rawFeature));
}
}
protected _getFile() {
return this._file;
}
public read(data: ArrayBuffer): { success: boolean } {
throw new Error('read(data: ArrayBuffer) not supported, use readFile instead.');
throw new Error('read(data: ArrayBuffer) not supported, use previewFile and import instead.');
}
protected _createDocs(text: string): CreateDocsResponse {
throw new Error('_createDocs not implemented.');
}
}
export function createChunks(
features: Feature[],
geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE,
maxChunkCharCount: number
): ImportDoc[][] {
const chunks: ImportDoc[][] = [];
let chunk: ImportDoc[] = [];
let chunkChars = 0;
for (let i = 0; i < features.length; i++) {
const doc = toEsDoc(features[i], geoFieldType);
const docChars = JSON.stringify(doc).length + 1; // +1 adds CHAR for comma once document is in list
if (chunk.length === 0 || chunkChars + docChars < maxChunkCharCount) {
// add ES document to current chunk
chunk.push(doc);
chunkChars += docChars;
} else {
// chunk boundary found, start new chunk
chunks.push(chunk);
chunk = [doc];
chunkChars = docChars;
}
}
if (chunk.length) {
chunks.push(chunk);
}
return chunks;
}
export function toEsDoc(
feature: Feature,
geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE
) {
const properties = feature.properties ? feature.properties : {};
return {
geometry:
geoFieldType === ES_FIELD_TYPES.GEO_SHAPE
? feature.geometry
: (feature.geometry as Point).coordinates,
...properties,
};
}

View file

@ -0,0 +1,126 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { Feature } from 'geojson';
import { createChunks, toEsDoc } from './create_chunks';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
const FEATURE_COLLECTION = {
type: 'FeatureCollection',
features: [
{
type: 'Feature',
properties: {
population: 200,
},
geometry: {
type: 'Point',
coordinates: [-112.0372, 46.608058],
},
} as Feature,
],
};
const GEOMETRY_COLLECTION_FEATURE = {
type: 'Feature',
properties: {
population: 200,
},
geometry: {
type: 'GeometryCollection',
geometries: [
{
type: 'Point',
coordinates: [100.0, 0.0],
},
{
type: 'LineString',
coordinates: [
[101.0, 0.0],
[102.0, 1.0],
],
},
],
},
} as Feature;
describe('toEsDoc', () => {
test('should convert feature to geo_point ES document', () => {
const esDoc = toEsDoc(FEATURE_COLLECTION.features[0], ES_FIELD_TYPES.GEO_POINT);
expect(esDoc).toEqual({
geometry: [-112.0372, 46.608058],
population: 200,
});
});
test('should convert feature to geo_shape ES document', () => {
const esDoc = toEsDoc(FEATURE_COLLECTION.features[0], ES_FIELD_TYPES.GEO_SHAPE);
expect(esDoc).toEqual({
geometry: {
type: 'Point',
coordinates: [-112.0372, 46.608058],
},
population: 200,
});
});
test('should convert GeometryCollection feature to geo_shape ES document', () => {
const esDoc = toEsDoc(GEOMETRY_COLLECTION_FEATURE, ES_FIELD_TYPES.GEO_SHAPE);
expect(esDoc).toEqual({
geometry: {
type: 'GeometryCollection',
geometries: [
{
type: 'Point',
coordinates: [100.0, 0.0],
},
{
type: 'LineString',
coordinates: [
[101.0, 0.0],
[102.0, 1.0],
],
},
],
},
population: 200,
});
});
});
describe('createChunks', () => {
const GEOMETRY_COLLECTION_DOC_CHARS = JSON.stringify(
toEsDoc(GEOMETRY_COLLECTION_FEATURE, ES_FIELD_TYPES.GEO_SHAPE)
).length;
const features = [
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
];
test('should break features into chunks', () => {
const maxChunkCharCount = GEOMETRY_COLLECTION_DOC_CHARS * 3.5;
const chunks = createChunks(features, ES_FIELD_TYPES.GEO_SHAPE, maxChunkCharCount);
expect(chunks.length).toBe(2);
expect(chunks[0].length).toBe(3);
expect(chunks[1].length).toBe(2);
});
test('should break features into chunks containing only single feature when feature size is greater than maxChunkCharCount', () => {
const maxChunkCharCount = GEOMETRY_COLLECTION_DOC_CHARS * 0.8;
const chunks = createChunks(features, ES_FIELD_TYPES.GEO_SHAPE, maxChunkCharCount);
expect(chunks.length).toBe(5);
expect(chunks[0].length).toBe(1);
expect(chunks[1].length).toBe(1);
expect(chunks[2].length).toBe(1);
expect(chunks[3].length).toBe(1);
expect(chunks[4].length).toBe(1);
});
});

View file

@ -0,0 +1,55 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { Feature, Point } from 'geojson';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
import type { ImportDoc } from '../../../common/types';
export function createChunks(
features: Feature[],
geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE,
maxChunkCharCount: number
): ImportDoc[][] {
const chunks: ImportDoc[][] = [];
let chunk: ImportDoc[] = [];
let chunkChars = 0;
for (let i = 0; i < features.length; i++) {
const doc = toEsDoc(features[i], geoFieldType);
const docChars = JSON.stringify(doc).length + 1; // +1 adds CHAR for comma once document is in list
if (chunk.length === 0 || chunkChars + docChars < maxChunkCharCount) {
// add ES document to current chunk
chunk.push(doc);
chunkChars += docChars;
} else {
// chunk boundary found, start new chunk
chunks.push(chunk);
chunk = [doc];
chunkChars = docChars;
}
}
if (chunk.length) {
chunks.push(chunk);
}
return chunks;
}
export function toEsDoc(
feature: Feature,
geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE
) {
const properties = feature.properties ? feature.properties : {};
return {
geometry:
geoFieldType === ES_FIELD_TYPES.GEO_SHAPE
? feature.geometry
: (feature.geometry as Point).coordinates,
...properties,
};
}

View file

@ -5,8 +5,7 @@
* 2.0.
*/
import { GeoJsonImporter, createChunks, toEsDoc } from './geojson_importer';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
import { GeoJsonImporter } from './geojson_importer';
import '@loaders.gl/polyfills';
const FEATURE_COLLECTION = {
@ -220,80 +219,3 @@ describe('previewFile', () => {
});
});
});
describe('toEsDoc', () => {
test('should convert feature to geo_point ES document', () => {
const esDoc = toEsDoc(FEATURE_COLLECTION.features[0], ES_FIELD_TYPES.GEO_POINT);
expect(esDoc).toEqual({
geometry: [-112.0372, 46.608058],
population: 200,
});
});
test('should convert feature to geo_shape ES document', () => {
const esDoc = toEsDoc(FEATURE_COLLECTION.features[0], ES_FIELD_TYPES.GEO_SHAPE);
expect(esDoc).toEqual({
geometry: {
type: 'Point',
coordinates: [-112.0372, 46.608058],
},
population: 200,
});
});
test('should convert GeometryCollection feature to geo_shape ES document', () => {
const esDoc = toEsDoc(GEOMETRY_COLLECTION_FEATURE, ES_FIELD_TYPES.GEO_SHAPE);
expect(esDoc).toEqual({
geometry: {
type: 'GeometryCollection',
geometries: [
{
type: 'Point',
coordinates: [100.0, 0.0],
},
{
type: 'LineString',
coordinates: [
[101.0, 0.0],
[102.0, 1.0],
],
},
],
},
population: 200,
});
});
});
describe('createChunks', () => {
const GEOMETRY_COLLECTION_DOC_CHARS = JSON.stringify(
toEsDoc(GEOMETRY_COLLECTION_FEATURE, ES_FIELD_TYPES.GEO_SHAPE)
).length;
const features = [
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
GEOMETRY_COLLECTION_FEATURE,
];
test('should break features into chunks', () => {
const maxChunkCharCount = GEOMETRY_COLLECTION_DOC_CHARS * 3.5;
const chunks = createChunks(features, ES_FIELD_TYPES.GEO_SHAPE, maxChunkCharCount);
expect(chunks.length).toBe(2);
expect(chunks[0].length).toBe(3);
expect(chunks[1].length).toBe(2);
});
test('should break features into chunks containing only single feature when feature size is greater than maxChunkCharCount', () => {
const maxChunkCharCount = GEOMETRY_COLLECTION_DOC_CHARS * 0.8;
const chunks = createChunks(features, ES_FIELD_TYPES.GEO_SHAPE, maxChunkCharCount);
expect(chunks.length).toBe(5);
expect(chunks[0].length).toBe(1);
expect(chunks[1].length).toBe(1);
expect(chunks[2].length).toBe(1);
expect(chunks[3].length).toBe(1);
expect(chunks[4].length).toBe(1);
});
});

View file

@ -0,0 +1,95 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { Feature } from 'geojson';
import { i18n } from '@kbn/i18n';
// @ts-expect-error
import { JSONLoader, loadInBatches } from '../loaders';
import type { ImportFailure } from '../../../../common/types';
import { AbstractGeoFileImporter } from '../abstract_geo_file_importer';
export const GEOJSON_FILE_TYPES = ['.json', '.geojson'];
export class GeoJsonImporter extends AbstractGeoFileImporter {
private _iterator?: Iterator<unknown>;
private _prevBatchLastFeature?: Feature;
protected async _readNext(prevTotalFeaturesRead: number, prevTotalBytesRead: number) {
let featureIndex = prevTotalFeaturesRead;
const results = {
bytesRead: 0,
features: [] as Feature[],
geometryTypesMap: new Map<string, boolean>(),
invalidFeatures: [] as ImportFailure[],
hasNext: true,
};
if (this._iterator === undefined) {
this._iterator = await loadInBatches(this._getFile(), JSONLoader, {
json: {
jsonpaths: ['$.features'],
_rootObjectBatches: true,
},
});
}
if (!this._getIsActive() || !this._iterator) {
results.hasNext = false;
return results;
}
const { value: batch, done } = await this._iterator.next();
if (!this._getIsActive() || done) {
results.hasNext = false;
return results;
}
if ('bytesUsed' in batch) {
results.bytesRead = batch.bytesUsed - prevTotalBytesRead;
}
const features: unknown[] = this._prevBatchLastFeature ? [this._prevBatchLastFeature] : [];
this._prevBatchLastFeature = undefined;
const isLastBatch = batch.batchType === 'root-object-batch-complete';
if (isLastBatch) {
// Handle single feature geoJson
if (featureIndex === 0) {
features.push(batch.container);
}
} else {
features.push(...batch.data);
}
for (let i = 0; i < features.length; i++) {
const feature = features[i] as Feature;
if (!isLastBatch && i === features.length - 1) {
// Do not process last feature until next batch is read, features on batch boundary may be incomplete.
this._prevBatchLastFeature = feature;
continue;
}
featureIndex++;
if (!feature.geometry || !feature.geometry.type) {
results.invalidFeatures.push({
item: featureIndex,
reason: i18n.translate('xpack.fileUpload.geojsonImporter.noGeometry', {
defaultMessage: 'Feature does not contain required field "geometry"',
}),
doc: feature,
});
} else {
if (!results.geometryTypesMap.has(feature.geometry.type)) {
results.geometryTypesMap.set(feature.geometry.type, true);
}
results.features.push(feature);
}
}
return results;
}
}

View file

@ -5,5 +5,4 @@
* 2.0.
*/
export type { GeoJsonPreview } from './geojson_importer';
export { GeoJsonImporter, GEOJSON_FILE_TYPES } from './geojson_importer';

View file

@ -0,0 +1,24 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import type { GeoFileImporter } from './types';
import { GeoJsonImporter, GEOJSON_FILE_TYPES } from './geojson_importer';
import { ShapefileImporter, SHAPEFILE_TYPES } from './shapefile_importer';
import { getFileExtension, validateFile } from '../validate_file';
export const GEO_FILE_TYPES = [...GEOJSON_FILE_TYPES, ...SHAPEFILE_TYPES];
export function geoImporterFactory(file: File): GeoFileImporter {
validateFile(file, GEO_FILE_TYPES);
const extension = getFileExtension(file);
return GEOJSON_FILE_TYPES.includes(extension)
? new GeoJsonImporter(file)
: new ShapefileImporter(file);
}
export type { GeoFileImporter, GeoFilePreview } from './types';

View file

@ -7,4 +7,5 @@
// Loading @loaders.gl from javascriopt file to typescript compilation failures within @loaders.gl.
export { JSONLoader } from '@loaders.gl/json';
export { loadInBatches } from '@loaders.gl/core';
export { _BrowserFileSystem as BrowserFileSystem, loadInBatches } from '@loaders.gl/core';
export { DBFLoader, ShapefileLoader } from '@loaders.gl/shapefile';

View file

@ -0,0 +1,8 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
export { ShapefileImporter, SHAPEFILE_TYPES } from './shapefile_importer';

View file

@ -0,0 +1,38 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React from 'react';
import { SideCarFilePicker } from './side_car_file_picker';
interface Props {
shapefileName: string;
onDbfSelect: (file: File | null) => void;
onPrjSelect: (file: File | null) => void;
onShxSelect: (file: File | null) => void;
}
export function ShapefileEditor(props: Props) {
return (
<>
<SideCarFilePicker
ext=".dbf"
onSelect={props.onDbfSelect}
shapefileName={props.shapefileName}
/>
<SideCarFilePicker
ext=".prj"
onSelect={props.onPrjSelect}
shapefileName={props.shapefileName}
/>
<SideCarFilePicker
ext=".shx"
onSelect={props.onShxSelect}
shapefileName={props.shapefileName}
/>
</>
);
}

View file

@ -0,0 +1,131 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React from 'react';
import { Feature } from 'geojson';
// @ts-expect-error
import { BrowserFileSystem, DBFLoader, loadInBatches, ShapefileLoader } from '../loaders';
import type { ImportFailure } from '../../../../common/types';
import { ShapefileEditor } from './shapefile_editor';
import { AbstractGeoFileImporter } from '../abstract_geo_file_importer';
export const SHAPEFILE_TYPES = ['.shp'];
export class ShapefileImporter extends AbstractGeoFileImporter {
private _tableRowCount: number | null = null;
private _dbfFile: File | null = null;
private _prjFile: File | null = null;
private _shxFile: File | null = null;
private _iterator?: Iterator<{ data: Feature[] }>;
public canPreview() {
return this._dbfFile !== null && this._prjFile !== null && this._shxFile !== null;
}
public renderEditor(onChange: () => void) {
return !this.canPreview() ? (
<ShapefileEditor
shapefileName={this._getFile().name}
onDbfSelect={(file) => {
this._dbfFile = file;
onChange();
}}
onPrjSelect={(file) => {
this._prjFile = file;
onChange();
}}
onShxSelect={(file) => {
this._shxFile = file;
onChange();
}}
/>
) : null;
}
private async _setTableRowCount() {
if (!this._dbfFile) {
return;
}
// read header from dbf file to get number of records in data file
const dbfIterator = (await loadInBatches(this._dbfFile, DBFLoader, {
metadata: false,
dbf: { encoding: 'latin1' },
})) as unknown as Iterator<{ nRecords: number }>;
const { value } = await dbfIterator.next();
if (value.nRecords && typeof value.nRecords === 'number') {
this._tableRowCount = value.nRecords;
}
}
protected _getProgress(featuresProcessed: number, bytesProcessed: number) {
if (this._tableRowCount === null || this._tableRowCount === 0) {
return 0;
}
if (featuresProcessed > this._tableRowCount) {
return 100;
}
return (featuresProcessed / this._tableRowCount) * 100;
}
protected async _readNext(prevTotalFeaturesRead: number, prevTotalBytesRead: number) {
const results = {
bytesRead: 0,
features: [] as Feature[],
geometryTypesMap: new Map<string, boolean>(),
invalidFeatures: [] as ImportFailure[],
hasNext: true,
};
if (this._iterator === undefined) {
const sideCarFiles: File[] = [];
if (this._dbfFile) {
sideCarFiles.push(this._dbfFile);
}
if (this._prjFile) {
sideCarFiles.push(this._prjFile);
}
if (this._shxFile) {
sideCarFiles.push(this._shxFile);
}
const fileSystem = new BrowserFileSystem([this._getFile(), ...sideCarFiles]);
this._iterator = (await loadInBatches(this._getFile().name, ShapefileLoader, {
fetch: fileSystem.fetch,
// Reproject shapefiles to WGS84
gis: { reproject: true, _targetCrs: 'EPSG:4326' },
// Only parse the X & Y coordinates. Other coords not supported by Elasticsearch.
shp: { _maxDimensions: 2 },
metadata: false,
})) as unknown as Iterator<{ data: Feature[] }>;
await this._setTableRowCount();
}
const { value: batch, done } = await this._iterator.next();
if (!this._getIsActive() || done) {
results.hasNext = false;
return results;
}
for (let i = 0; i < batch.data.length; i++) {
const feature = batch.data[i];
if (!results.geometryTypesMap.has(feature.geometry.type)) {
results.geometryTypesMap.set(feature.geometry.type, true);
}
results.features.push(feature);
// Instead of tracking bytes read, which is difficult since reading from multiple binary files
// track size by features
const featureChars = JSON.stringify(feature).length;
results.bytesRead = results.bytesRead + featureChars;
}
return results;
}
}

View file

@ -0,0 +1,14 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { getFileNameWithoutExt } from './side_car_file_picker';
test('getFileNameWithoutExt', () => {
expect(getFileNameWithoutExt('foo')).toBe('foo');
expect(getFileNameWithoutExt('foo.shp')).toBe('foo');
expect(getFileNameWithoutExt('foo.bar.shp')).toBe('foo.bar');
});

View file

@ -0,0 +1,91 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import React, { Component } from 'react';
import { i18n } from '@kbn/i18n';
import { EuiFilePicker, EuiFormRow } from '@elastic/eui';
export function getFileNameWithoutExt(fileName: string) {
const splits = fileName.split('.');
if (splits.length > 1) {
splits.pop();
}
return splits.join('.');
}
interface Props {
ext: '.dbf' | '.prj' | '.shx';
onSelect: (file: File | null) => void;
shapefileName: string;
}
interface State {
error: string;
isInvalid: boolean;
}
export class SideCarFilePicker extends Component<Props, State> {
state: State = {
error: '',
isInvalid: false,
};
_isSideCarFileValid(sideCarFile: File) {
return (
getFileNameWithoutExt(this.props.shapefileName) === getFileNameWithoutExt(sideCarFile.name)
);
}
_getSideCarFileNameError() {
return i18n.translate('xpack.fileUpload.shapefile.sideCarFilePicker.error', {
defaultMessage: '{ext} expected to be {shapefileName}{ext}',
values: {
ext: this.props.ext,
shapefileName: getFileNameWithoutExt(this.props.shapefileName),
},
});
}
_onSelect = (files: FileList | null) => {
if (!files || files.length === 0) {
this.setState({ error: '', isInvalid: false });
this.props.onSelect(null);
return;
}
const file = files[0];
if (!this._isSideCarFileValid(file)) {
this.setState({ error: this._getSideCarFileNameError(), isInvalid: true });
this.props.onSelect(null);
return;
}
this.setState({ error: '', isInvalid: false });
this.props.onSelect(file);
};
render() {
return (
<EuiFormRow isInvalid={this.state.isInvalid} error={this.state.error}>
<EuiFilePicker
initialPromptText={i18n.translate(
'xpack.fileUpload.shapefile.sideCarFilePicker.promptText',
{
defaultMessage: `Select '{ext}' file`,
values: { ext: this.props.ext },
}
)}
onChange={this._onSelect}
accept={this.props.ext}
display="default"
isInvalid={this.state.isInvalid}
data-test-subj={`shapefileSideCarFilePicker${this.props.ext.replace('.', '_')}`}
/>
</EuiFormRow>
);
}
}

View file

@ -0,0 +1,26 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { Feature } from 'geojson';
import { ReactNode } from 'react';
import { IImporter } from '../types';
import { ES_FIELD_TYPES } from '../../../../../../src/plugins/data/public';
export interface GeoFilePreview {
features: Feature[];
hasPoints: boolean;
hasShapes: boolean;
previewCoverage: number;
}
export interface GeoFileImporter extends IImporter {
destroy(): void;
canPreview(): boolean;
previewFile(rowLimit?: number, sizeLimit?: number): Promise<GeoFilePreview>;
renderEditor(onChange: () => void): ReactNode;
setGeoFieldType(geoFieldType: ES_FIELD_TYPES.GEO_POINT | ES_FIELD_TYPES.GEO_SHAPE): void;
}

View file

@ -8,6 +8,14 @@
import { i18n } from '@kbn/i18n';
import { getMaxBytes, getMaxBytesFormatted } from './get_max_bytes';
/*
* Extract file extension from file. Example: file.name "points.geojson" returns ".geojson"
*/
export function getFileExtension(file: File) {
const extension = file.name.split('.').pop();
return '.' + extension;
}
export function validateFile(file: File, types: string[]) {
if (file.size > getMaxBytes()) {
throw new Error(
@ -29,9 +37,7 @@ export function validateFile(file: File, types: string[]) {
);
}
const nameSplit = file.name.split('.');
const fileType = nameSplit.pop();
if (!types.includes(`.${fileType}`)) {
if (!types.includes(getFileExtension(file))) {
throw new Error(
i18n.translate('xpack.fileUpload.fileTypeError', {
defaultMessage: 'File is not one of acceptable types: {types}',

View file

@ -11,7 +11,7 @@ export function plugin() {
return new FileUploadPlugin();
}
export type { Props as IndexNameFormProps } from './components/geojson_upload_form/index_name_form';
export type { Props as IndexNameFormProps } from './components/geo_upload_form/index_name_form';
export type { FileUploadPluginStart } from './plugin';
export type { FileUploadComponentProps, FileUploadGeoResults } from './lazy_load_bundle';

View file

@ -33,7 +33,7 @@ export interface FileUploadComponentProps {
let loadModulesPromise: Promise<LazyLoadedFileUploadModules>;
export interface LazyLoadedFileUploadModules {
JsonUploadAndParse: React.ComponentType<FileUploadComponentProps>;
GeoUploadWizard: React.ComponentType<FileUploadComponentProps>;
IndexNameForm: React.ComponentType<IndexNameFormProps>;
importerFactory: (format: string, options: ImportFactoryOptions) => IImporter | undefined;
getHttp: () => HttpStart;
@ -46,9 +46,9 @@ export async function lazyLoadModules(): Promise<LazyLoadedFileUploadModules> {
loadModulesPromise = new Promise(async (resolve, reject) => {
try {
const { JsonUploadAndParse, importerFactory, IndexNameForm } = await import('./lazy');
const { GeoUploadWizard, importerFactory, IndexNameForm } = await import('./lazy');
resolve({
JsonUploadAndParse,
GeoUploadWizard,
importerFactory,
getHttp,
IndexNameForm,

View file

@ -5,6 +5,6 @@
* 2.0.
*/
export { JsonUploadAndParse } from '../../components/json_upload_and_parse';
export { IndexNameForm } from '../../components/geojson_upload_form/index_name_form';
export { GeoUploadWizard } from '../../components/geo_upload_wizard';
export { IndexNameForm } from '../../components/geo_upload_form/index_name_form';
export { importerFactory } from '../../importer';

View file

@ -15,7 +15,7 @@ export const uploadLayerWizardConfig: LayerWizard = {
order: 10,
categories: [],
description: i18n.translate('xpack.maps.fileUploadWizard.description', {
defaultMessage: 'Index GeoJSON data in Elasticsearch',
defaultMessage: 'Index GeoJSON and Shapefile data in Elasticsearch',
}),
disabledReason: i18n.translate('xpack.maps.fileUploadWizard.disabledDesc', {
defaultMessage:
@ -53,6 +53,6 @@ export const uploadLayerWizardConfig: LayerWizard = {
return <ClientFileCreateSourceEditor {...renderWizardArguments} />;
},
title: i18n.translate('xpack.maps.fileUploadWizard.title', {
defaultMessage: 'Upload GeoJSON',
defaultMessage: 'Upload file',
}),
};

View file

@ -10982,12 +10982,6 @@
"xpack.features.visualizeFeatureName": "Visualizeライブラリ",
"xpack.fileUpload.fileSizeError": "ファイルサイズ{fileSize}は最大ファイルサイズの{maxFileSize}を超えています",
"xpack.fileUpload.fileTypeError": "ファイルは使用可能なタイプのいずれかではありません。{types}",
"xpack.fileUpload.geojsonFilePicker.acceptedCoordinateSystem": "座標は EPSG:4326 座標参照系でなければなりません。",
"xpack.fileUpload.geojsonFilePicker.acceptedFormats": "使用可能な形式:{fileTypes}",
"xpack.fileUpload.geojsonFilePicker.filePicker": "ファイルを選択するかドラッグ &amp;amp; ドロップしてください",
"xpack.fileUpload.geojsonFilePicker.maxSize": "最大サイズ:{maxFileSize}",
"xpack.fileUpload.geojsonFilePicker.noFeaturesDetected": "選択したファイルにはGeoJson機能がありません。",
"xpack.fileUpload.geojsonFilePicker.previewSummary": "{numFeatures}個の特徴量。ファイルの{previewCoverage}%。",
"xpack.fileUpload.geojsonImporter.noGeometry": "特長量には必須フィールド「ジオメトリ」が含まれていません",
"xpack.fileUpload.import.noIdOrIndexSuppliedErrorMessage": "ID またはインデックスが提供されていません",
"xpack.fileUpload.importComplete.copyButtonAriaLabel": "クリップボードにコピー",
@ -11016,11 +11010,6 @@
"xpack.fileUpload.indexNameRequired": "インデックス名は必須です",
"xpack.fileUpload.indexPatternAlreadyExistsErrorMessage": "データビューはすでに存在します。",
"xpack.fileUpload.indexSettings.enterIndexTypeLabel": "インデックスタイプ",
"xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern": "データビュー{indexName}を作成しています",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingError": "データインデックスエラー",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted": "インデックスを作成中:{indexName}",
"xpack.fileUpload.jsonUploadAndParse.indexPatternError": "データビューを作成できません",
"xpack.fileUpload.jsonUploadAndParse.writingToIndex": "インデックスに書き込み中:{progress}%完了",
"xpack.fileUpload.maxFileSizeUiSetting.description": "ファイルのインポート時にファイルサイズ上限を設定します。この設定でサポートされている最大値は1 GBです。",
"xpack.fileUpload.maxFileSizeUiSetting.error": "200 MB、1 GBなどの有効なデータサイズにしてください。",
"xpack.fileUpload.maxFileSizeUiSetting.name": "最大ファイルアップロードサイズ",
@ -17801,7 +17790,7 @@
"xpack.ml.ruleEditor.scopeSection.noPermissionToViewFilterListsTitle": "フィルターリストを表示するパーミッションがありません",
"xpack.ml.ruleEditor.scopeSection.scopeTitle": "範囲",
"xpack.ml.ruleEditor.selectRuleAction.createRuleLinkText": "ルールを作成",
"xpack.ml.ruleEditor.selectRuleAction.orText": "OR ",
"xpack.ml.ruleEditor.selectRuleAction.orText": "OR ",
"xpack.ml.ruleEditor.typicalAppliesTypeText": "通常",
"xpack.ml.sampleDataLinkLabel": "ML ジョブ",
"xpack.ml.settings.anomalyDetection.anomalyDetectionTitle": "異常検知",
@ -23912,9 +23901,9 @@
"xpack.securitySolution.open.timeline.showingLabel": "表示中:",
"xpack.securitySolution.open.timeline.singleTemplateLabel": "テンプレート",
"xpack.securitySolution.open.timeline.singleTimelineLabel": "タイムライン",
"xpack.securitySolution.open.timeline.successfullyDeletedTimelinesTitle": "{totalTimelines, plural, =0 {すべてのタイムライン} other {{totalTimelines} 個のタイムライン}}の削除が正常に完了しました",
"xpack.securitySolution.open.timeline.successfullyDeletedTimelinesTitle": "{totalTimelines, plural, =0 {すべてのタイムライン} other {{totalTimelines} 個のタイムライン}}の削除が正常に完了しました",
"xpack.securitySolution.open.timeline.successfullyDeletedTimelineTemplatesTitle": "{totalTimelineTemplates, plural, =0 {すべてのタイムライン} other {{totalTimelineTemplates}個のタイムラインテンプレート}}が正常に削除されました",
"xpack.securitySolution.open.timeline.successfullyExportedTimelinesTitle": "{totalTimelines, plural, =0 {すべてのタイムライン} other {{totalTimelines} 個のタイムライン}}のエクスポートが正常に完了しました",
"xpack.securitySolution.open.timeline.successfullyExportedTimelinesTitle": "{totalTimelines, plural, =0 {すべてのタイムライン} other {{totalTimelines} 個のタイムライン}}のエクスポートが正常に完了しました",
"xpack.securitySolution.open.timeline.successfullyExportedTimelineTemplatesTitle": "{totalTimelineTemplates, plural, =0 {すべてのタイムライン} other {{totalTimelineTemplates} タイムラインテンプレート}}が正常にエクスポートされました",
"xpack.securitySolution.open.timeline.timelineNameTableHeader": "タイムライン名",
"xpack.securitySolution.open.timeline.timelineTemplateNameTableHeader": "テンプレート名",

View file

@ -11089,12 +11089,6 @@
"xpack.features.visualizeFeatureName": "Visualize 库",
"xpack.fileUpload.fileSizeError": "文件大小 {fileSize} 超过最大文件大小 {maxFileSize}",
"xpack.fileUpload.fileTypeError": "文件不是可接受类型之一:{types}",
"xpack.fileUpload.geojsonFilePicker.acceptedCoordinateSystem": "坐标必须在 EPSG:4326 坐标参考系中。",
"xpack.fileUpload.geojsonFilePicker.acceptedFormats": "接受的格式:{fileTypes}",
"xpack.fileUpload.geojsonFilePicker.filePicker": "选择或拖放文件",
"xpack.fileUpload.geojsonFilePicker.maxSize": "最大大小:{maxFileSize}",
"xpack.fileUpload.geojsonFilePicker.noFeaturesDetected": "选定文件中未找到 GeoJson 特征。",
"xpack.fileUpload.geojsonFilePicker.previewSummary": "正在预览 {numFeatures} 个特征、{previewCoverage}% 的文件。",
"xpack.fileUpload.geojsonImporter.noGeometry": "特征不包含必需的字段“geometry”",
"xpack.fileUpload.import.noIdOrIndexSuppliedErrorMessage": "未提供任何 ID 或索引",
"xpack.fileUpload.importComplete.copyButtonAriaLabel": "复制到剪贴板",
@ -11123,11 +11117,6 @@
"xpack.fileUpload.indexNameRequired": "需要索引名称",
"xpack.fileUpload.indexPatternAlreadyExistsErrorMessage": "数据视图已存在。",
"xpack.fileUpload.indexSettings.enterIndexTypeLabel": "索引类型",
"xpack.fileUpload.jsonUploadAndParse.creatingIndexPattern": "正在创建数据视图:{indexName}",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingError": "数据索引错误",
"xpack.fileUpload.jsonUploadAndParse.dataIndexingStarted": "正在创建索引:{indexName}",
"xpack.fileUpload.jsonUploadAndParse.indexPatternError": "无法创建数据视图",
"xpack.fileUpload.jsonUploadAndParse.writingToIndex": "正在写入索引:已完成 {progress}%",
"xpack.fileUpload.maxFileSizeUiSetting.description": "设置导入文件时的文件大小限制。此设置支持的最高值为 1GB。",
"xpack.fileUpload.maxFileSizeUiSetting.error": "应为有效的数据大小。如 200MB、1GB",
"xpack.fileUpload.maxFileSizeUiSetting.name": "最大文件上传大小",

View file

@ -0,0 +1 @@
GEOGCS["GCS_North_American_1983",DATUM["D_North_American_1983",SPHEROID["GRS_1980",6378137,298.257222101]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]]

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,70 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import expect from '@kbn/expect';
import path from 'path';
import uuid from 'uuid/v4';
export default function ({ getPageObjects, getService }) {
const PageObjects = getPageObjects(['geoFileUpload', 'maps']);
const security = getService('security');
const retry = getService('retry');
describe('geojson file upload', () => {
let indexName = '';
before(async () => {
await security.testUser.setRoles([
'global_maps_all',
'geoall_data_writer',
'global_index_pattern_management_all',
]);
await PageObjects.maps.openNewMap();
});
after(async () => {
await security.testUser.restoreDefaults();
});
it('should preview part of geojson file', async () => {
await PageObjects.maps.clickAddLayer();
await PageObjects.maps.selectFileUploadCard();
await PageObjects.geoFileUpload.previewGeoJsonFile(
path.join(__dirname, 'files', 'world_countries_v7.geo.json')
);
await PageObjects.maps.waitForLayersToLoad();
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
const tooltipText = await PageObjects.maps.getLayerTocTooltipMsg('world_countries_v7');
expect(tooltipText).to.be('world_countries_v7\nResults limited to 76 features, 41% of file.');
});
it('should import geojson', async () => {
indexName = uuid();
await PageObjects.geoFileUpload.setIndexName(indexName);
await PageObjects.geoFileUpload.uploadFile();
const statusText = await PageObjects.geoFileUpload.getFileUploadStatusCalloutMsg();
expect(statusText).to.be('File upload complete\nIndexed 250 features.');
});
it('should add as document layer', async () => {
await PageObjects.geoFileUpload.addFileAsDocumentLayer();
await PageObjects.maps.waitForLayersToLoad();
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
await retry.try(async () => {
await PageObjects.maps.waitForLayersToLoad();
const tooltipText = await PageObjects.maps.getLayerTocTooltipMsg(indexName);
expect(tooltipText).to.be(`${indexName}\nFound ~281 documents. This count is approximate.`);
});
});
});
}

View file

@ -6,8 +6,9 @@
*/
export default function ({ loadTestFile }) {
describe('import_geojson', function () {
loadTestFile(require.resolve('./add_layer_import_panel'));
loadTestFile(require.resolve('./file_indexing_panel'));
describe('geo file upload', function () {
loadTestFile(require.resolve('./wizard'));
loadTestFile(require.resolve('./geojson'));
loadTestFile(require.resolve('./shapefile'));
});
}

View file

@ -0,0 +1,72 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import expect from '@kbn/expect';
import path from 'path';
import uuid from 'uuid/v4';
export default function ({ getPageObjects, getService }) {
const PageObjects = getPageObjects(['geoFileUpload', 'maps']);
const security = getService('security');
const retry = getService('retry');
describe('shapefile upload', () => {
let indexName = '';
before(async () => {
await security.testUser.setRoles([
'global_maps_all',
'geoall_data_writer',
'global_index_pattern_management_all',
]);
await PageObjects.maps.openNewMap();
});
after(async () => {
await security.testUser.restoreDefaults();
});
it('should preview part of shapefile', async () => {
await PageObjects.maps.clickAddLayer();
await PageObjects.maps.selectFileUploadCard();
await PageObjects.geoFileUpload.previewShapefile(
path.join(__dirname, 'files', 'cb_2018_us_csa_500k.shp')
);
await PageObjects.maps.waitForLayersToLoad();
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
const tooltipText = await PageObjects.maps.getLayerTocTooltipMsg('cb_2018_us_csa_500k');
expect(tooltipText).to.be(
'cb_2018_us_csa_500k\nResults limited to 141 features, 81% of file.'
);
});
it('should import shapefile', async () => {
indexName = uuid();
await PageObjects.geoFileUpload.setIndexName(indexName);
await PageObjects.geoFileUpload.uploadFile();
const statusText = await PageObjects.geoFileUpload.getFileUploadStatusCalloutMsg();
expect(statusText).to.be('File upload complete\nIndexed 174 features.');
});
it('should add as document layer', async () => {
await PageObjects.geoFileUpload.addFileAsDocumentLayer();
await PageObjects.maps.waitForLayersToLoad();
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
await retry.try(async () => {
await PageObjects.maps.waitForLayersToLoad();
const tooltipText = await PageObjects.maps.getLayerTocTooltipMsg(indexName);
expect(tooltipText).to.be(`${indexName}\nFound 174 documents.`);
});
});
});
}

View file

@ -0,0 +1,90 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import expect from '@kbn/expect';
import path from 'path';
export default function ({ getPageObjects, getService }) {
const PageObjects = getPageObjects(['geoFileUpload', 'maps']);
const security = getService('security');
const retry = getService('retry');
describe('geo file upload wizard', () => {
before(async () => {
await security.testUser.setRoles([
'global_maps_all',
'geoall_data_writer',
'global_index_pattern_management_all',
]);
});
after(async () => {
await security.testUser.restoreDefaults();
});
// describe block is testing a workflow and individual tests are not designed to be run out of order
describe('preview layer workflow', () => {
before(async () => {
await PageObjects.maps.openNewMap();
await PageObjects.maps.clickAddLayer();
await PageObjects.maps.selectFileUploadCard();
await PageObjects.geoFileUpload.previewGeoJsonFile(
path.join(__dirname, 'files', 'point.json')
);
await PageObjects.maps.waitForLayersToLoad();
});
it('should add preview layer to map', async () => {
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
const hasLayer = await PageObjects.maps.doesLayerExist('point');
expect(hasLayer).to.be(true);
});
it('should replace preivew layer on file change', async () => {
await PageObjects.geoFileUpload.previewGeoJsonFile(
path.join(__dirname, 'files', 'polygon.json')
);
await PageObjects.maps.waitForLayersToLoad();
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
const hasLayer = await PageObjects.maps.doesLayerExist('polygon');
expect(hasLayer).to.be(true);
});
it('should disable next button when index already exists', async () => {
await retry.try(async () => {
expect(await PageObjects.geoFileUpload.isNextButtonEnabled()).to.be(true);
});
// "geo_shapes" index already exists, its added by es_archive
await PageObjects.geoFileUpload.setIndexName('geo_shapes');
await retry.try(async () => {
expect(await PageObjects.geoFileUpload.isNextButtonEnabled()).to.be(false);
});
});
it('should enable next button when index name is changed', async () => {
await PageObjects.geoFileUpload.setIndexName('polygon');
await retry.try(async () => {
expect(await PageObjects.geoFileUpload.isNextButtonEnabled()).to.be(true);
});
});
it('should remove preview layer on cancel', async () => {
await PageObjects.maps.cancelLayerAdd();
await PageObjects.maps.waitForLayerDeleted('polygon');
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(1);
});
});
});
}

View file

@ -1,114 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import expect from '@kbn/expect';
import path from 'path';
export default function ({ getPageObjects, getService }) {
const PageObjects = getPageObjects(['maps', 'common']);
const FILE_LOAD_DIR = 'test_upload_files';
const DEFAULT_LOAD_FILE_NAME = 'point.json';
const security = getService('security');
const retry = getService('retry');
describe('GeoJSON import layer panel', () => {
before(async () => {
await security.testUser.setRoles([
'global_maps_all',
'geoall_data_writer',
'global_index_pattern_management_all',
]);
await PageObjects.maps.openNewMap();
});
after(async () => {
await security.testUser.restoreDefaults();
});
beforeEach(async () => {
await PageObjects.maps.clickAddLayer();
await PageObjects.maps.selectGeoJsonUploadSource();
await PageObjects.maps.uploadJsonFileForIndexing(
path.join(__dirname, FILE_LOAD_DIR, DEFAULT_LOAD_FILE_NAME)
);
});
afterEach(async () => {
await PageObjects.maps.closeOrCancelLayer();
});
it('should add GeoJSON file to map', async () => {
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(2);
const filePickerLoadedFile = await PageObjects.maps.hasFilePickerLoadedFile(
DEFAULT_LOAD_FILE_NAME
);
expect(filePickerLoadedFile).to.be(true);
});
it('should remove layer on cancel', async () => {
await PageObjects.maps.cancelLayerAdd();
await PageObjects.maps.waitForLayerDeleted('point');
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(1);
});
it('should replace layer on input change', async () => {
// Upload second file
const secondLoadFileName = 'polygon.json';
await PageObjects.maps.uploadJsonFileForIndexing(
path.join(__dirname, FILE_LOAD_DIR, secondLoadFileName)
);
await PageObjects.maps.waitForLayersToLoad();
// Check second file is loaded in file picker
const filePickerLoadedFile = await PageObjects.maps.hasFilePickerLoadedFile(
secondLoadFileName
);
expect(filePickerLoadedFile).to.be(true);
});
it('should clear layer on replacement layer load error', async () => {
// Upload second file
const secondLoadFileName = 'not_json.txt';
await PageObjects.maps.uploadJsonFileForIndexing(
path.join(__dirname, FILE_LOAD_DIR, secondLoadFileName)
);
await PageObjects.maps.waitForLayersToLoad();
// Check second file is loaded in file picker
const filePickerLoadedFile = await PageObjects.maps.hasFilePickerLoadedFile(
secondLoadFileName
);
expect(filePickerLoadedFile).to.be(true);
// Check that no file is loaded in layer preview
const numberOfLayers = await PageObjects.maps.getNumberOfLayers();
expect(numberOfLayers).to.be(1);
});
it('should prevent import button from activating unless valid index name provided', async () => {
await PageObjects.maps.setIndexName('NoCapitalLetters');
await retry.try(async () => {
const importButtonActive = await PageObjects.maps.importFileButtonEnabled();
expect(importButtonActive).to.be(false);
});
await PageObjects.maps.setIndexName('validindexname');
await retry.try(async () => {
const importButtonActive = await PageObjects.maps.importFileButtonEnabled();
expect(importButtonActive).to.be(true);
});
await PageObjects.maps.setIndexName('?noquestionmarks?');
await retry.try(async () => {
const importButtonActive = await PageObjects.maps.importFileButtonEnabled();
expect(importButtonActive).to.be(false);
});
});
});
}

View file

@ -1,97 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import expect from '@kbn/expect';
import path from 'path';
import uuid from 'uuid/v4';
export default function ({ getService, getPageObjects }) {
const PageObjects = getPageObjects(['maps', 'common']);
const log = getService('log');
const security = getService('security');
const retry = getService('retry');
async function loadFileAndIndex(loadFileName) {
log.debug(`Uploading ${loadFileName} for indexing`);
await PageObjects.maps.uploadJsonFileForIndexing(
path.join(__dirname, 'test_upload_files', loadFileName)
);
await PageObjects.maps.waitForLayersToLoad();
await PageObjects.maps.doesLayerExist('Import File');
await PageObjects.maps.hasFilePickerLoadedFile(loadFileName);
const indexName = uuid();
await PageObjects.maps.setIndexName(indexName);
await retry.try(async () => {
const importButtonActive = await PageObjects.maps.importFileButtonEnabled();
expect(importButtonActive).to.be(true);
});
await PageObjects.maps.clickImportFileButton();
return indexName;
}
describe('geojson upload', () => {
before(async () => {
await security.testUser.setRoles(
['global_maps_all', 'geoall_data_writer', 'global_index_pattern_management_all'],
false
);
await PageObjects.maps.openNewMap();
});
after(async () => {
await security.testUser.restoreDefaults();
});
beforeEach(async () => {
await PageObjects.maps.clickAddLayer();
await PageObjects.maps.selectGeoJsonUploadSource();
});
afterEach(async () => {
await PageObjects.maps.closeOrCancelLayer();
await PageObjects.maps.waitForLayerAddPanelClosed();
});
it('should not activate add layer button until indexing succeeds', async () => {
await loadFileAndIndex('point.json');
let layerAddReady = await PageObjects.maps.importFileButtonEnabled();
expect(layerAddReady).to.be(false);
layerAddReady = await PageObjects.maps.importLayerReadyForAdd();
expect(layerAddReady).to.be(true);
});
it('should load geojson file as ES document source layer', async () => {
const indexName = await loadFileAndIndex('point.json');
const layerAddReady = await PageObjects.maps.importLayerReadyForAdd();
expect(layerAddReady).to.be(true);
await PageObjects.maps.clickImportFileButton();
const geojsonTempLayerExists = await PageObjects.maps.doesLayerExist('Import File');
expect(geojsonTempLayerExists).to.be(false);
const newIndexedLayerExists = await PageObjects.maps.doesLayerExist(indexName);
expect(newIndexedLayerExists).to.be(true);
});
it('should remove index layer on cancel', async () => {
const indexName = await loadFileAndIndex('point.json');
const layerAddReady = await PageObjects.maps.importLayerReadyForAdd();
expect(layerAddReady).to.be(true);
await PageObjects.maps.cancelLayerAdd();
const geojsonTempLayerExists = await PageObjects.maps.doesLayerExist('Import File');
expect(geojsonTempLayerExists).to.be(false);
const newIndexedLayerExists = await PageObjects.maps.doesLayerExist(indexName);
expect(newIndexedLayerExists).to.be(false);
});
});
}

View file

@ -1,12 +0,0 @@
{
"type": "Feature",
"geometry": {
"type": "LineString",
"coordinates": [
[30, 10], [10, 30], [40, 40]
]
},
"properties": {
"name": "Line time"
}
}

View file

@ -1,13 +0,0 @@
{
"type": "Feature",
"geometry": {
"type": "MultiLineString",
"coordinates": [
[[10, 10], [20, 20], [10, 40]],
[[40, 40], [30, 30], [40, 20], [30, 10]]
]
},
"properties": {
"name": "MultiLine time"
}
}

View file

@ -1,12 +0,0 @@
{
"type": "Feature",
"geometry": {
"type": "MultiPoint",
"coordinates": [
[10, 40], [40, 30], [20, 20], [30, 10]
]
},
"properties": {
"name": "MultiPoint islands"
}
}

View file

@ -1,18 +0,0 @@
{
"type": "Feature",
"geometry": {
"type": "MultiPolygon",
"coordinates": [
[
[[40, 40], [20, 45], [45, 30], [40, 40]]
],
[
[[20, 35], [10, 30], [10, 10], [30, 5], [45, 20], [20, 35]],
[[30, 20], [20, 15], [20, 25], [30, 20]]
]
]
},
"properties": {
"name": "Polys want features"
}
}

View file

@ -1 +0,0 @@
This is not a json file.

View file

@ -91,7 +91,7 @@ export default function ({ loadTestFile, getService }) {
loadTestFile(require.resolve('./mvt_scaling'));
loadTestFile(require.resolve('./mvt_geotile_grid'));
loadTestFile(require.resolve('./add_layer_panel'));
loadTestFile(require.resolve('./import_geojson'));
loadTestFile(require.resolve('./file_upload'));
loadTestFile(require.resolve('./layer_errors'));
loadTestFile(require.resolve('./visualize_create_menu'));
loadTestFile(require.resolve('./discover'));

View file

@ -0,0 +1,88 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import _ from 'lodash';
import { FtrService } from '../ftr_provider_context';
export class GeoFileUploadPageObject extends FtrService {
private readonly header = this.ctx.getPageObject('header');
private readonly log = this.ctx.getService('log');
private readonly testSubjects = this.ctx.getService('testSubjects');
private readonly retry = this.ctx.getService('retry');
async isNextButtonEnabled(): Promise<boolean> {
const importFileButton = await this.testSubjects.find('importFileButton');
const isDisabled = await importFileButton.getAttribute('disabled');
const isEnabled = !isDisabled;
this.log.debug(`isNextButtonEnabled: ${isEnabled}`);
return isEnabled;
}
async clickNextButton() {
this.log.debug(`Clicking next button to advance to next step in wizard`);
await this.testSubjects.click('importFileButton');
}
async selectFile(selector: string, path: string) {
this.log.debug(`selectFile; selector: ${selector}, path: ${path}`);
const input = await this.testSubjects.find(selector);
await input.type(path);
}
async waitForFilePreview() {
await this.retry.waitFor('Wait for file preview', async () => {
return await this.isNextButtonEnabled();
});
}
async previewGeoJsonFile(path: string) {
await this.selectFile('geoFilePicker', path);
await this.waitForFilePreview();
await this.header.waitUntilLoadingHasFinished();
}
async previewShapefile(path: string) {
await this.selectFile('geoFilePicker', path);
await this.selectFile('shapefileSideCarFilePicker_dbf', path.replace('.shp', '.dbf'));
await this.selectFile('shapefileSideCarFilePicker_prj', path.replace('.shp', '.prj'));
await this.selectFile('shapefileSideCarFilePicker_shx', path.replace('.shp', '.shx'));
await this.waitForFilePreview();
await this.header.waitUntilLoadingHasFinished();
}
async setIndexName(indexName: string) {
this.log.debug(`Set index name: ${indexName}`);
await this.testSubjects.setValue('fileUploadIndexNameInput', indexName);
}
async uploadFile(): Promise<void> {
// next button is disabled while checking index name
// make sure next button is enabled before clicking it
await this.retry.waitFor('Wait for import button to be enabled', async () => {
return await this.isNextButtonEnabled();
});
await this.clickNextButton();
await this.retry.waitFor('wait for file upload status', async () => {
return await this.testSubjects.exists('fileUploadStatusCallout');
});
}
async addFileAsDocumentLayer() {
await this.clickNextButton();
await this.header.waitUntilLoadingHasFinished();
}
async getFileUploadStatusCalloutMsg() {
return await this.testSubjects.getVisibleText('fileUploadStatusCallout');
}
}

View file

@ -393,14 +393,6 @@ export class GisPageObject extends FtrService {
);
}
async hasFilePickerLoadedFile(fileName: string) {
this.log.debug(`Has file picker loaded file ${fileName}`);
const filePickerText = await this.find.byCssSelector('.euiFilePicker__promptText');
const filePickerTextContent = await filePickerText.getVisibleText();
return fileName === filePickerTextContent;
}
/*
* Layer panel utility functions
*/
@ -461,62 +453,6 @@ export class GisPageObject extends FtrService {
}
}
async importFileButtonEnabled() {
this.log.debug(`Check "Import file" button enabled`);
const importFileButton = await this.testSubjects.find('importFileButton');
const isDisabled = await importFileButton.getAttribute('disabled');
return !isDisabled;
}
async importLayerReadyForAdd() {
this.log.debug(`Wait until import complete`);
await this.testSubjects.find('indexRespCopyButton', 5000);
let layerAddReady = false;
await this.retry.waitForWithTimeout('Add layer button ready', 2000, async () => {
layerAddReady = await this.importFileButtonEnabled();
return layerAddReady;
});
return layerAddReady;
}
async clickImportFileButton() {
this.log.debug(`Click "Import file" button`);
await this.testSubjects.click('importFileButton');
}
async setIndexName(indexName: string) {
this.log.debug(`Set index name to: ${indexName}`);
await this.testSubjects.setValue('fileUploadIndexNameInput', indexName);
}
async setIndexType(indexType: string) {
this.log.debug(`Set index type to: ${indexType}`);
await this.testSubjects.selectValue('fileImportIndexSelect', indexType);
}
async indexTypeOptionExists(indexType: string) {
this.log.debug(`Check index type "${indexType}" available`);
return await this.find.existsByCssSelector(
`select[data-test-subj="fileImportIndexSelect"] > option[value="${indexType}"]`
);
}
async clickCopyButton(dataTestSubj: string): Promise<string> {
this.log.debug(`Click ${dataTestSubj} copy button`);
await this.testSubjects.click(dataTestSubj);
return await this.browser.getClipboardValue();
}
async getIndexResults() {
return JSON.parse(await this.clickCopyButton('indexRespCopyButton'));
}
async getIndexPatternResults() {
return JSON.parse(await this.clickCopyButton('indexPatternRespCopyButton'));
}
async setLayerQuery(layerName: string, query: string) {
await this.openLayerPanel(layerName);
await this.testSubjects.click('mapLayerPanelOpenFilterEditorButton');
@ -571,17 +507,9 @@ export class GisPageObject extends FtrService {
await this.testSubjects.click('emsBoundaries');
}
async selectGeoJsonUploadSource() {
this.log.debug(`Select upload geojson source`);
await this.testSubjects.click('uploadGeoJson');
}
async uploadJsonFileForIndexing(path: string) {
await this.common.setFileInputPath(path);
this.log.debug(`File selected`);
await this.header.waitUntilLoadingHasFinished();
await this.waitForLayersToLoad();
async selectFileUploadCard() {
this.log.debug(`Select upload file card`);
await this.testSubjects.click('uploadFile');
}
async selectVectorLayer(vectorLayerName: string) {

View file

@ -20,6 +20,7 @@ import { ObservabilityPageProvider } from './observability_page';
import { InfraHomePageProvider } from './infra_home_page';
import { InfraLogsPageProvider } from './infra_logs_page';
import { GisPageObject } from './gis_page';
import { GeoFileUploadPageObject } from './geo_file_upload';
import { StatusPageObject } from './status_page';
import { UpgradeAssistantPageObject } from './upgrade_assistant_page';
import { RollupPageObject } from './rollup_page';
@ -63,6 +64,7 @@ export const pageObjects = {
infraLogs: InfraLogsPageProvider,
infraSavedViews: InfraSavedViewsProvider,
maps: GisPageObject,
geoFileUpload: GeoFileUploadPageObject,
statusPage: StatusPageObject,
upgradeAssistant: UpgradeAssistantPageObject,
uptime: UptimePageObject,

View file

@ -1992,7 +1992,7 @@
core-js-pure "^3.0.0"
regenerator-runtime "^0.13.4"
"@babel/runtime@^7.0.0", "@babel/runtime@^7.1.2", "@babel/runtime@^7.10.2", "@babel/runtime@^7.11.2", "@babel/runtime@^7.12.5", "@babel/runtime@^7.13.10", "@babel/runtime@^7.14.0", "@babel/runtime@^7.16.0", "@babel/runtime@^7.16.7", "@babel/runtime@^7.3.1", "@babel/runtime@^7.4.4", "@babel/runtime@^7.4.5", "@babel/runtime@^7.5.0", "@babel/runtime@^7.5.5", "@babel/runtime@^7.6.2", "@babel/runtime@^7.6.3", "@babel/runtime@^7.7.2", "@babel/runtime@^7.7.6", "@babel/runtime@^7.8.4", "@babel/runtime@^7.8.7", "@babel/runtime@^7.9.2":
"@babel/runtime@^7.0.0", "@babel/runtime@^7.1.2", "@babel/runtime@^7.10.2", "@babel/runtime@^7.11.2", "@babel/runtime@^7.12.0", "@babel/runtime@^7.12.5", "@babel/runtime@^7.13.10", "@babel/runtime@^7.14.0", "@babel/runtime@^7.16.0", "@babel/runtime@^7.16.7", "@babel/runtime@^7.3.1", "@babel/runtime@^7.4.4", "@babel/runtime@^7.4.5", "@babel/runtime@^7.5.0", "@babel/runtime@^7.5.5", "@babel/runtime@^7.6.2", "@babel/runtime@^7.6.3", "@babel/runtime@^7.7.2", "@babel/runtime@^7.7.6", "@babel/runtime@^7.8.4", "@babel/runtime@^7.8.7", "@babel/runtime@^7.9.2":
version "7.16.7"
resolved "https://registry.yarnpkg.com/@babel/runtime/-/runtime-7.16.7.tgz#03ff99f64106588c9c403c6ecb8c3bafbbdff1fa"
integrity sha512-9E9FJowqAsytyOY6LG+1KuueckRL+aQW+mKvXRXnuFGyRAyepJPmEo9vgMfXUA6O9u3IeEdv9MAkppFcaQwogQ==
@ -3994,6 +3994,14 @@
"@babel/runtime" "^7.3.1"
"@loaders.gl/loader-utils" "2.3.1"
"@loaders.gl/core@2.3.13":
version "2.3.13"
resolved "https://registry.yarnpkg.com/@loaders.gl/core/-/core-2.3.13.tgz#093fe965cfab0a72c902a63d461282ae1ed55dc2"
integrity sha512-Hjm8eJjS/OUnaHrOSgXtE+qDg5V4Do0jIpp2u0Dv3CMxPrtd2TpwkDfAyZWmmbZew9rzqPoAVMINejS/ItWUeg==
dependencies:
"@babel/runtime" "^7.3.1"
"@loaders.gl/loader-utils" "2.3.13"
"@loaders.gl/gis@2.3.1":
version "2.3.1"
resolved "https://registry.yarnpkg.com/@loaders.gl/gis/-/gis-2.3.1.tgz#f867dbd32e1287b81da888f5dca6c4fe5c1d2c64"
@ -4003,6 +4011,15 @@
"@mapbox/vector-tile" "^1.3.1"
pbf "^3.2.1"
"@loaders.gl/gis@2.3.13":
version "2.3.13"
resolved "https://registry.yarnpkg.com/@loaders.gl/gis/-/gis-2.3.13.tgz#b80cda7e8f709efd0a5a9badf7daf9c68b6b0409"
integrity sha512-i+hot7QeW53GhRwnvF5H65lsZYv4/ESbFuGtNy5TKivPaTIqn1oIFtLOku9Ntw5xTfky9qNNlbMPcsDMoniavQ==
dependencies:
"@loaders.gl/loader-utils" "2.3.13"
"@mapbox/vector-tile" "^1.3.1"
pbf "^3.2.1"
"@loaders.gl/json@^2.3.1":
version "2.3.1"
resolved "https://registry.yarnpkg.com/@loaders.gl/json/-/json-2.3.1.tgz#d1707f95ca3a1413c37cfaaefb103cc14b2632ff"
@ -4020,6 +4037,14 @@
"@babel/runtime" "^7.3.1"
"@probe.gl/stats" "^3.3.0"
"@loaders.gl/loader-utils@2.3.13":
version "2.3.13"
resolved "https://registry.yarnpkg.com/@loaders.gl/loader-utils/-/loader-utils-2.3.13.tgz#5cf6403b1c19b2fe5abacbd89e6252b8ca50db96"
integrity sha512-vXzH5CWG8pWjUEb7hUr6CM4ERj4NVRpA60OxvVv/OaZZ7hNN63+9/tSUA5IXD9QArWPWrFBnKnvE+5gg4WNqTg==
dependencies:
"@babel/runtime" "^7.3.1"
"@probe.gl/stats" "^3.3.0"
"@loaders.gl/polyfills@^2.3.5":
version "2.3.5"
resolved "https://registry.yarnpkg.com/@loaders.gl/polyfills/-/polyfills-2.3.5.tgz#5f32b732c8f2a7e80c221f19ee2d01725f09b5c3"
@ -4033,6 +4058,16 @@
through "^2.3.8"
web-streams-polyfill "^3.0.0"
"@loaders.gl/shapefile@^2.3.1":
version "2.3.13"
resolved "https://registry.yarnpkg.com/@loaders.gl/shapefile/-/shapefile-2.3.13.tgz#5c6c9cc4a113c2f6739c0eb553ae9ceb9d1840ae"
integrity sha512-WPeWZsPBmr0VnivmJ4ELBnVefxl1IpoTctd6ODAr2YTU7t2I/5UJZymgXlYxazoo1eEpodCEyCGaZWK1i/9g2Q==
dependencies:
"@loaders.gl/gis" "2.3.13"
"@loaders.gl/loader-utils" "2.3.13"
"@loaders.gl/tables" "2.3.13"
"@math.gl/proj4" "^3.3.0"
"@loaders.gl/tables@2.3.1":
version "2.3.1"
resolved "https://registry.yarnpkg.com/@loaders.gl/tables/-/tables-2.3.1.tgz#2095dba9b1ceeb633aa0a0a4f282c7c27d02db80"
@ -4041,6 +4076,14 @@
"@loaders.gl/core" "2.3.1"
d3-dsv "^1.2.0"
"@loaders.gl/tables@2.3.13":
version "2.3.13"
resolved "https://registry.yarnpkg.com/@loaders.gl/tables/-/tables-2.3.13.tgz#df30cbb02110e4857e2f3e2a07b64fcf8e0e5b78"
integrity sha512-NHEEUJ/08d3QjMl71CL8MDFUdbKclsjbAVVl9Hiud47qsS7eWwU5g7bayEW0dSawMhZVu+dbbpOyziv/QgDizw==
dependencies:
"@loaders.gl/core" "2.3.13"
d3-dsv "^1.2.0"
"@mapbox/extent@0.4.0":
version "0.4.0"
resolved "https://registry.yarnpkg.com/@mapbox/extent/-/extent-0.4.0.tgz#3e591f32e1f0c3981c864239f7b0ac06e610f8a9"
@ -4151,6 +4194,24 @@
resolved "https://registry.yarnpkg.com/@mapbox/whoots-js/-/whoots-js-3.1.0.tgz#497c67a1cef50d1a2459ba60f315e448d2ad87fe"
integrity sha512-Es6WcD0nO5l+2BOQS4uLfNPYQaNDfbot3X1XUoloz+x0mPDS3eeORZJl06HXjwBG1fOGwCRnzK88LMdxKRrd6Q==
"@math.gl/core@3.5.6":
version "3.5.6"
resolved "https://registry.yarnpkg.com/@math.gl/core/-/core-3.5.6.tgz#d849db978d7d4a4984bb63868adc693975d97ce7"
integrity sha512-UGCKtJUBA9MBswclK6l8DuZqLcEnqlSfI57WzSDUB/Nki4tHfmdImJyhp8ky9W4cIahV1YLMhHRK1oRpLeC1sw==
dependencies:
"@babel/runtime" "^7.12.0"
gl-matrix "~3.3.0"
"@math.gl/proj4@^3.3.0":
version "3.5.6"
resolved "https://registry.yarnpkg.com/@math.gl/proj4/-/proj4-3.5.6.tgz#617aba4cc2a20621550eeaca019bdb35a789d5e0"
integrity sha512-X04gccjujg3SpHUM+0l127jY8MuZch33Q8KBBTswrUNq08SRmnH2zCC6PtWAFFmszRmkcAbJlo6SlDBHYPu00g==
dependencies:
"@babel/runtime" "^7.12.0"
"@math.gl/core" "3.5.6"
"@types/proj4" "^2.5.0"
proj4 "2.6.2"
"@mdx-js/loader@^1.6.22":
version "1.6.22"
resolved "https://registry.yarnpkg.com/@mdx-js/loader/-/loader-1.6.22.tgz#d9e8fe7f8185ff13c9c8639c048b123e30d322c4"
@ -7184,6 +7245,11 @@
resolved "https://registry.yarnpkg.com/@types/prismjs/-/prismjs-1.16.3.tgz#73ae78b3e339777a1a1b7a8df89dcd6b8fe750c5"
integrity sha512-7lbX0Odbg9rnzXRdYdgPQZFkjd38QHpD6tvWxbLi6VXGQbXr054doixIS+TwftHP6afffA1zxCZrIcJRS/MkYQ==
"@types/proj4@^2.5.0":
version "2.5.2"
resolved "https://registry.yarnpkg.com/@types/proj4/-/proj4-2.5.2.tgz#e3afa4e09e5cf08d8bc74e1b3de3b2111324ee33"
integrity sha512-/Nmfn9p08yaYw6xo5f2b0L+2oHk2kZeOkp5v+4VCeNfq+ETlLQbmHmC97/pjDIEZy8jxwz7pdPpwNzDHM5cuJw==
"@types/prop-types@*":
version "15.7.1"
resolved "https://registry.yarnpkg.com/@types/prop-types/-/prop-types-15.7.1.tgz#f1a11e7babb0c3cad68100be381d1e064c68f1f6"
@ -15738,7 +15804,7 @@ github-slugger@^1.0.0:
dependencies:
emoji-regex ">=6.0.0 <=6.1.1"
gl-matrix@^3.2.1:
gl-matrix@^3.2.1, gl-matrix@~3.3.0:
version "3.3.0"
resolved "https://registry.yarnpkg.com/gl-matrix/-/gl-matrix-3.3.0.tgz#232eef60b1c8b30a28cbbe75b2caf6c48fd6358b"
integrity sha512-COb7LDz+SXaHtl/h4LeaFcNdJdAQSDeVqjiIihSXNrkWObZLhDI4hIkZC11Aeqp7bcE72clzB0BnDXr2SmslRA==
@ -20539,6 +20605,11 @@ methods@^1.1.1, methods@~1.1.2:
resolved "https://registry.yarnpkg.com/methods/-/methods-1.1.2.tgz#5529a4d67654134edcc5266656835b0f851afcee"
integrity sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=
mgrs@1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/mgrs/-/mgrs-1.0.0.tgz#fb91588e78c90025672395cb40b25f7cd6ad1829"
integrity sha1-+5FYjnjJACVnI5XLQLJffNatGCk=
microevent.ts@~0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/microevent.ts/-/microevent.ts-0.1.1.tgz#70b09b83f43df5172d0205a63025bce0f7357fa0"
@ -23624,6 +23695,14 @@ progress@^2.0.0, progress@^2.0.1, progress@^2.0.3:
resolved "https://registry.yarnpkg.com/progress/-/progress-2.0.3.tgz#7e8cf8d8f5b8f239c1bc68beb4eb78567d572ef8"
integrity sha512-7PiHtLll5LdnKIMw100I+8xJXR5gW2QwWYkT6iJva0bXitZKa/XMrSbdmg3r2Xnaidz9Qumd0VPaMrZlF9V9sA==
proj4@2.6.2:
version "2.6.2"
resolved "https://registry.yarnpkg.com/proj4/-/proj4-2.6.2.tgz#4665d7cbc30fd356375007c2fed53b07dbda1d67"
integrity sha512-Pn0+HZtXb4JzuN8RR0VM7yyseegiYHbXkF+2FOdGpzRojcZ1BTjWxOh7qfp2vH0EyLu8pvcrhLxidwzgyUy/Gw==
dependencies:
mgrs "1.0.0"
wkt-parser "^1.2.4"
promise-inflight@^1.0.1:
version "1.0.1"
resolved "https://registry.yarnpkg.com/promise-inflight/-/promise-inflight-1.0.1.tgz#98472870bf228132fcbdd868129bad12c3c029e3"
@ -30645,6 +30724,11 @@ winston@^3.0.0, winston@^3.3.3:
triple-beam "^1.3.0"
winston-transport "^4.4.0"
wkt-parser@^1.2.4:
version "1.3.2"
resolved "https://registry.yarnpkg.com/wkt-parser/-/wkt-parser-1.3.2.tgz#deeff04a21edc5b170a60da418e9ed1d1ab0e219"
integrity sha512-A26BOOo7sHAagyxG7iuRhnKMO7Q3mEOiOT4oGUmohtN/Li5wameeU4S6f8vWw6NADTVKljBs8bzA8JPQgSEMVQ==
word-wrap@^1.2.3, word-wrap@~1.2.3:
version "1.2.3"
resolved "https://registry.yarnpkg.com/word-wrap/-/word-wrap-1.2.3.tgz#610636f6b1f703891bd34771ccb17fb93b47079c"