Merge branch 'master' of github.com:elastic/kibana into feature-secops

This commit is contained in:
Xavier Mouligneau 2019-01-08 16:13:46 -05:00
commit a44371b360
927 changed files with 27463 additions and 9441 deletions

View file

@ -1,7 +1,8 @@
---
name: Accessibility Issue
about: Issues to help Elastic meet WCAG / Section 508 compliance
about: Issues to help Kibana be as keyboard-navigable, screen-readable, and accessible to all, with a focus on WCAG / Section 508 compliance
labels: accessibility
title: (Accessibility)
---
**Steps to reproduce (assumes [ChromeVox](https://chrome.google.com/webstore/detail/chromevox/kgejglhpjiefppelpmljglcjbhoiplfn) or similar)**
@ -9,17 +10,20 @@ about: Issues to help Elastic meet WCAG / Section 508 compliance
1.
2.
3.
4.
[Screenshot here]
**Actual Result**
5.
4.
**Expected Result**
5.
[Link to meta issues here]
4.
**Meta Issue**
**Kibana Version:**
**Relevant WCAG Criteria:** (link to https://www.w3.org/WAI/WCAG21/quickref/?versions=2.0)
**Relevant WCAG Criteria:** [#.#.# WCAG Criterion](link to https://www.w3.org/WAI/WCAG21/quickref/?versions=2.0)

80
.sass-lint.yml Normal file
View file

@ -0,0 +1,80 @@
files:
include:
- 'x-pack/plugins/rollup/**/*.s+(a|c)ss'
- 'x-pack/plugins/security/**/*.s+(a|c)ss'
rules:
quotes:
- 2
-
style: 'single'
# } else { style on one line, like our JS
brace-style:
- 2
-
style: '1tbs'
variable-name-format:
- 2
-
convention: 'camelcase'
# Needs regex, right now we ignore
class-name-format: 0
# Order how you please
property-sort-order: 0
hex-notation:
- 2
-
style: 'uppercase'
mixin-name-format:
- 2
-
allow-leading-underscore: false
convention: 'camelcase'
# Use none instead of 0 for no border
border-zero:
- 2
- convention: 'none'
force-element-nesting: 0
# something { not something{
space-before-brace:
- 2
force-pseudo-nesting: 0
# 2 spaces for indentation
indentation: 2
function-name-format:
- 2
-
allow-leading-underscore: false
convention: 'camelcase'
# This removes the need for ::hover
pseudo-element: 0
# ($var / 2) rather than ($var/2)
space-around-operator: 2
# We minify css, so this doesn't apply
no-css-comments: 0
# We use _ (underscores) for import path that don't directly compile
clean-import-paths: 0
# Allows input[type=search]
force-attribute-nesting: 0
no-qualifying-elements:
- 2
-
# Allows input[type=search]
allow-element-with-attribute: 1
# Files can end without a newline
final-newline: 0
# We use some rare duplicate property values for browser variance
no-duplicate-properties:
- 2
-
exclude:
- 'font-size'
- 'word-break'
# Put a line-break between sections of CSS, but allow quicky one-liners for legibility
empty-line-between-blocks:
- 2
-
allow-single-line-rulesets: 1
# Warns are nice for deprecations and development
no-warn: 0
# Transition all is useful in certain situations and there's no recent info to suggest slowdown
no-transition-all: 0

View file

@ -1,5 +1,5 @@
Kibana source code with Kibana X-Pack source code
Copyright 2012-2018 Elasticsearch B.V.
Copyright 2012-2019 Elasticsearch B.V.
---
This product has relied on ASTExplorer that is licensed under MIT.

View file

@ -22,7 +22,6 @@ This section summarizes the changes in each release.
[[release-notes-7.0.0-alpha2]]
== {kib} 7.0.0-alpha2
coming[7.0.0-alpha2]
[float]
[[breaking-7.0.0-alpha2]]

View file

@ -7,7 +7,7 @@
Elastic Application Performance Monitoring (APM) automatically collects in-depth
performance metrics and errors from inside your applications.
The **APM** page in {kib} is provided with the {xpack} basic license. It
The **APM** page in {kib} is provided with the basic license. It
enables developers to drill down into the performance data for their applications
and quickly locate the performance bottlenecks.

View file

@ -52,19 +52,11 @@ triangle that appears next to the URL line of the request. Notice that as you mo
.The Action Menu
image::dev-tools/console/images/introduction_action_menu.png["The Action Menu",width=400,align="center"]
When the response come back, you should see it in the left hand panel:
When the response come back, you should see it in the right hand panel:
.The Output Pane
image::dev-tools/console/images/introduction_output.png[Screenshot]
[float]
[[console-ui]]
=== The Console UI
In this section you will find a more detailed description of UI of Console. The basic aspects of the UI are explained
in the <<console-kibana>> section.
include::multi-requests.asciidoc[]
include::auto-formatting.asciidoc[]

View file

@ -14,8 +14,8 @@ with over 120 reusable grok patterns. See
https://github.com/elastic/elasticsearch/tree/master/modules/ingest-common/src/main/resources/patterns[Ingest node grok patterns] and https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns[Logstash grok patterns]
for the full list of patterns.
{xpack} includes a Grok Debugger tool that you can use to build and debug
grok patterns before you use them in your data processing pipelines. Because
You can build and debug grok patterns in the Grok Debugger tool in {kib}
before you use them in your data processing pipelines. Because
ingest node and Logstash share the same grok implementation and pattern
libraries, any grok pattern that you create in the Grok Debugger will work
in ingest node and Logstash.

View file

@ -9,7 +9,7 @@ ifdef::gs-mini[]
== Getting Started
endif::gs-mini[]
The Search Profiler is automatically enabled in {kib}. It is located under the
The {searchprofiler} is automatically enabled in {kib}. It is located under the
*Dev Tools* tab in {kib}.
[[first-profile]]
@ -18,24 +18,24 @@ To start profiling queries:
. Open Kibana in your web browser and log in. If you are running Kibana
locally, go to `http://localhost:5601/`.
. Click **DevTools** in the side navigation to open the Search Profiler.
. Click **DevTools** in the side navigation to open the {searchprofiler}.
Console is the default tool to open when first accessing DevTools.
+
image::dev-tools/searchprofiler/images/gs1.png["Opening DevTools"]
+
On the top navigation bar, click the second item: *Search Profiler*
+
image::dev-tools/searchprofiler/images/gs2.png["Opening the Search Profiler"]
image::dev-tools/searchprofiler/images/gs2.png["Opening the {searchprofiler}"]
. This opens the Search Profiler interface.
. This opens the {searchprofiler} interface.
+
image::dev-tools/searchprofiler/images/gs3.png["Search Profiler Interface"]
image::dev-tools/searchprofiler/images/gs3.png["{searchprofiler} Interface"]
. Replace the default `match_all` query with the query you want to profile and click *Profile*.
+
image::dev-tools/searchprofiler/images/gs4.png["Profiling the match_all query"]
+
Search Profiler displays the names of the indices searched, the shards in each index,
{searchprofiler} displays the names of the indices searched, the shards in each index,
and how long the query took. The following example shows the results of profiling
the match_all query. Three indices were searched: `.monitoring-kibana-2-2016.11.30`,
`.monitoring-data-2` and `test`.
@ -64,7 +64,7 @@ This displays details about the query component(s) that ran on the shard.
+
In this example, there is a single `"MatchAllDocsQuery"` that ran on the shard.
Since it was the only query run, it took 100% of the time. When you mouse over
a row, the Search Profiler displays additional information about the query component."
a row, the {searchprofiler} displays additional information about the query component."
+
image::dev-tools/searchprofiler/images/gs6.png["Profile details for the first shard"]
+

View file

@ -1,3 +1,4 @@
[role="xpack"]
[[xpack-profiler]]
= Profiling your Queries and Aggregations
@ -7,12 +8,12 @@ Elasticsearch has a powerful profiler API which can be used to inspect and analy
your search queries. The response, however, is a very large JSON blob which is
difficult to analyze by hand.
{xpack} includes the Search Profiler tool which can transform this JSON output
The {searchprofiler} tool can transform this JSON output
into a visualization that is easy to navigate, allowing you to diagnose and debug
poorly performing queries much faster.
image::dev-tools/searchprofile/images/overview.png["Search Profiler Visualization"]
image::dev-tools/searchprofile/images/overview.png["{searchprofiler} Visualization"]
--

View file

@ -6,12 +6,12 @@ Elasticsearch has a powerful profiler API which can be used to inspect and analy
your search queries. The response, however, is a very large JSON blob which is
difficult to analyze by hand.
{xpack} includes the Search Profiler tool which can transform this JSON output
The {searchprofiler} tool can transform this JSON output
into a visualization that is easy to navigate, allowing you to diagnose and debug
poorly performing queries much faster.
image::dev-tools/searchprofiler/images/overview.png["Search Profiler Visualization"]
image::dev-tools/searchprofiler/images/overview.png["{searchprofiler} Visualization"]
include::getting-started.asciidoc[]

View file

@ -2,7 +2,7 @@
[[profiler-index]]
=== Index and Type filtering
By default, all queries executed by the Search Profiler are sent
By default, all queries executed by the {searchprofiler} are sent
to `GET /_search`. It searches across your entire cluster (all indices, all types).
If you need to query a specific index or type (or several), you can use the Index

View file

@ -2,7 +2,7 @@
[[profiler-complicated]]
=== Profiling a more complicated query
To understand how the query trees are displayed inside the Search Profiler,
To understand how the query trees are displayed inside the {searchprofiler},
let's look at a more complicated query.
. Index the following data:
@ -118,6 +118,6 @@ image::dev-tools/searchprofiler/images/gs10.png["Drilling into the first shard's
Click a shard's Expand button to view the aggregation details. Hover over an
aggregation row to view the timing breakdown.
For more information about how the Search Profiler works, how timings are calculated, and
For more information about how the {searchprofiler} works, how timings are calculated, and
how to interpret various results, see
{ref}/search-profile-queries.html[Profiling queries].

View file

@ -2,7 +2,7 @@
[[profiler-render]]
=== Rendering pre-captured profiler JSON
The Search Profiler queries the cluster that the Kibana node is attached to.
The {searchprofiler} queries the cluster that the Kibana node is attached to.
It does this by executing the query against the cluster and collecting the results.
This is convenient, but sometimes performance problems are temporal in nature. For example,
@ -10,7 +10,7 @@ a query might only be slow at certain time of day when many customers are using
You can setup a process to automatically profile slow queries when they occur and then
save those profile responses for later analysis.
The Search Profiler supports this workflow by enabling you to paste the
The {searchprofiler} supports this workflow by enabling you to paste the
pre-captured JSON. The tool will detect that this is a profiler response JSON
rather than a query, and render the visualization rather than querying the cluster.

Binary file not shown.

After

Width:  |  Height:  |  Size: 230 KiB

View file

@ -9,8 +9,6 @@ patterns, advanced settings that tweak the behaviors of Kibana itself, and
the various "objects" that you can save throughout Kibana such as searches,
visualizations, and dashboards.
This section is pluginable, so in addition to the out of the box capabilities,
packs such as {xpack} can add additional management capabilities to Kibana.
--
include::management/managing-licenses.asciidoc[]

View file

@ -78,3 +78,12 @@ You must first stop a rollup job before deleting it.
[role="screenshot"]
image::images/management_rollup_job_details.png[][Rollup job details]
You can start, stop, and delete an existing rollup job, but edits are not supported.
If you want to make any changes, delete the existing job and create a new one with
the updated specifications. Be sure to use a different name for the new rollup job;
reusing the same name could lead to problems with mismatched job configurations.
More about logistical details for the {ref}/rollup-job-config.html[rollup job configuration]
can be found in the {es} documentation.

View file

@ -15,10 +15,15 @@ an item for creating a rollup index pattern, if a rollup index is detected in th
image::images/management_create_rollup_menu.png[Create index pattern menu]
You can match an index pattern to only rolled up data, or mix both rolled up
and raw data to visualize all data together. An index
pattern can match only one rolled up index, not multiple. There is no restriction
on the number of standard indices that an index pattern can match. To match multiple indices, use a comma
to separate the names, with no space after the comma.
and raw data to visualize all data together. An index pattern can match only one
rolled up index, not multiple. There is no restriction on the number of standard
indices that an index pattern can match.
Combination index patterns use the same
notation as other multiple indices in {es}. To match multiple indices to create a
combination index pattern, use a comma to separate the names, with no space after the comma.
The notation for wildcards (`*`) and the ability to "exclude" (`-`) also apply
(for example, `test*,-test3`).
When creating an index pattern, youre asked to set a time field for filtering.
With a rollup index, the time filter field is the same field used for

View file

@ -1,70 +1,14 @@
[role="xpack"]
[[xpack-upgrade-assistant]]
[[upgrade-assistant]]
== Upgrade Assistant
The Upgrade Assistant helps you prepare to upgrade from Elasticsearch 5.x to
Elasticsearch 6.0. It identifies deprecated settings, simplifies reindexing
your pre-5.x indices, and upgrades the internal indices used by Kibana and
X-Pack to the format required in 6.0.
The Upgrade Assistant helps you prepare for your upgrade to {es} 8.0.
To access the assistant, go to *Management > 8.0 Upgrade Assistant*.
To access the Upgrade Assistant, go to **Management** and click the **Upgrade
Assistant** link in the Elasticsearch section.
The assistant identifies the deprecated settings in your cluster and indices
and guides you through the process of resolving issues, including reindexing.
[float]
[[cluster-checkup]]
=== Cluster Checkup
Before upgrading to Elasticsearch 8.0, make sure that you are using the final
7.x minor release to see the most up-to-date deprecation issues.
The first step in preparing to upgrade is to identify any deprecated settings
or features you are using. The Cluster Checkup runs a series of checks
against your cluster and indices and generates a report identifying
any issues that you need to resolve.
To run the Cluster Checkup, go to the **Cluster Checkup** tab in the
Upgrade Assistant. Issues that **must** be resolved before you can upgrade to
Elasticsearch 6.0 appear in red as errors.
If the checkup finds indices that need to be reindexed, you can
manage that process with the Reindex Helper. You can also manually reindex or
simply delete old indices if you are sure you no longer need them.
[float]
[[reindex-helper]]
=== Reindex Helper
If you have indices created in 2.x, you must reindex them before
upgrading to Elasticsearch 6.0. In addition, the internal Kibana and X-Pack
indices must be reindexed to upgrade them to the format required in 6.0.
To reindex indices with the Reindex Helper:
. **Back up your indices using {ref}/modules-snapshots.html[Snapshot and Restore].**
. Go to the **Reindex Helper** tab in the Upgrade Assistant.
. Click the **Reindex** button to reindex an index.
. Monitor the reindex task that is kicked off.
You can run any number of reindex tasks simultaneously. The processing status
is displayed for each index. You can stop a task by clicking **Cancel**. If
any step in the reindex process fails, you can reset the index by clicking
**Reset**.
You can also click **Refresh Indices** to remove any indices that have been
successfully reindexed from the list.
Reindexing tasks continue to run when you leave the Reindex Helper. When you
come back, click **Refresh Indices** to get the latest status of your indices.
NOTE: When you come back to the Reindex Helper, it shows any tasks that are
still running. You can cancel those tasks if you need to, but you won't have
access to the task progress and are blocked from resetting indices that fail
reindexing. This is because the index might have been modified outside of
your Kibana instance.
[float]
[[toggle-deprecation-logger]]
=== Toggle Deprecation Logger
To see the current deprecation logging status, go to the **Toggle Deprecation
Logging** tab in the Upgrade Assistant. Deprecation Logging is enabled by
default from Elasticsearch 5.x onward. If you have disabled deprecation logging, you
can click **Toggle Deprecation Logging** to re-enable it. This logs any
deprecated actions to your log directory so you can see what changes you need
to make to your code before upgrading.
[role="screenshot"]
image::images/management-upgrade-assistant-8.0.png[]

View file

@ -23,6 +23,30 @@ Kibana 7.0 will only use the Node.js distribution included in the package.
*Impact:* There is no expected impact unless Kibana is installed in a non-standard way.
[float]
=== Removed support for using PhantomJS browser for screenshots in Reporting
*Details:* Since the first release of Kibana Reporting, PhantomJS was used as
the headless browser to capture screenshots of Kibana dashboards and
visualizations. In that short time, Chromium has started offering a new
headless browser library and the PhantomJS maintainers abandoned their project.
We started planning for a transition in 6.5.0, when we made Chromium the
default option, but allowed users to continue using Phantom with the
`xpack.reporting.capture.browser.type: phantom` setting. In 7.0, that setting
will still exist for compatibility, but the only valid option will be
`chromium`.
*Impact:* Before upgrading to 7.0, if you have `xpack.reporting.capture.browser.type`
set in kibana.yml, make sure it is set to `chromium`.
[NOTE]
============
Reporting 7.0 uses a version of the Chromium headless browser that RHEL 6,
CentOS 6.x, and other old versions of Linux derived from RHEL 6. This change
effectively removes RHEL 6 OS server support from Kibana Reporting. Users with
RHEL 6 must upgrade to RHEL 7 to use Kibana Reporting starting with version
7.0.0 of the Elastic stack.
============
[float]
=== Advanced setting query:queryString:options no longer applies to filters
*Details:* In previous versions of Kibana the Advanced Setting `query:queryString:options` was applied to both queries

View file

@ -13,17 +13,10 @@ However, these docs will be kept up-to-date to reflect the current implementatio
[float]
[[reporting-nav-bar-extensions]]
=== Nav Bar Extensions
X-Pack uses the `NavBarExtensionsRegistryProvider` to register a navigation bar item for the
Dashboard/Discover/Visualize applications. These all use the `export-config` AngularJS directive to display the
Reporting options. Each of the different exportTypes require the AngularJS controller that the contains the
`export-config` directive to implement certain methods/properties that are used when creating the {reporting} job.
=== Share Menu Extensions
X-Pack uses the `ShareContextMenuExtensionsRegistryProvider` to register actions in the share menu.
If you wish to add Reporting integration via the navigation bar that emulates the way these options are chosen for
Dashboards, Visualize and Discover, you can reuse the `export-config` directive for the time being. Otherwise, you can
provide a custom UI that generates the URL that is used to generate the {reporting} job.
This integration will likely be changing in the near future as we move away from AngularJS towards React.
This integration will likely be changing in the near future as we move towards a unified actions abstraction across {kib}.
[float]
=== Generate Job URL

View file

@ -2,7 +2,7 @@
[[reporting-getting-started]]
== Getting Started
{reporting} is automatically enabled in {kib}.
{reporting} is automatically enabled in {kib}.
To manually generate a report:
@ -14,20 +14,24 @@ information, see <<secure-reporting>>.
. Open the dashboard, visualization, or saved search you want to include
in the report.
. Click *Reporting* in the {kib} toolbar:
. Click *Share* in the {kib} toolbar:
+
--
[role="screenshot"]
image:reporting/images/reporting-button.png["Reporting Button",link="reporting-button.png"]
image:reporting/images/share-button.png["Reporting Button",link="share-button.png"]
--
. Depending on the {kib} application, choose the appropriate options:
.. If you're on Discover, click the *Generate CSV* button.
.. If you're on Discover:
... Select *CSV Reports*
... Click the *Generate CSV* button.
.. If you're on Visualize or Dashboard:
... Select *PDF Reports*
... Select either *Optimize PDF for printing* or *Preserve existing layout in PDF*. For an explanation of the different layout modes, see <<pdf-layout-modes, PDF Layout Modes>>.
... Choose to enable *Optimize for printing* layout mode. For an explanation of the different layout modes, see <<pdf-layout-modes, PDF Layout Modes>>.
... Click the *Generate PDF* button.

View file

@ -1,9 +1,10 @@
[role="xpack"]
[[xpack-reporting]]
= Reporting from Kibana
[partintro]
--
{xpack} enables you to generate reports that contain {kib} dashboards,
You can generate reports that contain {kib} dashboards,
visualizations, and saved searches. The reports are exported as
print-optimized PDF documents.
@ -15,12 +16,12 @@ The following Reporting button appears in the {kib} toolbar:
image:images/reporting.jpg["Reporting",link="reporting.jpg"]
You can also {xpack-ref}/automating-report-generation.html[generate reports automatically].
You can also {stack-ov}/automating-report-generation.html[generate reports automatically].
IMPORTANT: Reports are stored in the `.reporting-*` indices. Any user with
access to these indices has access to every report generated by all users.
To use {reporting} in a production environment, {xpack-ref}/securing-reporting.html[secure
To use {reporting} in a production environment, {stack-ov}/securing-reporting.html[secure
the Reporting endpoints].
--

Binary file not shown.

Before

Width:  |  Height:  |  Size: 111 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 145 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 86 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 55 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 213 KiB

View file

@ -4,7 +4,7 @@
[partintro]
--
{xpack} enables you to generate reports that contain {kib} dashboards,
You can generate reports that contain {kib} dashboards,
visualizations, and saved searches. Dashboards and visualizations are
exported as PDF documents, while saved searches in Discover
are exported to CSV.
@ -13,10 +13,10 @@ NOTE: On Linux, the `libfontconfig` and `libfreetype6` packages and system
fonts are required to generate PDF reports. If no system fonts are available,
labels are not rendered correctly in the reports.
The following Reporting button is found in the {kib} toolbar:
Reporting is located in the share menu from the {kib} toolbar:
[role="screenshot"]
image::reporting/images/reporting-button.png["Reporting"]
image::reporting/images/share-button.png["Share"]
You can also <<automating-report-generation, generate reports automatically>>.
@ -30,4 +30,4 @@ include::pdf-layout-modes.asciidoc[]
include::configuring-reporting.asciidoc[]
include::chromium-sandbox.asciidoc[]
include::reporting-troubleshooting.asciidoc[]
include::development/index.asciidoc[]
include::development/index.asciidoc[]

View file

@ -5,7 +5,7 @@ When creating a PDF report, there are two layout modes *Optimize PDF for printin
--
[role="screenshot"]
image:reporting/images/pdf-reporting.png["PDF Reporting",link="pdf-reporting.png"]
image:reporting/images/preserve-layout-switch.png["PDF Reporting",link="preserve-layout-switch.png"]
--
[float]
@ -26,4 +26,4 @@ This will create a PDF preserving the existing layout and size of the Visualizat
--
[role="screenshot"]
image:reporting/images/preserve-layout.png["Preserve existing layout in PDF",link="preserve-layout.png"]
--
--

View file

@ -22,8 +22,9 @@ There is currently a known limitation with the Data Table visualization that onl
[float]
==== `You must install fontconfig and freetype for Reporting to work'`
Reporting using PhantomJS, the default browser, relies on system packages. Install the appropriate fontconfig and freetype
packages for your distribution.
Reporting uses a headless browser on the Kibana server, which relies on some
system packages. Install the appropriate fontconfig and freetype packages for
your distribution.
[float]
==== `Max attempts reached (3)`
@ -54,11 +55,6 @@ the CAP_SYS_ADMIN capability.
Elastic recommends that you research the feasibility of enabling unprivileged user namespaces before disabling the sandbox. An exception
is if you are running Kibana in Docker because the container runs in a user namespace with the built-in seccomp/bpf filters.
[float]
==== `spawn EACCES`
Ensure that the `phantomjs` binary in your Kibana data directory is owned by the user who is running Kibana, that the user has the execute permission,
and if applicable, that the filesystem is mounted with the `exec` option.
[float]
==== `Caught error spawning Chromium`
Ensure that the `headless_shell` binary located in your Kibana data directory is owned by the user who is running Kibana, that the user has the execute permission,

View file

@ -2,11 +2,12 @@
[[xpack-security]]
== Security
{xpack} security enables you to easily secure a cluster. With security, you can
The {stack} {security-features} enable you to easily secure a cluster. With
security, you can
password-protect your data as well as implement more advanced security measures
such as encrypting communications, role-based access control, IP filtering, and
auditing. For more information, see
{xpack-ref}/elasticsearch-security.html[Securing {es} and {kib}] and
{stack-ov}/elasticsearch-security.html[Securing {es} and {kib}] and
<<using-kibana-with-security,Configuring Security in {kib}>>.
[float]
@ -15,7 +16,7 @@ auditing. For more information, see
You can create and manage users on the *Management* / *Security* / *Users* page.
You can also change their passwords and roles. For more information about
authentication and built-in users, see
{xpack-ref}/setting-up-authentication.html[Setting Up User Authentication].
{stack-ov}/setting-up-authentication.html[Setting up user authentication].
[float]
=== Roles
@ -25,7 +26,7 @@ You can manage roles on the *Management* / *Security* / *Roles* page, or use
{kib} see <<xpack-security-authorization, {kib} Authorization>>.
For a more holistic overview of configuring roles for the entire stack,
see {xpack-ref}/authorization.html[Configuring Role-based Access Control].
see {stack-ov}/authorization.html[Configuring role-based access control].
[NOTE]
============================================================================

View file

@ -17,7 +17,7 @@ Set to `true` (default) to enable the <<xpack-grokdebugger,Grok Debugger>>.
[float]
[[profiler-settings]]
==== Search Profiler Settings
==== {searchprofiler} Settings
`xpack.searchprofiler.enabled`::
Set to `true` (default) to enable the <<xpack-profiler,Search Profiler>>.
Set to `true` (default) to enable the <<xpack-profiler,{searchprofiler}>>.

View file

@ -91,16 +91,8 @@ visualizations, try increasing this value.
Defaults to `3000` (3 seconds).
[[xpack-reporting-browser]]`xpack.reporting.capture.browser.type`::
Specifies the browser to use to capture screenshots. Valid options are `phantom`
and `chromium`. When `chromium` is set, the settings specified in the <<reporting-chromium-settings, Chromium settings>>
are respected. Defaults to `chromium`.
[NOTE]
============
Starting in 7.0, Phantom support will be removed from Kibana, and `chromium`
will be the only valid option for the `xpack.reporting.capture.browser.type` setting.
============
Specifies the browser to use to capture screenshots. This setting exists for
backward compatibility. The only valid option is `chromium`.
[float]
[[reporting-chromium-settings]]

View file

@ -1,9 +1,8 @@
[float]
=== {component} TLS/SSL Settings
You can configure the following TLS/SSL settings. If the settings are not configured,
the {xpack} defaults will be used. See
{xpack-ref}/security-settings.html[Default TLS/SSL Settings].
//<<ssl-tls-settings, {xpack} defaults>> will be used.
You can configure the following TLS/SSL settings. If the settings are not
configured, the default values are used. See
{stack-ov}/security-settings.html[Default TLS/SSL Settings].
ifdef::server[]
+{ssl-prefix}.ssl.enabled+::
@ -45,9 +44,9 @@ Java Cryptography Architecture documentation]. Defaults to the value of
The following settings are used to specify a private key, certificate, and the
trusted certificates that should be used when communicating over an SSL/TLS connection.
If none of the settings below are specified, this will default to the {xpack} defaults.
See {xpack-ref}/security-settings.html[Default TLS/SSL Settings].
//<<ssl-tls-settings, {xpack} defaults>>.
If none of the settings below are specified, the default values are used.
See {stack-ov}/security-settings.html[Default TLS/SSL settings].
ifdef::server[]
A private key and certificate must be configured.
endif::server[]
@ -55,9 +54,8 @@ ifndef::server[]
A private key and certificate are optional and would be used if the server requires client authentication for PKI
authentication.
endif::server[]
If none of the settings below are specified, the {xpack} defaults will be used.
See {xpack-ref}/security-settings.html[Default TLS/SSL Settings].
//<<ssl-tls-settings, {xpack} defaults>> will be used.
If none of the settings below are specified, the defaults values are used.
See {stack-ov}/security-settings.html[Default TLS/SSL settings].
[float]
===== PEM Encoded Files

View file

@ -1,9 +1,9 @@
[[production]]
== Using Kibana in a production environment
* <<configuring-kibana-shield, Using Kibana with {xpack}>>
* <<enabling-ssl, Enabling SSL>>
* <<load-balancing, Load balancing across multiple {es} nodes>>
* <<configuring-kibana-shield>>
* <<enabling-ssl>>
* <<load-balancing>>
How you deploy Kibana largely depends on your use case. If you are the only user,
you can run Kibana on your local machine and configure it to point to whatever
@ -19,19 +19,19 @@ and an Elasticsearch client node on the same machine. For more information, see
[float]
[[configuring-kibana-shield]]
=== Using Kibana with {security}
=== Using {stack} {security-features}
You can use {stack-ov}/elasticsearch-security.html[{security}] to control what
Elasticsearch data users can access through Kibana.
You can use {stack-ov}/elasticsearch-security.html[{stack} {security-features}]
to control what {es} data users can access through Kibana.
When {security} is enabled, Kibana users have to log in. They need to
When {security-features} are enabled, Kibana users have to log in. They need to
have the `kibana_user` role as well as access to the indices they
will be working with in Kibana.
If a user loads a Kibana dashboard that accesses data in an index that they
are not authorized to view, they get an error that indicates the index does
not exist. {security} does not currently provide a way to control which
users can load which dashboards.
not exist. The {security-features} do not currently provide a way to control
which users can load which dashboards.
For information about setting up Kibana users, see
{kibana-ref}/using-kibana-with-security.html[Configuring security in Kibana].

View file

@ -37,14 +37,15 @@
"preinstall": "node ./preinstall_check",
"kbn": "node scripts/kbn",
"es": "node scripts/es",
"elasticsearch": "echo 'use `yarn es snapshot -E path.data=../data/`'",
"test": "grunt test",
"test:dev": "grunt test:dev",
"test:quick": "grunt test:quick",
"test:browser": "grunt test:browser",
"test:ui": "echo 'use `node scripts/functional_tests`' && false",
"test:ui:server": "echo 'use `node scripts/functional_tests_server`' && false",
"test:ui:runner": "echo 'use `node scripts/functional_test_runner`' && false",
"test:jest": "node scripts/jest",
"test:mocha": "grunt test:mocha",
"test:ui": "node scripts/functional_tests",
"test:ui:server": "node scripts/functional_tests_server",
"test:ui:runner": "node scripts/functional_test_runner",
"test:server": "grunt test:server",
"test:coverage": "grunt test:coverage",
"checkLicenses": "grunt licenses --dev",
@ -54,11 +55,12 @@
"debug-break": "node --nolazy --inspect-brk scripts/kibana --dev",
"precommit": "node scripts/precommit_hook",
"karma": "karma start",
"lint": "echo 'use `node scripts/eslint` and/or `node scripts/tslint`' && false",
"lintroller": "echo 'use `node scripts/eslint --fix` and/or `node scripts/tslint --fix`' && false",
"makelogs": "echo 'use `node scripts/makelogs`' && false",
"mocha": "echo 'use `node scripts/mocha`' && false",
"sterilize": "grunt sterilize",
"lint": "yarn run lint:es && yarn run lint:ts && yarn run lint:sass",
"lint:es": "node scripts/eslint",
"lint:ts": "node scripts/tslint",
"lint:sass": "node scripts/sasslint",
"makelogs": "node scripts/makelogs",
"mocha": "node scripts/mocha",
"uiFramework:start": "cd packages/kbn-ui-framework && yarn docSiteStart",
"uiFramework:build": "cd packages/kbn-ui-framework && yarn docSiteBuild",
"uiFramework:createComponent": "cd packages/kbn-ui-framework && yarn createComponent",
@ -93,7 +95,7 @@
},
"dependencies": {
"@elastic/datemath": "5.0.2",
"@elastic/eui": "5.8.1",
"@elastic/eui": "6.0.1",
"@elastic/filesaver": "1.1.2",
"@elastic/good": "8.1.1-kibana2",
"@elastic/numeral": "2.3.2",
@ -107,6 +109,7 @@
"@kbn/pm": "1.0.0",
"@kbn/test-subj-selector": "0.2.1",
"@kbn/ui-framework": "1.0.0",
"JSONStream": "1.1.1",
"abortcontroller-polyfill": "^1.1.9",
"angular": "1.6.9",
"angular-aria": "1.6.6",
@ -238,7 +241,7 @@
"vision": "^5.3.3",
"webpack": "4.23.1",
"webpack-merge": "4.1.4",
"whatwg-fetch": "^2.0.3",
"whatwg-fetch": "^3.0.0",
"wreck": "^14.0.2",
"x-pack": "7.0.0",
"yauzl": "2.7.0"
@ -270,7 +273,7 @@
"@types/enzyme": "^3.1.12",
"@types/eslint": "^4.16.2",
"@types/execa": "^0.9.0",
"@types/fetch-mock": "^5.12.2",
"@types/fetch-mock": "7.2.1",
"@types/getopts": "^2.0.0",
"@types/glob": "^5.0.35",
"@types/globby": "^8.0.0",
@ -337,7 +340,7 @@
"eslint-plugin-react": "^7.11.1",
"expect.js": "0.3.1",
"faker": "1.1.0",
"fetch-mock": "^5.13.1",
"fetch-mock": "7.3.0",
"geckodriver": "1.12.2",
"getopts": "2.0.0",
"grunt": "1.0.1",
@ -382,6 +385,7 @@
"prettier": "^1.14.3",
"proxyquire": "1.7.11",
"regenerate": "^1.4.0",
"sass-lint": "^1.12.1",
"simple-git": "1.37.0",
"sinon": "^5.0.7",
"strip-ansi": "^3.0.1",
@ -397,8 +401,8 @@
"ts-node": "^7.0.1",
"tslint": "^5.11.0",
"tslint-config-prettier": "^1.15.0",
"tslint-plugin-prettier": "^2.0.0",
"tslint-microsoft-contrib": "^6.0.0",
"tslint-plugin-prettier": "^2.0.0",
"typescript": "^3.0.3",
"vinyl-fs": "^3.0.2",
"xml2js": "^0.4.19",

View file

@ -172,6 +172,7 @@ module.exports = {
'import/no-named-as-default': 'error',
'import/no-named-as-default-member': 'error',
'import/no-duplicates': 'error',
'import/no-dynamic-require': 'error',
'prefer-object-spread/prefer-object-spread': 'error',
}

View file

@ -26,7 +26,7 @@ module.exports = {
browsers: [
'last 2 versions',
'> 5%',
'Safari 7', // for PhantomJS support
'Safari 7', // for PhantomJS support: https://github.com/elastic/kibana/issues/27136
],
},
useBuiltIns: true,

View file

@ -52,6 +52,8 @@ function fromExpression(expression, parseOptions = {}, parse = parseKuery) {
return parse(expression, parseOptions);
}
// indexPattern isn't required, but if you pass one in, we can be more intelligent
// about how we craft the queries (e.g. scripted fields)
export function toElasticsearchQuery(node, indexPattern) {
if (!node || !node.type || !nodeTypes[node.type]) {
return toElasticsearchQuery(nodeTypes.function.buildNode('and', []));

View file

@ -57,12 +57,21 @@ describe('kuery functions', function () {
expect(_.isEqual(expected, result)).to.be(true);
});
it('should return an ES exists query without an index pattern', function () {
const expected = {
exists: { field: 'response' }
};
const existsNode = nodeTypes.function.buildNode('exists', 'response');
const result = exists.toElasticsearchQuery(existsNode);
expect(_.isEqual(expected, result)).to.be(true);
});
it('should throw an error for scripted fields', function () {
const existsNode = nodeTypes.function.buildNode('exists', 'script string');
expect(exists.toElasticsearchQuery)
.withArgs(existsNode, indexPattern).to.throwException(/Exists query does not support scripted fields/);
});
});
});
});

View file

@ -83,6 +83,14 @@ describe('kuery functions', function () {
expect(result.geo_bounding_box.geo).to.have.property('bottom_right', '50.73, -135.35');
});
it('should return an ES geo_bounding_box query without an index pattern', function () {
const node = nodeTypes.function.buildNode('geoBoundingBox', 'geo', params);
const result = geoBoundingBox.toElasticsearchQuery(node);
expect(result).to.have.property('geo_bounding_box');
expect(result.geo_bounding_box.geo).to.have.property('top_left', '73.12, -174.37');
expect(result.geo_bounding_box.geo).to.have.property('bottom_right', '50.73, -135.35');
});
it('should use the ignore_unmapped parameter', function () {
const node = nodeTypes.function.buildNode('geoBoundingBox', 'geo', params);
const result = geoBoundingBox.toElasticsearchQuery(node, indexPattern);

View file

@ -91,6 +91,17 @@ describe('kuery functions', function () {
});
});
it('should return an ES geo_polygon query without an index pattern', function () {
const node = nodeTypes.function.buildNode('geoPolygon', 'geo', points);
const result = geoPolygon.toElasticsearchQuery(node);
expect(result).to.have.property('geo_polygon');
expect(result.geo_polygon.geo).to.have.property('points');
result.geo_polygon.geo.points.forEach((point, index) => {
const expectedLatLon = `${points[index].lat}, ${points[index].lon}`;
expect(point).to.be(expectedLatLon);
});
});
it('should use the ignore_unmapped parameter', function () {
const node = nodeTypes.function.buildNode('geoPolygon', 'geo', points);

View file

@ -143,6 +143,21 @@ describe('kuery functions', function () {
expect(result).to.eql(expected);
});
it('should return an ES match query when a concrete fieldName and value are provided without an index pattern', function () {
const expected = {
bool: {
should: [
{ match: { extension: 'jpg' } },
],
minimum_should_match: 1
}
};
const node = nodeTypes.function.buildNode('is', 'extension', 'jpg');
const result = is.toElasticsearchQuery(node);
expect(result).to.eql(expected);
});
it('should support creation of phrase queries', function () {
const expected = {
bool: {

View file

@ -86,6 +86,28 @@ describe('kuery functions', function () {
expect(result).to.eql(expected);
});
it('should return an ES range query without an index pattern', function () {
const expected = {
bool: {
should: [
{
range: {
bytes: {
gt: 1000,
lt: 8000
}
}
}
],
minimum_should_match: 1
}
};
const node = nodeTypes.function.buildNode('range', 'bytes', { gt: 1000, lt: 8000 });
const result = range.toElasticsearchQuery(node);
expect(result).to.eql(expected);
});
it('should support wildcard field names', function () {
const expected = {
bool: {

View file

@ -17,6 +17,7 @@
* under the License.
*/
import { get } from 'lodash';
import * as literal from '../node_types/literal';
export function buildNodeParams(fieldName) {
@ -28,7 +29,7 @@ export function buildNodeParams(fieldName) {
export function toElasticsearchQuery(node, indexPattern) {
const { arguments: [ fieldNameArg ] } = node;
const fieldName = literal.toElasticsearchQuery(fieldNameArg);
const field = indexPattern.fields.find(field => field.name === fieldName);
const field = get(indexPattern, 'fields', []).find(field => field.name === fieldName);
if (field && field.scripted) {
throw new Error(`Exists query does not support scripted fields`);

View file

@ -37,7 +37,7 @@ export function buildNodeParams(fieldName, params) {
export function toElasticsearchQuery(node, indexPattern) {
const [ fieldNameArg, ...args ] = node.arguments;
const fieldName = nodeTypes.literal.toElasticsearchQuery(fieldNameArg);
const field = indexPattern.fields.find(field => field.name === fieldName);
const field = _.get(indexPattern, 'fields', []).find(field => field.name === fieldName);
const queryParams = args.reduce((acc, arg) => {
const snakeArgName = _.snakeCase(arg.name);
return {

View file

@ -17,6 +17,7 @@
* under the License.
*/
import { get } from 'lodash';
import { nodeTypes } from '../node_types';
import * as ast from '../ast';
@ -35,7 +36,7 @@ export function buildNodeParams(fieldName, points) {
export function toElasticsearchQuery(node, indexPattern) {
const [ fieldNameArg, ...points ] = node.arguments;
const fieldName = nodeTypes.literal.toElasticsearchQuery(fieldNameArg);
const field = indexPattern.fields.find(field => field.name === fieldName);
const field = get(indexPattern, 'fields', []).find(field => field.name === fieldName);
const queryParams = {
points: points.map(ast.toElasticsearchQuery)
};

View file

@ -44,6 +44,7 @@ export function buildNodeParams(fieldName, value, isPhrase = false) {
export function toElasticsearchQuery(node, indexPattern) {
const { arguments: [ fieldNameArg, valueArg, isPhraseArg ] } = node;
const fieldName = ast.toElasticsearchQuery(fieldNameArg);
const value = !_.isUndefined(valueArg) ? ast.toElasticsearchQuery(valueArg) : valueArg;
const type = isPhraseArg.value ? 'phrase' : 'best_fields';
@ -65,7 +66,7 @@ export function toElasticsearchQuery(node, indexPattern) {
};
}
const fields = getFields(fieldNameArg, indexPattern);
const fields = indexPattern ? getFields(fieldNameArg, indexPattern) : [];
// If no fields are found in the index pattern we send through the given field name as-is. We do this to preserve
// the behaviour of lucene on dashboards where there are panels based on different index patterns that have different
@ -80,7 +81,10 @@ export function toElasticsearchQuery(node, indexPattern) {
}
const isExistsQuery = valueArg.type === 'wildcard' && value === '*';
const isMatchAllQuery = isExistsQuery && fields && fields.length === indexPattern.fields.length;
const isAllFieldsQuery =
(fieldNameArg.type === 'wildcard' && fieldName === '*')
|| (fields && indexPattern && fields.length === indexPattern.fields.length);
const isMatchAllQuery = isExistsQuery && isAllFieldsQuery;
if (isMatchAllQuery) {
return { match_all: {} };

View file

@ -37,7 +37,7 @@ export function buildNodeParams(fieldName, params) {
export function toElasticsearchQuery(node, indexPattern) {
const [ fieldNameArg, ...args ] = node.arguments;
const fields = getFields(fieldNameArg, indexPattern);
const fields = indexPattern ? getFields(fieldNameArg, indexPattern) : [];
const namedArgs = extractArguments(args);
const queryParams = _.mapValues(namedArgs, ast.toElasticsearchQuery);

View file

@ -55,7 +55,7 @@ exports.getPlugins = function(config, kibanaPath, projectRoot) {
return pluginsFromMap.concat(
glob.sync(globPatterns).map(pkgJsonPath => {
const path = dirname(pkgJsonPath);
const pkg = require(pkgJsonPath);
const pkg = require(pkgJsonPath); // eslint-disable-line import/no-dynamic-require
return {
name: pkg.name,
directory: path,

View file

@ -393,29 +393,29 @@ For example, there is a component that is wrapped by `injectI18n`, like in the `
```js
// ...
class AddFilterUi extends Component {
export const AddFilter = injectI18n(
class AddFilterUi extends Component {
// ...
render() {
const { filter } = this.state;
return (
<EuiFlexGroup>
<EuiFlexItem grow={10}>
<EuiFieldText
fullWidth
value={filter}
onChange={e => this.setState({ filter: e.target.value.trim() })}
placeholder={this.props.intl.formatMessage({
id: 'kbn.management.indexPattern.edit.source.placeholder',
defaultMessage: 'source filter, accepts wildcards (e.g., `user*` to filter fields starting with \'user\')'
})}
/>
</EuiFlexItem>
</EuiFlexGroup>
);
render() {
const { filter } = this.state;
return (
<EuiFlexGroup>
<EuiFlexItem grow={10}>
<EuiFieldText
fullWidth
value={filter}
onChange={e => this.setState({ filter: e.target.value.trim() })}
placeholder={this.props.intl.formatMessage({
id: 'kbn.management.indexPattern.edit.source.placeholder',
defaultMessage: 'source filter, accepts wildcards (e.g., `user*` to filter fields starting with \'user\')'
})}
/>
</EuiFlexItem>
</EuiFlexGroup>
);
}
}
}
export const AddFilter = injectI18n(AddFilterUi);
);
```
To test the `AddFilter` component it is needed to render its `WrappedComponent` property using `shallowWithIntl` function to pass `intl` object into the `props`.

View file

@ -298,7 +298,7 @@ React component as a pure function:
import React from 'react';
import { injectI18n, intlShape } from '@kbn/i18n/react';
const MyComponentContent = ({ intl }) => (
export const MyComponent = injectI18n({ intl }) => (
<input
type="text"
placeholder={intl.formatMessage(
@ -311,13 +311,11 @@ const MyComponentContent = ({ intl }) => (
{ name, unreadCount }
)}
/>
);
));
MyComponentContent.propTypes = {
MyComponent.WrappedComponent.propTypes = {
intl: intlShape.isRequired,
};
export const MyComponent = injectI18n(MyComponentContent);
```
React component as a class:
@ -326,27 +324,27 @@ React component as a class:
import React from 'react';
import { injectI18n, intlShape } from '@kbn/i18n/react';
class MyComponentContent extends React.Component {
static propTypes = {
intl: intlShape.isRequired,
};
export const MyComponent = injectI18n(
class MyComponent extends React.Component {
static propTypes = {
intl: intlShape.isRequired,
};
render() {
const { intl } = this.props;
render() {
const { intl } = this.props;
return (
<input
type="text"
placeholder={intl.formatMessage({
id: 'kbn.management.objects.searchPlaceholder',
defaultMessage: 'Search',
})}
/>
);
return (
<input
type="text"
placeholder={intl.formatMessage({
id: 'kbn.management.objects.searchPlaceholder',
defaultMessage: 'Search',
})}
/>
);
}
}
}
export const MyComponent = injectI18n(MyComponentContent);
);
```
## AngularJS

View file

@ -9,6 +9,7 @@
"kbn:watch": "node scripts/build --dev --watch"
},
"dependencies": {
"@kbn/i18n": "1.0.0",
"lodash": "npm:@elastic/lodash@3.10.1-kibana1",
"lodash.clone": "^4.5.0",
"scriptjs": "^2.5.8",
@ -24,6 +25,7 @@
"babel-plugin-transform-runtime": "^6.23.0",
"babel-polyfill": "6.20.0",
"css-loader": "1.0.0",
"copy-webpack-plugin": "^4.6.0",
"del": "^3.0.0",
"getopts": "^2.2.3",
"pegjs": "0.9.0",
@ -34,4 +36,4 @@
"webpack": "4.23.1",
"webpack-cli": "^3.1.2"
}
}
}

View file

@ -17,10 +17,4 @@
* under the License.
*/
import chrome from 'ui/chrome';
const apiPrefix = chrome.addBasePath('/api/kibana');
export async function getRemoteClusters($http) {
const response = await $http.get(`${apiPrefix}/clusters`);
return response.data;
}
import '../common/register';

View file

@ -17,6 +17,7 @@
* under the License.
*/
import { i18n } from '@kbn/i18n';
import $script from 'scriptjs';
let resolvePromise = null;
@ -43,6 +44,7 @@ const loadBrowserRegistries = (registries, basePath) => {
const type = remainingTypes.pop();
window.canvas = window.canvas || {};
window.canvas.register = d => registries[type].register(d);
window.canvas.i18n = i18n;
// Load plugins one at a time because each needs a different loader function
// $script will only load each of these once, we so can call this as many times as we need?

View file

@ -31,10 +31,16 @@ const canvasPluginDirectoryName = 'canvas_plugin';
const isDirectory = path =>
lstat(path)
.then(stat => stat.isDirectory())
.catch(() => false);
.catch(() => false); // if lstat fails, it doesn't exist and is not a directory
const isDirname = (p, name) => path.basename(p) === name;
const filterDirectories = (paths, { exclude = false } = {}) => {
return Promise.all(paths.map(p => isDirectory(p))).then(directories => {
return paths.filter((p, i) => (exclude ? !directories[i] : directories[i]));
});
};
const getPackagePluginPath = () => {
let basePluginPath = path.resolve(__dirname, '..');
@ -90,19 +96,15 @@ export const getPluginPaths = type => {
return list.concat(dir);
}, [])
)
.then(possibleCanvasPlugins => {
// Check how many are directories. If lstat fails it doesn't exist anyway.
return Promise.all(
// An array
possibleCanvasPlugins.map(pluginPath => isDirectory(pluginPath))
).then(isDirectory => possibleCanvasPlugins.filter((pluginPath, i) => isDirectory[i]));
})
.then(possibleCanvasPlugins => filterDirectories(possibleCanvasPlugins, { exclude: false }))
.then(canvasPluginDirectories => {
return Promise.all(
canvasPluginDirectories.map(dir =>
// Get the full path of all files in the directory
readdir(dir).then(files => files.map(file => path.resolve(dir, file)))
)
).then(flatten);
)
.then(flatten)
.then(files => filterDirectories(files, { exclude: true }));
});
};

View file

@ -17,6 +17,7 @@
* under the License.
*/
import { i18n } from '@kbn/i18n';
import { typesRegistry } from '../common/lib/types_registry';
import { functionsRegistry as serverFunctions } from '../common/lib/functions_registry';
import { getPluginPaths } from './get_plugin_paths';
@ -48,23 +49,18 @@ export const populateServerRegistries = types => {
const remainingTypes = types;
const populatedTypes = {};
const globalKeys = Object.keys(global);
const loadType = () => {
const type = remainingTypes.pop();
getPluginPaths(type).then(paths => {
global.canvas = global.canvas || {};
global.canvas.register = d => registries[type].register(d);
global.canvas.i18n = i18n;
paths.forEach(path => {
require(path);
require(path); // eslint-disable-line import/no-dynamic-require
});
Object.keys(global).forEach(key => {
if (!globalKeys.includes(key)) {
delete global[key];
}
});
delete global.canvas;
populatedTypes[type] = registries[type];
if (remainingTypes.length) loadType();

View file

@ -0,0 +1,3 @@
/* eslint-disable */
import util from 'util';
console.log(util.format('hello world'));

View file

@ -17,29 +17,27 @@
* under the License.
*/
import { callWithRequestFactory } from './call_with_request_factory';
import handleEsError from '../../../lib/handle_es_error';
const { extname } = require('path');
async function fetchRemoteClusters(callWithRequest) {
const options = {
method: 'GET',
path: '_remote/info'
};
const remoteInfo = await callWithRequest('transport.request', options);
return Object.keys(remoteInfo);
}
const { transform } = require('babel-core');
export function registerClustersRoute(server) {
server.route({
path: '/api/kibana/clusters',
method: 'GET',
handler: async request => {
const callWithRequest = callWithRequestFactory(server, request);
try {
return await fetchRemoteClusters(callWithRequest);
} catch (error) {
throw handleEsError(error);
}
exports.createServerCodeTransformer = (sourceMaps) => {
return (content, path) => {
switch (extname(path)) {
case '.js':
const { code = '' } = transform(content.toString('utf8'), {
filename: path,
ast: false,
code: true,
sourceMaps: sourceMaps ? 'inline' : false,
babelrc: false,
presets: [require.resolve('@kbn/babel-preset/webpack_preset')],
});
return code;
default:
return content.toString('utf8');
}
});
}
};
};

View file

@ -0,0 +1,49 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
import { readFileSync } from 'fs';
import { resolve } from 'path';
import { createServerCodeTransformer } from './server_code_transformer';
const JS_FIXTURE_PATH = resolve(__dirname, '__fixtures__/sample.js');
const JS_FIXTURE = readFileSync(JS_FIXTURE_PATH);
describe('js support', () => {
it('transpiles js file', () => {
const transformer = createServerCodeTransformer();
expect(transformer(JS_FIXTURE, JS_FIXTURE_PATH)).toMatchInlineSnapshot(`
"'use strict';
var _util = require('util');
var _util2 = _interopRequireDefault(_util);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
console.log(_util2.default.format('hello world')); /* eslint-disable */"
`);
});
it('throws errors for js syntax errors', () => {
const transformer = createServerCodeTransformer();
expect(() => transformer(Buffer.from(`export default 'foo`), JS_FIXTURE_PATH)).toThrowError(
/Unterminated string constant/
);
});
});

View file

@ -18,6 +18,10 @@
*/
const { resolve } = require('path');
const CopyWebpackPlugin = require('copy-webpack-plugin');
const { createServerCodeTransformer } = require('./server_code_transformer');
const {
PLUGIN_SOURCE_DIR,
PLUGIN_BUILD_DIR,
@ -31,7 +35,7 @@ module.exports = function ({ sourceMaps }, { watch }) {
mode: 'none',
entry: {
'types/all': resolve(PLUGIN_SOURCE_DIR, 'types/register.js'),
'functions/common/all': resolve(PLUGIN_SOURCE_DIR, 'functions/common/register.js'),
'functions/browser/all': resolve(PLUGIN_SOURCE_DIR, 'functions/browser/register.js'),
},
// there were problems with the node and web targets since this code is actually
@ -95,6 +99,15 @@ module.exports = function ({ sourceMaps }, { watch }) {
stats: 'errors-only',
plugins: [
new CopyWebpackPlugin([
{
from: resolve(PLUGIN_SOURCE_DIR, 'functions/common'),
to: resolve(PLUGIN_BUILD_DIR, 'functions/common'),
ignore: '**/__tests__/**',
transform: createServerCodeTransformer(sourceMaps)
},
]),
function LoaderFailHandlerPlugin() {
if (!watch) {
return;

View file

@ -27,7 +27,7 @@ const chalk = require('chalk');
const pkg = require('../package.json');
const kibanaPkgPath = require.resolve('../../../package.json');
const kibanaPkg = require(kibanaPkgPath);
const kibanaPkg = require(kibanaPkgPath); // eslint-disable-line import/no-dynamic-require
const KBN_DIR = dirname(kibanaPkgPath);
@ -72,6 +72,7 @@ module.exports = function({ name }) {
filters: {
'public/**/*': 'generateApp',
'translations/**/*': 'generateTranslations',
'.i18nrc.json': 'generateTranslations',
'public/hack.js': 'generateHack',
'server/**/*': 'generateApi',
'public/app.scss': 'generateScss',
@ -80,6 +81,7 @@ module.exports = function({ name }) {
move: {
gitignore: '.gitignore',
eslintrc: '.eslintrc',
'package_template.json': 'package.json',
},
data: answers =>
Object.assign(

View file

@ -0,0 +1,5 @@
{
"paths": {
"<%= camelCase(name) %>": "./"
}
}

View file

@ -17,6 +17,11 @@
"test:browser": "plugin-helpers test:browser",
"build": "plugin-helpers build"
},
<%_ if (generateTranslations) { _%>
"dependencies": {
"@kbn/i18n": "link:../../kibana/packages/kbn-i18n"
},
<%_ } _%>
"devDependencies": {
"@elastic/eslint-config-kibana": "link:../../kibana/packages/eslint-config-kibana",
"@elastic/eslint-import-resolver-kibana": "link:../../kibana/packages/kbn-eslint-import-resolver-kibana",

View file

@ -2,6 +2,9 @@ import React from 'react';
import { uiModules } from 'ui/modules';
import chrome from 'ui/chrome';
import { render, unmountComponentAtNode } from 'react-dom';
<%_ if (generateTranslations) { _%>
import { I18nProvider } from '@kbn/i18n/react';
<%_ } _%>
import 'ui/autoload/styles';
import './less/main.less';
@ -24,7 +27,16 @@ function RootController($scope, $element, $http) {
const domNode = $element[0];
// render react to DOM
<%_ if (generateTranslations) { _%>
render(
<I18nProvider>
<Main title="<%= name %>" httpClient={$http} />
</I18nProvider>,
domNode
);
<%_ } else { _%>
render(<Main title="<%= name %>" httpClient={$http} />, domNode);
<%_ } _%>
// unmount react on controller destroy
$scope.$on('$destroy', () => {

View file

@ -9,6 +9,9 @@ import {
EuiPageContentBody,
EuiText
} from '@elastic/eui';
<%_ if (generateTranslations) { _%>
import { FormattedMessage } from '@kbn/i18n/react';
<%_ } _%>
export class Main extends React.Component {
constructor(props) {
@ -33,19 +36,57 @@ export class Main extends React.Component {
<EuiPageBody>
<EuiPageHeader>
<EuiTitle size="l">
<h1>{title} Hello World!</h1>
<h1>
<%_ if (generateTranslations) { _%>
<FormattedMessage
id="<%= camelCase(name) %>.helloWorldText"
defaultMessage="{title} Hello World!"
values={{ title }}
/>
<%_ } else { _%>
{title} Hello World!
<%_ } _%>
</h1>
</EuiTitle>
</EuiPageHeader>
<EuiPageContent>
<EuiPageContentHeader>
<EuiTitle>
<h2>Congratulations</h2>
<h2>
<%_ if (generateTranslations) { _%>
<FormattedMessage
id="<%= camelCase(name) %>.congratulationsTitle"
defaultMessage="Congratulations"
/>
<%_ } else { _%>
Congratulations
<%_ } _%>
</h2>
</EuiTitle>
</EuiPageContentHeader>
<EuiPageContentBody>
<EuiText>
<h3>You have successfully created your first Kibana Plugin!</h3>
<p>The server time (via API call) is {this.state.time || 'NO API CALL YET'}</p>
<h3>
<%_ if (generateTranslations) { _%>
<FormattedMessage
id="<%= camelCase(name) %>.congratulationsText"
defaultMessage="You have successfully created your first Kibana Plugin!"
/>
<%_ } else { _%>
You have successfully created your first Kibana Plugin!
<%_ } _%>
</h3>
<p>
<%_ if (generateTranslations) { _%>
<FormattedMessage
id="<%= camelCase(name) %>.serverTimeText"
defaultMessage="The server time (via API call) is {time}"
values={{ time: this.state.time || 'NO API CALL YET' }}
/>
<%_ } else { _%>
The server time (via API call) is {this.state.time || 'NO API CALL YET'}
<%_ } _%>
</p>
</EuiText>
</EuiPageContentBody>
</EuiPageContent>

View file

@ -0,0 +1,84 @@
{
"formats": {
"number": {
"currency": {
"style": "currency"
},
"percent": {
"style": "percent"
}
},
"date": {
"short": {
"month": "numeric",
"day": "numeric",
"year": "2-digit"
},
"medium": {
"month": "short",
"day": "numeric",
"year": "numeric"
},
"long": {
"month": "long",
"day": "numeric",
"year": "numeric"
},
"full": {
"weekday": "long",
"month": "long",
"day": "numeric",
"year": "numeric"
}
},
"time": {
"short": {
"hour": "numeric",
"minute": "numeric"
},
"medium": {
"hour": "numeric",
"minute": "numeric",
"second": "numeric"
},
"long": {
"hour": "numeric",
"minute": "numeric",
"second": "numeric",
"timeZoneName": "short"
},
"full": {
"hour": "numeric",
"minute": "numeric",
"second": "numeric",
"timeZoneName": "short"
}
},
"relative": {
"years": {
"units": "year"
},
"months": {
"units": "month"
},
"days": {
"units": "day"
},
"hours": {
"units": "hour"
},
"minutes": {
"units": "minute"
},
"seconds": {
"units": "second"
}
}
},
"messages": {
"<%= camelCase(name) %>.congratulationsText": "您已经成功创建第一个 Kibana 插件。",
"<%= camelCase(name) %>.congratulationsTitle": "恭喜!",
"<%= camelCase(name) %>.helloWorldText": "{title} 您好,世界!",
"<%= camelCase(name) %>.serverTimeText": "服务器时间(通过 API 调用)为 {time}"
}
}

View file

@ -26,10 +26,10 @@ function babelRegister() {
try {
// add support for moved babel-register source: https://github.com/elastic/kibana/pull/13973
require(resolve(plugin.kibanaRoot, 'src/setup_node_env/babel_register'));
require(resolve(plugin.kibanaRoot, 'src/setup_node_env/babel_register')); // eslint-disable-line import/no-dynamic-require
} catch (error) {
if (error.code === 'MODULE_NOT_FOUND') {
require(resolve(plugin.kibanaRoot, 'src/optimize/babel/register'));
require(resolve(plugin.kibanaRoot, 'src/optimize/babel/register')); // eslint-disable-line import/no-dynamic-require
} else {
throw error;
}
@ -42,11 +42,8 @@ function resolveKibanaPath(path) {
}
function readFtrConfigFile(log, path, settingOverrides) {
return require(resolveKibanaPath('src/functional_test_runner')).readConfigFile(
log,
path,
settingOverrides
);
return require(resolveKibanaPath('src/functional_test_runner')) // eslint-disable-line import/no-dynamic-require
.readConfigFile(log, path, settingOverrides);
}
module.exports = {

View file

@ -51,7 +51,7 @@ function removeSymlinkDependencies(root) {
// parse a ts config file
function parseTsconfig(pluginSourcePath, configPath) {
const ts = require(path.join(pluginSourcePath, 'node_modules', 'typescript'));
const ts = require(path.join(pluginSourcePath, 'node_modules', 'typescript')); // eslint-disable-line import/no-dynamic-require
const { error, config } = ts.parseConfigFileTextToJson(
configPath,

View file

@ -42,7 +42,7 @@ describe('creating the build', () => {
await createBuild(PLUGIN, buildTarget, buildVersion, kibanaVersion, buildFiles);
const pkg = require(resolve(PLUGIN_BUILD_TARGET, 'package.json'));
const pkg = require(resolve(PLUGIN_BUILD_TARGET, 'package.json')); // eslint-disable-line import/no-dynamic-require
expect(pkg).not.toHaveProperty('scripts');
expect(pkg).not.toHaveProperty('devDependencies');
});
@ -52,7 +52,7 @@ describe('creating the build', () => {
await createBuild(PLUGIN, buildTarget, buildVersion, kibanaVersion, buildFiles);
const pkg = require(resolve(PLUGIN_BUILD_TARGET, 'package.json'));
const pkg = require(resolve(PLUGIN_BUILD_TARGET, 'package.json')); // eslint-disable-line import/no-dynamic-require
expect(pkg).toHaveProperty('build');
expect(pkg.build.git).not.toBeUndefined();
expect(pkg.build.date).not.toBeUndefined();

View file

@ -2487,17 +2487,17 @@ main {
* 1. Make seamless transition from ToolBar to Table header and contained Menu.
* 1. Make seamless transition from Table to ToolBarFooter header.
*/
.kuiControlledTable .kuiTable {
border-top: none;
/* 1 */ }
.kuiControlledTable .kuiToolBarFooter {
border-top: none;
/* 2 */ }
.kuiControlledTable .kuiMenu--contained {
border-top: none;
/* 1 */ }
.kuiControlledTable {
background: #FFF; }
.kuiControlledTable .kuiTable {
border-top: none;
/* 1 */ }
.kuiControlledTable .kuiToolBarFooter {
border-top: none;
/* 2 */ }
.kuiControlledTable .kuiMenu--contained {
border-top: none;
/* 1 */ }
/**
* 1. Prevent cells from expanding based on content size. This substitutes for table-layout: fixed.

View file

@ -3,6 +3,7 @@
* 1. Make seamless transition from Table to ToolBarFooter header.
*/
.kuiControlledTable {
background: $tableBackgroundColor;
.kuiTable {
border-top: none; /* 1 */
}

View file

@ -4,7 +4,6 @@
padding: 10px;
height: 40px;
background-color: #ffffff;
border: $kuiBorderThin;
}
.kuiToolBarFooterSection {

21
scripts/i18n_integrate.js Normal file
View file

@ -0,0 +1,21 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
require('../src/setup_node_env');
require('../src/dev/run_i18n_integrate');

21
scripts/sasslint.js Normal file
View file

@ -0,0 +1,21 @@
/*
* Licensed to Elasticsearch B.V. under one or more contributor
* license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright
* ownership. Elasticsearch B.V. licenses this file to you under
* the Apache License, Version 2.0 (the "License"); you may
* not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
require('../src/setup_node_env');
require('../src/dev/run_sasslint');

View file

@ -1,126 +1,114 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`#batchSet Buffers are always clear of previously buffered changes: two requests, second only sends bar, not foo 1`] = `
Object {
"matched": Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
],
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"bar\\":\\"box\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
},
],
"method": "POST",
},
],
"unmatched": Array [],
}
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"bar\\":\\"box\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
},
],
]
`;
exports[`#batchSet Overwrites previously buffered values with new values for the same key: two requests, foo=d in final 1`] = `
Object {
"matched": Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"a\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"a\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
],
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"d\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
},
],
"method": "POST",
},
],
"unmatched": Array [],
}
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"d\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
},
],
]
`;
exports[`#batchSet buffers changes while first request is in progress, sends buffered changes after first request completes: final, includes both requests 1`] = `
Object {
"matched": Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
],
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"box\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
},
],
"method": "POST",
},
],
"unmatched": Array [],
}
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"box\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
},
],
]
`;
exports[`#batchSet buffers changes while first request is in progress, sends buffered changes after first request completes: initial, only one request 1`] = `
Object {
"matched": Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
],
"method": "POST",
},
],
"unmatched": Array [],
}
]
`;
exports[`#batchSet rejects all promises for batched requests that fail: promise rejections 1`] = `
@ -147,22 +135,19 @@ exports[`#batchSet rejects on 404 response 1`] = `"Request failed with status co
exports[`#batchSet rejects on 500 1`] = `"Request failed with status code: 500"`;
exports[`#batchSet sends a single change immediately: synchronous fetch 1`] = `
Object {
"matched": Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
"method": "POST",
Array [
Array [
"/foo/bar/api/kibana/settings",
Object {
"body": "{\\"changes\\":{\\"foo\\":\\"bar\\"}}",
"credentials": "same-origin",
"headers": Object {
"accept": "application/json",
"content-type": "application/json",
"kbn-version": "v9.9.9",
},
],
"method": "POST",
},
],
"unmatched": Array [],
}
]
`;

View file

@ -17,7 +17,8 @@
* under the License.
*/
import fetchMock from 'fetch-mock';
// @ts-ignore
import fetchMock from 'fetch-mock/es5/client';
import * as Rx from 'rxjs';
import { takeUntil, toArray } from 'rxjs/operators';
@ -142,10 +143,16 @@ describe('#batchSet', () => {
fetchMock.once('*', {
body: { settings: {} },
});
fetchMock.once('*', {
status: 400,
body: 'invalid',
});
fetchMock.once(
'*',
{
status: 400,
body: 'invalid',
},
{
overwriteRoutes: false,
}
);
const { uiSettingsApi } = setup();
// trigger the initial sync request, which enabled buffering
@ -161,7 +168,7 @@ describe('#batchSet', () => {
).resolves.toMatchSnapshot('promise rejections');
// ensure only two requests were sent
expect(fetchMock.calls().matched).toHaveLength(2);
expect(fetchMock.calls()).toHaveLength(2);
});
});
@ -191,10 +198,16 @@ describe('#getLoadingCount$()', () => {
fetchMock.once('*', {
body: { settings: {} },
});
fetchMock.once('*', {
status: 400,
body: 'invalid',
});
fetchMock.once(
'*',
{
status: 400,
body: 'invalid',
},
{
overwriteRoutes: false,
}
);
const { uiSettingsApi } = setup();
const done$ = new Rx.Subject();

View file

@ -25,7 +25,7 @@ import { createPlatform } from './platform';
export async function getConfig({ isRelease, targetAllPlatforms, versionQualifier }) {
const pkgPath = resolve(__dirname, '../../../../package.json');
const pkg = require(pkgPath);
const pkg = require(pkgPath); // eslint-disable-line import/no-dynamic-require
const repoRoot = dirname(pkgPath);
const nodeVersion = pkg.engines.node;

View file

@ -195,27 +195,21 @@ export const CleanExtraBrowsersTask = {
async run(config, log, build) {
const getBrowserPathsForPlatform = platform => {
const reportingDir = 'node_modules/x-pack/plugins/reporting';
const phantomDir = '.phantom';
const chromiumDir = '.chromium';
const phantomPath = p =>
build.resolvePathForPlatform(platform, reportingDir, phantomDir, p);
const chromiumPath = p =>
build.resolvePathForPlatform(platform, reportingDir, chromiumDir, p);
return platforms => {
const paths = [];
if (platforms.windows) {
paths.push(phantomPath('phantomjs-*-windows.zip'));
paths.push(chromiumPath('chromium-*-win32.zip'));
paths.push(chromiumPath('chromium-*-windows.zip'));
}
if (platforms.darwin) {
paths.push(phantomPath('phantomjs-*-macosx.zip'));
paths.push(chromiumPath('chromium-*-darwin.zip'));
}
if (platforms.linux) {
paths.push(phantomPath('phantomjs-*-linux-x86_64.tar.bz2'));
paths.push(chromiumPath('chromium-*-linux.zip'));
}
return paths;

View file

@ -34,7 +34,7 @@ export const TranspileScssTask = {
const uiExports = collectUiExports(enabledPlugins);
try {
const bundles = await buildAll(uiExports.styleSheetPaths);
const bundles = await buildAll(uiExports.styleSheetPaths, log);
bundles.forEach(bundle => log.info(`Compiled SCSS: ${bundle.source}`));
} catch (error) {
const { message, line, file } = error;

View file

@ -48,6 +48,10 @@ export class File {
return this.ext === '.ts' || this.ext === '.tsx';
}
public isSass() {
return this.ext === '.sass' || this.ext === '.scss';
}
public isFixture() {
return this.relativePath.split(sep).includes('__fixtures__');
}

View file

@ -0,0 +1,63 @@
{
"formats": {
"number": {
"currency": {
"style": "currency"
},
"percent": {
"style": "percent"
}
},
"date": {
"short": {
"month": "numeric",
"day": "numeric",
"year": "2-digit"
},
"medium": {
"month": "short",
"day": "numeric",
"year": "numeric"
},
"long": {
"month": "long",
"day": "numeric",
"year": "numeric"
},
"full": {
"weekday": "long",
"month": "long",
"day": "numeric",
"year": "numeric"
}
},
"time": {
"short": {
"hour": "numeric",
"minute": "numeric"
},
"medium": {
"hour": "numeric",
"minute": "numeric",
"second": "numeric"
},
"long": {
"hour": "numeric",
"minute": "numeric",
"second": "numeric",
"timeZoneName": "short"
},
"full": {
"hour": "numeric",
"minute": "numeric",
"second": "numeric",
"timeZoneName": "short"
}
}
},
"messages": {
"plugin-1.message-id-1": "Translated text 1",
"plugin-1.message-id-2": "Translated text 2",
"plugin-2.message-id": "Translated text"
}
}

View file

@ -55,9 +55,17 @@ Array [
`;
exports[`dev/i18n/extract_default_translations throws on id collision 1`] = `
" I18N ERROR  Error in src/dev/i18n/__fixtures__/extract_default_translations/test_plugin_3/test_file.jsx
Array [
" I18N ERROR  Error in src/dev/i18n/__fixtures__/extract_default_translations/test_plugin_3/test_file.jsx
Error: There is more than one default message for the same id \\"plugin_3.duplicate_id\\":
\\"Message 1\\" and \\"Message 2\\""
\\"Message 1\\" and \\"Message 2\\"",
]
`;
exports[`dev/i18n/extract_default_translations throws on wrong message namespace 1`] = `"Expected \\"wrong_plugin_namespace.message-id\\" id to have \\"plugin_2\\" namespace. See .i18nrc.json for the list of supported namespaces."`;
exports[`dev/i18n/extract_default_translations throws on wrong message namespace 1`] = `
Array [
Array [
[Error: Expected "wrong_plugin_namespace.message-id" id to have "plugin_2" namespace. See .i18nrc.json for the list of supported namespaces.],
],
]
`;

View file

@ -0,0 +1,163 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`dev/i18n/integrate_locale_files integrateLocaleFiles splits locale file by plugins and writes them into the right folders 1`] = `
Array [
"src/dev/i18n/__fixtures__/integrate_locale_files/test_plugin_1/translations/fr.json",
"{
\\"formats\\": {
\\"number\\": {
\\"currency\\": {
\\"style\\": \\"currency\\"
},
\\"percent\\": {
\\"style\\": \\"percent\\"
}
},
\\"date\\": {
\\"short\\": {
\\"month\\": \\"numeric\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"2-digit\\"
},
\\"medium\\": {
\\"month\\": \\"short\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"numeric\\"
},
\\"long\\": {
\\"month\\": \\"long\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"numeric\\"
},
\\"full\\": {
\\"weekday\\": \\"long\\",
\\"month\\": \\"long\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"numeric\\"
}
},
\\"time\\": {
\\"short\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\"
},
\\"medium\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\",
\\"second\\": \\"numeric\\"
},
\\"long\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\",
\\"second\\": \\"numeric\\",
\\"timeZoneName\\": \\"short\\"
},
\\"full\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\",
\\"second\\": \\"numeric\\",
\\"timeZoneName\\": \\"short\\"
}
}
},
\\"messages\\": {
\\"plugin-1.message-id-1\\": \\"Translated text 1\\",
\\"plugin-1.message-id-2\\": \\"Translated text 2\\"
}
}",
]
`;
exports[`dev/i18n/integrate_locale_files integrateLocaleFiles splits locale file by plugins and writes them into the right folders 2`] = `
Array [
"src/dev/i18n/__fixtures__/integrate_locale_files/test_plugin_2/translations/fr.json",
"{
\\"formats\\": {
\\"number\\": {
\\"currency\\": {
\\"style\\": \\"currency\\"
},
\\"percent\\": {
\\"style\\": \\"percent\\"
}
},
\\"date\\": {
\\"short\\": {
\\"month\\": \\"numeric\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"2-digit\\"
},
\\"medium\\": {
\\"month\\": \\"short\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"numeric\\"
},
\\"long\\": {
\\"month\\": \\"long\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"numeric\\"
},
\\"full\\": {
\\"weekday\\": \\"long\\",
\\"month\\": \\"long\\",
\\"day\\": \\"numeric\\",
\\"year\\": \\"numeric\\"
}
},
\\"time\\": {
\\"short\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\"
},
\\"medium\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\",
\\"second\\": \\"numeric\\"
},
\\"long\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\",
\\"second\\": \\"numeric\\",
\\"timeZoneName\\": \\"short\\"
},
\\"full\\": {
\\"hour\\": \\"numeric\\",
\\"minute\\": \\"numeric\\",
\\"second\\": \\"numeric\\",
\\"timeZoneName\\": \\"short\\"
}
}
},
\\"messages\\": {
\\"plugin-2.message-id\\": \\"Translated text\\"
}
}",
]
`;
exports[`dev/i18n/integrate_locale_files integrateLocaleFiles splits locale file by plugins and writes them into the right folders 3`] = `
Array [
"src/dev/i18n/__fixtures__/integrate_locale_files/test_plugin_1/translations",
"src/dev/i18n/__fixtures__/integrate_locale_files/test_plugin_2/translations",
]
`;
exports[`dev/i18n/integrate_locale_files verifyMessages throws an error for unused id and missing id 1`] = `
"
Missing translations:
plugin-1.message-id-2"
`;
exports[`dev/i18n/integrate_locale_files verifyMessages throws an error for unused id and missing id 2`] = `
"
Unused translations:
plugin-1.message-id-3"
`;
exports[`dev/i18n/integrate_locale_files verifyMessages throws an error for unused id and missing id 3`] = `
"
Unused translations:
plugin-2.message
Missing translations:
plugin-2.message-id"
`;

View file

@ -9,6 +9,8 @@ exports[`i18n utils should create verbose parser error message 1`] = `
"
`;
exports[`i18n utils should normalizePath 1`] = `"src/dev/i18n"`;
exports[`i18n utils should not escape linebreaks 1`] = `
"Text
with

View file

@ -18,8 +18,6 @@
*/
import path from 'path';
import normalize from 'normalize-path';
import chalk from 'chalk';
import {
extractHtmlMessages,
@ -27,26 +25,24 @@ import {
extractPugMessages,
extractHandlebarsMessages,
} from './extractors';
import { globAsync, readFileAsync } from './utils';
import { paths, exclude } from '../../../.i18nrc.json';
import { globAsync, readFileAsync, normalizePath } from './utils';
import { createFailError, isFailError } from '../run';
function addMessageToMap(targetMap, key, value) {
function addMessageToMap(targetMap, key, value, reporter) {
const existingValue = targetMap.get(key);
if (targetMap.has(key) && existingValue.message !== value.message) {
throw createFailError(`There is more than one default message for the same id "${key}":
"${existingValue.message}" and "${value.message}"`);
reporter.report(
createFailError(`There is more than one default message for the same id "${key}":
"${existingValue.message}" and "${value.message}"`)
);
} else {
targetMap.set(key, value);
}
targetMap.set(key, value);
}
function normalizePath(inputPath) {
return normalize(path.relative('.', inputPath));
}
export function filterPaths(inputPaths) {
export function filterPaths(inputPaths, paths) {
const availablePaths = Object.values(paths);
const pathsForExtraction = new Set();
@ -70,26 +66,28 @@ export function filterPaths(inputPaths) {
return [...pathsForExtraction];
}
function filterEntries(entries) {
function filterEntries(entries, exclude) {
return entries.filter(entry =>
exclude.every(excludedPath => !normalizePath(entry).startsWith(excludedPath))
);
}
export function validateMessageNamespace(id, filePath) {
export function validateMessageNamespace(id, filePath, allowedPaths, reporter) {
const normalizedPath = normalizePath(filePath);
const [expectedNamespace] = Object.entries(paths).find(([, pluginPath]) =>
const [expectedNamespace] = Object.entries(allowedPaths).find(([, pluginPath]) =>
normalizedPath.startsWith(`${pluginPath}/`)
);
if (!id.startsWith(`${expectedNamespace}.`)) {
throw createFailError(`Expected "${id}" id to have "${expectedNamespace}" namespace. \
See .i18nrc.json for the list of supported namespaces.`);
reporter.report(
createFailError(`Expected "${id}" id to have "${expectedNamespace}" namespace. \
See .i18nrc.json for the list of supported namespaces.`)
);
}
}
export async function extractMessagesFromPathToMap(inputPath, targetMap) {
export async function extractMessagesFromPathToMap(inputPath, targetMap, config, reporter) {
const entries = await globAsync('*.{js,jsx,pug,ts,tsx,html,hbs,handlebars}', {
cwd: inputPath,
matchBase: true,
@ -123,7 +121,7 @@ export async function extractMessagesFromPathToMap(inputPath, targetMap) {
[hbsEntries, extractHandlebarsMessages],
].map(async ([entries, extractFunction]) => {
const files = await Promise.all(
filterEntries(entries).map(async entry => {
filterEntries(entries, config.exclude).map(async entry => {
return {
name: entry,
content: await readFileAsync(entry),
@ -132,21 +130,31 @@ export async function extractMessagesFromPathToMap(inputPath, targetMap) {
);
for (const { name, content } of files) {
const reporterWithContext = reporter.withContext({ name });
try {
for (const [id, value] of extractFunction(content)) {
validateMessageNamespace(id, name);
addMessageToMap(targetMap, id, value);
for (const [id, value] of extractFunction(content, reporterWithContext)) {
validateMessageNamespace(id, name, config.paths, reporterWithContext);
addMessageToMap(targetMap, id, value, reporterWithContext);
}
} catch (error) {
if (isFailError(error)) {
throw createFailError(
`${chalk.white.bgRed(' I18N ERROR ')} Error in ${normalizePath(name)}\n${error}`
);
if (!isFailError(error)) {
throw error;
}
throw error;
reporterWithContext.report(error);
}
}
})
);
}
export async function getDefaultMessagesMap(inputPaths, config, reporter) {
const defaultMessagesMap = new Map();
for (const inputPath of filterPaths(inputPaths, config.paths)) {
await extractMessagesFromPathToMap(inputPath, defaultMessagesMap, config, reporter);
}
return defaultMessagesMap;
}

View file

@ -23,6 +23,7 @@ import {
extractMessagesFromPathToMap,
validateMessageNamespace,
} from './extract_default_translations';
import { ErrorReporter } from './utils';
const fixturesPath = path.resolve(__dirname, '__fixtures__', 'extract_default_translations');
const pluginsPaths = [
@ -31,31 +32,32 @@ const pluginsPaths = [
path.join(fixturesPath, 'test_plugin_3'),
];
jest.mock('../../../.i18nrc.json', () => ({
const config = {
paths: {
plugin_1: 'src/dev/i18n/__fixtures__/extract_default_translations/test_plugin_1',
plugin_2: 'src/dev/i18n/__fixtures__/extract_default_translations/test_plugin_2',
plugin_3: 'src/dev/i18n/__fixtures__/extract_default_translations/test_plugin_3',
},
exclude: [],
}));
};
describe('dev/i18n/extract_default_translations', () => {
test('extracts messages from path to map', async () => {
const [pluginPath] = pluginsPaths;
const resultMap = new Map();
await extractMessagesFromPathToMap(pluginPath, resultMap);
await extractMessagesFromPathToMap(pluginPath, resultMap, config, new ErrorReporter());
expect([...resultMap].sort()).toMatchSnapshot();
});
test('throws on id collision', async () => {
const [, , pluginPath] = pluginsPaths;
const reporter = new ErrorReporter();
await expect(
extractMessagesFromPathToMap(pluginPath, new Map())
).rejects.toThrowErrorMatchingSnapshot();
extractMessagesFromPathToMap(pluginPath, new Map(), config, reporter)
).resolves.not.toThrow();
expect(reporter.errors).toMatchSnapshot();
});
test('validates message namespace', () => {
@ -64,15 +66,18 @@ describe('dev/i18n/extract_default_translations', () => {
__dirname,
'__fixtures__/extract_default_translations/test_plugin_2/test_file.html'
);
expect(() => validateMessageNamespace(id, filePath)).not.toThrow();
expect(() => validateMessageNamespace(id, filePath, config.paths)).not.toThrow();
});
test('throws on wrong message namespace', () => {
const report = jest.fn();
const id = 'wrong_plugin_namespace.message-id';
const filePath = path.resolve(
__dirname,
'__fixtures__/extract_default_translations/test_plugin_2/test_file.html'
);
expect(() => validateMessageNamespace(id, filePath)).toThrowErrorMatchingSnapshot();
expect(() => validateMessageNamespace(id, filePath, config.paths, { report })).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
});

View file

@ -26,6 +26,18 @@ Array [
]
`;
exports[`dev/i18n/extractors/code throws on empty id 1`] = `"Empty \\"id\\" value in i18n() or i18n.translate() is not allowed."`;
exports[`dev/i18n/extractors/code throws on empty id 1`] = `
Array [
Array [
[Error: Empty "id" value in i18n() or i18n.translate() is not allowed.],
],
]
`;
exports[`dev/i18n/extractors/code throws on missing defaultMessage 1`] = `"Empty defaultMessage in intl.formatMessage() is not allowed (\\"message-id\\")."`;
exports[`dev/i18n/extractors/code throws on missing defaultMessage 1`] = `
Array [
Array [
[Error: Empty defaultMessage in intl.formatMessage() is not allowed ("message-id").],
],
]
`;

View file

@ -12,10 +12,34 @@ Array [
]
`;
exports[`dev/i18n/extractors/handlebars throws on empty id 1`] = `"Empty id argument in Handlebars i18n is not allowed."`;
exports[`dev/i18n/extractors/handlebars throws on empty id 1`] = `
Array [
Array [
[Error: Empty id argument in Handlebars i18n is not allowed.],
],
]
`;
exports[`dev/i18n/extractors/handlebars throws on missing defaultMessage property 1`] = `"defaultMessage value in Handlebars i18n should be a string (\\"message-id\\")."`;
exports[`dev/i18n/extractors/handlebars throws on missing defaultMessage property 1`] = `
Array [
Array [
[Error: defaultMessage value in Handlebars i18n should be a string ("message-id").],
],
]
`;
exports[`dev/i18n/extractors/handlebars throws on wrong number of arguments 1`] = `"Wrong number of arguments for handlebars i18n call."`;
exports[`dev/i18n/extractors/handlebars throws on wrong number of arguments 1`] = `
Array [
Array [
[Error: Wrong number of arguments for handlebars i18n call.],
],
]
`;
exports[`dev/i18n/extractors/handlebars throws on wrong properties argument type 1`] = `"Properties string in Handlebars i18n should be a string literal (\\"ui.id-1\\")."`;
exports[`dev/i18n/extractors/handlebars throws on wrong properties argument type 1`] = `
Array [
Array [
[Error: Properties string in Handlebars i18n should be a string literal ("ui.id-1").],
],
]
`;

View file

@ -50,12 +50,28 @@ Array [
]
`;
exports[`dev/i18n/extractors/html throws on empty i18n-id 1`] = `"Empty \\"i18n-id\\" value in angular directive is not allowed."`;
exports[`dev/i18n/extractors/html throws on i18n filter usage in complex angular expression 1`] = `
"Couldn't parse angular i18n expression:
Unexpected token, expected \\";\\" (1:6):
mode as ('metricVis.colorModes.' + mode"
exports[`dev/i18n/extractors/html throws on empty i18n-id 1`] = `
Array [
Array [
[Error: Empty "i18n-id" value in angular directive is not allowed.],
],
]
`;
exports[`dev/i18n/extractors/html throws on missing i18n-default-message attribute 1`] = `"Empty defaultMessage in angular directive is not allowed (\\"message-id\\")."`;
exports[`dev/i18n/extractors/html throws on i18n filter usage in complex angular expression 1`] = `
Array [
Array [
[Error: Couldn't parse angular i18n expression:
Unexpected token, expected ";" (1:6):
mode as ('metricVis.colorModes.' + mode],
],
]
`;
exports[`dev/i18n/extractors/html throws on missing i18n-default-message attribute 1`] = `
Array [
Array [
[Error: Empty defaultMessage in angular directive is not allowed ("message-id").],
],
]
`;

View file

@ -20,6 +20,18 @@ Array [
]
`;
exports[`dev/i18n/extractors/pug throws on empty id 1`] = `"Empty \\"id\\" value in i18n() or i18n.translate() is not allowed."`;
exports[`dev/i18n/extractors/pug throws on empty id 1`] = `
Array [
Array [
[Error: Empty "id" value in i18n() or i18n.translate() is not allowed.],
],
]
`;
exports[`dev/i18n/extractors/pug throws on missing default message 1`] = `"Empty defaultMessage in i18n() or i18n.translate() is not allowed (\\"message-id\\")."`;
exports[`dev/i18n/extractors/pug throws on missing default message 1`] = `
Array [
Array [
[Error: Empty defaultMessage in i18n() or i18n.translate() is not allowed ("message-id").],
],
]
`;

View file

@ -29,7 +29,7 @@ import {
import { extractI18nCallMessages } from './i18n_call';
import { createParserErrorMessage, isI18nTranslateFunction, traverseNodes } from '../utils';
import { extractIntlMessages, extractFormattedMessages } from './react';
import { createFailError } from '../../run';
import { createFailError, isFailError } from '../../run';
/**
* Detect Intl.formatMessage() function call (React).
@ -61,7 +61,7 @@ export function isFormattedMessageElement(node) {
return isJSXOpeningElement(node) && isJSXIdentifier(node.name, { name: 'FormattedMessage' });
}
export function* extractCodeMessages(buffer) {
export function* extractCodeMessages(buffer, reporter) {
let ast;
try {
@ -72,19 +72,26 @@ export function* extractCodeMessages(buffer) {
} catch (error) {
if (error instanceof SyntaxError) {
const errorWithContext = createParserErrorMessage(buffer.toString(), error);
throw createFailError(errorWithContext);
reporter.report(createFailError(errorWithContext));
return;
}
throw error;
}
for (const node of traverseNodes(ast.program.body)) {
if (isI18nTranslateFunction(node)) {
yield extractI18nCallMessages(node);
} else if (isIntlFormatMessageFunction(node)) {
yield extractIntlMessages(node);
} else if (isFormattedMessageElement(node)) {
yield extractFormattedMessages(node);
try {
if (isI18nTranslateFunction(node)) {
yield extractI18nCallMessages(node);
} else if (isIntlFormatMessageFunction(node)) {
yield extractIntlMessages(node);
} else if (isFormattedMessageElement(node)) {
yield extractFormattedMessages(node);
}
} catch (error) {
if (!isFailError(error)) {
throw error;
}
reporter.report(error);
}
}
}

View file

@ -65,7 +65,13 @@ function f() {
}
`;
const report = jest.fn();
describe('dev/i18n/extractors/code', () => {
beforeEach(() => {
report.mockClear();
});
test('extracts React, server-side and angular service default messages', () => {
const actual = Array.from(extractCodeMessages(extractCodeMessagesSource));
expect(actual.sort()).toMatchSnapshot();
@ -73,12 +79,14 @@ describe('dev/i18n/extractors/code', () => {
test('throws on empty id', () => {
const source = Buffer.from(`i18n.translate('', { defaultMessage: 'Default message' });`);
expect(() => extractCodeMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractCodeMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on missing defaultMessage', () => {
const source = Buffer.from(`intl.formatMessage({ id: 'message-id' });`);
expect(() => extractCodeMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractCodeMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
});

View file

@ -18,7 +18,7 @@
*/
import { formatJSString, checkValuesProperty } from '../utils';
import { createFailError } from '../../run';
import { createFailError, isFailError } from '../../run';
import { DEFAULT_MESSAGE_KEY, DESCRIPTION_KEY } from '../constants';
const HBS_REGEX = /(?<=\{\{)([\s\S]*?)(?=\}\})/g;
@ -27,7 +27,7 @@ const TOKENS_REGEX = /[^'\s]+|(?:'([^'\\]|\\[\s\S])*')/g;
/**
* Example: `'{{i18n 'message-id' '{"defaultMessage": "Message text"}'}}'`
*/
export function* extractHandlebarsMessages(buffer) {
export function* extractHandlebarsMessages(buffer, reporter) {
for (const expression of buffer.toString().match(HBS_REGEX) || []) {
const tokens = expression.match(TOKENS_REGEX);
@ -38,58 +38,78 @@ export function* extractHandlebarsMessages(buffer) {
}
if (tokens.length !== 3) {
throw createFailError(`Wrong number of arguments for handlebars i18n call.`);
reporter.report(createFailError(`Wrong number of arguments for handlebars i18n call.`));
continue;
}
if (!idString.startsWith(`'`) || !idString.endsWith(`'`)) {
throw createFailError(`Message id should be a string literal.`);
reporter.report(createFailError(`Message id should be a string literal.`));
continue;
}
const messageId = formatJSString(idString.slice(1, -1));
if (!messageId) {
throw createFailError(`Empty id argument in Handlebars i18n is not allowed.`);
reporter.report(createFailError(`Empty id argument in Handlebars i18n is not allowed.`));
continue;
}
if (!propertiesString.startsWith(`'`) || !propertiesString.endsWith(`'`)) {
throw createFailError(
`Properties string in Handlebars i18n should be a string literal ("${messageId}").`
reporter.report(
createFailError(
`Properties string in Handlebars i18n should be a string literal ("${messageId}").`
)
);
continue;
}
const properties = JSON.parse(propertiesString.slice(1, -1));
if (typeof properties.defaultMessage !== 'string') {
throw createFailError(
`defaultMessage value in Handlebars i18n should be a string ("${messageId}").`
reporter.report(
createFailError(
`defaultMessage value in Handlebars i18n should be a string ("${messageId}").`
)
);
continue;
}
if (properties[DESCRIPTION_KEY] != null && typeof properties[DESCRIPTION_KEY] !== 'string') {
throw createFailError(
`Description value in Handlebars i18n should be a string ("${messageId}").`
reporter.report(
createFailError(`Description value in Handlebars i18n should be a string ("${messageId}").`)
);
continue;
}
const message = formatJSString(properties[DEFAULT_MESSAGE_KEY]);
const description = formatJSString(properties[DESCRIPTION_KEY]);
if (!message) {
throw createFailError(
`Empty defaultMessage in Handlebars i18n is not allowed ("${messageId}").`
reporter.report(
createFailError(`Empty defaultMessage in Handlebars i18n is not allowed ("${messageId}").`)
);
continue;
}
const valuesObject = properties.values;
if (valuesObject != null && typeof valuesObject !== 'object') {
throw createFailError(
`"values" value should be an object in Handlebars i18n ("${messageId}").`
reporter.report(
createFailError(`"values" value should be an object in Handlebars i18n ("${messageId}").`)
);
continue;
}
checkValuesProperty(Object.keys(valuesObject || {}), message, messageId);
try {
checkValuesProperty(Object.keys(valuesObject || {}), message, messageId);
yield [messageId, { message, description }];
yield [messageId, { message, description }];
} catch (error) {
if (!isFailError(error)) {
throw error;
}
reporter.report(error);
}
}
}

View file

@ -19,7 +19,13 @@
import { extractHandlebarsMessages } from './handlebars';
const report = jest.fn();
describe('dev/i18n/extractors/handlebars', () => {
beforeEach(() => {
report.mockClear();
});
test('extracts handlebars default messages', () => {
const source = Buffer.from(`\
window.onload = function () {
@ -49,7 +55,8 @@ window.onload = function () {
};
`);
expect(() => extractHandlebarsMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHandlebarsMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on wrong properties argument type', () => {
@ -59,7 +66,8 @@ window.onload = function () {
};
`);
expect(() => extractHandlebarsMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHandlebarsMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on empty id', () => {
@ -69,7 +77,8 @@ window.onload = function () {
};
`);
expect(() => extractHandlebarsMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHandlebarsMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on missing defaultMessage property', () => {
@ -79,6 +88,7 @@ window.onload = function () {
};
`);
expect(() => extractHandlebarsMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHandlebarsMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
});

View file

@ -33,7 +33,7 @@ import {
extractDescriptionValueFromNode,
} from '../utils';
import { DEFAULT_MESSAGE_KEY, DESCRIPTION_KEY, VALUES_KEY } from '../constants';
import { createFailError } from '../../run';
import { createFailError, isFailError } from '../../run';
/**
* Find all substrings of "{{ any text }}" pattern allowing '{' and '}' chars in single quote strings
@ -152,41 +152,49 @@ function* extractExpressions(htmlContent) {
}
}
function* getFilterMessages(htmlContent) {
function* getFilterMessages(htmlContent, reporter) {
for (const expression of extractExpressions(htmlContent)) {
const filterStart = expression.indexOf(I18N_FILTER_MARKER);
const idExpression = trimOneTimeBindingOperator(expression.slice(0, filterStart).trim());
const filterObjectExpression = expression.slice(filterStart + I18N_FILTER_MARKER.length).trim();
if (!filterObjectExpression || !idExpression) {
throw createFailError(`Cannot parse i18n filter expression: ${expression}`);
}
try {
if (!filterObjectExpression || !idExpression) {
throw createFailError(`Cannot parse i18n filter expression: ${expression}`);
}
const messageId = parseIdExpression(idExpression);
const messageId = parseIdExpression(idExpression);
if (!messageId) {
throw createFailError(`Empty "id" value in angular filter expression is not allowed.`);
}
if (!messageId) {
throw createFailError('Empty "id" value in angular filter expression is not allowed.');
}
const { message, description, valuesKeys } = parseFilterObjectExpression(
filterObjectExpression,
messageId
);
if (!message) {
throw createFailError(
`Empty defaultMessage in angular filter expression is not allowed ("${messageId}").`
const { message, description, valuesKeys } = parseFilterObjectExpression(
filterObjectExpression,
messageId
);
if (!message) {
throw createFailError(
`Empty defaultMessage in angular filter expression is not allowed ("${messageId}").`
);
}
checkValuesProperty(valuesKeys, message, messageId);
yield [messageId, { message, description }];
} catch (error) {
if (!isFailError(error)) {
throw error;
}
reporter.report(error);
}
checkValuesProperty(valuesKeys, message, messageId);
yield [messageId, { message, description }];
}
}
function* getDirectiveMessages(htmlContent) {
function* getDirectiveMessages(htmlContent, reporter) {
const $ = cheerio.load(htmlContent);
const elements = $('[i18n-id]')
@ -205,34 +213,51 @@ function* getDirectiveMessages(htmlContent) {
for (const element of elements) {
const messageId = formatHTMLString(element.id);
if (!messageId) {
throw createFailError(`Empty "i18n-id" value in angular directive is not allowed.`);
reporter.report(
createFailError('Empty "i18n-id" value in angular directive is not allowed.')
);
continue;
}
const message = formatHTMLString(element.defaultMessage);
if (!message) {
throw createFailError(
`Empty defaultMessage in angular directive is not allowed ("${messageId}").`
reporter.report(
createFailError(
`Empty defaultMessage in angular directive is not allowed ("${messageId}").`
)
);
continue;
}
if (element.values) {
const ast = parseExpression(element.values);
const valuesObjectNode = [...traverseNodes(ast.program.body)].find(node =>
isObjectExpression(node)
);
const valuesKeys = extractValuesKeysFromNode(valuesObjectNode);
try {
if (element.values) {
const ast = parseExpression(element.values);
const valuesObjectNode = [...traverseNodes(ast.program.body)].find(node =>
isObjectExpression(node)
);
const valuesKeys = extractValuesKeysFromNode(valuesObjectNode);
checkValuesProperty(valuesKeys, message, messageId);
} else {
checkValuesProperty([], message, messageId);
checkValuesProperty(valuesKeys, message, messageId);
} else {
checkValuesProperty([], message, messageId);
}
yield [
messageId,
{ message, description: formatHTMLString(element.description) || undefined },
];
} catch (error) {
if (!isFailError(error)) {
throw error;
}
reporter.report(error);
}
yield [messageId, { message, description: formatHTMLString(element.description) || undefined }];
}
}
export function* extractHtmlMessages(buffer) {
export function* extractHtmlMessages(buffer, reporter) {
const content = buffer.toString();
yield* getDirectiveMessages(content);
yield* getFilterMessages(content);
yield* getDirectiveMessages(content, reporter);
yield* getFilterMessages(content, reporter);
}

View file

@ -41,7 +41,13 @@ const htmlSourceBuffer = Buffer.from(`
</div>
`);
const report = jest.fn();
describe('dev/i18n/extractors/html', () => {
beforeEach(() => {
report.mockClear();
});
test('extracts default messages from HTML', () => {
const actual = Array.from(extractHtmlMessages(htmlSourceBuffer));
expect(actual.sort()).toMatchSnapshot();
@ -67,7 +73,8 @@ describe('dev/i18n/extractors/html', () => {
></p>
`);
expect(() => extractHtmlMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHtmlMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on missing i18n-default-message attribute', () => {
@ -77,7 +84,8 @@ describe('dev/i18n/extractors/html', () => {
></p>
`);
expect(() => extractHtmlMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHtmlMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on i18n filter usage in complex angular expression', () => {
@ -87,7 +95,8 @@ describe('dev/i18n/extractors/html', () => {
></div>
`);
expect(() => extractHtmlMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractHtmlMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('extracts message from i18n filter in interpolating directive', () => {

View file

@ -21,40 +21,52 @@ import { parse } from '@babel/parser';
import { extractI18nCallMessages } from './i18n_call';
import { isI18nTranslateFunction, traverseNodes, createParserErrorMessage } from '../utils';
import { createFailError } from '../../run';
import { createFailError, isFailError } from '../../run';
/**
* Matches `i18n(...)` in `#{i18n('id', { defaultMessage: 'Message text' })}`
*/
const PUG_I18N_REGEX = /i18n\((([^)']|'([^'\\]|\\.)*')*)\)/g;
function parsePugExpression(expression) {
let ast;
try {
ast = parse(expression);
} catch (error) {
if (error instanceof SyntaxError) {
const errorWithContext = createParserErrorMessage(expression, error);
throw createFailError(
`Couldn't parse Pug expression with i18n(...) call:\n${errorWithContext}`
);
}
throw error;
}
return ast;
}
/**
* Example: `#{i18n('message-id', { defaultMessage: 'Message text' })}`
*/
export function* extractPugMessages(buffer) {
export function* extractPugMessages(buffer, reporter) {
const expressions = buffer.toString().match(PUG_I18N_REGEX) || [];
for (const expression of expressions) {
let ast;
try {
ast = parse(expression);
} catch (error) {
if (error instanceof SyntaxError) {
const errorWithContext = createParserErrorMessage(expression, error);
throw createFailError(
`Couldn't parse Pug expression with i18n(...) call:\n${errorWithContext}`
);
}
const ast = parsePugExpression(expression);
const node = [...traverseNodes(ast.program.body)].find(node => isI18nTranslateFunction(node));
throw error;
}
for (const node of traverseNodes(ast.program.body)) {
if (isI18nTranslateFunction(node)) {
if (node) {
yield extractI18nCallMessages(node);
break;
}
} catch (error) {
if (!isFailError(error)) {
throw error;
}
reporter.report(error);
}
}
}

View file

@ -19,7 +19,13 @@
import { extractPugMessages } from './pug';
const report = jest.fn();
describe('dev/i18n/extractors/pug', () => {
beforeEach(() => {
report.mockClear();
});
test('extracts messages from pug template with interpolation', () => {
const source = Buffer.from(`\
#{i18n('message-id', { defaultMessage: 'Default message', description: 'Message description' })}
@ -43,7 +49,8 @@ describe('dev/i18n/extractors/pug', () => {
h1= i18n('', { defaultMessage: 'Default message', description: 'Message description' })
`);
expect(() => extractPugMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractPugMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
test('throws on missing default message', () => {
@ -51,6 +58,7 @@ h1= i18n('', { defaultMessage: 'Default message', description: 'Message descript
#{i18n('message-id', { description: 'Message description' })}
`);
expect(() => extractPugMessages(source).next()).toThrowErrorMatchingSnapshot();
expect(() => extractPugMessages(source, { report }).next()).not.toThrow();
expect(report.mock.calls).toMatchSnapshot();
});
});

View file

@ -18,5 +18,5 @@
*/
export { filterPaths, extractMessagesFromPathToMap } from './extract_default_translations';
export { writeFileAsync } from './utils';
export { writeFileAsync, readFileAsync, normalizePath, ErrorReporter } from './utils';
export { serializeToJson, serializeToJson5 } from './serializers';

Some files were not shown because too many files have changed in this diff Show more