mirror of
https://github.com/elastic/logstash.git
synced 2025-04-21 05:08:07 -04:00
* Provision automatic test runs for ruby/java unit tests and integration tests with fips mode (#17029) * Run ruby unit tests under FIPS mode This commit shows a proposed pattern for running automated tests for logstash in FIPS mode. It uses a new identifier in gradle for conditionally setting properties to configure fips mode. The tests are run in a container representative of the base image the final artifacts will be built from. * Move everything from qa/fips -> x-pack This commit moves test setup/config under x-pack dir. * Extend test pipelines for fips mode to java unit tests and integration * Add git to container for gradle * move fips-mode gradle hooks to x-pack * Skip license check for now --------- Co-authored-by: Ry Biesemeyer <ry.biesemeyer@elastic.co> * Split fips integration tests into two steps (#17038) * Split fips integration tests into two steps The integration tests suite takes about 40 minutes. This is far too slow for reasonable feedback on a PR. This commit follows the pattern for the non-fips integration tests whereby the tests are split into two sections that can run in parallel across two steps. This should halve the feedback time. The logic for getting a list of specs files to run has been extracted to a shared shell script for use here and in the integration tests shell script. * Use shared function for splitting integration tests The logic for getting a list of specs to run has been extracted so that it can be shared across fips and non fips integration test modes. This commit updates the non fips integration tests to use the shared function. * fix typo in helper name (kebab case, not snake) * Escape $ so buildkite upload does not try to interpolate * Wrap integration tests in shell script to avoid BK interpolation * Move entrypoint for running integration tests inside docker * Skip offline pack manager tests when running in fips mode (#17160) This commit introduces a pattern for skipping tests we do not want to run in fips mode. In this case the plugin manager tests rely on using bundler/net-http/openssl which is not configured to be run with bouncycastle fips providers. * Get tests running in FIPS environment (#17096) * Modify FIPS test runner environment for integration tests This commit makes two small changes to the dockerfile used to define the fips test environment. Specifically it adds curl (which is required by integration tests), make (which is required by test setup), adds a c compiler (gcc and glibc for integration tests which compile a small c program) and turns off debug ssl logging as it is extremely noisy in logs and breaking some assumptions in tests about logfile content. Closes https://github.com/elastic/ingest-dev/issues/5074 * Do not run test env as root The elastic stack is not meant to be run as root. This commit updates the test environment to provision a non root user and have the container context execute under that providioned user. Closes https://github.com/elastic/ingest-dev/issues/5088 * Skip unit tests that reach out to rubygems for fips mode The `update` test setup reaches out to rubygems with net/http which is incompatible with our use of openssl in fips mode. This commit skips those tests when running under fips. See https://github.com/elastic/ingest-dev/issues/5071 * Work around random data request limits in BCFIPS This commit changes test setup to make chunked calls to random data generation in order to work around a limit in fips mode. See https://github.com/elastic/ingest-dev/issues/5072 for details. * Skip tests validating openssl defaults Openssl will not be used when running under FIPS mode. The test setup and tests themselves were failing when running in FIPS mode. This commit skips the tests that are covering behavior that will be disabled. See https://github.com/elastic/ingest-dev/issues/5069 * Skip tests that require pluginmanager to install plugins This commit skips tests that rely on using the pluginmanager to install plugins during tests which require reaching out to rubygems. See https://github.com/elastic/ingest-dev/issues/5108 * Skip prepare offline pack integration tests in fips mode The offline pack tests require on pluginmanager to use net-http library for resolving deps. This will not operate under fips mode. Skip when running in fips mode. See https://github.com/elastic/ingest-dev/issues/5109 * Ensure a gem executible is on path for test setup This commit modifies the generate-gems script to ensure that a `gem` executable is on the path. If there is not one on the test runner, then use the one bundled with vendored jruby. * Skip webserver specs when running in FIPS mode This commit skips the existing webserver tests. We have some options and need to understand some requirements for the webserver functionality for fips mode. The https://github.com/elastic/ingest-dev/issues/5110 issue has a ton of details. * Skip cli `remove` integration tests for FIPS This commit skips tests that are running `remove` action for the pluginmanager. These require reaching out to rubygems which is not available in FIPS mode. These tests were added post initial integration tests scoping work but are clearly requiring skips for FIPS mode. * Add openssl package to FIPS testing env container The setup script for filebeats requires an openssl executable. This commit updates the testing container with this tool. See https://github.com/elastic/ingest-dev/issues/5107 * Re-introduce retries for FIPS tests now that we are in a passing state * Backport 17203 and 17267 fedramp8x (#17271) * Pluginmanager clean after mutate (#17203) * pluginmanager: always clean after mutate * pluginmanager: don't skip updating plugins installed with --version * pr feedback (cherry picked from commit8c96913807
) * Pluginmanager install preserve (#17267) * tests: integration tests for pluginmanager install --preserve * fix regression where pluginmanager's install --preserve flag didn't * Add :skip_fips to update_spec.rb * Run x-pack tests under FIPS mode (#17254) This commit adds two new CI cells to cover x-pack tests running in FIPS mode. This ensures we have coverage of these features when running existing x-pack tests. * observabilitySRE: docker rake tasks (#17272) * observabilitySRE: docker rake tasks * Apply suggestions from code review Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> * Update rakelib/plugin.rake * Update rakelib/plugin.rake * Update docker/Makefile Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> --------- Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> * Ensure env2yaml dep is properly expressed in observabilitySRE task (#17305) The `build-from-local-observability-sre-artifacts` task depends on the `env2yaml` task. This was easy to miss in local development if other images had been built. This commit updates the makefile to properly define that dependency. * Add a smoke test for observability SRE container (#17298) * Add a smoke test for observability SRE container Add a CI cell to ensure the observability contater is building successfully. In order to show success run a quick smoke test to point out any glaring issues. This adds some general, low risk plugins for doing quick testing. This will help developers in debugging as we work on this image. * Show what is happening when rake fails * Debug deeper in the stack Show the stdout/stderr when shelling out fails. * Debug layers of build tooling Open3 is not capturing stdout for some reason. Capture it and print to see what is wrong in CI. * Actually run ls command in docker container 🤦 * Update safe_system based on code review suggestion * Dynamically generate version for container invocation Co-authored-by: Ry Biesemeyer <yaauie@users.noreply.github.com> * Refactor smoke test setup to script Avoid interpolation backflips with buildkite by extracting to a script. * Split out message surfacing improvment to separate PR. Moved to: https://github.com/elastic/logstash/pull/17310 * Extract version qualifier into standalone script * Wait for version-qualifier.sh script to land upstream Use https://github.com/elastic/logstash/pull/17311 once it lands and gets backported to 8.x. For now just hard code version. --------- Co-authored-by: Ry Biesemeyer <yaauie@users.noreply.github.com> * Configure observability SRE container for FIPS (#17297) This commit establishes a pattern for configuring the container to run in fips mode. - Use chainguard-fips - Copy over java properties from ls tar archive - Convert default jks to BC keystore - Configure logstash to use java properties and FIPS config NOTE: this assumes bouncycastle jars are in the tarball. The https://github.com/elastic/ingest-dev/issues/5049 ticket will address that. * Exclude plugin manager and keystore cli from observabilitySRE artifact (#17375) * Conditionally install bcfips jars when building/testing observabilitySRE (#17359) * Conditionally install bcfips jars when building for observabilitySRE This commit implements a pattern for performing specific gradle tasks based on a newly named "fedrampHighMode" option. This option is used to configure tests to run with additional configuration specific to the observabilitySRE use case. Similarly the additional jar dependencies for bouncycastle fips providers are conditionally installed gated on the "fedrampHighMode" option. In order to ensure the the "fedrampHighMode" option persists through the layers of sub-processes spawned between gradle and rake we store and respect an environment variable FEDRAMP_HIGH_MODE. This may be useful generally in building the docker image. Try codereview suggestion * Use gradle pattern for setting properties with env vars Gradle has a mechanism for setting properties with environment variables prefixed with `ORG_GRADLE_PROJECT`. This commit updates the gradle tasks to use that pattern. See https://docs.gradle.org/current/userguide/build_environment.html#setting_a_project_property for details. * Pull in latests commits from 8.x and update based on new patterns (#17385) * Fix empty node stats pipelines (#17185) (#17197) Fixed an issue where the `/_node/stats` API displayed empty pipeline metrics when X-Pack monitoring was enabled (cherry picked from commit86785815bd
) Co-authored-by: kaisecheng <69120390+kaisecheng@users.noreply.github.com> * Update z_rubycheck.rake to no longer inject Xmx1g (#17211) This allows the environment variable JRUBY_OPTS to be used for setting properties like Xmx original pr: #16420 (cherry picked from commitf562f37df2
) Co-authored-by: kaisecheng <69120390+kaisecheng@users.noreply.github.com> * Improve warning for insufficient file resources for PQ max_bytes (#16656) (#17222) This commit refactors the `PersistedQueueConfigValidator` class to provide a more detailed, accurate and actionable warning when pipeline's PQ configs are at risk of running out of disk space. See https://github.com/elastic/logstash/issues/14839 for design considerations. The highlights of the changes include accurately determining the free resources on a filesystem disk and then providing a breakdown of the usage for each of the paths configured for a queue. (cherry picked from commit062154494a
) Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> * gradle task migrate to the new artifacts-api (#17232) (#17236) This commit migrates gradle task to the new artifacts-api - remove dependency on staging artifacts - all builds use snapshot artifacts - resolve version from current branch, major.x, previous minor, with priority given in that order. Co-authored-by: Andrea Selva <selva.andre@gmail.com> (cherry picked from commit0a745686f6
) Co-authored-by: kaisecheng <69120390+kaisecheng@users.noreply.github.com> * tests: ls2ls delay checking until events have been processed (#17167) (#17252) * tests: ls2ls delay checking until events have been processed * Make sure upstream sends expected number of events before checking the expectation with downstream. Remove unnecessary or duplicated logics from the spec. * Add exception handling in `wait_for_rest_api` to make wait for LS REST API retriable. --------- Co-authored-by: Mashhur <mashhur.sattorov@elastic.co> Co-authored-by: Mashhur <99575341+mashhurs@users.noreply.github.com> (cherry picked from commit73ffa243bf
) Co-authored-by: Ry Biesemeyer <yaauie@users.noreply.github.com> * Additional cleanify changes to ls2ls integ tests (#17246) (#17255) * Additional cleanify changes to ls2ls integ tests: replace heartbeat-input with reload option, set queue drain to get consistent result. (cherry picked from commit1e06eea86e
) Co-authored-by: Mashhur <99575341+mashhurs@users.noreply.github.com> * [8.x] Reimplement LogStash::Numeric setting in Java (backport #17127) (#17273) This is an automatic backport of pull request #17127 done by [Mergify](https://mergify.com). ---- * Reimplement LogStash::Numeric setting in Java (#17127) Reimplements `LogStash::Setting::Numeric` Ruby setting class into the `org.logstash.settings.NumericSetting` and exposes it through `java_import` as `LogStash::Setting::NumericSetting`. Updates the rspec tests: - verifies `java.lang.IllegalArgumentException` instead of `ArgumentError` is thrown because the kind of exception thrown by Java code, during verification. (cherry picked from commit07a3c8e73b
) * Fixed reference of SettingNumeric class (on main modules were removed) --------- Co-authored-by: Andrea Selva <selva.andre@gmail.com> * [CI] Health report integration tests use the new artifacts-api (#17274) (#17277) migrate to the new artifacts-api (cherry picked from commitfeb2b92ba2
) Co-authored-by: kaisecheng <69120390+kaisecheng@users.noreply.github.com> * Backport 17203 and 17267 8.x (#17270) * Pluginmanager clean after mutate (#17203) * pluginmanager: always clean after mutate * pluginmanager: don't skip updating plugins installed with --version * pr feedback (cherry picked from commit8c96913807
) * Pluginmanager install preserve (#17267) * tests: integration tests for pluginmanager install --preserve * fix regression where pluginmanager's install --preserve flag didn't * [Backport 8.x] benchmark script (#17283) This commit cherry-picked the missing becnhmark script PRs The deprecated artifacts-api is removed [CI] benchmark uses the new artifacts-api (#17224) [CI] benchmark readme (#16783) Introduce a new flag to explicitly permit legacy monitoring (#16586) (Only take the benchmark script) [ci] fix wrong queue type in benchmark marathon (#16465) [CI] fix benchmark marathon (#16447) [CI] benchmark dashboard and pipeline for testing against multiple versions (#16421) * Fix pqcheck and pqrepair on Windows (#17210) (#17259) A recent change to pqheck, attempted to address an issue where the pqcheck would not on Windows mahcines when located in a folder containing a space, such as "C:\program files\elastic\logstash". While this fixed an issue with spaces in folders, it introduced a new issue related to Java options, and the pqcheck was still unable to run on Windows. This PR attempts to address the issue, by removing the quotes around the Java options, which caused the option parsing to fail, and instead removes the explicit setting of the classpath - the use of `set CLASSPATH=` in the `:concat` function is sufficient to set the classpath, and should also fix the spaces issue Fixes: #17209 (cherry picked from commitba5f21576c
) Co-authored-by: Rob Bavey <rob.bavey@elastic.co> * Shareable function for partitioning integration tests (#17223) (#17303) For the fedramp high work https://github.com/elastic/logstash/pull/17038/files a use case for multiple scripts consuming the partitioning functionality emerged. As we look to more advanced partitioning we want to ensure that the functionality will be consumable from multiple scripts. See https://github.com/elastic/logstash/pull/17219#issuecomment-2698650296 (cherry picked from commitd916972877
) Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> * [8.x] Surface failures from nested rake/shell tasks (backport #17310) (#17317) * Surface failures from nested rake/shell tasks (#17310) Previously when rake would shell out the output would be lost. This made debugging CI logs difficult. This commit updates the stack with improved message surfacing on error. (cherry picked from commit0d931a502a
) # Conflicts: # rubyUtils.gradle * Extend ruby linting tasks to handle file inputs (#16660) This commit extends the gradle and rake tasks to pass through a list of files for rubocop to lint. This allows more specificity and fine grained control for linting when the consumer of the tasks only wishes to lint a select few files. * Ensure shellwords library is loaded Without this depending on task load order `Shellwords` may not be available. --------- Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> * Forward Port of Release notes for `8.16.5` and `8.17.3` (#17187), (#17188) (#17266) (#17321) * Forward Port of Release notes for 8.17.3 (#17187) * Update release notes for 8.17.3 --------- Co-authored-by: logstashmachine <43502315+logstashmachine@users.noreply.github.com> Co-authored-by: Rob Bavey <rob.bavey@elastic.co> * Forward Port of Release notes for 8.16.5 (#17188) * Update release notes for 8.16.5 --------- Co-authored-by: logstashmachine <43502315+logstashmachine@users.noreply.github.com> Co-authored-by: Rob Bavey <rob.bavey@elastic.co> --------- Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: logstashmachine <43502315+logstashmachine@users.noreply.github.com> (cherry picked from commit63e8fd1d21
) Co-authored-by: Rob Bavey <rob.bavey@elastic.co> * Add Deprecation tag to arcsight module (#17331) * [8.x] Upgrade elasticsearch-ruby client. (backport #17161) (#17306) * Upgrade elasticsearch-ruby client. (#17161) * Fix Faraday removed basic auth option and apply the ES client module name change. (cherry picked from commite748488e4a
) * Apply the required changes in elasticsearch_client.rb after upgrading the elasticsearch-ruby client to 8.x * Swallow the exception and make non-connectable client when ES client raises connection refuses exception. --------- Co-authored-by: Mashhur <99575341+mashhurs@users.noreply.github.com> Co-authored-by: Mashhur <mashhur.sattorov@elastic.co> * Removed unused configHash computation that can be replaced by PipelineConfig.configHash() (#17336) (#17345) Removed unused configHash computation happening in AbstractPipeline and used only in tests replaced by PipelineConfig.configHash() invocation (cherry picked from commit787fd2c62f
) Co-authored-by: Andrea Selva <selva.andre@gmail.com> * Use org.logstash.common.Util to hashing by default to SHA256 (#17346) (#17352) Removes the usage fo Apache Commons Codec MessgeDigest to use internal Util class with embodies hashing methods. (cherry picked from commit9c0e50faac
) Co-authored-by: Andrea Selva <selva.andre@gmail.com> * Added test to verify the int overflow happen (#17353) (#17354) Use long instead of int type to keep the length of the first token. The size limit validation requires to sum two integers, one with the length of the accumulated chars till now plus the next fragment head part. If any of the two sizes is close to the max integer it generates an overflow and could successfully fail the test9c0e50faac/logstash-core/src/main/java/org/logstash/common/BufferedTokenizerExt.java (L123)
. To fall in this case it's required that sizeLimit is bigger then 2^32 bytes (2GB) and data fragments without any line delimiter is pushed to the tokenizer with a total size close to 2^32 bytes. (cherry picked from commitafde43f918
) Co-authored-by: Andrea Selva <selva.andre@gmail.com> * [8.x] add ci shared qualified-version script (backport #17311) (#17348) * add ci shared qualified-version script (#17311) * ci: add shareable script for generating qualified version * ci: use shared script to generate qualified version (cherry picked from commit10b5a84f84
) # Conflicts: # .buildkite/scripts/dra/build_docker.sh * resolve merge conflict --------- Co-authored-by: Rye Biesemeyer <yaauie@users.noreply.github.com> * tests: make integration split quantity configurable (#17219) (#17367) * tests: make integration split quantity configurable Refactors shared splitter bash function to take a list of files on stdin and split into a configurable number of partitions, emitting only those from the currently-selected partition to stdout. Also refactors the only caller in the integration_tests launcher script to accept an optional partition_count parameter (defaulting to `2` for backward- compatibility), to provide the list of specs to the function's stdin, and to output relevant information about the quantity of partition splits and which was selected. * ci: run integration tests in 3 parts (cherry picked from commit3e0f488df2
) Co-authored-by: Rye Biesemeyer <yaauie@users.noreply.github.com> * Update buildkite with new patterns from 8.x This commit updates the buildkite definitions to be compatible with the upstream 8.x branch. Specificially: - Split integration tests for fips into 3 runners. - Use the new shared bash helper for computing QUALIFIED_VERSION It also continues standardization of using a "fedrampHighMode" for indicating the tests should be running in the context of our custom image for the SRE team. * Bug fix: Actually use shared integration_tests.sh file After refactoring to use the same script, I forgot to actually use it in the buildkite definition... --------- Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: kaisecheng <69120390+kaisecheng@users.noreply.github.com> Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: Ry Biesemeyer <yaauie@users.noreply.github.com> Co-authored-by: Mashhur <99575341+mashhurs@users.noreply.github.com> Co-authored-by: Andrea Selva <selva.andre@gmail.com> Co-authored-by: Rob Bavey <rob.bavey@elastic.co> Co-authored-by: Mashhur <mashhur.sattorov@elastic.co> * Pin rubocop-ast development gem due to new dep on prism (#17407) (#17433) The rubocop-ast gem just introduced a new dependency on prism. - https://rubygems.org/gems/rubocop-ast/versions/1.43.0 In our install default gem rake task we are seeing issues trying to build native extensions. I see that in upstream jruby they are seeing a similar problem (at least it is the same failure mode https://github.com/jruby/jruby/pull/8415 This commit pins rubocop-ast to 1.42.0 which is the last version that did not have an explicit prism dependency. (cherry picked from commit6de59f2c02
) Co-authored-by: Cas Donoghue <cas.donoghue@gmail.com> * Add age filter fedramp (#17434) * net-zero-change refactor * add logstash-filter-age to observabilitySRE artifact * Add licenses for bouncycastle fips jars (#17406) This commit adds licences for bouncycastle jars that are added for the observability SRE container artifact. It re-enables the previously disabled license check and adds a new one running in fips mode. * Publish Observability SRE images to internal container registry (#17401) * POC for publishing observability SRE images This commit adds a step to the pull_request_pipeline buildkite definition to push a docker image to the elastic container registry. It is added here to show that we have the proper creds etc in CI to push the container where it needs to go. We will likely move this into the DRA pipeline once we are confident it is pushing to the correct place with a naming convention that works for all consumers/producers. The general idea is to build the container with our gradle task, then once we have that image we can tag it with the git sha and a "latest" identifier. This would allow consumers to choose between an exact sha for a stream like 8.19.0 or the "latest". I will also need to factor in the case where we have the tag *without* the sha postfix. Obviously we will want to fold this in to the existing DRA pipeline for building/staging images but for now it seems reasonable to handle this separately. * check variable resolution * Move POC code into DRA pipeline This commit takes the POC from the pull_request_pipeline and adds it to the DRA pipeline. Noteably, we take care to not disrupt anything about the existing DRA pipeline by making this wait until after the artifacts are published and we set a soft_fail. While this is being introduced and stabilized we want to ensure the existing DRA pipeline continues to work without interruption. As we get more stability we can look at a tigther integration. * Disambiguate architectures Eventually we will want to do proper annotations with manifests but for now just add arch to the tag. * Use docker manifest for multi-architecture builds This commit refactors the POC pipeline for pushing observabilty SRE containers to handle conflicts for tags based on target architectures. Cells with respective architectures build containers and push to the container registry with a unique identifier. Once those exist we introduce a separate step to use the docker manifest command to annotate those images such that a container client can download the correct image based on architecture. As a result for every artifact there will be 2 images pushed (one for each arch) and N manifests pushed. The manifests will handle the final naming that the consumer would expect. * Refactor docker naming scheme In order to follow more closely the existing tagging scheme this commit refactors the naming for images to include the build sha BEFORE the SNAPSHOT identifier. WHile this does not exactly follow the whole system that exists today for container images in DRA it follows a pattern that is more similar. Ideally we can iterate to fold handling of this container into DRA and in that case consumers would not need to update their patterns for identifying images. * Code review refactor Rename INCLUDE_SHA to INCLUDE_COMMIT_ID in qualified-version script. Confine use of this argument to individual invocations instead at top level in scripts. * Build observabilitySRE containers after DRA is published This gates build/push for observability SRE containers on success of DRA pipeline. * x-pack: add fips validation plugin from x-pack (#16940) * x-pack: add fips_validation plugin to be included in fips builds The `logstash-integration-fips_validation` plugin provides no runtime pipeline plugins, but instead provides hooks to ensure that the logstash process is correctly configured for compliance with FIPS 140-3. It is installed while building the observabilitySRE artifacts. * fips validation: ensure BCFIPS,BCJSSE,SUN are first 3 security providers * remove re-injection of BCFIPS jars * Update lib/bootstrap/rubygems.rb * add integration spec for fips_validation plugin * add missing logstash_plugin helper * fixup * skip non-fips spec on fips-configured artifact, add spec details * Improve smoke tests for observability SRE image (#17486) * Improve smoke tests for observability SRE image This commit adds a new rspec test to run the observability SRE container in a docker compose network with filebeat and elasticsearch. It uses some simple test data through a pipeline with plugins we expect to be used in production. The rspec tests will ensure the test data is flowing from filebeat to logstash to elasticsearch by querying elasticsearch for expected transformed data. * REVERT ME: debug whats goig on in CI :( * Run filebeat container as root * Work around strict file ownership perms for filebeat We add the filebeat config in a volume, the permissions checks fail due test runner not being a root user. This commit disables that check in filebeat as seems to be the consensus solution online for example: https://event-driven.io/en/tricks_on_how_to_set_up_related_docker_images/ * Dynaimcally generate PKI instead of checking it in Instead of checking in PKI, dynamically generate it with gradle task for starting containers and running the tests. This improvement avoids github warning of checked in keys and avoid expiration headaches. Generation is very fast and does not add any significant overhead to test setup. * Remove use of "should" in rspec docstrings see https://github.com/rubocop/rspec-style-guide?tab=readme-ov-file#should-in-example-docstrings * Ensure permissions readable for volume Now that certs are dynamically generated, ensure they are able to be read in container * Use elasticsearch-fips image for smoke testing * Add git ignore for temp certs * Fix naming convention for integration tests Co-authored-by: Rye Biesemeyer <yaauie@users.noreply.github.com> * Use parameter expansion for FEDRAMP_HIGH_MODE Co-authored-by: Rye Biesemeyer <yaauie@users.noreply.github.com> * Use parameter expansion for FEDRAMP_HIGH_MODE Co-authored-by: Rye Biesemeyer <yaauie@users.noreply.github.com> * Use parameter expansion for FEDRAMP_HIGH_MODE Co-authored-by: Rye Biesemeyer <yaauie@users.noreply.github.com> --------- Co-authored-by: Ry Biesemeyer <ry.biesemeyer@elastic.co> Co-authored-by: Ry Biesemeyer <yaauie@users.noreply.github.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com> Co-authored-by: kaisecheng <69120390+kaisecheng@users.noreply.github.com> Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: Mashhur <99575341+mashhurs@users.noreply.github.com> Co-authored-by: Andrea Selva <selva.andre@gmail.com> Co-authored-by: Rob Bavey <rob.bavey@elastic.co> Co-authored-by: Mashhur <mashhur.sattorov@elastic.co> NOTE: we decided to squash these commits as the feature branch had cherry-picks (and squshed change sets182f15ebde
) from 8.x which would potentially make the commit history confusing. We determined that the benefit of having individual commits from the feature branch was outweighed by the potentially confusing git history. This will also make porting this bit of work to other streams more simple.
440 lines
18 KiB
Groovy
440 lines
18 KiB
Groovy
/*
|
|
* Licensed to Elasticsearch B.V. under one or more contributor
|
|
* license agreements. See the NOTICE file distributed with
|
|
* this work for additional information regarding copyright
|
|
* ownership. Elasticsearch B.V. licenses this file to you under
|
|
* the Apache License, Version 2.0 (the "License"); you may
|
|
* not use this file except in compliance with the License.
|
|
* You may obtain a copy of the License at
|
|
*
|
|
* http://www.apache.org/licenses/LICENSE-2.0
|
|
*
|
|
* Unless required by applicable law or agreed to in writing,
|
|
* software distributed under the License is distributed on an
|
|
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
|
|
* KIND, either express or implied. See the License for the
|
|
* specific language governing permissions and limitations
|
|
* under the License.
|
|
*/
|
|
|
|
|
|
buildscript {
|
|
repositories {
|
|
mavenCentral()
|
|
}
|
|
dependencies {
|
|
classpath "org.yaml:snakeyaml:${snakeYamlVersion}"
|
|
classpath "de.undercouch:gradle-download-task:4.0.4"
|
|
classpath "org.jruby:jruby-core:9.4.9.0"
|
|
}
|
|
}
|
|
|
|
import de.undercouch.gradle.tasks.download.Download
|
|
import de.undercouch.gradle.tasks.download.Verify
|
|
import org.yaml.snakeyaml.Yaml
|
|
|
|
|
|
import org.jruby.Ruby
|
|
import org.jruby.embed.PathType
|
|
import org.jruby.embed.ScriptingContainer
|
|
|
|
import java.lang.annotation.Annotation
|
|
import java.nio.file.Files
|
|
import java.nio.file.Paths
|
|
|
|
|
|
ext {
|
|
bundle = this.&bundle
|
|
bundleWithEnv = this.&bundleWithEnv
|
|
bundleQAGems = this.&bundleQAGems
|
|
gem = this.&gem
|
|
buildGem = this.&buildGem
|
|
rake = this.&rake
|
|
setupJruby = this.&setupJruby
|
|
generateRubySupportFilesForPlugin = this.&generateRubySupportFilesForPlugin
|
|
validatePluginJar = this.&validatePluginJar
|
|
versionMap = new HashMap()
|
|
pluginInfo = new PluginInfo()
|
|
}
|
|
|
|
|
|
/**
|
|
* Executes a bundler bin script with given parameters.
|
|
* @param projectDir Gradle projectDir
|
|
* @param buildDir Gradle buildDir
|
|
* @param pwd Current worker directory to execute in
|
|
* @param bundleBin Bundler Bin Script
|
|
* @param args CLI Args to Use with Bundler
|
|
*/
|
|
void bundle(File projectDir, File buildDir, String pwd, String bundleBin, Iterable<String> args) {
|
|
bundleWithEnv(projectDir, buildDir, pwd, bundleBin, args, Collections.emptyMap())
|
|
}
|
|
|
|
/**
|
|
* Executes a bundler bin script with given parameters.
|
|
* @param projectDir Gradle projectDir
|
|
* @param buildDir Gradle buildDir
|
|
* @param pwd Current worker directory to execute in
|
|
* @param bundleBin Bundler Bin Script
|
|
* @param args CLI Args to Use with Bundler
|
|
* @param env Environment Variables to Set
|
|
*/
|
|
void bundleWithEnv(File projectDir, File buildDir, String pwd, String bundleBin, Iterable<String> args, Map<String, String> env) {
|
|
executeJruby projectDir, buildDir, { ScriptingContainer jruby ->
|
|
jruby.environment.putAll(env)
|
|
jruby.currentDirectory = pwd
|
|
jruby.argv = args.toList().toArray()
|
|
jruby.runScriptlet(PathType.ABSOLUTE, bundleBin)
|
|
}
|
|
}
|
|
|
|
void bundleQAGems(File projectDir, String qaBuildPath) {
|
|
def jruby = new ScriptingContainer()
|
|
jruby.setLoadPaths(["${projectDir}/vendor/jruby/lib/ruby/stdlib".toString()])
|
|
try {
|
|
jruby.currentDirectory = qaBuildPath
|
|
jruby.runScriptlet("""
|
|
require "bundler"
|
|
require "bundler/cli"
|
|
Bundler::CLI.start(['install', '--path', "${qaBuildPath}/vendor", '--gemfile', "${projectDir}/qa/integration/Gemfile"])
|
|
""")
|
|
} finally {
|
|
jruby.terminate()
|
|
Ruby.clearGlobalRuntime()
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Installs a Gem with the given version to the given path.
|
|
* @param projectDir Gradle projectDir
|
|
* @param buildDir Gradle buildDir
|
|
* @param gem Gem Name
|
|
* @param version Version to Install
|
|
* @param path Path to Install to
|
|
*/
|
|
void gem(File projectDir, File buildDir, String gem, String version, String path) {
|
|
executeJruby projectDir, buildDir, { ScriptingContainer jruby ->
|
|
jruby.currentDirectory = projectDir
|
|
jruby.runScriptlet("""
|
|
require 'rubygems/commands/install_command'
|
|
cmd = Gem::Commands::InstallCommand.new
|
|
cmd.handle_options ['--no-document', '${gem}', '-v', '${version}', '-i', '${path}']
|
|
begin
|
|
cmd.execute
|
|
rescue Gem::SystemExitException => e
|
|
raise e unless e.exit_code == 0
|
|
end
|
|
"""
|
|
)
|
|
}
|
|
}
|
|
|
|
void buildGem(File projectDir, File buildDir, String gemspec) {
|
|
executeJruby projectDir, buildDir, { ScriptingContainer jruby ->
|
|
jruby.currentDirectory = projectDir
|
|
jruby.runScriptlet("""
|
|
require 'rubygems/commands/build_command'
|
|
cmd = Gem::Commands::BuildCommand.new
|
|
cmd.handle_options ['${gemspec}']
|
|
begin
|
|
cmd.execute
|
|
rescue Gem::SystemExitException => e
|
|
raise e unless e.exit_code == 0
|
|
end
|
|
"""
|
|
)
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Executes RSpec for a given plugin.
|
|
* @param projectDir Gradle projectDir
|
|
* @param buildDir Gradle buildDir
|
|
* @param plugin Plugin to run specs for
|
|
* @param args Optional arguments to pass to the rake task
|
|
*/
|
|
void rake(File projectDir, File buildDir, String task, String... args) {
|
|
executeJruby projectDir, buildDir, { ScriptingContainer jruby ->
|
|
jruby.currentDirectory = projectDir
|
|
jruby.runScriptlet("require 'rake'; require 'time'")
|
|
def rakeArgs = args ? "'${args.join("','")}'" : ""
|
|
jruby.runScriptlet("""
|
|
begin
|
|
rake = Rake.application
|
|
rake.init
|
|
rake.load_rakefile
|
|
rake['${task}'].invoke(${rakeArgs})
|
|
rescue => e
|
|
puts "Rake task error: #{e.class}: #{e.message}"
|
|
puts "Backtrace: #{e.backtrace.join("\\n")}"
|
|
raise e
|
|
end
|
|
"""
|
|
)
|
|
}
|
|
}
|
|
|
|
void setupJruby(File projectDir, File buildDir) {
|
|
executeJruby projectDir, buildDir, { ScriptingContainer jruby ->
|
|
jruby.currentDirectory = projectDir
|
|
jruby.runScriptlet("require '${projectDir}/lib/bootstrap/environment'")
|
|
jruby.runScriptlet("LogStash::Bundler.invoke!")
|
|
jruby.runScriptlet("LogStash::Bundler.genericize_platform")
|
|
}
|
|
}
|
|
|
|
/**
|
|
* Executes Closure using a fresh JRuby environment, safely tearing it down afterwards.
|
|
* @param projectDir Gradle projectDir
|
|
* @param buildDir Gradle buildDir
|
|
* @param block Closure to run
|
|
*/
|
|
Object executeJruby(File projectDir, File buildDir, Closure<?> /* Object*/ block) {
|
|
def jruby = new ScriptingContainer()
|
|
def env = jruby.environment
|
|
def gemDir = "${projectDir}/vendor/bundle/jruby/3.1.0".toString()
|
|
jruby.setLoadPaths(["${projectDir}/vendor/jruby/lib/ruby/stdlib".toString()])
|
|
env.put "USE_RUBY", "1"
|
|
env.put "GEM_HOME", gemDir
|
|
env.put "GEM_SPEC_CACHE", "${buildDir}/cache".toString()
|
|
env.put "GEM_PATH", gemDir
|
|
// Pass through ORG_GRADLE_PROJECT_fedrampHighMode if it exists in the project properties
|
|
// See https://docs.gradle.org/current/userguide/build_environment.html#setting_a_project_property
|
|
// For more information about setting properties via env vars prefixed with ORG_GRADLE_PROJECT
|
|
if (project.hasProperty('fedrampHighMode') && project.property('fedrampHighMode').toBoolean()) {
|
|
env.put "ORG_GRADLE_PROJECT_fedrampHighMode", "true"
|
|
}
|
|
try {
|
|
block(jruby)
|
|
} finally {
|
|
jruby.terminate()
|
|
Ruby.clearGlobalRuntime()
|
|
}
|
|
}
|
|
|
|
//===============================================================================
|
|
// Ruby variables
|
|
//===============================================================================
|
|
|
|
def versionsPath = project.hasProperty("LOGSTASH_CORE_PATH") ? LOGSTASH_CORE_PATH + "/../versions.yml" : "${projectDir}/versions.yml"
|
|
versionMap = (Map) (new Yaml()).load(new File("${versionsPath}").text)
|
|
|
|
String jRubyURL
|
|
String jRubyVersion
|
|
String jRubySha1
|
|
Boolean doChecksum
|
|
|
|
if (versionMap["jruby-runtime-override"]) {
|
|
jRubyVersion = versionMap["jruby-runtime-override"]["version"]
|
|
jRubyURL = versionMap["jruby-runtime-override"]["url"]
|
|
doChecksum = false
|
|
} else {
|
|
jRubyVersion = versionMap["jruby"]["version"]
|
|
jRubySha1 = versionMap["jruby"]["sha1"]
|
|
jRubyURL = "https://repo1.maven.org/maven2/org/jruby/jruby-dist/${jRubyVersion}/jruby-dist-${jRubyVersion}-bin.tar.gz"
|
|
doChecksum = true
|
|
}
|
|
def jrubyTarPath = "${projectDir}/vendor/_/jruby-dist-${jRubyVersion}-bin.tar.gz"
|
|
|
|
def customJRubyDir = project.hasProperty("custom.jruby.path") ? project.property("custom.jruby.path") : ""
|
|
def customJRubyVersion = customJRubyDir == "" ? "" : Files.readAllLines(Paths.get(customJRubyDir, "VERSION")).get(0).trim()
|
|
def customJRubyTar = customJRubyDir == "" ? "" : (customJRubyDir + "/maven/jruby-dist/target/jruby-dist-${customJRubyVersion}-bin.tar.gz")
|
|
|
|
tasks.register("downloadJRuby", Download) {
|
|
description "Download JRuby artifact from this specific URL: ${jRubyURL}"
|
|
src jRubyURL
|
|
onlyIfNewer true
|
|
inputs.file(versionsPath)
|
|
outputs.file(jrubyTarPath)
|
|
dest new File("${projectDir}/vendor/_", "jruby-dist-${jRubyVersion}-bin.tar.gz")
|
|
}
|
|
|
|
downloadJRuby.onlyIf { customJRubyDir == "" }
|
|
|
|
tasks.register("verifyFile", Verify) {
|
|
dependsOn downloadJRuby
|
|
description "Verify the SHA1 of the download JRuby artifact"
|
|
inputs.file(jrubyTarPath)
|
|
outputs.file(jrubyTarPath)
|
|
src new File(jrubyTarPath)
|
|
algorithm 'SHA-1'
|
|
checksum jRubySha1
|
|
}
|
|
|
|
verifyFile.onlyIf { customJRubyDir == "" }
|
|
verifyFile.onlyIf { doChecksum }
|
|
|
|
tasks.register("buildCustomJRuby", Exec) {
|
|
description "Build tar.gz and .jar artifacts from JRuby source directory"
|
|
workingDir (customJRubyDir == "" ? "./" : customJRubyDir)
|
|
commandLine './mvnw', 'clean', 'install', '-Pdist', '-Pcomplete'
|
|
standardOutput = new ByteArrayOutputStream()
|
|
errorOutput = new ByteArrayOutputStream()
|
|
ext.output = {
|
|
standardOutput.toString() + errorOutput.toString()
|
|
}
|
|
}
|
|
|
|
buildCustomJRuby.onlyIf { customJRubyDir != "" }
|
|
|
|
tasks.register("installCustomJRuby", Copy) {
|
|
dependsOn buildCustomJRuby
|
|
description "Install custom built JRuby in the vendor directory"
|
|
inputs.file(customJRubyTar)
|
|
outputs.dir("${projectDir}/vendor/jruby")
|
|
from tarTree(customJRubyTar == "" ? jrubyTarPath : customJRubyTar)
|
|
eachFile { f ->
|
|
f.path = f.path.replaceFirst("^jruby-${customJRubyVersion}", '')
|
|
}
|
|
includeEmptyDirs = false
|
|
into "${projectDir}/vendor/jruby"
|
|
}
|
|
|
|
installCustomJRuby.onlyIf { customJRubyDir != "" }
|
|
|
|
tasks.register("downloadAndInstallJRuby", Copy) {
|
|
dependsOn=[verifyFile, installCustomJRuby]
|
|
description "Install JRuby in the vendor directory"
|
|
inputs.file(jrubyTarPath)
|
|
outputs.dir("${projectDir}/vendor/jruby")
|
|
from tarTree(downloadJRuby.dest)
|
|
eachFile { f ->
|
|
f.path = f.path.replaceFirst("^jruby-${jRubyVersion}", '')
|
|
}
|
|
exclude "**/did_you_mean-*/evaluation/**" // licensing issue https://github.com/jruby/jruby/issues/6471
|
|
exclude "vendor/bundle/jruby/**/gems/ruby-maven-libs-3.3.9/**/*"
|
|
exclude "**/lib/jni/**/**"
|
|
|
|
includeEmptyDirs = false
|
|
into "${projectDir}/vendor/jruby"
|
|
}
|
|
|
|
downloadAndInstallJRuby.onlyIf { customJRubyDir == "" }
|
|
|
|
//===============================================================================
|
|
// Ruby auto-gen utilities for Java plugins
|
|
//===============================================================================
|
|
|
|
class PluginInfo {
|
|
public String[] licenses
|
|
public String longDescription
|
|
public String[] authors
|
|
public String[] email
|
|
public String homepage
|
|
public String pluginType
|
|
public String pluginClass
|
|
public String pluginName
|
|
|
|
String pluginFullName() {
|
|
return "logstash-" + pluginType + "-" + pluginName
|
|
}
|
|
}
|
|
|
|
void generateRubySupportFilesForPlugin(String projectDescription, String projectGroup, String version) {
|
|
File gemFile = file("Gemfile")
|
|
gemFile.write("# AUTOGENERATED BY THE GRADLE SCRIPT. EDITS WILL BE OVERWRITTEN.\n")
|
|
gemFile.append("source 'https://rubygems.org'\n")
|
|
gemFile.append("\n")
|
|
gemFile.append("gemspec\n")
|
|
gemFile.append("\n")
|
|
gemFile.append("logstash_path = ENV[\"LOGSTASH_PATH\"] || \"../../logstash\"\n")
|
|
gemFile.append("use_logstash_source = ENV[\"LOGSTASH_SOURCE\"] && ENV[\"LOGSTASH_SOURCE\"].to_s == \"1\"\n")
|
|
gemFile.append("\n")
|
|
gemFile.append("if Dir.exist?(logstash_path) && use_logstash_source\n")
|
|
gemFile.append(" gem 'logstash-core', :path => \"#{logstash_path}/logstash-core\"\n")
|
|
gemFile.append(" gem 'logstash-core-plugin-api', :path => \"#{logstash_path}/logstash-core-plugin-api\"\n")
|
|
gemFile.append("end\n")
|
|
|
|
File gemspecFile = file(pluginInfo.pluginFullName() + ".gemspec")
|
|
gemspecFile.write("# AUTOGENERATED BY THE GRADLE SCRIPT. EDITS WILL BE OVERWRITTEN.\n")
|
|
gemspecFile.append("Gem::Specification.new do |s|\n")
|
|
gemspecFile.append(" s.name = '" + pluginInfo.pluginFullName() + "'\n")
|
|
gemspecFile.append(" s.version = ::File.read('VERSION').split('\\n').first\n")
|
|
gemspecFile.append(" s.licenses = ['" + String.join("', '", pluginInfo.licenses) + "']\n")
|
|
gemspecFile.append(" s.summary = '" + projectDescription + "'\n")
|
|
gemspecFile.append(" s.description = '" + pluginInfo.longDescription + "'\n")
|
|
gemspecFile.append(" s.authors = ['" + String.join("', '", pluginInfo.authors) + "']\n")
|
|
gemspecFile.append(" s.email = ['" + String.join("', '", pluginInfo.email) + "']\n")
|
|
gemspecFile.append(" s.homepage = '" + pluginInfo.homepage + "'\n")
|
|
gemspecFile.append(" s.platform = 'java'\n")
|
|
gemspecFile.append(" s.require_paths = ['lib', 'vendor/jar-dependencies']\n")
|
|
gemspecFile.append("\n")
|
|
gemspecFile.append(" s.files = Dir[\"lib/**/*\",\"*.gemspec\",\"*.md\",\"CONTRIBUTORS\",\"Gemfile\",\"LICENSE\",\"NOTICE.TXT\", \"vendor/jar-dependencies/**/*.jar\", \"vendor/jar-dependencies/**/*.rb\", \"VERSION\", \"docs/**/*\"]\n")
|
|
gemspecFile.append("\n")
|
|
gemspecFile.append(" # Special flag to let us know this is actually a logstash plugin\n")
|
|
gemspecFile.append(" s.metadata = { 'logstash_plugin' => 'true', 'logstash_group' => '" + pluginInfo.pluginType + "', 'java_plugin' => 'true'}\n")
|
|
gemspecFile.append("\n")
|
|
gemspecFile.append(" # Gem dependencies\n")
|
|
gemspecFile.append(" s.add_runtime_dependency \"logstash-core-plugin-api\", \">= 1.60\", \"<= 2.99\"\n")
|
|
gemspecFile.append(" s.add_runtime_dependency 'jar-dependencies'\n")
|
|
gemspecFile.append(" s.add_development_dependency 'logstash-devutils'\n")
|
|
gemspecFile.append("end\n")
|
|
|
|
String moduleName = pluginInfo.pluginType.substring(0, 1).toUpperCase() + pluginInfo.pluginType.substring(1) + "s"
|
|
File pluginRb = file("lib/logstash/" + pluginInfo.pluginType + "s/" + pluginInfo.pluginName + ".rb")
|
|
Files.createDirectories(pluginRb.toPath().getParent())
|
|
pluginRb.write("# AUTOGENERATED BY THE GRADLE SCRIPT. EDITS WILL BE OVERWRITTEN.\n")
|
|
pluginRb.append("# encoding: utf-8\n")
|
|
pluginRb.append("require \"logstash/" + pluginInfo.pluginType + "s/base\"\n")
|
|
pluginRb.append("require \"logstash/namespace\"\n")
|
|
pluginRb.append("require \"" + pluginInfo.pluginFullName() + "_jars\"\n")
|
|
pluginRb.append("require \"java\"\n")
|
|
pluginRb.append("\n")
|
|
pluginRb.append("class LogStash::" + moduleName + "::" + pluginInfo.pluginClass + " < LogStash::" + moduleName + "::Base\n")
|
|
pluginRb.append(" config_name \"" + pluginInfo.pluginName + "\"\n")
|
|
pluginRb.append("\n")
|
|
pluginRb.append(" def self.javaClass() Java::" + projectGroup + "." + pluginInfo.pluginClass + ".java_class; end\n")
|
|
pluginRb.append("end\n")
|
|
|
|
File pluginJarsRb = file("lib/" + pluginInfo.pluginFullName() + "_jars.rb")
|
|
pluginJarsRb.write("# AUTOGENERATED BY THE GRADLE SCRIPT. EDITS WILL BE OVERWRITTEN.\n")
|
|
pluginJarsRb.append("# encoding: utf-8\n")
|
|
pluginJarsRb.append("\n")
|
|
pluginJarsRb.append("require 'jar_dependencies'\n")
|
|
pluginJarsRb.append("require_jar('" + projectGroup + "', '" + pluginInfo.pluginFullName() + "', '" + version +"')\n")
|
|
}
|
|
|
|
void validatePluginJar(File pluginJar, String group) {
|
|
List<String> validationErrors = new ArrayList<>()
|
|
|
|
if (group.equals('org.logstash') || group.startsWith('org.logstash.') || group.equals('co.elastic.logstash') || group.startsWith('co.elastic.logstash.')) {
|
|
validationErrors.add("The plugin should not be placed in the 'org.logstash' or 'co.elastic.logstash' packages")
|
|
throw new GradleScriptException("Plugin validation errors:" + System.lineSeparator() +
|
|
String.join(System.lineSeparator(), validationErrors), null)
|
|
}
|
|
|
|
URLClassLoader cl = URLClassLoader.newInstance([pluginJar.toURI().toURL()] as URL[])
|
|
String pluginClassName = group + "." + pluginInfo.pluginClass
|
|
|
|
Class<?> pluginClass = null
|
|
try {
|
|
pluginClass = cl.loadClass(pluginClassName)
|
|
} catch (ClassNotFoundException ex) {
|
|
validationErrors.add(String.format("Unable to locate plugin class defined in build.gradle as '%s' in jar '%s'", pluginClassName, pluginJar))
|
|
throw new GradleScriptException("Plugin validation errors:" + System.lineSeparator() +
|
|
String.join(System.lineSeparator(), validationErrors), null)
|
|
}
|
|
|
|
if (pluginClass != null) {
|
|
|
|
Annotation[] logstashPlugin = pluginClass.getAnnotations().findAll({ x -> x.annotationType().toString().equals("interface co.elastic.logstash.api.LogstashPlugin") })
|
|
if (logstashPlugin.length != 1) {
|
|
validationErrors.add("There must be a single @LogstashPlugin annotation on the plugin class")
|
|
} else {
|
|
String pluginAnnotation = logstashPlugin[0].name()
|
|
|
|
if (pluginAnnotation != pluginInfo.pluginName) {
|
|
validationErrors.add("The 'name' property on the @LogstashPlugin (which is '" + pluginAnnotation + "') must match the 'pluginName' property which is defined as '" + pluginInfo.pluginName + "' in the build.gradle file")
|
|
}
|
|
|
|
if (pluginAnnotation.replace("_", "").toLowerCase() != pluginInfo.pluginClass.toLowerCase()) {
|
|
validationErrors.add("The 'name' property on the @LogstashPlugin (which is '" + pluginAnnotation + "') must match the plugin class name '" + pluginInfo.pluginClass + "' excluding casing and underscores")
|
|
}
|
|
}
|
|
}
|
|
|
|
if (validationErrors.size() > 0) {
|
|
throw new GradleScriptException("Plugin validation errors:" + System.lineSeparator() +
|
|
String.join(System.lineSeparator(), validationErrors), null)
|
|
}
|
|
}
|