mirror of
https://github.com/elastic/kibana.git
synced 2025-06-27 18:51:07 -04:00
## Summary Closes https://github.com/elastic/kibana/issues/38277 This PR ~attempts to take a stab at automating~ automates the process for building the linux headless chromium used for reporting within Kibana. The Idea here is that one would simply only need create an issue within the Kibana project that will then trigger the build for chromium, we'll be leveraging GitHub actions for this, the Github workflow defined only kicks in when an issue is created with the label `trigger-chromium-build`, when such an issue is created, the body of the issue is analysed; there's an expectation that the body would contain a fenced codeblock of type json that specifies the property `puppeteer_version`, if this is found this value is read and, we then trigger a build on buildkite passing along this value, in the case where there are multiple fenced code blocks specifying the expected property `puppeteer_version` we leave a message that the issue body be modified, also if the expected label was added but the expected property is not found we also leave a message prompting that the expected property be provided. Once the build has commenced a message containing the link to the build would be provided as initial feedback to the user, also on completion another message is provided to the user, that provides a link to the PR containing all the required changes to update puppeteer to the version specified. ~It's also worth pointing out that this PR also, modifies the source for `win`, `mac` chromium binaries to leverage the [JSON API endpoint](https://github.com/GoogleChromeLabs/chrome-for-testing#json-api-endpoints) provided by Google to get the required binaries for chromium headless in a deterministic way, which in turns is part of what makes this automation possible.~ ## How to test this - If you'd love to test the Github action too without the PR being merging in just yet, you should consider setting up [act](https://github.com/nektos/act), alongside it's companion [vscode extension](https://sanjulaganepola.github.io/github-local-actions-docs/) , we'll then want to create a payload file providing similar data that github would return for our workflow trigger, more info about setting this up [here](https://sanjulaganepola.github.io/github-local-actions-docs/usage/settings/#payloads). The payload file we'd want would be something along the following lines; ```json { "action": "labeled", "label":{ "name": "trigger-chromium-build" }, "issue": { "number": 1, "title": "Issue 1", "author_association": "MEMBER", "labels": [ { "name": "trigger-chromium-build" } ], "body": "\n## Random text \n\n ~~~json\n{\n \"puppeteer_version\": \"24.6.1\" \n}\n~~~\n~~~json\n{\n \"some_random_value\": \"23.0.1\" \n}\n~~~" } } ``` - To test the actual build process it can be initiated through this specific pipeline https://buildkite.com/elastic/kibana-migration-pipeline-staging by creating a custom build on `pull/212674/head` (this pull request) with the env variable similar to this ``` TESTED_PIPELINE_PATH=.buildkite/pipelines/chromium_linux_build/build_chromium.yml PUPPETEER_VERSION=24.6.1 GITHUB_ISSUE_NUMBER=212732 GITHUB_ISSUE_BASE_OWNER=elastic GITHUB_ISSUE_BASE_REPO=kibana GITHUB_ISSUE_TRIGGER_USER=eokoneyo ``` PS: Issue #212732 is an issue that's been created to test providing feedback to the user about the build process, re-triggering a build on an existing issue updates the comments in place. <!-- ### Where to go from here? - Ideas and thoughts welcome --> <!-- ### Checklist Check the PR satisfies following conditions. Reviewers should verify this PR satisfies this list as well. - [ ] Any text added follows [EUI's writing guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses sentence case text and includes [i18n support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md) - [ ] [Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html) was added for features that require explanation or tutorials - [ ] [Unit or functional tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html) were updated or added to match the most common scenarios - [ ] If a plugin configuration key changed, check if it needs to be allowlisted in the cloud and added to the [docker list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker) - [ ] This was checked for breaking HTTP API changes, and any breaking changes have been approved by the breaking-change committee. The `release_note:breaking` label should be applied in these situations. - [ ] [Flaky Test Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was used on any tests changed - [ ] The PR description includes the appropriate Release Notes section, and the correct `release_note:*` label is applied per the [guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process) ### Identify risks Does this PR introduce any risks? For example, consider risks like hard to test bugs, performance regression, potential of data loss. Describe the risk, its severity, and mitigation for each identified risk. Invite stakeholders and evaluate how to proceed before merging. - [ ] [See some risk examples](https://github.com/elastic/kibana/blob/main/RISK_MATRIX.mdx) - [ ] ... --> --------- Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
81 lines
2.9 KiB
Bash
Executable file
81 lines
2.9 KiB
Bash
Executable file
#!/usr/bin/env bash
|
|
|
|
set -euo pipefail
|
|
|
|
CHROMIUM_COMMIT_HASH=$(buildkite-agent meta-data get "chromium_commit_hash")
|
|
|
|
echo "---Preparing to build Chromium of commit hash: $CHROMIUM_COMMIT_HASH"
|
|
|
|
BUILD_ROOT_DIR="$HOME/chromium"
|
|
|
|
KIBANA_CHECKOUT_DIR="$(pwd)"
|
|
|
|
BUILD_SCRIPT="$KIBANA_CHECKOUT_DIR/x-pack/build_chromium"
|
|
|
|
# Create a dedicated working directory outside of the default buildkite working directory.
|
|
mkdir "$BUILD_ROOT_DIR" && cd "$BUILD_ROOT_DIR"
|
|
|
|
ARTIFACT_STAGING_STORAGE_BUCKET="gs://headless_shell_staging"
|
|
ARTIFACT_PROD_STORAGE_BUCKET="gs://headless_shell"
|
|
|
|
ARTIFACT_QUERY="chromium-${CHROMIUM_COMMIT_HASH:0:7}-.*_$PLATFORM_VARIANT"
|
|
|
|
## impersonate service account that has access to our storage bucket
|
|
"$KIBANA_CHECKOUT_DIR/.buildkite/scripts/common/activate_service_account.sh" "kibana-ci-access-chromium-blds"
|
|
|
|
# Query to determine if expected build artifact from a prior build exists,
|
|
# the build.py script uploads the build artifact to the staging bucket
|
|
artifacts=$(gsutil ls "$ARTIFACT_STAGING_STORAGE_BUCKET" | grep "$ARTIFACT_QUERY" || true)
|
|
|
|
if [[ -z "$artifacts" ]]; then
|
|
echo "No files found matching the query: $ARTIFACT_QUERY"
|
|
|
|
echo "---Chromium build does not exist in the bucket, proceeding with the build"
|
|
|
|
# Install the required packages for building chromium
|
|
sudo apt-get update && \
|
|
sudo apt-get install -y curl git build-essential pkg-config gcc gperf python3 && \
|
|
sudo rm -rf /var/lib/apt/lists/*
|
|
|
|
BUILD_SCRIPT_SYMLINK="$BUILD_ROOT_DIR/build_scripts"
|
|
|
|
# link existing build_chromium directory from kibana
|
|
ln -s "$BUILD_SCRIPT" "$BUILD_SCRIPT_SYMLINK"
|
|
|
|
# Allow our scripts to use depot_tools commands
|
|
export PATH=$BUILD_ROOT_DIR/depot_tools:$PATH
|
|
|
|
# Install the OS packages, configure the environment, download the chromium source (56GB)
|
|
python3 "$BUILD_SCRIPT_SYMLINK/init.py"
|
|
|
|
echo "---Building $PLATFORM_VARIANT Chromium of commit hash: $CHROMIUM_COMMIT_HASH"
|
|
|
|
# Run the build script with the path to the chromium src directory, the git commit hash
|
|
python3 "$BUILD_SCRIPT_SYMLINK/build.py" "$CHROMIUM_COMMIT_HASH" "$PLATFORM_VARIANT"
|
|
|
|
echo "---Upload build artefact to prod storage bucket"
|
|
|
|
gsutil cp "$BUILD_ROOT_DIR/chromium/src/out/headless/chromium-*" "$ARTIFACT_PROD_STORAGE_BUCKET"
|
|
|
|
echo "---Persisting build artefact to buildkite for following steps"
|
|
|
|
buildkite-agent artifact upload "$BUILD_ROOT_DIR/chromium/src/out/headless/chromium-*"
|
|
|
|
else
|
|
echo "$artifacts" | while read -r file; do
|
|
gsutil cp "$file" .
|
|
done
|
|
|
|
shopt -s nullglob
|
|
files=("chromium-*")
|
|
shopt -u nullglob
|
|
|
|
if [ ${#files[@]} -gt 0 ]; then
|
|
echo "---Chromium build already exists in the bucket, skipping build"
|
|
# Upload the existing build artifact to buildkite so it can be used in the next steps,
|
|
# and accessible without necessarily having access to the storage bucket itself
|
|
buildkite-agent artifact upload "chromium-*"
|
|
fi
|
|
fi
|
|
|
|
echo "---Build completed"
|