[On-Week] Automate chromium linux builds for reporting in Kibana (#212674)

## Summary

Closes https://github.com/elastic/kibana/issues/38277

This PR ~attempts to take a stab at automating~ automates the process
for building the linux headless chromium used for reporting within
Kibana.

The Idea here is that one would simply only need create an issue within
the Kibana project that will then trigger the build for chromium, we'll
be leveraging GitHub actions for this, the Github workflow defined only
kicks in when an issue is created with the label
`trigger-chromium-build`, when such an issue is created, the body of the
issue is analysed; there's an expectation that the body would contain a
fenced codeblock of type json that specifies the property
`puppeteer_version`, if this is found this value is read and, we then
trigger a build on buildkite passing along this value, in the case where
there are multiple fenced code blocks specifying the expected property
`puppeteer_version` we leave a message that the issue body be modified,
also if the expected label was added but the expected property is not
found we also leave a message prompting that the expected property be
provided.

Once the build has commenced a message containing the link to the build
would be provided as initial feedback to the user, also on completion
another message is provided to the user, that provides a link to the PR
containing all the required changes to update puppeteer to the version
specified.

~It's also worth pointing out that this PR also, modifies the source for
`win`, `mac` chromium binaries to leverage the [JSON API
endpoint](https://github.com/GoogleChromeLabs/chrome-for-testing#json-api-endpoints)
provided by Google to get the required binaries for chromium headless in
a deterministic way, which in turns is part of what makes this
automation possible.~

## How to test this

- If you'd love to test the Github action too without the PR being
merging in just yet, you should consider setting up
[act](https://github.com/nektos/act), alongside it's companion [vscode
extension](https://sanjulaganepola.github.io/github-local-actions-docs/)
, we'll then want to create a payload file providing similar data that
github would return for our workflow trigger, more info about setting
this up
[here](https://sanjulaganepola.github.io/github-local-actions-docs/usage/settings/#payloads).
The payload file we'd want would be something along the following lines;

```json
{
    "action": "labeled",
    "label":{
        "name": "trigger-chromium-build"
    },
    "issue": {
        "number": 1,
        "title": "Issue 1",
        "author_association": "MEMBER",
        "labels": [
            {
                "name": "trigger-chromium-build"
            }
        ],
        "body": "\n## Random text \n\n ~~~json\n{\n  \"puppeteer_version\": \"24.6.1\" \n}\n~~~\n~~~json\n{\n  \"some_random_value\": \"23.0.1\" \n}\n~~~"
    }
}
```

- To test the actual build process it can be initiated through this
specific pipeline
https://buildkite.com/elastic/kibana-migration-pipeline-staging by
creating a custom build on `pull/212674/head` (this pull request) with
the env variable similar to this

```
TESTED_PIPELINE_PATH=.buildkite/pipelines/chromium_linux_build/build_chromium.yml
PUPPETEER_VERSION=24.6.1
GITHUB_ISSUE_NUMBER=212732
GITHUB_ISSUE_BASE_OWNER=elastic
GITHUB_ISSUE_BASE_REPO=kibana
GITHUB_ISSUE_TRIGGER_USER=eokoneyo
```

PS: Issue #212732 is an issue that's been created to test providing
feedback to the user about the build process, re-triggering a build on
an existing issue updates the comments in place.

<!-- ### Where to go from here?

- Ideas and thoughts welcome
-->
<!--
### Checklist

Check the PR satisfies following conditions. 

Reviewers should verify this PR satisfies this list as well.

- [ ] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md)
- [ ]
[Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html)
was added for features that require explanation or tutorials
- [ ] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [ ] If a plugin configuration key changed, check if it needs to be
allowlisted in the cloud and added to the [docker
list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker)
- [ ] This was checked for breaking HTTP API changes, and any breaking
changes have been approved by the breaking-change committee. The
`release_note:breaking` label should be applied in these situations.
- [ ] [Flaky Test
Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was
used on any tests changed
- [ ] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)

### Identify risks

Does this PR introduce any risks? For example, consider risks like hard
to test bugs, performance regression, potential of data loss.

Describe the risk, its severity, and mitigation for each identified
risk. Invite stakeholders and evaluate how to proceed before merging.

- [ ] [See some risk
examples](https://github.com/elastic/kibana/blob/main/RISK_MATRIX.mdx)
- [ ] ...


-->

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
This commit is contained in:
Eyo O. Eyo 2025-05-06 22:04:08 +02:00 committed by GitHub
parent 0acd88e3b9
commit 2fd65fba64
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
14 changed files with 3629 additions and 20 deletions

File diff suppressed because it is too large Load diff

View file

@ -11,6 +11,7 @@
},
"dependencies": {
"@octokit/rest": "^18.10.0",
"adm-zip": "^0.5.16",
"axios": "^1.8.3",
"globby": "^11.1.0",
"js-yaml": "^4.1.0",
@ -21,11 +22,14 @@
"devDependencies": {
"@types/chai": "^4.3.3",
"@types/js-yaml": "^4.0.9",
"@types/jscodeshift": "^0.12.0",
"@types/minimatch": "^3.0.5",
"@types/minimist": "^1.2.5",
"@types/mocha": "^10.0.1",
"@types/node": "^15.12.2",
"chai": "^4.3.10",
"chai-snapshot-tests": "^0.6.0",
"jscodeshift": "^17.1.2",
"mocha": "^11.0.1",
"nock": "^12.0.2",
"ts-node": "^10.9.2",

View file

@ -0,0 +1,40 @@
# yaml-language-server: $schema=https://gist.githubusercontent.com/elasticmachine/988b80dae436cafea07d9a4a460a011d/raw/rre.schema.json
apiVersion: backstage.io/v1alpha1
kind: Resource
metadata:
name: bk-kibana-chromium-linux-build
description: Create chromium linux builds for the Kibana Reporting feature
links:
- title: Pipeline link
url: https://buildkite.com/elastic/kibana-chromium-linux-build
spec:
type: buildkite-pipeline
owner: group:response-ops
system: buildkite
implementation:
apiVersion: buildkite.elastic.dev/v1
kind: Pipeline
metadata:
name: kibana / Chromium Linux Build
description: Creates headless Chromium Linux builds
spec:
env:
SLACK_NOTIFICATIONS_CHANNEL: "#kibana-alerting"
ELASTIC_SLACK_NOTIFICATIONS_ENABLED: "false"
repository: elastic/kibana
branch_configuration: main
default_branch: main
pipeline_file: ".buildkite/pipelines/chromium_linux_build/build_chromium.yml"
provider_settings:
trigger_mode: none
teams:
kibana-operations:
access_level: MANAGE_BUILD_AND_READ
appex-sharedux:
access_level: MANAGE_BUILD_AND_READ
response-ops:
access_level: MANAGE_BUILD_AND_READ
everyone:
access_level: READ_ONLY
tags:
- kibana

View file

@ -14,6 +14,7 @@ spec:
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-artifacts-staging.yml
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-artifacts-trigger.yml
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-chrome-forward-testing.yml
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-chromium-linux-build.yml
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-codeql.yml
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-console-definitions-sync.yml
- https://github.com/elastic/kibana/blob/main/.buildkite/pipeline-resource-definitions/kibana-coverage-daily.yml

View file

@ -0,0 +1,59 @@
env:
GITHUB_PR_TARGET_BRANCH: 'main' # common utils needs this value to be set
steps:
- command: |
ts-node .buildkite/scripts/lifecycle/comment_on_pr.ts \
--message "Linux headless chromium build started at: $BUILDKITE_BUILD_URL" \
--context "chromium-linux-build-job-start" \
--issue-number $GITHUB_ISSUE_NUMBER \
--repository $GITHUB_ISSUE_BASE_REPO \
--repository-owner $GITHUB_ISSUE_BASE_OWNER \
--clear-previous
label: Comment with job URL on issue
agents:
provider: gcp
image: family/kibana-ubuntu-2004
imageProject: elastic-images-prod
machineType: n2-standard-2
timeout_in_minutes: 5
- command: |
.buildkite/scripts/pipelines/chromium_linux_build/chromium_version.sh
label: Infer chromium version for puppeteer version
key: infer_chromium_version
agents:
provider: gcp
image: family/kibana-ubuntu-2004
imageProject: elastic-images-prod
machineType: n2-standard-2
timeout_in_minutes: 5
- command: "PLATFORM_VARIANT={{matrix}} .buildkite/scripts/pipelines/chromium_linux_build/build.sh"
label: "Build {{matrix}} linux headless chromium"
depends_on: "infer_chromium_version"
key: build_chromium
env:
DEBIAN_FRONTEND: noninteractive
agents:
image: family/kibana-ubuntu-2004
imageProject: elastic-images-prod
provider: gcp
machineType: c2d-highcpu-112
diskSizeGb: 275
matrix:
- "x64"
- "arm64"
- command: node .buildkite/scripts/pipelines/chromium_linux_build/issue_feedback/entry.js
label: provided feedback to issue created to update puppeteer
depends_on: "build_chromium"
env:
KIBANA_MACHINE_USERNAME: kibanamachine
KIBANA_MACHINE_EMAIL: 42973632+kibanamachine@users.noreply.github.com
agents:
provider: gcp
image: family/kibana-ubuntu-2004
imageProject: elastic-images-prod
machineType: n2-standard-2
timeout_in_minutes: 30

View file

@ -76,6 +76,9 @@ if [[ -z "$EMAIL" ]]; then
"ci-artifacts.kibana.dev")
EMAIL="kibana-ci-access-artifacts@$GCLOUD_EMAIL_POSTFIX"
;;
"kibana-ci-access-chromium-blds")
EMAIL="kibana-ci-access-chromium-blds@$GCLOUD_EMAIL_POSTFIX"
;;
*)
EMAIL="$BUCKET_NAME@$GCLOUD_EMAIL_POSTFIX"
;;

View file

@ -0,0 +1,81 @@
#!/usr/bin/env bash
set -euo pipefail
CHROMIUM_COMMIT_HASH=$(buildkite-agent meta-data get "chromium_commit_hash")
echo "---Preparing to build Chromium of commit hash: $CHROMIUM_COMMIT_HASH"
BUILD_ROOT_DIR="$HOME/chromium"
KIBANA_CHECKOUT_DIR="$(pwd)"
BUILD_SCRIPT="$KIBANA_CHECKOUT_DIR/x-pack/build_chromium"
# Create a dedicated working directory outside of the default buildkite working directory.
mkdir "$BUILD_ROOT_DIR" && cd "$BUILD_ROOT_DIR"
ARTIFACT_STAGING_STORAGE_BUCKET="gs://headless_shell_staging"
ARTIFACT_PROD_STORAGE_BUCKET="gs://headless_shell"
ARTIFACT_QUERY="chromium-${CHROMIUM_COMMIT_HASH:0:7}-.*_$PLATFORM_VARIANT"
## impersonate service account that has access to our storage bucket
"$KIBANA_CHECKOUT_DIR/.buildkite/scripts/common/activate_service_account.sh" "kibana-ci-access-chromium-blds"
# Query to determine if expected build artifact from a prior build exists,
# the build.py script uploads the build artifact to the staging bucket
artifacts=$(gsutil ls "$ARTIFACT_STAGING_STORAGE_BUCKET" | grep "$ARTIFACT_QUERY" || true)
if [[ -z "$artifacts" ]]; then
echo "No files found matching the query: $ARTIFACT_QUERY"
echo "---Chromium build does not exist in the bucket, proceeding with the build"
# Install the required packages for building chromium
sudo apt-get update && \
sudo apt-get install -y curl git build-essential pkg-config gcc gperf python3 && \
sudo rm -rf /var/lib/apt/lists/*
BUILD_SCRIPT_SYMLINK="$BUILD_ROOT_DIR/build_scripts"
# link existing build_chromium directory from kibana
ln -s "$BUILD_SCRIPT" "$BUILD_SCRIPT_SYMLINK"
# Allow our scripts to use depot_tools commands
export PATH=$BUILD_ROOT_DIR/depot_tools:$PATH
# Install the OS packages, configure the environment, download the chromium source (56GB)
python3 "$BUILD_SCRIPT_SYMLINK/init.py"
echo "---Building $PLATFORM_VARIANT Chromium of commit hash: $CHROMIUM_COMMIT_HASH"
# Run the build script with the path to the chromium src directory, the git commit hash
python3 "$BUILD_SCRIPT_SYMLINK/build.py" "$CHROMIUM_COMMIT_HASH" "$PLATFORM_VARIANT"
echo "---Upload build artefact to prod storage bucket"
gsutil cp "$BUILD_ROOT_DIR/chromium/src/out/headless/chromium-*" "$ARTIFACT_PROD_STORAGE_BUCKET"
echo "---Persisting build artefact to buildkite for following steps"
buildkite-agent artifact upload "$BUILD_ROOT_DIR/chromium/src/out/headless/chromium-*"
else
echo "$artifacts" | while read -r file; do
gsutil cp "$file" .
done
shopt -s nullglob
files=("chromium-*")
shopt -u nullglob
if [ ${#files[@]} -gt 0 ]; then
echo "---Chromium build already exists in the bucket, skipping build"
# Upload the existing build artifact to buildkite so it can be used in the next steps,
# and accessible without necessarily having access to the storage bucket itself
buildkite-agent artifact upload "chromium-*"
fi
fi
echo "---Build completed"

View file

@ -0,0 +1,14 @@
#!/usr/bin/env bash
set -euo pipefail
source .buildkite/scripts/common/util.sh
.buildkite/scripts/bootstrap.sh
echo "---Attempting to compute chromium version for provided puppeteer version"
CHROMIUM_VERSION_OUTPUT=$(node scripts/chromium_version $PUPPETEER_VERSION)
echo "$CHROMIUM_VERSION_OUTPUT" | grep -i "chromium commit" | awk '{print $5}' | buildkite-agent meta-data set "chromium_commit_hash"
echo "$CHROMIUM_VERSION_OUTPUT" | grep -i "chrome version" | awk '{print $5}' | buildkite-agent meta-data set "chromium_version"
echo "$CHROMIUM_VERSION_OUTPUT" | grep -i "chromium revision" | awk '{print $5}' | buildkite-agent meta-data set "chromium_revision"

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,375 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the "Elastic License
* 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side
* Public License v 1"; you may not use this file except in compliance with, at
* your election, the "Elastic License 2.0", the "GNU Affero General Public
* License v3.0 only", or the "Server Side Public License, v 1".
*/
const { parse, resolve } = require('path');
const { readFileSync, createWriteStream } = require('fs');
const assert = require('assert');
const { execFile } = require('child_process');
const { finished } = require('stream').promises;
const axios = require('axios');
const fg = require('fast-glob');
const AdmZip = require('adm-zip');
/**
* @typedef DownloadDefinition
* @property {'linux64' | 'mac-arm64' | 'mac-x64' | 'win32' | 'win64'} platform
* @property {string} url
*/
/**
* @description defines the structure of the response return from https://googlechromelabs.github.io/chrome-for-testing/known-good-versions-with-downloads.json
* @typedef VersionDefinition
* @property {string} version
* @property {string} revision
* @property {{
* chrome: DownloadDefinition[]
* 'chrome-headless-shell': DownloadDefinition[] | undefined
* }} downloads
*/
/**
* @description helper to invoke binaries on the machine running this script
* @param {string} file
* @param {string[]} fileArguments
* @param {import('child_process').ExecFileOptions & { printToScreen?: boolean }} options
*/
const $ = async (file, fileArguments, { printToScreen, ...options } = {}) => {
const { stdout } = await new Promise((resolve, reject) => {
const process = execFile(file, fileArguments, options, (error, stdout, stderr) => {
if (error) {
reject(error);
}
resolve({ stdout, stderr });
});
if (printToScreen) {
process.stdout?.on('data', console.log);
process.stderr?.on('data', console.error);
}
});
return stdout.trim();
};
/**
* @param {string} filePath
* @returns {Promise<string>}
*/
const getSha256Hash = async (filePath) => {
// sha256sum returns the hash and the file name, we only need the hash
const [hash] = (await $('sha256sum', [filePath])).split(/\s/);
return hash;
};
(async () => {
assert.ok(process.env.PUPPETEER_VERSION, 'PUPPETEER_VERSION is not defined');
assert.ok(process.env.KIBANA_MACHINE_USERNAME, 'KIBANA_MACHINE_USERNAME is not defined');
assert.ok(process.env.KIBANA_MACHINE_EMAIL, 'KIBANA_MACHINE_EMAIL is not defined');
assert.ok(process.env.GITHUB_ISSUE_NUMBER, 'GITHUB_ISSUE_NUMBER is not defined');
assert.ok(process.env.GITHUB_ISSUE_BASE_REPO, 'GITHUB_ISSUE_BASE_REPO is not defined');
assert.ok(process.env.GITHUB_ISSUE_BASE_OWNER, 'GITHUB_ISSUE_BASE_OWNER is not defined');
assert.ok(process.env.GITHUB_ISSUE_TRIGGER_USER, 'GITHUB_ISSUE_TRIGGER_USER is not defined');
assert.ok(process.env.BUILDKITE_BUILD_ID, 'BUILDKITE_BUILD_ID is not defined');
assert.ok(process.env.BUILDKITE_BUILD_URL, 'BUILDKITE_BUILD_URL is not defined');
const buildRoot = process.cwd();
const prTitle = `[Reporting] Update puppeteer to version ${process.env.PUPPETEER_VERSION}`;
// branch name which is unique to the current puppeteer version
const branchName = `chore/puppeteer_${process.env.PUPPETEER_VERSION}_update`;
let openPRForVersionUpgrade = await $(
'gh',
[
'pr',
'list',
'--search',
String(`is:open is:pr head:${branchName} base:main`),
'--author',
process.env.KIBANA_MACHINE_USERNAME,
'--limit',
'1',
'--json',
'url',
'-q',
'.[].url',
],
{ printToScreen: true }
);
if (openPRForVersionUpgrade) {
console.log(
'---Found existing PR for version upgrade, skipping PR creation to leave a comment \n'
);
await $(
'ts-node',
[
resolve(buildRoot, '.buildkite/scripts/lifecycle/comment_on_pr.ts'),
'--message',
String(
`An existing PR has already been created to upgrade puppeteer to the requested version at ${openPRForVersionUpgrade}`
),
'--issue-number',
process.env.GITHUB_ISSUE_NUMBER,
'--repository',
process.env.GITHUB_ISSUE_BASE_REPO,
'--repository-owner',
process.env.GITHUB_ISSUE_BASE_OWNER,
'--context',
'chromium-linux-build-pr-exists',
'--clear-previous',
],
{ printToScreen: true }
);
return;
}
// configure git
await $('git', ['config', '--global', 'user.name', process.env.KIBANA_MACHINE_USERNAME]);
await $('git', ['config', '--global', 'user.email', process.env.KIBANA_MACHINE_EMAIL]);
// create a new branch to update puppeteer
await $('git', ['checkout', '-b', `${branchName}_temp`]);
console.log('---Updating puppeteer package to version %s', process.env.PUPPETEER_VERSION);
await $('yarn', ['add', `puppeteer@${process.env.PUPPETEER_VERSION}`]);
await $('yarn', ['kbn', 'bootstrap']);
await $('git', ['add', 'package.json', 'yarn.lock']);
await $('git', [
'commit',
'-m',
`chore: update puppeteer to version ${process.env.PUPPETEER_VERSION}`,
]);
// switch working dir to the screenshotting server directory
process.chdir('src/platform/packages/private/kbn-screenshotting-server/src');
await $('mkdir', ['-p', 'chromium']);
process.chdir('chromium');
const response = await axios.get(
'https://googlechromelabs.github.io/chrome-for-testing/known-good-versions-with-downloads.json'
);
/**
* @description list of known good versions of chromium provided by google
* @type {{
* timestamp: string
* versions: VersionDefinition[]
* }}
*/
const { versions } = response.data;
const chromiumVersion = await $('buildkite-agent', ['meta-data', 'get', 'chromium_version'], {
printToScreen: true,
});
const chromiumRevision = await $('buildkite-agent', ['meta-data', 'get', 'chromium_revision'], {
printToScreen: true,
});
const matchedChromeConfig = versions.find(
(v) => v.version === chromiumVersion && v.revision === chromiumRevision
);
assert.notStrictEqual(
matchedChromeConfig?.downloads?.['chrome-headless-shell']?.length,
0,
`Failed to find a known good version for chromium ${chromiumVersion}`
);
/**
* @type {import('./transform_path_file').ChromiumUpdateConfigMap}
*/
const config = {};
await Promise.all(
(matchedChromeConfig?.downloads['chrome-headless-shell'] ?? []).map(async (download) => {
if (
download.platform === 'win64' ||
download.platform === 'mac-x64' ||
download.platform === 'mac-arm64'
) {
console.log(`---Attempting downloading for ${download.platform} chrome-headless-shell \n`);
const url = new URL(download.url);
const downloadResponse = await axios.get(url.toString(), { responseType: 'stream' });
const downloadFileName = parse(url.pathname).base;
downloadResponse.data.pipe(createWriteStream(downloadFileName));
await finished(downloadResponse.data);
console.log(`---Extracting and computing checksum for ${downloadFileName}\n`);
const archiveChecksum = await getSha256Hash(downloadFileName);
const zip = new AdmZip(downloadFileName);
zip.extractAllTo('.', true);
const binaryChecksum = await getSha256Hash(
`${parse(downloadFileName).name}/chrome-headless-shell${
/^win/.test(download.platform) ? '.exe' : ''
}`
);
config[download.platform.replace('-', '_')] = {
archiveChecksum,
binaryChecksum,
};
}
})
);
console.log('--Attempting download of persisted artifacts for linux chromium from prior step');
await Promise.all(
['arm64', 'x64'].map(async (arch) => {
await $(
'buildkite-agent',
[
'artifact',
'download',
`*${arch}.*`,
'.',
'--build',
String(process.env.BUILDKITE_BUILD_ID),
],
{ printToScreen: true }
);
const linuxVariantBuildArtifact = await fg(`chromium-*_${arch}.*`);
assert(
linuxVariantBuildArtifact.length,
'linux build artifacts files from prior step not found...'
);
const match = linuxVariantBuildArtifact.find((artifact) =>
RegExp(String.raw`${arch}\.zip`).test(artifact)
);
assert.ok(match, `No artifacts containing linux headless shell binaries found for ${arch}`);
const archiveChecksum = await getSha256Hash(match);
assert.strictEqual(
archiveChecksum,
readFileSync(`${parse(match).name}.sha256`, 'utf-8').split(/\s/)[0],
'Checksum mismatch'
);
const zip = new AdmZip(match);
zip.extractAllTo('.', true);
const binaryChecksum = await getSha256Hash(`headless_shell-linux_${arch}/headless_shell`);
config[`linux_${arch}`] = {
archiveFilename: match,
archiveChecksum,
binaryChecksum,
};
})
);
console.log('--Modifying paths.ts file\n');
await $(
'npx',
[
'--',
'jscodeshift',
'--extensions=ts',
'--parser=tsx',
'--fail-on-error',
`--transform`,
resolve(
buildRoot,
'.buildkite/scripts/pipelines/chromium_linux_build/issue_feedback/transform_path_file.js'
),
'../paths.ts',
`--updateConfig=${JSON.stringify(config)}`,
`--chromiumVersion=${chromiumVersion}`,
],
{ printToScreen: true }
);
await $('git', ['add', '../paths.ts']);
await $('git', ['commit', '-m', 'chore: update chromium paths']);
// create clean branch based off of main
await $('git', ['checkout', 'main']);
await $('git', ['checkout', '-b', branchName]);
// cherry-pick the two commits we made to the temp branch
await $('git', ['cherry-pick', `${branchName}_temp~1`, `${branchName}_temp~0`], {
printToScreen: true,
});
await $('git', ['push', 'origin', branchName], {
printToScreen: true,
});
openPRForVersionUpgrade = await $(
'gh',
[
'pr',
'create',
'--title',
prTitle,
'--body',
`Closes #${process.env.GITHUB_ISSUE_NUMBER} \n\n This PR updates puppeteer to version \`${process.env.PUPPETEER_VERSION}\` and updates the chromium paths to the latest known good version for windows and mac where the chromium revision is \`${chromiumRevision}\` and version is \`${chromiumVersion}\`.\nFor linux a custom build was triggered to build chromium binaries for both x64 and arm64. \n\n\n **NB** This PR should be tested before merging it in as puppeteer might have breaking changes we are not aware of`,
'--base',
'main',
'--head',
branchName,
'--label',
'release_note:skip',
'--label',
'Team:SharedUX',
'--label',
'Team:ResponseOps',
'--label',
'backport',
'--label',
'Feature:Reporting:Screenshot',
'--assignee',
process.env.GITHUB_ISSUE_TRIGGER_USER,
],
{
printToScreen: true,
}
);
console.log('---Providing feedback to issue \n');
await $(
'ts-node',
[
resolve(buildRoot, '.buildkite/scripts/lifecycle/comment_on_pr.ts'),
'--message',
String.raw`@${process.env.GITHUB_ISSUE_TRIGGER_USER} a PR has been created to upgrade puppeteer to the requested version at ${openPRForVersionUpgrade}`,
'--issue-number',
process.env.GITHUB_ISSUE_NUMBER,
'--repository',
process.env.GITHUB_ISSUE_BASE_REPO,
'--repository-owner',
process.env.GITHUB_ISSUE_BASE_OWNER,
'--context',
'chromium-linux-build-diff',
'--clear-previous',
],
{ printToScreen: true }
);
})();

View file

@ -0,0 +1,208 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the "Elastic License
* 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side
* Public License v 1"; you may not use this file except in compliance with, at
* your election, the "Elastic License 2.0", the "GNU Affero General Public
* License v3.0 only", or the "Server Side Public License, v 1".
*/
// @ts-check
const assert = require('assert');
/**
* @typedef ChromiumUpdateConfig
* @property {string} archiveChecksum
* @property {string} binaryChecksum
*/
/**
*
* @typedef ExtraChromiumUpdateConfig
* @property {string} archiveFilename
*/
/**
* @typedef ChromiumUpdateConfigMap
* @property {ChromiumUpdateConfig} mac_arm64
* @property {ChromiumUpdateConfig} mac_x64
* @property {ChromiumUpdateConfig} win64
* @property {ChromiumUpdateConfig & ExtraChromiumUpdateConfig} linux_arm64
* @property {ChromiumUpdateConfig & ExtraChromiumUpdateConfig} linux_x64
*/
/**
* @typedef transformOptions
* @property {string} chromiumVersion
* @property {ChromiumUpdateConfigMap} updateConfig
*/
/**
* @param {*} file
* @param {import('jscodeshift').API} api
* @param {transformOptions} options
* @returns
*/
module.exports = function transformer(file, api, options) {
const j = api.jscodeshift;
assert.ok(Object.values(options).length, 'Expected options to be defined');
assert.ok(options.chromiumVersion, 'Expected version to be defined');
assert.ok(options.updateConfig, 'Expected updateConfig to be defined');
assert.ok(options.updateConfig.linux_arm64, 'Expected linux_arm64 update config to be defined');
assert.ok(options.updateConfig.linux_x64, 'Expected linux_x64 update config to be defined');
assert.ok(options.updateConfig.mac_arm64, 'Expected mac_arm64 update config to be defined');
assert.ok(options.updateConfig.mac_x64, 'Expected mac_x64 update config to be defined');
assert.ok(options.updateConfig.win64, 'Expected win64 update config to be defined');
const root = j(file.source);
const packagesPropertyDefinition = root
.find(j.ClassDeclaration, {
id: { name: 'ChromiumArchivePaths' },
})
.find(j.ClassProperty, {
key: { name: 'packages' },
});
assert(
packagesPropertyDefinition.size() === 1,
'Expected to find a single packages definition on ChromiumArchivePaths'
);
const packagesArray = packagesPropertyDefinition.find(j.ArrayExpression);
/**
* @param {import('jscodeshift').ObjectExpression['properties'][number]} property
* @param {string} propertyKey
* @param {string} propertyValue
*/
const setPropertyValue = (property, propertyKey, propertyValue) => {
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === propertyKey
) {
property.value = j.literal(propertyValue);
}
};
// Traverse each package object in the array
packagesArray.find(j.ObjectExpression).forEach((path) => {
path.node.properties.forEach((property, _, _properties) => {
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === 'version'
) {
property.value = j.literal(options.chromiumVersion);
}
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === 'archivePath' &&
property.value.type === 'StringLiteral' &&
property.value.value === 'mac-x64'
) {
_properties.forEach((property) => {
setPropertyValue(
property,
'archiveChecksum',
options.updateConfig.mac_x64.archiveChecksum
);
setPropertyValue(property, 'binaryChecksum', options.updateConfig.mac_x64.binaryChecksum);
});
}
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === 'archivePath' &&
property.value.type === 'StringLiteral' &&
property.value.value === 'mac-arm64'
) {
_properties.forEach((property) => {
setPropertyValue(
property,
'archiveChecksum',
options.updateConfig.mac_arm64.archiveChecksum
);
setPropertyValue(
property,
'binaryChecksum',
options.updateConfig.mac_arm64.binaryChecksum
);
});
}
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === 'archivePath' &&
property.value.type === 'StringLiteral' &&
property.value.value === 'win64'
) {
_properties.forEach((property) => {
setPropertyValue(property, 'archiveChecksum', options.updateConfig.win64.archiveChecksum);
setPropertyValue(property, 'binaryChecksum', options.updateConfig.win64.binaryChecksum);
});
}
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === 'binaryRelativePath' &&
property.value.type === 'StringLiteral' &&
/linux_x64/.test(property.value.value)
) {
_properties.forEach((property) => {
setPropertyValue(
property,
'archiveChecksum',
options.updateConfig.linux_x64.archiveChecksum
);
setPropertyValue(
property,
'binaryChecksum',
options.updateConfig.linux_x64.binaryChecksum
);
setPropertyValue(
property,
'archiveFilename',
options.updateConfig.linux_x64.archiveFilename
);
});
}
if (
property.type === 'ObjectProperty' &&
property.key.type === 'Identifier' &&
property.key.name === 'binaryRelativePath' &&
property.value.type === 'StringLiteral' &&
/linux_arm64/.test(property.value.value)
) {
_properties.forEach((property) => {
setPropertyValue(
property,
'archiveChecksum',
options.updateConfig.linux_arm64.archiveChecksum
);
setPropertyValue(
property,
'binaryChecksum',
options.updateConfig.linux_arm64.binaryChecksum
);
setPropertyValue(
property,
'archiveFilename',
options.updateConfig.linux_arm64.archiveFilename
);
});
}
});
});
return root.toSource();
};

View file

@ -0,0 +1,111 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the "Elastic License
* 2.0", the "GNU Affero General Public License v3.0 only", and the "Server Side
* Public License v 1"; you may not use this file except in compliance with, at
* your election, the "Elastic License 2.0", the "GNU Affero General Public
* License v3.0 only", or the "Server Side Public License, v 1".
*/
import { resolve } from 'path';
import { readFileSync } from 'fs';
import chai, { expect, assert } from 'chai';
import snapshots from 'chai-snapshot-tests';
// @ts-expect-error test utils is defined and exists, see https://github.com/facebook/jscodeshift#applytransform
import { applyTransform } from 'jscodeshift/dist/testUtils';
import pathFileTransform from './transform_path_file';
// read contents of the path file our transform is built to update
const pathFileContents = readFileSync(
resolve(process.cwd(), '../src/platform/packages/private/kbn-screenshotting-server/src/paths.ts'),
'utf8'
);
/**
* @type {import('./transform_path_file').transformOptions}
*/
const transformOptions = {
chromiumVersion: '130.6943.126',
updateConfig: {
mac_x64: {
archiveChecksum: '2e64a158419165ceee5db0b57703777bf21470f2d9656bbf100f54ebe059f695',
binaryChecksum: '53dbb5e3d4327c980d7bb6dbcb6bd6f73b1de573925a2d4dab010d6cafcc3bbc',
},
mac_arm64: {
archiveChecksum: '51645431ecc1d843d4fdc34f3817ca2a4ac7c3b4450eb9f3117f806ebaa78487',
binaryChecksum: '35f42c93856df90bd01bc809e8a32bffb25a48c83d7cc2feb9af6e2376f7fc65',
},
win64: {
archiveChecksum: '4fd9484cf67790b5bbff39be62d5835f6848a326a68b4be1b83dc22a4336efa1',
binaryChecksum: '46054cfc2be47f7822008e29674baefd82912cdae107fbe07027cbe84622c0b9',
},
linux_x64: {
archiveFilename: 'chromium-cffa127-locales-linux_x64.zip',
archiveChecksum: '082d3bcabe0a04c4ec7f90d8e425f9c63147015964aa0d3b59a1cccd66571939',
binaryChecksum: 'a22ecc374131998d7ed05b2f433a1a8a819e3ae3b9c4dfa92311cf11ac9e34e1',
},
linux_arm64: {
archiveFilename: 'chromium-cffa127-locales-linux_arm64.zip',
archiveChecksum: '571437335b3b867207650390ca8827ea71a58a842f7bb22bbb497a1266324431',
binaryChecksum: '68dafc4ae03cc4c2812e94f61f62db72a7dcde95754d817594bf25e3862647be',
},
},
};
const runnerOptions = {
parser: 'tsx',
extensions: 'ts',
};
describe('transform_path_file', () => {
before(() => {
chai.use(snapshots(__filename));
});
it('throws an error if options are missing', () => {
expect(() => {
applyTransform(pathFileTransform, {}, pathFileContents, runnerOptions);
}).to.throw('Expected options to be defined');
});
it('throws an error if chromiumVersion is missing', () => {
expect(() => {
applyTransform(
pathFileTransform,
{
updateConfig: transformOptions.updateConfig,
},
{ source: pathFileContents },
runnerOptions
);
}).to.throw('Expected version to be defined');
});
it('throws an error if updateConfig is missing', () => {
expect(() => {
applyTransform(
pathFileTransform,
{
chromiumVersion: transformOptions.chromiumVersion,
},
{ source: pathFileContents },
runnerOptions
);
}).to.throw('Expected updateConfig to be defined');
});
// This test fails because a change was made to the `kbn-screenshotting-server/src/paths.ts` file that is not reflected
// in the transform being tested.
// See entry.js for an example on how you might verify that the script creates the right modification in the paths file,
// when you've made changes that match the expected output update the test snapshot to match the new output
it('transform output matches our expectation', () => {
const output = applyTransform(
pathFileTransform,
transformOptions,
{ source: pathFileContents },
runnerOptions
);
assert.snapshot('updated_paths_file', output);
});
});

View file

@ -0,0 +1,133 @@
name: Trigger Chromium Build
on:
issues:
types: [labeled, opened, reopened, edited, closed]
permissions:
contents: read
issues: write
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
matches_label:
if: |
github.event.label.name == 'trigger-chromium-build'
&& github.event.issue.author_association == 'MEMBER' && github.event.action != 'closed'
runs-on: ubuntu-latest
outputs:
version_bump_config: ${{ steps.extract_version_bump_config.outputs.result }}
steps:
- name: Install Dependencies
run: |
apt-get update && apt-get install jq wget -y
- name: Install mdsh
run: |
wget https://github.com/bashup/mdsh/raw/master/bin/mdsh && \
chmod +x mdsh && \
export PATH=$PATH:mdsh
- name: Extract version bump configuration
id: extract_version_bump_config
run: |
# mdsh-* functions are informed from https://github.com/bashup/mdsh/blob/master/mdsh.md
# Function to parse markdown and extract fenced code blocks
mdsh-parse() {
local cmd=$1 lno=0 block_start lang mdsh_block ln indent fence close_fence indent_remove
local mdsh_fence=$'^( {0,3})(~~~+|```+) *([^`]*)$'
while mdsh-find-block; do
indent=${BASH_REMATCH[1]} fence=${BASH_REMATCH[2]} lang=${BASH_REMATCH[3]} mdsh_block=
block_start=$lno close_fence="^( {0,3})$fence+ *\$" indent_remove="^${indent// / ?}"
while ((lno++)); IFS= read -r ln && ! [[ $ln =~ $close_fence ]]; do
! [[ $ln =~ $indent_remove ]] || ln=${ln#${BASH_REMATCH[0]}}; mdsh_block+=$ln$'\n'
done
lang="${lang%\"${lang##*[![:space:]]}\"}"; "$cmd" fenced "$lang" "$mdsh_block"
done
}
# Function to find fenced code blocks
mdsh-find-block() {
while ((lno++)); IFS= read -r ln; do if [[ $ln =~ $mdsh_fence ]]; then return; fi; done; false
}
# Variable to hold the upgrade config if found
UPGRADE_CONFIG=""
# Variable to track if multiple puppeteer_version properties are found
PUPPETEER_VERSION_COUNT=0
# Handler function to process each code block
handle_code_block() {
local block_type=$1
local lang=$2
local block_content=$3
if [[ $lang == "json" ]]; then
# Remove leading and trailing whitespace from the block content
block_content=$(echo "$block_content" | sed -e 's/^[[:space:]]*//' -e 's/[[:space:]]*$//')
# Check for puppeteer_version property
if echo "$block_content" | jq -e 'has("puppeteer_version")' > /dev/null; then
PUPPETEER_VERSION_COUNT=$((PUPPETEER_VERSION_COUNT + 1))
if [[ $PUPPETEER_VERSION_COUNT -gt 1 ]]; then
# Mark multiple puppeteer_version properties found, exit with error
echo "multiple_puppeteer_version_properties=true" >> $GITHUB_OUTPUT
exit 0
fi
UPGRADE_CONFIG="$block_content"
fi
fi
}
# Main function to extract code blocks from a markdown file
extract_code_blocks() {
mdsh-parse handle_code_block
# Output the upgrade config if found
if [[ -n "$UPGRADE_CONFIG" ]]; then
echo "version_bump_config=$UPGRADE_CONFIG" >> $GITHUB_OUTPUT
fi
}
# attempt extracting puppeteer version bump config from the issue body,
# single quote is very much intentional here so it's not escaped by bash
echo '${{ github.event.issue.body }}' | extract_code_blocks
- name: Report multiple puppeteer_version properties
if: ${{ steps.extract_version_bump_config.outputs.multiple_puppeteer_version_properties == 'true' }}
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea #v7.0.1
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: 'Multiple JSON blocks with puppeteer_version found. Please provide only one JSON block with the puppeteer_version property.'
})
- name: Report missing puppeteer_version properties
if: ${{ steps.extract_version_bump_config.outputs.result == '' }}
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea #v7.0.1
with:
script: |
github.rest.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: 'No puppeteer_version property found in the issue body. Please provide the puppeteer_version property in a JSON code block.'
})
trigger_build:
runs-on: ubuntu-latest
needs: [matches_label]
if: ${{ needs.matches_label.outputs.version_bump_config }}
steps:
- uses: elastic/oblt-actions/buildkite/run@v1
with:
token: ${{ secrets.SHAREDUX_BUILDKITE_TOKEN }}
pipeline: 'kibana-chromium-linux-build'
env-vars: |
GITHUB_ISSUE_BASE_OWNER=${{ github.event.repository.owner.name }}
GITHUB_ISSUE_BASE_REPO=${{ github.event.repository.name }}/
GITHUB_ISSUE_NUMBER=${{ github.event.issue.number }}
GITHUB_ISSUE_LABELS=${{ github.event.issue.labels }} | jq -r '. | map(.name) | join(",")'
GITHUB_ISSUE_TRIGGER_USER=${{ github.event.issue.user.login }}
PUPPETEER_VERSION=$(echo ${{ needs.matches_label.outputs.version_bump_config }} | jq -r '.puppeteer_version')

View file

@ -38,6 +38,7 @@ if arch_name != 'x64' and arch_name != 'arm64':
raise Exception('Unexpected architecture: ' + arch_name + '. `x64` and `arm64` are supported.')
print('Fetching locale files')
# TODO: move this into the repo itself, so we are only writing the build output to the bucket
runcmd('gsutil cp gs://headless_shell_staging/en-US.pak .')
print('Building Chromium ' + source_version + ' for ' + arch_name + ' from ' + src_path)