kibana/x-pack/build_chromium/build.py
Eyo O. Eyo 2fd65fba64
[On-Week] Automate chromium linux builds for reporting in Kibana (#212674)
## Summary

Closes https://github.com/elastic/kibana/issues/38277

This PR ~attempts to take a stab at automating~ automates the process
for building the linux headless chromium used for reporting within
Kibana.

The Idea here is that one would simply only need create an issue within
the Kibana project that will then trigger the build for chromium, we'll
be leveraging GitHub actions for this, the Github workflow defined only
kicks in when an issue is created with the label
`trigger-chromium-build`, when such an issue is created, the body of the
issue is analysed; there's an expectation that the body would contain a
fenced codeblock of type json that specifies the property
`puppeteer_version`, if this is found this value is read and, we then
trigger a build on buildkite passing along this value, in the case where
there are multiple fenced code blocks specifying the expected property
`puppeteer_version` we leave a message that the issue body be modified,
also if the expected label was added but the expected property is not
found we also leave a message prompting that the expected property be
provided.

Once the build has commenced a message containing the link to the build
would be provided as initial feedback to the user, also on completion
another message is provided to the user, that provides a link to the PR
containing all the required changes to update puppeteer to the version
specified.

~It's also worth pointing out that this PR also, modifies the source for
`win`, `mac` chromium binaries to leverage the [JSON API
endpoint](https://github.com/GoogleChromeLabs/chrome-for-testing#json-api-endpoints)
provided by Google to get the required binaries for chromium headless in
a deterministic way, which in turns is part of what makes this
automation possible.~

## How to test this

- If you'd love to test the Github action too without the PR being
merging in just yet, you should consider setting up
[act](https://github.com/nektos/act), alongside it's companion [vscode
extension](https://sanjulaganepola.github.io/github-local-actions-docs/)
, we'll then want to create a payload file providing similar data that
github would return for our workflow trigger, more info about setting
this up
[here](https://sanjulaganepola.github.io/github-local-actions-docs/usage/settings/#payloads).
The payload file we'd want would be something along the following lines;

```json
{
    "action": "labeled",
    "label":{
        "name": "trigger-chromium-build"
    },
    "issue": {
        "number": 1,
        "title": "Issue 1",
        "author_association": "MEMBER",
        "labels": [
            {
                "name": "trigger-chromium-build"
            }
        ],
        "body": "\n## Random text \n\n ~~~json\n{\n  \"puppeteer_version\": \"24.6.1\" \n}\n~~~\n~~~json\n{\n  \"some_random_value\": \"23.0.1\" \n}\n~~~"
    }
}
```

- To test the actual build process it can be initiated through this
specific pipeline
https://buildkite.com/elastic/kibana-migration-pipeline-staging by
creating a custom build on `pull/212674/head` (this pull request) with
the env variable similar to this

```
TESTED_PIPELINE_PATH=.buildkite/pipelines/chromium_linux_build/build_chromium.yml
PUPPETEER_VERSION=24.6.1
GITHUB_ISSUE_NUMBER=212732
GITHUB_ISSUE_BASE_OWNER=elastic
GITHUB_ISSUE_BASE_REPO=kibana
GITHUB_ISSUE_TRIGGER_USER=eokoneyo
```

PS: Issue #212732 is an issue that's been created to test providing
feedback to the user about the build process, re-triggering a build on
an existing issue updates the comments in place.

<!-- ### Where to go from here?

- Ideas and thoughts welcome
-->
<!--
### Checklist

Check the PR satisfies following conditions. 

Reviewers should verify this PR satisfies this list as well.

- [ ] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/src/platform/packages/shared/kbn-i18n/README.md)
- [ ]
[Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html)
was added for features that require explanation or tutorials
- [ ] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [ ] If a plugin configuration key changed, check if it needs to be
allowlisted in the cloud and added to the [docker
list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker)
- [ ] This was checked for breaking HTTP API changes, and any breaking
changes have been approved by the breaking-change committee. The
`release_note:breaking` label should be applied in these situations.
- [ ] [Flaky Test
Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was
used on any tests changed
- [ ] The PR description includes the appropriate Release Notes section,
and the correct `release_note:*` label is applied per the
[guidelines](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)

### Identify risks

Does this PR introduce any risks? For example, consider risks like hard
to test bugs, performance regression, potential of data loss.

Describe the risk, its severity, and mitigation for each identified
risk. Invite stakeholders and evaluate how to proceed before merging.

- [ ] [See some risk
examples](https://github.com/elastic/kibana/blob/main/RISK_MATRIX.mdx)
- [ ] ...


-->

---------

Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
2025-05-06 22:04:08 +02:00

127 lines
5.3 KiB
Python

import os, subprocess, sys, platform, zipfile, hashlib, shutil
from os import path
from build_util import (
runcmd,
runcmdsilent,
sha256_file,
)
# This file builds Chromium headless on Linux.
# Verify that we have an argument, and if not print instructions
if (len(sys.argv) < 2):
print('Usage:')
print('python build.py {chromium_version} {arch_name}')
print('Example:')
print('python build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479 x64')
print('python build.py 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479 arm64 # cross-compile for ARM architecture')
print
sys.exit(1)
src_path = path.abspath(path.join(os.curdir, 'chromium', 'src'))
build_path = path.abspath(path.join(src_path, '..', '..'))
en_us_locale_pak_file_name = 'en-US.pak'
en_us_locale_file_path = path.abspath(en_us_locale_pak_file_name)
build_chromium_path = path.abspath(path.dirname(__file__))
argsgn_file = path.join(build_chromium_path, platform.system().lower(), 'args.gn')
# The version of Chromium we wish to build. This can be any valid git
# commit, tag, or branch, so: 68.0.3440.106 or
# 4747cc23ae334a57a35ed3c8e6adcdbc8a50d479
source_version = sys.argv[1]
base_version = source_version[:7].strip('.')
# Set to "arm" to build for ARM on Linux
arch_name = sys.argv[2] if len(sys.argv) >= 3 else 'unknown'
if arch_name != 'x64' and arch_name != 'arm64':
raise Exception('Unexpected architecture: ' + arch_name + '. `x64` and `arm64` are supported.')
print('Fetching locale files')
# TODO: move this into the repo itself, so we are only writing the build output to the bucket
runcmd('gsutil cp gs://headless_shell_staging/en-US.pak .')
print('Building Chromium ' + source_version + ' for ' + arch_name + ' from ' + src_path)
print('src path: ' + src_path)
print('depot_tools path: ' + path.join(build_path, 'depot_tools'))
print('build_chromium_path: ' + build_chromium_path)
print('args.gn file: ' + argsgn_file)
print
# Sync the codebase to the correct version
print('Setting local tracking branch')
print(' > cd ' + src_path)
os.chdir(src_path)
checked_out = runcmdsilent('git checkout build-' + base_version)
if checked_out != 0:
print('Syncing remote version')
runcmd('git fetch origin ' + source_version)
print('Creating a new branch for tracking the source version')
runcmd('git checkout -b build-' + base_version + ' ' + source_version)
# configure environment: environment path
depot_tools_path = os.path.join(build_path, 'depot_tools')
full_path = depot_tools_path + os.pathsep + os.environ['PATH']
print('Updating PATH for depot_tools: ' + full_path)
os.environ['PATH'] = full_path
# configure environment: build dependencies
print('Running sysroot install script...')
runcmd(src_path + '/build/linux/sysroot_scripts/install-sysroot.py --arch=' + arch_name)
print('Updating all modules')
runcmd('gclient sync -D')
print('Setting up build directory')
runcmd('rm -rf out/headless')
runcmd('mkdir out/headless')
# Copy args.gn from the root of our directory to out/headless/args.gn,
# add the target_cpu for cross-compilation
print('Adding target_cpu to args')
argsgn_file_out = path.abspath('out/headless/args.gn')
runcmd('cp ' + argsgn_file + ' ' + argsgn_file_out)
runcmd('echo \'target_cpu="' + arch_name + '"\' >> ' + argsgn_file_out)
runcmd('gn gen out/headless')
# Build Chromium... this takes *forever* on underpowered VMs
print('Compiling... this will take a while')
runcmd('autoninja -C out/headless headless_shell')
# Optimize the output on Linux x64 by stripping inessentials from the binary
# ARM must be cross-compiled from Linux and can not read the ARM binary in order to strip
if arch_name != 'arm64':
print('Optimizing headless_shell')
shutil.move('out/headless/headless_shell', 'out/headless/headless_shell_raw')
runcmd('strip -o out/headless/headless_shell out/headless/headless_shell_raw')
# Create the zip and generate the sha256 hash using filenames like:
# chromium-4747cc2-linux_x64.zip
base_filename = 'out/headless/chromium-' + base_version + '-locales-' + platform.system().lower() + '_' + arch_name
zip_filename = base_filename + '.zip'
sha256_filename = base_filename + '.sha256'
print('Creating ' + path.join(src_path, zip_filename))
archive = zipfile.ZipFile(zip_filename, mode='w', compression=zipfile.ZIP_DEFLATED)
path_prefix = 'headless_shell-' + platform.system().lower() + '_' + arch_name
# Add dependencies that must be bundled with the Chromium executable.
archive.write('out/headless/headless_shell', path.join(path_prefix, 'headless_shell'))
archive.write('out/headless/libEGL.so', path.join(path_prefix, 'libEGL.so'))
archive.write('out/headless/libGLESv2.so', path.join(path_prefix, 'libGLESv2.so'))
archive.write('out/headless/libvk_swiftshader.so', path.join(path_prefix, 'libvk_swiftshader.so'))
archive.write('out/headless/libvulkan.so.1', path.join(path_prefix, 'libvulkan.so.1'))
archive.write('out/headless/vk_swiftshader_icd.json', path.join(path_prefix, 'vk_swiftshader_icd.json'))
archive.write(en_us_locale_file_path, path.join(path_prefix, 'locales', en_us_locale_pak_file_name))
archive.close()
print('Creating ' + path.join(src_path, sha256_filename))
with open (sha256_filename, 'w') as f:
f.write(sha256_file(zip_filename))
runcmd('gsutil cp ' + path.join(src_path, zip_filename) + ' gs://headless_shell_staging')
runcmd('gsutil cp ' + path.join(src_path, sha256_filename) + ' gs://headless_shell_staging')