[Screenshotting] Darwin downloads Chromium at startup, add ARM support for Darwin (#122057)

* update puppeteer

* [Reporting] Use downloadable chromium build for the Mac OS headless browser

* fix test

* [Screenshotting] add new paths for mac browsers

* add bundled field to chromium download paths interface

* update build scripts

* polish

* log when skip

* stop downloading all packages every time

* pass the PackageInfo object to the install function

* remove eslint-disable-line

* fix call to getBinaryPath

* polish

* simplify plugin.ts

* fix unit test

* fix lint

* tweak gulp task to explicitly download chromium for all platforms

* ignore chromium when copying x-pack source

* simplify

* update documentation with steps to manually download chromium

* Apply Documentation suggestions from code review

Co-authored-by: gchaps <33642766+gchaps@users.noreply.github.com>

* update docs to cut redundancy

* clean up link to manual browser download

* wording choice adjustment

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
Co-authored-by: gchaps <33642766+gchaps@users.noreply.github.com>
This commit is contained in:
Tim Sullivan 2022-01-05 21:44:29 -07:00 committed by GitHub
parent 6dc463da70
commit 4ba843c267
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
17 changed files with 206 additions and 187 deletions

View file

@ -50,7 +50,7 @@ xpack.reporting.encryptionKey: "something_secret"
[[reporting-kibana-server-settings]]
==== {kib} server settings
Reporting opens the {kib} web interface in a server process to generate
For PNG and PDF reports, Reporting opens the {kib} web interface in a headless server process to generate
screenshots of {kib} visualizations. In most cases, the default settings
work and you don't need to configure the {report-features} to communicate with {kib}.
@ -129,14 +129,11 @@ If capturing a report fails for any reason, {kib} will re-attempt other reportin
`xpack.reporting.capture.loadDelay`::
Specify the {time-units}[amount of time] before taking a screenshot when visualizations are not evented. All visualizations that ship with {kib} are evented, so this setting should not have much effect. If you are seeing empty images instead of visualizations, try increasing this value. Defaults to `3s`.
[[xpack-reporting-browser]] `xpack.reporting.capture.browser.type` {ess-icon}::
Specifies the browser to use to capture screenshots. This setting exists for backward compatibility. The only valid option is `chromium`.
[float]
[[reporting-chromium-settings]]
==== Chromium settings
When <<xpack-reporting-browser, `xpack.reporting.capture.browser.type`>> is set to `chromium` (default) you can also specify the following settings.
For PDF and PNG reports, Reporting spawns a headless Chromium browser process on the server to load and capture a screenshot of the {kib} app. When installing {kib} on Linux and Windows platforms, the Chromium binary comes bundled with the {kib} download. For Mac platforms, the Chromium binary is downloaded the first time {kib} is started.
`xpack.reporting.capture.browser.chromium.disableSandbox`::
It is recommended that you research the feasibility of enabling unprivileged user namespaces. An exception is if you are running {kib} in Docker because the container runs in a user namespace with the built-in seccomp/bpf filters. For more information, refer to <<reporting-chromium-sandbox>>. Defaults to `false` for all operating systems except Debian and Red Hat Linux, which use `true`.

View file

@ -8,7 +8,15 @@
:keywords: administrator, analyst, concept, setup, reporting
:description: Consider the production components that are used to generate reports.
To generate reports, {kib} uses a custom build of the Chromium web browser, which runs on the {kib} server in headless mode to load {kib} and capture the rendered {kib} visualizations as images. Chromium is an open-source project not related to Elastic, but the Chromium binary for {kib} has been custom-built by Elastic to make sure it works with minimal setup. The operating system that the {kib} server uses can require additional dependencies for Chromium.
To generate reports, {kib} uses the Chromium web browser, which runs on the server in headless mode. Chromium is an open-source project not related to Elastic, and is embedded into {kib}. Chromium may require additional OS dependencies to run properly.
[NOTE]
============
Chromium is not embedded into {kib} for the Darwin (Mac OS) architecture. When
running on Darwin, Reporting will download Chromium into the proper area of
the {kib} installation path the first time the server starts. To separately
download and install the browser, see <<reporting-manual-chromium-install>>.
============
[float]
[[reporting-chromium-sandbox]]

View file

@ -14,8 +14,8 @@ Having trouble? Here are solutions to common problems you might encounter while
* <<reporting-troubleshooting-error-messages>>
* <<reporting-troubleshooting-puppeteer-debug-logs>>
* <<reporting-troubleshooting-system-requirements>>
* <<reporting-troubleshooting-arm-systems>>
* <<reporting-troubleshooting-maps-ems>>
* <<reporting-manual-chromium-install>>
[float]
[[reporting-diagnostics]]
@ -150,12 +150,6 @@ requests to render.
If the {kib} instance doesn't have enough memory to run the report, the report fails with an error such as `Error: Page crashed!`
In this case, try increasing the memory for the {kib} instance to 2GB.
[float]
[[reporting-troubleshooting-arm-systems]]
=== ARM systems
Chromium is not compatible with ARM RHEL.
[float]
[[reporting-troubleshooting-maps-ems]]
=== Unable to connect to Elastic Maps Service
@ -164,3 +158,24 @@ https://www.elastic.co/elastic-maps-service[{ems} ({ems-init})] is a service tha
tile layers and vector shapes of administrative boundaries.
If a report contains a map with a missing basemap layer or administrative boundary, the {kib} server does not have access to {ems-init}.
See <<maps-connect-to-ems>> for information on how to connect your {kib} server to {ems-init}.
[float]
[[reporting-manual-chromium-install]]
=== Manually install the Chromium browser for Darwin
Chromium is not embedded into {kib} for the Darwin (Mac OS) architecture. When
running {kib} on Darwin, Reporting will download Chromium into the proper area of
the {kib} installation path the first time the server starts. If the server
does not have access to the Internet, you must download the
Chromium browser and install it into the {kib} installation path.
1. Download the Chromium zip file:
** For https://commondatastorage.googleapis.com/chromium-browser-snapshots/Mac/901912/chrome-mac.zip[x64] systems
** For https://commondatastorage.googleapis.com/chromium-browser-snapshots/Mac_Arm/901913/chrome-mac.zip[ARM] systems
2. Copy the zip file into the holding area. Relative to the root directory of {kib}, the path is:
** `.chromium/x64` for x64 systems
** `.chromium/arm64` for ARM systems
When {kib} starts, it will automatically extract the browser from the zip file, and is then ready for PNG and PDF reports.

View file

@ -7,18 +7,27 @@
*/
// eslint-disable-next-line @kbn/eslint/no-restricted-paths
import { install } from '../../../../x-pack/plugins/screenshotting/server/utils';
import { install, paths } from '../../../../x-pack/plugins/screenshotting/server/utils';
export const InstallChromium = {
description: 'Installing Chromium',
async run(config, log, build) {
for (const platform of config.getNodePlatforms()) {
const target = `${platform.getName()}-${platform.getArchitecture()}`;
log.info(`Installing Chromium for ${target}`);
const preInstalledPackages = paths.packages.filter((p) => p.isPreInstalled);
// revert after https://github.com/elastic/kibana/issues/109949
if (target === 'darwin-arm64') continue;
for (const platform of config.getNodePlatforms()) {
const pkg = paths.find(platform.getName(), platform.getArchitecture(), preInstalledPackages);
const target = `${platform.getName()}-${platform.getArchitecture()}`;
if (!pkg) {
log.info(`Skipping Chromium install for ${target}`);
// Unbundled chromium packages (for Darwin): Chromium is downloaded at
// server startup, rather than being pre-installed
continue;
}
log.info(`Installing Chromium for ${target}`);
const logger = {
get: log.withType.bind(log),
@ -31,12 +40,8 @@ export const InstallChromium = {
log: log.write.bind(log),
};
await install(
logger,
build.resolvePathForPlatform(platform, 'x-pack/plugins/screenshotting/chromium'),
platform.getName(),
platform.getArchitecture()
);
const path = build.resolvePathForPlatform(platform, 'x-pack/plugins/screenshotting/chromium');
await install(logger, pkg, path);
}
},
};

View file

@ -1,29 +1,26 @@
# Chromium build
We ship our own headless build of Chromium for Linux and Mac OS, using a
version of the source that corresponds to the requirements of the Puppeteer
node module. The scripts in this folder can be used to accept a commit hash
from the Chromium repository, and initialize the build in a workspace.
We ship our own headless build of Chromium which is significantly smaller than
the standard binaries shipped by Google. The scripts in this folder can be used
to accept a commit hash from the Chromium repository, and initialize the build
on Ubuntu Linux.
## Why do we do this
**Linux**: By default, Puppeteer will download a zip file containing the
Chromium browser for any OS. This creates problems on Linux, because Chromium
has a dependency on X11, which is often not installed for a server environment.
We don't want to make a requirement for Linux that you need X11 to run Kibana.
To work around this, we create our own Chromium build, using the
By default, Puppeteer will download a zip file containing the Chromium browser for any
OS. This creates problems on Linux, because Chromium has a dependency on X11, which
is often not installed for a server environment. We don't want to make a requirement
for Linux that you need X11 to run Kibana. To work around this, we create our own Chromium
build, using the
[`headless_shell`](https://chromium.googlesource.com/chromium/src/+/5cf4b8b13ed518472038170f8de9db2f6c258fe4/headless)
build target. There are no (trustworthy) sources of these builds available
elsewhere.
build target. There are no (trustworthy) sources of these builds available elsewhere.
**Mac**: We do this on Mac because Elastic signs the Kibanna release artifact
with Apple to work with Gatekeeper on Mac OS. Having our own binary of Chromium
and bundling it with Kibana is integral to the artifact signing process.
**Windows**: No custom build is necessary for Windows. We are able to use the
full build of Chromium, downloaded from the main [Chromium download
location](https://commondatastorage.googleapis.com/chromium-browser-snapshots/index.html),
using the revision that corresponds with the Puppeteer dependency.
Fortunately, creating the custom builds is only necessary for Linux. When you have a build
of Kibana for Linux, or if you use a Linux desktop to develop Kibana, you have a copy of
`headless_shell` bundled inside. When you have a Windows or Mac build of Kibana, or use
either of those for development, you have a copy of the full build of Chromium, which
was downloaded from the main [Chromium download
location](https://commondatastorage.googleapis.com/chromium-browser-snapshots/index.html).
## Build Script Usage
@ -42,10 +39,10 @@ gsutil cp -r gs://headless_shell_staging/build_chromium .
python ./build_chromium/init.py [arch_name]
# Run the build script with the path to the chromium src directory, the git commit hash
python ./build_chromium/build.py <commit_id> x64
python ./build_chromium/build.py 70f5d88ea95298a18a85c33c98ea00e02358ad75 x64
# OR You can build for ARM
python ./build_chromium/build.py <commit_id> arm64
python ./build_chromium/build.py 70f5d88ea95298a18a85c33c98ea00e02358ad75 arm64
```
**NOTE:** The `init.py` script updates git config to make it more possible for
@ -56,7 +53,7 @@ with "early EOF" errors, the instance could be low on memory or disk space.
If you need to bump the version of Puppeteer, you need to get a new git commit hash for Chromium that corresponds to the Puppeteer version.
```
node x-pack/dev-tools/chromium_version.js [PuppeteerVersion]
node scripts/chromium_version.js [PuppeteerVersion]
```
When bumping the Puppeteer version, make sure you also update the `ChromiumArchivePaths.revision` variable in

View file

@ -6,7 +6,7 @@ from build_util import (
md5_file,
)
# This file builds Chromium headless on Mac and Linux.
# This file builds Chromium headless on Linux.
# Verify that we have an argument, and if not print instructions
if (len(sys.argv) < 2):
@ -88,7 +88,7 @@ runcmd('gn gen out/headless')
print('Compiling... this will take a while')
runcmd('autoninja -C out/headless headless_shell')
# Optimize the output on Linux x64 and Mac by stripping inessentials from the binary
# Optimize the output on Linux x64 by stripping inessentials from the binary
# ARM must be cross-compiled from Linux and can not read the ARM binary in order to strip
if platform.system() != 'Windows' and arch_name != 'arm64':
print('Optimizing headless_shell')
@ -111,18 +111,10 @@ def archive_file(name):
archive.write(from_path, to_path)
return to_path
# Each platform has slightly different requirements for what dependencies
# must be bundled with the Chromium executable.
# Add dependencies that must be bundled with the Chromium executable.
archive_file('headless_shell')
if platform.system() == 'Linux':
archive_file(path.join('swiftshader', 'libEGL.so'))
archive_file(path.join('swiftshader', 'libGLESv2.so'))
elif platform.system() == 'Darwin':
archive_file('headless_shell')
archive_file('libswiftshader_libEGL.dylib')
archive_file('libswiftshader_libGLESv2.dylib')
archive_file(path.join('Helpers', 'chrome_crashpad_handler'))
archive_file(path.join('swiftshader', 'libEGL.so'))
archive_file(path.join('swiftshader', 'libGLESv2.so'))
archive.close()

View file

@ -1,30 +0,0 @@
# Based on //build/headless.gn
# Embed resource.pak into binary to simplify deployment.
headless_use_embedded_resources = true
# In order to simplify deployment we build ICU data file
# into binary.
icu_use_data_file = false
# Use embedded data instead external files for headless in order
# to simplify deployment.
v8_use_external_startup_data = false
enable_nacl = false
enable_print_preview = false
enable_basic_printing = false
enable_remoting = false
use_alsa = false
use_cups = false
use_dbus = false
use_gio = false
# Please, consult @elastic/kibana-security before changing/removing this option.
use_kerberos = false
use_libpci = false
use_pulseaudio = false
use_udev = false
is_debug = false
symbol_level = 0
is_component_build = false

View file

@ -9,3 +9,14 @@ The plugin exposes most of the functionality in the start contract.
The Chromium download and setup is happening during the setup stage.
To learn more about the public API, please use automatically generated API reference or generated TypeDoc comments.
## Testing Chromium downloads
To download all Chromium browsers for all platforms and architectures:
```
cd x-pack
npx gulp downloadChromium
```
This command is used to provision CI workspaces so that Chromium does not need to be downloaded for every CI run.

View file

@ -14,8 +14,9 @@ export const getChromiumDisconnectedError = () =>
})
);
export { ChromiumArchivePaths } from './paths';
export type { ConditionalHeaders } from './driver';
export { HeadlessChromiumDriver } from './driver';
export type { PerformanceMetrics } from './driver_factory';
export type { ConditionalHeaders } from './driver';
export { DEFAULT_VIEWPORT, HeadlessChromiumDriverFactory } from './driver_factory';
export type { PerformanceMetrics } from './driver_factory';
export { ChromiumArchivePaths } from './paths';
export type { PackageInfo } from './paths';

View file

@ -7,14 +7,16 @@
import path from 'path';
interface PackageInfo {
platform: string;
architecture: string;
export interface PackageInfo {
platform: 'linux' | 'darwin' | 'win32';
architecture: 'x64' | 'arm64';
archiveFilename: string;
archiveChecksum: string;
binaryChecksum: string;
binaryRelativePath: string;
revision: number;
revision: 901912 | 901913;
isPreInstalled: boolean;
location: 'custom' | 'common';
}
enum BaseUrl {
@ -32,17 +34,35 @@ interface CommonPackageInfo extends PackageInfo {
archivePath: string;
}
function isCommonPackage(p: PackageInfo): p is CommonPackageInfo {
return p.location === 'common';
}
export class ChromiumArchivePaths {
public readonly packages: Array<CustomPackageInfo | CommonPackageInfo> = [
{
platform: 'darwin',
architecture: 'x64',
archiveFilename: 'chromium-d163fd7-darwin_x64.zip',
archiveChecksum: '19aa88bd59e2575816425bf72786c53f',
binaryChecksum: 'dfcd6e007214175997663c50c8d871ea',
binaryRelativePath: 'headless_shell-darwin_x64/headless_shell',
location: 'custom',
revision: 856583,
archiveFilename: 'chrome-mac.zip',
archiveChecksum: '229fd88c73c5878940821875f77578e4',
binaryChecksum: 'b0e5ca009306b14e41527000139852e5',
binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium',
location: 'common',
archivePath: 'Mac',
revision: 901912,
isPreInstalled: false,
},
{
platform: 'darwin',
architecture: 'arm64',
archiveFilename: 'chrome-mac.zip',
archiveChecksum: 'ecf7aa509c8e2545989ebb9711e35384',
binaryChecksum: 'b5072b06ffd2d2af4fea7012914da09f',
binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium',
location: 'common',
archivePath: 'Mac_Arm',
revision: 901913, // NOTE: 901912 is not available
isPreInstalled: false,
},
{
platform: 'linux',
@ -53,6 +73,7 @@ export class ChromiumArchivePaths {
binaryRelativePath: 'headless_shell-linux_x64/headless_shell',
location: 'custom',
revision: 901912,
isPreInstalled: true,
},
{
platform: 'linux',
@ -63,6 +84,7 @@ export class ChromiumArchivePaths {
binaryRelativePath: 'headless_shell-linux_arm64/headless_shell',
location: 'custom',
revision: 901912,
isPreInstalled: true,
},
{
platform: 'win32',
@ -70,18 +92,19 @@ export class ChromiumArchivePaths {
archiveFilename: 'chrome-win.zip',
archiveChecksum: '861bb8b7b8406a6934a87d3cbbce61d9',
binaryChecksum: 'ffa0949471e1b9a57bc8f8633fca9c7b',
binaryRelativePath: 'chrome-win\\chrome.exe',
binaryRelativePath: path.join('chrome-win', 'chrome.exe'),
location: 'common',
archivePath: 'Win',
revision: 901912,
isPreInstalled: true,
},
];
// zip files get downloaded to a .chromium directory in the kibana root
public readonly archivesPath = path.resolve(__dirname, '../../../../../../.chromium');
public find(platform: string, architecture: string) {
return this.packages.find((p) => p.platform === platform && p.architecture === architecture);
public find(platform: string, architecture: string, packages: PackageInfo[] = this.packages) {
return packages.find((p) => p.platform === platform && p.architecture === architecture);
}
public resolvePath(p: PackageInfo) {
@ -93,15 +116,14 @@ export class ChromiumArchivePaths {
return this.packages.map((p) => this.resolvePath(p));
}
public getDownloadUrl(p: CustomPackageInfo | CommonPackageInfo) {
if (p.location === 'common') {
public getDownloadUrl(p: PackageInfo) {
if (isCommonPackage(p)) {
return `${BaseUrl.common}/${p.archivePath}/${p.revision}/${p.archiveFilename}`;
}
return BaseUrl.custom + '/' + p.archiveFilename; // revision is not used for URL if package is a custom build
}
public getBinaryPath(p: PackageInfo) {
const chromiumPath = path.resolve(__dirname, '../../../chromium');
public getBinaryPath(p: PackageInfo, chromiumPath: string) {
return path.join(chromiumPath, p.binaryRelativePath);
}
}

View file

@ -8,7 +8,7 @@
import path from 'path';
import mockFs from 'mock-fs';
import { existsSync, readdirSync } from 'fs';
import { ChromiumArchivePaths } from '../chromium';
import { ChromiumArchivePaths, PackageInfo } from '../chromium';
import { fetch } from './fetch';
import { md5 } from './checksum';
import { download } from '.';
@ -18,9 +18,11 @@ jest.mock('./fetch');
describe('ensureDownloaded', () => {
let paths: ChromiumArchivePaths;
let pkg: PackageInfo;
beforeEach(() => {
paths = new ChromiumArchivePaths();
pkg = paths.find('linux', 'x64') as PackageInfo;
(md5 as jest.MockedFunction<typeof md5>).mockImplementation(
async (packagePath) =>
@ -51,7 +53,7 @@ describe('ensureDownloaded', () => {
[unexpectedPath2]: 'test',
});
await download(paths);
await download(paths, pkg);
expect(existsSync(unexpectedPath1)).toBe(false);
expect(existsSync(unexpectedPath2)).toBe(false);
@ -60,13 +62,13 @@ describe('ensureDownloaded', () => {
it('should reject when download fails', async () => {
(fetch as jest.MockedFunction<typeof fetch>).mockRejectedValueOnce(new Error('some error'));
await expect(download(paths)).rejects.toBeInstanceOf(Error);
await expect(download(paths, pkg)).rejects.toBeInstanceOf(Error);
});
it('should reject when downloaded md5 hash is different', async () => {
(fetch as jest.MockedFunction<typeof fetch>).mockResolvedValue('random-md5');
await expect(download(paths)).rejects.toBeInstanceOf(Error);
await expect(download(paths, pkg)).rejects.toBeInstanceOf(Error);
});
describe('when archives are already present', () => {
@ -79,24 +81,24 @@ describe('ensureDownloaded', () => {
});
it('should not download again', async () => {
await download(paths);
await download(paths, pkg);
expect(fetch).not.toHaveBeenCalled();
expect(readdirSync(path.resolve(`${paths.archivesPath}/x64`))).toEqual(
expect.arrayContaining([
'chrome-mac.zip',
'chrome-win.zip',
'chromium-70f5d88-linux_x64.zip',
'chromium-d163fd7-darwin_x64.zip',
])
);
expect(readdirSync(path.resolve(`${paths.archivesPath}/arm64`))).toEqual(
expect.arrayContaining(['chromium-70f5d88-linux_arm64.zip'])
expect.arrayContaining(['chrome-mac.zip', 'chromium-70f5d88-linux_arm64.zip'])
);
});
it('should download again if md5 hash different', async () => {
(md5 as jest.MockedFunction<typeof md5>).mockResolvedValueOnce('random-md5');
await download(paths);
await download(paths, pkg);
expect(fetch).toHaveBeenCalledTimes(1);
});

View file

@ -8,7 +8,7 @@
import { existsSync } from 'fs';
import del from 'del';
import type { Logger } from 'src/core/server';
import type { ChromiumArchivePaths } from '../chromium';
import type { ChromiumArchivePaths, PackageInfo } from '../chromium';
import { md5 } from './checksum';
import { fetch } from './fetch';
@ -19,7 +19,7 @@ import { fetch } from './fetch';
* @param {BrowserSpec} browsers
* @return {Promise<undefined>}
*/
export async function download(paths: ChromiumArchivePaths, logger?: Logger) {
export async function download(paths: ChromiumArchivePaths, pkg: PackageInfo, logger?: Logger) {
const removedFiles = await del(`${paths.archivesPath}/**/*`, {
force: true,
onlyFiles: true,
@ -29,57 +29,49 @@ export async function download(paths: ChromiumArchivePaths, logger?: Logger) {
removedFiles.forEach((path) => logger?.warn(`Deleting unexpected file ${path}`));
const invalidChecksums: string[] = [];
await Promise.all(
paths.packages.map(async (path) => {
const { archiveFilename, archiveChecksum } = path;
if (!archiveFilename || !archiveChecksum) {
return;
}
const resolvedPath = paths.resolvePath(path);
const pathExists = existsSync(resolvedPath);
const { archiveFilename, archiveChecksum } = pkg;
if (!archiveFilename || !archiveChecksum) {
return;
}
let foundChecksum = 'MISSING';
try {
foundChecksum = await md5(resolvedPath);
// eslint-disable-next-line no-empty
} catch {}
const resolvedPath = paths.resolvePath(pkg);
const foundChecksum = await md5(resolvedPath).catch(() => 'MISSING');
if (pathExists && foundChecksum === archiveChecksum) {
logger?.debug(
`Browser archive for ${path.platform}/${path.architecture} found in ${resolvedPath}.`
);
return;
}
const pathExists = existsSync(resolvedPath);
if (pathExists && foundChecksum === archiveChecksum) {
logger?.debug(
`Browser archive for ${pkg.platform}/${pkg.architecture} already found in ${resolvedPath}.`
);
return;
}
if (!pathExists) {
logger?.warn(
`Browser archive for ${path.platform}/${path.architecture} not found in ${resolvedPath}.`
);
}
if (!pathExists) {
logger?.warn(
`Browser archive for ${pkg.platform}/${pkg.architecture} not found in ${resolvedPath}.`
);
}
if (foundChecksum !== archiveChecksum) {
logger?.warn(
`Browser archive checksum for ${path.platform}/${path.architecture} ` +
`is ${foundChecksum} but ${archiveChecksum} was expected.`
);
}
if (foundChecksum !== archiveChecksum) {
logger?.warn(
`Browser archive checksum for ${pkg.platform}/${pkg.architecture} ` +
`is ${foundChecksum} but ${archiveChecksum} was expected.`
);
}
const url = paths.getDownloadUrl(path);
try {
const downloadedChecksum = await fetch(url, resolvedPath, logger);
if (downloadedChecksum !== archiveChecksum) {
logger?.warn(
`Invalid checksum for ${path.platform}/${path.architecture}: ` +
`expected ${archiveChecksum} got ${downloadedChecksum}`
);
invalidChecksums.push(`${url} => ${resolvedPath}`);
}
} catch (error) {
throw new Error(`Failed to download ${url}: ${error}`);
}
})
);
const url = paths.getDownloadUrl(pkg);
try {
const downloadedChecksum = await fetch(url, resolvedPath, logger);
if (downloadedChecksum !== archiveChecksum) {
logger?.warn(
`Invalid checksum for ${pkg.platform}/${pkg.architecture}: ` +
`expected ${archiveChecksum} got ${downloadedChecksum}`
);
invalidChecksums.push(`${url} => ${resolvedPath}`);
}
} catch (error) {
throw new Error(`Failed to download ${url}: ${error}`);
}
if (invalidChecksums.length) {
const error = new Error(

View file

@ -6,10 +6,9 @@
*/
import del from 'del';
import os from 'os';
import path from 'path';
import type { Logger } from 'src/core/server';
import { ChromiumArchivePaths } from './chromium';
import { ChromiumArchivePaths, PackageInfo } from './chromium';
import { download } from './download';
import { md5 } from './download/checksum';
import { extract } from './extract';
@ -21,23 +20,16 @@ import { extract } from './extract';
export async function install(
paths: ChromiumArchivePaths,
logger: Logger,
chromiumPath: string = path.resolve(__dirname, '../../chromium'),
platform: string = process.platform,
architecture: string = os.arch()
pkg: PackageInfo,
chromiumPath: string = path.resolve(__dirname, '../../chromium')
): Promise<string> {
const pkg = paths.find(platform, architecture);
if (!pkg) {
throw new Error(`Unsupported platform: ${platform}-${architecture}`);
}
const binaryPath = paths.getBinaryPath(pkg);
const binaryChecksum = await md5(binaryPath).catch(() => '');
const binaryPath = paths.getBinaryPath(pkg, chromiumPath);
const binaryChecksum = await md5(binaryPath).catch(() => 'MISSING');
if (binaryChecksum !== pkg.binaryChecksum) {
logger?.warn(
`Found browser binary checksum for ${pkg.platform}/${pkg.architecture} ` +
`is ${binaryChecksum} but ${pkg.binaryChecksum} was expected. Re-installing...`
`Found browser binary checksum for ${pkg.platform}/${pkg.architecture} in ${binaryPath}` +
` is ${binaryChecksum} but ${pkg.binaryChecksum} was expected. Re-installing...`
);
try {
await del(chromiumPath);
@ -46,7 +38,7 @@ export async function install(
}
try {
await download(paths, logger);
await download(paths, pkg, logger);
const archive = path.join(paths.archivesPath, pkg.architecture, pkg.archiveFilename);
logger.info(`Extracting [${archive}] to [${chromiumPath}]`);
await extract(archive, chromiumPath);

View file

@ -16,8 +16,9 @@ import type {
} from 'src/core/server';
import type { ScreenshotModePluginSetup } from 'src/plugins/screenshot_mode/server';
import { ChromiumArchivePaths, HeadlessChromiumDriverFactory, install } from './browsers';
import { createConfig, ConfigType } from './config';
import { ConfigType, createConfig } from './config';
import { getScreenshots, ScreenshotOptions } from './screenshots';
import { getChromiumPackage } from './utils';
interface SetupDeps {
screenshotMode: ScreenshotModePluginSetup;
@ -59,7 +60,7 @@ export class ScreenshottingPlugin implements Plugin<void, ScreenshottingStart, S
const logger = this.logger.get('chromium');
const [config, binaryPath] = await Promise.all([
createConfig(this.logger, this.config),
install(paths, logger),
install(paths, logger, getChromiumPackage()),
]);
return new HeadlessChromiumDriverFactory(this.screenshotMode, config, logger, binaryPath);

View file

@ -5,9 +5,23 @@
* 2.0.
*/
import os from 'os';
import { ChromiumArchivePaths, download as baseDownload, install as baseInstall } from './browsers';
const paths = new ChromiumArchivePaths();
export const getChromiumPackage = () => {
const platform = process.platform;
const architecture = os.arch();
const chromiumPackageInfo = paths.find(process.platform, architecture);
if (!chromiumPackageInfo) {
throw new Error(`Unsupported platform: ${platform}-${architecture}`);
}
return chromiumPackageInfo;
};
export const download = baseDownload.bind(undefined, paths);
export const install = baseInstall.bind(undefined, paths);
export { paths };

View file

@ -60,8 +60,6 @@ async function copySourceAndBabelify() {
'index.js',
'.i18nrc.json',
'plugins/**/*',
'plugins/reporting/.phantom/*',
'plugins/reporting/.chromium/*',
'typings/**/*',
],
{
@ -79,6 +77,7 @@ async function copySourceAndBabelify() {
'**/{__tests__,__mocks__,__snapshots__,__fixtures__,__jest__,cypress}/**',
'plugins/*/target/**',
'plugins/canvas/shareable_runtime/test/**',
'plugins/screenshotting/chromium/**',
'plugins/telemetry_collection_xpack/schema/**', // Skip telemetry schemas
],
allowEmpty: true,

View file

@ -5,7 +5,7 @@
* 2.0.
*/
import { download } from '../plugins/screenshotting/server/utils';
import { download, paths } from '../plugins/screenshotting/server/utils';
export const downloadChromium = async () => {
// eslint-disable-next-line no-console
@ -21,5 +21,6 @@ export const downloadChromium = async () => {
log: consoleLogger('log'),
};
await download(logger);
// Download Chromium for all platforms
await Promise.all(paths.packages.map((pkg) => download(pkg, logger)));
};