mirror of
https://github.com/elastic/kibana.git
synced 2025-04-25 02:09:32 -04:00
[Reporting] Update Puppeteer to version 8.0.0 and Chromium to r856583 (#98688)
* Update Puppeteer to 8.0.0 Updates Chromium to r856583 Links to new build of Linux headless_shell in the Kibana team GCS bucket Links to main download site of Chromium for Mac and Windows Removes Mac and Windows compatibility from the Chromium build scripts * add functional tests for large dashboard * ensure png comparison is working * add test for large dashboard pdf * update arm64 binary checksum * update README * more readme update * Update x-pack/build_chromium/README.md Co-authored-by: Jean-Louis Leysens <jloleysens@gmail.com> Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com> Co-authored-by: Jean-Louis Leysens <jloleysens@gmail.com>
This commit is contained in:
parent
518da4daa1
commit
f73da420ff
25 changed files with 519 additions and 434 deletions
|
@ -314,7 +314,7 @@
|
||||||
"proxy-from-env": "1.0.0",
|
"proxy-from-env": "1.0.0",
|
||||||
"proxyquire": "1.8.0",
|
"proxyquire": "1.8.0",
|
||||||
"puid": "1.0.7",
|
"puid": "1.0.7",
|
||||||
"puppeteer": "npm:@elastic/puppeteer@5.4.1-patch.1",
|
"puppeteer": "^8.0.0",
|
||||||
"query-string": "^6.13.2",
|
"query-string": "^6.13.2",
|
||||||
"raw-loader": "^3.1.0",
|
"raw-loader": "^3.1.0",
|
||||||
"rbush": "^3.0.1",
|
"rbush": "^3.0.1",
|
||||||
|
@ -586,7 +586,6 @@
|
||||||
"@types/pretty-ms": "^5.0.0",
|
"@types/pretty-ms": "^5.0.0",
|
||||||
"@types/prop-types": "^15.7.3",
|
"@types/prop-types": "^15.7.3",
|
||||||
"@types/proper-lockfile": "^3.0.1",
|
"@types/proper-lockfile": "^3.0.1",
|
||||||
"@types/puppeteer": "^5.4.1",
|
|
||||||
"@types/rbush": "^3.0.0",
|
"@types/rbush": "^3.0.0",
|
||||||
"@types/reach__router": "^1.2.6",
|
"@types/reach__router": "^1.2.6",
|
||||||
"@types/react": "^16.9.36",
|
"@types/react": "^16.9.36",
|
||||||
|
|
|
@ -3,37 +3,24 @@
|
||||||
We ship our own headless build of Chromium which is significantly smaller than
|
We ship our own headless build of Chromium which is significantly smaller than
|
||||||
the standard binaries shipped by Google. The scripts in this folder can be used
|
the standard binaries shipped by Google. The scripts in this folder can be used
|
||||||
to accept a commit hash from the Chromium repository, and initialize the build
|
to accept a commit hash from the Chromium repository, and initialize the build
|
||||||
environments and run the build on Mac, Windows, and Linux.
|
on Ubuntu Linux.
|
||||||
|
|
||||||
## Before you begin
|
## Why do we do this
|
||||||
|
|
||||||
If you wish to use a remote VM to build, you'll need access to our GCP account,
|
By default, Puppeteer will download a zip file containing the Chromium browser for any
|
||||||
which is where we have two machines provisioned for the Linux and Windows
|
OS. This creates problems on Linux, because Chromium has a dependency on X11, which
|
||||||
builds. Mac builds can be achieved locally, and are a great place to start to
|
is often not installed for a server environment. We don't want to make a requirement
|
||||||
gain familiarity.
|
for Linux that you need X11 to run Kibana. To work around this, we create our own Chromium
|
||||||
|
build, using the
|
||||||
|
[`headless_shell`](https://chromium.googlesource.com/chromium/src/+/5cf4b8b13ed518472038170f8de9db2f6c258fe4/headless)
|
||||||
|
build target. There are no (trustworthy) sources of these builds available elsewhere.
|
||||||
|
|
||||||
**NOTE:** Linux builds should be done in Ubuntu on x64 architecture. ARM builds
|
Fortunately, creating the custom builds is only necessary for Linux. When you have a build
|
||||||
are created in x64 using cross-compiling. CentOS is not supported for building Chromium.
|
of Kibana for Linux, or if you use a Linux desktop to develop Kibana, you have a copy of
|
||||||
|
`headless_shell` bundled inside. When you have a Windows or Mac build of Kibana, or use
|
||||||
1. Login to our GCP instance [here using your okta credentials](https://console.cloud.google.com/).
|
either of those for development, you have a copy of the full build of Chromium, which
|
||||||
2. Click the "Compute Engine" tab.
|
was downloaded from the main [Chromium download
|
||||||
3. Find `chromium-build-linux` or `chromium-build-windows-12-beefy` and start the instance.
|
location](https://commondatastorage.googleapis.com/chromium-browser-snapshots/index.html).
|
||||||
4. Install [Google Cloud SDK](https://cloud.google.com/sdk) locally to ssh into the GCP instance
|
|
||||||
5. System dependencies:
|
|
||||||
- 8 CPU
|
|
||||||
- 30GB memory
|
|
||||||
- 80GB free space on disk (Try `ncdu /home` to see where space is used.)
|
|
||||||
- git
|
|
||||||
- python2 (`python` must link to `python2`)
|
|
||||||
- lsb_release
|
|
||||||
- tmux is recommended in case your ssh session is interrupted
|
|
||||||
- "Cloud API access scopes": must have **read / write** scope for the Storage API
|
|
||||||
6. Copy the entire `build_chromium` directory from the `headless_shell_staging` bucket. To do this, use `gsutil rsync`:
|
|
||||||
```sh
|
|
||||||
# This shows a preview of what would change by synchronizing the source scripts with the destination GCS bucket.
|
|
||||||
# Remove the `-n` flag to enact the changes
|
|
||||||
gsutil -m rsync -n -r x-pack/build_chromium gs://headless_shell_staging/build_chromium
|
|
||||||
```
|
|
||||||
|
|
||||||
## Build Script Usage
|
## Build Script Usage
|
||||||
|
|
||||||
|
@ -63,161 +50,65 @@ the Chromium repo to be cloned successfully. If checking out the Chromium fails
|
||||||
with "early EOF" errors, the instance could be low on memory or disk space.
|
with "early EOF" errors, the instance could be low on memory or disk space.
|
||||||
|
|
||||||
## Getting the Commit Hash
|
## Getting the Commit Hash
|
||||||
|
|
||||||
If you need to bump the version of Puppeteer, you need to get a new git commit hash for Chromium that corresponds to the Puppeteer version.
|
If you need to bump the version of Puppeteer, you need to get a new git commit hash for Chromium that corresponds to the Puppeteer version.
|
||||||
```
|
```
|
||||||
node x-pack/dev-tools/chromium_version.js [PuppeteerVersion]
|
node x-pack/dev-tools/chromium_version.js [PuppeteerVersion]
|
||||||
```
|
```
|
||||||
|
|
||||||
When bumping the Puppeteer version, make sure you also update the `.chromium-commit` file with the commit hash
|
When bumping the Puppeteer version, make sure you also update the `ChromiumArchivePaths.revision` variable in
|
||||||
for the current Chromium build, so we'll be able to construct a build pipeline for each OS (coming soon!).
|
`x-pack/plugins/reporting/server/browsers/chromium/paths.ts`.
|
||||||
|
|
||||||
## Build args
|
## Build args
|
||||||
|
|
||||||
A good how-to on building Chromium from source is
|
A good how-to on building Chromium from source is
|
||||||
[here](https://chromium.googlesource.com/chromium/src/+/master/docs/get_the_code.md).
|
[here](https://chromium.googlesource.com/chromium/src/+/master/docs/get_the_code.md).
|
||||||
|
|
||||||
There are documents for each OS that will explain how to customize arguments
|
We have an `linux/args.gn` file that is automatically copied to the build target directory.
|
||||||
for the build using the `gn` tool. Those instructions do not apply for the
|
|
||||||
Kibana Chromium build. Our `build.py` script ensure the correct `args.gn`
|
|
||||||
file gets used for build arguments.
|
|
||||||
|
|
||||||
We have an `args.gn` file per platform:
|
|
||||||
|
|
||||||
- mac: `darwin/args.gn`
|
|
||||||
- linux 64bit: `linux-x64/args.gn`
|
|
||||||
- windows: `windows/args.gn`
|
|
||||||
- ARM 64bit: linux-aarch64/args.gn
|
|
||||||
|
|
||||||
To get a list of the build arguments that are enabled, install `depot_tools` and run
|
To get a list of the build arguments that are enabled, install `depot_tools` and run
|
||||||
`gn args out/headless --list`. It prints out all of the flags and their
|
`gn args out/headless --list`. It prints out all of the flags and their
|
||||||
settings, including the defaults.
|
settings, including the defaults. Some build flags are documented
|
||||||
|
|
||||||
The various build flags are not well documented. Some are documented
|
|
||||||
[here](https://www.chromium.org/developers/gn-build-configuration).
|
[here](https://www.chromium.org/developers/gn-build-configuration).
|
||||||
|
|
||||||
As of this writing, there is an officially supported headless Chromium build
|
|
||||||
args file for Linux: `build/args/headless.gn`. This does not work on Windows or
|
|
||||||
Mac, so we have taken that as our starting point, and modified it until the
|
|
||||||
Windows / Mac builds succeeded.
|
|
||||||
|
|
||||||
**NOTE:** Please, make sure you consult @elastic/kibana-security before you change, remove or add any of the build flags.
|
**NOTE:** Please, make sure you consult @elastic/kibana-security before you change, remove or add any of the build flags.
|
||||||
|
|
||||||
## Building locally
|
## Directions for Elasticians
|
||||||
|
|
||||||
You can skip the step of running `<os_name>/init.sh` for your OS if you already
|
If you wish to use a remote VM to build, you'll need access to our GCP account.
|
||||||
have your environment set up, and the chromium source cloned.
|
|
||||||
|
|
||||||
To get the Chromium code, refer to the [documentation](https://chromium.googlesource.com/chromium/src/+/master/docs/get_the_code.md).
|
**NOTE:** The builds should be done in Ubuntu on x64 architecture. ARM builds
|
||||||
Install `depot_tools` as suggested, since it comes with useful scripts. Use the
|
are created in x64 using cross-compiling. CentOS is not supported for building Chromium.
|
||||||
`fetch` command to clone the chromium repository. To set up and run the build,
|
|
||||||
use the Kibana `build.py` script (in this directory).
|
|
||||||
|
|
||||||
It's recommended that you create a working directory for the chromium source
|
1. Login to Google Cloud Console
|
||||||
code and all the build tools, and run the commands from there:
|
2. Click the "Compute Engine" tab.
|
||||||
```sh
|
3. Create a Linux VM:
|
||||||
mkdir ~/chromium && cd ~/chromium
|
- 8 CPU
|
||||||
cp -r ~/path/to/kibana/x-pack/build_chromium .
|
- 30GB memory
|
||||||
python ./build_chromium/init.sh [arch_name]
|
- 80GB free space on disk (Try `ncdu /home` to see where space is used.)
|
||||||
python ./build_chromium/build.py <commit_id>
|
- git
|
||||||
```
|
- python2 (`python` must link to `python2`)
|
||||||
|
- lsb_release
|
||||||
## VMs
|
- tmux is recommended in case your ssh session is interrupted
|
||||||
|
- "Cloud API access scopes": must have **read / write** scope for the Storage API
|
||||||
I ran Linux and Windows VMs in GCP with the following specs:
|
4. Install [Google Cloud SDK](https://cloud.google.com/sdk) locally to ssh into the GCP instance
|
||||||
|
|
||||||
- 8 core vCPU
|
|
||||||
- 30GB RAM
|
|
||||||
- 128GB hard drive
|
|
||||||
- Ubuntu 18.04 LTS (not minimal)
|
|
||||||
- Windows Server 2016 (full, with desktop)
|
|
||||||
|
|
||||||
The more cores the better, as the build makes effective use of each. For Linux, Ubuntu is the only officially supported build target.
|
|
||||||
|
|
||||||
- Linux:
|
|
||||||
- SSH in using [gcloud](https://cloud.google.com/sdk/)
|
|
||||||
- Get the ssh command in the [GCP console](https://console.cloud.google.com/) -> VM instances -> your-vm-name -> SSH -> "View gcloud command"
|
|
||||||
- Their in-browser UI is kinda sluggish, so use the commandline tool (Google Cloud SDK is required)
|
|
||||||
|
|
||||||
- Windows:
|
|
||||||
- Install Microsoft's Remote Desktop tools
|
|
||||||
- Get the RDP file in the [GCP console](https://console.cloud.google.com/) -> VM instances -> your-vm-name -> RDP -> Download the RDP file
|
|
||||||
- Edit it in Microsoft Remote Desktop:
|
|
||||||
- Display -> Resolution (1280 x 960 or something reasonable)
|
|
||||||
- Local Resources -> Folders, then select the folder(s) you want to share, this is at least `build_chromium` folder
|
|
||||||
- Save
|
|
||||||
|
|
||||||
## Initializing each VM / environment
|
|
||||||
|
|
||||||
In a VM, you'll want to use the init scripts to initialize each environment.
|
|
||||||
On Mac OS you'll need to install XCode and accept the license agreement.
|
|
||||||
|
|
||||||
Create the build folder:
|
|
||||||
|
|
||||||
- Mac / Linux: `mkdir -p ~/chromium`
|
|
||||||
- Windows: `mkdir c:\chromium`
|
|
||||||
|
|
||||||
Copy the `x-pack/build-chromium` folder to each. Replace `you@your-machine` with the correct username and VM name:
|
|
||||||
|
|
||||||
- Mac: `cp -r x-pack/build_chromium ~/chromium/build_chromium`
|
|
||||||
- Linux: `gcloud compute scp --recurse x-pack/build_chromium you@your-machine:~/chromium/ --zone=us-east1-b --project "XXXXXXXX"`
|
|
||||||
- Windows: Copy the `build_chromium` folder via the RDP GUI into `c:\chromium\build_chromium`
|
|
||||||
|
|
||||||
There is an init script for each platform. This downloads and installs the necessary prerequisites, sets environment variables, etc.
|
|
||||||
|
|
||||||
- Mac x64: `~/chromium/build_chromium/darwin/init.sh`
|
|
||||||
- Linux x64: `~/chromium/build_chromium/linux/init.sh`
|
|
||||||
- Linux arm64: `~/chromium/build_chromium/linux/init.sh arm64`
|
|
||||||
- Windows x64: `c:\chromium\build_chromium\windows\init.bat`
|
|
||||||
|
|
||||||
In windows, at least, you will need to do a number of extra steps:
|
|
||||||
|
|
||||||
- Follow the prompts in the Visual Studio installation process, click "Install" and wait a while
|
|
||||||
- Once it's installed, open Control Panel and turn on Debugging Tools for Windows:
|
|
||||||
- Control Panel → Programs → Programs and Features → Select the “Windows Software Development Kit” → Change → Change → Check “Debugging Tools For Windows” → Change
|
|
||||||
- Press enter in the terminal to continue running the init
|
|
||||||
|
|
||||||
## Building
|
|
||||||
|
|
||||||
Note: In Linux, you should run the build command in tmux so that if your ssh session disconnects, the build can keep going. To do this, just type `tmux` into your terminal to hop into a tmux session. If you get disconnected, you can hop back in like so:
|
|
||||||
|
|
||||||
- SSH into the server
|
|
||||||
- Run `tmux list-sessions`
|
|
||||||
- Run `tmux switch -t {session_id}`, replacing {session_id} with the value from the list-sessions output
|
|
||||||
|
|
||||||
To run the build, replace the sha in the following commands with the sha that you wish to build:
|
|
||||||
|
|
||||||
- Mac x64: `python ~/chromium/build_chromium/build.py 312d84c8ce62810976feda0d3457108a6dfff9e6`
|
|
||||||
- Linux x64: `python ~/chromium/build_chromium/build.py 312d84c8ce62810976feda0d3457108a6dfff9e6`
|
|
||||||
- Linux arm64: `python ~/chromium/build_chromium/build.py 312d84c8ce62810976feda0d3457108a6dfff9e6 arm64`
|
|
||||||
- Windows x64: `python c:\chromium\build_chromium\build.py 312d84c8ce62810976feda0d3457108a6dfff9e6`
|
|
||||||
|
|
||||||
## Artifacts
|
## Artifacts
|
||||||
|
|
||||||
After the build completes, there will be a .zip file and a .md5 file in `~/chromium/chromium/src/out/headless`. These are named like so: `chromium-{first_7_of_SHA}-{platform}-{arch}`, for example: `chromium-4747cc2-linux-x64`.
|
After the build completes, there will be a .zip file and a .md5 file in `~/chromium/chromium/src/out/headless`. These are named like so: `chromium-{first_7_of_SHA}-{platform}-{arch}`, for example: `chromium-4747cc2-linux-x64`.
|
||||||
|
The zip files and md5 files are copied to a staging bucket in GCP storage.
|
||||||
|
|
||||||
The zip files need to be deployed to GCP Storage. For testing, I drop them into `headless-shell-dev`, but for production, they need to be in `headless-shell`. And the `x-pack/plugins/reporting/server/browsers/chromium/paths.ts` file needs to be upated to have the correct `archiveChecksum`, `archiveFilename`, `binaryChecksum` and `baseUrl`. Below is a list of what the archive's are:
|
## Testing
|
||||||
|
Search the Puppeteer Github repo for known issues that could affect our use case, and make sure to test anywhere that is affected.
|
||||||
|
|
||||||
- `archiveChecksum`: The contents of the `.md5` file, which is the `md5` checksum of the zip file.
|
Here's the steps on how to test a Puppeteer upgrade, run these tests on Mac, Windows, Linux x64 and Linux arm64:
|
||||||
- `binaryChecksum`: The `md5` checksum of the `headless_shell` binary itself.
|
|
||||||
|
|
||||||
*If you're building in the cloud, don't forget to turn off your VM after retrieving the build artifacts!*
|
- Make sure the Reporting plugin is fetching the correct version of the browser
|
||||||
|
at start-up time, and that it can successfully unzip it and copy the files to
|
||||||
## Diagnosing runtime failures
|
`x-pack/plugins/reporting/chromium`
|
||||||
|
- Make sure there are no errors when using the **Reporting diagnostic tool**
|
||||||
After getting the build to pass, the resulting binaries often failed to run or would hang.
|
- All functional and API tests that generate PDF and PNG files should pass.
|
||||||
|
- Use a VM to run Kibana in a low-memory environment and try to generate a PNG of a dashboard that outputs as a 4MB file. Document the minimum requirements in the PR.
|
||||||
You can run the headless browser manually to see what errors it is generating (replace the `c:\dev\data` with the path to a dummy folder you've created on your system):
|
|
||||||
|
|
||||||
**Mac**
|
|
||||||
`headless_shell --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/`
|
|
||||||
|
|
||||||
**Linux**
|
|
||||||
`headless_shell --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/`
|
|
||||||
|
|
||||||
**Windows**
|
|
||||||
`headless_shell.exe --disable-translate --disable-extensions --disable-background-networking --safebrowsing-disable-auto-update --disable-sync --metrics-recording-only --disable-default-apps --mute-audio --no-first-run --disable-gpu --no-sandbox --headless --hide-scrollbars --window-size=400,400 --remote-debugging-port=9221 https://example.com/`
|
|
||||||
|
|
||||||
In the case of Windows, you can use IE to open `http://localhost:9221` and see if the page loads. In mac/linux you can just curl the JSON endpoints: `curl http://localhost:9221/json/list`.
|
|
||||||
|
|
||||||
## Resources
|
## Resources
|
||||||
|
|
||||||
|
@ -225,8 +116,6 @@ The following links provide helpful context about how the Chromium build works,
|
||||||
|
|
||||||
- Tools for Chromium version information: https://omahaproxy.appspot.com/
|
- Tools for Chromium version information: https://omahaproxy.appspot.com/
|
||||||
- https://www.chromium.org/developers/how-tos/get-the-code/working-with-release-branches
|
- https://www.chromium.org/developers/how-tos/get-the-code/working-with-release-branches
|
||||||
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/windows_build_instructions.md
|
|
||||||
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/mac_build_instructions.md
|
|
||||||
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/linux/build_instructions.md
|
- https://chromium.googlesource.com/chromium/src/+/HEAD/docs/linux/build_instructions.md
|
||||||
- Some build-flag descriptions: https://www.chromium.org/developers/gn-build-configuration
|
- Some build-flag descriptions: https://www.chromium.org/developers/gn-build-configuration
|
||||||
- The serverless Chromium project was indispensable: https://github.com/adieuadieu/serverless-chrome/blob/b29445aa5a96d031be2edd5d1fc8651683bf262c/packages/lambda/builds/chromium/build/build.sh
|
- The serverless Chromium project was indispensable: https://github.com/adieuadieu/serverless-chrome/blob/b29445aa5a96d031be2edd5d1fc8651683bf262c/packages/lambda/builds/chromium/build/build.sh
|
||||||
|
|
|
@ -6,7 +6,7 @@ from build_util import (
|
||||||
md5_file,
|
md5_file,
|
||||||
)
|
)
|
||||||
|
|
||||||
# This file builds Chromium headless on Windows, Mac, and Linux.
|
# This file builds Chromium headless on Linux.
|
||||||
|
|
||||||
# Verify that we have an argument, and if not print instructions
|
# Verify that we have an argument, and if not print instructions
|
||||||
if (len(sys.argv) < 2):
|
if (len(sys.argv) < 2):
|
||||||
|
@ -76,7 +76,7 @@ print('Setting up build directory')
|
||||||
runcmd('rm -rf out/headless')
|
runcmd('rm -rf out/headless')
|
||||||
runcmd('mkdir out/headless')
|
runcmd('mkdir out/headless')
|
||||||
|
|
||||||
# Copy build args/{Linux | Darwin | Windows}.gn from the root of our directory to out/headless/args.gn,
|
# Copy args.gn from the root of our directory to out/headless/args.gn,
|
||||||
# add the target_cpu for cross-compilation
|
# add the target_cpu for cross-compilation
|
||||||
print('Adding target_cpu to args')
|
print('Adding target_cpu to args')
|
||||||
argsgn_file_out = path.abspath('out/headless/args.gn')
|
argsgn_file_out = path.abspath('out/headless/args.gn')
|
||||||
|
@ -89,7 +89,7 @@ runcmd('gn gen out/headless')
|
||||||
print('Compiling... this will take a while')
|
print('Compiling... this will take a while')
|
||||||
runcmd('autoninja -C out/headless headless_shell')
|
runcmd('autoninja -C out/headless headless_shell')
|
||||||
|
|
||||||
# Optimize the output on Linux x64 and Mac by stripping inessentials from the binary
|
# Optimize the output on Linux x64 by stripping inessentials from the binary
|
||||||
# ARM must be cross-compiled from Linux and can not read the ARM binary in order to strip
|
# ARM must be cross-compiled from Linux and can not read the ARM binary in order to strip
|
||||||
if platform.system() != 'Windows' and arch_name != 'arm64':
|
if platform.system() != 'Windows' and arch_name != 'arm64':
|
||||||
print('Optimizing headless_shell')
|
print('Optimizing headless_shell')
|
||||||
|
@ -112,30 +112,10 @@ def archive_file(name):
|
||||||
archive.write(from_path, to_path)
|
archive.write(from_path, to_path)
|
||||||
return to_path
|
return to_path
|
||||||
|
|
||||||
# Each platform has slightly different requirements for what dependencies
|
# Add dependencies that must be bundled with the Chromium executable.
|
||||||
# must be bundled with the Chromium executable.
|
archive_file('headless_shell')
|
||||||
if platform.system() == 'Linux':
|
archive_file(path.join('swiftshader', 'libEGL.so'))
|
||||||
archive_file('headless_shell')
|
archive_file(path.join('swiftshader', 'libGLESv2.so'))
|
||||||
archive_file(path.join('swiftshader', 'libEGL.so'))
|
|
||||||
archive_file(path.join('swiftshader', 'libGLESv2.so'))
|
|
||||||
|
|
||||||
if arch_name == 'arm64':
|
|
||||||
archive_file(path.join('swiftshader', 'libEGL.so'))
|
|
||||||
|
|
||||||
elif platform.system() == 'Windows':
|
|
||||||
archive_file('headless_shell.exe')
|
|
||||||
archive_file('dbghelp.dll')
|
|
||||||
archive_file('icudtl.dat')
|
|
||||||
archive_file(path.join('swiftshader', 'libEGL.dll'))
|
|
||||||
archive_file(path.join('swiftshader', 'libEGL.dll.lib'))
|
|
||||||
archive_file(path.join('swiftshader', 'libGLESv2.dll'))
|
|
||||||
archive_file(path.join('swiftshader', 'libGLESv2.dll.lib'))
|
|
||||||
|
|
||||||
elif platform.system() == 'Darwin':
|
|
||||||
archive_file('headless_shell')
|
|
||||||
archive_file('libswiftshader_libEGL.dylib')
|
|
||||||
archive_file('libswiftshader_libGLESv2.dylib')
|
|
||||||
archive_file(path.join('Helpers', 'chrome_crashpad_handler'))
|
|
||||||
|
|
||||||
archive.close()
|
archive.close()
|
||||||
|
|
||||||
|
|
|
@ -1,33 +0,0 @@
|
||||||
# Based on //build/headless.gn
|
|
||||||
|
|
||||||
# Embed resource.pak into binary to simplify deployment.
|
|
||||||
headless_use_embedded_resources = true
|
|
||||||
|
|
||||||
# In order to simplify deployment we build ICU data file
|
|
||||||
# into binary.
|
|
||||||
icu_use_data_file = false
|
|
||||||
|
|
||||||
# Use embedded data instead external files for headless in order
|
|
||||||
# to simplify deployment.
|
|
||||||
v8_use_external_startup_data = false
|
|
||||||
|
|
||||||
enable_nacl = false
|
|
||||||
enable_print_preview = false
|
|
||||||
enable_basic_printing = false
|
|
||||||
enable_remoting = false
|
|
||||||
use_alsa = false
|
|
||||||
use_cups = false
|
|
||||||
use_dbus = false
|
|
||||||
use_gio = false
|
|
||||||
use_libpci = false
|
|
||||||
use_pulseaudio = false
|
|
||||||
use_udev = false
|
|
||||||
|
|
||||||
is_debug = false
|
|
||||||
symbol_level = 0
|
|
||||||
is_component_build = false
|
|
||||||
|
|
||||||
# Please, consult @elastic/kibana-security before changing/removing this option.
|
|
||||||
use_kerberos = false
|
|
||||||
|
|
||||||
# target_cpu is appended before build: "x64" or "arm64"
|
|
|
@ -1,5 +0,0 @@
|
||||||
#!/bin/bash
|
|
||||||
|
|
||||||
# Launch the cross-platform init script using a relative path
|
|
||||||
# from this script's location.
|
|
||||||
python "`dirname "$0"`/../init.py"
|
|
|
@ -3,11 +3,9 @@ from os import path
|
||||||
from build_util import runcmd, mkdir
|
from build_util import runcmd, mkdir
|
||||||
|
|
||||||
# This is a cross-platform initialization script which should only be run
|
# This is a cross-platform initialization script which should only be run
|
||||||
# once per environment, and isn't intended to be run directly. You should
|
# once per environment.
|
||||||
# run the appropriate platform init script (e.g. Linux/init.sh) which will
|
|
||||||
# call this once the platform-specific initialization has completed.
|
|
||||||
|
|
||||||
# Set to "arm" to build for ARM on Linux
|
# Set to "arm" to build for ARM
|
||||||
arch_name = sys.argv[1] if len(sys.argv) >= 2 else 'undefined'
|
arch_name = sys.argv[1] if len(sys.argv) >= 2 else 'undefined'
|
||||||
build_path = path.abspath(os.curdir)
|
build_path = path.abspath(os.curdir)
|
||||||
src_path = path.abspath(path.join(build_path, 'chromium', 'src'))
|
src_path = path.abspath(path.join(build_path, 'chromium', 'src'))
|
||||||
|
@ -23,7 +21,6 @@ runcmd('git config --global branch.autosetuprebase always')
|
||||||
runcmd('git config --global core.compression 0')
|
runcmd('git config --global core.compression 0')
|
||||||
|
|
||||||
# Grab Chromium's custom build tools, if they aren't already installed
|
# Grab Chromium's custom build tools, if they aren't already installed
|
||||||
# (On Windows, they are installed before this Python script is run)
|
|
||||||
# Put depot_tools on the path so we can properly run the fetch command
|
# Put depot_tools on the path so we can properly run the fetch command
|
||||||
if not path.isdir('depot_tools'):
|
if not path.isdir('depot_tools'):
|
||||||
print('Installing depot_tools...')
|
print('Installing depot_tools...')
|
||||||
|
|
|
@ -1,13 +0,0 @@
|
||||||
#!/bin/bash
|
|
||||||
|
|
||||||
# Initializes a Linux environment. This need only be done once per
|
|
||||||
# machine. The OS needs to be a flavor that supports apt get, such as Ubuntu.
|
|
||||||
|
|
||||||
if ! [ -x "$(command -v python)" ]; then
|
|
||||||
echo "Installing Python"
|
|
||||||
sudo apt-get --assume-yes install python
|
|
||||||
fi
|
|
||||||
|
|
||||||
# Launch the cross-platform init script using a relative path
|
|
||||||
# from this script's location.
|
|
||||||
python "`dirname "$0"`/../init.py" $1
|
|
|
@ -1,29 +0,0 @@
|
||||||
# Based on //build/headless.gn
|
|
||||||
|
|
||||||
# Embed resource.pak into binary to simplify deployment.
|
|
||||||
headless_use_embedded_resources = true
|
|
||||||
|
|
||||||
# Use embedded data instead external files for headless in order
|
|
||||||
# to simplify deployment.
|
|
||||||
v8_use_external_startup_data = false
|
|
||||||
|
|
||||||
enable_nacl = false
|
|
||||||
enable_print_preview = false
|
|
||||||
enable_basic_printing = false
|
|
||||||
enable_remoting = false
|
|
||||||
use_alsa = false
|
|
||||||
use_cups = false
|
|
||||||
use_dbus = false
|
|
||||||
use_gio = false
|
|
||||||
use_libpci = false
|
|
||||||
use_pulseaudio = false
|
|
||||||
use_udev = false
|
|
||||||
|
|
||||||
is_debug = false
|
|
||||||
symbol_level = 0
|
|
||||||
is_component_build = false
|
|
||||||
|
|
||||||
# Please, consult @elastic/kibana-security before changing/removing this option.
|
|
||||||
use_kerberos = false
|
|
||||||
|
|
||||||
# target_cpu is appended before build: "x64" or "arm64"
|
|
|
@ -1,32 +0,0 @@
|
||||||
: This only needs to be run once per environment to set it up.
|
|
||||||
: This requires a GUI, as the VS installation is graphical.
|
|
||||||
: If initilization fails, you can simply install run the `install_vs.exe`
|
|
||||||
|
|
||||||
@echo off
|
|
||||||
|
|
||||||
: Install Visual Studio (this requires user interaction, and takes quite a while)
|
|
||||||
: Most of the subsequent commands can be run in parallel with this (downloading, unzipping,
|
|
||||||
: grabbing the source, etc). This must be completed before building, though.
|
|
||||||
@echo "Installing Visual Studio"
|
|
||||||
|
|
||||||
powershell -command "& {iwr -outf c:\chromium\install_vs.exe https://download.visualstudio.microsoft.com/download/pr/f9c35424-ffad-4b44-bb8f-d4e3968e90ce/f75403c967456e32e758ef558957f345/vs_community.exe}"
|
|
||||||
|
|
||||||
install_vs.exe --add Microsoft.VisualStudio.Workload.NativeDesktop --add Microsoft.VisualStudio.Component.VC.ATLMFC --includeRecommended
|
|
||||||
|
|
||||||
: Install Chromium's custom build tools
|
|
||||||
@echo "Installing Chromium build tools"
|
|
||||||
|
|
||||||
powershell -command "& {iwr -outf %~dp0../../depot_tools.zip https://storage.googleapis.com/chrome-infra/depot_tools.zip}"
|
|
||||||
powershell -command "& {Expand-Archive %~dp0../../depot_tools.zip -DestinationPath %~dp0../../depot_tools}"
|
|
||||||
|
|
||||||
: Set the environment variables required by depot_tools
|
|
||||||
@echo "When Visual Studio is installed, you need to enable the Windows SDK in Control Panel. After that, press <enter> here to continue initialization"
|
|
||||||
|
|
||||||
pause
|
|
||||||
|
|
||||||
SETX PATH "%~dp0..\..\depot_tools;%path%"
|
|
||||||
SETX DEPOT_TOOLS_WIN_TOOLCHAIN 0
|
|
||||||
|
|
||||||
call gclient
|
|
||||||
|
|
||||||
python %~dp0../init.py
|
|
|
@ -302,7 +302,7 @@ export class HeadlessChromiumDriver {
|
||||||
// Even though 3xx redirects go through our request
|
// Even though 3xx redirects go through our request
|
||||||
// handler, we should probably inspect responses just to
|
// handler, we should probably inspect responses just to
|
||||||
// avoid being bamboozled by some malicious request
|
// avoid being bamboozled by some malicious request
|
||||||
this.page.on('response', (interceptedResponse: puppeteer.Response) => {
|
this.page.on('response', (interceptedResponse: puppeteer.HTTPResponse) => {
|
||||||
const interceptedUrl = interceptedResponse.url();
|
const interceptedUrl = interceptedResponse.url();
|
||||||
const allowed = !interceptedUrl.startsWith('file://');
|
const allowed = !interceptedUrl.startsWith('file://');
|
||||||
|
|
||||||
|
|
|
@ -46,6 +46,8 @@ export const args = ({ userDataDir, viewport, disableSandbox, proxy: proxyConfig
|
||||||
// The viewport may later need to be resized depending on the position of the clip area.
|
// The viewport may later need to be resized depending on the position of the clip area.
|
||||||
// These numbers come from the job parameters, so this is a close guess.
|
// These numbers come from the job parameters, so this is a close guess.
|
||||||
`--window-size=${Math.floor(viewport.width)},${Math.floor(viewport.height)}`,
|
`--window-size=${Math.floor(viewport.width)},${Math.floor(viewport.height)}`,
|
||||||
|
// allow screenshot clip region to go outside of the viewport
|
||||||
|
`--mainFrameClipsContent=false`,
|
||||||
];
|
];
|
||||||
|
|
||||||
if (proxyConfig.enabled) {
|
if (proxyConfig.enabled) {
|
||||||
|
|
|
@ -89,7 +89,7 @@ export class HeadlessChromiumDriverFactory {
|
||||||
const versionInfo = await client.send('Browser.getVersion');
|
const versionInfo = await client.send('Browser.getVersion');
|
||||||
logger.debug(`Browser version: ${JSON.stringify(versionInfo)}`);
|
logger.debug(`Browser version: ${JSON.stringify(versionInfo)}`);
|
||||||
|
|
||||||
await page.emulateTimezone(browserTimezone ?? null);
|
await page.emulateTimezone(browserTimezone);
|
||||||
|
|
||||||
// Set the default timeout for all navigation methods to the openUrl timeout (30 seconds)
|
// Set the default timeout for all navigation methods to the openUrl timeout (30 seconds)
|
||||||
// All waitFor methods have their own timeout config passed in to them
|
// All waitFor methods have their own timeout config passed in to them
|
||||||
|
@ -173,7 +173,7 @@ export class HeadlessChromiumDriverFactory {
|
||||||
})
|
})
|
||||||
);
|
);
|
||||||
|
|
||||||
const pageRequestFailed$ = Rx.fromEvent<puppeteer.Request>(page, 'requestfailed').pipe(
|
const pageRequestFailed$ = Rx.fromEvent<puppeteer.HTTPRequest>(page, 'requestfailed').pipe(
|
||||||
map((req) => {
|
map((req) => {
|
||||||
const failure = req.failure && req.failure();
|
const failure = req.failure && req.failure();
|
||||||
if (failure) {
|
if (failure) {
|
||||||
|
|
|
@ -99,8 +99,9 @@ export const browserStartLogs = (
|
||||||
);
|
);
|
||||||
|
|
||||||
const error$ = fromEvent(browserProcess, 'error').pipe(
|
const error$ = fromEvent(browserProcess, 'error').pipe(
|
||||||
map(() => {
|
map((err) => {
|
||||||
logger.error(`Browser process threw an error on startup`);
|
logger.error(`Browser process threw an error on startup`);
|
||||||
|
logger.error(err as string | Error);
|
||||||
return i18n.translate('xpack.reporting.diagnostic.browserErrored', {
|
return i18n.translate('xpack.reporting.diagnostic.browserErrored', {
|
||||||
defaultMessage: `Browser process threw an error on startup`,
|
defaultMessage: `Browser process threw an error on startup`,
|
||||||
});
|
});
|
||||||
|
|
|
@ -16,44 +16,62 @@ interface PackageInfo {
|
||||||
binaryRelativePath: string;
|
binaryRelativePath: string;
|
||||||
}
|
}
|
||||||
|
|
||||||
// We download zip files from a Kibana team GCS bucket named `headless_shell`
|
|
||||||
enum BaseUrl {
|
enum BaseUrl {
|
||||||
|
// see https://www.chromium.org/getting-involved/download-chromium
|
||||||
|
common = 'https://commondatastorage.googleapis.com/chromium-browser-snapshots',
|
||||||
|
// A GCS bucket under the Kibana team
|
||||||
custom = 'https://storage.googleapis.com/headless_shell',
|
custom = 'https://storage.googleapis.com/headless_shell',
|
||||||
}
|
}
|
||||||
|
|
||||||
|
interface CustomPackageInfo extends PackageInfo {
|
||||||
|
location: 'custom';
|
||||||
|
}
|
||||||
|
interface CommonPackageInfo extends PackageInfo {
|
||||||
|
location: 'common';
|
||||||
|
archivePath: string;
|
||||||
|
}
|
||||||
|
|
||||||
export class ChromiumArchivePaths {
|
export class ChromiumArchivePaths {
|
||||||
public readonly packages: PackageInfo[] = [
|
public readonly revision = '856583';
|
||||||
|
|
||||||
|
public readonly packages: Array<CustomPackageInfo | CommonPackageInfo> = [
|
||||||
{
|
{
|
||||||
platform: 'darwin',
|
platform: 'darwin',
|
||||||
architecture: 'x64',
|
architecture: 'x64',
|
||||||
archiveFilename: 'chromium-ef768c9-darwin_x64.zip',
|
archiveFilename: 'chrome-mac.zip',
|
||||||
archiveChecksum: 'd87287f6b2159cff7c64babac873cc73',
|
archiveChecksum: '6aad6fa5a26d83e24e2f0d52de5230bf',
|
||||||
binaryChecksum: '8d777b3380a654e2730fc36afbfb11e1',
|
binaryChecksum: '2dc7a7250d849df4cab60f3b4a70c1ea',
|
||||||
binaryRelativePath: 'headless_shell-darwin_x64/headless_shell',
|
binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium',
|
||||||
|
location: 'common',
|
||||||
|
archivePath: 'Mac',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
platform: 'linux',
|
platform: 'linux',
|
||||||
architecture: 'x64',
|
architecture: 'x64',
|
||||||
archiveFilename: 'chromium-ef768c9-linux_x64.zip',
|
archiveFilename: 'chromium-d163fd7-linux_x64.zip',
|
||||||
archiveChecksum: '85575e8fd56849f4de5e3584e05712c0',
|
archiveChecksum: 'fba0a240d409228a3494aef415c300fc',
|
||||||
binaryChecksum: '38c4d849c17683def1283d7e5aa56fe9',
|
binaryChecksum: '99cfab472d516038b94ef86649e52871',
|
||||||
binaryRelativePath: 'headless_shell-linux_x64/headless_shell',
|
binaryRelativePath: 'headless_shell-linux_x64/headless_shell',
|
||||||
|
location: 'custom',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
platform: 'linux',
|
platform: 'linux',
|
||||||
architecture: 'arm64',
|
architecture: 'arm64',
|
||||||
archiveFilename: 'chromium-ef768c9-linux_arm64.zip',
|
archiveFilename: 'chromium-d163fd7-linux_arm64.zip',
|
||||||
archiveChecksum: '20b09b70476bea76a276c583bf72eac7',
|
archiveChecksum: '29834735bc2f0e0d9134c33bc0580fb6',
|
||||||
binaryChecksum: 'dcfd277800c1a5c7d566c445cbdc225c',
|
binaryChecksum: '13baccf2e5c8385cb9d9588db6a9e2c2',
|
||||||
binaryRelativePath: 'headless_shell-linux_arm64/headless_shell',
|
binaryRelativePath: 'headless_shell-linux_arm64/headless_shell',
|
||||||
|
location: 'custom',
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
platform: 'win32',
|
platform: 'win32',
|
||||||
architecture: 'x64',
|
architecture: 'x64',
|
||||||
archiveFilename: 'chromium-ef768c9-windows_x64.zip',
|
archiveFilename: 'chrome-win.zip',
|
||||||
archiveChecksum: '33301c749b5305b65311742578c52f15',
|
archiveChecksum: '64999a384bfb6c96c50c4cb6810dbc05',
|
||||||
binaryChecksum: '9f28dd56c7a304a22bf66f0097fa4de9',
|
binaryChecksum: '13b8bbb4a12f9036b8cc3b57b3a71fec',
|
||||||
binaryRelativePath: 'headless_shell-windows_x64\\headless_shell.exe',
|
binaryRelativePath: 'chrome-win\\chrome.exe',
|
||||||
|
location: 'common',
|
||||||
|
archivePath: 'Win',
|
||||||
},
|
},
|
||||||
];
|
];
|
||||||
|
|
||||||
|
@ -72,8 +90,11 @@ export class ChromiumArchivePaths {
|
||||||
return this.packages.map((p) => this.resolvePath(p));
|
return this.packages.map((p) => this.resolvePath(p));
|
||||||
}
|
}
|
||||||
|
|
||||||
public getDownloadUrl(p: PackageInfo) {
|
public getDownloadUrl(p: CustomPackageInfo | CommonPackageInfo) {
|
||||||
return BaseUrl.custom + `/${p.archiveFilename}`;
|
if (p.location === 'common') {
|
||||||
|
return `${BaseUrl.common}/${p.archivePath}/${this.revision}/${p.archiveFilename}`;
|
||||||
|
}
|
||||||
|
return BaseUrl.custom + '/' + p.archiveFilename;
|
||||||
}
|
}
|
||||||
|
|
||||||
public getBinaryPath(p: PackageInfo) {
|
public getBinaryPath(p: PackageInfo) {
|
||||||
|
|
|
@ -40,16 +40,14 @@ export function installBrowser(
|
||||||
|
|
||||||
if (binaryChecksum !== pkg.binaryChecksum) {
|
if (binaryChecksum !== pkg.binaryChecksum) {
|
||||||
await ensureBrowserDownloaded(logger);
|
await ensureBrowserDownloaded(logger);
|
||||||
|
await del(chromiumPath);
|
||||||
|
|
||||||
const archive = path.join(paths.archivesPath, pkg.archiveFilename);
|
const archive = path.join(paths.archivesPath, pkg.archiveFilename);
|
||||||
logger.info(`Extracting [${archive}] to [${binaryPath}]`);
|
logger.info(`Extracting [${archive}] to [${chromiumPath}]`);
|
||||||
|
|
||||||
await del(chromiumPath);
|
|
||||||
await extract(archive, chromiumPath);
|
await extract(archive, chromiumPath);
|
||||||
}
|
}
|
||||||
|
|
||||||
logger.debug(`Browser executable: ${binaryPath}`);
|
logger.info(`Browser executable: ${binaryPath}`);
|
||||||
|
|
||||||
binaryPath$.next(binaryPath); // subscribers wait for download and extract to complete
|
binaryPath$.next(binaryPath); // subscribers wait for download and extract to complete
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
|
@ -9,7 +9,6 @@ import { APP_WRAPPER_CLASS } from '../../../../../../src/core/server';
|
||||||
export const DEFAULT_PAGELOAD_SELECTOR = `.${APP_WRAPPER_CLASS}`;
|
export const DEFAULT_PAGELOAD_SELECTOR = `.${APP_WRAPPER_CLASS}`;
|
||||||
|
|
||||||
export const CONTEXT_GETNUMBEROFITEMS = 'GetNumberOfItems';
|
export const CONTEXT_GETNUMBEROFITEMS = 'GetNumberOfItems';
|
||||||
export const CONTEXT_GETBROWSERDIMENSIONS = 'GetBrowserDimensions';
|
|
||||||
export const CONTEXT_INJECTCSS = 'InjectCss';
|
export const CONTEXT_INJECTCSS = 'InjectCss';
|
||||||
export const CONTEXT_WAITFORRENDER = 'WaitForRender';
|
export const CONTEXT_WAITFORRENDER = 'WaitForRender';
|
||||||
export const CONTEXT_GETTIMERANGE = 'GetTimeRange';
|
export const CONTEXT_GETTIMERANGE = 'GetTimeRange';
|
||||||
|
|
|
@ -10,54 +10,6 @@ import { LevelLogger, startTrace } from '../';
|
||||||
import { HeadlessChromiumDriver } from '../../browsers';
|
import { HeadlessChromiumDriver } from '../../browsers';
|
||||||
import { LayoutInstance } from '../layouts';
|
import { LayoutInstance } from '../layouts';
|
||||||
import { ElementsPositionAndAttribute, Screenshot } from './';
|
import { ElementsPositionAndAttribute, Screenshot } from './';
|
||||||
import { CONTEXT_GETBROWSERDIMENSIONS } from './constants';
|
|
||||||
|
|
||||||
// In Puppeteer 5.4+, the viewport size limits what the screenshot can take, even if a clip is specified. The clip area must
|
|
||||||
// be visible in the viewport. This workaround resizes the viewport to the actual content height and width.
|
|
||||||
// NOTE: this will fire a window resize event
|
|
||||||
const resizeToClipArea = async (
|
|
||||||
item: ElementsPositionAndAttribute,
|
|
||||||
browser: HeadlessChromiumDriver,
|
|
||||||
zoom: number,
|
|
||||||
logger: LevelLogger
|
|
||||||
) => {
|
|
||||||
// Check current viewport size
|
|
||||||
const { width, height, left, top } = item.position.boundingClientRect; // the "unscaled" pixel sizes
|
|
||||||
const [viewWidth, viewHeight] = await browser.evaluate(
|
|
||||||
{
|
|
||||||
fn: () => [document.body.clientWidth, document.body.clientHeight],
|
|
||||||
args: [],
|
|
||||||
},
|
|
||||||
{ context: CONTEXT_GETBROWSERDIMENSIONS },
|
|
||||||
logger
|
|
||||||
);
|
|
||||||
|
|
||||||
logger.debug(`Browser viewport: width=${viewWidth} height=${viewHeight}`);
|
|
||||||
|
|
||||||
// Resize the viewport if the clip area is not visible
|
|
||||||
if (viewWidth < width + left || viewHeight < height + top) {
|
|
||||||
logger.debug(`Item's position is not within the viewport.`);
|
|
||||||
|
|
||||||
// add left and top margin to unscaled measurements
|
|
||||||
const newWidth = width + left;
|
|
||||||
const newHeight = height + top;
|
|
||||||
|
|
||||||
logger.debug(
|
|
||||||
`Resizing browser viewport to: width=${newWidth} height=${newHeight} zoom=${zoom}`
|
|
||||||
);
|
|
||||||
|
|
||||||
await browser.setViewport(
|
|
||||||
{
|
|
||||||
width: newWidth,
|
|
||||||
height: newHeight,
|
|
||||||
zoom,
|
|
||||||
},
|
|
||||||
logger
|
|
||||||
);
|
|
||||||
}
|
|
||||||
|
|
||||||
logger.debug(`Capturing item: width=${width} height=${height} left=${left} top=${top}`);
|
|
||||||
};
|
|
||||||
|
|
||||||
export const getScreenshots = async (
|
export const getScreenshots = async (
|
||||||
browser: HeadlessChromiumDriver,
|
browser: HeadlessChromiumDriver,
|
||||||
|
@ -77,7 +29,6 @@ export const getScreenshots = async (
|
||||||
const endTrace = startTrace('get_screenshots', 'read');
|
const endTrace = startTrace('get_screenshots', 'read');
|
||||||
const item = elementsPositionAndAttributes[i];
|
const item = elementsPositionAndAttributes[i];
|
||||||
|
|
||||||
await resizeToClipArea(item, browser, layout.getBrowserZoom(), logger);
|
|
||||||
const base64EncodedData = await browser.screenshot(item.position);
|
const base64EncodedData = await browser.screenshot(item.position);
|
||||||
|
|
||||||
if (!base64EncodedData) {
|
if (!base64EncodedData) {
|
||||||
|
|
|
@ -341,8 +341,6 @@ describe('Screenshot Observable Pipeline', () => {
|
||||||
|
|
||||||
if (mockCall === contexts.CONTEXT_ELEMENTATTRIBUTES) {
|
if (mockCall === contexts.CONTEXT_ELEMENTATTRIBUTES) {
|
||||||
return Promise.resolve(null);
|
return Promise.resolve(null);
|
||||||
} else if (mockCall === contexts.CONTEXT_GETBROWSERDIMENSIONS) {
|
|
||||||
return Promise.resolve([800, 600]);
|
|
||||||
} else {
|
} else {
|
||||||
return Promise.resolve();
|
return Promise.resolve();
|
||||||
}
|
}
|
||||||
|
|
|
@ -65,9 +65,6 @@ mockBrowserEvaluate.mockImplementation(() => {
|
||||||
if (mockCall === contexts.CONTEXT_GETNUMBEROFITEMS) {
|
if (mockCall === contexts.CONTEXT_GETNUMBEROFITEMS) {
|
||||||
return Promise.resolve(1);
|
return Promise.resolve(1);
|
||||||
}
|
}
|
||||||
if (mockCall === contexts.CONTEXT_GETBROWSERDIMENSIONS) {
|
|
||||||
return Promise.resolve([600, 800]);
|
|
||||||
}
|
|
||||||
if (mockCall === contexts.CONTEXT_INJECTCSS) {
|
if (mockCall === contexts.CONTEXT_INJECTCSS) {
|
||||||
return Promise.resolve();
|
return Promise.resolve();
|
||||||
}
|
}
|
||||||
|
|
|
@ -40,8 +40,7 @@ export async function checkIfPngsMatch(
|
||||||
log.debug(`writeFile: ${baselineCopyPath}`);
|
log.debug(`writeFile: ${baselineCopyPath}`);
|
||||||
await fs.writeFile(baselineCopyPath, await fs.readFile(baselinepngPath));
|
await fs.writeFile(baselineCopyPath, await fs.readFile(baselinepngPath));
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
log.error(`No baseline png found at ${baselinepngPath}`);
|
throw new Error(`No baseline png found at ${baselinepngPath}`);
|
||||||
return 0;
|
|
||||||
}
|
}
|
||||||
log.debug(`writeFile: ${actualCopyPath}`);
|
log.debug(`writeFile: ${actualCopyPath}`);
|
||||||
await fs.writeFile(actualCopyPath, await fs.readFile(actualpngPath));
|
await fs.writeFile(actualCopyPath, await fs.readFile(actualpngPath));
|
||||||
|
|
Binary file not shown.
After Width: | Height: | Size: 6.2 MiB |
Before Width: | Height: | Size: 752 KiB After Width: | Height: | Size: 752 KiB |
|
@ -120,7 +120,6 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('PNG Layout', () => {
|
describe('PNG Layout', () => {
|
||||||
it('downloads a PNG file', async function () {
|
|
||||||
const writeSessionReport = async (name: string, rawPdf: Buffer, reportExt: string) => {
|
const writeSessionReport = async (name: string, rawPdf: Buffer, reportExt: string) => {
|
||||||
const sessionDirectory = path.resolve(REPORTS_FOLDER, 'session');
|
const sessionDirectory = path.resolve(REPORTS_FOLDER, 'session');
|
||||||
await mkdirAsync(sessionDirectory, { recursive: true });
|
await mkdirAsync(sessionDirectory, { recursive: true });
|
||||||
|
@ -135,6 +134,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
|
||||||
return fullPath;
|
return fullPath;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
it('downloads a PNG file: small dashboard', async function () {
|
||||||
this.timeout(300000);
|
this.timeout(300000);
|
||||||
|
|
||||||
await PageObjects.common.navigateToApp('dashboard');
|
await PageObjects.common.navigateToApp('dashboard');
|
||||||
|
@ -146,7 +146,31 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
|
||||||
|
|
||||||
const url = await PageObjects.reporting.getReportURL(60000);
|
const url = await PageObjects.reporting.getReportURL(60000);
|
||||||
const reportData = await PageObjects.reporting.getRawPdfReportData(url);
|
const reportData = await PageObjects.reporting.getRawPdfReportData(url);
|
||||||
const reportFileName = 'dashboard_preserve_layout';
|
const reportFileName = 'small_dashboard_preserve_layout';
|
||||||
|
const sessionReportPath = await writeSessionReport(reportFileName, reportData, 'png');
|
||||||
|
const percentDiff = await checkIfPngsMatch(
|
||||||
|
sessionReportPath,
|
||||||
|
getBaselineReportPath(reportFileName, 'png'),
|
||||||
|
config.get('screenshots.directory'),
|
||||||
|
log
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(percentDiff).to.be.lessThan(0.09);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('downloads a PNG file: large dashboard', async function () {
|
||||||
|
this.timeout(300000);
|
||||||
|
|
||||||
|
await PageObjects.common.navigateToApp('dashboard');
|
||||||
|
await PageObjects.dashboard.loadSavedDashboard('Large Dashboard');
|
||||||
|
await PageObjects.reporting.openPngReportingPanel();
|
||||||
|
await PageObjects.reporting.forceSharedItemsContainerSize({ width: 1405 });
|
||||||
|
await PageObjects.reporting.clickGenerateReportButton();
|
||||||
|
await PageObjects.reporting.removeForceSharedItemsContainerSize();
|
||||||
|
|
||||||
|
const url = await PageObjects.reporting.getReportURL(200000);
|
||||||
|
const reportData = await PageObjects.reporting.getRawPdfReportData(url);
|
||||||
|
const reportFileName = 'large_dashboard_preserve_layout';
|
||||||
const sessionReportPath = await writeSessionReport(reportFileName, reportData, 'png');
|
const sessionReportPath = await writeSessionReport(reportFileName, reportData, 'png');
|
||||||
const percentDiff = await checkIfPngsMatch(
|
const percentDiff = await checkIfPngsMatch(
|
||||||
sessionReportPath,
|
sessionReportPath,
|
||||||
|
@ -160,9 +184,7 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('Preserve Layout', () => {
|
describe('Preserve Layout', () => {
|
||||||
it('downloads a PDF file', async function () {
|
it('downloads a PDF file: small dashboard', async function () {
|
||||||
// Generating and then comparing reports can take longer than the default 60s timeout because the comparePngs
|
|
||||||
// function is taking about 15 seconds per comparison in jenkins.
|
|
||||||
this.timeout(300000);
|
this.timeout(300000);
|
||||||
await PageObjects.common.navigateToApp('dashboard');
|
await PageObjects.common.navigateToApp('dashboard');
|
||||||
await PageObjects.dashboard.loadSavedDashboard('Ecom Dashboard');
|
await PageObjects.dashboard.loadSavedDashboard('Ecom Dashboard');
|
||||||
|
@ -176,10 +198,22 @@ export default function ({ getPageObjects, getService }: FtrProviderContext) {
|
||||||
expect(res.get('content-type')).to.equal('application/pdf');
|
expect(res.get('content-type')).to.equal('application/pdf');
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('downloads a PDF file: large dashboard', async function () {
|
||||||
|
this.timeout(300000);
|
||||||
|
await PageObjects.common.navigateToApp('dashboard');
|
||||||
|
await PageObjects.dashboard.loadSavedDashboard('Large Dashboard');
|
||||||
|
await PageObjects.reporting.openPdfReportingPanel();
|
||||||
|
await PageObjects.reporting.clickGenerateReportButton();
|
||||||
|
|
||||||
|
const url = await PageObjects.reporting.getReportURL(60000);
|
||||||
|
const res = await PageObjects.reporting.getResponse(url);
|
||||||
|
|
||||||
|
expect(res.status).to.equal(200);
|
||||||
|
expect(res.get('content-type')).to.equal('application/pdf');
|
||||||
|
});
|
||||||
|
|
||||||
it('downloads a PDF file with saved search given EuiDataGrid enabled', async function () {
|
it('downloads a PDF file with saved search given EuiDataGrid enabled', async function () {
|
||||||
await kibanaServer.uiSettings.replace({ 'doc_table:legacy': false });
|
await kibanaServer.uiSettings.replace({ 'doc_table:legacy': false });
|
||||||
// Generating and then comparing reports can take longer than the default 60s timeout because the comparePngs
|
|
||||||
// function is taking about 15 seconds per comparison in jenkins.
|
|
||||||
this.timeout(300000);
|
this.timeout(300000);
|
||||||
await PageObjects.common.navigateToApp('dashboard');
|
await PageObjects.common.navigateToApp('dashboard');
|
||||||
await PageObjects.dashboard.loadSavedDashboard('Ecom Dashboard');
|
await PageObjects.dashboard.loadSavedDashboard('Ecom Dashboard');
|
||||||
|
|
File diff suppressed because one or more lines are too long
31
yarn.lock
31
yarn.lock
|
@ -5599,13 +5599,6 @@
|
||||||
resolved "https://registry.yarnpkg.com/@types/proper-lockfile/-/proper-lockfile-3.0.1.tgz#dd770a2abce3adbcce3bd1ed892ce2f5f17fbc86"
|
resolved "https://registry.yarnpkg.com/@types/proper-lockfile/-/proper-lockfile-3.0.1.tgz#dd770a2abce3adbcce3bd1ed892ce2f5f17fbc86"
|
||||||
integrity sha512-ODOjqxmaNs0Zkij+BJovsNJRSX7BJrr681o8ZnNTNIcTermvVFzLpz/XFtfg3vNrlPVTJY1l4e9h2LvHoxC1lg==
|
integrity sha512-ODOjqxmaNs0Zkij+BJovsNJRSX7BJrr681o8ZnNTNIcTermvVFzLpz/XFtfg3vNrlPVTJY1l4e9h2LvHoxC1lg==
|
||||||
|
|
||||||
"@types/puppeteer@^5.4.1":
|
|
||||||
version "5.4.1"
|
|
||||||
resolved "https://registry.yarnpkg.com/@types/puppeteer/-/puppeteer-5.4.1.tgz#8d0075ad7705e8061b06df6a9a3abc6ca5fb7cd9"
|
|
||||||
integrity sha512-mEytIRrqvsFgs16rHOa5jcZcoycO/NSjg1oLQkFUegj3HOHeAP1EUfRi+eIsJdGrx2oOtfN39ckibkRXzs+qXA==
|
|
||||||
dependencies:
|
|
||||||
"@types/node" "*"
|
|
||||||
|
|
||||||
"@types/q@^1.5.1":
|
"@types/q@^1.5.1":
|
||||||
version "1.5.4"
|
version "1.5.4"
|
||||||
resolved "https://registry.yarnpkg.com/@types/q/-/q-1.5.4.tgz#15925414e0ad2cd765bfef58842f7e26a7accb24"
|
resolved "https://registry.yarnpkg.com/@types/q/-/q-1.5.4.tgz#15925414e0ad2cd765bfef58842f7e26a7accb24"
|
||||||
|
@ -11654,16 +11647,16 @@ detective@^5.0.2, detective@^5.2.0:
|
||||||
defined "^1.0.0"
|
defined "^1.0.0"
|
||||||
minimist "^1.1.1"
|
minimist "^1.1.1"
|
||||||
|
|
||||||
devtools-protocol@0.0.809251:
|
|
||||||
version "0.0.809251"
|
|
||||||
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.809251.tgz#300b3366be107d5c46114ecb85274173e3999518"
|
|
||||||
integrity sha512-pf+2OY6ghMDPjKkzSWxHMq+McD+9Ojmq5XVRYpv/kPd9sTMQxzEt21592a31API8qRjro0iYYOc3ag46qF/1FA==
|
|
||||||
|
|
||||||
devtools-protocol@0.0.818844:
|
devtools-protocol@0.0.818844:
|
||||||
version "0.0.818844"
|
version "0.0.818844"
|
||||||
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.818844.tgz#d1947278ec85b53e4c8ca598f607a28fa785ba9e"
|
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.818844.tgz#d1947278ec85b53e4c8ca598f607a28fa785ba9e"
|
||||||
integrity sha512-AD1hi7iVJ8OD0aMLQU5VK0XH9LDlA1+BcPIgrAxPfaibx2DbWucuyOhc4oyQCbnvDDO68nN6/LcKfqTP343Jjg==
|
integrity sha512-AD1hi7iVJ8OD0aMLQU5VK0XH9LDlA1+BcPIgrAxPfaibx2DbWucuyOhc4oyQCbnvDDO68nN6/LcKfqTP343Jjg==
|
||||||
|
|
||||||
|
devtools-protocol@0.0.854822:
|
||||||
|
version "0.0.854822"
|
||||||
|
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.854822.tgz#eac3a5260a6b3b4e729a09fdc0c77b0d322e777b"
|
||||||
|
integrity sha512-xd4D8kHQtB0KtWW0c9xBZD5LVtm9chkMOfs/3Yn01RhT/sFIsVtzTtypfKoFfWBaL+7xCYLxjOLkhwPXaX/Kcg==
|
||||||
|
|
||||||
dezalgo@^1.0.0:
|
dezalgo@^1.0.0:
|
||||||
version "1.0.3"
|
version "1.0.3"
|
||||||
resolved "https://registry.yarnpkg.com/dezalgo/-/dezalgo-1.0.3.tgz#7f742de066fc748bc8db820569dddce49bf0d456"
|
resolved "https://registry.yarnpkg.com/dezalgo/-/dezalgo-1.0.3.tgz#7f742de066fc748bc8db820569dddce49bf0d456"
|
||||||
|
@ -22454,19 +22447,19 @@ puppeteer@^5.3.1:
|
||||||
unbzip2-stream "^1.3.3"
|
unbzip2-stream "^1.3.3"
|
||||||
ws "^7.2.3"
|
ws "^7.2.3"
|
||||||
|
|
||||||
"puppeteer@npm:@elastic/puppeteer@5.4.1-patch.1":
|
puppeteer@^8.0.0:
|
||||||
version "5.4.1-patch.1"
|
version "8.0.0"
|
||||||
resolved "https://registry.yarnpkg.com/@elastic/puppeteer/-/puppeteer-5.4.1-patch.1.tgz#61af43ec7df47d1042c8708c386cfa7af76e08f7"
|
resolved "https://registry.yarnpkg.com/puppeteer/-/puppeteer-8.0.0.tgz#a236669118aa795331c2d0ca19877159e7664705"
|
||||||
integrity sha512-I4JbNmQHZkE72TPNdipND8GnsEBnqzuksxPSAT25qvudShuuzdY9TwNBQ65IJwPD/pjlpx7fUIUmFyvTHwlxhQ==
|
integrity sha512-D0RzSWlepeWkxPPdK3xhTcefj8rjah1791GE82Pdjsri49sy11ci/JQsAO8K2NRukqvwEtcI+ImP5F4ZiMvtIQ==
|
||||||
dependencies:
|
dependencies:
|
||||||
debug "^4.1.0"
|
debug "^4.1.0"
|
||||||
devtools-protocol "0.0.809251"
|
devtools-protocol "0.0.854822"
|
||||||
extract-zip "^2.0.0"
|
extract-zip "^2.0.0"
|
||||||
https-proxy-agent "^4.0.0"
|
https-proxy-agent "^5.0.0"
|
||||||
node-fetch "^2.6.1"
|
node-fetch "^2.6.1"
|
||||||
pkg-dir "^4.2.0"
|
pkg-dir "^4.2.0"
|
||||||
progress "^2.0.1"
|
progress "^2.0.1"
|
||||||
proxy-from-env "^1.0.0"
|
proxy-from-env "^1.1.0"
|
||||||
rimraf "^3.0.2"
|
rimraf "^3.0.2"
|
||||||
tar-fs "^2.0.0"
|
tar-fs "^2.0.0"
|
||||||
unbzip2-stream "^1.3.3"
|
unbzip2-stream "^1.3.3"
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue