Puppeteer update v21.5.2 (#172332)

## Summary

This PR updates puppeteer to v21.5.2. 

Also the hashing Algorithm for verifying the accompanying chromium
binaries required for chromium has been switched from md5 to sha1, to be
FIPS compliant.

<!-- 
### Checklist

Delete any items that are not applicable to this PR.

- [ ] Any text added follows [EUI's writing
guidelines](https://elastic.github.io/eui/#/guidelines/writing), uses
sentence case text and includes [i18n
support](https://github.com/elastic/kibana/blob/main/packages/kbn-i18n/README.md)
- [ ]
[Documentation](https://www.elastic.co/guide/en/kibana/master/development-documentation.html)
was added for features that require explanation or tutorials
- [ ] [Unit or functional
tests](https://www.elastic.co/guide/en/kibana/master/development-tests.html)
were updated or added to match the most common scenarios
- [ ] [Flaky Test
Runner](https://ci-stats.kibana.dev/trigger_flaky_test_runner/1) was
used on any tests changed
- [ ] Any UI touched in this PR is usable by keyboard only (learn more
about [keyboard accessibility](https://webaim.org/techniques/keyboard/))
- [ ] Any UI touched in this PR does not create any new axe failures
(run axe in browser:
[FF](https://addons.mozilla.org/en-US/firefox/addon/axe-devtools/),
[Chrome](https://chrome.google.com/webstore/detail/axe-web-accessibility-tes/lhdoppojpmngadmnindnejefpokejbdd?hl=en-US))
- [ ] If a plugin configuration key changed, check if it needs to be
allowlisted in the cloud and added to the [docker
list](https://github.com/elastic/kibana/blob/main/src/dev/build/tasks/os_packages/docker_generator/resources/base/bin/kibana-docker)
- [ ] This renders correctly on smaller devices using a responsive
layout. (You can test this [in your
browser](https://www.browserstack.com/guide/responsive-testing-on-local-server))
- [ ] This was checked for [cross-browser
compatibility](https://www.elastic.co/support/matrix#matrix_browsers)


### Risk Matrix

Delete this section if it is not applicable to this PR.

Before closing this PR, invite QA, stakeholders, and other developers to
identify risks that should be tested prior to the change/feature
release.

When forming the risk matrix, consider some of the following examples
and how they may potentially impact the change:

| Risk | Probability | Severity | Mitigation/Notes |

|---------------------------|-------------|----------|-------------------------|
| Multiple Spaces&mdash;unexpected behavior in non-default Kibana Space.
| Low | High | Integration tests will verify that all features are still
supported in non-default Kibana Space and when user switches between
spaces. |
| Multiple nodes&mdash;Elasticsearch polling might have race conditions
when multiple Kibana nodes are polling for the same tasks. | High | Low
| Tasks are idempotent, so executing them multiple times will not result
in logical error, but will degrade performance. To test for this case we
add plenty of unit tests around this logic and document manual testing
procedure. |
| Code should gracefully handle cases when feature X or plugin Y are
disabled. | Medium | High | Unit tests will verify that any feature flag
or plugin combination still results in our service operational. |
| [See more potential risk
examples](https://github.com/elastic/kibana/blob/main/RISK_MATRIX.mdx) |


### For maintainers

- [ ] This was checked for breaking API changes and was [labeled
appropriately](https://www.elastic.co/guide/en/kibana/master/contributing.html#kibana-release-notes-process)
-->
This commit is contained in:
Eyo O. Eyo 2023-12-05 16:45:51 +01:00 committed by GitHub
parent 06b026c114
commit 5d85f655b0
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
14 changed files with 96 additions and 120 deletions

View file

@ -1018,7 +1018,7 @@
"prop-types": "^15.8.1", "prop-types": "^15.8.1",
"proxy-from-env": "1.0.0", "proxy-from-env": "1.0.0",
"puid": "1.0.7", "puid": "1.0.7",
"puppeteer": "21.3.6", "puppeteer": "21.5.2",
"query-string": "^6.13.2", "query-string": "^6.13.2",
"rbush": "^3.0.1", "rbush": "^3.0.1",
"re-resizable": "^6.9.9", "re-resizable": "^6.9.9",

View file

@ -3,7 +3,7 @@ from os import path
from build_util import ( from build_util import (
runcmd, runcmd,
runcmdsilent, runcmdsilent,
md5_file, sha256_file,
) )
# This file builds Chromium headless on Linux. # This file builds Chromium headless on Linux.
@ -96,11 +96,11 @@ if arch_name != 'arm64':
shutil.move('out/headless/headless_shell', 'out/headless/headless_shell_raw') shutil.move('out/headless/headless_shell', 'out/headless/headless_shell_raw')
runcmd('strip -o out/headless/headless_shell out/headless/headless_shell_raw') runcmd('strip -o out/headless/headless_shell out/headless/headless_shell_raw')
# Create the zip and generate the md5 hash using filenames like: # Create the zip and generate the sha256 hash using filenames like:
# chromium-4747cc2-linux_x64.zip # chromium-4747cc2-linux_x64.zip
base_filename = 'out/headless/chromium-' + base_version + '-locales-' + platform.system().lower() + '_' + arch_name base_filename = 'out/headless/chromium-' + base_version + '-locales-' + platform.system().lower() + '_' + arch_name
zip_filename = base_filename + '.zip' zip_filename = base_filename + '.zip'
md5_filename = base_filename + '.md5' sha256_filename = base_filename + '.sha256'
print('Creating ' + path.join(src_path, zip_filename)) print('Creating ' + path.join(src_path, zip_filename))
archive = zipfile.ZipFile(zip_filename, mode='w', compression=zipfile.ZIP_DEFLATED) archive = zipfile.ZipFile(zip_filename, mode='w', compression=zipfile.ZIP_DEFLATED)
@ -118,9 +118,9 @@ archive.write(en_us_locale_file_path, path.join(path_prefix, 'locales', en_us_lo
archive.close() archive.close()
print('Creating ' + path.join(src_path, md5_filename)) print('Creating ' + path.join(src_path, sha256_filename))
with open (md5_filename, 'w') as f: with open (sha256_filename, 'w') as f:
f.write(md5_file(zip_filename)) f.write(sha256_file(zip_filename))
runcmd('gsutil cp ' + path.join(src_path, zip_filename) + ' gs://headless_shell_staging') runcmd('gsutil cp ' + path.join(src_path, zip_filename) + ' gs://headless_shell_staging')
runcmd('gsutil cp ' + path.join(src_path, md5_filename) + ' gs://headless_shell_staging') runcmd('gsutil cp ' + path.join(src_path, sha256_filename) + ' gs://headless_shell_staging')

View file

@ -20,10 +20,10 @@ def mkdir(dir):
if not os.path.exists(dir): if not os.path.exists(dir):
return os.makedirs(dir) return os.makedirs(dir)
def md5_file(filename): def sha256_file(filename):
"""Builds a hex md5 hash of the given file""" """Builds a hex sha1 hash of the given file"""
md5 = hashlib.md5() sha256 = hashlib.sha256()
with open(filename, 'rb') as f: with open(filename, 'rb') as f:
for chunk in iter(lambda: f.read(128 * md5.block_size), b''): for chunk in iter(lambda: f.read(128 * sha256.block_size), b''):
md5.update(chunk) sha256.update(chunk)
return md5.hexdigest() return sha256.hexdigest()

View file

@ -50,6 +50,8 @@ describe('headless webgl arm mac workaround', () => {
// if you're updating this, then you're likely updating chromium // if you're updating this, then you're likely updating chromium
// please double-check that the --use-angle flag is still needed for arm macs // please double-check that the --use-angle flag is still needed for arm macs
// instead of --use-angle you may need --enable-gpu // instead of --use-angle you may need --enable-gpu
expect(getChromiumPackage().binaryChecksum).toBe('361f7cbac5bcac1d9974a43e29bf4bf5'); // just putting this here so that someone updating the chromium version will see this comment expect(getChromiumPackage().binaryChecksum).toBe(
'a8556ed7ac2a669fa81f752f7d18a9d1e9b99b05d3504f6bbc08e3e0b02ff71e'
); // just putting this here so that someone updating the chromium version will see this comment
}); });
}); });

View file

@ -5,26 +5,12 @@
* 2.0. * 2.0.
*/ */
import { type Protocol } from 'devtools-protocol';
import type { Metrics as PuppeteerMetrics } from 'puppeteer'; import type { Metrics as PuppeteerMetrics } from 'puppeteer';
import { cpus } from 'os'; import { cpus } from 'os';
import { PerformanceMetrics } from '../../../../common/types'; import { PerformanceMetrics } from '../../../../common/types';
declare module 'puppeteer' { export type Metrics = Protocol.Performance.GetMetricsResponse;
interface CDPSession {
send(command: 'Performance.getMetrics'): Promise<RawMetrics>;
}
}
type RawMetrics = Metrics;
export interface Metrics {
metrics: Metric[];
}
interface Metric {
name: keyof NormalizedMetrics;
value: unknown;
}
interface NormalizedMetrics extends Required<PuppeteerMetrics> { interface NormalizedMetrics extends Required<PuppeteerMetrics> {
ProcessTime: number; ProcessTime: number;

View file

@ -44,10 +44,10 @@ export class ChromiumArchivePaths {
platform: 'darwin', platform: 'darwin',
architecture: 'x64', architecture: 'x64',
archiveFilename: 'chrome-mac.zip', archiveFilename: 'chrome-mac.zip',
archiveChecksum: '086ffb9d1e248f41f1e385aaea1bb568', archiveChecksum: '35261c7a88f1797d27646c340eeaf7d7d70727f0c4ae884e8400240ed66d7192',
binaryChecksum: '58ed6d2bba7773b85aaec1d78b9c1a7b', binaryChecksum: 'ca90fe7573ddb0723d633fe526acf0fdefdda570a549f35e15c111d10f3ffc0d',
binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium', binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium',
revision: 1181205, revision: 1204244, // 1204232 is not available for Mac Intel
location: 'common', location: 'common',
archivePath: 'Mac', archivePath: 'Mac',
isPreInstalled: false, isPreInstalled: false,
@ -56,10 +56,10 @@ export class ChromiumArchivePaths {
platform: 'darwin', platform: 'darwin',
architecture: 'arm64', architecture: 'arm64',
archiveFilename: 'chrome-mac.zip', archiveFilename: 'chrome-mac.zip',
archiveChecksum: 'f80b2cb14025e283a740836aa66e46d4', archiveChecksum: '1ed375086a9505ee6bc9bc1373bebd79e87e5b27af5a93258ea25ffb6f71f03c',
binaryChecksum: '361f7cbac5bcac1d9974a43e29bf4bf5', binaryChecksum: 'a8556ed7ac2a669fa81f752f7d18a9d1e9b99b05d3504f6bbc08e3e0b02ff71e',
binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium', binaryRelativePath: 'chrome-mac/Chromium.app/Contents/MacOS/Chromium',
revision: 1181286, // 1181205 is not available for Mac_Arm revision: 1204255, // 1204232 is not available for Mac_Arm
location: 'common', location: 'common',
archivePath: 'Mac_Arm', archivePath: 'Mac_Arm',
isPreInstalled: false, isPreInstalled: false,
@ -67,22 +67,22 @@ export class ChromiumArchivePaths {
{ {
platform: 'linux', platform: 'linux',
architecture: 'x64', architecture: 'x64',
archiveFilename: 'chromium-67649b1-locales-linux_x64.zip', archiveFilename: 'chromium-38c7255-locales-linux_x64.zip',
archiveChecksum: '21bd8a1e06f236fa405c74d92a7ccd63', archiveChecksum: 'bf07734366ece771a85b2452fd63e5981b1abc234ef0ed1c7d0774b8a7b5c6a9',
binaryChecksum: 'b75d45d3044cc320bb09ce7356003d24', binaryChecksum: '87a991c412ad333549a58524b6be23f2a1ff56af61bb1a1b10c1f4a0206edc2a',
binaryRelativePath: 'headless_shell-linux_x64/headless_shell', binaryRelativePath: 'headless_shell-linux_x64/headless_shell',
revision: 1181205, revision: 1204232,
location: 'custom', location: 'custom',
isPreInstalled: true, isPreInstalled: true,
}, },
{ {
platform: 'linux', platform: 'linux',
architecture: 'arm64', architecture: 'arm64',
archiveFilename: 'chromium-67649b1-locales-linux_arm64.zip', archiveFilename: 'chromium-38c7255-locales-linux_arm64.zip',
archiveChecksum: '0c3b42ada934258b4596f3e984d011e3', archiveChecksum: '11c1cd2398ae3b57a72e7746e1f1cbbd2c2d18d1b83dec949dc81a3c690688f0',
binaryChecksum: 'ac521fbc52fb1589416a214ce7b299ee', binaryChecksum: '4d914034d466b97c438283dbc914230e087217c25028f403dfa3c933ea755e94',
binaryRelativePath: 'headless_shell-linux_arm64/headless_shell', binaryRelativePath: 'headless_shell-linux_arm64/headless_shell',
revision: 1181205, revision: 1204232,
location: 'custom', location: 'custom',
isPreInstalled: true, isPreInstalled: true,
}, },
@ -90,10 +90,10 @@ export class ChromiumArchivePaths {
platform: 'win32', platform: 'win32',
architecture: 'x64', architecture: 'x64',
archiveFilename: 'chrome-win.zip', archiveFilename: 'chrome-win.zip',
archiveChecksum: '08186d7494e75c2cca03270d9a4ff589', archiveChecksum: 'd6f5a21973867115435814c2c46d49edd9a0a2ad6da14b4724746374cad80e47',
binaryChecksum: '1623fed921c9acee7221b2de98abe54e', binaryChecksum: '9c0d2404004bd7c4ada649049422de6958460ecf6cec53460a478c6d8c33e444',
binaryRelativePath: path.join('chrome-win', 'chrome.exe'), binaryRelativePath: path.join('chrome-win', 'chrome.exe'),
revision: 1181280, // 1181205 is not available for win revision: 1204234, // 1204232 is not available for win
location: 'common', location: 'common',
archivePath: 'Win', archivePath: 'Win',
isPreInstalled: true, isPreInstalled: true,

View file

@ -9,9 +9,9 @@ jest.mock('fs');
import { createReadStream, ReadStream } from 'fs'; import { createReadStream, ReadStream } from 'fs';
import { Readable } from 'stream'; import { Readable } from 'stream';
import { md5 } from './checksum'; import { sha256 } from './checksum';
describe('md5', () => { describe('sha1', () => {
let stream: ReadStream; let stream: ReadStream;
beforeEach(() => { beforeEach(() => {
@ -25,14 +25,16 @@ describe('md5', () => {
(createReadStream as jest.MockedFunction<typeof createReadStream>).mockReturnValue(stream); (createReadStream as jest.MockedFunction<typeof createReadStream>).mockReturnValue(stream);
}); });
it('should return an md5 hash', async () => { it('should return an sha256 hash', async () => {
await expect(md5('path')).resolves.toBe('437b930db84b8079c2dd804a71936b5f'); expect(await sha256('path')).toMatchInlineSnapshot(
`"3fc9b689459d738f8c88a3a48aa9e33542016b7a4052e001aaa536fca74813cb"`
);
}); });
it('should reject on stream error', async () => { it('should reject on stream error', async () => {
const error = new Error('Some error'); const error = new Error('Some error');
stream.destroy(error); stream.destroy(error);
await expect(md5('path')).rejects.toEqual(error); await expect(sha256('path')).rejects.toEqual(error);
}); });
}); });

View file

@ -7,15 +7,14 @@
import { createHash } from 'crypto'; import { createHash } from 'crypto';
import { createReadStream } from 'fs'; import { createReadStream } from 'fs';
import { finished } from 'stream'; import { finished } from 'stream/promises';
import { promisify } from 'util';
export async function md5(path: string) { export async function sha256(path: string) {
const hash = createHash('md5'); const hash = createHash('sha256');
const stream = createReadStream(path); const stream = createReadStream(path);
stream.on('data', (chunk) => hash.update(chunk)); stream.on('data', (chunk) => hash.update(chunk));
await promisify(finished)(stream, { writable: false }); await finished(stream);
return hash.digest('hex'); return hash.digest('hex');
} }

View file

@ -41,8 +41,8 @@ describe('fetch', () => {
expect(readFileSync(TEMP_FILE, 'utf8')).toEqual('foobar'); expect(readFileSync(TEMP_FILE, 'utf8')).toEqual('foobar');
}); });
test('returns the md5 hex hash of the http body', async () => { test('returns the sha1 hex hash of the http body', async () => {
const hash = createHash('md5').update('foobar').digest('hex'); const hash = createHash('sha256').update('foobar').digest('hex');
await expect(fetch('url', TEMP_FILE)).resolves.toEqual(hash); await expect(fetch('url', TEMP_FILE)).resolves.toEqual(hash);
}); });

View file

@ -9,8 +9,8 @@ import Axios from 'axios';
import { createHash } from 'crypto'; import { createHash } from 'crypto';
import { closeSync, mkdirSync, openSync, writeSync } from 'fs'; import { closeSync, mkdirSync, openSync, writeSync } from 'fs';
import { dirname } from 'path'; import { dirname } from 'path';
import { finished, Readable } from 'stream'; import { Readable } from 'stream';
import { promisify } from 'util'; import { finished } from 'stream/promises';
import type { Logger } from '@kbn/core/server'; import type { Logger } from '@kbn/core/server';
/** /**
@ -19,7 +19,7 @@ import type { Logger } from '@kbn/core/server';
export async function fetch(url: string, path: string, logger?: Logger): Promise<string> { export async function fetch(url: string, path: string, logger?: Logger): Promise<string> {
logger?.info(`Downloading ${url} to ${path}`); logger?.info(`Downloading ${url} to ${path}`);
const hash = createHash('md5'); const hash = createHash('sha256');
mkdirSync(dirname(path), { recursive: true }); mkdirSync(dirname(path), { recursive: true });
const handle = openSync(path, 'w'); const handle = openSync(path, 'w');
@ -36,7 +36,7 @@ export async function fetch(url: string, path: string, logger?: Logger): Promise
hash.update(chunk); hash.update(chunk);
}); });
await promisify(finished)(response.data, { writable: false }); await finished(response.data);
logger?.info(`Downloaded ${url}`); logger?.info(`Downloaded ${url}`);
} catch (error) { } catch (error) {
logger?.error(error); logger?.error(error);

View file

@ -10,7 +10,7 @@ import mockFs from 'mock-fs';
import { existsSync, readdirSync } from 'fs'; import { existsSync, readdirSync } from 'fs';
import { ChromiumArchivePaths, PackageInfo } from '../chromium'; import { ChromiumArchivePaths, PackageInfo } from '../chromium';
import { fetch } from './fetch'; import { fetch } from './fetch';
import { md5 } from './checksum'; import { sha256 } from './checksum';
import { download } from '.'; import { download } from '.';
jest.mock('./checksum'); jest.mock('./checksum');
@ -24,16 +24,16 @@ describe('ensureDownloaded', () => {
paths = new ChromiumArchivePaths(); paths = new ChromiumArchivePaths();
pkg = paths.find('linux', 'x64') as PackageInfo; pkg = paths.find('linux', 'x64') as PackageInfo;
(md5 as jest.MockedFunction<typeof md5>).mockImplementation( (sha256 as jest.MockedFunction<typeof sha256>).mockImplementation(
async (packagePath) => async (packagePath) =>
paths.packages.find((packageInfo) => paths.resolvePath(packageInfo) === packagePath) paths.packages.find((packageInfo) => paths.resolvePath(packageInfo) === packagePath)
?.archiveChecksum ?? 'some-md5' ?.archiveChecksum ?? 'some-sha256'
); );
(fetch as jest.MockedFunction<typeof fetch>).mockImplementation( (fetch as jest.MockedFunction<typeof fetch>).mockImplementation(
async (_url, packagePath) => async (_url, packagePath) =>
paths.packages.find((packageInfo) => paths.resolvePath(packageInfo) === packagePath) paths.packages.find((packageInfo) => paths.resolvePath(packageInfo) === packagePath)
?.archiveChecksum ?? 'some-md5' ?.archiveChecksum ?? 'some-sha256'
); );
mockFs(); mockFs();
@ -65,8 +65,8 @@ describe('ensureDownloaded', () => {
await expect(download(paths, pkg)).rejects.toBeInstanceOf(Error); await expect(download(paths, pkg)).rejects.toBeInstanceOf(Error);
}); });
it('should reject when downloaded md5 hash is different', async () => { it('should reject when downloaded sha256 hash is different', async () => {
(fetch as jest.MockedFunction<typeof fetch>).mockResolvedValue('random-md5'); (fetch as jest.MockedFunction<typeof fetch>).mockResolvedValue('random-sha256');
await expect(download(paths, pkg)).rejects.toBeInstanceOf(Error); await expect(download(paths, pkg)).rejects.toBeInstanceOf(Error);
}); });
@ -99,8 +99,8 @@ describe('ensureDownloaded', () => {
); );
}); });
it('should download again if md5 hash different', async () => { it('should download again if sha256 hash different', async () => {
(md5 as jest.MockedFunction<typeof md5>).mockResolvedValueOnce('random-md5'); (sha256 as jest.MockedFunction<typeof sha256>).mockResolvedValueOnce('random-sha256');
await download(paths, pkg); await download(paths, pkg);
expect(fetch).toHaveBeenCalledTimes(1); expect(fetch).toHaveBeenCalledTimes(1);

View file

@ -9,7 +9,7 @@ import { existsSync } from 'fs';
import del from 'del'; import del from 'del';
import type { Logger } from '@kbn/core/server'; import type { Logger } from '@kbn/core/server';
import type { ChromiumArchivePaths, PackageInfo } from '../chromium'; import type { ChromiumArchivePaths, PackageInfo } from '../chromium';
import { md5 } from './checksum'; import { sha256 } from './checksum';
import { fetch } from './fetch'; import { fetch } from './fetch';
type ValidChecksum = string; type ValidChecksum = string;
@ -40,7 +40,7 @@ export async function download(
} }
const resolvedPath = paths.resolvePath(pkg); const resolvedPath = paths.resolvePath(pkg);
const foundChecksum = await md5(resolvedPath).catch(() => 'MISSING'); const foundChecksum = await sha256(resolvedPath).catch(() => 'MISSING');
const pathExists = existsSync(resolvedPath); const pathExists = existsSync(resolvedPath);
if (pathExists && foundChecksum === archiveChecksum) { if (pathExists && foundChecksum === archiveChecksum) {

View file

@ -10,7 +10,7 @@ import path from 'path';
import type { Logger } from '@kbn/core/server'; import type { Logger } from '@kbn/core/server';
import { ChromiumArchivePaths, PackageInfo } from './chromium'; import { ChromiumArchivePaths, PackageInfo } from './chromium';
import { download } from './download'; import { download } from './download';
import { md5 } from './download/checksum'; import { sha256 } from './download/checksum';
import { extract } from './extract'; import { extract } from './extract';
type BinaryPath = string; type BinaryPath = string;
@ -27,7 +27,7 @@ export async function install(
): Promise<BinaryPath> { ): Promise<BinaryPath> {
const { architecture, platform } = pkg; const { architecture, platform } = pkg;
const binaryPath = paths.getBinaryPath(pkg, chromiumPath); const binaryPath = paths.getBinaryPath(pkg, chromiumPath);
const binaryChecksum = await md5(binaryPath).catch(() => 'MISSING'); const binaryChecksum = await sha256(binaryPath).catch(() => 'MISSING');
if (binaryChecksum === pkg.binaryChecksum) { if (binaryChecksum === pkg.binaryChecksum) {
// validated a previously extracted browser binary // validated a previously extracted browser binary
@ -55,7 +55,7 @@ export async function install(
} }
// check the newly extracted browser binary // check the newly extracted browser binary
const downloadedBinaryChecksum = await md5(binaryPath).catch(() => 'MISSING'); const downloadedBinaryChecksum = await sha256(binaryPath).catch(() => 'MISSING');
if (downloadedBinaryChecksum !== pkg.binaryChecksum) { if (downloadedBinaryChecksum !== pkg.binaryChecksum) {
const error = new Error( const error = new Error(
`Error installing browsers, binary checksums incorrect for [${architecture}/${platform}]` `Error installing browsers, binary checksums incorrect for [${architecture}/${platform}]`

View file

@ -7295,10 +7295,10 @@
resolved "https://registry.yarnpkg.com/@protobufjs/utf8/-/utf8-1.1.0.tgz#a777360b5b39a1a2e5106f8e858f2fd2d060c570" resolved "https://registry.yarnpkg.com/@protobufjs/utf8/-/utf8-1.1.0.tgz#a777360b5b39a1a2e5106f8e858f2fd2d060c570"
integrity sha1-p3c2C1s5oaLlEG+OhY8v0tBgxXA= integrity sha1-p3c2C1s5oaLlEG+OhY8v0tBgxXA=
"@puppeteer/browsers@1.7.1": "@puppeteer/browsers@1.8.0":
version "1.7.1" version "1.8.0"
resolved "https://registry.yarnpkg.com/@puppeteer/browsers/-/browsers-1.7.1.tgz#04f1e3aec4b87f50a7acc8f64be2149bda014f0a" resolved "https://registry.yarnpkg.com/@puppeteer/browsers/-/browsers-1.8.0.tgz#fb6ee61de15e7f0e67737aea9f9bab1512dbd7d8"
integrity sha512-nIb8SOBgDEMFY2iS2MdnUZOg2ikcYchRrBoF+wtdjieRFKR2uGRipHY/oFLo+2N6anDualyClPzGywTHRGrLfw== integrity sha512-TkRHIV6k2D8OlUe8RtG+5jgOF/H98Myx0M6AOafC8DdNVOFiBSFa5cpRDtpm8LXOa9sVwe0+e6Q3FC56X/DZfg==
dependencies: dependencies:
debug "4.3.4" debug "4.3.4"
extract-zip "2.0.1" extract-zip "2.0.1"
@ -7306,7 +7306,7 @@
proxy-agent "6.3.1" proxy-agent "6.3.1"
tar-fs "3.0.4" tar-fs "3.0.4"
unbzip2-stream "1.4.3" unbzip2-stream "1.4.3"
yargs "17.7.1" yargs "17.7.2"
"@redux-saga/core@^1.1.3": "@redux-saga/core@^1.1.3":
version "1.1.3" version "1.1.3"
@ -12933,10 +12933,10 @@ chromedriver@^119.0.1:
proxy-from-env "^1.1.0" proxy-from-env "^1.1.0"
tcp-port-used "^1.0.2" tcp-port-used "^1.0.2"
chromium-bidi@0.4.28: chromium-bidi@0.4.33:
version "0.4.28" version "0.4.33"
resolved "https://registry.yarnpkg.com/chromium-bidi/-/chromium-bidi-0.4.28.tgz#05befef4f3f19003198237245780d1c60e6f4dbc" resolved "https://registry.yarnpkg.com/chromium-bidi/-/chromium-bidi-0.4.33.tgz#9a9aba5a5b07118c8e7d6405f8ee79f47418dd1d"
integrity sha512-2HZ74QlAApJrEwcGlU/sUu0s4VS+FI3CJ09Toc9aE9VemMyhHZXeaROQgJKNRaYMUTUx6qIv1cLBs3F+vfgjSw== integrity sha512-IxoFM5WGQOIAd95qrSXzJUv4eXIrh+RvU3rwwqIiwYuvfE7U/Llj4fejbsJnjJMUYCuGtVQsY2gv7oGl4aTNSQ==
dependencies: dependencies:
mitt "3.0.1" mitt "3.0.1"
urlpattern-polyfill "9.0.0" urlpattern-polyfill "9.0.0"
@ -14914,10 +14914,10 @@ detective@^5.0.2:
defined "^1.0.0" defined "^1.0.0"
minimist "^1.1.1" minimist "^1.1.1"
devtools-protocol@0.0.1179426: devtools-protocol@0.0.1203626:
version "0.0.1179426" version "0.0.1203626"
resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.1179426.tgz#c4c3ee671efae868395569123002facbbbffa267" resolved "https://registry.yarnpkg.com/devtools-protocol/-/devtools-protocol-0.0.1203626.tgz#4366a4c81a7e0d4fd6924e9182c67f1e5941e820"
integrity sha512-KKC7IGwdOr7u9kTGgjUvGTov/z1s2H7oHi3zKCdR9eSDyCPia5CBi4aRhtp7d8uR7l0GS5UTDw3TjKGu5CqINg== integrity sha512-nEzHZteIUZfGCZtTiS1fRpC8UZmsfD1SiyPvaUNvS13dvKf666OAm8YTi0+Ca3n1nLEyu49Cy4+dPWpaHFJk9g==
dezalgo@^1.0.0, dezalgo@^1.0.4: dezalgo@^1.0.0, dezalgo@^1.0.4:
version "1.0.4" version "1.0.4"
@ -25068,26 +25068,26 @@ pupa@^2.1.1:
dependencies: dependencies:
escape-goat "^2.0.0" escape-goat "^2.0.0"
puppeteer-core@21.3.6: puppeteer-core@21.5.2:
version "21.3.6" version "21.5.2"
resolved "https://registry.yarnpkg.com/puppeteer-core/-/puppeteer-core-21.3.6.tgz#5507fafb790692ff887e368de71a1c5a0d08af1e" resolved "https://registry.yarnpkg.com/puppeteer-core/-/puppeteer-core-21.5.2.tgz#6d3de4efb2ae65f1ee072043787b75594e88035f"
integrity sha512-ZH6tjTdRXwW2fx5W3jBbG+yUVQdDfZW1kjfwvWwMzsnKEli5ZwV70Zp97GOebHQHrK8zM3vX5VqI9sd48c9PnQ== integrity sha512-v4T0cWnujSKs+iEfmb8ccd7u4/x8oblEyKqplqKnJ582Kw8PewYAWvkH4qUWhitN3O2q9RF7dzkvjyK5HbzjLA==
dependencies: dependencies:
"@puppeteer/browsers" "1.7.1" "@puppeteer/browsers" "1.8.0"
chromium-bidi "0.4.28" chromium-bidi "0.4.33"
cross-fetch "4.0.0" cross-fetch "4.0.0"
debug "4.3.4" debug "4.3.4"
devtools-protocol "0.0.1179426" devtools-protocol "0.0.1203626"
ws "8.14.2" ws "8.14.2"
puppeteer@21.3.6: puppeteer@21.5.2:
version "21.3.6" version "21.5.2"
resolved "https://registry.yarnpkg.com/puppeteer/-/puppeteer-21.3.6.tgz#961a44cd532ab5344ed53d7714aa56b4602ace10" resolved "https://registry.yarnpkg.com/puppeteer/-/puppeteer-21.5.2.tgz#0a4a72175c0fd0944d6486f4734807e1671d527b"
integrity sha512-ulK9+KLvdaVsG0EKbKyw/DCXCz88rsnrvIJg9tY8AmkGR01AxI4ZJTH9BJl1OE7cLfh2vxjBvY+xfvJod6rfgw== integrity sha512-BaAGJOq8Fl6/cck6obmwaNLksuY0Bg/lIahCLhJPGXBFUD2mCffypa4A592MaWnDcye7eaHmSK9yot0pxctY8A==
dependencies: dependencies:
"@puppeteer/browsers" "1.7.1" "@puppeteer/browsers" "1.8.0"
cosmiconfig "8.3.6" cosmiconfig "8.3.6"
puppeteer-core "21.3.6" puppeteer-core "21.5.2"
pure-rand@^6.0.0: pure-rand@^6.0.0:
version "6.0.2" version "6.0.2"
@ -31289,10 +31289,10 @@ yargs@16.2.0:
y18n "^5.0.5" y18n "^5.0.5"
yargs-parser "^20.2.2" yargs-parser "^20.2.2"
yargs@17.7.1: yargs@17.7.2, yargs@^17.2.1, yargs@^17.3.1, yargs@^17.4.0, yargs@^17.6.0, yargs@^17.7.1, yargs@^17.7.2:
version "17.7.1" version "17.7.2"
resolved "https://registry.yarnpkg.com/yargs/-/yargs-17.7.1.tgz#34a77645201d1a8fc5213ace787c220eabbd0967" resolved "https://registry.yarnpkg.com/yargs/-/yargs-17.7.2.tgz#991df39aca675a192b816e1e0363f9d75d2aa269"
integrity sha512-cwiTb08Xuv5fqF4AovYacTFNxk62th7LKJ6BL9IGUpTJrWoU7/7WdQGTP2SjKf1dUNBGzDd28p/Yfs/GI6JrLw== integrity sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==
dependencies: dependencies:
cliui "^8.0.1" cliui "^8.0.1"
escalade "^3.1.1" escalade "^3.1.1"
@ -31319,19 +31319,6 @@ yargs@^15.0.2, yargs@^15.3.1, yargs@^15.4.1:
y18n "^4.0.0" y18n "^4.0.0"
yargs-parser "^18.1.2" yargs-parser "^18.1.2"
yargs@^17.2.1, yargs@^17.3.1, yargs@^17.4.0, yargs@^17.6.0, yargs@^17.7.1, yargs@^17.7.2:
version "17.7.2"
resolved "https://registry.yarnpkg.com/yargs/-/yargs-17.7.2.tgz#991df39aca675a192b816e1e0363f9d75d2aa269"
integrity sha512-7dSzzRQ++CKnNI/krKnYRV7JKKPUXMEh61soaHKg9mrWEhzFWhFnxPxGl+69cD1Ou63C13NUPCnmIcrvqCuM6w==
dependencies:
cliui "^8.0.1"
escalade "^3.1.1"
get-caller-file "^2.0.5"
require-directory "^2.1.1"
string-width "^4.2.3"
y18n "^5.0.5"
yargs-parser "^21.1.1"
yargs@^3.15.0: yargs@^3.15.0:
version "3.32.0" version "3.32.0"
resolved "https://registry.yarnpkg.com/yargs/-/yargs-3.32.0.tgz#03088e9ebf9e756b69751611d2a5ef591482c995" resolved "https://registry.yarnpkg.com/yargs/-/yargs-3.32.0.tgz#03088e9ebf9e756b69751611d2a5ef591482c995"