[kbn/pm] rewrite to avoid needing a build process (#136207)

* [kbn/pm] rewrite to avoid needing a build process

* uncomment timing reporting

* throw in a few missing comments

* Update README.md

* remove extra SomeDevLog interface from ci-stats-core

* remove non-stdio logging from bazel_runner, improve output formatting

* use private fields instead of just ts private props

* promote args to a positional arg

* optionally require the ci-stats-reporter after each command

* allow opt-ing out of vscode config management

* reduce to a single import

* add bit of docs regarding weird imports and package deps of kbn/pm

* clean extraDirs from Kibana's package.json file too

* tweak logging of run-in-packages to use --quiet and not just CI=true

* remove unlazy-loader

* add readme for @kbn/yarn-lock-validator

* convert @kbn/some-dev-logs docs to mdx

* remove missing navigation id and fix id in dev-cli-runner docs

* fix title of some-dev-logs docs page

* typo
This commit is contained in:
Spencer 2022-07-18 10:46:13 -05:00 committed by GitHub
parent 11f7ace59f
commit 4f817ad8a0
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
154 changed files with 3093 additions and 67794 deletions

View file

@ -11,7 +11,6 @@ export DISABLE_BOOTSTRAP_VALIDATION=false
.buildkite/scripts/steps/checks/telemetry.sh
.buildkite/scripts/steps/checks/ts_projects.sh
.buildkite/scripts/steps/checks/jest_configs.sh
.buildkite/scripts/steps/checks/kbn_pm_dist.sh
.buildkite/scripts/steps/checks/plugin_list_docs.sh
.buildkite/scripts/steps/checks/bundle_limits.sh
.buildkite/scripts/steps/checks/i18n.sh

View file

@ -1,10 +0,0 @@
#!/usr/bin/env bash
set -euo pipefail
source .buildkite/scripts/common/util.sh
echo "--- Building kbn-pm distributable"
yarn kbn run build -i @kbn/pm
check_for_changed_files 'yarn kbn run build -i @kbn/pm' true

View file

@ -6,4 +6,4 @@ source .buildkite/scripts/common/util.sh
echo --- Test Projects
checks-reporter-with-killswitch "Test Projects" \
yarn kbn run test --exclude kibana --oss --skip-kibana-plugins --skip-missing
yarn kbn run-in-packages test

View file

@ -17,16 +17,6 @@ Remember that any time you need to make sure the monorepo is ready to be used ju
yarn kbn bootstrap
----
[discrete]
=== Building Non Bazel Packages
Non Bazel packages can be built independently with
[source,bash]
----
yarn kbn run build -i PACKAGE_NAME
----
[discrete]
=== Building Bazel Packages
@ -34,7 +24,7 @@ Bazel packages are built as a whole for now. You can use:
[source,bash]
----
yarn kbn build
yarn kbn bootstrap
----
[discrete]

44
kbn_pm/README.mdx Normal file
View file

@ -0,0 +1,44 @@
---
id: kibDevDocsOpsKbnPm
slug: /kibana-dev-docs/ops/kbn-pm
title: "@kbn/pm"
description: 'The tool which bootstraps the repo and helps work with packages'
date: 2022-07-14
tags: ['kibana', 'dev', 'contributor', 'operations', 'packages', 'scripts']
---
`@kbn/pm` is the tool that we use to bootstrap the Kibana repository, build packages with Bazel, and run scripts in packages.
## commands
### `yarn kbn bootstrap`
Use this command to install dependencies, build packages, and prepare the repo for local development.
### `yarn kbn watch`
Use this command to build all packages and make sure that they are rebuilt as you make changes.
### and more!
There are several commands supported by `@kbn/pm`, but rather than documenting them here they are documented in the help text. Please run `yarn kbn --help` locally to see the most up-to-date info.
## Why isn't this TypeScript?
Since this tool is required for bootstrapping the repository it needs to work without any dependencies installed and without a build toolchain. We accomplish this by writing the tool in vanilla JS (shocker!) and using TypeScript to validate the code which is typed via heavy use of JSDoc comments.
In order to use import/export syntax and enhance the developer experience a little we use the `.mjs` file extension.
In some cases we actually do use TypeScript files, just for defining complicated types. These files are then imported only in special TS-compatible JSDoc comments, so Node.js will never try to import them but they can be used to define types which are too complicated to define inline or in a JSDoc comment.
There are cases where `@kbn/pm` relies on code from packages, mostly to prevent reimplementing common functionality. This can only be done in one of two ways:
1. With a dynamic `await import(...)` statement that is always run after boostrap is complete, or is wrapped in a try/catch in case bootstrap didn't complete successfully.
2. By pulling in the source code of the un-built package.
Option 1 is used in several places, with contingencies in place in case bootstrap failed. Option 2 is used for two pieces of code which are needed in order to run bootstrap:
1. `@kbn/plugin-discovery` as we need to populate the `@kbn/synthetic-package-map` to run Bazel
2. `@kbn/bazel-runner` as we want to have the logic for running bazel in a single location
Because we load these two packages from source, without being built, before bootstrap is ever run, they can not depend on other packages and must be written in Vanilla JS as well.

130
kbn_pm/src/cli.mjs Normal file
View file

@ -0,0 +1,130 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
/**
* This is the script that's run by `yarn kbn`. This script has as little logic
* as possible so that it can:
* - run without being built and without any dependencies
* - can bootstrap the repository, installing all deps and building all packages
* - load additional commands from packages which will extend the functionality
* beyond bootstrapping
*/
import { Args } from './lib/args.mjs';
import { getHelp } from './lib/help.mjs';
import { createFlagError, isCliError } from './lib/cli_error.mjs';
import { COMMANDS } from './commands/index.mjs';
import { Log } from './lib/log.mjs';
const start = Date.now();
const args = new Args(process.argv.slice(2), process.env.CI ? ['--quiet'] : []);
const log = new Log(args.getLoggingLevel());
const cmdName = args.getCommandName();
/**
* @param {import('./lib/log.mjs').Log} log
*/
async function tryToGetCiStatsReporter(log) {
try {
const { CiStatsReporter } = await import('@kbn/ci-stats-reporter');
return CiStatsReporter.fromEnv(log);
} catch {
return;
}
}
try {
const cmd = cmdName ? COMMANDS.find((c) => c.name === cmdName) : undefined;
if (cmdName && !cmd) {
throw createFlagError(`Invalid command name [${cmdName}]`);
}
if (args.getBooleanValue('help')) {
log._write(await getHelp(cmdName));
process.exit(0);
}
if (!cmd) {
throw createFlagError('missing command name');
}
/** @type {import('@kbn/ci-stats-reporter').CiStatsTiming[]} */
const timings = [];
/** @type {import('./lib/command').CommandRunContext['time']} */
const time = async (id, block) => {
if (!cmd.reportTimings) {
return await block();
}
const start = Date.now();
log.verbose(`[${id}]`, 'start');
const [result] = await Promise.allSettled([block()]);
const ms = Date.now() - start;
log.verbose(`[${id}]`, result.status === 'fulfilled' ? 'success' : 'failure', 'in', ms, 'ms');
timings.push({
group: cmd.reportTimings.group,
id,
ms,
meta: {
success: result.status === 'fulfilled',
},
});
if (result.status === 'fulfilled') {
return result.value;
} else {
throw result.reason;
}
};
const [result] = await Promise.allSettled([
(async () =>
await cmd.run({
args,
log,
time,
}))(),
]);
if (cmd.reportTimings) {
timings.push({
group: cmd.reportTimings.group,
id: cmd.reportTimings.id,
ms: Date.now() - start,
meta: {
success: result.status === 'fulfilled',
},
});
}
if (timings.length) {
const reporter = await tryToGetCiStatsReporter(log);
if (reporter) {
await reporter.timings({ timings });
}
}
if (result.status === 'rejected') {
throw result.reason;
}
} catch (error) {
if (!isCliError(error)) {
throw error;
}
log.error(`[${cmdName}] failed: ${error.message}`);
if (error.showHelp) {
log._write('');
log._write(await getHelp(cmdName));
}
process.exit(error.exitCode ?? 1);
}

View file

@ -0,0 +1,127 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { spawnSync } from '../../lib/spawn.mjs';
import * as Bazel from '../../lib/bazel.mjs';
import { haveNodeModulesBeenManuallyDeleted, removeYarnIntegrityFileIfExists } from './yarn.mjs';
import { setupRemoteCache } from './setup_remote_cache.mjs';
import { regenerateSyntheticPackageMap } from './regenerate_synthetic_package_map.mjs';
import { sortPackageJson } from './sort_package_json.mjs';
import { pluginDiscovery } from './plugins.mjs';
import { regenerateBaseTsconfig } from './regenerate_base_tsconfig.mjs';
/** @type {import('../../lib/command').Command} */
export const command = {
name: 'bootstrap',
intro: 'Bootstrap the Kibana repository, installs all dependencies and builds all packages',
description: `
This command should be run every time you checkout a new revision, or can be used to build all packages
once after making a change locally. Package builds are cached remotely so when you don't have local
changes build artifacts will be downloaded from the remote cache.
`,
flagsHelp: `
--force-install Use this flag to force bootstrap to install yarn dependencies. By default the',
command will attempt to only run yarn installs when necessary, but if you manually',
delete the node modules directory or have an issue in your node_modules directory',
you might need to force the install manually.',
--offline Run the installation process without consulting online resources. This is useful and',
sometimes necessary for using bootstrap on an airplane for instance. The local caches',
will be used exclusively, including a yarn-registry local mirror which is created and',
maintained by successful online bootstrap executions.',
--no-validate By default bootstrap validates the yarn.lock file to check for a handfull of',
conditions. If you run into issues with this process locally you can disable it by',
passing this flag.
--no-vscode By default bootstrap updates the .vscode directory to include commonly useful vscode
settings for local development. Disable this process either pass this flag or set
the KBN_BOOTSTRAP_NO_VSCODE=true environment variable.
--quiet Prevent logging more than basic success/error messages
`,
reportTimings: {
group: 'scripts/kbn bootstrap',
id: 'total',
},
async run({ args, log, time }) {
const offline = args.getBooleanValue('offline') ?? false;
const validate = args.getBooleanValue('validate') ?? true;
const quiet = args.getBooleanValue('quiet') ?? false;
const vscodeConfig =
args.getBooleanValue('vscode') ?? (process.env.KBN_BOOTSTRAP_NO_VSCODE ? false : true);
// Force install is set in case a flag is passed into yarn kbn bootstrap or
// our custom logic have determined there is a chance node_modules have been manually deleted and as such bazel
// tracking mechanism is no longer valid
const forceInstall =
args.getBooleanValue('force-install') ?? haveNodeModulesBeenManuallyDeleted();
Bazel.tryRemovingBazeliskFromYarnGlobal(log);
// Install bazel machinery tools if needed
Bazel.ensureInstalled(log);
// Setup remote cache settings in .bazelrc.cache if needed
setupRemoteCache(log);
// Bootstrap process for Bazel packages
// Bazel is now managing dependencies so yarn install
// will happen as part of this
//
// NOTE: Bazel projects will be introduced incrementally
// And should begin from the ones with none dependencies forward.
// That way non bazel projects could depend on bazel projects but not the other way around
// That is only intended during the migration process while non Bazel projects are not removed at all.
if (forceInstall) {
await time('force install dependencies', async () => {
removeYarnIntegrityFileIfExists();
await Bazel.expungeCache(log, { quiet });
await Bazel.installYarnDeps(log, { offline, quiet });
});
}
const plugins = await time('plugin discovery', async () => {
return pluginDiscovery();
});
// generate the synthetic package map which powers several other features, needed
// as an input to the package build
await time('regenerate synthetic package map', async () => {
regenerateSyntheticPackageMap(plugins);
});
// build packages
await time('build packages', async () => {
await Bazel.buildPackages(log, { offline, quiet });
});
await time('sort package json', async () => {
await sortPackageJson();
});
await time('regenerate tsconfig.base.json', async () => {
regenerateBaseTsconfig(plugins);
});
if (validate) {
// now that packages are built we can import `@kbn/yarn-lock-validator`
const { readYarnLock, validateDependencies } = await import('@kbn/yarn-lock-validator');
const yarnLock = await time('read yarn.lock', async () => {
return await readYarnLock();
});
await time('validate dependencies', async () => {
await validateDependencies(log, yarnLock);
});
}
if (vscodeConfig) {
await time('update vscode config', async () => {
// Update vscode settings
spawnSync('node', ['scripts/update_vscode_config']);
log.success('vscode config updated');
});
}
},
};

View file

@ -0,0 +1,61 @@
/* eslint-disable @kbn/eslint/require-license-header */
/**
* @notice
* This code includes a copy of the `normalize-path`
* https://github.com/jonschlinkert/normalize-path/blob/52c3a95ebebc2d98c1ad7606cbafa7e658656899/index.js
*
* The MIT License (MIT)
*
* Copyright (c) 2014-2018, Jon Schlinkert.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
*/
/**
* @param {string} path
* @returns {string}
*/
export function normalizePath(path) {
if (typeof path !== 'string') {
throw new TypeError('expected path to be a string');
}
if (path === '\\' || path === '/') return '/';
const len = path.length;
if (len <= 1) return path;
// ensure that win32 namespaces has two leading slashes, so that the path is
// handled properly by the win32 version of path.parse() after being normalized
// https://msdn.microsoft.com/library/windows/desktop/aa365247(v=vs.85).aspx#namespaces
let prefix = '';
if (len > 4 && path[3] === '\\') {
const ch = path[2];
if ((ch === '?' || ch === '.') && path.slice(0, 2) === '\\\\') {
path = path.slice(2);
prefix = '//';
}
}
const segs = path.split(/[/\\]+/);
if (segs[segs.length - 1] === '') {
segs.pop();
}
return prefix + segs.join('/');
}

View file

@ -0,0 +1,51 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { REPO_ROOT } from '../../lib/paths.mjs';
/** @type {string} */
const PLUGIN_DISCOVERY_SRC = '../../../../packages/kbn-plugin-discovery/src/index.js';
/**
* @param {string} pluginId
* @returns {string}
*/
export function convertPluginIdToPackageId(pluginId) {
if (pluginId === 'core') {
// core is the only non-plugin
return `@kbn/core`;
}
return `@kbn/${pluginId
.split('')
.flatMap((c) => (c.toUpperCase() === c ? `-${c.toLowerCase()}` : c))
.join('')}-plugin`
.replace(/-\w(-\w)+-/g, (match) => `-${match.split('-').join('')}-`)
.replace(/-plugin-plugin$/, '-plugin');
}
/**
* @returns {Promise<import('@kbn/plugin-discovery').KibanaPlatformPlugin[]>}
*/
export async function pluginDiscovery() {
/* eslint-disable no-unsanitized/method */
/** @type {import('@kbn/plugin-discovery')} */
const { getPluginSearchPaths, simpleKibanaPlatformPluginDiscovery } = await import(
PLUGIN_DISCOVERY_SRC
);
/* eslint-enable no-unsanitized/method */
const searchPaths = getPluginSearchPaths({
rootDir: REPO_ROOT,
examples: true,
oss: false,
testPlugins: true,
});
return simpleKibanaPlatformPluginDiscovery(searchPaths, []);
}

View file

@ -6,30 +6,33 @@
* Side Public License, v 1.
*/
import Fs from 'fs/promises';
import Path from 'path';
import Fs from 'fs';
import normalizePath from 'normalize-path';
import { KibanaPlatformPlugin } from '@kbn/plugin-discovery';
import { convertPluginIdToPackageId } from './convert_plugin_id_to_package_id';
import { REPO_ROOT } from '../../lib/paths.mjs';
import { convertPluginIdToPackageId } from './plugins.mjs';
import { normalizePath } from './normalize_path.mjs';
export async function regenerateBaseTsconfig(plugins: KibanaPlatformPlugin[], repoRoot: string) {
const tsconfigPath = Path.resolve(repoRoot, 'tsconfig.base.json');
const lines = (await Fs.readFile(tsconfigPath, 'utf-8')).split('\n');
/**
* @param {import('@kbn/plugin-discovery').KibanaPlatformPlugin[]} plugins
*/
export function regenerateBaseTsconfig(plugins) {
const tsconfigPath = Path.resolve(REPO_ROOT, 'tsconfig.base.json');
const lines = Fs.readFileSync(tsconfigPath, 'utf-8').split('\n');
const packageMap = plugins
.slice()
.sort((a, b) => a.manifestPath.localeCompare(b.manifestPath))
.flatMap((p) => {
const id = convertPluginIdToPackageId(p.manifest.id);
const path = normalizePath(Path.relative(repoRoot, p.directory));
const path = normalizePath(Path.relative(REPO_ROOT, p.directory));
return [` "${id}": ["${path}"],`, ` "${id}/*": ["${path}/*"],`];
});
const start = lines.findIndex((l) => l.trim() === '// START AUTOMATED PACKAGE LISTING');
const end = lines.findIndex((l) => l.trim() === '// END AUTOMATED PACKAGE LISTING');
await Fs.writeFile(
Fs.writeFileSync(
tsconfigPath,
[...lines.slice(0, start + 1), ...packageMap, ...lines.slice(end)].join('\n')
);

View file

@ -0,0 +1,34 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import Fs from 'fs';
import { normalizePath } from './normalize_path.mjs';
import { REPO_ROOT } from '../../lib/paths.mjs';
import { convertPluginIdToPackageId } from './plugins.mjs';
/**
* @param {import('@kbn/plugin-discovery').KibanaPlatformPlugin[]} plugins
*/
export function regenerateSyntheticPackageMap(plugins) {
/** @type {Array<[string, string]>} */
const entries = [['@kbn/core', 'src/core']];
for (const plugin of plugins) {
entries.push([
convertPluginIdToPackageId(plugin.manifest.id),
normalizePath(Path.relative(REPO_ROOT, plugin.directory)),
]);
}
Fs.writeFileSync(
Path.resolve(REPO_ROOT, 'packages/kbn-synthetic-package-map/synthetic-packages.json'),
JSON.stringify(entries, null, 2)
);
}

View file

@ -5,17 +5,19 @@
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import dedent from 'dedent';
import { writeFileSync } from 'fs';
import { resolve } from 'path';
import { spawn } from '../child_process';
import { log } from '../log';
import { isFile, readFile } from '../fs';
async function isElasticCommitter() {
import Path from 'path';
import Fs from 'fs';
import { spawnSync } from 'child_process';
import { isFile } from '../../lib/fs.mjs';
import { dedent } from '../../lib/indent.mjs';
import { REPO_ROOT } from '../../lib/paths.mjs';
function isElasticCommitter() {
try {
const { stdout: email } = await spawn('git', ['config', 'user.email'], {
stdio: 'pipe',
const { stdout: email } = spawnSync('git', ['config', 'user.email'], {
encoding: 'utf8',
});
return email.trim().endsWith('@elastic.co');
@ -24,31 +26,39 @@ async function isElasticCommitter() {
}
}
async function upToDate(settingsPath: string) {
if (!(await isFile(settingsPath))) {
/**
*
* @param {string} settingsPath
* @returns
*/
function upToDate(settingsPath) {
if (!isFile(settingsPath)) {
return false;
}
const readSettingsFile = await readFile(settingsPath, 'utf8');
const readSettingsFile = Fs.readFileSync(settingsPath, 'utf8');
return readSettingsFile.startsWith('# V2 ');
}
export async function setupRemoteCache(repoRootPath: string) {
/**
* @param {import('@kbn/some-dev-log').SomeDevLog} log
*/
export function setupRemoteCache(log) {
// The remote cache is only for Elastic employees working locally (CI cache settings are handled elsewhere)
if (
process.env.FORCE_BOOTSTRAP_REMOTE_CACHE !== 'true' &&
(process.env.CI || !(await isElasticCommitter()))
(process.env.CI || !isElasticCommitter())
) {
return;
}
log.debug(`[bazel_tools] setting up remote cache settings if necessary`);
log.debug(`setting up remote cache settings if necessary`);
const settingsPath = resolve(repoRootPath, '.bazelrc.cache');
const settingsPath = Path.resolve(REPO_ROOT, '.bazelrc.cache');
// Checks if we should upgrade or install the config file
if (await upToDate(settingsPath)) {
log.debug(`[bazel_tools] remote cache config already exists and is up-to-date, skipping`);
if (upToDate(settingsPath)) {
log.debug(`remote cache config already exists and is up-to-date, skipping`);
return;
}
@ -60,6 +70,6 @@ export async function setupRemoteCache(repoRootPath: string) {
build --incompatible_remote_results_ignore_disk
`;
writeFileSync(settingsPath, contents);
log.info(`[bazel_tools] remote cache settings written to ${settingsPath}`);
Fs.writeFileSync(settingsPath, contents);
log.info(`remote cache settings written to ${settingsPath}`);
}

View file

@ -6,13 +6,15 @@
* Side Public License, v 1.
*/
import Fsp from 'fs/promises';
import Path from 'path';
import Fs from 'fs';
import { sortPackageJson } from '@kbn/sort-package-json';
import { REPO_ROOT } from '../../lib/paths.mjs';
export async function regeneratePackageJson(rootPath: string) {
const path = Path.resolve(rootPath, 'package.json');
const json = await Fsp.readFile(path, 'utf8');
await Fsp.writeFile(path, sortPackageJson(json));
export async function sortPackageJson() {
const { sortPackageJson } = await import('@kbn/sort-package-json');
const path = Path.resolve(REPO_ROOT, 'package.json');
const json = Fs.readFileSync(path, 'utf8');
Fs.writeFileSync(path, sortPackageJson(json));
}

View file

@ -0,0 +1,53 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import Fs from 'fs';
import { REPO_ROOT } from '../../lib/paths.mjs';
import { maybeRealpath, isFile, isDirectory } from '../../lib/fs.mjs';
// yarn integrity file checker
export function removeYarnIntegrityFileIfExists() {
try {
const nodeModulesRealPath = maybeRealpath(Path.resolve(REPO_ROOT, 'node_modules'));
const yarnIntegrityFilePath = Path.resolve(nodeModulesRealPath, '.yarn-integrity');
// check if the file exists and delete it in that case
if (isFile(yarnIntegrityFilePath)) {
Fs.unlinkSync(yarnIntegrityFilePath);
}
} catch {
// no-op
}
}
// yarn and bazel integration checkers
function areNodeModulesPresent() {
try {
return isDirectory(Path.resolve(REPO_ROOT, 'node_modules'));
} catch {
return false;
}
}
function haveBazelFoldersBeenCreatedBefore() {
try {
return (
isDirectory(Path.resolve(REPO_ROOT, 'bazel-bin/packages')) ||
isDirectory(Path.resolve(REPO_ROOT, 'bazel-kibana/packages')) ||
isDirectory(Path.resolve(REPO_ROOT, 'bazel-out/host'))
);
} catch {
return false;
}
}
export function haveNodeModulesBeenManuallyDeleted() {
return !areNodeModulesPresent() && haveBazelFoldersBeenCreatedBefore();
}

View file

@ -0,0 +1,44 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { dedent } from '../lib/indent.mjs';
import { cleanPaths } from '../lib/clean.mjs';
import * as Bazel from '../lib/bazel.mjs';
import { findPluginCleanPaths } from '../lib/find_clean_paths.mjs';
/** @type {import('../lib/command').Command} */
export const command = {
name: 'clean',
description: 'Deletes output directories and resets internal caches',
reportTimings: {
group: 'scripts/kbn clean',
id: 'total',
},
flagsHelp: `
--quiet Prevent logging more than basic success/error messages
`,
async run({ args, log }) {
log.warning(dedent`
This command is only necessary for the circumstance where you need to recover a consistent
state when problems arise. If you need to run this command often, please let us know by
filling out this form: https://ela.st/yarn-kbn-clean.
Please note it might not solve problems with node_modules. To solve problems around node_modules
you might need to run 'yarn kbn reset'.
`);
await cleanPaths(log, await findPluginCleanPaths(log));
// Runs Bazel soft clean
if (Bazel.isInstalled(log)) {
await Bazel.clean(log, {
quiet: args.getBooleanValue('quiet'),
});
}
},
};

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
export const COMMANDS = [
(await import('./bootstrap/bootstrap_command.mjs')).command,
(await import('./watch_command.mjs')).command,
(await import('./run_in_packages_command.mjs')).command,
(await import('./clean_command.mjs')).command,
(await import('./reset_command.mjs')).command,
];

View file

@ -0,0 +1,47 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import { REPO_ROOT } from '../lib/paths.mjs';
import { dedent } from '../lib/indent.mjs';
import { cleanPaths } from '../lib/clean.mjs';
import * as Bazel from '../lib/bazel.mjs';
import { findPluginCleanPaths, readCleanPatterns } from '../lib/find_clean_paths.mjs';
/** @type {import('../lib/command').Command} */
export const command = {
name: 'reset',
description:
'Deletes node_modules and output directories, resets internal and disk caches, and stops Bazel server',
reportTimings: {
group: 'scripts/kbn reset',
id: 'total',
},
flagsHelp: `
--quiet Prevent logging more than basic success/error messages
`,
async run({ args, log }) {
log.warning(dedent`
In most cases, 'yarn kbn clean' is all that should be needed to recover a consistent state when
problems arise. However for the rare cases where something get corrupt on node_modules you might need this command.
If you think you need to use this command very often (which is not normal), please let us know.
`);
await cleanPaths(log, [
Path.resolve(REPO_ROOT, 'node_modules'),
Path.resolve(REPO_ROOT, 'x-pack/node_modules'),
...readCleanPatterns(REPO_ROOT),
...(await findPluginCleanPaths(log)),
]);
const quiet = args.getBooleanValue('quiet');
Bazel.expungeCache(log, { quiet });
Bazel.cleanDiskCache(log);
},
};

View file

@ -0,0 +1,76 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import { REPO_ROOT } from '../lib/paths.mjs';
import { spawnSync, spawnStreaming } from '../lib/spawn.mjs';
/** @type {import('../lib/command').Command} */
export const command = {
name: 'run-in-packages',
usage: '[...flags] <command> [...subFlags]',
description: `
Run script defined in package.json in each package that contains that script. Any flags passed
after the script name will be passed directly to the script.
`,
flagsHelp: `
--filter package name to be filter packages by, can be specified multiple times
and only packages matching this filter will be matched
--exclude package name to be excluded, can be specified multiple times
--quiet only log the output of commands if they fail
`,
reportTimings: {
group: 'scripts/kbn run',
id: 'total',
},
async run({ log, args }) {
const scriptName = args.getPositionalArgs()[0];
const rawArgs = args.getRawArgs();
const i = rawArgs.indexOf(scriptName);
const scriptArgs = i !== -1 ? rawArgs.slice(i + 1) : [];
const exclude = args.getStringValues('exclude') ?? [];
const include = args.getStringValues('include') ?? [];
const { discoverBazelPackages } = await import('@kbn/bazel-packages');
const packages = await discoverBazelPackages(REPO_ROOT);
for (const { pkg, normalizedRepoRelativeDir } of packages) {
if (
exclude.includes(pkg.name) ||
(include.length && !include.includes(pkg.name)) ||
!pkg.scripts ||
!Object.hasOwn(pkg.scripts, scriptName)
) {
continue;
}
log.debug(
`running [${scriptName}] script in [${pkg.name}]`,
scriptArgs.length ? `with args [${scriptArgs.join(' ')}]` : ''
);
const cwd = Path.resolve(REPO_ROOT, normalizedRepoRelativeDir);
if (args.getBooleanValue('quiet')) {
spawnSync('yarn', ['run', scriptName, ...scriptArgs], {
cwd,
description: `${scriptName} in ${pkg.name}`,
});
} else {
await spawnStreaming('yarn', ['run', scriptName, ...scriptArgs], {
cwd: cwd,
logPrefix: ' ',
});
}
log.success(`Ran [${scriptName}] in [${pkg.name}]`);
}
},
};

View file

@ -0,0 +1,31 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import * as Bazel from '../lib/bazel.mjs';
/** @type {import('../lib/command').Command} */
export const command = {
name: 'watch',
description: 'Runs a build in the Bazel built packages and keeps watching them for changes',
flagsHelp: `
--offline Run the installation process without consulting online resources. This is useful and',
sometimes necessary for using bootstrap on an airplane for instance. The local caches',
will be used exclusively, including a yarn-registry local mirror which is created and',
maintained by successful online bootstrap executions.',
`,
reportTimings: {
group: 'scripts/kbn watch',
id: 'total',
},
async run({ args, log }) {
await Bazel.watch(log, {
offline: args.getBooleanValue('offline') ?? true,
});
},
};

182
kbn_pm/src/lib/args.mjs Normal file
View file

@ -0,0 +1,182 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { createFlagError } from './cli_error.mjs';
/**
* @param {string[]} argv
*/
function parseArgv(argv) {
/** @type {string[]} */
const raw = [];
/** @type {string[]} */
const positional = [];
/** @type {Map<string, string | string[] | boolean>} */
const flags = new Map();
for (const arg of argv) {
raw.push(arg);
if (!arg.startsWith('--')) {
// positional arguments are anything that doesn't start with "--"
positional.push(arg);
continue;
}
// flags always start with "--" and might have an =value attached.
// - If the flag name starts with "no-", like `--no-flag`, it will set `flag` to `false`
// - If the flag has multiple string values they will be turned into an array of values
const [name, ...value] = arg.slice(2).split('=');
if (value.length === 0) {
// boolean flag
if (name.startsWith('no-')) {
flags.set(name.slice(3), false);
} else {
flags.set(name, true);
}
} else {
flags.set(name, value.join('='));
}
}
return { raw, positional, flags };
}
export class Args {
#flags;
#positional;
#raw;
#defaults;
/**
* @param {string[]} argv
* @param {string[]} defaultArgv
*/
constructor(argv, defaultArgv) {
const { flags, positional, raw } = parseArgv(argv);
this.#flags = flags;
this.#positional = positional;
this.#raw = raw;
this.#defaults = parseArgv(defaultArgv).flags;
}
/**
* @returns {import('@kbn/some-dev-log').SomeLogLevel}
*/
getLoggingLevel() {
if (this.getBooleanValue('quiet')) {
return 'quiet';
}
if (this.getBooleanValue('verbose')) {
return 'verbose';
}
if (this.getBooleanValue('debug')) {
return 'debug';
}
return 'info';
}
/**
* Get the command name from the args
*/
getCommandName() {
return this.#positional[0];
}
/**
* Get the positional arguments, excludes the command name
*/
getPositionalArgs() {
return this.#positional.slice(1);
}
/**
* Get all of the passed args
*/
getRawArgs() {
return this.#raw.slice();
}
/**
* Get the value of a specific flag as a string, if the argument is specified multiple
* times then only the last value specified will be returned. If the flag was specified
* as a boolean then an error will be thrown. If the flag wasn't specified then
* undefined will be returned.
* @param {string} name
*/
getStringValue(name) {
const value = this.#flags.get(name) ?? this.#defaults.get(name);
if (Array.isArray(value)) {
return value.at(-1);
}
if (typeof value === 'boolean') {
throw createFlagError(`Expected [--${name}] to have a value, not be a boolean flag`);
}
return value;
}
/**
* Get the string values of a flag as an array of values. This will return all values for a
* given flag and any boolean values will cause an error to be thrown.
* @param {string} name
*/
getStringValues(name) {
const value = this.#flags.get(name) ?? this.#defaults.get(name);
if (typeof value === 'string') {
return [value];
}
if (value === undefined || Array.isArray(value)) {
return value;
}
throw createFlagError(`Expected [--${name}] to have a string value`);
}
/**
* Get the boolean value of a specific flag. If the flag wasn't defined then undefined will
* be returned. If the flag was specified with a string value then an error will be thrown.
* @param {string} name
*/
getBooleanValue(name) {
const value = this.#flags.get(name) ?? this.#defaults.get(name);
if (typeof value === 'boolean' || value === undefined) {
return value;
}
throw createFlagError(
`Unexpected value for [--${name}], this is a boolean flag and should be specified as just [--${name}] or [--no-${name}]`
);
}
/**
* Get the value of a specific flag parsed as a number. If the flag wasn't specified then
* undefined will be returned. If the flag was specified multiple times then the last value
* specified will be used. If the flag's value can't be parsed as a number then an error
* will be returned.
* @param {string} name
*/
getNumberValue(name) {
const value = this.getStringValue(name);
if (value === undefined) {
return value;
}
const parsed = parseFloat(value);
if (Number.isNaN(parsed)) {
throw createFlagError(`Expected value of [--${name}] to be parsable as a valid number`);
}
return parsed;
}
}

260
kbn_pm/src/lib/bazel.mjs Normal file
View file

@ -0,0 +1,260 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import Fs from 'fs';
import { spawnSync } from './spawn.mjs';
import * as Color from './colors.mjs';
import { createCliError } from './cli_error.mjs';
import { REPO_ROOT } from './paths.mjs';
import { cleanPaths } from './clean.mjs';
import { indent } from './indent.mjs';
const BAZEL_RUNNER_SRC = '../../../packages/kbn-bazel-runner/src/index.js';
async function getBazelRunner() {
/* eslint-disable no-unsanitized/method */
/** @type {import('@kbn/bazel-runner')} */
const { runBazel, runIBazel } = await import(BAZEL_RUNNER_SRC);
/* eslint-enable no-unsanitized/method */
return { runBazel, runIBazel };
}
/**
* @param {import('./log.mjs').Log} log
* @param {string} name
* @param {number} code
* @param {string} output
*/
function throwBazelError(log, name, code, output) {
const tag = Color.title('HINT');
log._write(
[
tag,
tag +
'If experiencing problems with node_modules try `yarn kbn bootstrap --force-install` or as last resort `yarn kbn reset && yarn kbn bootstrap`',
tag,
].join('\n')
);
throw createCliError(
`[${name}] exited with code [${code}]${output ? `\n output:\n${indent(4, output)}}` : ''}`
);
}
/**
* @param {import('./log.mjs').Log} log
* @param {string[]} inputArgs
* @param {{ quiet?: boolean; offline?: boolean, env?: Record<string, string> } | undefined} opts
*/
async function runBazel(log, inputArgs, opts = undefined) {
const bazel = (await getBazelRunner()).runBazel;
const args = [...(opts?.offline ? ['--config=offline'] : []), ...inputArgs];
log.debug(`> bazel ${args.join(' ')}`);
await bazel(args, {
env: opts?.env,
cwd: REPO_ROOT,
quiet: opts?.quiet,
logPrefix: Color.info('[bazel]'),
onErrorExit(code, output) {
throwBazelError(log, 'bazel', code, output);
},
});
}
/**
*
* @param {import('./log.mjs').Log} log
* @param {{ offline: boolean } | undefined} opts
*/
export async function watch(log, opts = undefined) {
const ibazel = (await getBazelRunner()).runIBazel;
const args = [
// --run_output=false arg will disable the iBazel notifications about gazelle
// and buildozer when running it. Could also be solved by adding a root
// `.bazel_fix_commands.json` but its not needed at the moment
'--run_output=false',
'build',
'//packages:build',
'--show_result=1',
...(opts?.offline ? ['--config=offline'] : []),
];
log.debug(`> ibazel ${args.join(' ')}`);
await ibazel(args, {
cwd: REPO_ROOT,
logPrefix: Color.info('[ibazel]'),
onErrorExit(code, output) {
throwBazelError(log, 'ibazel', code, output);
},
});
}
/**
* @param {import('./log.mjs').Log} log
* @param {{ quiet?: boolean } | undefined} opts
*/
export async function clean(log, opts = undefined) {
await runBazel(log, ['clean'], {
quiet: opts?.quiet,
});
log.success('soft cleaned bazel');
}
/**
* @param {import('./log.mjs').Log} log
* @param {{ quiet?: boolean } | undefined} opts
*/
export async function expungeCache(log, opts = undefined) {
await runBazel(log, ['clean', '--expunge'], {
quiet: opts?.quiet,
});
log.success('hard cleaned bazel');
}
/**
* @param {import('./log.mjs').Log} log
*/
export async function cleanDiskCache(log) {
const args = ['info', 'repository_cache'];
log.debug(`> bazel ${args.join(' ')}`);
const repositoryCachePath = spawnSync('bazel', args);
await cleanPaths(log, [
Path.resolve(Path.dirname(repositoryCachePath), 'disk-cache'),
Path.resolve(repositoryCachePath),
]);
log.success('removed disk caches');
}
/**
* @param {import('./log.mjs').Log} log
* @param {{ offline?: boolean, quiet?: boolean } | undefined} opts
*/
export async function installYarnDeps(log, opts = undefined) {
await runBazel(log, ['run', '@nodejs//:yarn'], {
offline: opts?.offline,
quiet: opts?.quiet,
env: {
SASS_BINARY_SITE:
'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-sass',
RE2_DOWNLOAD_MIRROR:
'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2',
},
});
log.success('yarn deps installed');
}
/**
* @param {import('./log.mjs').Log} log
* @param {{ offline?: boolean, quiet?: boolean } | undefined} opts
*/
export async function buildPackages(log, opts = undefined) {
await runBazel(log, ['build', '//packages:build', '--show_result=1'], {
offline: opts?.offline,
quiet: opts?.quiet,
});
log.success('packages built');
}
/**
* @param {string} versionFilename
* @returns
*/
function readBazelToolsVersionFile(versionFilename) {
const version = Fs.readFileSync(Path.resolve(REPO_ROOT, versionFilename), 'utf8').trim();
if (!version) {
throw new Error(
`Failed on reading bazel tools versions\n ${versionFilename} file do not contain any version set`
);
}
return version;
}
/**
* @param {import('./log.mjs').Log} log
*/
export function tryRemovingBazeliskFromYarnGlobal(log) {
try {
log.debug('Checking if Bazelisk is installed on the yarn global scope');
const stdout = spawnSync('yarn', ['global', 'list']);
if (stdout.includes(`@bazel/bazelisk@`)) {
log.debug('Bazelisk was found on yarn global scope, removing it');
spawnSync('yarn', ['global', 'remove', `@bazel/bazelisk`]);
log.info(`bazelisk was installed on Yarn global packages and is now removed`);
return true;
}
return false;
} catch {
return false;
}
}
/**
* @param {import('./log.mjs').Log} log
*/
export function isInstalled(log) {
try {
log.debug('getting bazel version');
const stdout = spawnSync('bazel', ['--version']).trim();
const bazelVersion = readBazelToolsVersionFile('.bazelversion');
if (stdout === `bazel ${bazelVersion}`) {
return true;
} else {
log.info(`Bazel is installed (${stdout}), but was expecting ${bazelVersion}`);
return false;
}
} catch {
return false;
}
}
/**
* @param {import('./log.mjs').Log} log
*/
export function ensureInstalled(log) {
if (isInstalled(log)) {
return;
}
// Install bazelisk if not installed
log.debug(`reading bazel tools versions from version files`);
const bazeliskVersion = readBazelToolsVersionFile('.bazeliskversion');
const bazelVersion = readBazelToolsVersionFile('.bazelversion');
log.info(`installing Bazel tools`);
log.debug(
`bazelisk is not installed. Installing @bazel/bazelisk@${bazeliskVersion} and bazel@${bazelVersion}`
);
spawnSync('npm', ['install', '--global', `@bazel/bazelisk@${bazeliskVersion}`], {
env: {
USE_BAZEL_VERSION: bazelVersion,
},
});
const isBazelBinAvailableAfterInstall = isInstalled(log);
if (!isBazelBinAvailableAfterInstall) {
throw new Error(
`an error occurred when installing the Bazel tools. Please make sure you have access to npm globally installed modules on your $PATH`
);
}
log.success(`bazel tools installed`);
}

30
kbn_pm/src/lib/clean.mjs Normal file
View file

@ -0,0 +1,30 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Fs from 'fs';
import Fsp from 'fs/promises';
import Path from 'path';
/**
*
* @param {import('@kbn/some-dev-log').SomeDevLog} log
* @param {string[]} paths
*/
export async function cleanPaths(log, paths) {
for (const path of paths) {
if (!Fs.existsSync(path)) {
continue;
}
log.info('deleting', Path.relative(process.cwd(), path));
await Fsp.rm(path, {
recursive: true,
force: true,
});
}
}

View file

@ -0,0 +1,48 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { isObj } from './obj_helpers.mjs';
/** @typedef {Error & { showHelp: boolean, exitCode?: number }} CliError */
/**
* Create a CliError instance
* @param {string} message
* @param {{ showHelp?: boolean, exitCode?: number } | undefined} options
* @returns {CliError}
*/
export function createCliError(message, options = undefined) {
/** @type {true} */
const __isCliError = true;
return Object.assign(new Error(message), {
__isCliError,
showHelp: options?.showHelp || false,
exitCode: options?.exitCode,
});
}
/**
* @param {string} message
*/
export function createFlagError(message) {
return createCliError(message, {
showHelp: true,
exitCode: 1,
});
}
/**
* Determine if the passed value is a CliError
*
* @param {unknown} error
* @returns {error is CliError}
*/
export function isCliError(error) {
return isObj(error) && !!error.__isCliError;
}

49
kbn_pm/src/lib/colors.mjs Normal file
View file

@ -0,0 +1,49 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
/**
* Print an error title, prints on a red background with white text
* @param {string} txt
*/
export const err = (txt) => `\x1b[41m\x1b[37m${txt}\x1b[39m\x1b[49m`;
/**
* Print some text with some spacing, with very high contrast and bold text
* @param {string} txt
*/
export const title = (txt) => `\x1b[100m\x1b[37m\x1b[1m ${txt} \x1b[22m\x1b[39m\x1b[49m`;
/**
* Print the yellow warning label
* @param {string} txt
*/
export const warning = (txt) => `\x1b[33m${txt}\x1b[39m`;
/**
* Print the simple blue info label
* @param {string} txt
*/
export const info = (txt) => `\x1b[94m${txt}\x1b[39m`;
/**
* Print a green success label
* @param {string} txt
*/
export const success = (txt) => `\x1b[32m${txt}\x1b[39m`;
/**
* Print the simple dim debug label
* @param {string} txt
*/
export const debug = (txt) => `\x1b[2m${txt}\x1b[22m`;
/**
* Print the bright verbose label
* @param {string} txt
*/
export const verbose = (txt) => `\x1b[35m${txt}\x1b[39m`;

71
kbn_pm/src/lib/command.ts Normal file
View file

@ -0,0 +1,71 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import type { Log } from './log.mjs';
import type { Args } from './args.mjs';
/**
* Helper function to easily time specific parts of a kbn command. Does not produce
* timings unless the reportTimings config is also defined
*/
export type SubCommandTimeFn = <T>(id: string, block: () => Promise<T>) => Promise<T>;
/**
* Argument passed to the command run function
*/
export interface CommandRunContext {
args: Args;
log: Log;
time: SubCommandTimeFn;
}
/**
* Description of a command that can be run by kbn/pm
*/
export interface Command {
/**
* The name of the command
*/
name: string;
/**
* Additionall usage details which should be added after the command name
*/
usage?: string;
/**
* Text to follow the name of the command in the help output
*/
intro?: string;
/**
* Summary of the functionality for this command, printed
* between the usage and flags help in the help output
*/
description?: string;
/**
* Description of the flags this command accepts
*/
flagsHelp?: string;
/**
* Function which executes the command.
*/
run(ctx: CommandRunContext): Promise<void>;
/**
* Configuration to send timing data to ci-stats for this command. If the
* time() fn is used those timing records will use the group from this config.
* If this config is not used then the time() fn won't report any data.
*/
reportTimings?: {
group: string;
id: string;
};
}

View file

@ -0,0 +1,75 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import Fs from 'fs';
import { REPO_ROOT } from './paths.mjs';
/**
* Attempt to load the synthetic package map, if bootstrap hasn't run successfully
* this might fail.
* @param {import('@kbn/some-dev-log').SomeDevLog} log
* @returns {Promise<import('@kbn/synthetic-package-map').PackageMap>}
*/
async function tryToGetSyntheticPackageMap(log) {
try {
const { readPackageMap } = await import('@kbn/synthetic-package-map');
return readPackageMap();
} catch (error) {
log.warning(
'unable to load synthetic package map, unable to clean target directories in synthetic packages'
);
return new Map();
}
}
/**
* @param {*} packageDir
* @returns {string[]}
*/
export function readCleanPatterns(packageDir) {
let json;
try {
const path = Path.resolve(packageDir, 'package.json');
json = JSON.parse(Fs.readFileSync(path, 'utf8'));
} catch (error) {
if (error.code === 'ENOENT') {
return [];
}
throw error;
}
/** @type {string[]} */
const patterns = json.kibana?.clean?.extraPatterns ?? [];
return patterns.flatMap((pattern) => {
const absolute = Path.resolve(packageDir, pattern);
// sanity check to make sure that resolved patterns are "relative" to
// the package dir, if they start with a . then they traverse out of
// the package dir so we drop them
if (Path.relative(packageDir, absolute).startsWith('.')) {
return [];
}
return absolute;
});
}
/**
* @param {import('@kbn/some-dev-log').SomeDevLog} log
* @returns {Promise<string[]>}
*/
export async function findPluginCleanPaths(log) {
const packageMap = await tryToGetSyntheticPackageMap(log);
return [...packageMap.values()].flatMap((repoRelativePath) => {
const pkgDir = Path.resolve(REPO_ROOT, repoRelativePath);
return [Path.resolve(pkgDir, 'target'), ...readCleanPatterns(pkgDir)];
});
}

51
kbn_pm/src/lib/fs.mjs Normal file
View file

@ -0,0 +1,51 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Fs from 'fs';
/**
* @param {string} path
* @returns {string}
*/
export function maybeRealpath(path) {
try {
return Fs.realpathSync.native(path);
} catch (error) {
if (error.code !== 'ENOENT') {
throw error;
}
}
return path;
}
/**
* @param {string} path
* @returns {boolean}
*/
export function isDirectory(path) {
try {
const stat = Fs.statSync(path);
return stat.isDirectory();
} catch (error) {
return false;
}
}
/**
* @param {string} path
* @returns {boolean}
*/
export function isFile(path) {
try {
const stat = Fs.statSync(path);
return stat.isFile();
} catch (error) {
return false;
}
}

56
kbn_pm/src/lib/help.mjs Normal file
View file

@ -0,0 +1,56 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { COMMANDS } from '../commands/index.mjs';
import { dedent, indent } from './indent.mjs';
import { title } from './colors.mjs';
/**
* @param {string | undefined} cmdName
* @returns {Promise<string>}
*/
export async function getHelp(cmdName = undefined) {
const cmd = cmdName && COMMANDS.find((c) => c.name === cmdName);
/**
* @param {number} depth
* @param {import('./command').Command} cmd
* @returns {string[]}
*/
const cmdLines = (depth, cmd) => {
const intro = cmd.intro ? dedent(cmd.intro) : '';
const desc = cmd.description ? dedent(cmd.description) : '';
const flags = cmd.flagsHelp ? dedent(cmd.flagsHelp) : '';
return [
indent(
depth,
`${title(`yarn kbn ${cmd.name}${cmd.usage ? ` ${cmd.usage}` : ''}`)}${
intro ? ` ${intro}` : ''
}`
),
'',
...(desc ? [indent(depth + 2, desc), ''] : []),
...(flags ? [indent(depth + 2, 'Flags:'), indent(depth + 4, flags), ''] : []),
];
};
if (cmd) {
return ['', ...cmdLines(0, cmd)].join('\n');
}
const lines = [
'Usage:',
' yarn kbn <command> [...flags]',
'',
'Commands:',
...COMMANDS.map((cmd) => cmdLines(2, cmd)).flat(),
];
return lines.join('\n');
}

55
kbn_pm/src/lib/indent.mjs Normal file
View file

@ -0,0 +1,55 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
const NON_WS_RE = /\S/;
/**
* @param {string} line
*/
const nonWsStart = (line) => line.match(NON_WS_RE)?.index ?? line.length;
/**
* Dedent the string, trimming all empty lines from the beggining of
* `txt` and finding the first line with non-whitespace characters, then
* subtracting the indent from that line from all subsequent lines
* @param {TemplateStringsArray | string} txts
* @param {...any} vars
*/
export function dedent(txts, ...vars) {
/** @type {string[]} */
const lines = (
Array.isArray(txts) ? txts.reduce((acc, txt, i) => `${acc}${vars[i - 1]}${txt}`) : txts
).split('\n');
while (lines.length && lines[0].trim() === '') {
lines.shift();
}
/** @type {number | undefined} */
let depth;
return lines
.map((l) => {
if (depth === undefined) {
depth = nonWsStart(l);
}
return l.slice(Math.min(nonWsStart(l), depth));
})
.join('\n');
}
/**
* @param {number} width
* @param {string} txt
* @returns {string}
*/
export const indent = (width, txt) =>
txt
.split('\n')
.map((l) => `${' '.repeat(width)}${l}`)
.join('\n');

120
kbn_pm/src/lib/log.mjs Normal file
View file

@ -0,0 +1,120 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { format } from 'util';
import * as Colors from './colors.mjs';
/** @typedef {import('@kbn/some-dev-log').SomeDevLog} SomeDevLog */
/**
* @implements {SomeDevLog}
*/
export class Log {
#flags;
/**
*
* @param {import('@kbn/some-dev-log').SomeLogLevel} level
*/
constructor(level) {
this.#flags = {
error: true,
success: true,
info: level !== 'quiet',
warning: level !== 'quiet',
debug: level === 'debug' || level === 'verbose',
verbose: level === 'verbose',
};
}
/**
* Log an error message
* @param {string} msg
* @param {...any} rest
*/
error(msg, ...rest) {
if (this.#flags.error) {
this._fmt(' ERROR ', Colors.err, msg, rest);
}
}
/**
* Log a verbose message, only shown when using --verbose
* @param {string} msg
* @param {...any} rest
*/
warning(msg, ...rest) {
if (this.#flags.warning) {
this._fmt('warn', Colors.warning, msg, rest);
}
}
/**
* Log a standard message to the log
* @param {string} msg
* @param {...any} rest
*/
info(msg, ...rest) {
if (this.#flags.info) {
this._fmt('info', Colors.info, msg, rest);
}
}
/**
* Log a verbose message, only shown when using --verbose
* @param {string} msg
* @param {...any} rest
*/
success(msg, ...rest) {
if (this.#flags.success) {
this._fmt('success', Colors.success, msg, rest);
}
}
/**
* Log a debug message, only shown when using --debug or --verbose
* @param {string} msg
* @param {...any} rest
*/
debug(msg, ...rest) {
if (this.#flags.debug) {
this._fmt('debg', Colors.debug, msg, rest);
}
}
/**
* Log a verbose message, only shown when using --verbose
* @param {string} msg
* @param {...any} rest
*/
verbose(msg, ...rest) {
if (this.#flags.verbose) {
this._fmt('verb', Colors.verbose, msg, rest);
}
}
/**
* @param {string} tag
* @param {(txt: string) => string} color
* @param {string} msg
* @param {...any} rest
*/
_fmt(tag, color, msg, rest) {
const lines = format(msg, ...rest).split('\n');
const padding = ' '.repeat(tag.length + 1);
this._write(`${color(tag)} ${lines.map((l, i) => (i > 0 ? `${padding}${l}` : l)).join('\n')}`);
}
/**
* @param {string} txt
*/
_write(txt) {
console.log(txt);
}
}

View file

@ -0,0 +1,13 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
/**
* @param {unknown} v
* @returns {v is Record<string, unknown>}
*/
export const isObj = (v) => typeof v === 'object' && v !== null;

View file

@ -6,4 +6,6 @@
* Side Public License, v 1.
*/
export * from './src';
import Path from 'path';
export const REPO_ROOT = Path.resolve(Path.dirname(new URL(import.meta.url).pathname), '../../..');

113
kbn_pm/src/lib/spawn.mjs Normal file
View file

@ -0,0 +1,113 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import ChildProcess from 'child_process';
import Readline from 'readline';
import { createCliError } from './cli_error.mjs';
import { REPO_ROOT } from './paths.mjs';
import { indent } from './indent.mjs';
/** @typedef {{ cwd?: string, env?: Record<string, string> }} SpawnOpts */
/**
* Run a child process and return it's stdout
* @param {string} cmd
* @param {string[]} args
* @param {undefined | (SpawnOpts & { description?: string })} opts
*/
export function spawnSync(cmd, args, opts = undefined) {
const result = ChildProcess.spawnSync(cmd === 'node' ? process.execPath : cmd, args, {
cwd: opts?.cwd ?? REPO_ROOT,
encoding: 'utf8',
env: {
...process.env,
...opts?.env,
},
});
if (result.status !== null && result.status > 0) {
throw createCliError(
`[${opts?.description ?? cmd}] exitted with ${result.status}:\n${
result.stdout.trim()
? ` stdout:\n${indent(4, result.stdout.trim())}\n\n`
: ' stdout: no output\n'
}${
result.stderr.trim()
? ` stderr:\n${indent(4, result.stderr.trim())}\n\n`
: ' stderr: no output\n'
}`
);
}
return result.stdout;
}
/**
* Print each line of output to the console
* @param {import('stream').Readable} stream
* @param {string | undefined} prefix
*/
async function printLines(stream, prefix) {
const int = Readline.createInterface({
input: stream,
crlfDelay: Infinity,
});
for await (const line of int) {
console.log(prefix ? `${prefix} ${line}` : line);
}
}
/**
* @param {import('events').EventEmitter} emitter
* @param {string} event
* @returns {Promise<any>}
*/
function once(emitter, event) {
return new Promise((resolve) => {
emitter.once(event, resolve);
});
}
/**
* Run a child process and print the output to the log
* @param {string} cmd
* @param {string[]} args
* @param {undefined | (SpawnOpts & { logPrefix?: string })} options
*/
export async function spawnStreaming(cmd, args, options = undefined) {
const proc = ChildProcess.spawn(cmd, args, {
env: {
...process.env,
...options?.env,
},
cwd: options?.cwd,
stdio: ['ignore', 'pipe', 'pipe'],
});
await Promise.all([
printLines(proc.stdout, options?.logPrefix),
printLines(proc.stderr, options?.logPrefix),
// Wait for process to exit, or error
Promise.race([
once(proc, 'exit').then((code) => {
if (typeof code !== 'number' || code === 0) {
return;
}
throw new Error(`[${cmd}] exitted with code [${code}]`);
}),
once(proc, 'error').then((error) => {
throw error;
}),
]),
]);
}

16
kbn_pm/tsconfig.json Normal file
View file

@ -0,0 +1,16 @@
{
"extends": "../tsconfig.base.json",
"compilerOptions": {
"outDir": "target",
"allowJs": true,
"checkJs": true,
"composite": false,
"target": "ES2022",
"module": "ESNext"
},
"include": [
"src/**/*.mjs",
"src/**/*.ts",
],
"exclude": []
}

View file

@ -229,9 +229,6 @@
{
"id": "kibCoreChromePluginApi"
},
{
"id": "kibCoreHttpPluginApi"
},
{
"id": "kibCoreSavedObjectsPluginApi"
},
@ -450,6 +447,9 @@
{
"label": "Build tooling",
"items": [
{
"id": "kibDevDocsOpsKbnPm"
},
{
"id": "kibDevDocsOpsOptimizer"
},
@ -487,6 +487,9 @@
},
{
"id": "kibDevDocsOpsEslintPluginImports"
},
{
"id": "kibDevDocsOpsKbnYarnLockValidator"
}
]
},
@ -519,6 +522,12 @@
},
{
"id": "kibDevDocsOpsEs"
},
{
"id": "kibDevDocsOpsSomeDevLog"
},
{
"id": "kibDevDocsOpsDevCliRunner"
}
]
}

View file

@ -597,8 +597,8 @@
"@kbn/performance-testing-dataset-extractor": "link:bazel-bin/packages/kbn-performance-testing-dataset-extractor",
"@kbn/plugin-generator": "link:bazel-bin/packages/kbn-plugin-generator",
"@kbn/plugin-helpers": "link:bazel-bin/packages/kbn-plugin-helpers",
"@kbn/pm": "link:packages/kbn-pm",
"@kbn/scalability-simulation-generator": "link:bazel-bin/packages/kbn-scalability-simulation-generator",
"@kbn/some-dev-log": "link:bazel-bin/packages/kbn-some-dev-log",
"@kbn/sort-package-json": "link:bazel-bin/packages/kbn-sort-package-json",
"@kbn/spec-to-console": "link:bazel-bin/packages/kbn-spec-to-console",
"@kbn/stdio-dev-helpers": "link:bazel-bin/packages/kbn-stdio-dev-helpers",
@ -612,6 +612,7 @@
"@kbn/type-summarizer": "link:bazel-bin/packages/kbn-type-summarizer",
"@kbn/type-summarizer-cli": "link:bazel-bin/packages/kbn-type-summarizer-cli",
"@kbn/type-summarizer-core": "link:bazel-bin/packages/kbn-type-summarizer-core",
"@kbn/yarn-lock-validator": "link:bazel-bin/packages/kbn-yarn-lock-validator",
"@loaders.gl/polyfills": "^2.3.5",
"@mapbox/vector-tile": "1.3.1",
"@octokit/rest": "^16.35.0",
@ -648,7 +649,6 @@
"@types/chroma-js": "^1.4.2",
"@types/chromedriver": "^81.0.1",
"@types/classnames": "^2.2.9",
"@types/cmd-shim": "^2.0.0",
"@types/color": "^3.0.3",
"@types/compression-webpack-plugin": "^2.0.2",
"@types/cytoscape": "^3.14.0",
@ -876,6 +876,7 @@
"@types/kbn__shared-ux-services": "link:bazel-bin/packages/kbn-shared-ux-services/npm_module_types",
"@types/kbn__shared-ux-storybook": "link:bazel-bin/packages/kbn-shared-ux-storybook/npm_module_types",
"@types/kbn__shared-ux-utility": "link:bazel-bin/packages/kbn-shared-ux-utility/npm_module_types",
"@types/kbn__some-dev-log": "link:bazel-bin/packages/kbn-some-dev-log/npm_module_types",
"@types/kbn__sort-package-json": "link:bazel-bin/packages/kbn-sort-package-json/npm_module_types",
"@types/kbn__std": "link:bazel-bin/packages/kbn-std/npm_module_types",
"@types/kbn__stdio-dev-helpers": "link:bazel-bin/packages/kbn-stdio-dev-helpers/npm_module_types",
@ -894,6 +895,7 @@
"@types/kbn__utility-types": "link:bazel-bin/packages/kbn-utility-types/npm_module_types",
"@types/kbn__utility-types-jest": "link:bazel-bin/packages/kbn-utility-types-jest/npm_module_types",
"@types/kbn__utils": "link:bazel-bin/packages/kbn-utils/npm_module_types",
"@types/kbn__yarn-lock-validator": "link:bazel-bin/packages/kbn-yarn-lock-validator/npm_module_types",
"@types/license-checker": "15.0.0",
"@types/listr": "^0.14.0",
"@types/loader-utils": "^1.1.3",
@ -913,7 +915,6 @@
"@types/moment-duration-format": "^2.2.3",
"@types/moment-timezone": "^0.5.30",
"@types/mustache": "^0.8.31",
"@types/ncp": "^2.0.1",
"@types/nock": "^10.0.3",
"@types/node": "16.11.41",
"@types/node-fetch": "^2.6.0",
@ -946,7 +947,6 @@
"@types/react-test-renderer": "^16.9.1",
"@types/react-virtualized": "^9.18.7",
"@types/react-vis": "^1.11.9",
"@types/read-pkg": "^4.0.0",
"@types/recompose": "^0.30.6",
"@types/reduce-reducers": "^1.0.0",
"@types/redux-actions": "^2.6.1",
@ -961,7 +961,6 @@
"@types/source-map-support": "^0.5.3",
"@types/stats-lite": "^2.2.0",
"@types/strip-ansi": "^5.2.1",
"@types/strong-log-transformer": "^1.0.0",
"@types/styled-components": "^5.1.0",
"@types/supertest": "^2.0.5",
"@types/tapable": "^1.0.6",
@ -980,7 +979,6 @@
"@types/webpack-env": "^1.15.3",
"@types/webpack-merge": "^4.1.5",
"@types/webpack-sources": "^0.1.4",
"@types/write-pkg": "^3.1.0",
"@types/xml-crypto": "^1.4.2",
"@types/xml2js": "^0.4.5",
"@types/yargs": "^15.0.0",
@ -1011,7 +1009,6 @@
"chokidar": "^3.4.3",
"chromedriver": "^103.0.0",
"clean-webpack-plugin": "^3.0.0",
"cmd-shim": "^2.1.0",
"compression-webpack-plugin": "^4.0.0",
"copy-webpack-plugin": "^6.0.2",
"cpy": "^8.1.1",
@ -1111,9 +1108,7 @@
"mochawesome-merge": "^4.2.1",
"mock-fs": "^5.1.2",
"ms-chromium-edge-driver": "^0.5.1",
"multimatch": "^4.0.0",
"mutation-observer": "^1.0.3",
"ncp": "^2.0.0",
"nock": "12.0.3",
"node-sass": "7.0.1",
"null-loader": "^3.0.0",
@ -1133,7 +1128,6 @@
"q": "^1.5.1",
"raw-loader": "^3.1.0",
"react-test-renderer": "^16.14.0",
"read-pkg": "^5.2.0",
"regenerate": "^1.4.0",
"resolve": "^1.22.0",
"rxjs-marbles": "^5.0.6",
@ -1144,7 +1138,6 @@
"sort-package-json": "^1.53.1",
"source-map": "^0.7.3",
"string-replace-loader": "^2.2.0",
"strong-log-transformer": "^2.1.0",
"style-loader": "^1.1.3",
"stylelint": "13.8.0",
"stylelint-scss": "^3.18.0",
@ -1161,7 +1154,6 @@
"ts-morph": "^13.0.2",
"tsd": "^0.20.0",
"typescript": "4.6.3",
"unlazy-loader": "^0.1.3",
"url-loader": "^2.2.0",
"val-loader": "^1.1.1",
"vinyl-fs": "^3.0.3",
@ -1172,7 +1164,6 @@
"webpack-dev-server": "^3.11.0",
"webpack-merge": "^4.2.2",
"webpack-sources": "^1.4.1",
"write-pkg": "^4.0.0",
"xml-crypto": "^2.1.3",
"xmlbuilder": "13.0.2",
"yargs": "^15.4.1"

View file

@ -170,6 +170,7 @@ filegroup(
"//packages/kbn-shared-ux-services:build",
"//packages/kbn-shared-ux-storybook:build",
"//packages/kbn-shared-ux-utility:build",
"//packages/kbn-some-dev-log:build",
"//packages/kbn-sort-package-json:build",
"//packages/kbn-spec-to-console:build",
"//packages/kbn-std:build",
@ -194,6 +195,7 @@ filegroup(
"//packages/kbn-utility-types-jest:build",
"//packages/kbn-utility-types:build",
"//packages/kbn-utils:build",
"//packages/kbn-yarn-lock-validator:build",
"//packages/shared-ux/avatar/solution:build",
"//packages/shared-ux/button_toolbar:build",
"//packages/shared-ux/button/exit_full_screen:build",
@ -367,6 +369,7 @@ filegroup(
"//packages/kbn-shared-ux-services:build_types",
"//packages/kbn-shared-ux-storybook:build_types",
"//packages/kbn-shared-ux-utility:build_types",
"//packages/kbn-some-dev-log:build_types",
"//packages/kbn-sort-package-json:build_types",
"//packages/kbn-std:build_types",
"//packages/kbn-stdio-dev-helpers:build_types",
@ -385,6 +388,7 @@ filegroup(
"//packages/kbn-utility-types-jest:build_types",
"//packages/kbn-utility-types:build_types",
"//packages/kbn-utils:build_types",
"//packages/kbn-yarn-lock-validator:build_types",
"//packages/shared-ux/avatar/solution:build_types",
"//packages/shared-ux/button_toolbar:build_types",
"//packages/shared-ux/button/exit_full_screen:build_types",

View file

@ -30,9 +30,6 @@ instead be:
"@kbn/i18n": "link:../../kibana/packages/kbn-i18n"
```
How all of this works is described in more detail in the
[`@kbn/pm` docs](./kbn-pm#how-it-works).
## Creating a new package
Create a new sub-folder. The name of the folder should mirror the `name` in the

View file

@ -30,12 +30,10 @@ export class BazelPackage {
const pkg = readPackageJson(Path.resolve(dir, 'package.json'));
let buildBazelContent;
if (pkg.name !== '@kbn/pm') {
try {
buildBazelContent = await Fsp.readFile(Path.resolve(dir, 'BUILD.bazel'), 'utf8');
} catch (error) {
throw new Error(`unable to read BUILD.bazel file in [${dir}]: ${error.message}`);
}
try {
buildBazelContent = await Fsp.readFile(Path.resolve(dir, 'BUILD.bazel'), 'utf8');
} catch (error) {
throw new Error(`unable to read BUILD.bazel file in [${dir}]: ${error.message}`);
}
return new BazelPackage(normalizePath(Path.relative(REPO_ROOT, dir)), pkg, buildBazelContent);

View file

@ -24,6 +24,10 @@ export interface ParsedPackageJson {
/** Is this package only intended for dev? */
devOnly?: boolean;
};
/** Scripts defined in the package.json file */
scripts?: {
[key: string]: string | undefined;
};
/** All other fields in the package.json are typed as unknown as we don't care what they are */
[key: string]: unknown;
}

View file

@ -7,6 +7,7 @@ PKG_REQUIRE_NAME = "@kbn/bazel-runner"
SOURCE_FILES = glob(
[
"src/**/*.js",
"src/**/*.ts",
],
exclude = [
@ -81,6 +82,7 @@ ts_project(
args = ['--pretty'],
srcs = SRCS,
deps = TYPES_DEPS,
allow_js = True,
declaration = True,
declaration_map = True,
emit_declaration_only = True,

View file

@ -9,29 +9,38 @@ tags: ['kibana', 'dev', 'contributor', 'operations', 'bazel', 'runner']
This is a package with helpers for invoking bazel and iBazel commands, used everywhere we programmatically run bazel.
## async runBazel(options: BazelRunOptions)
## API
### async runBazel(args: string[], options: BazelRunOptions)
It runs bazel on the background with the given options
## async runIBazel(options: BazelRunOptions)
### async runIBazel(args: string[], options: BazelRunOptions)
It runs a IBazel on the background with the given options
### BazelRunOptions
#### BazelRunOptions
```
{
// a logger to print the command output
log: ToolingLog;
// the arguments to run the bazel or ibazel with
bazelArgs: string[];
log: SomeDevLog;
// run bazel with the no connection compatible config or not
offline?: boolean;
// additional options to pass into execa process
execaOpts?: execa.Options;
// environment variables to set in process running Bazel
env?: Record<string, string>;
// directory to run bazel in
cwd?: string;
// text to include at the beginning on each line of log output produced by bazel
logPrefix?: string;
// handler to implement custom error handling
onErrorExit?: (code: number) => void;
}
```
### execa.Options
Info around available execa options can be found [here](https://github.com/sindresorhus/execa/blob/9a157b3bc247b19d55cc6fbec77800a5ac348d19/readme.md#options)
## NOTE:
This code is needed in order to properly bootstrap the repository. As such, it can't have any NPM dependencies or require being built. This code is loaded directly into the node.js process that boostraps the repository from source while also being built into a package and exposed to the rest of the package system. Please consider this when making any changes to the source.
The code is still type-checked as JS with JSDoc comments, and a single .ts file which provides interfaces to the JS validation and are publically available to package consumers.

View file

@ -0,0 +1,128 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
const ChildProcess = require('child_process');
const Readline = require('readline');
/**
* Print each line of output to the console
* @param {import('stream').Readable} stream
* @param {string | undefined} prefix
*/
async function printLines(stream, prefix) {
const int = Readline.createInterface({
input: stream,
crlfDelay: Infinity,
});
for await (const line of int) {
console.log(prefix ? `${prefix} ${line}` : line);
}
}
/**
* Buffer each line of output to an array so that it can be printed if necessary
* @param {import('stream').Readable} stream
* @param {string[]} buffer
*/
async function bufferLines(stream, buffer) {
const int = Readline.createInterface({
input: stream,
crlfDelay: Infinity,
});
for await (const line of int) {
buffer.push(line);
}
}
/**
* @param {import('events').EventEmitter} emitter
* @param {string} event
* @returns {Promise<any>}
*/
function once(emitter, event) {
return new Promise((resolve) => {
emitter.once(event, resolve);
});
}
/**
* @param {'bazel' | 'ibazel'} runner
* @param {string[]} args
* @param {import('./types').BazelRunOptions | undefined} options
*/
async function runBazelRunner(runner, args, options = undefined) {
const proc = ChildProcess.spawn(runner, args, {
env: {
...process.env,
...options?.env,
},
cwd: options?.cwd,
stdio: ['ignore', 'pipe', 'pipe'],
});
/** @type {string[]} */
const buffer = [];
await Promise.all([
options?.quiet
? Promise.all([bufferLines(proc.stdout, buffer), bufferLines(proc.stderr, buffer)])
: Promise.all([
printLines(proc.stdout, options?.logPrefix),
printLines(proc.stderr, options?.logPrefix),
]),
// Wait for process to exit, or error
Promise.race([
once(proc, 'exit').then((code) => {
if (typeof code !== 'number' || code === 0) {
return;
}
if (options?.onErrorExit) {
options.onErrorExit(code, buffer.join('\n'));
} else {
throw new Error(
`The bazel command that was running exitted with code [${code}]${
buffer.length ? `\n output:\n${buffer.map((l) => ` ${l}`).join('\n')}` : ''
}`
);
}
}),
once(proc, 'error').then((error) => {
throw error;
}),
]),
]);
}
/**
* @param {string[]} args
* @param {import('./types').BazelRunOptions | undefined} options
*/
async function runBazel(args, options = undefined) {
return await runBazelRunner('bazel', args, options);
}
/**
* @param {string[]} args
* @param {import('./types').BazelRunOptions | undefined} options
*/
async function runIBazel(args, options = undefined) {
return await runBazelRunner('ibazel', args, {
...options,
env: {
IBAZEL_USE_LEGACY_WATCHER: '0',
...options?.env,
},
});
}
module.exports = { runBazel, runIBazel };

View file

@ -1,76 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import chalk from 'chalk';
import execa from 'execa';
import * as Rx from 'rxjs';
import { tap } from 'rxjs/operators';
import { ToolingLog } from '@kbn/tooling-log';
import { observeLines } from '@kbn/stdio-dev-helpers';
type BazelCommandRunner = 'bazel' | 'ibazel';
interface BazelRunOptions {
log: ToolingLog;
bazelArgs: string[];
offline?: boolean;
execaOpts?: execa.Options;
}
async function runBazelCommandWithRunner(runner: BazelCommandRunner, options: BazelRunOptions) {
const bazelProc = execa(
runner,
options.offline ? [...options.bazelArgs, '--config=offline'] : options.bazelArgs,
{
...options.execaOpts,
stdio: 'pipe',
preferLocal: true,
}
);
await Promise.all([
// Bazel outputs machine readable output into stdout and human readable output goes to stderr.
// Therefore we need to get both. In order to get errors we need to parse the actual text line
Rx.lastValueFrom(
Rx.merge(
observeLines(bazelProc.stdout!).pipe(
tap((line) => options.log.info(`${chalk.cyan(`[${runner}]`)} ${line}`))
),
observeLines(bazelProc.stderr!).pipe(
tap((line) => options.log.info(`${chalk.cyan(`[${runner}]`)} ${line}`))
)
).pipe(Rx.defaultIfEmpty(undefined))
),
// Wait for process and logs to finish, unsubscribing in the end
bazelProc.catch(() => {
options.log.error(
'HINT: If experiencing problems with node_modules try `yarn kbn bootstrap --force-install` or as last resort `yarn kbn reset && yarn kbn bootstrap`'
);
throw new Error(`The bazel command that was running failed to complete.`);
}),
]);
}
export async function runBazel(options: BazelRunOptions) {
await runBazelCommandWithRunner('bazel', options);
}
export async function runIBazel(options: BazelRunOptions) {
await runBazelCommandWithRunner('ibazel', {
...options,
execaOpts: {
...options.execaOpts,
env: {
...options.execaOpts?.env,
IBAZEL_USE_LEGACY_WATCHER: '0',
},
},
});
}

View file

@ -6,6 +6,6 @@
* Side Public License, v 1.
*/
export * from './get_cache_folders';
export * from './install_tools';
export * from './yarn';
const { runBazel, runIBazel } = require('./bazel_runner');
module.exports = { runBazel, runIBazel };

View file

@ -0,0 +1,34 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
/**
* Options that can be used to customize how Bazel is run
*/
export interface BazelRunOptions {
/**
* Current working directory to run bazel in
*/
cwd?: string;
/**
* Custom environment variables to define for the bazel process
*/
env?: Record<string, string>;
/**
* Prevent logging bazel output unless there is something wrong
*/
quiet?: boolean;
/**
* Prefix to write before each line of output, does nothing if `quiet` is true.
*/
logPrefix?: string;
/**
* Error handler which can be used to throw custom error types when bazel failes. Output will
* be empty unless `quiet` is true
*/
onErrorExit?: (core: number, output: string) => void;
}

View file

@ -4,6 +4,8 @@
"declaration": true,
"declarationMap": true,
"emitDeclarationOnly": true,
"allowJs": true,
"checkJs": true,
"outDir": "target_types",
"rootDir": "src",
"stripInternal": false,

View file

@ -52,6 +52,7 @@ TYPES_DEPS = [
"@npm//@types/node",
"@npm//@types/jest",
"//packages/kbn-tooling-log:npm_module_types",
"//packages/kbn-some-dev-log:npm_module_types",
]
jsts_transpiler(

View file

@ -6,7 +6,7 @@
* Side Public License, v 1.
*/
import type { ToolingLog } from '@kbn/tooling-log';
import { SomeDevLog } from '@kbn/some-dev-log';
/**
* Information about how CiStatsReporter should talk to the ci-stats service. Normally
@ -23,7 +23,7 @@ export interface Config {
buildId: string;
}
function validateConfig(log: ToolingLog, config: { [k in keyof Config]: unknown }) {
function validateConfig(log: SomeDevLog, config: { [k in keyof Config]: unknown }) {
const validApiToken = typeof config.apiToken === 'string' && config.apiToken.length !== 0;
if (!validApiToken) {
log.warning('KIBANA_CI_STATS_CONFIG is missing a valid api token, stats will not be reported');
@ -39,7 +39,7 @@ function validateConfig(log: ToolingLog, config: { [k in keyof Config]: unknown
return config as Config;
}
export function parseConfig(log: ToolingLog) {
export function parseConfig(log: SomeDevLog) {
const configJson = process.env.KIBANA_CI_STATS_CONFIG;
if (!configJson) {
log.debug('KIBANA_CI_STATS_CONFIG environment variable not found, disabling CiStatsReporter');

View file

@ -58,6 +58,7 @@ TYPES_DEPS = [
"@npm//@types/jest",
"//packages/kbn-tooling-log:npm_module_types",
"//packages/kbn-ci-stats-core:npm_module_types",
"//packages/kbn-some-dev-log:npm_module_types",
]
jsts_transpiler(

View file

@ -14,11 +14,13 @@ import crypto from 'crypto';
import execa from 'execa';
import Axios, { AxiosRequestConfig } from 'axios';
import { REPO_ROOT, kibanaPackageJson } from '@kbn/utils';
import { parseConfig, Config, CiStatsMetadata } from '@kbn/ci-stats-core';
import type { SomeDevLog } from '@kbn/some-dev-log';
// @ts-expect-error not "public", but necessary to prevent Jest shimming from breaking things
import httpAdapter from 'axios/lib/adapters/http';
import { ToolingLog } from '@kbn/tooling-log';
import { parseConfig, Config, CiStatsMetadata } from '@kbn/ci-stats-core';
import type { CiStatsTestGroupInfo, CiStatsTestRun } from './ci_stats_test_group_types';
const BASE_URL = 'https://ci-stats.kibana.dev';
@ -119,11 +121,11 @@ export class CiStatsReporter {
/**
* Create a CiStatsReporter by inspecting the ENV for the necessary config
*/
static fromEnv(log: ToolingLog) {
static fromEnv(log: SomeDevLog) {
return new CiStatsReporter(parseConfig(log), log);
}
constructor(private readonly config: Config | undefined, private readonly log: ToolingLog) {}
constructor(private readonly config: Config | undefined, private readonly log: SomeDevLog) {}
/**
* Determine if CI_STATS is explicitly disabled by the environment. To determine
@ -327,28 +329,16 @@ export class CiStatsReporter {
}
/**
* In order to allow this code to run before @kbn/utils is built, @kbn/pm will pass
* in the upstreamBranch when calling the timings() method. Outside of @kbn/pm
* we rely on @kbn/utils to find the package.json file.
* In order to allow this code to run before @kbn/utils is built
*/
private getUpstreamBranch() {
// specify the module id in a way that will keep webpack from bundling extra code into @kbn/pm
const hideFromWebpack = ['@', 'kbn/utils'];
// eslint-disable-next-line @typescript-eslint/no-var-requires
const { kibanaPackageJson } = require(hideFromWebpack.join(''));
return kibanaPackageJson.branch;
}
/**
* In order to allow this code to run before @kbn/utils is built, @kbn/pm will pass
* in the kibanaUuid when calling the timings() method. Outside of @kbn/pm
* we rely on @kbn/utils to find the repo root.
* In order to allow this code to run before @kbn/utils is built
*/
private getKibanaUuid() {
// specify the module id in a way that will keep webpack from bundling extra code into @kbn/pm
const hideFromWebpack = ['@', 'kbn/utils'];
// eslint-disable-next-line @typescript-eslint/no-var-requires
const { REPO_ROOT } = require(hideFromWebpack.join(''));
try {
return Fs.readFileSync(Path.resolve(REPO_ROOT, 'data/uuid'), 'utf-8').trim();
} catch (error) {

View file

@ -1,8 +1,17 @@
# @kbn/dev-cli-runner
---
id: kibDevDocsOpsDevCliRunner
slug: /kibana-dev-docs/ops/kbn-dev-cli-runner
title: "@kbn/dev-cli-runner"
description: 'Helper functions for writing little scripts for random build/ci/dev tasks'
date: 2022-07-14
tags: ['kibana', 'dev', 'contributor', 'operations', 'packages', 'scripts', 'cli']
---
## @kbn/dev-cli-runner
Helper functions for writing little scripts for random build/ci/dev tasks.
## Usage
### Usage
Define the function that should validate the CLI arguments and call your task fn:
@ -63,7 +72,7 @@ $ node scripts/my_task
#
```
## API
### API
- ***`run(fn: async ({ flags: Flags, log: ToolingLog, addCleanupTask }) => Promise<void>, options: Options)`***

View file

@ -7,7 +7,7 @@ module.exports = {
* Main JS configuration
*/
{
files: ['**/*.js'],
files: ['**/*.js', '**/*.mjs'],
parser: require.resolve('@babel/eslint-parser'),
plugins: [

View file

@ -7,6 +7,7 @@ PKG_REQUIRE_NAME = "@kbn/plugin-discovery"
SOURCE_FILES = glob(
[
"src/**/*.js",
"src/**/*.ts",
],
exclude = [
@ -36,10 +37,6 @@ NPM_MODULE_EXTRA_FILES = [
# "@npm//name-of-package"
# eg. "@npm//lodash"
RUNTIME_DEPS = [
"@npm//globby",
"@npm//load-json-file",
"@npm//normalize-path",
"@npm//tslib",
]
# In this array place dependencies necessary to build the types, which will include the
@ -54,11 +51,6 @@ RUNTIME_DEPS = [
TYPES_DEPS = [
"@npm//@types/jest",
"@npm//@types/node",
"@npm//@types/normalize-path",
"@npm//globby",
"@npm//load-json-file",
"@npm//normalize-path",
"@npm//tslib",
]
jsts_transpiler(
@ -83,6 +75,7 @@ ts_project(
deps = TYPES_DEPS,
declaration = True,
declaration_map = True,
allow_js = True,
emit_declaration_only = True,
out_dir = "target_types",
root_dir = "src",

View file

@ -11,14 +11,22 @@ At the moment plugins can live in a couple of different places and in the future
to live anywhere in the repository. This is a package that holds custom logic useful to find and
parse those.
## parseKibanaPlatformPlugin
## API
### parseKibanaPlatformPlugin
It returns a platform plugin for a given manifest path
## getPluginSearchPaths
### getPluginSearchPaths
It returns the paths where plugins will be searched for
## simpleKibanaPlatformPluginDiscovery
### simpleKibanaPlatformPluginDiscovery
It finds and returns the new platform plugins
## NOTE:
This code is needed in order to properly bootstrap the repository. As such, it can't have any NPM dependencies or require being built. This code is loaded directly into the node.js process that boostraps the repository from source while also being built into a package and exposed to the rest of the package system. Please consider this when making any changes to the source.
The code is still type-checked as JS with JSDoc comments, and a single .ts file which provides interfaces to the JS validation and are publically available to package consumers.

View file

@ -0,0 +1,65 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
const Path = require('path');
const Fs = require('fs');
/**
* @param {string} path
*/
function safeReadDir(path) {
try {
return Fs.readdirSync(path, {
withFileTypes: true,
});
} catch (error) {
if (error.code === 'ENOENT' || error.code === 'ENOTDIR') {
return [];
}
throw error;
}
}
/**
* Given an iterable of paths with optoinal "*" segments, expand the path to the
* list of actual absolute paths, removing all "*" segments, and then return the
* set of paths which end up pointing to actual files.
*
* @param {string} dir
* @param {number} depth
* @returns {string[]}
*/
function findKibanaJsonFiles(dir, depth) {
// if depth = 0 then we just need to determine if there is a kibana.json file in this directory
// and return either that path or an empty array
if (depth === 0) {
const path = Path.resolve(dir, 'kibana.json');
return Fs.existsSync(path) ? [path] : [];
}
// if depth > 0 read the files in this directory, if we find a kibana.json file then we can stop
// otherwise we will iterate through the child directories and try to find kibana.json files there.
const files = safeReadDir(dir);
/** @type {string[]} */
const childDirs = [];
for (const ent of files) {
if (ent.isFile()) {
if (ent.name === 'kibana.json') {
return [Path.resolve(dir, ent.name)];
}
} else if (ent.isDirectory()) {
childDirs.push(Path.resolve(dir, ent.name));
}
}
return childDirs.flatMap((dir) => findKibanaJsonFiles(dir, depth - 1));
}
module.exports = { findKibanaJsonFiles };

View file

@ -0,0 +1,19 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
const { parseKibanaPlatformPlugin } = require('./parse_kibana_platform_plugin');
const { getPluginSearchPaths } = require('./plugin_search_paths');
const {
simpleKibanaPlatformPluginDiscovery,
} = require('./simple_kibana_platform_plugin_discovery');
module.exports = {
parseKibanaPlatformPlugin,
getPluginSearchPaths,
simpleKibanaPlatformPluginDiscovery,
};

View file

@ -0,0 +1,30 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
const Fs = require('fs');
const Path = require('path');
/**
* @param {string} path
* @returns {any}
*/
function loadJsonFile(path) {
try {
return JSON.parse(Fs.readFileSync(path, 'utf8'));
} catch (error) {
if (error.code === 'ENOENT') {
throw new Error(`Missing file: ${path}`);
}
throw new Error(
`Unable to read JSON at [${Path.relative(process.cwd(), path)}]: ${error.message}`
);
}
}
module.exports = { loadJsonFile };

View file

@ -6,16 +6,16 @@
* Side Public License, v 1.
*/
import Path from 'path';
import loadJsonFile from 'load-json-file';
const Path = require('path');
export interface KibanaPlatformPlugin {
readonly directory: string;
readonly manifestPath: string;
readonly manifest: KibanaPlatformPluginManifest;
}
const { loadJsonFile } = require('./load_json_file');
function isValidDepsDeclaration(input: unknown, type: string): string[] {
/**
* @param {unknown} input
* @param {string} type
* @returns {string[]}
*/
function isValidDepsDeclaration(input, type) {
if (typeof input === 'undefined') return [];
if (Array.isArray(input) && input.every((i) => typeof i === 'string')) {
return input;
@ -23,35 +23,17 @@ function isValidDepsDeclaration(input: unknown, type: string): string[] {
throw new TypeError(`The "${type}" in plugin manifest should be an array of strings.`);
}
export interface KibanaPlatformPluginManifest {
id: string;
ui: boolean;
server: boolean;
kibanaVersion: string;
version: string;
owner: {
// Internally, this should be a team name.
name: string;
// All internally owned plugins should have a github team specified that can be pinged in issues, or used to look up
// members who can be asked questions regarding the plugin.
githubTeam?: string;
};
// TODO: make required.
description?: string;
enabledOnAnonymousPages?: boolean;
serviceFolders: readonly string[];
requiredPlugins: readonly string[];
optionalPlugins: readonly string[];
requiredBundles: readonly string[];
extraPublicDirs: readonly string[];
}
export function parseKibanaPlatformPlugin(manifestPath: string): KibanaPlatformPlugin {
/**
* @param {string} manifestPath
* @returns {import('./types').KibanaPlatformPlugin}
*/
function parseKibanaPlatformPlugin(manifestPath) {
if (!Path.isAbsolute(manifestPath)) {
throw new TypeError('expected new platform manifest path to be absolute');
}
const manifest: Partial<KibanaPlatformPluginManifest> = loadJsonFile.sync(manifestPath);
/** @type {Partial<import('./types').KibanaPlatformPluginManifest>} */
const manifest = loadJsonFile(manifestPath);
if (!manifest || typeof manifest !== 'object' || Array.isArray(manifest)) {
throw new TypeError('expected new platform plugin manifest to be a JSON encoded object');
}
@ -92,3 +74,5 @@ export function parseKibanaPlatformPlugin(manifestPath: string): KibanaPlatformP
},
};
}
module.exports = { parseKibanaPlatformPlugin };

View file

@ -6,16 +6,13 @@
* Side Public License, v 1.
*/
import { resolve } from 'path';
const { resolve } = require('path');
export interface SearchOptions {
rootDir: string;
oss: boolean;
examples: boolean;
testPlugins?: boolean;
}
export function getPluginSearchPaths({ rootDir, oss, examples, testPlugins }: SearchOptions) {
/**
* @param {{ rootDir: string; oss: boolean; examples: boolean; testPlugins?: boolean; }} options
* @returns {string[]}
*/
function getPluginSearchPaths({ rootDir, oss, examples, testPlugins }) {
return [
resolve(rootDir, 'src', 'plugins'),
...(oss ? [] : [resolve(rootDir, 'x-pack', 'plugins')]),
@ -45,3 +42,5 @@ export function getPluginSearchPaths({ rootDir, oss, examples, testPlugins }: Se
: []),
];
}
module.exports = { getPluginSearchPaths };

View file

@ -0,0 +1,29 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
const { parseKibanaPlatformPlugin } = require('./parse_kibana_platform_plugin');
const { findKibanaJsonFiles } = require('./find_kibana_json_files');
/**
* Helper to find the new platform plugins.
* @param {string[]} scanDirs
* @param {string[]} pluginPaths
* @returns {Array<import('./types').KibanaPlatformPlugin>}
*/
function simpleKibanaPlatformPluginDiscovery(scanDirs, pluginPaths) {
return Array.from(
new Set([
// find kibana.json files up to 5 levels within each scan dir
...scanDirs.flatMap((dir) => findKibanaJsonFiles(dir, 5)),
// find kibana.json files at the root of each plugin path
...pluginPaths.flatMap((path) => findKibanaJsonFiles(path, 0)),
])
).map(parseKibanaPlatformPlugin);
}
module.exports = { simpleKibanaPlatformPluginDiscovery };

View file

@ -1,49 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import globby from 'globby';
import normalize from 'normalize-path';
import { parseKibanaPlatformPlugin, KibanaPlatformPlugin } from './parse_kibana_platform_plugin';
/**
* Helper to find the new platform plugins.
*/
export function simpleKibanaPlatformPluginDiscovery(
scanDirs: string[],
pluginPaths: string[]
): KibanaPlatformPlugin[] {
const patterns = Array.from(
new Set([
// find kibana.json files up to 5 levels within the scan dir
...scanDirs.reduce(
(acc: string[], dir) => [
...acc,
Path.resolve(dir, '*/kibana.json'),
Path.resolve(dir, '*/*/kibana.json'),
Path.resolve(dir, '*/*/*/kibana.json'),
Path.resolve(dir, '*/*/*/*/kibana.json'),
Path.resolve(dir, '*/*/*/*/*/kibana.json'),
],
[]
),
...pluginPaths.map((path) => Path.resolve(path, `kibana.json`)),
])
).map((path) => normalize(path));
const manifestPaths = globby.sync(patterns, { absolute: true }).map((path) =>
// absolute paths returned from globby are using normalize or
// something so the path separators are `/` even on windows,
// Path.resolve solves this
Path.resolve(path)
);
return manifestPaths.map(parseKibanaPlatformPlugin);
}

View file

@ -0,0 +1,44 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
export type JsonValue =
| { [key: string]: JsonValue }
| JsonValue[]
| string
| number
| boolean
| null;
export interface KibanaPlatformPlugin {
readonly directory: string;
readonly manifestPath: string;
readonly manifest: KibanaPlatformPluginManifest;
}
export interface KibanaPlatformPluginManifest {
id: string;
ui: boolean;
server: boolean;
kibanaVersion: string;
version: string;
owner: {
// Internally, this should be a team name.
name: string;
// All internally owned plugins should have a github team specified that can be pinged in issues, or used to look up
// members who can be asked questions regarding the plugin.
githubTeam?: string;
};
// TODO: make required.
description?: string;
enabledOnAnonymousPages?: boolean;
serviceFolders: readonly string[];
requiredPlugins: readonly string[];
optionalPlugins: readonly string[];
requiredBundles: readonly string[];
extraPublicDirs: readonly string[];
}

View file

@ -3,7 +3,8 @@
"compilerOptions": {
"declaration": true,
"declarationMap": true,
"emitDeclarationOnly": true,
"allowJs": true,
"checkJs": true,
"outDir": "target_types",
"rootDir": "src",
"stripInternal": false,
@ -13,6 +14,6 @@
]
},
"include": [
"src/**/*"
"src/**/*",
]
}

View file

@ -1,15 +0,0 @@
{
"presets": [
"@babel/typescript",
["@babel/preset-env", {
"targets": {
"node": "current"
}
}]
],
"plugins": [
"@babel/proposal-class-properties",
"@babel/proposal-object-rest-spread",
"@babel/proposal-optional-chaining"
]
}

View file

@ -1,266 +0,0 @@
# `@kbn/pm` — The Kibana project management tool
`@kbn/pm` is a project management tool inspired by Lerna, which enables sharing
code between Kibana and Kibana plugins.
To run `@kbn/pm`, go to Kibana root and run `yarn kbn`.
## Why `@kbn/pm`?
Long-term we want to get rid of Webpack from production (basically, it's causing
a lot of problems, using a lot of memory and adding a lot of complexity).
Ideally we want each plugin to build its own separate production bundles for
both server and UI. To get there all Kibana plugins (including x-pack) need to
be able to build their production bundles separately from Kibana, which means
they need to be able to depend on code from Kibana without `import`-ing random
files directly from the Kibana source code.
From a plugin perspective there are two different types of Kibana dependencies:
runtime and static dependencies. Runtime dependencies are things that are
instantiated at runtime and that are injected into the plugin, for example
config and elasticsearch clients. Static dependencies are those dependencies
that we want to `import`. `kbn-eslint-config` is one example of this, and
it's actually needed because eslint requires it to be a separate package. But we
also have dependencies like `datemath`, `flot`, `eui` and others that we
control, but where we want to `import` them in plugins instead of injecting them
(because injecting them would be painful to work with). (Btw, these examples
aren't necessarily a part of the Kibana repo today, they are just meant as
examples of code that we might at some point want to include in the repo while
having them be `import`able in Kibana plugins like any other npm package)
Another reason we need static dependencies is that we're starting to introduce
TypeScript into Kibana, and to work nicely with TypeScript across plugins we
need to be able to statically import dependencies. We have for example built an
observable library for Kibana in TypeScript and we need to expose both the
functionality and the TypeScript types to plugins (so other plugins built with
TypeScript can depend on the types for the lib).
However, even though we have multiple packages we don't necessarily want to
`npm publish` them. The ideal solution for us is being able to work on code
locally in the Kibana repo and have a nice workflow that doesn't require
publishing, but where we still get the value of having "packages" that are
available to plugins, without these plugins having to import files directly from
the Kibana folder.
Basically, we just want to be able to share "static code" (aka being able to
`import`) between Kibana and Kibana plugins. To get there we need tooling.
`@kbn/pm` is a tool that helps us manage these static dependencies, and it
enables us to share these packages between Kibana and Kibana plugins. It also
enables these packages to have their own dependencies and their own build
scripts, while still having a nice developer experience.
## How it works
### Internal usage
For packages that are referenced within the Kibana repo itself (for example,
using the `@kbn/i18n` package from an `x-pack` plugin), we are leveraging
Yarn's workspaces feature. This allows yarn to optimize node_modules within
the entire repo to avoid duplicate modules by hoisting common packages as high
in the dependency tree as possible.
To reference a package from within the Kibana repo, simply use the current
version number from that package's package.json file. Then, running `yarn kbn
bootstrap` will symlink that package into your dependency tree. That means
you can make changes to `@kbn/i18n` and immediately have them available
in Kibana itself. No `npm publish` needed anymore — Kibana will always rely
directly on the code that's in the local packages.
### External Plugins
For external plugins, referencing packages in Kibana relies on
`link:` style dependencies in Yarn. With `link:` dependencies you specify the
relative location to a package instead of a version when adding it to
`package.json`. For example:
```
"@kbn/i18n": "link:packages/kbn-i18n"
```
Now when you run `yarn` it will set up a symlink to this folder instead of
downloading code from the npm registry. This allows external plugins to always
use the versions of the package that is bundled with the Kibana version they
are running inside of.
```
"@kbn/i18n": "link:../../kibana/packages/kbn-date-math"
```
This works because we moved to a strict location of Kibana plugins,
`./plugins/{pluginName}` inside of Kibana, or `../kibana-extra/{pluginName}`
relative to Kibana. This is one of the reasons we wanted to move towards a setup
that looks like this:
```
elastic
└── kibana
└── plugins
├── kibana-canvas
└── x-pack-kibana
```
Relying on `link:` style dependencies means we no longer need to `npm publish`
our Kibana specific packages. It also means that plugin authors no longer need
to worry about the versions of the Kibana packages, as they will always use the
packages from their local Kibana.
## The `kbn` use-cases
### Bootstrapping
Now, instead of installing all the dependencies with just running `yarn` you use
the `@kbn/pm` tool, which can install dependencies (and set up symlinks) in
all the packages using one command (aka "bootstrap" the setup).
To bootstrap Kibana:
```
yarn kbn bootstrap
```
By default, `@kbn/pm` will bootstrap all packages within Kibana, plus all
Kibana plugins located in `./plugins` or `../kibana-extra`. There are several
options for skipping parts of this, e.g. to skip bootstrapping of Kibana
plugins:
```
yarn kbn bootstrap --skip-kibana-plugins
```
Or just skip few selected packages:
```
yarn kbn bootstrap --exclude @kbn/pm --exclude @kbn/i18n
```
For more details, run:
```
yarn kbn
```
Bootstrapping also calls the `kbn:bootstrap` script for every included project.
This is intended for packages that need to be built/transpiled to be usable.
### Running scripts
Some times you want to run the same script across multiple packages and plugins,
e.g. `build` or `test`. Instead of jumping into each package and running
`yarn build` you can run:
```
yarn kbn run build --skip-missing
```
And if needed, you can skip packages in the same way as for bootstrapping, e.g.
with `--exclude` and `--skip-kibana-plugins`:
```
yarn kbn run build --exclude kibana --skip-missing
```
### Watching
During development you can also use `kbn` to watch for changes. For this to work
package should define `kbn:watch` script in the `package.json`:
```
yarn kbn watch
```
By default `kbn watch` will sort all packages within Kibana into batches based on
their mutual dependencies and run watch script for all packages in the correct order.
As with any other `kbn` command, you can use `--include` and `--exclude` filters to watch
only for a selected packages:
```
yarn kbn watch --include @kbn/pm --include kibana
```
## Building packages for production
The production build process relies on both the Grunt setup at the root of the
Kibana project and code in `@kbn/pm`. The full process is described in
`tasks/build/packages.js`.
## Development
This package is run from Kibana root, using `yarn kbn`. This will run the
"pre-built" (aka built and committed to git) version of this tool, which is
located in the `dist/` folder. This will also use the included version of Yarn
instead of using your local install of Yarn.
If you need to build a new version of this package, run `yarn build` in this
folder.
Even though this file is generated we commit it to Kibana, because it's used
_before_ dependencies are fetched (as this is the tool actually responsible for
fetching dependencies).
## Technical decisions
### Why our own tool?
While exploring the approach to static dependencies we built PoCs using npm 5
(which symlinks packages using [`file:` dependencies][npm5-file]), [Yarn
workspaces][yarn-workspaces], Yarn (using `link:` dependencies), and
[Lerna][lerna].
In the end we decided to build our own tool, based on Yarn, and `link:`
dependencies, and workspaces. This gave us the control we wanted, and it fits
nicely into our context (e.g. where publishing to npm isn't necessarily
something we want to do).
### Some notes from this exploration
#### `file:` dependencies in npm<5 and in yarn
When you add a dependency like `"foo": "file:../../kibana/packages/foo"`, both
npm<5 and yarn copies the files into the `node_modules` folder. This means you
can't easily make changes to the plugin while developing. Therefore this is a
no-go.
#### `file:` dependencies in npm5
In npm5 `file:` dependencies changed to symlink instead of copy the files. This
means you can have a nicer workflow while developing packages locally. However,
we hit several bugs when using this feature, and we often had to re-run
`npm install` in packages. This is likely because we used an early version of
the new `file:` dependencies in npm5.
#### `link:` dependencies in Yarn
This is the same feature as `file:` dependencies in npm5. However, we did not
hit any problems with them during our exploration.
#### Yarn workspaces
Enables specifying multiple "workspaces" (aka packages/projects) in
`package.json`. When running `yarn` from the root, Yarn will install all the
dependencies for these workspaces and hoist the dependencies to the root (to
"deduplicate" packages). However:
> Workspaces must be children of the workspace root in term of folder hierarchy.
> You cannot and must not reference a workspace that is located outside of this
> filesystem hierarchy.
So Yarn workspaces requires a shared root, which (at least currently) doesn't
fit Kibana, and it's therefore a no-go for now.
#### Lerna
Lerna is based on symlinking packages (similarly to the [`link`][npm-link]
feature which exists in both npm and Yarn, but it's not directly using that
feature). It's a tool built specifically for managing JavaScript projects with
multiple packages. However, it's primarily built (i.e. optimized) for monorepo
_libraries_, so it's focused on publishing packages and other use-cases that are
not necessarily optimized for our use-cases. It's also not ideal for the setup
we currently have, with one app that "owns everything" and the rest being
packages for that app.
[npm-link]: https://docs.npmjs.com/cli/link
[npm5-file]: https://github.com/npm/npm/pull/15900
[yarn-workspaces]: https://yarnpkg.com/lang/en/docs/workspaces/
[lerna]: https://github.com/lerna/lerna

File diff suppressed because one or more lines are too long

View file

@ -1,15 +0,0 @@
{
"name": "@kbn/pm",
"main": "./dist/index.js",
"version": "1.0.0",
"license": "SSPL-1.0 OR Elastic License 2.0",
"private": true,
"kibana": {
"devOnly": true
},
"scripts": {
"build": "../../node_modules/.bin/webpack",
"kbn:watch": "../../node_modules/.bin/webpack --watch",
"prettier": "../../node_modules/.bin/prettier --write './src/**/*.ts'"
}
}

View file

@ -1,97 +0,0 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`excludes project if single \`exclude\` filter is specified 1`] = `
Object {
"graph": Object {
"bar": Array [],
"baz": Array [],
"kibana": Array [],
"quux": Array [],
"with-additional-projects": Array [],
},
"projects": Array [
"bar",
"baz",
"kibana",
"quux",
"with-additional-projects",
],
}
`;
exports[`excludes projects if multiple \`exclude\` filter are specified 1`] = `
Object {
"graph": Object {
"kibana": Array [],
"quux": Array [],
"with-additional-projects": Array [],
},
"projects": Array [
"kibana",
"quux",
"with-additional-projects",
],
}
`;
exports[`includes only projects specified in multiple \`include\` filters 1`] = `
Object {
"graph": Object {
"bar": Array [],
"baz": Array [],
"foo": Array [],
},
"projects": Array [
"bar",
"baz",
"foo",
],
}
`;
exports[`includes single project if single \`include\` filter is specified 1`] = `
Object {
"graph": Object {
"foo": Array [],
},
"projects": Array [
"foo",
],
}
`;
exports[`passes all found projects to the command if no filter is specified 1`] = `
Object {
"graph": Object {
"bar": Array [],
"baz": Array [],
"foo": Array [],
"kibana": Array [
"foo",
],
"quux": Array [],
"with-additional-projects": Array [],
},
"projects": Array [
"bar",
"baz",
"foo",
"kibana",
"quux",
"with-additional-projects",
],
}
`;
exports[`respects both \`include\` and \`exclude\` filters if specified at the same time 1`] = `
Object {
"graph": Object {
"baz": Array [],
"foo": Array [],
},
"projects": Array [
"baz",
"foo",
],
}
`;

View file

@ -1,108 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import dedent from 'dedent';
import getopts from 'getopts';
import { resolve } from 'path';
import { pickLevelFromFlags } from '@kbn/tooling-log';
import { commands } from './commands';
import { runCommand } from './run';
import { log } from './utils/log';
function help() {
log.info(
dedent`
usage: kbn <command> [<args>]
By default commands are run for Kibana itself, all packages in the 'packages/'
folder and for all plugins in './plugins' and '../kibana-extra'.
Available commands:
${Object.values(commands)
.map((command) => `${command.name} - ${command.description}`)
.join('\n ')}
Global options:
-e, --exclude Exclude specified project. Can be specified multiple times to exclude multiple projects, e.g. '-e kibana -e @kbn/pm'.
-i, --include Include only specified projects. If left unspecified, it defaults to including all projects.
--oss Do not include the x-pack when running command.
--skip-kibana-plugins Filter all plugins in ./plugins and ../kibana-extra when running command.
--no-cache Disable the kbn packages bootstrap cache
--no-validate Disable the bootstrap yarn.lock validation
--force-install Forces yarn install to run on bootstrap
--offline Run in offline mode
--verbose Set log level to verbose
--debug Set log level to debug
--quiet Set log level to error
--silent Disable log output
"run" options:
--skip-missing Ignore packages which don't have the requested script
` + '\n'
);
}
export async function run(argv: string[]) {
log.setLogLevel(
pickLevelFromFlags(
getopts(argv, {
boolean: ['verbose', 'debug', 'quiet', 'silent', 'skip-missing'],
})
)
);
// We can simplify this setup (and remove this extra handling) once Yarn
// starts forwarding the `--` directly to this script, see
// https://github.com/yarnpkg/yarn/blob/b2d3e1a8fe45ef376b716d597cc79b38702a9320/src/cli/index.js#L174-L182
if (argv.includes('--')) {
log.error(`Using "--" is not allowed, as it doesn't work with 'yarn kbn'.`);
process.exit(1);
}
const options = getopts(argv, {
alias: {
e: 'exclude',
h: 'help',
i: 'include',
},
default: {
cache: true,
'force-install': false,
offline: false,
validate: true,
},
boolean: ['cache', 'force-install', 'offline', 'validate'],
});
const args = options._;
if (options.help || args.length === 0) {
help();
return;
}
// This `rootPath` is relative to `./dist/` as that's the location of the
// built version of this tool.
const rootPath = resolve(__dirname, '../../../');
const commandName = args[0];
const extraArgs = args.slice(1);
const commandOptions = { options, extraArgs, rootPath };
const command = commands[commandName];
if (command === undefined) {
log.error(`[${commandName}] is not a valid command, see 'kbn --help'`);
process.exit(1);
}
await runCommand(command, commandOptions);
}

View file

@ -1,148 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import Path from 'path';
import { CiStatsReporter } from '@kbn/ci-stats-reporter';
import { runBazel } from '@kbn/bazel-runner';
import { log } from '../utils/log';
import { spawnStreaming } from '../utils/child_process';
import { linkProjectExecutables } from '../utils/link_project_executables';
import { ICommand } from '.';
import { readYarnLock } from '../utils/yarn_lock';
import { validateDependencies } from '../utils/validate_dependencies';
import {
installBazelTools,
haveNodeModulesBeenManuallyDeleted,
removeYarnIntegrityFileIfExists,
} from '../utils/bazel';
import { setupRemoteCache } from '../utils/bazel/setup_remote_cache';
export const BootstrapCommand: ICommand = {
description: 'Install dependencies and crosslink projects',
name: 'bootstrap',
reportTiming: {
group: 'scripts/kbn bootstrap',
id: 'total',
},
async run(projects, projectGraph, { options, kbn, rootPath }) {
const kibanaProjectPath = projects.get('kibana')?.path || '';
const offline = options?.offline === true;
const reporter = CiStatsReporter.fromEnv(log);
const timings: Array<{ id: string; ms: number }> = [];
const time = async <T>(id: string, body: () => Promise<T>): Promise<T> => {
const start = Date.now();
try {
return await body();
} finally {
timings.push({
id,
ms: Date.now() - start,
});
}
};
// Force install is set in case a flag is passed into yarn kbn bootstrap or
// our custom logic have determined there is a chance node_modules have been manually deleted and as such bazel
// tracking mechanism is no longer valid
const forceInstall =
(!!options && options['force-install'] === true) ||
(await haveNodeModulesBeenManuallyDeleted(kibanaProjectPath));
// Install bazel machinery tools if needed
await installBazelTools(rootPath);
// Setup remote cache settings in .bazelrc.cache if needed
await setupRemoteCache(rootPath);
// Bootstrap process for Bazel packages
// Bazel is now managing dependencies so yarn install
// will happen as part of this
//
// NOTE: Bazel projects will be introduced incrementally
// And should begin from the ones with none dependencies forward.
// That way non bazel projects could depend on bazel projects but not the other way around
// That is only intended during the migration process while non Bazel projects are not removed at all.
//
if (forceInstall) {
await time('force install dependencies', async () => {
await removeYarnIntegrityFileIfExists(Path.resolve(kibanaProjectPath, 'node_modules'));
await runBazel({
bazelArgs: ['clean', '--expunge'],
log,
});
await runBazel({
bazelArgs: ['run', '@nodejs//:yarn'],
offline,
log,
execaOpts: {
env: {
SASS_BINARY_SITE:
'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-sass',
RE2_DOWNLOAD_MIRROR:
'https://us-central1-elastic-kibana-184716.cloudfunctions.net/kibana-ci-proxy-cache/node-re2',
},
},
});
});
}
// build packages
await time('build packages', async () => {
await runBazel({
bazelArgs: ['build', '//packages:build', '--show_result=1'],
log,
offline,
});
});
const yarnLock = await time('read yarn.lock', async () => await readYarnLock(kbn));
if (options.validate) {
await time('validate dependencies', async () => {
await validateDependencies(kbn, yarnLock);
});
}
// Assure all kbn projects with bin defined scripts
// copy those scripts into the top level node_modules folder
//
// NOTE: We don't probably need this anymore, is actually not being used
await time('link project executables', async () => {
await linkProjectExecutables(projects, projectGraph);
});
await time('update vscode config', async () => {
// Update vscode settings
await spawnStreaming(
process.execPath,
['scripts/update_vscode_config'],
{
cwd: kbn.getAbsolute(),
env: process.env,
},
{ prefix: '[vscode]', debug: false }
);
});
// send timings
await reporter.timings({
upstreamBranch: kbn.kibanaProject.json.branch,
// prevent loading @kbn/utils by passing null
kibanaUuid: kbn.getUuid() || null,
timings: timings.map((t) => ({
group: 'scripts/kbn bootstrap',
...t,
})),
});
},
};

View file

@ -1,30 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { runBazel } from '@kbn/bazel-runner';
import { ICommand } from '.';
import { log } from '../utils/log';
export const BuildCommand: ICommand = {
description: 'Runs a build in the Bazel built packages',
name: 'build',
reportTiming: {
group: 'scripts/kbn build',
id: 'total',
},
async run(projects, projectGraph, { options }) {
// Call bazel with the target to build all available packages
await runBazel({
bazelArgs: ['build', '//packages:build', '--show_result=1'],
log,
offline: options?.offline === true,
});
},
};

View file

@ -1,94 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import dedent from 'dedent';
import del from 'del';
import ora from 'ora';
import { join, relative } from 'path';
import { runBazel } from '@kbn/bazel-runner';
import { isBazelBinAvailable } from '../utils/bazel';
import { isDirectory } from '../utils/fs';
import { log } from '../utils/log';
import { ICommand } from '.';
export const CleanCommand: ICommand = {
description: 'Deletes output directories and resets internal caches.',
name: 'clean',
reportTiming: {
group: 'scripts/kbn clean',
id: 'total',
},
async run(projects, projectGraph, { kbn }) {
log.warning(dedent`
This command is only necessary for the circumstance where you need to recover a consistent
state when problems arise. If you need to run this command often, please let us know by
filling out this form: https://ela.st/yarn-kbn-clean.
Please not it might not solve problems with node_modules. To solve problems around node_modules
you might need to run 'yarn kbn reset'.
`);
const toDelete = [];
for (const project of projects.values()) {
if (await isDirectory(project.targetLocation)) {
toDelete.push({
cwd: project.path,
pattern: relative(project.path, project.targetLocation),
});
}
const { extraPatterns } = project.getCleanConfig();
if (extraPatterns) {
toDelete.push({
cwd: project.path,
pattern: extraPatterns,
});
}
}
// Runs Bazel soft clean
if (await isBazelBinAvailable(kbn.getAbsolute())) {
await runBazel({
bazelArgs: ['clean'],
log,
});
log.success('Soft cleaned bazel');
}
if (toDelete.length === 0) {
log.success('Nothing to delete');
} else {
/**
* In order to avoid patterns like `/build` in packages from accidentally
* impacting files outside the package we use `process.chdir()` to change
* the cwd to the package and execute `del()` without the `force` option
* so it will check that each file being deleted is within the package.
*
* `del()` does support a `cwd` option, but it's only for resolving the
* patterns and does not impact the cwd check.
*/
const originalCwd = process.cwd();
try {
for (const { pattern, cwd } of toDelete) {
process.chdir(cwd);
const promise = del(pattern);
if (log.wouldLogLevel('info')) {
ora.promise(promise, relative(originalCwd, join(cwd, String(pattern))));
}
await promise;
}
} finally {
process.chdir(originalCwd);
}
}
},
};

View file

@ -1,44 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { ProjectGraph, ProjectMap } from '../utils/projects';
export interface ICommandConfig {
extraArgs: string[];
options: { [key: string]: any };
rootPath: string;
kbn: Kibana;
}
export interface ICommand {
name: string;
description: string;
reportTiming?: {
group: string;
id: string;
};
run: (projects: ProjectMap, projectGraph: ProjectGraph, config: ICommandConfig) => Promise<void>;
}
import { BootstrapCommand } from './bootstrap';
import { BuildCommand } from './build';
import { CleanCommand } from './clean';
import { ResetCommand } from './reset';
import { RunCommand } from './run';
import { WatchCommand } from './watch';
import { Kibana } from '../utils/kibana';
export const commands: { [key: string]: ICommand } = {
bootstrap: BootstrapCommand,
build: BuildCommand,
clean: CleanCommand,
reset: ResetCommand,
run: RunCommand,
watch: WatchCommand,
};

View file

@ -1,112 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import dedent from 'dedent';
import del from 'del';
import ora from 'ora';
import { join, relative } from 'path';
import { runBazel } from '@kbn/bazel-runner';
import {
getBazelDiskCacheFolder,
getBazelRepositoryCacheFolder,
isBazelBinAvailable,
} from '../utils/bazel';
import { isDirectory } from '../utils/fs';
import { log } from '../utils/log';
import { ICommand } from '.';
export const ResetCommand: ICommand = {
description:
'Deletes node_modules and output directories, resets internal and disk caches, and stops Bazel server',
name: 'reset',
reportTiming: {
group: 'scripts/kbn reset',
id: 'total',
},
async run(projects, projectGraph, { kbn }) {
log.warning(dedent`
In most cases, 'yarn kbn clean' is all that should be needed to recover a consistent state when
problems arise. However for the rare cases where something get corrupt on node_modules you might need this command.
If you think you need to use this command very often (which is not normal), please let us know.
`);
const toDelete = [];
for (const project of projects.values()) {
if (await isDirectory(project.nodeModulesLocation)) {
toDelete.push({
cwd: project.path,
pattern: relative(project.path, project.nodeModulesLocation),
});
}
if (await isDirectory(project.targetLocation)) {
toDelete.push({
cwd: project.path,
pattern: relative(project.path, project.targetLocation),
});
}
const { extraPatterns } = project.getCleanConfig();
if (extraPatterns) {
toDelete.push({
cwd: project.path,
pattern: extraPatterns,
});
}
}
// Runs Bazel hard clean and deletes Bazel Cache Folders
if (await isBazelBinAvailable(kbn.getAbsolute())) {
// Hard cleaning bazel
await runBazel({
bazelArgs: ['clean', '--expunge'],
log,
});
log.success('Hard cleaned bazel');
// Deletes Bazel Cache Folders
await del([await getBazelDiskCacheFolder(), await getBazelRepositoryCacheFolder()], {
force: true,
});
log.success('Removed disk caches');
}
if (toDelete.length === 0) {
return;
}
/**
* In order to avoid patterns like `/build` in packages from accidentally
* impacting files outside the package we use `process.chdir()` to change
* the cwd to the package and execute `del()` without the `force` option
* so it will check that each file being deleted is within the package.
*
* `del()` does support a `cwd` option, but it's only for resolving the
* patterns and does not impact the cwd check.
*/
const originalCwd = process.cwd();
try {
for (const { pattern, cwd } of toDelete) {
process.chdir(cwd);
const promise = del(pattern);
if (log.wouldLogLevel('info')) {
ora.promise(promise, relative(originalCwd, join(cwd, String(pattern))));
}
await promise;
}
} finally {
process.chdir(originalCwd);
}
},
};

View file

@ -1,60 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import dedent from 'dedent';
import { CliError } from '../utils/errors';
import { log } from '../utils/log';
import { parallelizeBatches } from '../utils/parallelize';
import { topologicallyBatchProjects } from '../utils/projects';
import { ICommand } from '.';
export const RunCommand: ICommand = {
description:
'Run script defined in package.json in each package that contains that script (only works on packages not using Bazel yet)',
name: 'run',
reportTiming: {
group: 'scripts/kbn run',
id: 'total',
},
async run(projects, projectGraph, { extraArgs, options }) {
log.warning(dedent`
We are migrating packages into the Bazel build system and we will no longer support running npm scripts on
packages using 'yarn kbn run' on Bazel built packages. If the package you are trying to act on contains a
BUILD.bazel file please just use 'yarn kbn build' to build it or 'yarn kbn watch' to watch it
`);
const batchedProjects = topologicallyBatchProjects(projects, projectGraph);
if (extraArgs.length === 0) {
throw new CliError('No script specified');
}
const scriptName = extraArgs[0];
const scriptArgs = extraArgs.slice(1);
await parallelizeBatches(batchedProjects, async (project) => {
if (!project.hasScript(scriptName)) {
if (!!options['skip-missing']) {
return;
}
throw new CliError(
`[${project.name}] no "${scriptName}" script defined. To skip packages without the "${scriptName}" script pass --skip-missing`
);
}
log.info(`[${project.name}] running "${scriptName}" script`);
await project.runScriptStreaming(scriptName, {
args: scriptArgs,
});
log.success(`[${project.name}] complete`);
});
},
};

View file

@ -1,35 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { runIBazel } from '@kbn/bazel-runner';
import { ICommand } from '.';
import { log } from '../utils/log';
export const WatchCommand: ICommand = {
description: 'Runs a build in the Bazel built packages and keeps watching them for changes',
name: 'watch',
reportTiming: {
group: 'scripts/kbn watch',
id: 'total',
},
async run(projects, projectGraph, { options }) {
const runOffline = options?.offline === true;
// Call bazel with the target to build all available packages and run it through iBazel to watch it for changes
//
// Note: --run_output=false arg will disable the iBazel notifications about gazelle and buildozer when running it
// Can also be solved by adding a root `.bazel_fix_commands.json` but its not needed at the moment
await runIBazel({
bazelArgs: ['--run_output=false', 'build', '//packages:build', '--show_result=1'],
log,
offline: runOffline,
});
},
};

View file

@ -1,54 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { resolve } from 'path';
interface Options {
rootPath: string;
skipKibanaPlugins?: boolean;
ossOnly?: boolean;
}
/**
* Returns all the paths where plugins are located
*/
export function getProjectPaths({ rootPath, ossOnly, skipKibanaPlugins }: Options) {
const projectPaths = [rootPath, resolve(rootPath, 'packages/*')];
// This is needed in order to install the dependencies for the declared
// plugin functional used in the selenium functional tests.
// As we are now using the webpack dll for the client vendors dependencies
// when we run the plugin functional tests against the distributable
// dependencies used by such plugins like @eui, react and react-dom can't
// be loaded from the dll as the context is different from the one declared
// into the webpack dll reference plugin.
// In anyway, have a plugin declaring their own dependencies is the
// correct and the expect behavior.
projectPaths.push(resolve(rootPath, 'test/plugin_functional/plugins/*'));
projectPaths.push(resolve(rootPath, 'test/interpreter_functional/plugins/*'));
projectPaths.push(resolve(rootPath, 'test/server_integration/__fixtures__/plugins/*'));
projectPaths.push(resolve(rootPath, 'examples/*'));
if (!ossOnly) {
projectPaths.push(resolve(rootPath, 'x-pack'));
projectPaths.push(resolve(rootPath, 'x-pack/plugins/*'));
projectPaths.push(resolve(rootPath, 'x-pack/legacy/plugins/*'));
projectPaths.push(resolve(rootPath, 'x-pack/test/functional_with_es_ssl/fixtures/plugins/*'));
}
if (!skipKibanaPlugins) {
projectPaths.push(resolve(rootPath, '../kibana-extra/*'));
projectPaths.push(resolve(rootPath, '../kibana-extra/*/packages/*'));
projectPaths.push(resolve(rootPath, '../kibana-extra/*/plugins/*'));
projectPaths.push(resolve(rootPath, 'plugins/*'));
projectPaths.push(resolve(rootPath, 'plugins/*/packages/*'));
projectPaths.push(resolve(rootPath, 'plugins/*/plugins/*'));
}
return projectPaths;
}

View file

@ -1,12 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
export { run } from './cli';
export { getProjects } from './utils/projects';
export { Project } from './utils/project';
export { getProjectPaths } from './config';

View file

@ -1,121 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { resolve } from 'path';
import { ICommand, ICommandConfig } from './commands';
import { runCommand } from './run';
import { Project } from './utils/project';
import { log } from './utils/log';
log.setLogLevel('silent');
const rootPath = resolve(__dirname, 'utils/__fixtures__/kibana');
jest.mock('./utils/regenerate_package_json');
jest.mock('./utils/regenerate_synthetic_package_map');
jest.mock('./utils/regenerate_base_tsconfig');
function getExpectedProjectsAndGraph(runMock: any) {
const [fullProjects, fullProjectGraph] = (runMock as jest.Mock<any>).mock.calls[0];
const projects = [...fullProjects.keys()].sort();
const graph = [...fullProjectGraph.entries()].reduce((expected, [projectName, dependencies]) => {
expected[projectName] = dependencies.map((project: Project) => project.name);
return expected;
}, {});
return { projects, graph };
}
let command: ICommand;
let config: Omit<ICommandConfig, 'kbn'>;
beforeEach(() => {
command = {
description: 'test description',
name: 'test name',
run: jest.fn(),
};
config = {
extraArgs: [],
options: {},
rootPath,
};
});
test('passes all found projects to the command if no filter is specified', async () => {
await runCommand(command, config);
expect(command.run).toHaveBeenCalledTimes(1);
expect(getExpectedProjectsAndGraph(command.run)).toMatchSnapshot();
});
test('excludes project if single `exclude` filter is specified', async () => {
await runCommand(command, {
...config,
options: { exclude: 'foo' },
});
expect(command.run).toHaveBeenCalledTimes(1);
expect(getExpectedProjectsAndGraph(command.run)).toMatchSnapshot();
});
test('excludes projects if multiple `exclude` filter are specified', async () => {
await runCommand(command, {
...config,
options: { exclude: ['foo', 'bar', 'baz'] },
});
expect(command.run).toHaveBeenCalledTimes(1);
expect(getExpectedProjectsAndGraph(command.run)).toMatchSnapshot();
});
test('includes single project if single `include` filter is specified', async () => {
await runCommand(command, {
...config,
options: { include: 'foo' },
});
expect(command.run).toHaveBeenCalledTimes(1);
expect(getExpectedProjectsAndGraph(command.run)).toMatchSnapshot();
});
test('includes only projects specified in multiple `include` filters', async () => {
await runCommand(command, {
...config,
options: { include: ['foo', 'bar', 'baz'] },
});
expect(command.run).toHaveBeenCalledTimes(1);
expect(getExpectedProjectsAndGraph(command.run)).toMatchSnapshot();
});
test('respects both `include` and `exclude` filters if specified at the same time', async () => {
await runCommand(command, {
...config,
options: { include: ['foo', 'bar', 'baz'], exclude: 'bar' },
});
expect(command.run).toHaveBeenCalledTimes(1);
expect(getExpectedProjectsAndGraph(command.run)).toMatchSnapshot();
});
test('does not run command if all projects are filtered out', async () => {
const mockProcessExit = jest.spyOn(process, 'exit').mockReturnValue(undefined as never);
await runCommand(command, {
...config,
// Including and excluding the same project will result in 0 projects selected.
options: { include: ['foo'], exclude: ['foo'] },
});
expect(command.run).not.toHaveBeenCalled();
expect(mockProcessExit).toHaveBeenCalledWith(1);
});

View file

@ -1,162 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { CiStatsReporter, CiStatsTiming } from '@kbn/ci-stats-reporter';
import { simpleKibanaPlatformPluginDiscovery, getPluginSearchPaths } from '@kbn/plugin-discovery';
import { ICommand, ICommandConfig } from './commands';
import { CliError } from './utils/errors';
import { log } from './utils/log';
import { buildProjectGraph } from './utils/projects';
import { renderProjectsTree } from './utils/projects_tree';
import { regeneratePackageJson } from './utils/regenerate_package_json';
import { regenerateSyntheticPackageMap } from './utils/regenerate_synthetic_package_map';
import { regenerateBaseTsconfig } from './utils/regenerate_base_tsconfig';
import { Kibana } from './utils/kibana';
process.env.CI_STATS_NESTED_TIMING = 'true';
export async function runCommand(command: ICommand, config: Omit<ICommandConfig, 'kbn'>) {
const runStartTime = Date.now();
let kbn: undefined | Kibana;
const timings: Array<Omit<CiStatsTiming, 'group'>> = [];
async function time<T>(id: string, block: () => Promise<T>): Promise<T> {
const start = Date.now();
let success = true;
try {
return await block();
} catch (error) {
success = false;
throw error;
} finally {
timings.push({
id,
ms: Date.now() - start,
meta: {
success,
},
});
}
}
async function reportTimes(timingConfig: { group: string; id: string }, error?: Error) {
if (!kbn) {
// things are too broken to report remotely
return;
}
const reporter = CiStatsReporter.fromEnv(log);
try {
await reporter.timings({
upstreamBranch: kbn.kibanaProject.json.branch,
// prevent loading @kbn/utils by passing null
kibanaUuid: kbn.getUuid() || null,
timings: [
...timings.map((t) => ({ ...timingConfig, ...t })),
{
group: timingConfig.group,
id: timingConfig.id,
ms: Date.now() - runStartTime,
meta: {
success: !error,
},
},
],
});
} catch (e) {
// prevent hiding bootstrap errors
log.error('failed to report timings:');
log.error(e);
}
}
try {
log.debug(`Running [${command.name}] command from [${config.rootPath}]`);
await time('regenerate package.json, synthetic-package map and tsconfig', async () => {
const plugins = simpleKibanaPlatformPluginDiscovery(
getPluginSearchPaths({
rootDir: config.rootPath,
oss: false,
examples: true,
testPlugins: true,
}),
[]
);
await Promise.all([
regeneratePackageJson(config.rootPath),
regenerateSyntheticPackageMap(plugins, config.rootPath),
regenerateBaseTsconfig(plugins, config.rootPath),
]);
});
kbn = await time('load Kibana project', async () => await Kibana.loadFrom(config.rootPath));
const projects = kbn.getFilteredProjects({
skipKibanaPlugins: Boolean(config.options['skip-kibana-plugins']),
ossOnly: Boolean(config.options.oss),
exclude: toArray(config.options.exclude),
include: toArray(config.options.include),
});
if (projects.size === 0) {
log.error(
`There are no projects found. Double check project name(s) in '-i/--include' and '-e/--exclude' filters.`
);
return process.exit(1);
}
const projectGraph = buildProjectGraph(projects);
log.debug(`Found ${projects.size.toString()} projects`);
log.debug(renderProjectsTree(config.rootPath, projects));
await command.run(projects, projectGraph, {
...config,
kbn,
});
if (command.reportTiming) {
await reportTimes(command.reportTiming);
}
} catch (error) {
if (command.reportTiming) {
await reportTimes(command.reportTiming, error);
}
log.error(`[${command.name}] failed:`);
if (error instanceof CliError) {
log.error(error.message);
const metaOutput = Object.entries(error.meta)
.map(([key, value]) => `${key}: ${value}`)
.join('\n');
if (metaOutput) {
log.info('Additional debugging info:\n');
log.indent(2);
log.info(metaOutput);
log.indent(-2);
}
} else {
log.error(error);
}
process.exit(1);
}
}
function toArray<T>(value?: T | T[]) {
if (value == null) {
return [];
}
return Array.isArray(value) ? value : [value];
}

View file

@ -1,40 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { cloneDeepWith } from 'lodash';
import { resolve, sep as pathSep } from 'path';
const repoRoot = resolve(__dirname, '../../../../');
const normalizePaths = (value: any) => {
let didReplacement = false;
const clone = cloneDeepWith(value, (v: any) => {
if (typeof v === 'string' && v.startsWith(repoRoot)) {
didReplacement = true;
return v
.replace(repoRoot, '<repoRoot>')
.split(pathSep) // normalize path separators
.join('/');
}
});
return {
clone,
didReplacement,
};
};
export const absolutePathSnapshotSerializer = {
print(value: any, serialize: (val: any) => string) {
return serialize(normalizePaths(value).clone);
},
test(value: any) {
return normalizePaths(value).didReplacement;
},
};

View file

@ -1,11 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
export { absolutePathSnapshotSerializer } from './absolute_path_snapshot_serializer';
export { stripAnsiSnapshotSerializer } from './strip_ansi_snapshot_serializer';

View file

@ -1,20 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import hasAnsi from 'has-ansi';
import stripAnsi from 'strip-ansi';
export const stripAnsiSnapshotSerializer: jest.SnapshotSerializerPlugin = {
serialize(value: string) {
return stripAnsi(value);
},
test(value: any) {
return typeof value === 'string' && hasAnsi(value);
},
};

View file

@ -1,4 +0,0 @@
{
"name": "with-additional-projects",
"version": "1.0.0"
}

View file

@ -1,7 +0,0 @@
{
"name": "baz",
"version": "1.0.0",
"dependencies": {
"bar": "link:../../../../kibana/packages/bar"
}
}

View file

@ -1,8 +0,0 @@
{
"name": "quux",
"version": "1.0.0",
"dependencies": {
"bar": "link:../../../../kibana/packages/bar",
"baz": "link:../../packages/baz"
}
}

View file

@ -1,7 +0,0 @@
{
"name": "kibana",
"version": "1.0.0",
"dependencies": {
"foo": "link:packages/foo"
}
}

View file

@ -1,4 +0,0 @@
{
"name": "bar",
"version": "1.0.0"
}

View file

@ -1,4 +0,0 @@
{
"name": "foo",
"version": "1.0.0"
}

View file

@ -1,4 +0,0 @@
{
"name": "baz",
"version": "1.0.0"
}

View file

@ -1,4 +0,0 @@
{
"name": "baz",
"version": "1.0.0"
}

View file

@ -1,4 +0,0 @@
{
"name": "quux",
"version": "1.0.0"
}

View file

@ -1,4 +0,0 @@
{
"name": "zorge",
"version": "1.0.0"
}

View file

@ -1,4 +0,0 @@
{
"name": "corge",
"version": "1.0.0"
}

View file

@ -1,35 +0,0 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`bin script points nowhere does not try to create symlink on node_modules/.bin for that bin script: fs module calls 1`] = `
Object {
"chmod": Array [],
"copyDirectory": Array [],
"createSymlink": Array [],
"isDirectory": Array [],
"isFile": Array [],
"isSymlink": Array [],
"mkdirp": Array [],
"readFile": Array [],
"rmdirp": Array [],
"tryRealpath": Array [],
"unlink": Array [],
"writeFile": Array [],
}
`;
exports[`bin script points to a file creates a symlink for the project bin into the roots project node_modules/.bin directory as well as node_modules/.bin directory symlink into the roots one: fs module calls 1`] = `
Object {
"chmod": Array [],
"copyDirectory": Array [],
"createSymlink": Array [],
"isDirectory": Array [],
"isFile": Array [],
"isSymlink": Array [],
"mkdirp": Array [],
"readFile": Array [],
"rmdirp": Array [],
"tryRealpath": Array [],
"unlink": Array [],
"writeFile": Array [],
}
`;

View file

@ -1,3 +0,0 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`#getExecutables() throws CliError when bin is something strange 1`] = `"[kibana] has an invalid \\"bin\\" field in its package.json, expected an object or a string"`;

View file

@ -1,41 +0,0 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`#buildProjectGraph builds full project graph 1`] = `
Object {
"bar": Array [],
"baz": Array [],
"foo": Array [],
"kibana": Array [
"foo",
],
"quux": Array [],
"zorge": Array [],
}
`;
exports[`#topologicallyBatchProjects batches projects topologically based on their project dependencies 1`] = `
Array [
Array [
"bar",
"foo",
"baz",
"quux",
"zorge",
],
Array [
"kibana",
],
]
`;
exports[`#topologicallyBatchProjects batches projects topologically even if graph contains projects not presented in the project map 1`] = `
Array [
Array [
"kibana",
"bar",
"baz",
"quux",
"zorge",
],
]
`;

View file

@ -1,29 +0,0 @@
// Jest Snapshot v1, https://goo.gl/fbAQLP
exports[`handles projects outside root folder 1`] = `
kibana
├── packages
│ ├── bar
│ └── foo
└── ../plugins
├── baz
├── quux
└── zorge
`;
exports[`handles projects with root folder 1`] = `
kibana
└── packages
├── bar
└── foo
`;
exports[`handles projects within projects outside root folder 1`] = `
kibana
├── packages
│ ├── bar
│ └── foo
└── ../kibana-extra/additional_projects (with-additional-projects)
├── packages/baz
└── plugins/quux
`;

View file

@ -1,25 +0,0 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0 and the Server Side Public License, v 1; you may not use this file except
* in compliance with, at your election, the Elastic License 2.0 or the Server
* Side Public License, v 1.
*/
import { dirname, resolve } from 'path';
import { spawn } from '../child_process';
async function rawRunBazelInfoRepoCache() {
const { stdout: bazelRepositoryCachePath } = await spawn('bazel', ['info', 'repository_cache'], {
stdio: 'pipe',
});
return bazelRepositoryCachePath;
}
export async function getBazelDiskCacheFolder() {
return resolve(dirname(await rawRunBazelInfoRepoCache()), 'disk-cache');
}
export async function getBazelRepositoryCacheFolder() {
return await rawRunBazelInfoRepoCache();
}

Some files were not shown because too many files have changed in this diff Show more