Instrument the inference chatComplete API with OpenTelemetry, and export helper functions to create spans w/ the right semconv attributes. Additionally, optionally export to Langfuse or Phoenix. ## Centralizes OpenTelemetry setup As this is the first instance of OpenTelemetry based _tracing_ (we already have metrics in the MonitoringCollection plugin), some bootstrapping code is necessary to centrally configure OpenTelemetry. To this end, I've added the following config settings: - `telemetry.tracing.enabled`: whether OpenTelemetry tracing is enabled (defaults to undefined, if undefined, falls back to `telemetry.enabled`) - `telemetry.tracing.sample_rate` (defaults to 1) The naming of these configuration settings is mostly in-line with [the Elasticsearch tracing settings](https://github.com/elastic/elasticsearch/blob/main/TRACING.md). The following packages (containing bootstrapping logic, utility functions, types and config schemas) were added: - `@kbn/telemetry` - `@kbn/telemetry-config` - `@kbn/tracing` The OpenTelemetry bootstrapping depends on @kbn/apm-config-loader, as it has the same constraints - it needs to run before any other code, and it needs to read the raw config. Additionally, a root `telemetry` logger was added that captures OpenTelemetry logs. Note that there is no default exporter for spans, which means that although spans are being recorded, they do not get exported. ## Instrument chatComplete calls Calls to `chatComplete` now create OpenTelemetry spans, roughly following semantic conventions (which for GenAI are very much in flux). Some helper functions were added to create other inference spans. These helper functions use baggage to determine whether the created inference span is the "root" of an inference trace. This allows us to export these spans as if it were root spans - something that is needed to be able to easily visualize these in other tools. Leveraging these inference spans, two exporters are added. One for [Phoenix](https://github.com/Arize-ai/phoenix) and one for [Langfuse](https://github.com/langfuse/langfuse/tree/main): two open-source LLM Observability suites. This allows engineers that use the Inference plugin to be able to inspect and improve their LLM-based workflows with much less effort. For both Phoenix and Langfuse, two service scripts were added. Run `node scripts/phoenix` or `node scripts/langfuse` to get started. Both scripts work with zero-config - they will log generated Kibana config to stdout. --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com> |
||
---|---|---|
.. | ||
codeql | ||
archive_migration_functions.sh | ||
backport.js | ||
build.js | ||
build_api_docs.js | ||
build_kibana_platform_plugins.js | ||
build_plugin_list_docs.js | ||
build_product_doc_artifacts.js | ||
capture_oas_snapshot.js | ||
check_file_casing.js | ||
check_ftr_code_owners.js | ||
check_ftr_configs.js | ||
check_jest_configs.js | ||
check_licenses.js | ||
check_mappings_update.js | ||
check_prod_native_modules.js | ||
chromium_version.js | ||
classify_source.js | ||
create_observability_rules.js | ||
delete_kibana_indices.sh | ||
dependency_ownership.js | ||
dependency_usage.sh | ||
dev_docs.sh | ||
docs.js | ||
download_pr_list.js | ||
eis.js | ||
enabled_ftr_configs.js | ||
es.js | ||
es_archiver.js | ||
eslint.js | ||
eslint_with_types.js | ||
extract_performance_testing_dataset.js | ||
find_babel_runtime_helpers_in_use.js | ||
find_node_libs_browser_polyfills_in_use.js | ||
functional_test_runner.js | ||
functional_tests.js | ||
functional_tests_server.js | ||
generate.js | ||
generate_console_definitions.js | ||
generate_openapi.js | ||
generate_plugin.js | ||
generate_team_assignments.js | ||
get_owners_for_file.js | ||
i18n_check.js | ||
i18n_extract.js | ||
i18n_integrate.js | ||
ingest_coverage.js | ||
jest.js | ||
jest_integration.js | ||
kbn.js | ||
kbn_archiver.js | ||
kibana.js | ||
kibana_encryption_keys.js | ||
kibana_keystore.js | ||
kibana_plugin.js | ||
kibana_setup.js | ||
kibana_verification_code.js | ||
langfuse.js | ||
licenses_csv_report.js | ||
lint_packages.js | ||
lint_ts_projects.js | ||
makelogs.js | ||
manifest.js | ||
notice.js | ||
phoenix.js | ||
plugin_check.js | ||
plugin_helpers.js | ||
precommit_hook.js | ||
prettier_topology_check.js | ||
profile.js | ||
quick_checks.js | ||
read_jest_help.mjs | ||
README.md | ||
register_git_hook.js | ||
relocate.js | ||
report_failed_tests.js | ||
report_performance_metrics.js | ||
rewrite_buildkite_definitions.js | ||
run_performance.js | ||
run_scalability.js | ||
saved_objs_info.js | ||
scout.js | ||
ship_ci_stats.js | ||
snapshot_plugin_types.js | ||
stage_by_owner.js | ||
storybook.js | ||
styled_components_mapping.js | ||
stylelint.js | ||
synthtrace.js | ||
telemetry_check.js | ||
telemetry_extract.js | ||
test_hardening.js | ||
type_check.js | ||
update_prs.js | ||
update_vscode_config.js | ||
validate_next_docs.js | ||
whereis_pkg.js | ||
yarn_deduplicate.js |
Kibana Dev Scripts
This directory contains scripts useful for interacting with Kibana tools in development. Use the node executable and --help
flag to learn about how they work:
node scripts/{{script name}} --help
For Developers
This directory is excluded from the build and tools within it should help users discover their capabilities. Each script in this directory must:
- require
src/setup_node_env
to bootstrap NodeJS environment - call out to source code in the
src
orpackages
directories - react to the
--help
flag - run everywhere OR check and fail fast when a required OS or toolchain is not available
Functional Test Scripts
node scripts/functional_tests [--config src/platform/test/functional/config.base.js --config test/api_integration/config.js]
Runs all the functional tests: selenium tests and api integration tests. List configs with multiple --config
arguments. Uses the @kbn/test library to run Elasticsearch and Kibana servers and tests against those servers, for multiple server+test setups. In particular, calls out to runTests()
. Can be run on a single config.
node scripts/functional_tests_server [--config src/platform/test/functional/config.base.js]
Starts just the Elasticsearch and Kibana servers given a single config, i.e. via --config src/platform/test/functional/config.base.js
or --config test/api_integration/config
. Allows the user to start just the servers with this script, and keep them running while running tests against these servers. The idea is that the same config file configures both Elasticsearch and Kibana servers. Uses the startServers()
method from @kbn/test library.
Example. Start servers and run tests, separately, but using the same config:
# Just the servers
node scripts/functional_tests_server --config path/to/config
In another terminal:
# Just the tests--against the running servers
node scripts/functional_test_runner --config path/to/config
For details on how the internal methods work, read this readme.
ES archiver
Loading data
If you wish to load up specific es archived data for your test, you can do so via:
node scripts/es_archiver.js load <archive> [--es-url=http://username:password@localhost:9200] [--kibana-url=http://username:password@localhost:5601/{basepath?}]
That will load the specified archive located in the archive directory specified by the default functional config file, located in src/platform/test/functional/config.base.js
. To load archives from other function config files you can pass --config path/to/config.js
.
Note: The --es-url
and --kibana-url
options may or may not be neccessary depending on your current Kibana configuration settings, and their values
may also change based on those settings (for example if you are not running with security you will not need the username:password
portion).
Saving data
You can save existing data into an archive by using the save
command:
node scripts/es_archiver.js save <archive name for kibana data> [space separated list of index patterns to include]
You may want to store the .kibana index separate from data. Since adding a lot of data will bloat our repo size, we have many tests that reuse the same
data indices but use their own .kibana
index.