kibana/x-pack/platform/packages/shared/kbn-inference-cli/README.md
Dario Gieselaar dd7ed50d9b
[Inference] Run EIS locally (#215475)
1. Make sure you're connected to [Infra
Vault](https://docs.elastic.dev/vault/infra-vault/home) using oidc:
`$ VAULT_ADDR={...} vault login -method oidc`

2. Run the `eis` script:
`$ node scripts/eis.js`

2a. After it's started, run ES with:
`$ yarn es snapshot --license trial -E
xpack.inference.elastic.url=http://localhost:8443`
2b. The command will output credentials for a preconfigured EIS
connector. Paste it into kibana(.dev).yml.

3. Start Kibana as usual. 

4. Run:
`yarn run ts-node --transpile-only
x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts`

This should output:

```
 ~/dev/kibana  eis-connector-cli *219  yarn run ts-node --transpile-only x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts
yarn run v1.22.22
$ /Users/dariogieselaar/dev/kibana/node_modules/.bin/ts-node --transpile-only x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts 
 info Discovered kibana running at: http://elastic:changeme@127.0.0.1:5601/kbn
 info {
        id: 'extract_personal_details',
        content: '',
        output: { name: 'Sarah', age: 29, city: 'San Francisco' }
      }
  Done in 5.47s.
```

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Co-authored-by: Dima Arnautov <arnautov.dima@gmail.com>
Co-authored-by: Elastic Machine <elasticmachine@users.noreply.github.com>
2025-04-23 08:08:33 +02:00

1.9 KiB

@kbn/inference-cli

Exposes an Inference (plugin) API client for scripts, that mimicks the chatComplete and output APIs that are available on its start contract. It depends on the KibanaClient that is exposed from the @kbn/kibana-api-cli package. It automatically selects a connector if available. Usage:

import { createInferenceClient } from '@kbn/inference-cli';
import { ToolingLog } from '@kbn/tooling-log';

const log = new ToolingLog();
const inferenceClient = await createInferenceClient({
  log,
  // pass in a signal that is triggered on teardown
  signal: new AbortController().signal,
});

const response = await inferenceClient.output({
  id: 'extract_personal_details',
  input: `Sarah is a 29-year-old software developer living in San Francisco.`,
  schema: {
    type: 'object',
    properties: {
      name: { type: 'string' },
      age: { type: 'number' },
      city: { type: 'string' },
    },
    required: ['name'],
  } as const,
});

log.info(response.output);

Running a recipe:

$ yarn run ts-node x-pack/solutions/observability/packages/kbn-genai-cli/recipes/hello_world.ts

EIS

You can set up a local instance of the Elastic Inference Service by running node scripts/eis.js. This starts the EIS Gateway in a Docker container, and handles certificates and configuration.

Prerequisites

EIS connects to external LLM providers, so you need to supply authentication. By default, the setup script will try to get credentials from Vault. Make sure you have configured Vault to point at Elastic's Infra Vault server, and that you're logged in. If you want to, you can run Vault locally and set VAULT_ADDR and VAULT_SECRET_PATH. By default the script will try to get credentials from the Infra Vault cluster, at secret/kibana-issues/dev/inference/*, which is accessible for all employees.