[Inference Connector][Serverless] Added preconfigured connector for inference Elastic Rainbow Sprinkles LLM (#209946)

This PR is adding the Serverless Kibana preconfigured `.inference`
connector for Elastic Rainbow Rainbow Sprinkles LLM.
Instead of adding within kibana-controller
This commit is contained in:
Yuliia Naumenko 2025-02-07 17:15:47 -08:00 committed by GitHub
parent 7a9bf1399c
commit 41a66ec75b
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -275,4 +275,16 @@ xpack.dataUsage.enableExperimental: ['dataUsageDisabled']
uiSettings.experimental.defaultTheme: "amsterdam"
# This feature is disabled in Serverless until Inference Endpoint become enabled within a Serverless environment
xpack.stack_connectors.enableExperimental: ['inferenceConnectorOff']
xpack.stack_connectors.enableExperimental: ['inferenceConnectorOff']
# This is the definition introducing pre-configured Kibana Connector for Elastic default LLM
Elastic-Inference-Rainbow-Sprinkles:
name: Elastic-Inference-Rainbow-Sprinkles
actionTypeId: .inference
exposeConfig: true
config:
provider: 'elastic'
taskType: 'chat_completion'
inferenceId: '.rainbow-sprinkles-elastic'
providerConfig:
model_id: 'rainbow-sprinkles'