[DOCS][Playground] Mention local model compatibility (#192911)

Note about openai sdk compatible local models + links to examples
This commit is contained in:
Liam Thompson 2024-09-16 14:18:22 +01:00 committed by GitHub
parent a0973d6002
commit 0a385f30fd
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -89,6 +89,17 @@ a|
|===
[[playground-local-llms]]
[TIP]
====
You can also use locally hosted LLMs that are compatible with the OpenAI SDK.
Once you've set up your LLM, you can connect to it using the OpenAI connector.
Refer to the following for examples:
* {security-guide}/connect-to-byo-llm.html[Using LM Studio]
* https://www.elastic.co/search-labs/blog/localai-for-text-embeddings[LocalAI with `docker-compose`]
====
[float]
[[playground-getting-started]]
== Getting started
@ -101,13 +112,15 @@ image::get-started.png[width=600]
=== Connect to LLM provider
To get started with {x}, you need to create a <<action-types,connector>> for your LLM provider.
Follow these steps on the {x} landing page:
You can also connect to <<playground-local-llms,locally hosted LLMs>> which are compatible with the OpenAI API, by using the OpenAI connector.
To connect to an LLM provider, follow these steps on the {x} landing page:
. Under *Connect to an LLM*, click *Create connector*.
. Select your *LLM provider*.
. *Name* your connector.
. Select a *URL endpoint* (or use the default).
. Enter *access credentials* for your LLM provider.
. Enter *access credentials* for your LLM provider. (If you're running a locally hosted LLM using the OpenAI connector, you must input a value in the API key form, but the specific value doesn't matter.)
[TIP]
====