* Update service-openai.asciidoc (#125419)
Many customers want to use our OpenAI Inference endpoint against OpenAI compatible API's they have written, or Ollama, or Nvidia Triton OpenAI API front end. I had heard that was the intent of this OpenAI inference endpoint, but we do not state it directly. Can we validate this is OK with Search PM and include this?
Co-authored-by: István Zoltán Szabó <istvan.szabo@elastic.co>
* Update docs/reference/inference/service-openai.asciidoc
---------
Co-authored-by: Brad Quarry <38725582+bradquarry@users.noreply.github.com>
* Including examples
* Using js instead of json
* Adding unified docs to main page
* Adding missing description text
* Refactoring to remove unified route
* Addign back references to the _unified route
* Update docs/reference/inference/chat-completion-inference.asciidoc
* Address feedback
---------
Co-authored-by: István Zoltán Szabó <istvan.szabo@elastic.co>