kibana/examples/response_stream/README.md
Walter Rafelsberger 0ab24e566c
[ML] AIOps: Use Kibana's http service instead of fetch, fix throttling. (#162335)
- Originally Kibana's `http` service did not support receiving streams,
that's why we used plain `fetch` for this. This has been fixed in
#158678, so this PR updates the streaming helpers to use Kibana's `http`
service from now on.
- The PR also breaks out the response stream code into its own package
and restructures it to separate client and server side code. This brings
down the `aiops` bundle size by `~300KB`! 🥳
- The approach to client side throttling/buffering was also revamped:
There was an issue doing the throttling inside the generator function,
it always waited for the timeout. The buffering is now removed from
`fetchStream`, instead `useThrottle` from `react-use` is used on the
reduced `data` in `useFetchStream`. Loading log rate analysis results
got a lot snappier with this update!
2023-07-27 08:57:10 +02:00

1.7 KiB

response stream

This plugin demonstrates how to stream chunks of data to the client with just a single request.

To run Kibana with the described examples, use yarn start --run-examples.

The response_stream plugin demonstrates API endpoints that can stream data chunks with a single request with gzip/compression support. gzip-streams get decompressed natively by browsers. The plugin demonstrates two use cases to get started: Streaming a raw string as well as a more complex example that streams Redux-like actions to the client which update React state via useReducer().

Code in @kbn/ml-response-stream contains helpers to set up a stream on the server side (streamFactory()) and consume it on the client side via a custom hook (useFetchStream()). The utilities make use of TS generics in a way that allows to have type safety for both request related options as well as the returned data.

No additional third party libraries are used in the helpers to make it work. On the server, they integrate with Hapi and use node's own gzip. On the client, the custom hook abstracts away the necessary logic to consume the stream, internally it makes use of a generator function and useReducer() to update React state.

On the server, the simpler stream to send a string is set up like this:

const { end, push, responseWithHeaders } = streamFactory(request.headers);

The request's headers get passed on to automatically identify if compression is supported by the client.

On the client, the custom hook is used like this:

const {
  errors,
  start,
  cancel,
  data,
  isRunning
} = useFetchStream('/internal/response_stream/simple_string_stream');