## Summary Why? To simplify the process of migration to react-router@6. https://github.com/remix-run/react-router/discussions/8753 What problems exactly it solves? - In my previous PR I added `CompatRouter` https://github.com/elastic/kibana/pull/159173, which caused changes in ~50 files and pinged 15 Teams. And this is just meant to be a temporary change, so when we're done with the migration I would have to revert these changes and engage everyone to review the PR again. And it is just a single step in the migration strategy. So to make our lives easier I think it would be better to have a common place where we do import our router components because it will allow us to surface some extra logic in single place instead of going through the whole source code again. - `react-router@6` doesn't support a custom `Route` component, so that means our custom `Route` component that we're using almost everywhere today, will need to be replaced by a different solution. I have decided to add `Routes` component, which will be responsible for rendering the proper component (`react-router@6` renamed `Switch` to `Routes`, so I have named this component to align with the dictionary of the new router) and also is going to add the logic that today is done in `Route` (moving logic to `Routes` will be done in the follow-up PR, here I just wanted to focus on using the common router components to make the review process easier) --------- Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com> |
||
---|---|---|
.. | ||
common | ||
public | ||
server | ||
kibana.jsonc | ||
README.md | ||
tsconfig.json |
response stream
This plugin demonstrates how to stream chunks of data to the client with just a single request.
To run Kibana with the described examples, use yarn start --run-examples
.
The response_stream
plugin demonstrates API endpoints that can stream data chunks with a single request with gzip/compression support. gzip-streams get decompressed natively by browsers. The plugin demonstrates two use cases to get started: Streaming a raw string as well as a more complex example that streams Redux-like actions to the client which update React state via useReducer()
.
Code in @kbn/aiops-utils
contains helpers to set up a stream on the server side (streamFactory()
) and consume it on the client side via a custom hook (useFetchStream()
). The utilities make use of TS generics in a way that allows to have type safety for both request related options as well as the returned data.
No additional third party libraries are used in the helpers to make it work. On the server, they integrate with Hapi
and use node's own gzip
. On the client, the custom hook abstracts away the necessary logic to consume the stream, internally it makes use of a generator function and useReducer()
to update React state.
On the server, the simpler stream to send a string is set up like this:
const { end, push, responseWithHeaders } = streamFactory(request.headers);
The request's headers get passed on to automatically identify if compression is supported by the client.
On the client, the custom hook is used like this:
const { errors, start, cancel, data, isRunning } = useFetchStream<
ApiSimpleStringStream, typeof basePath
>(`${basePath}/internal/response_stream/simple_string_stream`);