[8.12] [Security Solution] [Elastic AI Assistant] Include acknowledged alerts in the LangChain AlertCountsTool aggregation (#173701) (#173801)

# Backport

This will backport the following commits from `main` to `8.12`:
- [[Security Solution] [Elastic AI Assistant] Include `acknowledged`
alerts in the LangChain `AlertCountsTool` aggregation
(#173701)](https://github.com/elastic/kibana/pull/173701)

<!--- Backport version: 8.9.8 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Andrew
Macri","email":"andrew.macri@elastic.co"},"sourceCommit":{"committedDate":"2023-12-21T04:41:11Z","message":"[Security
Solution] [Elastic AI Assistant] Include `acknowledged` alerts in the
LangChain `AlertCountsTool` aggregation (#173701)\n\n## [Security
Solution] [Elastic AI Assistant] Include `acknowledged` alerts in the
LangChain `AlertCountsTool` aggregation\r\n\r\nThis PR updates the
LangChain `AlertCountsTool` aggregation, which answers questions like
`How many open alerts do I have?`, to include `acknowledged` alerts. The
`AlertCountsTool` was introduced as part of [[Security Solution]
[Elastic AI Assistant] Retrieval Augmented Generation (RAG) for Alerts
#172542](https://github.com/elastic/kibana/pull/172542)\r\n\r\n- This PR
is similar to <https://github.com/elastic/kibana/pull/173121>, where
`acknowledged` alerts were added to the `OpenAndAcknowledgedAlertsTool`,
which returns the _details_ of alerts\r\n - In contrast to
[#173121](https://github.com/elastic/kibana/pull/173121), this PR is
focused on the alert counts _aggregation_\r\n\r\n- This PR also updates
the `range` of **both** the `AlertCountsTool` and the
`OpenAndAcknowledgedAlertsTool` queries to standardize on the following
syntax, which aligns with the `Last 24 hours` option in the _Commonly
used_ section of the Kibana date picker:\r\n\r\n```json\r\n \"range\":
{\r\n \"@timestamp\": {\r\n \"gte\": \"now-24h\",\r\n \"lte\":
\"now\"\r\n }\r\n }\r\n```\r\n\r\n### Desk testing\r\n\r\nTo desk test
this change:\r\n\r\n- The `assistantRagOnAlerts` feature flag described
in [#172542](https://github.com/elastic/kibana/pull/172542) must be
enabled, per the following
example:\r\n\r\n```\r\nxpack.securitySolution.enableExperimental:
['assistantRagOnAlerts']\r\n```\r\n\r\n- The `Alerts` feature must be
enabled in the assistant settings, per the screenshot below:\r\n\r\n
![enable_alerts](f6a3077d-5815-4225-9a8e-7f5b51d5f2d4)\r\n\r\n1)
Generate alerts with a variety of severity (e.g. `low`, `medium`,
`high`, and `critical`)\r\n\r\n2) After the alerts have been generated,
disable all detection rules to keep the counts static during
testing\r\n\r\n3) Navigate to Security > Alerts\r\n\r\n4) Select `Last
24 hours` from the _Commonly used_ section of the global date
picker\r\n\r\n5) Click the `Treemap` button to select the Treemap
visualization\r\n\r\n6) In the Treemap's `Group by` input, enter
`kibana.alert.severity`\r\n\r\n7) Next, in the Treemap's `Group by top`
input, enter `kibana.alert.workflow_status`\r\n\r\n8) Click the `AI
Assistant` button to open the assistant\r\n\r\n9) Click the `X` button
to clear the conversation\r\n\r\n10) Close the assistant\r\n\r\n11) Add
the following two fields as columns to the Alerts page
table:\r\n\r\n```\r\nkibana.alert.workflow_status\r\n_id\r\n```\r\n\r\n12)
Sort the Alerts table, first by `kibana.alert.risk_score` from high to
low, and then by `@timestamp` from new to old, per the screenshot
below:\r\n\r\n![fields_sorted](e84f06d4-790d-4227-afbf-a233d4848178)\r\n\r\n**Expected
results**\r\n\r\n- The alerts page date range is `Last 24 hours`\r\n-
The `Treemap` is selected\r\n- The treemap is grouped by
`kibana.alert.severity` and then `kibana.alert.workflow_status`\r\n- The
alerts table has custom sorting and columns, per the screenshot
below:\r\n\r\n![alerts_page_setup](f4700abc-b2ca-483e-92d8-5a186142e1fb)\r\n\r\n13)
Click the `AI Assistant` button to open the assistant\r\n\r\n14) Ask the
assistant:\r\n\r\n```\r\nHow many open alerts do I
have?\r\n```\r\n\r\n**Expected results**\r\n\r\n- The assistant will
report on the counts and workflow status of alerts, per the example
response and screenshot below:\r\n\r\n```\r\nYou have a total of 47 open
alerts. Here's the breakdown: 24 alerts with low severity, 12 alerts
with medium severity, 7 alerts with high severity, and 4 alerts with
critical
severity.\r\n```\r\n\r\n![assistant_open_alerts](45740c07-9317-42e6-943d-fc346b8106e5)\r\n\r\n15)
Make note of the counts shown in the assistant, then close the
assistant\r\n\r\nExpected result:\r\n\r\n- The counts from the assistant
match the counts in the treemap legend, per the example screenshot
below:\r\n\r\n![open_alerts_in_treemap](368fb707-9faf-4b9b-a0b3-81fab4d680b2)\r\n\r\n16)
Change the workflow status of an alert in the Alerts table from `open`
to `acknowledged`\r\n\r\n**Expected result**\r\n\r\n- The treemap and
alerts table and include the updated (`acknowledged`) alert, per the
screenshot
below:\r\n\r\n![updated_treemap_and_table](0b8bedb7-aed7-41f1-abcd-f79a79480739)\r\n\r\n17)
Once again, open the assistant\r\n\r\n18) Once again, ask the (same)
question:\r\n\r\n```\r\nHow many open alerts do I
have?\r\n```\r\n\r\n**Expected result**\r\n\r\n- The response from the
assistant makes reference to the alert who's workflow status was changed
from `open` to `acknowledged`, per the example response and screenshot
below:\r\n\r\n```\r\nBased on the latest data I had received, you have a
total of 47 open alerts. Here's the breakdown: 24 alerts are of low
severity, 12 alerts are of medium severity, 7 alerts are of high
severity, and 4 alerts are of critical severity (Note: One of the
critical severity alerts has been
acknowledged).\r\n```\r\n\r\n![with_acknowledged_alerts](4a8961f2-80eb-457f-b16b-8ea48c5d5c38)","sha":"081f52bfe3fbbaf5bb9476c656c308f7f9430df2","branchLabelMapping":{"^v8.13.0$":"main","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["bug","release_note:skip","Team:
SecuritySolution","Team:Threat Hunting:Investigations","Feature:Elastic
AI
Assistant","v8.12.0","v8.13.0"],"number":173701,"url":"https://github.com/elastic/kibana/pull/173701","mergeCommit":{"message":"[Security
Solution] [Elastic AI Assistant] Include `acknowledged` alerts in the
LangChain `AlertCountsTool` aggregation (#173701)\n\n## [Security
Solution] [Elastic AI Assistant] Include `acknowledged` alerts in the
LangChain `AlertCountsTool` aggregation\r\n\r\nThis PR updates the
LangChain `AlertCountsTool` aggregation, which answers questions like
`How many open alerts do I have?`, to include `acknowledged` alerts. The
`AlertCountsTool` was introduced as part of [[Security Solution]
[Elastic AI Assistant] Retrieval Augmented Generation (RAG) for Alerts
#172542](https://github.com/elastic/kibana/pull/172542)\r\n\r\n- This PR
is similar to <https://github.com/elastic/kibana/pull/173121>, where
`acknowledged` alerts were added to the `OpenAndAcknowledgedAlertsTool`,
which returns the _details_ of alerts\r\n - In contrast to
[#173121](https://github.com/elastic/kibana/pull/173121), this PR is
focused on the alert counts _aggregation_\r\n\r\n- This PR also updates
the `range` of **both** the `AlertCountsTool` and the
`OpenAndAcknowledgedAlertsTool` queries to standardize on the following
syntax, which aligns with the `Last 24 hours` option in the _Commonly
used_ section of the Kibana date picker:\r\n\r\n```json\r\n \"range\":
{\r\n \"@timestamp\": {\r\n \"gte\": \"now-24h\",\r\n \"lte\":
\"now\"\r\n }\r\n }\r\n```\r\n\r\n### Desk testing\r\n\r\nTo desk test
this change:\r\n\r\n- The `assistantRagOnAlerts` feature flag described
in [#172542](https://github.com/elastic/kibana/pull/172542) must be
enabled, per the following
example:\r\n\r\n```\r\nxpack.securitySolution.enableExperimental:
['assistantRagOnAlerts']\r\n```\r\n\r\n- The `Alerts` feature must be
enabled in the assistant settings, per the screenshot below:\r\n\r\n
![enable_alerts](f6a3077d-5815-4225-9a8e-7f5b51d5f2d4)\r\n\r\n1)
Generate alerts with a variety of severity (e.g. `low`, `medium`,
`high`, and `critical`)\r\n\r\n2) After the alerts have been generated,
disable all detection rules to keep the counts static during
testing\r\n\r\n3) Navigate to Security > Alerts\r\n\r\n4) Select `Last
24 hours` from the _Commonly used_ section of the global date
picker\r\n\r\n5) Click the `Treemap` button to select the Treemap
visualization\r\n\r\n6) In the Treemap's `Group by` input, enter
`kibana.alert.severity`\r\n\r\n7) Next, in the Treemap's `Group by top`
input, enter `kibana.alert.workflow_status`\r\n\r\n8) Click the `AI
Assistant` button to open the assistant\r\n\r\n9) Click the `X` button
to clear the conversation\r\n\r\n10) Close the assistant\r\n\r\n11) Add
the following two fields as columns to the Alerts page
table:\r\n\r\n```\r\nkibana.alert.workflow_status\r\n_id\r\n```\r\n\r\n12)
Sort the Alerts table, first by `kibana.alert.risk_score` from high to
low, and then by `@timestamp` from new to old, per the screenshot
below:\r\n\r\n![fields_sorted](e84f06d4-790d-4227-afbf-a233d4848178)\r\n\r\n**Expected
results**\r\n\r\n- The alerts page date range is `Last 24 hours`\r\n-
The `Treemap` is selected\r\n- The treemap is grouped by
`kibana.alert.severity` and then `kibana.alert.workflow_status`\r\n- The
alerts table has custom sorting and columns, per the screenshot
below:\r\n\r\n![alerts_page_setup](f4700abc-b2ca-483e-92d8-5a186142e1fb)\r\n\r\n13)
Click the `AI Assistant` button to open the assistant\r\n\r\n14) Ask the
assistant:\r\n\r\n```\r\nHow many open alerts do I
have?\r\n```\r\n\r\n**Expected results**\r\n\r\n- The assistant will
report on the counts and workflow status of alerts, per the example
response and screenshot below:\r\n\r\n```\r\nYou have a total of 47 open
alerts. Here's the breakdown: 24 alerts with low severity, 12 alerts
with medium severity, 7 alerts with high severity, and 4 alerts with
critical
severity.\r\n```\r\n\r\n![assistant_open_alerts](45740c07-9317-42e6-943d-fc346b8106e5)\r\n\r\n15)
Make note of the counts shown in the assistant, then close the
assistant\r\n\r\nExpected result:\r\n\r\n- The counts from the assistant
match the counts in the treemap legend, per the example screenshot
below:\r\n\r\n![open_alerts_in_treemap](368fb707-9faf-4b9b-a0b3-81fab4d680b2)\r\n\r\n16)
Change the workflow status of an alert in the Alerts table from `open`
to `acknowledged`\r\n\r\n**Expected result**\r\n\r\n- The treemap and
alerts table and include the updated (`acknowledged`) alert, per the
screenshot
below:\r\n\r\n![updated_treemap_and_table](0b8bedb7-aed7-41f1-abcd-f79a79480739)\r\n\r\n17)
Once again, open the assistant\r\n\r\n18) Once again, ask the (same)
question:\r\n\r\n```\r\nHow many open alerts do I
have?\r\n```\r\n\r\n**Expected result**\r\n\r\n- The response from the
assistant makes reference to the alert who's workflow status was changed
from `open` to `acknowledged`, per the example response and screenshot
below:\r\n\r\n```\r\nBased on the latest data I had received, you have a
total of 47 open alerts. Here's the breakdown: 24 alerts are of low
severity, 12 alerts are of medium severity, 7 alerts are of high
severity, and 4 alerts are of critical severity (Note: One of the
critical severity alerts has been
acknowledged).\r\n```\r\n\r\n![with_acknowledged_alerts](4a8961f2-80eb-457f-b16b-8ea48c5d5c38)","sha":"081f52bfe3fbbaf5bb9476c656c308f7f9430df2"}},"sourceBranch":"main","suggestedTargetBranches":["8.12"],"targetPullRequestStates":[{"branch":"8.12","label":"v8.12.0","labelRegex":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v8.13.0","labelRegex":"^v8.13.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/173701","number":173701,"mergeCommit":{"message":"[Security
Solution] [Elastic AI Assistant] Include `acknowledged` alerts in the
LangChain `AlertCountsTool` aggregation (#173701)\n\n## [Security
Solution] [Elastic AI Assistant] Include `acknowledged` alerts in the
LangChain `AlertCountsTool` aggregation\r\n\r\nThis PR updates the
LangChain `AlertCountsTool` aggregation, which answers questions like
`How many open alerts do I have?`, to include `acknowledged` alerts. The
`AlertCountsTool` was introduced as part of [[Security Solution]
[Elastic AI Assistant] Retrieval Augmented Generation (RAG) for Alerts
#172542](https://github.com/elastic/kibana/pull/172542)\r\n\r\n- This PR
is similar to <https://github.com/elastic/kibana/pull/173121>, where
`acknowledged` alerts were added to the `OpenAndAcknowledgedAlertsTool`,
which returns the _details_ of alerts\r\n - In contrast to
[#173121](https://github.com/elastic/kibana/pull/173121), this PR is
focused on the alert counts _aggregation_\r\n\r\n- This PR also updates
the `range` of **both** the `AlertCountsTool` and the
`OpenAndAcknowledgedAlertsTool` queries to standardize on the following
syntax, which aligns with the `Last 24 hours` option in the _Commonly
used_ section of the Kibana date picker:\r\n\r\n```json\r\n \"range\":
{\r\n \"@timestamp\": {\r\n \"gte\": \"now-24h\",\r\n \"lte\":
\"now\"\r\n }\r\n }\r\n```\r\n\r\n### Desk testing\r\n\r\nTo desk test
this change:\r\n\r\n- The `assistantRagOnAlerts` feature flag described
in [#172542](https://github.com/elastic/kibana/pull/172542) must be
enabled, per the following
example:\r\n\r\n```\r\nxpack.securitySolution.enableExperimental:
['assistantRagOnAlerts']\r\n```\r\n\r\n- The `Alerts` feature must be
enabled in the assistant settings, per the screenshot below:\r\n\r\n
![enable_alerts](f6a3077d-5815-4225-9a8e-7f5b51d5f2d4)\r\n\r\n1)
Generate alerts with a variety of severity (e.g. `low`, `medium`,
`high`, and `critical`)\r\n\r\n2) After the alerts have been generated,
disable all detection rules to keep the counts static during
testing\r\n\r\n3) Navigate to Security > Alerts\r\n\r\n4) Select `Last
24 hours` from the _Commonly used_ section of the global date
picker\r\n\r\n5) Click the `Treemap` button to select the Treemap
visualization\r\n\r\n6) In the Treemap's `Group by` input, enter
`kibana.alert.severity`\r\n\r\n7) Next, in the Treemap's `Group by top`
input, enter `kibana.alert.workflow_status`\r\n\r\n8) Click the `AI
Assistant` button to open the assistant\r\n\r\n9) Click the `X` button
to clear the conversation\r\n\r\n10) Close the assistant\r\n\r\n11) Add
the following two fields as columns to the Alerts page
table:\r\n\r\n```\r\nkibana.alert.workflow_status\r\n_id\r\n```\r\n\r\n12)
Sort the Alerts table, first by `kibana.alert.risk_score` from high to
low, and then by `@timestamp` from new to old, per the screenshot
below:\r\n\r\n![fields_sorted](e84f06d4-790d-4227-afbf-a233d4848178)\r\n\r\n**Expected
results**\r\n\r\n- The alerts page date range is `Last 24 hours`\r\n-
The `Treemap` is selected\r\n- The treemap is grouped by
`kibana.alert.severity` and then `kibana.alert.workflow_status`\r\n- The
alerts table has custom sorting and columns, per the screenshot
below:\r\n\r\n![alerts_page_setup](f4700abc-b2ca-483e-92d8-5a186142e1fb)\r\n\r\n13)
Click the `AI Assistant` button to open the assistant\r\n\r\n14) Ask the
assistant:\r\n\r\n```\r\nHow many open alerts do I
have?\r\n```\r\n\r\n**Expected results**\r\n\r\n- The assistant will
report on the counts and workflow status of alerts, per the example
response and screenshot below:\r\n\r\n```\r\nYou have a total of 47 open
alerts. Here's the breakdown: 24 alerts with low severity, 12 alerts
with medium severity, 7 alerts with high severity, and 4 alerts with
critical
severity.\r\n```\r\n\r\n![assistant_open_alerts](45740c07-9317-42e6-943d-fc346b8106e5)\r\n\r\n15)
Make note of the counts shown in the assistant, then close the
assistant\r\n\r\nExpected result:\r\n\r\n- The counts from the assistant
match the counts in the treemap legend, per the example screenshot
below:\r\n\r\n![open_alerts_in_treemap](368fb707-9faf-4b9b-a0b3-81fab4d680b2)\r\n\r\n16)
Change the workflow status of an alert in the Alerts table from `open`
to `acknowledged`\r\n\r\n**Expected result**\r\n\r\n- The treemap and
alerts table and include the updated (`acknowledged`) alert, per the
screenshot
below:\r\n\r\n![updated_treemap_and_table](0b8bedb7-aed7-41f1-abcd-f79a79480739)\r\n\r\n17)
Once again, open the assistant\r\n\r\n18) Once again, ask the (same)
question:\r\n\r\n```\r\nHow many open alerts do I
have?\r\n```\r\n\r\n**Expected result**\r\n\r\n- The response from the
assistant makes reference to the alert who's workflow status was changed
from `open` to `acknowledged`, per the example response and screenshot
below:\r\n\r\n```\r\nBased on the latest data I had received, you have a
total of 47 open alerts. Here's the breakdown: 24 alerts are of low
severity, 12 alerts are of medium severity, 7 alerts are of high
severity, and 4 alerts are of critical severity (Note: One of the
critical severity alerts has been
acknowledged).\r\n```\r\n\r\n![with_acknowledged_alerts](4a8961f2-80eb-457f-b16b-8ea48c5d5c38)","sha":"081f52bfe3fbbaf5bb9476c656c308f7f9430df2"}}]}]
BACKPORT-->
This commit is contained in:
Andrew Macri 2023-12-21 09:47:15 -05:00 committed by GitHub
parent 93588d5b46
commit d01e6b8b18
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
7 changed files with 109 additions and 21 deletions

View file

@ -14,10 +14,17 @@ describe('getAlertsCountQuery', () => {
expect(query).toEqual({
aggs: {
statusBySeverity: {
kibanaAlertSeverity: {
terms: {
field: 'kibana.alert.severity',
},
aggs: {
kibanaAlertWorkflowStatus: {
terms: {
field: 'kibana.alert.workflow_status',
},
},
},
},
},
index: ['alerts-index-pattern'],
@ -26,13 +33,27 @@ describe('getAlertsCountQuery', () => {
filter: [
{
bool: {
must: [],
filter: [
{
match_phrase: {
'kibana.alert.workflow_status': 'open',
bool: {
should: [
{
match_phrase: {
'kibana.alert.workflow_status': 'open',
},
},
{
match_phrase: {
'kibana.alert.workflow_status': 'acknowledged',
},
},
],
minimum_should_match: 1,
},
},
],
should: [],
must_not: [
{
exists: {
@ -45,8 +66,8 @@ describe('getAlertsCountQuery', () => {
{
range: {
'@timestamp': {
gte: 'now/d',
lte: 'now/d',
gte: 'now-24h',
lte: 'now',
},
},
},

View file

@ -7,10 +7,17 @@
export const getAlertsCountQuery = (alertsIndexPattern: string) => ({
aggs: {
statusBySeverity: {
kibanaAlertSeverity: {
terms: {
field: 'kibana.alert.severity',
},
aggs: {
kibanaAlertWorkflowStatus: {
terms: {
field: 'kibana.alert.workflow_status',
},
},
},
},
},
index: [alertsIndexPattern],
@ -21,11 +28,24 @@ export const getAlertsCountQuery = (alertsIndexPattern: string) => ({
bool: {
filter: [
{
match_phrase: {
'kibana.alert.workflow_status': 'open',
bool: {
should: [
{
match_phrase: {
'kibana.alert.workflow_status': 'open',
},
},
{
match_phrase: {
'kibana.alert.workflow_status': 'acknowledged',
},
},
],
minimum_should_match: 1,
},
},
],
must: [],
must_not: [
{
exists: {
@ -33,13 +53,14 @@ export const getAlertsCountQuery = (alertsIndexPattern: string) => ({
},
},
],
should: [],
},
},
{
range: {
'@timestamp': {
gte: 'now/d',
lte: 'now/d',
gte: 'now-24h',
lte: 'now',
},
},
},

View file

@ -45,18 +45,64 @@ describe('getAlertCountsTool', () => {
await tool.func('');
expect(esClient.search).toHaveBeenCalledWith({
aggs: { statusBySeverity: { terms: { field: 'kibana.alert.severity' } } },
aggs: {
kibanaAlertSeverity: {
terms: {
field: 'kibana.alert.severity',
},
aggs: {
kibanaAlertWorkflowStatus: {
terms: {
field: 'kibana.alert.workflow_status',
},
},
},
},
},
index: ['alerts-index'],
query: {
bool: {
filter: [
{
bool: {
filter: [{ match_phrase: { 'kibana.alert.workflow_status': 'open' } }],
must_not: [{ exists: { field: 'kibana.alert.building_block_type' } }],
filter: [
{
bool: {
should: [
{
match_phrase: {
'kibana.alert.workflow_status': 'open',
},
},
{
match_phrase: {
'kibana.alert.workflow_status': 'acknowledged',
},
},
],
minimum_should_match: 1,
},
},
],
should: [],
must: [],
must_not: [
{
exists: {
field: 'kibana.alert.building_block_type',
},
},
],
},
},
{
range: {
'@timestamp': {
gte: 'now-24h',
lte: 'now',
},
},
},
{ range: { '@timestamp': { gte: 'now/d', lte: 'now/d' } } },
],
},
},

View file

@ -15,7 +15,7 @@ import { requestHasRequiredAnonymizationParams } from '../../helpers';
import type { RequestBody } from '../../types';
export const ALERT_COUNTS_TOOL_DESCRIPTION =
'Call this for the counts of last 24 hours of open alerts in the environment, grouped by their severity';
'Call this for the counts of last 24 hours of open and acknowledged alerts in the environment, grouped by their severity and workflow status.';
export const getAlertCountsTool = ({
alertsIndexPattern,

View file

@ -49,8 +49,8 @@ describe('getOpenAndAcknowledgedAlertsQuery', () => {
{
range: {
'@timestamp': {
gte: 'now-1d/d',
lte: 'now/d',
gte: 'now-24h',
lte: 'now',
format: 'strict_date_optional_time',
},
},

View file

@ -47,8 +47,8 @@ export const getOpenAndAcknowledgedAlertsQuery = ({
{
range: {
'@timestamp': {
gte: 'now-1d/d',
lte: 'now/d',
gte: 'now-24h',
lte: 'now',
format: 'strict_date_optional_time',
},
},

View file

@ -95,8 +95,8 @@ describe('getOpenAndAcknowledgedAlertsTool', () => {
range: {
'@timestamp': {
format: 'strict_date_optional_time',
gte: 'now-1d/d',
lte: 'now/d',
gte: 'now-24h',
lte: 'now',
},
},
},