[Security Solution][API testing] Move and restructures Risk Engine tests (#170530)

## Summary

Following the initial work in this
https://github.com/elastic/kibana/pull/166755
- Addresses part of https://github.com/elastic/kibana/issues/151902 for
Entity Analytics/Risk Engine
- Introduced new folders called `entity_analytics/risk_engine` under
`security_solution_api_integration`
- Moved the utility files associated with risk_engine to the new
directory `security_solution_api_integration`. Files that were not
actively used in the previous folder were moved, while any duplicate
files remained in their original positions.
- Updated the CodeOwner file for the newly moved tests
- Old / new groups details and execution time
[document](https://docs.google.com/document/d/1CRFfDWMzw3ob03euWIvT4-IoiLXjoiPWI8mTBqP4Zks/edit)


| Action | File | New Path if moved |
|--------|------|----------|
| Moved|
group10/risk_engine/init_and_status_apis|/entity_analytics/default_license/risk_engine/init_and_status_apis.ts
|
| Moved|
group10/risk_engine/risk_score_calculation|/entity_analytics/default_license/risk_engine/risk_score_calculation.ts
|
| Moved|
group10/risk_engine/risk_score_preview|/entity_analytics/default_license/risk_engine/risk_score_preview.ts
|
| Moved|
group10/risk_engine/risk_scoring_task_execution|/entity_analytics/default_license/risk_engine/risk_scoring_task_execution.ts
|
| Moved|
group10/risk_engine/telemetry_usage|/entity_analytics/default_license/risk_engine/telemetry_usage.ts
|
| Moved| group10/risk_engine/utils|entity_analytics/utils/risk_engine.ts
|
| Moved| utils/get_stats|entity_analytics/utils/get_risk_engine_stats.ts
|

Tests skipped on main:


https://github.com/elastic/kibana/blob/main/x-pack/test/detection_engine_api_integration/security_and_spaces/group10/risk_engine/init_and_status_apis.ts#L363

---------

Co-authored-by: kibanamachine <42973632+kibanamachine@users.noreply.github.com>
Co-authored-by: Nikita Khristinin <nkhristinin@gmail.com>
This commit is contained in:
Wafaa Nasr 2023-11-09 14:52:45 +01:00 committed by GitHub
parent 629ee090c3
commit eeaf815d23
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
33 changed files with 1328 additions and 165 deletions

View file

@ -462,6 +462,9 @@ enabled:
- x-pack/test/security_solution_api_integration/test_suites/detections_response/default_license/actions/configs/ess.config.ts
- x-pack/test/security_solution_api_integration/test_suites/detections_response/default_license/alerts/configs/serverless.config.ts
- x-pack/test/security_solution_api_integration/test_suites/detections_response/default_license/alerts/configs/ess.config.ts
- x-pack/test/security_solution_api_integration/test_suites/entity_analytics/default_license/risk_engine/configs/serverless.config.ts
- x-pack/test/security_solution_api_integration/test_suites/entity_analytics/default_license/risk_engine/configs/ess.config.ts

2
.github/CODEOWNERS vendored
View file

@ -1438,7 +1438,7 @@ x-pack/plugins/security_solution/public/overview/pages/entity_analytics.tsx @ela
x-pack/plugins/security_solution/public/overview/components/entity_analytics
x-pack/plugins/security_solution/server/lib/entity_analytics @elastic/security-entity-analytics
x-pack/plugins/security_solution/server/lib/risk_score @elastic/security-entity-analytics
x-pack/test/detection_engine_api_integration/security_and_spaces/group10/risk_engine @elastic/security-entity-analytics
x-pack/test/security_solution_api_integration/test_suites/entity_analytics @elastic/security-entity-analytics
# Security Defend Workflows - OSQuery Ownership
/x-pack/plugins/security_solution/common/api/detection_engine/model/rule_response_actions @elastic/security-defend-workflows

View file

@ -30,10 +30,5 @@ export default ({ loadTestFile }: FtrProviderContext): void => {
loadTestFile(require.resolve('./runtime'));
loadTestFile(require.resolve('./throttle'));
loadTestFile(require.resolve('./ignore_fields'));
loadTestFile(require.resolve('./risk_engine/init_and_status_apis'));
loadTestFile(require.resolve('./risk_engine/risk_score_preview'));
loadTestFile(require.resolve('./risk_engine/risk_score_calculation'));
loadTestFile(require.resolve('./risk_engine/risk_scoring_task_execution'));
loadTestFile(require.resolve('./risk_engine/telemetry_usage'));
});
};

View file

@ -28,7 +28,7 @@ ex:
```
## Adding new security area's tests
# Adding new security area's tests
1. Within the `test_suites` directory, create a new area folder.
2. Introduce `ess.config` and `serverless.config` files to reference the new test files and incorporate any additional custom properties defined in the `CreateTestConfigOptions` interface.
@ -36,7 +36,7 @@ ex:
4. Append a new entry in the `ftr_configs.yml` file to enable the execution of the newly added tests within the CI pipeline.
## Testing locally
# Testing locally
In the `package.json` file, you'll find commands to configure the server for each environment and to run tests against that specific environment. These commands adhere to the Mocha tagging system, allowing for the inclusion and exclusion of tags, mirroring the setup of the CI pipeline.
@ -44,49 +44,55 @@ In the `package.json` file, you'll find commands to configure the server for eac
In this project, you can run various commands to execute tests and workflows, each of which can be customized by specifying different parameters. Below, how to define the commands based on the parameters and their order.
### Command Structure
1. Server Initialization and running tests for ex: (Detections Response - Default License):
The command structure follows this pattern
- `<type>` can be either "server" or "runner," allowing you to either set up the server or execute the tests against the designated server.
- `<area>`: The area the test is defined under, such as "detection_engine, entity_analytics,.."
- `<licenseFolder>`: The license folder the test is defined under such as "default_license, basic_license,..."
The command structure follows this pattern:
#### `initialize-server:dr:default`
- `<command-name>`: The name of the specific command or test case.
- `<folder>`: The test folder or workflow you want to run.
- `<type>`: The type of operation, either "server" or "runner."
- `<environment>`: The testing environment, such as "serverlessEnv," "essEnv," or "qaEnv."
- `<licenseFolder>`: The license folder the test is defined under such as "default_license", by default the value is "default_license"
- `<area>`: The area the test is defined under, such as "detection_engine", by default the value is "detection_engine"
- Command: `node ./scripts/index.js server detections_response default_license`
- Description: Initiates the server for the Detections Response area with the default license.
#### `run-tests:dr:default`
### Serverless and Ess Configuration
- When using "serverless" or "ess" in the script, it specifies the correct configuration file for the tests.
- "Serverless" and "ess" help determine the configuration specific to the chosen test.
### serverlessEnv, essEnv, qaEnv Grep Command
- When using "serverlessEnv,.." in the script, it appends the correct grep command for filtering tests in the serverless testing environment.
- "serverlessEnv,..." is used to customize the test execution based on the serverless environment.
- Command: `node ./scripts/index.js runner detections_response default_license`
- Description: Runs the tests for the Detections Response area with the default license.
### Command Examples
Here are some command examples using the provided parameters:
2. Executes particular sets of test suites linked to the designated environment and license:
1. **Run the server for "exception_workflows" in the "serverlessEnv" environment:**
```shell
npm run initialize-server exceptions/workflows serverless
```
2. **To run tests for the "exception_workflows" using the serverless runner in the "serverlessEnv" environment, you can use the following command:**
```shell
npm run run-tests exceptions/workflows serverless serverlessEnv
```
3. **Run tests for "exception_workflows" using the serverless runner in the "qaEnv" environment:**
```shell
npm run run-tests exceptions/workflows serverless qaEnv
```
4. **Run the server for "exception_workflows" in the "essEnv" environment:**
```shell
npm run initialize-server exceptions/workflows ess
```
5. **Run tests for "exception_workflows" using the ess runner in the "essEnv" environment:**
```shell
npm run run-tests exceptions/workflows ess essEnv
```
The command structure follows this pattern:
- `<folder>`: The test folder or workflow you want to run.
- `<projectType>`: The type of project to pick the relevant configurations, either "serverless" or "ess."
- "serverless" and "ess" help determine the configuration specific to the chosen test.
- `<environment>`: The testing environment, such as "serverlessEnv," "essEnv," or "qaEnv."
- When using "serverlessEnv,.." in the script, it appends the correct grep command for filtering tests in the serverless testing environment.
- "serverlessEnv,..." is used to customize the test execution based on the serverless environment.
Here are some command examples for "exceptions" which defined under the "detection_engine" area using the default license:
1. **Run the server for "exception_workflows" in the "serverlessEnv" environment:**
```shell
npm run initialize-server:dr:default exceptions/workflows serverless
```
2. **To run tests for the "exception_workflows" using the serverless runner in the "serverlessEnv" environment, you can use the following command:**
```shell
npm run run-tests:dr:default exceptions/workflows serverless serverlessEnv
```
3. **Run tests for "exception_workflows" using the serverless runner in the "qaEnv" environment:**
```shell
npm run run-tests:dr:default exceptions/workflows serverless qaEnv
```
4. **Run the server for "exception_workflows" in the "essEnv" environment:**
```shell
npm run initialize-server:dr:default exceptions/workflows ess
```
5. **Run tests for "exception_workflows" using the ess runner in the "essEnv" environment:**
```shell
npm run run-tests:dr:default exceptions/workflows ess essEnv
```

View file

@ -4,7 +4,8 @@
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { GenericFtrProviderContext } from '@kbn/test';
import type { FtrProviderContext } from '../../test_serverless/api_integration/ftr_provider_context';
import { services } from '../../test_serverless/api_integration/services';
export type { FtrProviderContext };
export type FtrProviderContext = GenericFtrProviderContext<typeof services, {}>;

View file

@ -7,6 +7,8 @@
"scripts": {
"initialize-server:dr:default": "node ./scripts/index.js server detections_response default_license",
"run-tests:dr:default": "node ./scripts/index.js runner detections_response default_license",
"initialize-server:ea:default": "node ./scripts/index.js server entity_analytics default_license",
"run-tests:ea:default": "node ./scripts/index.js runner entity_analytics default_license",
"exception_workflows:server:serverless": "npm run initialize-server:dr:default exceptions/workflows serverless",
"exception_workflows:runner:serverless": "npm run run-tests:dr:default exceptions/workflows serverless serverlessEnv",
"exception_workflows:qa:serverless": "npm run run-tests:dr:default exceptions/workflows serverless qaEnv",
@ -41,6 +43,11 @@
"alerts:runner:serverless": "npm run run-tests:dr:default alerts serverless serverlessEnv",
"alerts:qa:serverless": "npm run run-tests:dr:default alerts serverless qaEnv",
"alerts:server:ess": "npm run initialize-server:dr:default alerts ess",
"alerts:runner:ess": "npm run run-tests:dr:default alerts ess essEnv"
"alerts:runner:ess": "npm run run-tests:dr:default alerts ess essEnv",
"entity_analytics:server:serverless": "npm run initialize-server:ea:default risk_engine serverless",
"entity_analytics:runner:serverless": "npm run run-tests:ea:default risk_engine serverless serverlessEnv",
"entity_analytics:qa:serverless": "npm run run-tests:ea:default risk_engine serverless qaEnv",
"entity_analytics:server:ess": "npm run initialize-server:ea:default risk_engine ess",
"entity_analytics:runner:ess": "npm run run-tests:ea:default risk_engine ess essEnv"
}
}

View file

@ -14,6 +14,7 @@ import type { RiskEnrichmentFields } from '@kbn/security-solution-plugin/server/
import { DETECTION_ENGINE_QUERY_SIGNALS_URL as DETECTION_ENGINE_QUERY_ALERTS_URL } from '@kbn/security-solution-plugin/common/constants';
import { countDownTest } from '../count_down_test';
import { getQueryAlertsId } from './get_query_alerts_ids';
import { routeWithNamespace } from '../route_with_namespace';
/**
* Given an array of rule ids this will return only alerts based on that rule id both
@ -25,12 +26,14 @@ export const getAlertsByIds = async (
supertest: SuperTest.SuperTest<SuperTest.Test>,
log: ToolingLog,
ids: string[],
size?: number
size?: number,
namespace?: string
): Promise<SearchResponse<DetectionAlert & RiskEnrichmentFields>> => {
const alertsOpen = await countDownTest<SearchResponse<DetectionAlert & RiskEnrichmentFields>>(
async () => {
const route = routeWithNamespace(DETECTION_ENGINE_QUERY_ALERTS_URL, namespace);
const response = await supertest
.post(DETECTION_ENGINE_QUERY_ALERTS_URL)
.post(route)
.set('kbn-xsrf', 'true')
.send(getQueryAlertsId(ids, size));
if (response.status !== 200) {

View file

@ -21,11 +21,12 @@ export const waitForAlertsToBePresent = async (
supertest: SuperTest.SuperTest<SuperTest.Test>,
log: ToolingLog,
numberOfAlerts = 1,
alertIds: string[]
alertIds: string[],
namespace?: string
): Promise<void> => {
await waitFor(
async () => {
const alertsOpen = await getAlertsByIds(supertest, log, alertIds, numberOfAlerts);
const alertsOpen = await getAlertsByIds(supertest, log, alertIds, numberOfAlerts, namespace);
return alertsOpen.hits.hits.length >= numberOfAlerts;
},
'waitForAlertsToBePresent',

View file

@ -0,0 +1,606 @@
# Data Generator for functional tests
Helper to generate and index documents for using in Kibana functional tests
- [Data Generator for functional tests](#data-generator-for-functional-tests)
- [DataGenerator](#datagenerator)
- [Initialization](#initialization)
- [Prerequisites](#prerequisites)
- [dataGeneratorFactory](#datageneratorfactory)
- [methods](#methods)
- [**indexListOfDocuments**](#indexlistofdocuments)
- [**indexGeneratedDocuments**](#indexgenerateddocuments)
- [**indexEnhancedDocuments**](#indexenhanceddocuments)
- [Utils](#utils)
- [**generateDocuments**](#generatedocuments)
- [**enhanceDocument**](#enhancedocument)
- [**enhanceDocuments**](#enhancedocuments)
- [Usage](#usage)
- [create test query rule that queries indexed documents within a test](#create-test-query-rule-that-queries-indexed-documents-within-a-test)
## DataGenerator
### Initialization
#### Prerequisites
1. Create index mappings in `x-pack/test/functional/es_archives/security_solution`
- create folder for index `foo_bar`
- add mappings file `mappings.json` in it
<details>
<summary>x-pack/test/functional/es_archives/security_solution/foo_bar/mappings.json</summary>
```JSON
{
"type": "index",
"value": {
"index": "foo_bar",
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"@timestamp": {
"type": "date"
},
"foo": {
"type": "keyword"
},
}
},
"settings": {
"index": {
"number_of_replicas": "1",
"number_of_shards": "1"
}
}
}
}
```
</details>
2. Add in `before` of the test file index initialization
```ts
const esArchiver = getService('esArchiver');
before(async () => {
await esArchiver.load(
'x-pack/test/functional/es_archives/security_solution/foo_bar'
);
});
```
3. Add in `after` of the test file index removal
```ts
const esArchiver = getService('esArchiver');
before(async () => {
await esArchiver.unload(
'x-pack/test/functional/es_archives/security_solution/foo_bar'
);
});
```
#### dataGeneratorFactory
`DataGeneratorParams`
| Property | Description | Type |
| --------------- | ------------------------------------------------------ | ------ |
| es | ES client | `ESClient` |
| index | index where document will be added | `string` |
| log | log client | `LogClient`|
1. import and initialize factory
```ts
import { dataGeneratorFactory } from '../../utils/data_generator';
const es = getService('es');
const log = getService('log');
const { indexListOfDocuments, indexGeneratedDocuments } = dataGeneratorFactory({
es,
index: 'foo_bar',
log,
});
```
2. Factory will return 2 methods which can be used to index documents into `foo_bar`
where `getService` is method from `FtrProviderContext`
### methods
#### **indexListOfDocuments**
| Property | Description | Type |
| --------------- | ------------------------------------------------------ | ------ |
| documents | list of documents to index | `Record<string, unknown>` |
Will index list of documents to `foo_bar` index as defined in `dataGeneratorFactory` params
```ts
await indexListOfDocuments([{ foo: "bar" }, { id: "test-1" }])
```
#### **indexGeneratedDocuments**
Will generate 10 documents in defined interval and index them in `foo_bar` index as defined in `dataGeneratorFactory` params
Method receives same parameters as [generateDocuments](#generateDocuments) util.
```ts
await indexGeneratedDocuments({
docsCount: 10,
interval: ['2020-10-28T07:30:00.000Z', '2020-10-30T07:30:00.000Z'],
seed: (i, id, timestamp) => ({ id, '@timestamp': timestamp, seq: i })
})
```
#### **indexEnhancedDocuments**
Will index list of enhanced documents to `foo_bar` index as defined in `dataGeneratorFactory` params
Method receives same parameters as [enhanceDocuments](#enhanceDocuments) util.
```ts
await indexEnhancedDocuments({
interval: ['1996-02-15T13:02:37.531Z', '2000-02-15T13:02:37.531Z'],
documents: [{ foo: 'bar' }, { foo: 'bar-1' }, { foo: 'bar-2' }]
})
```
## Utils
### **generateDocuments**
Util `generateDocuments` can generate list of documents based on basic seed function
Seed callback will receive sequential number of document of document, generated id, timestamp.
Can be used to generate custom document with large set of options depends on needs. See examples below.
| Property | Description | Type |
| --------------- | ------------------------------------------------------ | ------ |
| docsCount | number of documents to generate | `number` |
| seed | function that receives sequential number of document, generated id, timestamp as arguments and can used it create a document | `(index: number, id: string, timestamp: string) => Document` |
| interval | interval in which generate documents, defined by '@timestamp' field | `[string \| Date string \| Date]` _(optional)_ |
Examples:
1. Generate 10 documents with random id, timestamp in interval between '2020-10-28T07:30:00.000Z', '2020-10-30T07:30:00.000Z', and field `seq` that represents sequential number of document
```ts
const documents = generateDocuments({
docsCount: 10,
interval: ['2020-10-28T07:30:00.000Z', '2020-10-30T07:30:00.000Z'],
seed: (i, id, timestamp) => ({ id, '@timestamp': timestamp, seq: i })
})
```
<details>
<summary>Generated docs</summary>
```JSON
[
{
"id": "87d3d231-13c8-4d03-9ae4-d40781b3b2d1",
"@timestamp": "2020-10-30T04:00:55.790Z",
"seq": 0
},
{
"id": "90b99797-d0da-460d-86fd-eca40bedff39",
"@timestamp": "2020-10-28T08:43:01.117Z",
"seq": 1
},
{
"id": "809c05be-f401-4e31-86e1-55be8af4fac4",
"@timestamp": "2020-10-29T15:06:23.054Z",
"seq": 2
},
{
"id": "a2720f82-5401-4eab-b2eb-444a8425c937",
"@timestamp": "2020-10-29T23:19:47.790Z",
"seq": 3
},
{
"id": "e36e4418-4e89-4388-97df-97085b3fca92",
"@timestamp": "2020-10-29T09:14:00.966Z",
"seq": 4
},
{
"id": "4747adb3-0603-4651-8c0f-0c7df037f779",
"@timestamp": "2020-10-28T14:23:50.500Z",
"seq": 5
},
{
"id": "1fbfd873-b0ca-4cda-9c96-9a044622e712",
"@timestamp": "2020-10-28T10:00:20.995Z",
"seq": 6
},
{
"id": "9173cf93-1f9f-4f91-be5e-1e6888cb3aae",
"@timestamp": "2020-10-28T08:52:27.830Z",
"seq": 7
},
{
"id": "53245337-e383-4b28-9975-acbd79901b7c",
"@timestamp": "2020-10-29T08:58:02.385Z",
"seq": 8
},
{
"id": "0c700d33-df10-426e-8f71-677f437923ec",
"@timestamp": "2020-10-29T16:33:10.240Z",
"seq": 9
}
]
```
</details>
2. Generate 3 identical documents `{foo: bar}`
```ts
const documents = generateDocuments({
docsCount: 3,
seed: () => ({ foo: 'bar' })
})
```
<details>
<summary>Generated docs</summary>
```JSON
[
{
"foo": "bar"
},
{
"foo": "bar"
},
{
"foo": "bar"
}
]
```
</details>
3. Generate 5 documents with custom ingested timestamp, with no interval. If interval not defined, timestamp will be current time
```ts
const documents = generateDocuments({
docsCount: 5,
seed: (i, id, timestamp) => ({ foo: 'bar', event: { ingested: timestamp } })
})
```
<details>
<summary>Generated docs</summary>
```JSON
[
{
"foo": "bar",
"event": {
"ingested": "2023-02-15T13:02:37.531Z"
}
},
{
"foo": "bar",
"event": {
"ingested": "2023-02-15T13:02:37.531Z"
}
},
{
"foo": "bar",
"event": {
"ingested": "2023-02-15T13:02:37.531Z"
}
},
{
"foo": "bar",
"event": {
"ingested": "2023-02-15T13:02:37.531Z"
}
},
{
"foo": "bar",
"event": {
"ingested": "2023-02-15T13:02:37.531Z"
}
}
]
```
</details>
4. Generate 4 documents with custom if based on sequential number id
```ts
const documents = generateDocuments({
docsCount: 4,
seed: (i) => ({ foo: 'bar', id: `id-${i}`})
})
```
<details>
<summary>Generated docs</summary>
```JSON
[
{
"foo": "bar",
"id": "id-0"
},
{
"foo": "bar",
"id": "id-1"
},
{
"foo": "bar",
"id": "id-2"
},
{
"foo": "bar",
"id": "id-3"
}
]
```
</details>
### **enhanceDocument**
Adds generated `uuidv4` id and current time as `@timestamp` to document if `id`, `timestamp` params are not specified
`EnhanceDocumentOptions`
| Property | Description | Type |
| --------------- | ------------------------------------------------------ | ------ |
| id | id for document | `string` _(optional)_ |
| timestamp | timestamp for document | `string` _(optional)_ |
| document | document to enhance | `Record<string, unknown>` |
Examples:
1. Enhance document with generated `uuidv4` id and current time as `@timestamp`
```ts
const document = enhanceDocument({
document: { foo: 'bar' },
});
```
<details>
<summary>document</summary>
```JSON
{
"foo": "bar",
"id": "b501a64f-0dd4-4275-a38c-889be6a15a4d",
"@timestamp": "2023-02-15T17:21:21.429Z"
}
```
</details>
2. Enhance document with generated `uuidv4` id and predefined timestamp
```ts
const document = enhanceDocument({
timestamp: '1996-02-15T13:02:37.531Z',
document: { foo: 'bar' },
});
```
<details>
<summary>document</summary>
```JSON
{
"foo": "bar",
"id": "7b7460bf-e173-4744-af15-2c01ac52963b",
"@timestamp": "1996-02-15T13:02:37.531Z"
}
```
</details>
3. Enhance document with predefined id and and current time as `@timestamp`
```ts
const document = enhanceDocument({
id: 'test-id',
document: { foo: 'bar' },
});
```
<details>
<summary>document</summary>
```JSON
{
"foo": "bar",
"id": "test-id",
"@timestamp": "2023-02-15T17:21:21.429Z"
}
```
</details>
### **enhanceDocuments**
Adds generated `uuidv4` `id` property to list of documents if `id` parameter is not specified.
Adds `@timestamp` in defined interval to list of documents. If it's not specified, `@timestamp` will be added as current time
| Property | Description | Type |
| --------------- | ------------------------------------------------------ | ------ |
| documents | documents to enhance | `Record<string, unknown>[]` |
| id | id for documents | `string` _(optional)_ |
| interval | interval in which generate documents, defined by '@timestamp' field | `[string \| Date string \| Date]` _(optional)_ |
Examples:
1. Enhance documents with generated `uuidv4` id and current time as `@timestamp`
```ts
const documents = enhanceDocuments({
documents: [{ foo: 'bar' }, { foo: 'bar-1' }, { foo: 'bar-2' }]
});
```
<details>
<summary>documents</summary>
```JSON
[
{
"foo": "bar",
"id": "c55ddd6b-3cf2-4ebf-94d6-4eeeb4e5b655",
"@timestamp": "2023-02-16T16:43:13.573Z"
},
{
"foo": "bar-1",
"id": "61b157b9-5f1f-4d99-a5bf-072069f5139d",
"@timestamp": "2023-02-16T16:43:13.573Z"
},
{
"foo": "bar-2",
"id": "04929927-6d9e-4ccc-b083-250e3fe2d7a7",
"@timestamp": "2023-02-16T16:43:13.573Z"
}
]
```
</details>
2. Enhance document with generated `uuidv4` id and timestamp in predefined interval
```ts
const documents = enhanceDocuments({
interval: ['1996-02-15T13:02:37.531Z', '2000-02-15T13:02:37.531Z'],
documents: [{ foo: 'bar' }, { foo: 'bar-1' }, { foo: 'bar-2' }]
});
```
<details>
<summary>documents</summary>
```JSON
[
{
"foo": "bar",
"id": "883a67cb-0a57-4711-bdf9-e8a394a52460",
"@timestamp": "1998-07-04T15:16:46.587Z"
},
{
"foo": "bar-1",
"id": "70691d9e-1030-412f-8ae1-c6db50e90e91",
"@timestamp": "1998-05-15T07:00:52.339Z"
},
{
"foo": "bar-2",
"id": "b2140328-5cc4-4532-947e-30b8fd830ed7",
"@timestamp": "1999-09-01T21:50:38.957Z"
}
]
```
</details>
3. Enhance documents with predefined id and and current time as `@timestamp`
```ts
const documents = enhanceDocuments({
id: 'test-id',
documents: [{ foo: 'bar' }, { foo: 'bar-1' }, { foo: 'bar-2' }]
});
```
<details>
<summary>documents</summary>
```JSON
[
{
"foo": "bar",
"id": "test-id",
"@timestamp": "2023-02-16T16:43:13.574Z"
},
{
"foo": "bar-1",
"id": "test-id",
"@timestamp": "2023-02-16T16:43:13.574Z"
},
{
"foo": "bar-2",
"id": "test-id",
"@timestamp": "2023-02-16T16:43:13.574Z"
}
]
```
</details>
## Usage
### create test query rule that queries indexed documents within a test
When documents generated and indexed, there might be a need to create a test rule that targets only these documents. So, documents generated in the test, will be used only in context of this test.
There are few possible ways to do this
1. Create new index every time for a new test. Thus, newly indexed documents, will be the only documents present in test index. It might be costly operation, as it will require to create new index for each test, that re-initialize dataGeneratorFactory, or delete index after rule's run
2. Use the same id or specific field in documents.
For example:
```ts
const id = uuidv4();
const firstTimestamp = new Date().toISOString();
const firstDocument = {
id,
'@timestamp': firstTimestamp,
agent: {
name: 'agent-1',
},
};
await indexListOfDocuments([firstDocument, firstDocument]);
const rule: QueryRuleCreateProps = {
...getRuleForSignalTesting(['ecs_compliant']),
query: `id:${id}`,
};
```
All documents will have the same `id` and can be queried by following `id:${id}`
3. Use utility method `getKQLQueryFromDocumentList` that will create query from all ids in generated documents
```ts
const { documents } = await indexGeneratedDocuments({
docsCount: 4,
document: { foo: 'bar' },
enhance: true,
});
const query = getKQLQueryFromDocumentList(documents);
const rule = {
...getRuleForSignalTesting(['ecs_non_compliant']),
query,
};
```
util will generate the following query: `(id: "f6ca3ee1-407c-4685-a94b-11ef4ed5136b" or id: "2a7358b2-8cad-47ce-83b7-e4418c266f3e" or id: "9daec569-0ba1-4c46-a0c6-e340cee1c5fb" or id: "b03c2fdf-0ca1-447c-b8c6-2cc5a663ffe2")`, that will include all generated documents

View file

@ -0,0 +1,75 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import type { Client } from '@elastic/elasticsearch';
import { ToolingLog } from '@kbn/tooling-log';
import type { BulkResponse } from '@elastic/elasticsearch/lib/api/typesWithBodyKey';
import { indexDocuments } from './index_documents';
import { generateDocuments } from './generate_documents';
import { enhanceDocuments, EnhanceDocumentsOptions } from './enhance_documents';
import type { GenerateDocumentsParams } from './generate_documents';
import type { Document } from './types';
interface DataGeneratorParams {
es: Client;
documents: Array<Record<string, unknown>>;
index: string;
log: ToolingLog;
}
interface DataGeneratorResponse {
response: BulkResponse;
documents: Document[];
}
interface DataGenerator {
indexListOfDocuments: (docs: Document[]) => Promise<DataGeneratorResponse>;
indexGeneratedDocuments: (params: GenerateDocumentsParams) => Promise<DataGeneratorResponse>;
indexEnhancedDocuments: (params: EnhanceDocumentsOptions) => Promise<DataGeneratorResponse>;
}
/**
* initialize {@link DataGenerator}
* @param param.es - ES client
* @param params.index - index where document will be added
* @param params.log - logClient
* @returns methods of {@link DataGenerator}
*/
export const dataGeneratorFactory = ({
es,
index,
log,
}: Omit<DataGeneratorParams, 'documents'>): DataGenerator => {
return {
indexListOfDocuments: async (documents: DataGeneratorParams['documents']) => {
const response = await indexDocuments({ es, index, documents, log });
return {
documents,
response,
};
},
indexGeneratedDocuments: async (params: GenerateDocumentsParams) => {
const documents = generateDocuments(params);
const response = await indexDocuments({ es, index, documents, log });
return {
documents,
response,
};
},
indexEnhancedDocuments: async (params: EnhanceDocumentsOptions) => {
const documents = enhanceDocuments(params);
const response = await indexDocuments({ es, index, documents, log });
return {
documents,
response,
};
},
};
};

View file

@ -0,0 +1,29 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { v4 as uuidv4 } from 'uuid';
interface EnhanceDocumentOptions {
id?: string;
timestamp?: string;
document: Record<string, unknown>;
}
/**
* enhances document with generated id and timestamp
* @param {string} options.id - optional id, if not provided randomly generated
* @param {string} options.timestamp - optional timestamp of document, if not provided current time
* @param {Record<string, unknown>} options.document - document that will be enhanced
*/
export const enhanceDocument = (options: EnhanceDocumentOptions) => {
const id = options?.id ?? uuidv4();
const timestamp = options?.timestamp ?? new Date().toISOString();
return {
...options.document,
id,
'@timestamp': timestamp,
};
};

View file

@ -0,0 +1,32 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import type { IndexingInterval, Document } from './types';
import { getTimestamp } from './get_timestamp';
import { enhanceDocument } from './enhance_document';
export interface EnhanceDocumentsOptions {
interval?: IndexingInterval;
documents: Document[];
id?: string;
}
/**
* enhances documents with generated id and timestamp within interval
* @param {string} options.id - optional id, if not provided randomly generated
* @param {string} options.interval - optional interval of document, if not provided set as a current time
* @param {Record<string, unknown>[]} options.documents - documents that will be enhanced
*/
export const enhanceDocuments = ({ documents, interval, id }: EnhanceDocumentsOptions) => {
return documents.map((document) =>
enhanceDocument({
document,
id,
timestamp: getTimestamp(interval),
})
);
};

View file

@ -0,0 +1,39 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { v4 as uuidv4 } from 'uuid';
import { getTimestamp } from './get_timestamp';
import type { Document, IndexingInterval } from './types';
type DocumentSeedFunc = (index: number, id: string, timestamp: string) => Document;
export interface GenerateDocumentsParams {
interval?: IndexingInterval;
docsCount: number;
seed: DocumentSeedFunc;
}
/**
*
* @param param.interval - interval in which generate documents, defined by '@timestamp' field
* @param param.docsCount - number of document to generate
* @param param.seed - seed function. Function that receives index of document, generated id, timestamp as arguments and can used it create a document
* @returns generated Documents
*/
export const generateDocuments = ({ docsCount, interval, seed }: GenerateDocumentsParams) => {
const documents = [];
for (let i = 0; i < docsCount; i++) {
const id = uuidv4();
const timestamp = getTimestamp(interval);
documents.push(seed(i, id, timestamp));
}
return documents;
};

View file

@ -0,0 +1,41 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import type { Document } from './types';
/**
* returns KQL query from a list documents that includes all documents by their ids.
* it can be used later to create test rules that will query only these documents
* ```ts
* const documents = [
{
foo: 'bar',
id: 'f07df596-65ec-4ab1-b0b2-f3b69558ed26',
'@timestamp': '2020-10-29T07:10:51.989Z',
},
{
foo: 'bar',
id: 'e07614f9-1dc5-4849-90c4-31362bbdf8d0',
'@timestamp': '2020-10-30T00:32:48.987Z',
},
{
foo: 'test',
id: 'e03a5b12-77e6-4aa3-b0be-fbe5b0843f07',
'@timestamp': '2020-10-29T03:40:35.318Z',
},
];
const query = getKQLQueryFromDocumentList(documents);
// query equals to
// (id: "f07df596-65ec-4ab1-b0b2-f3b69558ed26" or id: "e07614f9-1dc5-4849-90c4-31362bbdf8d0" or id: "e03a5b12-77e6-4aa3-b0be-fbe5b0843f07")
* ```
*/
export const getKQLQueryFromDocumentList = (documents: Document[]) => {
const orClauses = documents.map(({ id }) => `id: "${id}"`).join(' or ');
return `(${orClauses})`;
};

View file

@ -0,0 +1,16 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import faker from 'faker';
import type { IndexingInterval } from './types';
export const getTimestamp = (interval?: IndexingInterval) => {
if (interval) {
return faker.date.between(...interval).toISOString();
}
return new Date().toISOString();
};

View file

@ -0,0 +1,14 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
export * from './data_generator_factory';
export * from './enhance_document';
export * from './enhance_documents';
export * from './generate_documents';
export * from './get_kql_query_from_documents_list';
export * from './get_timestamp';
export * from './index_documents';

View file

@ -0,0 +1,39 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import type { Client } from '@elastic/elasticsearch';
import type { BulkResponse } from '@elastic/elasticsearch/lib/api/typesWithBodyKey';
import { ToolingLog } from '@kbn/tooling-log';
interface IndexDocumentsParams {
es: Client;
documents: Array<Record<string, unknown>>;
index: string;
log: ToolingLog;
}
type IndexDocuments = (params: IndexDocumentsParams) => Promise<BulkResponse>;
/**
* Indexes documents into provided index
*/
export const indexDocuments: IndexDocuments = async ({ es, documents, index, log }) => {
const operations = documents.flatMap((doc: object) => [{ index: { _index: index } }, doc]);
const response = await es.bulk({ refresh: true, operations });
// throw error if document wasn't indexed, so test will be terminated earlier and no false positives can happen
response.items.some(({ index: responseIndex } = {}) => {
if (responseIndex?.error) {
log.error(
`Failed to index document in non_ecs_fields test suits: "${responseIndex.error?.reason}"`
);
throw Error(responseIndex.error.message);
}
});
return response;
};

View file

@ -0,0 +1,10 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
export type IndexingInterval = [string | Date, string | Date];
export type Document = Record<string, unknown>;

View file

@ -8,6 +8,8 @@ export * from './rules';
export * from './exception_list_and_item';
export * from './alerts';
export * from './actions';
export * from './data_generator';
export * from './rules/get_rule_so_by_id';
export * from './rules/create_rule_saved_object';
export * from './rules/get_rule_with_legacy_investigation_fields';
@ -18,4 +20,5 @@ export * from './count_down_es';
export * from './update_username';
export * from './refresh_index';
export * from './wait_for';
export * from './route_with_namespace';
export * from './wait_for_index_to_populate';

View file

@ -0,0 +1,21 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { FtrConfigProviderContext } from '@kbn/test';
export default async function ({ readConfigFile }: FtrConfigProviderContext) {
const functionalConfig = await readConfigFile(
require.resolve('../../../../../config/ess/config.base.trial')
);
return {
...functionalConfig.getAll(),
testFiles: [require.resolve('..')],
junit: {
reportName: 'Entity Analytics API Integration Tests - ESS - Risk Engine',
},
};
}

View file

@ -0,0 +1,15 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { createTestConfig } from '../../../../../config/serverless/config.base';
export default createTestConfig({
testFiles: [require.resolve('..')],
junit: {
reportName: 'Entity Analytics API Integration Tests - Serverless - Risk Engine',
},
});

View file

@ -0,0 +1,19 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { FtrProviderContext } from '../../../../ftr_provider_context';
export default function ({ loadTestFile }: FtrProviderContext) {
describe('Entity Analytics - Risk Engine', function () {
loadTestFile(require.resolve('./init_and_status_apis'));
loadTestFile(require.resolve('./risk_score_calculation'));
loadTestFile(require.resolve('./risk_score_preview'));
loadTestFile(require.resolve('./risk_scoring_task/task_execution'));
loadTestFile(require.resolve('./risk_scoring_task/task_execution_nondefault_spaces'));
loadTestFile(require.resolve('./telemetry_usage'));
});
}

View file

@ -7,7 +7,7 @@
import expect from '@kbn/expect';
import { riskEngineConfigurationTypeName } from '@kbn/security-solution-plugin/server/lib/entity_analytics/risk_engine/saved_object';
import { FtrProviderContext } from '../../../common/ftr_provider_context';
import {
legacyTransformIds,
createLegacyTransforms,
@ -17,9 +17,9 @@ import {
getLegacyRiskScoreDashboards,
clearLegacyDashboards,
cleanRiskEngine,
} from './utils';
} from '../../utils';
import { FtrProviderContext } from '../../../../ftr_provider_context';
// eslint-disable-next-line import/no-default-export
export default ({ getService }: FtrProviderContext) => {
const es = getService('es');
const supertest = getService('supertest');
@ -27,7 +27,7 @@ export default ({ getService }: FtrProviderContext) => {
const riskEngineRoutes = riskEngineRouteHelpersFactory(supertest);
const log = getService('log');
describe('Risk Engine', () => {
describe('@ess @serverless init_and_status_apis', () => {
beforeEach(async () => {
await cleanRiskEngine({ kibanaServer, es, log });
});

View file

@ -6,12 +6,16 @@
*/
import expect from '@kbn/expect';
import { X_ELASTIC_INTERNAL_ORIGIN_REQUEST } from '@kbn/core-http-common';
import { RISK_SCORE_CALCULATION_URL } from '@kbn/security-solution-plugin/common/constants';
import type { RiskScore } from '@kbn/security-solution-plugin/common/risk_engine';
import { v4 as uuidv4 } from 'uuid';
import { FtrProviderContext } from '../../../common/ftr_provider_context';
import { deleteAllAlerts, deleteAllRules } from '../../../utils';
import { dataGeneratorFactory } from '../../../utils/data_generator';
import {
deleteAllAlerts,
deleteAllRules,
dataGeneratorFactory,
} from '../../../detections_response/utils';
import {
buildDocument,
createAndSyncRuleAndAlertsFactory,
@ -19,9 +23,9 @@ import {
readRiskScores,
normalizeScores,
waitForRiskScoresToBePresent,
} from './utils';
} from '../../utils';
import { FtrProviderContext } from '../../../../ftr_provider_context';
// eslint-disable-next-line import/no-default-export
export default ({ getService }: FtrProviderContext): void => {
const supertest = getService('supertest');
const esArchiver = getService('esArchiver');
@ -39,6 +43,7 @@ export default ({ getService }: FtrProviderContext): void => {
.post(RISK_SCORE_CALCULATION_URL)
.set('kbn-xsrf', 'true')
.set('elastic-api-version', '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send(body)
.expect(200);
return result;
@ -63,7 +68,7 @@ export default ({ getService }: FtrProviderContext): void => {
});
};
describe('Risk Engine - Risk Scoring Calculation API', () => {
describe('@ess @serverless Risk Scoring Calculation API', () => {
context('with auditbeat data', () => {
const { indexListOfDocuments } = dataGeneratorFactory({
es,

View file

@ -10,17 +10,22 @@ import { ALERT_RISK_SCORE } from '@kbn/rule-data-utils';
import { RISK_SCORE_PREVIEW_URL } from '@kbn/security-solution-plugin/common/constants';
import type { RiskScore } from '@kbn/security-solution-plugin/common/risk_engine';
import { v4 as uuidv4 } from 'uuid';
import { FtrProviderContext } from '../../../common/ftr_provider_context';
import { createSignalsIndex, deleteAllAlerts, deleteAllRules } from '../../../utils';
import { dataGeneratorFactory } from '../../../utils/data_generator';
import { X_ELASTIC_INTERNAL_ORIGIN_REQUEST } from '@kbn/core-http-common';
import {
createAlertsIndex,
deleteAllAlerts,
deleteAllRules,
dataGeneratorFactory,
} from '../../../detections_response/utils';
import {
buildDocument,
createAndSyncRuleAndAlertsFactory,
deleteAllRiskScores,
sanitizeScores,
} from './utils';
} from '../../utils';
import { FtrProviderContext } from '../../../../ftr_provider_context';
// eslint-disable-next-line import/no-default-export
export default ({ getService }: FtrProviderContext): void => {
const supertest = getService('supertest');
const esArchiver = getService('esArchiver');
@ -37,6 +42,7 @@ export default ({ getService }: FtrProviderContext): void => {
const { body: result } = await supertest
.post(RISK_SCORE_PREVIEW_URL)
.set('elastic-api-version', '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.set('kbn-xsrf', 'true')
.send({ ...defaultBody, ...body })
.expect(200);
@ -56,7 +62,7 @@ export default ({ getService }: FtrProviderContext): void => {
return await previewRiskScores({ body: {} });
};
describe('Risk Engine - Risk Scoring Preview API', () => {
describe('@ess @serverless Risk Scoring Preview API', () => {
context('with auditbeat data', () => {
const { indexListOfDocuments } = dataGeneratorFactory({
es,
@ -78,7 +84,7 @@ export default ({ getService }: FtrProviderContext): void => {
await deleteAllAlerts(supertest, log, es);
await deleteAllRules(supertest, log);
await createSignalsIndex(supertest, log);
await createAlertsIndex(supertest, log);
});
afterEach(async () => {

View file

@ -0,0 +1,16 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import { GenericFtrProviderContext } from '@kbn/test';
import { SpacesServiceProvider } from '../../../../../../common/services/spaces';
import { services as serverlessServices } from '../../../../../../../test_serverless/api_integration/services';
const services = {
...serverlessServices,
spaces: SpacesServiceProvider,
};
export type FtrProviderContextWithSpaces = GenericFtrProviderContext<typeof services, {}>;

View file

@ -7,9 +7,11 @@
import expect from '@kbn/expect';
import { v4 as uuidv4 } from 'uuid';
import { FtrProviderContext } from '../../../common/ftr_provider_context';
import { deleteAllAlerts, deleteAllRules } from '../../../utils';
import { dataGeneratorFactory } from '../../../utils/data_generator';
import {
deleteAllAlerts,
deleteAllRules,
dataGeneratorFactory,
} from '../../../../detections_response/utils';
import {
buildDocument,
createAndSyncRuleAndAlertsFactory,
@ -20,11 +22,10 @@ import {
updateRiskEngineConfigSO,
getRiskEngineTask,
waitForRiskEngineTaskToBeGone,
deleteRiskScoreIndices,
cleanRiskEngine,
} from './utils';
} from '../../../utils';
import { FtrProviderContext } from '../../../../../ftr_provider_context';
// eslint-disable-next-line import/no-default-export
export default ({ getService }: FtrProviderContext): void => {
const supertest = getService('supertest');
const esArchiver = getService('esArchiver');
@ -35,7 +36,7 @@ export default ({ getService }: FtrProviderContext): void => {
const createAndSyncRuleAndAlerts = createAndSyncRuleAndAlertsFactory({ supertest, log });
const riskEngineRoutes = riskEngineRouteHelpersFactory(supertest);
describe('Risk Engine - Risk Scoring Task', () => {
describe('@ess @serverless Risk Scoring Task Execution', () => {
context('with auditbeat data', () => {
const { indexListOfDocuments } = dataGeneratorFactory({
es,
@ -92,7 +93,7 @@ export default ({ getService }: FtrProviderContext): void => {
await riskEngineRoutes.init();
});
it('calculates and persists risk scores for alert documents', async () => {
it('@skipInQA calculates and persists risk scores for alert documents', async () => {
await waitForRiskScoresToBePresent({ es, log, scoreCount: 10 });
const scores = await readRiskScores(es);
@ -103,7 +104,7 @@ export default ({ getService }: FtrProviderContext): void => {
);
});
it('starts the latest transform', async () => {
it('@skipInQA starts the latest transform', async () => {
await waitForRiskScoresToBePresent({ es, log, scoreCount: 10 });
const transformStats = await es.transform.getTransformStats({
@ -113,7 +114,7 @@ export default ({ getService }: FtrProviderContext): void => {
expect(transformStats.transforms[0].state).to.eql('started');
});
describe('disabling and re-enabling the risk engine', () => {
describe('@skipInQA disabling and re-enabling the risk engine', () => {
beforeEach(async () => {
await waitForRiskScoresToBePresent({ es, log, scoreCount: 10 });
await riskEngineRoutes.disable();
@ -135,7 +136,7 @@ export default ({ getService }: FtrProviderContext): void => {
});
});
describe('disabling the risk engine', () => {
describe('@skipInQA disabling the risk engine', () => {
beforeEach(async () => {
await waitForRiskScoresToBePresent({ es, log, scoreCount: 10 });
});
@ -214,7 +215,7 @@ export default ({ getService }: FtrProviderContext): void => {
await riskEngineRoutes.init();
});
it('calculates and persists risk scores for both types of entities', async () => {
it('@skipInQA calculates and persists risk scores for both types of entities', async () => {
await waitForRiskScoresToBePresent({ es, log, scoreCount: 20 });
const riskScores = await readRiskScores(es);
@ -226,74 +227,6 @@ export default ({ getService }: FtrProviderContext): void => {
expect(scoredIdentifiers.includes('user.name')).to.be(true);
});
});
describe('with alerts in a non-default space', () => {
let namespace: string;
let index: string[];
let documentId: string;
let createAndSyncRuleAndAlertsForOtherSpace: ReturnType<
typeof createAndSyncRuleAndAlertsFactory
>;
beforeEach(async () => {
documentId = uuidv4();
namespace = uuidv4();
index = [`risk-score.risk-score-${namespace}`];
createAndSyncRuleAndAlertsForOtherSpace = createAndSyncRuleAndAlertsFactory({
supertest,
log,
namespace,
});
const riskEngineRoutesForNamespace = riskEngineRouteHelpersFactory(supertest, namespace);
const spaces = getService('spaces');
await spaces.create({
id: namespace,
name: namespace,
disabledFeatures: [],
});
const baseEvent = buildDocument({ host: { name: 'host-1' } }, documentId);
await indexListOfDocuments(
Array(10)
.fill(baseEvent)
.map((_baseEvent, _index) => ({
..._baseEvent,
'host.name': `host-${_index}`,
}))
);
await createAndSyncRuleAndAlertsForOtherSpace({
query: `id: ${documentId}`,
alerts: 10,
riskScore: 40,
});
await riskEngineRoutesForNamespace.init();
});
afterEach(async () => {
await getService('spaces').delete(namespace);
await deleteRiskScoreIndices({ log, es, namespace });
});
it('calculates and persists risk scores for alert documents', async () => {
await waitForRiskScoresToBePresent({
es,
log,
scoreCount: 10,
index,
});
const scores = await readRiskScores(es, index);
expect(normalizeScores(scores).map(({ id_value: idValue }) => idValue)).to.eql(
Array(10)
.fill(0)
.map((_, _index) => `host-${_index}`)
);
});
});
});
});
};

View file

@ -0,0 +1,133 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import expect from '@kbn/expect';
import { v4 as uuidv4 } from 'uuid';
import {
deleteAllAlerts,
deleteAllRules,
dataGeneratorFactory,
} from '../../../../detections_response/utils';
import {
buildDocument,
createAndSyncRuleAndAlertsFactory,
readRiskScores,
waitForRiskScoresToBePresent,
normalizeScores,
riskEngineRouteHelpersFactory,
cleanRiskEngine,
deleteRiskScoreIndices,
} from '../../../utils';
import { FtrProviderContextWithSpaces } from './ftr_provider_context_with_spaces';
export default ({ getService }: FtrProviderContextWithSpaces): void => {
const supertest = getService('supertest');
const esArchiver = getService('esArchiver');
const es = getService('es');
const log = getService('log');
const kibanaServer = getService('kibanaServer');
describe('@ess Risk Scoring Task in non-default space', () => {
context('with auditbeat data', () => {
const { indexListOfDocuments } = dataGeneratorFactory({
es,
index: 'ecs_compliant',
log,
});
before(async () => {
await esArchiver.load('x-pack/test/functional/es_archives/security_solution/ecs_compliant');
});
after(async () => {
await esArchiver.unload(
'x-pack/test/functional/es_archives/security_solution/ecs_compliant'
);
});
beforeEach(async () => {
await cleanRiskEngine({ kibanaServer, es, log });
await deleteAllAlerts(supertest, log, es);
await deleteAllRules(supertest, log);
});
afterEach(async () => {
await cleanRiskEngine({ kibanaServer, es, log });
await deleteAllAlerts(supertest, log, es);
await deleteAllRules(supertest, log);
});
describe('with alerts in a non-default space', () => {
let namespace: string;
let index: string[];
let documentId: string;
let createAndSyncRuleAndAlertsForOtherSpace: ReturnType<
typeof createAndSyncRuleAndAlertsFactory
>;
beforeEach(async () => {
documentId = uuidv4();
namespace = uuidv4();
index = [`risk-score.risk-score-${namespace}`];
createAndSyncRuleAndAlertsForOtherSpace = createAndSyncRuleAndAlertsFactory({
supertest,
log,
namespace,
});
const riskEngineRoutesForNamespace = riskEngineRouteHelpersFactory(supertest, namespace);
const spaces = getService('spaces');
await spaces.create({
id: namespace,
name: namespace,
disabledFeatures: [],
});
const baseEvent = buildDocument({ host: { name: 'host-1' } }, documentId);
await indexListOfDocuments(
Array(10)
.fill(baseEvent)
.map((_baseEvent, _index) => ({
..._baseEvent,
'host.name': `host-${_index}`,
}))
);
await createAndSyncRuleAndAlertsForOtherSpace({
query: `id: ${documentId}`,
alerts: 10,
riskScore: 40,
});
await riskEngineRoutesForNamespace.init();
});
afterEach(async () => {
await getService('spaces').delete(namespace);
await deleteRiskScoreIndices({ log, es, namespace });
});
it('calculates and persists risk scores for alert documents', async () => {
await waitForRiskScoresToBePresent({
es,
log,
scoreCount: 10,
index,
});
const scores = await readRiskScores(es, index);
expect(normalizeScores(scores).map(({ id_value: idValue }) => idValue)).to.eql(
Array(10)
.fill(0)
.map((_, _index) => `host-${_index}`)
);
});
});
});
});
};

View file

@ -7,18 +7,21 @@
import expect from '@kbn/expect';
import { v4 as uuidv4 } from 'uuid';
import type { FtrProviderContext } from '../../../../common/ftr_provider_context';
import { deleteAllRules, deleteAllAlerts, getRiskEngineStats } from '../../../utils';
import {
deleteAllRules,
deleteAllAlerts,
dataGeneratorFactory,
} from '../../../detections_response/utils';
import {
buildDocument,
createAndSyncRuleAndAlertsFactory,
waitForRiskScoresToBePresent,
riskEngineRouteHelpersFactory,
cleanRiskEngine,
} from './utils';
import { dataGeneratorFactory } from '../../../utils/data_generator';
getRiskEngineStats,
} from '../../utils';
import { FtrProviderContext } from '../../../../ftr_provider_context';
// eslint-disable-next-line import/no-default-export
export default ({ getService }: FtrProviderContext) => {
const supertest = getService('supertest');
const esArchiver = getService('esArchiver');
@ -29,7 +32,7 @@ export default ({ getService }: FtrProviderContext) => {
const createAndSyncRuleAndAlerts = createAndSyncRuleAndAlertsFactory({ supertest, log });
const riskEngineRoutes = riskEngineRouteHelpersFactory(supertest);
describe('Risk engine telemetry', async () => {
describe('@ess @serverless telemetry', async () => {
const { indexListOfDocuments } = dataGeneratorFactory({
es,
index: 'ecs_compliant',
@ -122,8 +125,6 @@ export default ({ getService }: FtrProviderContext) => {
all_host_risk_scores_total_day: 10,
};
expect(otherStats).to.eql(expected);
expect(allRiskScoreIndexSize).to.be.greaterThan(0);
expect(uniqueRiskScoreIndexSize).to.be.greaterThan(0);
});
});
});

View file

@ -0,0 +1,73 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
import type { ToolingLog } from '@kbn/tooling-log';
import type SuperTest from 'supertest';
import type { DetectionMetrics } from '@kbn/security-solution-plugin/server/usage/detections/types';
import type { RiskEngineMetrics } from '@kbn/security-solution-plugin/server/usage/risk_engine/types';
import {
ELASTIC_HTTP_VERSION_HEADER,
X_ELASTIC_INTERNAL_ORIGIN_REQUEST,
} from '@kbn/core-http-common';
import { getStatsUrl } from '../../../../detection_engine_api_integration/utils/get_stats_url';
import {
getDetectionMetricsFromBody,
getRiskEngineMetricsFromBody,
} from '../../../../detection_engine_api_integration/utils/get_detection_metrics_from_body';
/**
* Gets the stats from the stats endpoint.
* @param supertest The supertest agent.
* @returns The detection metrics
*/
export const getStats = async (
supertest: SuperTest.SuperTest<SuperTest.Test>,
log: ToolingLog
): Promise<DetectionMetrics> => {
const response = await supertest
.post(getStatsUrl())
.set('kbn-xsrf', 'true')
.set(ELASTIC_HTTP_VERSION_HEADER, '2')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send({ unencrypted: true, refreshCache: true });
if (response.status !== 200) {
log.error(
`Did not get an expected 200 "ok" when getting the stats for detections. CI issues could happen. Suspect this line if you are seeing CI issues. body: ${JSON.stringify(
response.body
)}, status: ${JSON.stringify(response.status)}`
);
}
return getDetectionMetricsFromBody(response.body);
};
/**
* Gets the stats from the stats endpoint.
* @param supertest The supertest agent.
* @returns The detection metrics
*/
export const getRiskEngineStats = async (
supertest: SuperTest.SuperTest<SuperTest.Test>,
log: ToolingLog
): Promise<RiskEngineMetrics> => {
const response = await supertest
.post(getStatsUrl())
.set('kbn-xsrf', 'true')
.set(ELASTIC_HTTP_VERSION_HEADER, '2')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send({ unencrypted: true, refreshCache: true });
if (response.status !== 200) {
log.error(
`Did not get an expected 200 "ok" when getting the stats for risk engine. CI issues could happen. Suspect this line if you are seeing CI issues. body: ${JSON.stringify(
response.body
)}, status: ${JSON.stringify(response.status)}`
);
}
return getRiskEngineMetricsFromBody(response.body);
};

View file

@ -0,0 +1,8 @@
/*
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
* or more contributor license agreements. Licensed under the Elastic License
* 2.0; you may not use this file except in compliance with the Elastic License
* 2.0.
*/
export * from './risk_engine';
export * from './get_risk_engine_stats';

View file

@ -5,7 +5,10 @@
* 2.0.
*/
import { ELASTIC_HTTP_VERSION_HEADER } from '@kbn/core-http-common';
import {
ELASTIC_HTTP_VERSION_HEADER,
X_ELASTIC_INTERNAL_ORIGIN_REQUEST,
} from '@kbn/core-http-common';
import { v4 as uuidv4 } from 'uuid';
import SuperTest from 'supertest';
import type { Client } from '@elastic/elasticsearch';
@ -21,13 +24,13 @@ import {
} from '@kbn/security-solution-plugin/common/constants';
import {
createRule,
waitForSignalsToBePresent,
waitForAlertsToBePresent,
waitForRuleSuccess,
getRuleForSignalTesting,
getRuleForAlertTesting,
countDownTest,
waitFor,
routeWithNamespace,
} from '../../../utils';
} from '../../detections_response/utils';
const sanitizeScore = (score: Partial<RiskScore>): Partial<RiskScore> => {
delete score['@timestamp'];
@ -79,7 +82,7 @@ export const createAndSyncRuleAndAlertsFactory =
query: string;
riskScoreOverride?: string;
}): Promise<void> => {
const rule = getRuleForSignalTesting(['ecs_compliant']);
const rule = getRuleForAlertTesting(['ecs_compliant']);
const { id } = await createRule(
supertest,
log,
@ -99,7 +102,7 @@ export const createAndSyncRuleAndAlertsFactory =
namespace
);
await waitForRuleSuccess({ supertest, log, id, namespace });
await waitForSignalsToBePresent(supertest, log, alerts, [id], namespace);
await waitForAlertsToBePresent(supertest, log, alerts, [id], namespace);
};
export const deleteRiskScoreIndices = async ({
@ -399,6 +402,7 @@ export const clearLegacyDashboards = async ({
'/internal/risk_score/prebuilt_content/saved_objects/_bulk_delete/hostRiskScoreDashboards'
)
.set('kbn-xsrf', 'true')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200);
@ -407,6 +411,7 @@ export const clearLegacyDashboards = async ({
'/internal/risk_score/prebuilt_content/saved_objects/_bulk_delete/userRiskScoreDashboards'
)
.set('kbn-xsrf', 'true')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200);
} catch (e) {
@ -468,6 +473,7 @@ export const riskEngineRouteHelpersFactory = (
.post(routeWithNamespace(RISK_ENGINE_INIT_URL, namespace))
.set('kbn-xsrf', 'true')
.set('elastic-api-version', '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200),
@ -484,6 +490,7 @@ export const riskEngineRouteHelpersFactory = (
.post(routeWithNamespace(RISK_ENGINE_ENABLE_URL, namespace))
.set('kbn-xsrf', 'true')
.set('elastic-api-version', '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200),
@ -492,6 +499,7 @@ export const riskEngineRouteHelpersFactory = (
.post(routeWithNamespace(RISK_ENGINE_DISABLE_URL, namespace))
.set('kbn-xsrf', 'true')
.set('elastic-api-version', '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200),
});
@ -505,6 +513,7 @@ export const installLegacyRiskScore = async ({
.post('/internal/risk_score')
.set('kbn-xsrf', 'true')
.set(ELASTIC_HTTP_VERSION_HEADER, '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send({ riskScoreEntity: 'host' })
.expect(200);
@ -512,6 +521,7 @@ export const installLegacyRiskScore = async ({
.post('/internal/risk_score')
.set('kbn-xsrf', 'true')
.set(ELASTIC_HTTP_VERSION_HEADER, '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send({ riskScoreEntity: 'user' })
.expect(200);
@ -521,6 +531,7 @@ export const installLegacyRiskScore = async ({
)
.set('kbn-xsrf', 'true')
.set(ELASTIC_HTTP_VERSION_HEADER, '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200);
@ -530,6 +541,7 @@ export const installLegacyRiskScore = async ({
)
.set('kbn-xsrf', 'true')
.set(ELASTIC_HTTP_VERSION_HEADER, '1')
.set(X_ELASTIC_INTERNAL_ORIGIN_REQUEST, 'kibana')
.send()
.expect(200);
};

View file

@ -30,6 +30,7 @@
"@kbn/core-saved-objects-server",
"@kbn/core",
"@kbn/alerting-plugin",
"@kbn/core-http-common",
"@kbn/securitysolution-ecs"
]
}