[TSDB] Removed summary and histogram metric types (#89937)

It seems that for now we don't have a good use for the histogram and summary metric types. 
They had been left as place holders for a while, but at this point there is no concrete plan forward for them.

This PR removes the histogram and summary metric types. We may add them back in the future.

Also, this PR completely removes the time_series_metric mapping parameter from the histogram field type and only allows the gauge metric type for aggregate_metric_double fields.
This commit is contained in:
Christos Soulios 2022-09-09 15:04:30 +03:00 committed by GitHub
parent 715476547f
commit 1a709caa65
23 changed files with 38 additions and 222 deletions

View file

@ -89,7 +89,7 @@ A TSDS document is uniquely identified by its time series and timestamp, both of
which are used to generate the document `_id`. So, two documents with the same
dimensions and the same timestamp are considered to be duplicates. When you use
the `_bulk` endpoint to add documents to a TSDS, a second document with the same
timestamp and dimensions overwrites the first. When you use the
timestamp and dimensions overwrites the first. When you use the
`PUT /<target>/_create/<_id>` format to add an individual document and a document
with the same `_id` already exists, an error is generated.
@ -105,16 +105,16 @@ parameter:
* <<number,`long`>>
* <<number,`unsigned_long`>>
[[dimension-limits]]
.Dimension limits
****
[[dimension-limits]]
.Dimension limits
****
In a TSDS, {es} uses dimensions to
generate the document `_id` and <<tsid,`_tsid`>> values. The resulting `_id` is
always a short encoded hash. To prevent the `_tsid` value from being overly
large, {es} limits the number of dimensions for an index using the
<<index-mapping-dimension-fields-limit,`index.mapping.dimension_fields.limit`>>
index setting. While you can increase this limit, the resulting document `_tsid`
value can't exceed 32KB.
value can't exceed 32KB.
****
[discrete]
@ -157,20 +157,6 @@ available disk space.
Only numeric and `aggregate_metric_double` fields support the `gauge` metric
type.
// tag::time-series-metric-histogram[]
`histogram`:: A pair of numeric arrays that measure the distribution of values
across predefined buckets. For example, server response times by percentile.
// end::time-series-metric-histogram[]
+
Only `histogram` fields support the `histogram` metric type.
// tag::time-series-metric-summary[]
`summary`:: An array of aggregated values, such as `sum`, `avg`, `value_count`,
`min`, and `max`.
// end::time-series-metric-summary[]
+
Only `aggregate_metric_double` fields support the `gauge` metric type.
// tag::time-series-metric-null[]
`null` (Default):: Not a time series metric.
// end::time-series-metric-null[]
@ -303,4 +289,4 @@ Now that you know the basics, you're ready to <<set-up-tsds,create a TSDS>> or
<<set-up-tsds,convert an existing data stream to a TSDS>>.
include::set-up-tsds.asciidoc[]
include::tsds-index-settings.asciidoc[]
include::tsds-index-settings.asciidoc[]

View file

@ -64,12 +64,9 @@ include::numeric.asciidoc[tag=time_series_metric]
.Valid `time_series_metric` values for `aggregate_metric_double` fields
[%collapsible%open]
====
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-counter]
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-gauge]
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-summary]
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-null]
====

View file

@ -24,23 +24,6 @@ per document. Nested arrays are not supported.
========
[role="child_attributes"]
[[histogram-params]]
==== Parameters
ifeval::["{release-state}"=="unreleased"]
`time_series_metric`::
preview:[] (Optional, string)
include::numeric.asciidoc[tag=time_series_metric]
+
.Valid `time_series_metric` values for `histogram` fields
[%collapsible%open]
====
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-histogram]
include::{es-repo-dir}/data-streams/tsds.asciidoc[tag=time-series-metric-null]
====
endif::[]
[[histogram-uses]]
==== Uses

View file

@ -24,9 +24,7 @@ public final class TimeSeriesParams {
public enum MetricType {
gauge(new String[] { "max", "min", "value_count", "sum" }),
counter(new String[] { "last_value" }),
histogram(new String[] { "value_count" }), // TODO Add more aggs
summary(new String[] { "value_count", "sum", "min", "max" });
counter(new String[] { "last_value" });
private final String[] supportedAggs;

View file

@ -35,7 +35,6 @@ import org.elasticsearch.index.mapper.MapperParsingException;
import org.elasticsearch.index.mapper.SourceLoader;
import org.elasticsearch.index.mapper.SourceValueFetcher;
import org.elasticsearch.index.mapper.TextSearchInfo;
import org.elasticsearch.index.mapper.TimeSeriesParams;
import org.elasticsearch.index.mapper.ValueFetcher;
import org.elasticsearch.index.query.SearchExecutionContext;
import org.elasticsearch.script.field.DocValuesScriptFieldFactory;
@ -74,12 +73,6 @@ public class HistogramFieldMapper extends FieldMapper {
private final Parameter<Map<String, String>> meta = Parameter.metaParam();
private final Parameter<Explicit<Boolean>> ignoreMalformed;
/**
* Parameter that marks this field as a time series metric defining its time series metric type.
* For {@link HistogramFieldMapper} fields only the histogram metric type is supported.
*/
private final Parameter<TimeSeriesParams.MetricType> metric;
public Builder(String name, boolean ignoreMalformedByDefault) {
super(name);
this.ignoreMalformed = Parameter.explicitBoolParam(
@ -88,20 +81,18 @@ public class HistogramFieldMapper extends FieldMapper {
m -> toType(m).ignoreMalformed,
ignoreMalformedByDefault
);
this.metric = TimeSeriesParams.metricParam(m -> toType(m).metricType, TimeSeriesParams.MetricType.histogram);
}
@Override
protected Parameter<?>[] getParameters() {
return new Parameter<?>[] { ignoreMalformed, meta, metric };
return new Parameter<?>[] { ignoreMalformed, meta };
}
@Override
public HistogramFieldMapper build(MapperBuilderContext context) {
return new HistogramFieldMapper(
name,
new HistogramFieldType(context.buildFullName(name), meta.getValue(), metric.getValue()),
new HistogramFieldType(context.buildFullName(name), meta.getValue()),
multiFieldsBuilder.build(this, context),
copyTo.build(),
this
@ -117,9 +108,6 @@ public class HistogramFieldMapper extends FieldMapper {
private final Explicit<Boolean> ignoreMalformed;
private final boolean ignoreMalformedByDefault;
/** The metric type (gauge, counter, summary) if field is a time series metric */
private final TimeSeriesParams.MetricType metricType;
public HistogramFieldMapper(
String simpleName,
MappedFieldType mappedFieldType,
@ -130,7 +118,6 @@ public class HistogramFieldMapper extends FieldMapper {
super(simpleName, mappedFieldType, multiFields, copyTo);
this.ignoreMalformed = builder.ignoreMalformed.getValue();
this.ignoreMalformedByDefault = builder.ignoreMalformed.getDefaultValue().value();
this.metricType = builder.metric.getValue();
}
boolean ignoreMalformed() {
@ -154,11 +141,8 @@ public class HistogramFieldMapper extends FieldMapper {
public static class HistogramFieldType extends MappedFieldType {
private final TimeSeriesParams.MetricType metricType;
public HistogramFieldType(String name, Map<String, String> meta, TimeSeriesParams.MetricType metricType) {
public HistogramFieldType(String name, Map<String, String> meta) {
super(name, false, false, true, TextSearchInfo.NONE, meta);
this.metricType = metricType;
}
@Override
@ -262,14 +246,6 @@ public class HistogramFieldMapper extends FieldMapper {
"[" + CONTENT_TYPE + "] field do not support searching, " + "use dedicated aggregations instead: [" + name() + "]"
);
}
/**
* If field is a time series metric field, returns its metric type
* @return the metric type or null
*/
public TimeSeriesParams.MetricType getMetricType() {
return metricType;
}
}
@Override

View file

@ -235,7 +235,7 @@ public class HistoBackedHistogramAggregatorTests extends AggregatorTestCase {
}
private MappedFieldType defaultFieldType(String fieldName) {
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap());
}
}

View file

@ -373,7 +373,7 @@ public class HistoBackedRangeAggregatorTests extends AggregatorTestCase {
private MappedFieldType defaultFieldType(String fieldName) {
if (fieldName.equals(HISTO_FIELD_NAME)) {
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(fieldName, Collections.emptyMap());
} else {
return new NumberFieldMapper.NumberFieldType(fieldName, NumberFieldMapper.NumberType.DOUBLE);
}

View file

@ -90,7 +90,7 @@ public class HDRPreAggregatedPercentileRanksAggregatorTests extends AggregatorTe
PercentileRanksAggregationBuilder aggBuilder = new PercentileRanksAggregationBuilder("my_agg", new double[] { 0.1, 0.5, 12 })
.field("field")
.method(PercentilesMethod.HDR);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap());
try (IndexReader reader = w.getReader()) {
IndexSearcher searcher = new IndexSearcher(reader);
PercentileRanks ranks = searchAndReduce(searcher, new MatchAllDocsQuery(), aggBuilder, fieldType);

View file

@ -149,7 +149,7 @@ public class HDRPreAggregatedPercentilesAggregatorTests extends AggregatorTestCa
PercentilesAggregationBuilder builder = new PercentilesAggregationBuilder("test").field("number")
.method(PercentilesMethod.HDR);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap());
Aggregator aggregator = createAggregator(builder, indexSearcher, fieldType);
aggregator.preCollection();
indexSearcher.search(query, aggregator.asCollector());

View file

@ -136,6 +136,6 @@ public class HistoBackedAvgAggregatorTests extends AggregatorTestCase {
}
private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedAvgAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedAvgAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}

View file

@ -135,6 +135,6 @@ public class HistoBackedMaxAggregatorTests extends AggregatorTestCase {
}
private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMaxAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMaxAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}

View file

@ -135,6 +135,6 @@ public class HistoBackedMinAggregatorTests extends AggregatorTestCase {
}
private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMinAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedMinAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}

View file

@ -135,6 +135,6 @@ public class HistoBackedSumAggregatorTests extends AggregatorTestCase {
}
private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType(HistoBackedSumAggregatorTests.FIELD_NAME, Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType(HistoBackedSumAggregatorTests.FIELD_NAME, Collections.emptyMap());
}
}

View file

@ -140,6 +140,6 @@ public class HistoBackedValueCountAggregatorTests extends AggregatorTestCase {
}
private MappedFieldType defaultFieldType() {
return new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap(), null);
return new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap());
}
}

View file

@ -70,7 +70,7 @@ public class TDigestPreAggregatedPercentileRanksAggregatorTests extends Aggregat
PercentileRanksAggregationBuilder aggBuilder = new PercentileRanksAggregationBuilder("my_agg", new double[] { 0.1, 0.5, 12 })
.field("field")
.method(PercentilesMethod.TDIGEST);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("field", Collections.emptyMap());
try (IndexReader reader = w.getReader()) {
IndexSearcher searcher = new IndexSearcher(reader);
PercentileRanks ranks = searchAndReduce(searcher, new MatchAllDocsQuery(), aggBuilder, fieldType);

View file

@ -130,7 +130,7 @@ public class TDigestPreAggregatedPercentilesAggregatorTests extends AggregatorTe
PercentilesAggregationBuilder builder = new PercentilesAggregationBuilder("test").field("number")
.method(PercentilesMethod.TDIGEST);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap(), null);
MappedFieldType fieldType = new HistogramFieldMapper.HistogramFieldType("number", Collections.emptyMap());
Aggregator aggregator = createAggregator(builder, indexSearcher, fieldType);
aggregator.preCollection();
indexSearcher.search(query, aggregator.asCollector());

View file

@ -9,7 +9,6 @@ package org.elasticsearch.xpack.analytics.mapper;
import org.elasticsearch.index.mapper.DocumentMapper;
import org.elasticsearch.index.mapper.MappedFieldType;
import org.elasticsearch.index.mapper.MapperParsingException;
import org.elasticsearch.index.mapper.MapperService;
import org.elasticsearch.index.mapper.MapperTestCase;
import org.elasticsearch.index.mapper.ParsedDocument;
import org.elasticsearch.index.mapper.SourceToParse;
@ -318,37 +317,6 @@ public class HistogramFieldMapperTests extends MapperTestCase {
assertThat(e.getMessage(), containsString("Field [hist] of type [histogram] can't be used in multifields"));
}
public void testMetricType() throws IOException {
// Test default setting
MapperService mapperService = createMapperService(fieldMapping(b -> minimalMapping(b)));
HistogramFieldMapper.HistogramFieldType ft = (HistogramFieldMapper.HistogramFieldType) mapperService.fieldType("field");
assertNull(ft.getMetricType());
assertMetricType("histogram", HistogramFieldMapper.HistogramFieldType::getMetricType);
{
// Test invalid metric type for this field type
Exception e = expectThrows(MapperParsingException.class, () -> createMapperService(fieldMapping(b -> {
minimalMapping(b);
b.field("time_series_metric", "gauge");
})));
assertThat(
e.getCause().getMessage(),
containsString("Unknown value [gauge] for field [time_series_metric] - accepted values are [histogram]")
);
}
{
// Test invalid metric type for this field type
Exception e = expectThrows(MapperParsingException.class, () -> createMapperService(fieldMapping(b -> {
minimalMapping(b);
b.field("time_series_metric", "unknown");
})));
assertThat(
e.getCause().getMessage(),
containsString("Unknown value [unknown] for field [time_series_metric] - accepted values are [histogram]")
);
}
}
@Override
protected IngestScriptSupport ingestScriptSupport() {
throw new AssumptionViolatedException("not supported");

View file

@ -719,7 +719,7 @@ public class RateAggregatorTests extends AggregatorTestCase {
}
public void testHistogramFieldMonthToMonth() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month").field("val");
if (randomBoolean()) {
@ -742,7 +742,7 @@ public class RateAggregatorTests extends AggregatorTestCase {
}
public void testHistogramFieldMonthToYear() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month").field("val");
if (randomBoolean()) {
@ -762,7 +762,7 @@ public class RateAggregatorTests extends AggregatorTestCase {
}
public void testHistogramFieldMonthToMonthValueCount() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month")
.rateMode("value_count")
@ -784,7 +784,7 @@ public class RateAggregatorTests extends AggregatorTestCase {
}
public void testHistogramFieldMonthToYearValueCount() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month")
.rateMode("value_count")
@ -805,7 +805,7 @@ public class RateAggregatorTests extends AggregatorTestCase {
}
public void testFilterWithHistogramField() throws IOException {
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap(), null);
MappedFieldType histType = new HistogramFieldMapper.HistogramFieldType("val", Collections.emptyMap());
MappedFieldType dateType = dateFieldType(DATE_FIELD);
MappedFieldType keywordType = new KeywordFieldMapper.KeywordFieldType("term");
RateAggregationBuilder rateAggregationBuilder = new RateAggregationBuilder("my_rate").rateUnit("month").field("val");

View file

@ -79,6 +79,10 @@ tasks.named("yamlRestTestV7CompatTest").configure {
'unsigned_long/50_script_values/script_score query',
'unsigned_long/50_script_values/Script query',
'data_stream/140_data_stream_aliases/Fix IndexNotFoundException error when handling remove alias action',
'aggregate-metrics/90_tsdb_mappings/aggregate_double_metric with time series mappings',
'aggregate-metrics/90_tsdb_mappings/aggregate_double_metric with wrong time series mappings',
'analytics/histogram/histogram with wrong time series mappings',
'analytics/histogram/histogram with time series mappings',
].join(',')
}

View file

@ -171,12 +171,7 @@ public class AggregateDoubleMetricFieldMapper extends FieldMapper {
ignoreMalformedByDefault
);
this.timeSeriesMetric = TimeSeriesParams.metricParam(
m -> toType(m).metricType,
MetricType.gauge,
MetricType.counter,
MetricType.summary
);
this.timeSeriesMetric = TimeSeriesParams.metricParam(m -> toType(m).metricType, MetricType.gauge);
this.indexCreatedVersion = Objects.requireNonNull(indexCreatedVersion);
}

View file

@ -563,20 +563,17 @@ public class AggregateDoubleMetricFieldMapperTests extends MapperTestCase {
AggregateDoubleMetricFieldMapper.AggregateDoubleMetricFieldType ft =
(AggregateDoubleMetricFieldMapper.AggregateDoubleMetricFieldType) mapperService.fieldType("field");
assertNull(ft.getMetricType());
assertMetricType("gauge", AggregateDoubleMetricFieldMapper.AggregateDoubleMetricFieldType::getMetricType);
assertMetricType("counter", AggregateDoubleMetricFieldMapper.AggregateDoubleMetricFieldType::getMetricType);
assertMetricType("summary", AggregateDoubleMetricFieldMapper.AggregateDoubleMetricFieldType::getMetricType);
{
// Test invalid metric type for this field type
Exception e = expectThrows(MapperParsingException.class, () -> createMapperService(fieldMapping(b -> {
minimalMapping(b);
b.field("time_series_metric", "histogram");
b.field("time_series_metric", "counter");
})));
assertThat(
e.getCause().getMessage(),
containsString("Unknown value [histogram] for field [time_series_metric] - accepted values are [gauge, counter, summary]")
containsString("Unknown value [counter] for field [time_series_metric] - accepted values are [gauge]")
);
}
{
@ -587,7 +584,7 @@ public class AggregateDoubleMetricFieldMapperTests extends MapperTestCase {
})));
assertThat(
e.getCause().getMessage(),
containsString("Unknown value [unknown] for field [time_series_metric] - accepted values are [gauge, counter, summary]")
containsString("Unknown value [unknown] for field [time_series_metric] - accepted values are [gauge]")
);
}
}

View file

@ -33,26 +33,21 @@ aggregate_double_metric with time series mappings:
type: aggregate_metric_double
metrics: [min, max, sum, value_count]
default_metric: max
time_series_metric: counter
time_series_metric: gauge
rx:
type: aggregate_metric_double
metrics: [min, max, sum, value_count]
default_metric: max
time_series_metric: gauge
packets_dropped:
type: aggregate_metric_double
metrics: [min, max, sum, value_count]
default_metric: max
time_series_metric: summary
---
aggregate_double_metric with wrong time series mappings:
- skip:
version: " - 7.15.99"
reason: introduced in 7.16.0
- do:
catch: /Unknown value \[histogram\] for field \[time_series_metric\] \- accepted values are \[gauge, counter, summary\]/
catch: /Unknown value \[histogram\] for field \[time_series_metric\] \- accepted values are \[gauge\]/
indices.create:
index: tsdb_index
body:
@ -78,16 +73,11 @@ aggregate_double_metric with wrong time series mappings:
type: keyword
network:
properties:
packets_dropped:
type: aggregate_metric_double
metrics: [ min, max, sum, value_count ]
default_metric: max
time_series_metric: summary
tx:
type: aggregate_metric_double
metrics: [min, max, sum, value_count]
default_metric: max
time_series_metric: counter
time_series_metric: gauge
rx:
type: aggregate_metric_double
metrics: [min, max, sum, value_count]

View file

@ -172,84 +172,6 @@ setup:
- match: { aggregations.ranges.buckets.3.key: "0.5-*" }
- match: { aggregations.ranges.buckets.3.doc_count: 11 }
---
histogram with time series mappings:
- skip:
version: " - 8.0.99"
reason: introduced in 8.1.0
- do:
indices.create:
index: tsdb_index
body:
settings:
index:
mode: time_series
routing_path: [metricset, k8s.pod.uid]
time_series:
start_time: 2021-04-28T00:00:00Z
end_time: 2021-04-29T00:00:00Z
number_of_replicas: 0
number_of_shards: 2
mappings:
properties:
"@timestamp":
type: date
metricset:
type: keyword
time_series_dimension: true
k8s:
properties:
pod:
properties:
uid:
type: keyword
time_series_dimension: true
name:
type: keyword
network:
properties:
latency:
type: histogram
time_series_metric: histogram
---
histogram with wrong time series mappings:
- skip:
version: " - 7.15.99"
reason: introduced in 7.16.0
- do:
catch: /Unknown value \[counter\] for field \[time_series_metric\] \- accepted values are \[histogram\]/
indices.create:
index: tsdb_index
body:
settings:
index:
number_of_replicas: 0
number_of_shards: 2
mappings:
properties:
"@timestamp":
type: date
metricset:
type: keyword
time_series_dimension: true
k8s:
properties:
pod:
properties:
uid:
type: keyword
time_series_dimension: true
name:
type: keyword
network:
properties:
latency:
type: histogram
time_series_metric: counter
---
histogram with synthetic source:
- skip: