Merge branch 'master' into shadow-replicas

This commit is contained in:
Simon Willnauer 2015-02-17 10:54:20 +01:00
commit e8ad614673
12 changed files with 358 additions and 562 deletions

View file

@ -6,9 +6,9 @@ your application to Elasticsearch 2.0.
=== Indices API === Indices API
The <<alias-retrieving, get alias api>> will, by default produce an error response The <<alias-retrieving, get alias api>> will, by default produce an error response
if a requested index does not exist. This change brings the defaults for this API in if a requested index does not exist. This change brings the defaults for this API in
line with the other Indices APIs. The <<multi-index>> options can be used on a request line with the other Indices APIs. The <<multi-index>> options can be used on a request
to change this behavior to change this behavior
`GetIndexRequest.features()` now returns an array of Feature Enums instead of an array of String values. `GetIndexRequest.features()` now returns an array of Feature Enums instead of an array of String values.
@ -109,13 +109,13 @@ Some query builders have been removed or renamed:
==== Aggregations ==== Aggregations
The `date_histogram` aggregation now returns a `Histogram` object in the response, and the `DateHistogram` class has been removed. Similarly The `date_histogram` aggregation now returns a `Histogram` object in the response, and the `DateHistogram` class has been removed. Similarly
the `date_range`, `ipv4_range`, and `geo_distance` aggregations all return a `Range` object in the response, and the `IPV4Range`, `DateRange`, the `date_range`, `ipv4_range`, and `geo_distance` aggregations all return a `Range` object in the response, and the `IPV4Range`, `DateRange`,
and `GeoDistance` classes have been removed. The motivation for this is to have a single response API for the Range and Histogram aggregations and `GeoDistance` classes have been removed. The motivation for this is to have a single response API for the Range and Histogram aggregations
regardless of the type of data being queried. To support this some changes were made in the `MultiBucketAggregation` interface which applies regardless of the type of data being queried. To support this some changes were made in the `MultiBucketAggregation` interface which applies
to all bucket aggregations: to all bucket aggregations:
* The `getKey()` method now returns `Object` instead of `String`. The actual object type returned depends on the type of aggregation requested * The `getKey()` method now returns `Object` instead of `String`. The actual object type returned depends on the type of aggregation requested
(e.g. the `date_histogram` will return a `DateTime` object for this method whereas a `histogram` will return a `Number`). (e.g. the `date_histogram` will return a `DateTime` object for this method whereas a `histogram` will return a `Number`).
* A `getKeyAsString()` method has been added to return the String representation of the key. * A `getKeyAsString()` method has been added to return the String representation of the key.
* All other `getKeyAsX()` methods have been removed. * All other `getKeyAsX()` methods have been removed.
@ -125,6 +125,11 @@ The `histogram` and the `date_histogram` aggregation now support a simplified `o
`post_offset` rounding options. Instead of having to specify two separate offset shifts of the underlying buckets, the `offset` option `post_offset` rounding options. Instead of having to specify two separate offset shifts of the underlying buckets, the `offset` option
moves the bucket boundaries in positive or negative direction depending on its argument. moves the bucket boundaries in positive or negative direction depending on its argument.
The `date_histogram` options for `pre_zone` and `post_zone` are replaced by the `time_zone` option. The behavior of `time_zone` is
equivalent to the former `pre_zone` option. Setting `time_zone` to a value like "+01:00" now will lead to the bucket calculations
being applied in the specified time zone but In addition to this, also the `pre_zone_adjust_large_interval` is removed because we
now always return dates and bucket keys in UTC.
=== Terms filter lookup caching === Terms filter lookup caching
The terms filter lookup mechanism does not support the `cache` option anymore The terms filter lookup mechanism does not support the `cache` option anymore

View file

@ -48,29 +48,17 @@ See <<time-units>> for accepted abbreviations.
==== Time Zone ==== Time Zone
By default, times are stored as UTC milliseconds since the epoch. Thus, all computation and "bucketing" / "rounding" is By default, times are stored as UTC milliseconds since the epoch. Thus, all computation and "bucketing" / "rounding" is
done on UTC. It is possible to provide a time zone (both pre rounding, and post rounding) value, which will cause all done on UTC. It is possible to provide a time zone value, which will cause all bucket
computations to take the relevant zone into account. The time returned for each bucket/entry is milliseconds since the computations to take place in the specified zone. The time returned for each bucket/entry is milliseconds since the
epoch of the provided time zone. epoch in UTC. The parameters is called `time_zone`. It accepts either a numeric value for the hours offset, for example:
`"time_zone" : -2`. It also accepts a format of hours and minutes, like `"time_zone" : "-02:30"`.
Another option is to provide a time zone accepted as one of the values listed here.
The parameters are `pre_zone` (pre rounding based on interval) and `post_zone` (post rounding based on interval). The Lets take an example. For `2012-04-01T04:15:30Z` (UTC), with a `time_zone` of `"-08:00"`. For day interval, the actual time by
`time_zone` parameter simply sets the `pre_zone` parameter. By default, those are set to `UTC`.
The zone value accepts either a numeric value for the hours offset, for example: `"time_zone" : -2`. It also accepts a
format of hours and minutes, like `"time_zone" : "-02:30"`. Another option is to provide a time zone accepted as one of
the values listed here.
Lets take an example. For `2012-04-01T04:15:30Z`, with a `pre_zone` of `-08:00`. For day interval, the actual time by
applying the time zone and rounding falls under `2012-03-31`, so the returned value will be (in millis) of applying the time zone and rounding falls under `2012-03-31`, so the returned value will be (in millis) of
`2012-03-31T00:00:00Z` (UTC). For hour interval, applying the time zone results in `2012-03-31T20:15:30`, rounding it `2012-03-31T08:00:00Z` (UTC). For hour interval, internally applying the time zone results in `2012-03-31T20:15:30`, so rounding it
results in `2012-03-31T20:00:00`, but, we want to return it in UTC (`post_zone` is not set), so we convert it back to in the time zone results in `2012-03-31T20:00:00`, but we return that rounded value converted back in UTC so be consistent as
UTC: `2012-04-01T04:00:00Z`. Note, we are consistent in the results, returning the rounded value in UTC. `2012-04-01T04:00:00Z` (UTC).
`post_zone` simply takes the result, and adds the relevant offset.
Sometimes, we want to apply the same conversion to UTC we did above for hour also for day (and up) intervals. We can
set `pre_zone_adjust_large_interval` to `true`, which will apply the same conversion done for hour interval in the
example, to day and above intervals (it can be set regardless of the interval, but only kick in when using day and
higher intervals).
==== Offset ==== Offset

View file

@ -19,7 +19,6 @@
package org.elasticsearch.common.rounding; package org.elasticsearch.common.rounding;
import org.elasticsearch.ElasticsearchException; import org.elasticsearch.ElasticsearchException;
import org.elasticsearch.Version;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.io.stream.Streamable; import org.elasticsearch.common.io.stream.Streamable;
@ -239,12 +238,8 @@ public abstract class Rounding implements Streamable {
byte id = in.readByte(); byte id = in.readByte();
switch (id) { switch (id) {
case Interval.ID: rounding = new Interval(); break; case Interval.ID: rounding = new Interval(); break;
case TimeZoneRounding.TimeTimeZoneRoundingFloor.ID: rounding = new TimeZoneRounding.TimeTimeZoneRoundingFloor(); break; case TimeZoneRounding.TimeUnitRounding.ID: rounding = new TimeZoneRounding.TimeUnitRounding(); break;
case TimeZoneRounding.UTCTimeZoneRoundingFloor.ID: rounding = new TimeZoneRounding.UTCTimeZoneRoundingFloor(); break; case TimeZoneRounding.TimeIntervalRounding.ID: rounding = new TimeZoneRounding.TimeIntervalRounding(); break;
case TimeZoneRounding.DayTimeZoneRoundingFloor.ID: rounding = new TimeZoneRounding.DayTimeZoneRoundingFloor(); break;
case TimeZoneRounding.UTCIntervalTimeZoneRounding.ID: rounding = new TimeZoneRounding.UTCIntervalTimeZoneRounding(); break;
case TimeZoneRounding.TimeIntervalTimeZoneRounding.ID: rounding = new TimeZoneRounding.TimeIntervalTimeZoneRounding(); break;
case TimeZoneRounding.DayIntervalTimeZoneRounding.ID: rounding = new TimeZoneRounding.DayIntervalTimeZoneRounding(); break;
case TimeZoneRounding.FactorRounding.ID: rounding = new FactorRounding(); break; case TimeZoneRounding.FactorRounding.ID: rounding = new FactorRounding(); break;
case OffsetRounding.ID: rounding = new OffsetRounding(); break; case OffsetRounding.ID: rounding = new OffsetRounding(); break;
default: throw new ElasticsearchException("unknown rounding id [" + id + "]"); default: throw new ElasticsearchException("unknown rounding id [" + id + "]");

View file

@ -23,7 +23,6 @@ import org.elasticsearch.ElasticsearchIllegalArgumentException;
import org.elasticsearch.common.io.stream.StreamInput; import org.elasticsearch.common.io.stream.StreamInput;
import org.elasticsearch.common.io.stream.StreamOutput; import org.elasticsearch.common.io.stream.StreamOutput;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
import org.joda.time.DateTimeConstants;
import org.joda.time.DateTimeField; import org.joda.time.DateTimeField;
import org.joda.time.DateTimeZone; import org.joda.time.DateTimeZone;
import org.joda.time.DurationField; import org.joda.time.DurationField;
@ -47,15 +46,12 @@ public abstract class TimeZoneRounding extends Rounding {
private DateTimeUnit unit; private DateTimeUnit unit;
private long interval = -1; private long interval = -1;
private DateTimeZone preTz = DateTimeZone.UTC; private DateTimeZone timeZone = DateTimeZone.UTC;
private DateTimeZone postTz = DateTimeZone.UTC;
private float factor = 1.0f; private float factor = 1.0f;
private long offset; private long offset;
private boolean preZoneAdjustLargeInterval = false;
public Builder(DateTimeUnit unit) { public Builder(DateTimeUnit unit) {
this.unit = unit; this.unit = unit;
this.interval = -1; this.interval = -1;
@ -68,18 +64,8 @@ public abstract class TimeZoneRounding extends Rounding {
this.interval = interval.millis(); this.interval = interval.millis();
} }
public Builder preZone(DateTimeZone preTz) { public Builder timeZone(DateTimeZone timeZone) {
this.preTz = preTz; this.timeZone = timeZone;
return this;
}
public Builder preZoneAdjustLargeInterval(boolean preZoneAdjustLargeInterval) {
this.preZoneAdjustLargeInterval = preZoneAdjustLargeInterval;
return this;
}
public Builder postZone(DateTimeZone postTz) {
this.postTz = postTz;
return this; return this;
} }
@ -96,21 +82,9 @@ public abstract class TimeZoneRounding extends Rounding {
public Rounding build() { public Rounding build() {
Rounding timeZoneRounding; Rounding timeZoneRounding;
if (unit != null) { if (unit != null) {
if (preTz.equals(DateTimeZone.UTC) && postTz.equals(DateTimeZone.UTC)) { timeZoneRounding = new TimeUnitRounding(unit, timeZone);
timeZoneRounding = new UTCTimeZoneRoundingFloor(unit);
} else if (preZoneAdjustLargeInterval || unit.field().getDurationField().getUnitMillis() < DateTimeConstants.MILLIS_PER_HOUR * 12) {
timeZoneRounding = new TimeTimeZoneRoundingFloor(unit, preTz, postTz);
} else {
timeZoneRounding = new DayTimeZoneRoundingFloor(unit, preTz, postTz);
}
} else { } else {
if (preTz.equals(DateTimeZone.UTC) && postTz.equals(DateTimeZone.UTC)) { timeZoneRounding = new TimeIntervalRounding(interval, timeZone);
timeZoneRounding = new UTCIntervalTimeZoneRounding(interval);
} else if (preZoneAdjustLargeInterval || interval < DateTimeConstants.MILLIS_PER_HOUR * 12) {
timeZoneRounding = new TimeIntervalTimeZoneRounding(interval, preTz, postTz);
} else {
timeZoneRounding = new DayIntervalTimeZoneRounding(interval, preTz, postTz);
}
} }
if (offset != 0) { if (offset != 0) {
timeZoneRounding = new OffsetRounding(timeZoneRounding, offset); timeZoneRounding = new OffsetRounding(timeZoneRounding, offset);
@ -122,25 +96,23 @@ public abstract class TimeZoneRounding extends Rounding {
} }
} }
static class TimeTimeZoneRoundingFloor extends TimeZoneRounding { static class TimeUnitRounding extends TimeZoneRounding {
static final byte ID = 1; static final byte ID = 1;
private DateTimeUnit unit; private DateTimeUnit unit;
private DateTimeField field; private DateTimeField field;
private DurationField durationField; private DurationField durationField;
private DateTimeZone preTz; private DateTimeZone timeZone;
private DateTimeZone postTz;
TimeTimeZoneRoundingFloor() { // for serialization TimeUnitRounding() { // for serialization
} }
TimeTimeZoneRoundingFloor(DateTimeUnit unit, DateTimeZone preTz, DateTimeZone postTz) { TimeUnitRounding(DateTimeUnit unit, DateTimeZone timeZone) {
this.unit = unit; this.unit = unit;
field = unit.field(); this.field = unit.field();
durationField = field.getDurationField(); this.durationField = field.getDurationField();
this.preTz = preTz; this.timeZone = timeZone;
this.postTz = postTz;
} }
@Override @Override
@ -150,21 +122,24 @@ public abstract class TimeZoneRounding extends Rounding {
@Override @Override
public long roundKey(long utcMillis) { public long roundKey(long utcMillis) {
long offset = preTz.getOffset(utcMillis); long timeLocal = utcMillis;
long time = utcMillis + offset; timeLocal = timeZone.convertUTCToLocal(utcMillis);
return field.roundFloor(time) - offset; long rounded = field.roundFloor(timeLocal);
return timeZone.convertLocalToUTC(rounded, true, utcMillis);
} }
@Override @Override
public long valueForKey(long time) { public long valueForKey(long time) {
// now apply post Tz assert roundKey(time) == time;
time = time + postTz.getOffset(time);
return time; return time;
} }
@Override @Override
public long nextRoundingValue(long value) { public long nextRoundingValue(long time) {
return durationField.add(value, 1); long timeLocal = time;
timeLocal = timeZone.convertUTCToLocal(time);
long nextInLocalTime = durationField.add(timeLocal, 1);
return timeZone.convertLocalToUTC(nextInLocalTime, true);
} }
@Override @Override
@ -172,33 +147,31 @@ public abstract class TimeZoneRounding extends Rounding {
unit = DateTimeUnit.resolve(in.readByte()); unit = DateTimeUnit.resolve(in.readByte());
field = unit.field(); field = unit.field();
durationField = field.getDurationField(); durationField = field.getDurationField();
preTz = DateTimeZone.forID(in.readString()); timeZone = DateTimeZone.forID(in.readString());
postTz = DateTimeZone.forID(in.readString());
} }
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
out.writeByte(unit.id()); out.writeByte(unit.id());
out.writeString(preTz.getID()); out.writeString(timeZone.getID());
out.writeString(postTz.getID());
} }
} }
static class UTCTimeZoneRoundingFloor extends TimeZoneRounding { static class TimeIntervalRounding extends TimeZoneRounding {
final static byte ID = 2; final static byte ID = 2;
private DateTimeUnit unit; private long interval;
private DateTimeField field; private DateTimeZone timeZone;
private DurationField durationField;
UTCTimeZoneRoundingFloor() { // for serialization TimeIntervalRounding() { // for serialization
} }
UTCTimeZoneRoundingFloor(DateTimeUnit unit) { TimeIntervalRounding(long interval, DateTimeZone timeZone) {
this.unit = unit; if (interval < 1)
field = unit.field(); throw new ElasticsearchIllegalArgumentException("Zero or negative time interval not supported");
durationField = field.getDurationField(); this.interval = interval;
this.timeZone = timeZone;
} }
@Override @Override
@ -208,257 +181,36 @@ public abstract class TimeZoneRounding extends Rounding {
@Override @Override
public long roundKey(long utcMillis) { public long roundKey(long utcMillis) {
return field.roundFloor(utcMillis); long timeLocal = utcMillis;
} timeLocal = timeZone.convertUTCToLocal(utcMillis);
long rounded = Rounding.Interval.roundValue(Rounding.Interval.roundKey(timeLocal, interval), interval);
@Override return timeZone.convertLocalToUTC(rounded, true);
public long valueForKey(long key) {
return key;
}
@Override
public long nextRoundingValue(long value) {
return durationField.add(value, 1);
}
@Override
public void readFrom(StreamInput in) throws IOException {
unit = DateTimeUnit.resolve(in.readByte());
field = unit.field();
durationField = field.getDurationField();
}
@Override
public void writeTo(StreamOutput out) throws IOException {
out.writeByte(unit.id());
}
}
static class DayTimeZoneRoundingFloor extends TimeZoneRounding {
final static byte ID = 3;
private DateTimeUnit unit;
private DateTimeField field;
private DurationField durationField;
private DateTimeZone preTz;
private DateTimeZone postTz;
DayTimeZoneRoundingFloor() { // for serialization
}
DayTimeZoneRoundingFloor(DateTimeUnit unit, DateTimeZone preTz, DateTimeZone postTz) {
this.unit = unit;
field = unit.field();
durationField = field.getDurationField();
this.preTz = preTz;
this.postTz = postTz;
}
@Override
public byte id() {
return ID;
}
@Override
public long roundKey(long utcMillis) {
long time = utcMillis + preTz.getOffset(utcMillis);
return field.roundFloor(time);
} }
@Override @Override
public long valueForKey(long time) { public long valueForKey(long time) {
// after rounding, since its day level (and above), its actually UTC! assert roundKey(time) == time;
// now apply post Tz
time = time + postTz.getOffset(time);
return time; return time;
} }
@Override @Override
public long nextRoundingValue(long value) { public long nextRoundingValue(long time) {
return durationField.add(value, 1); long timeLocal = time;
} timeLocal = timeZone.convertUTCToLocal(time);
long next = timeLocal + interval;
@Override return timeZone.convertLocalToUTC(next, true);
public void readFrom(StreamInput in) throws IOException {
unit = DateTimeUnit.resolve(in.readByte());
field = unit.field();
durationField = field.getDurationField();
preTz = DateTimeZone.forID(in.readString());
postTz = DateTimeZone.forID(in.readString());
}
@Override
public void writeTo(StreamOutput out) throws IOException {
out.writeByte(unit.id());
out.writeString(preTz.getID());
out.writeString(postTz.getID());
}
}
static class UTCIntervalTimeZoneRounding extends TimeZoneRounding {
final static byte ID = 4;
private long interval;
UTCIntervalTimeZoneRounding() { // for serialization
}
UTCIntervalTimeZoneRounding(long interval) {
if (interval < 1)
throw new ElasticsearchIllegalArgumentException("Zero or negative time interval not supported");
this.interval = interval;
}
@Override
public byte id() {
return ID;
}
@Override
public long roundKey(long utcMillis) {
return Rounding.Interval.roundKey(utcMillis, interval);
}
@Override
public long valueForKey(long key) {
return Rounding.Interval.roundValue(key, interval);
}
@Override
public long nextRoundingValue(long value) {
return value + interval;
} }
@Override @Override
public void readFrom(StreamInput in) throws IOException { public void readFrom(StreamInput in) throws IOException {
interval = in.readVLong(); interval = in.readVLong();
timeZone = DateTimeZone.forID(in.readString());
} }
@Override @Override
public void writeTo(StreamOutput out) throws IOException { public void writeTo(StreamOutput out) throws IOException {
out.writeVLong(interval); out.writeVLong(interval);
} out.writeString(timeZone.getID());
}
static class TimeIntervalTimeZoneRounding extends TimeZoneRounding {
final static byte ID = 5;
private long interval;
private DateTimeZone preTz;
private DateTimeZone postTz;
TimeIntervalTimeZoneRounding() { // for serialization
}
TimeIntervalTimeZoneRounding(long interval, DateTimeZone preTz, DateTimeZone postTz) {
if (interval < 1)
throw new ElasticsearchIllegalArgumentException("Zero or negative time interval not supported");
this.interval = interval;
this.preTz = preTz;
this.postTz = postTz;
}
@Override
public byte id() {
return ID;
}
@Override
public long roundKey(long utcMillis) {
long time = utcMillis + preTz.getOffset(utcMillis);
return Rounding.Interval.roundKey(time, interval);
}
@Override
public long valueForKey(long key) {
long time = Rounding.Interval.roundValue(key, interval);
// now, time is still in local, move it to UTC
time = time - preTz.getOffset(time);
// now apply post Tz
time = time + postTz.getOffset(time);
return time;
}
@Override
public long nextRoundingValue(long value) {
return value + interval;
}
@Override
public void readFrom(StreamInput in) throws IOException {
interval = in.readVLong();
preTz = DateTimeZone.forID(in.readString());
postTz = DateTimeZone.forID(in.readString());
}
@Override
public void writeTo(StreamOutput out) throws IOException {
out.writeVLong(interval);
out.writeString(preTz.getID());
out.writeString(postTz.getID());
}
}
static class DayIntervalTimeZoneRounding extends TimeZoneRounding {
final static byte ID = 6;
private long interval;
private DateTimeZone preTz;
private DateTimeZone postTz;
DayIntervalTimeZoneRounding() { // for serialization
}
DayIntervalTimeZoneRounding(long interval, DateTimeZone preTz, DateTimeZone postTz) {
if (interval < 1)
throw new ElasticsearchIllegalArgumentException("Zero or negative time interval not supported");
this.interval = interval;
this.preTz = preTz;
this.postTz = postTz;
}
@Override
public byte id() {
return ID;
}
@Override
public long roundKey(long utcMillis) {
long time = utcMillis + preTz.getOffset(utcMillis);
return Rounding.Interval.roundKey(time, interval);
}
@Override
public long valueForKey(long key) {
long time = Rounding.Interval.roundValue(key, interval);
// after rounding, since its day level (and above), its actually UTC!
// now apply post Tz
time = time + postTz.getOffset(time);
return time;
}
@Override
public long nextRoundingValue(long value) {
return value + interval;
}
@Override
public void readFrom(StreamInput in) throws IOException {
interval = in.readVLong();
preTz = DateTimeZone.forID(in.readString());
postTz = DateTimeZone.forID(in.readString());
}
@Override
public void writeTo(StreamOutput out) throws IOException {
out.writeVLong(interval);
out.writeString(preTz.getID());
out.writeString(postTz.getID());
} }
} }
} }

View file

@ -402,8 +402,8 @@ public class RangeQueryBuilder extends BaseQueryBuilder implements MultiTermQuer
/** /**
* In case of date field, we can adjust the from/to fields using a timezone * In case of date field, we can adjust the from/to fields using a timezone
*/ */
public RangeQueryBuilder timeZone(String preZone) { public RangeQueryBuilder timeZone(String timezone) {
this.timeZone = preZone; this.timeZone = timezone;
return this; return this;
} }

View file

@ -37,9 +37,7 @@ public class DateHistogramBuilder extends ValuesSourceAggregationBuilder<DateHis
private Long minDocCount; private Long minDocCount;
private Object extendedBoundsMin; private Object extendedBoundsMin;
private Object extendedBoundsMax; private Object extendedBoundsMax;
private String preZone; private String timeZone;
private String postZone;
private boolean preZoneAdjustLargeInterval;
private String format; private String format;
private String offset; private String offset;
private float factor = 1.0f; private float factor = 1.0f;
@ -87,24 +85,8 @@ public class DateHistogramBuilder extends ValuesSourceAggregationBuilder<DateHis
/** /**
* Set the timezone in which to translate dates before computing buckets. * Set the timezone in which to translate dates before computing buckets.
*/ */
public DateHistogramBuilder preZone(String preZone) { public DateHistogramBuilder timeZone(String timeZone) {
this.preZone = preZone; this.timeZone = timeZone;
return this;
}
/**
* Set the timezone in which to translate dates after having computed buckets.
*/
public DateHistogramBuilder postZone(String postZone) {
this.postZone = postZone;
return this;
}
/**
* Set whether to adjust large intervals, when using days or larger intervals.
*/
public DateHistogramBuilder preZoneAdjustLargeInterval(boolean preZoneAdjustLargeInterval) {
this.preZoneAdjustLargeInterval = preZoneAdjustLargeInterval;
return this; return this;
} }
@ -186,16 +168,8 @@ public class DateHistogramBuilder extends ValuesSourceAggregationBuilder<DateHis
order.toXContent(builder, params); order.toXContent(builder, params);
} }
if (preZone != null) { if (timeZone != null) {
builder.field("pre_zone", preZone); builder.field("time_zone", timeZone);
}
if (postZone != null) {
builder.field("post_zone", postZone);
}
if (preZoneAdjustLargeInterval) {
builder.field("pre_zone_adjust_large_interval", true);
} }
if (offset != null) { if (offset != null) {

View file

@ -43,6 +43,9 @@ import java.io.IOException;
public class DateHistogramParser implements Aggregator.Parser { public class DateHistogramParser implements Aggregator.Parser {
static final ParseField EXTENDED_BOUNDS = new ParseField("extended_bounds"); static final ParseField EXTENDED_BOUNDS = new ParseField("extended_bounds");
static final ParseField TIME_ZONE = new ParseField("time_zone");
static final ParseField OFFSET = new ParseField("offset");
static final ParseField INTERVAL = new ParseField("interval");
private final ImmutableMap<String, DateTimeUnit> dateFieldUnits; private final ImmutableMap<String, DateTimeUnit> dateFieldUnits;
@ -85,9 +88,7 @@ public class DateHistogramParser implements Aggregator.Parser {
ExtendedBounds extendedBounds = null; ExtendedBounds extendedBounds = null;
InternalOrder order = (InternalOrder) Histogram.Order.KEY_ASC; InternalOrder order = (InternalOrder) Histogram.Order.KEY_ASC;
String interval = null; String interval = null;
boolean preZoneAdjustLargeInterval = false; DateTimeZone timeZone = DateTimeZone.UTC;
DateTimeZone preZone = DateTimeZone.UTC;
DateTimeZone postZone = DateTimeZone.UTC;
long offset = 0; long offset = 0;
XContentParser.Token token; XContentParser.Token token;
@ -98,15 +99,11 @@ public class DateHistogramParser implements Aggregator.Parser {
} else if (vsParser.token(currentFieldName, token, parser)) { } else if (vsParser.token(currentFieldName, token, parser)) {
continue; continue;
} else if (token == XContentParser.Token.VALUE_STRING) { } else if (token == XContentParser.Token.VALUE_STRING) {
if ("time_zone".equals(currentFieldName) || "timeZone".equals(currentFieldName)) { if (TIME_ZONE.match(currentFieldName)) {
preZone = DateTimeZone.forID(parser.text()); timeZone = DateTimeZone.forID(parser.text());
} else if ("pre_zone".equals(currentFieldName) || "preZone".equals(currentFieldName)) { } else if (OFFSET.match(currentFieldName)) {
preZone = DateTimeZone.forID(parser.text());
} else if ("post_zone".equals(currentFieldName) || "postZone".equals(currentFieldName)) {
postZone = DateTimeZone.forID(parser.text());
} else if ("offset".equals(currentFieldName)) {
offset = parseOffset(parser.text()); offset = parseOffset(parser.text());
} else if ("interval".equals(currentFieldName)) { } else if (INTERVAL.match(currentFieldName)) {
interval = parser.text(); interval = parser.text();
} else { } else {
throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "]."); throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "].");
@ -114,8 +111,6 @@ public class DateHistogramParser implements Aggregator.Parser {
} else if (token == XContentParser.Token.VALUE_BOOLEAN) { } else if (token == XContentParser.Token.VALUE_BOOLEAN) {
if ("keyed".equals(currentFieldName)) { if ("keyed".equals(currentFieldName)) {
keyed = parser.booleanValue(); keyed = parser.booleanValue();
} else if ("pre_zone_adjust_large_interval".equals(currentFieldName) || "preZoneAdjustLargeInterval".equals(currentFieldName)) {
preZoneAdjustLargeInterval = parser.booleanValue();
} else { } else {
throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "]."); throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "].");
} }
@ -123,11 +118,7 @@ public class DateHistogramParser implements Aggregator.Parser {
if ("min_doc_count".equals(currentFieldName) || "minDocCount".equals(currentFieldName)) { if ("min_doc_count".equals(currentFieldName) || "minDocCount".equals(currentFieldName)) {
minDocCount = parser.longValue(); minDocCount = parser.longValue();
} else if ("time_zone".equals(currentFieldName) || "timeZone".equals(currentFieldName)) { } else if ("time_zone".equals(currentFieldName) || "timeZone".equals(currentFieldName)) {
preZone = DateTimeZone.forOffsetHours(parser.intValue()); timeZone = DateTimeZone.forOffsetHours(parser.intValue());
} else if ("pre_zone".equals(currentFieldName) || "preZone".equals(currentFieldName)) {
preZone = DateTimeZone.forOffsetHours(parser.intValue());
} else if ("post_zone".equals(currentFieldName) || "postZone".equals(currentFieldName)) {
postZone = DateTimeZone.forOffsetHours(parser.intValue());
} else { } else {
throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "]."); throw new SearchParseException(context, "Unknown key for a " + token + " in [" + aggregationName + "]: [" + currentFieldName + "].");
} }
@ -191,8 +182,7 @@ public class DateHistogramParser implements Aggregator.Parser {
} }
Rounding rounding = tzRoundingBuilder Rounding rounding = tzRoundingBuilder
.preZone(preZone).postZone(postZone) .timeZone(timeZone)
.preZoneAdjustLargeInterval(preZoneAdjustLargeInterval)
.offset(offset).build(); .offset(offset).build();
return new HistogramAggregator.Factory(aggregationName, vsParser.config(), rounding, order, keyed, minDocCount, extendedBounds, return new HistogramAggregator.Factory(aggregationName, vsParser.config(), rounding, order, keyed, minDocCount, extendedBounds,

View file

@ -21,18 +21,23 @@ package org.elasticsearch.common.rounding;
import org.elasticsearch.common.unit.TimeValue; import org.elasticsearch.common.unit.TimeValue;
import org.elasticsearch.test.ElasticsearchTestCase; import org.elasticsearch.test.ElasticsearchTestCase;
import org.joda.time.DateTime;
import org.joda.time.DateTimeZone; import org.joda.time.DateTimeZone;
import org.joda.time.format.ISODateTimeFormat; import org.joda.time.format.ISODateTimeFormat;
import org.junit.Test; import org.junit.Test;
import java.util.concurrent.TimeUnit;
import static org.hamcrest.Matchers.equalTo; import static org.hamcrest.Matchers.equalTo;
import static org.hamcrest.Matchers.greaterThan;
import static org.hamcrest.Matchers.lessThanOrEqualTo;
/** /**
*/ */
public class TimeZoneRoundingTests extends ElasticsearchTestCase { public class TimeZoneRoundingTests extends ElasticsearchTestCase {
@Test @Test
public void testUTCMonthRounding() { public void testUTCTimeUnitRounding() {
Rounding tzRounding = TimeZoneRounding.builder(DateTimeUnit.MONTH_OF_YEAR).build(); Rounding tzRounding = TimeZoneRounding.builder(DateTimeUnit.MONTH_OF_YEAR).build();
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-01T00:00:00.000Z"))); assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-01T00:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-01T00:00:00.000Z")), equalTo(utc("2009-03-01T00:00:00.000Z"))); assertThat(tzRounding.nextRoundingValue(utc("2009-02-01T00:00:00.000Z")), equalTo(utc("2009-03-01T00:00:00.000Z")));
@ -46,81 +51,242 @@ public class TimeZoneRoundingTests extends ElasticsearchTestCase {
assertThat(tzRounding.nextRoundingValue(utc("2012-01-08T00:00:00.000Z")), equalTo(utc("2012-01-15T00:00:00.000Z"))); assertThat(tzRounding.nextRoundingValue(utc("2012-01-08T00:00:00.000Z")), equalTo(utc("2012-01-15T00:00:00.000Z")));
} }
@Test
public void testUTCIntervalRounding() {
Rounding tzRounding = TimeZoneRounding.builder(TimeValue.timeValueHours(12)).build();
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-03T00:00:00.000Z")));
long roundKey = tzRounding.roundKey(utc("2009-02-03T01:01:01"));
assertThat(roundKey, equalTo(tzRounding.roundKey(utc("2009-02-03T00:00:00.000Z"))));
assertThat(tzRounding.valueForKey(roundKey), equalTo(utc("2009-02-03T00:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T00:00:00.000Z")), equalTo(utc("2009-02-03T12:00:00.000Z")));
assertThat(tzRounding.round(utc("2009-02-03T13:01:01")), equalTo(utc("2009-02-03T12:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T12:00:00.000Z")), equalTo(utc("2009-02-04T00:00:00.000Z")));
tzRounding = TimeZoneRounding.builder(TimeValue.timeValueHours(48)).build();
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-03T00:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T00:00:00.000Z")), equalTo(utc("2009-02-05T00:00:00.000Z")));
assertThat(tzRounding.round(utc("2009-02-05T13:01:01")), equalTo(utc("2009-02-05T00:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-05T00:00:00.000Z")), equalTo(utc("2009-02-07T00:00:00.000Z")));
}
/**
* test TimeIntervalTimeZoneRounding, (interval < 12h) with time zone shift
*/
@Test
public void testTimeIntervalTimeZoneRounding() {
Rounding tzRounding = TimeZoneRounding.builder(TimeValue.timeValueHours(6)).timeZone(DateTimeZone.forOffsetHours(-1)).build();
assertThat(tzRounding.round(utc("2009-02-03T00:01:01")), equalTo(utc("2009-02-02T19:00:00.000Z")));
long roundKey = tzRounding.roundKey(utc("2009-02-03T00:01:01"));
assertThat(roundKey, equalTo(tzRounding.roundKey(utc("2009-02-02T19:00:00.000Z"))));
assertThat(tzRounding.valueForKey(roundKey), equalTo(utc("2009-02-02T19:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-02T19:00:00.000Z")), equalTo(utc("2009-02-03T01:00:00.000Z")));
assertThat(tzRounding.round(utc("2009-02-03T13:01:01")), equalTo(utc("2009-02-03T13:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T13:00:00.000Z")), equalTo(utc("2009-02-03T19:00:00.000Z")));
}
/**
* test DayIntervalTimeZoneRounding, (interval >= 12h) with time zone shift
*/
@Test
public void testDayIntervalTimeZoneRounding() {
Rounding tzRounding = TimeZoneRounding.builder(TimeValue.timeValueHours(12)).timeZone(DateTimeZone.forOffsetHours(-8)).build();
assertThat(tzRounding.round(utc("2009-02-03T00:01:01")), equalTo(utc("2009-02-02T20:00:00.000Z")));
long roundKey = tzRounding.roundKey(utc("2009-02-03T00:01:01"));
assertThat(roundKey, equalTo(tzRounding.roundKey(utc("2009-02-02T20:00:00.000Z"))));
assertThat(tzRounding.valueForKey(roundKey), equalTo(utc("2009-02-02T20:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-02T20:00:00.000Z")), equalTo(utc("2009-02-03T08:00:00.000Z")));
assertThat(tzRounding.round(utc("2009-02-03T13:01:01")), equalTo(utc("2009-02-03T08:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T08:00:00.000Z")), equalTo(utc("2009-02-03T20:00:00.000Z")));
}
@Test @Test
public void testDayTimeZoneRounding() { public void testDayTimeZoneRounding() {
Rounding tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).preZone(DateTimeZone.forOffsetHours(-2)).build(); int timezoneOffset = -2;
assertThat(tzRounding.round(0), equalTo(0l - TimeValue.timeValueHours(24).millis())); Rounding tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).timeZone(DateTimeZone.forOffsetHours(timezoneOffset))
assertThat(tzRounding.nextRoundingValue(0l - TimeValue.timeValueHours(24).millis()), equalTo(0l)); .build();
assertThat(tzRounding.round(0), equalTo(0l - TimeValue.timeValueHours(24 + timezoneOffset).millis()));
assertThat(tzRounding.nextRoundingValue(0l - TimeValue.timeValueHours(24 + timezoneOffset).millis()), equalTo(0l - TimeValue
.timeValueHours(timezoneOffset).millis()));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).preZone(DateTimeZone.forOffsetHours(-2)).postZone(DateTimeZone.forOffsetHours(-2)).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).timeZone(DateTimeZone.forID("-08:00")).build();
assertThat(tzRounding.round(0), equalTo(0l - TimeValue.timeValueHours(26).millis())); assertThat(tzRounding.round(utc("2012-04-01T04:15:30Z")), equalTo(utc("2012-03-31T08:00:00Z")));
assertThat(tzRounding.nextRoundingValue(0l - TimeValue.timeValueHours(26).millis()), equalTo(-TimeValue.timeValueHours(2).millis())); assertThat(toUTCDateString(tzRounding.nextRoundingValue(utc("2012-03-31T08:00:00Z"))),
equalTo(toUTCDateString(utc("2012-04-01T08:0:00Z"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).preZone(DateTimeZone.forOffsetHours(-2)).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.MONTH_OF_YEAR).timeZone(DateTimeZone.forID("-08:00")).build();
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-02T00:00:00"))); assertThat(tzRounding.round(utc("2012-04-01T04:15:30Z")), equalTo(utc("2012-03-01T08:00:00Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-02T00:00:00")), equalTo(utc("2009-02-03T00:00:00"))); assertThat(toUTCDateString(tzRounding.nextRoundingValue(utc("2012-03-01T08:00:00Z"))),
equalTo(toUTCDateString(utc("2012-04-01T08:0:00Z"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).preZone(DateTimeZone.forOffsetHours(-2)).postZone(DateTimeZone.forOffsetHours(-2)).build(); // date in Feb-3rd, but still in Feb-2nd in -02:00 timezone
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(time("2009-02-02T00:00:00", DateTimeZone.forOffsetHours(+2)))); tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).timeZone(DateTimeZone.forID("-02:00")).build();
assertThat(tzRounding.nextRoundingValue(time("2009-02-02T00:00:00", DateTimeZone.forOffsetHours(+2))), equalTo(time("2009-02-03T00:00:00", DateTimeZone.forOffsetHours(+2)))); assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-02T02:00:00")));
long roundKey = tzRounding.roundKey(utc("2009-02-03T01:01:01"));
assertThat(roundKey, equalTo(tzRounding.roundKey(utc("2009-02-02T02:00:00.000Z"))));
assertThat(tzRounding.valueForKey(roundKey), equalTo(utc("2009-02-02T02:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-02T02:00:00")), equalTo(utc("2009-02-03T02:00:00")));
// date in Feb-3rd, also in -02:00 timezone
tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).timeZone(DateTimeZone.forID("-02:00")).build();
assertThat(tzRounding.round(utc("2009-02-03T02:01:01")), equalTo(utc("2009-02-03T02:00:00")));
roundKey = tzRounding.roundKey(utc("2009-02-03T02:01:01"));
assertThat(roundKey, equalTo(tzRounding.roundKey(utc("2009-02-03T02:00:00.000Z"))));
assertThat(tzRounding.valueForKey(roundKey), equalTo(utc("2009-02-03T02:00:00.000Z")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T02:00:00")), equalTo(utc("2009-02-04T02:00:00")));
} }
@Test @Test
public void testTimeTimeZoneRounding() { public void testTimeTimeZoneRounding() {
Rounding tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forOffsetHours(-2)).build(); // hour unit
Rounding tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forOffsetHours(-2)).build();
assertThat(tzRounding.round(0), equalTo(0l)); assertThat(tzRounding.round(0), equalTo(0l));
assertThat(tzRounding.nextRoundingValue(0l), equalTo(TimeValue.timeValueHours(1l).getMillis())); assertThat(tzRounding.nextRoundingValue(0l), equalTo(TimeValue.timeValueHours(1l).getMillis()));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forOffsetHours(-2)).postZone(DateTimeZone.forOffsetHours(-2)).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forOffsetHours(-2)).build();
assertThat(tzRounding.round(0), equalTo(0l - TimeValue.timeValueHours(2).millis()));
assertThat(tzRounding.nextRoundingValue(0l - TimeValue.timeValueHours(2).millis()), equalTo(0l - TimeValue.timeValueHours(1).millis()));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forOffsetHours(-2)).build();
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-03T01:00:00"))); assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(utc("2009-02-03T01:00:00")));
assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T01:00:00")), equalTo(utc("2009-02-03T02:00:00"))); assertThat(tzRounding.nextRoundingValue(utc("2009-02-03T01:00:00")), equalTo(utc("2009-02-03T02:00:00")));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forOffsetHours(-2)).postZone(DateTimeZone.forOffsetHours(-2)).build();
assertThat(tzRounding.round(utc("2009-02-03T01:01:01")), equalTo(time("2009-02-03T01:00:00", DateTimeZone.forOffsetHours(+2))));
assertThat(tzRounding.nextRoundingValue(time("2009-02-03T01:00:00", DateTimeZone.forOffsetHours(+2))), equalTo(time("2009-02-03T02:00:00", DateTimeZone.forOffsetHours(+2))));
} }
@Test @Test
public void testTimeTimeZoneRoundingDST() { public void testTimeUnitRoundingDST() {
Rounding tzRounding; Rounding tzRounding;
// testing savings to non savings switch // testing savings to non savings switch
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("UTC")).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("UTC")).build();
assertThat(tzRounding.round(time("2014-10-26T01:01:01", DateTimeZone.forID("CET"))), equalTo(time("2014-10-26T01:00:00", DateTimeZone.forID("CET")))); assertThat(tzRounding.round(time("2014-10-26T01:01:01", DateTimeZone.forID("CET"))),
equalTo(time("2014-10-26T01:00:00", DateTimeZone.forID("CET"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("CET")).build();
assertThat(tzRounding.round(time("2014-10-26T01:01:01", DateTimeZone.forID("CET"))), equalTo(time("2014-10-26T01:00:00", DateTimeZone.forID("CET")))); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("CET")).build();
assertThat(tzRounding.round(time("2014-10-26T01:01:01", DateTimeZone.forID("CET"))),
equalTo(time("2014-10-26T01:00:00", DateTimeZone.forID("CET"))));
// testing non savings to savings switch // testing non savings to savings switch
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("UTC")).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("UTC")).build();
assertThat(tzRounding.round(time("2014-03-30T01:01:01", DateTimeZone.forID("CET"))), equalTo(time("2014-03-30T01:00:00", DateTimeZone.forID("CET")))); assertThat(tzRounding.round(time("2014-03-30T01:01:01", DateTimeZone.forID("CET"))),
equalTo(time("2014-03-30T01:00:00", DateTimeZone.forID("CET"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("CET")).build();
assertThat(tzRounding.round(time("2014-03-30T01:01:01", DateTimeZone.forID("CET"))), equalTo(time("2014-03-30T01:00:00", DateTimeZone.forID("CET")))); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("CET")).build();
assertThat(tzRounding.round(time("2014-03-30T01:01:01", DateTimeZone.forID("CET"))),
equalTo(time("2014-03-30T01:00:00", DateTimeZone.forID("CET"))));
// testing non savings to savings switch (America/Chicago) // testing non savings to savings switch (America/Chicago)
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("UTC")).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("UTC")).build();
assertThat(tzRounding.round(time("2014-03-09T03:01:01", DateTimeZone.forID("America/Chicago"))), equalTo(time("2014-03-09T03:00:00", DateTimeZone.forID("America/Chicago")))); assertThat(tzRounding.round(time("2014-03-09T03:01:01", DateTimeZone.forID("America/Chicago"))),
equalTo(time("2014-03-09T03:00:00", DateTimeZone.forID("America/Chicago"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("America/Chicago")).build();
assertThat(tzRounding.round(time("2014-03-09T03:01:01", DateTimeZone.forID("America/Chicago"))), equalTo(time("2014-03-09T03:00:00", DateTimeZone.forID("America/Chicago")))); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("America/Chicago")).build();
assertThat(tzRounding.round(time("2014-03-09T03:01:01", DateTimeZone.forID("America/Chicago"))),
equalTo(time("2014-03-09T03:00:00", DateTimeZone.forID("America/Chicago"))));
// testing savings to non savings switch 2013 (America/Chicago) // testing savings to non savings switch 2013 (America/Chicago)
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("UTC")).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("UTC")).build();
assertThat(tzRounding.round(time("2013-11-03T06:01:01", DateTimeZone.forID("America/Chicago"))), equalTo(time("2013-11-03T06:00:00", DateTimeZone.forID("America/Chicago")))); assertThat(tzRounding.round(time("2013-11-03T06:01:01", DateTimeZone.forID("America/Chicago"))),
equalTo(time("2013-11-03T06:00:00", DateTimeZone.forID("America/Chicago"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("America/Chicago")).build();
assertThat(tzRounding.round(time("2013-11-03T06:01:01", DateTimeZone.forID("America/Chicago"))), equalTo(time("2013-11-03T06:00:00", DateTimeZone.forID("America/Chicago")))); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("America/Chicago")).build();
assertThat(tzRounding.round(time("2013-11-03T06:01:01", DateTimeZone.forID("America/Chicago"))),
equalTo(time("2013-11-03T06:00:00", DateTimeZone.forID("America/Chicago"))));
// testing savings to non savings switch 2014 (America/Chicago) // testing savings to non savings switch 2014 (America/Chicago)
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("UTC")).build(); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("UTC")).build();
assertThat(tzRounding.round(time("2014-11-02T06:01:01", DateTimeZone.forID("America/Chicago"))), equalTo(time("2014-11-02T06:00:00", DateTimeZone.forID("America/Chicago")))); assertThat(tzRounding.round(time("2014-11-02T06:01:01", DateTimeZone.forID("America/Chicago"))),
equalTo(time("2014-11-02T06:00:00", DateTimeZone.forID("America/Chicago"))));
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).preZone(DateTimeZone.forID("America/Chicago")).build();
assertThat(tzRounding.round(time("2014-11-02T06:01:01", DateTimeZone.forID("America/Chicago"))), equalTo(time("2014-11-02T06:00:00", DateTimeZone.forID("America/Chicago")))); tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("America/Chicago")).build();
assertThat(tzRounding.round(time("2014-11-02T06:01:01", DateTimeZone.forID("America/Chicago"))),
equalTo(time("2014-11-02T06:00:00", DateTimeZone.forID("America/Chicago"))));
}
/**
* randomized test on TimeUnitRounding with random time units and time zone offsets
*/
@Test
public void testTimeZoneRoundingRandom() {
for (int i = 0; i < 1000; ++i) {
DateTimeUnit timeUnit = randomTimeUnit();
TimeZoneRounding rounding;
int timezoneOffset = randomIntBetween(-23, 23);
rounding = new TimeZoneRounding.TimeUnitRounding(timeUnit, DateTimeZone.forOffsetHours(timezoneOffset));
long date = Math.abs(randomLong() % ((long) 10e11));
final long roundedDate = rounding.round(date);
final long nextRoundingValue = rounding.nextRoundingValue(roundedDate);
assertThat("Rounding should be idempotent", roundedDate, equalTo(rounding.round(roundedDate)));
assertThat("Rounded value smaller or equal than unrounded, regardless of timezone", roundedDate, lessThanOrEqualTo(date));
assertThat("NextRounding value should be greater than date", nextRoundingValue, greaterThan(roundedDate));
assertThat("NextRounding value should be a rounded date", nextRoundingValue, equalTo(rounding.round(nextRoundingValue)));
}
}
/**
* randomized test on TimeIntervalRounding with random interval and time zone offsets
*/
@Test
public void testIntervalRoundingRandom() {
for (int i = 0; i < 1000; ++i) {
// max random interval is a year, can be negative
long interval = Math.abs(randomLong() % (TimeUnit.DAYS.toMillis(365)));
TimeZoneRounding rounding;
int timezoneOffset = randomIntBetween(-23, 23);
rounding = new TimeZoneRounding.TimeIntervalRounding(interval, DateTimeZone.forOffsetHours(timezoneOffset));
long date = Math.abs(randomLong() % ((long) 10e11));
final long roundedDate = rounding.round(date);
final long nextRoundingValue = rounding.nextRoundingValue(roundedDate);
assertThat("Rounding should be idempotent", roundedDate, equalTo(rounding.round(roundedDate)));
assertThat("Rounded value smaller or equal than unrounded, regardless of timezone", roundedDate, lessThanOrEqualTo(date));
assertThat("NextRounding value should be greater than date", nextRoundingValue, greaterThan(roundedDate));
assertThat("NextRounding value should be interval from rounded value", nextRoundingValue - roundedDate, equalTo(interval));
assertThat("NextRounding value should be a rounded date", nextRoundingValue, equalTo(rounding.round(nextRoundingValue)));
}
}
/**
* special test for DST switch from #9491
*/
@Test
public void testAmbiguousHoursAfterDSTSwitch() {
Rounding tzRounding;
tzRounding = TimeZoneRounding.builder(DateTimeUnit.HOUR_OF_DAY).timeZone(DateTimeZone.forID("Asia/Jerusalem")).build();
assertThat(tzRounding.round(time("2014-10-25T22:30:00", DateTimeZone.UTC)), equalTo(time("2014-10-25T22:00:00", DateTimeZone.UTC)));
assertThat(tzRounding.round(time("2014-10-25T23:30:00", DateTimeZone.UTC)), equalTo(time("2014-10-25T23:00:00", DateTimeZone.UTC)));
// Day interval
tzRounding = TimeZoneRounding.builder(DateTimeUnit.DAY_OF_MONTH).timeZone(DateTimeZone.forID("Asia/Jerusalem")).build();
assertThat(tzRounding.round(time("2014-11-11T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2014-11-11T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// DST on
assertThat(tzRounding.round(time("2014-08-11T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2014-08-11T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// Day of switching DST on -> off
assertThat(tzRounding.round(time("2014-10-26T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2014-10-26T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// Day of switching DST off -> on
assertThat(tzRounding.round(time("2015-03-27T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2015-03-27T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// Month interval
tzRounding = TimeZoneRounding.builder(DateTimeUnit.MONTH_OF_YEAR).timeZone(DateTimeZone.forID("Asia/Jerusalem")).build();
assertThat(tzRounding.round(time("2014-11-11T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2014-11-01T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// DST on
assertThat(tzRounding.round(time("2014-10-10T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2014-10-01T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// Year interval
tzRounding = TimeZoneRounding.builder(DateTimeUnit.YEAR_OF_CENTURY).timeZone(DateTimeZone.forID("Asia/Jerusalem")).build();
assertThat(tzRounding.round(time("2014-11-11T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(time("2014-01-01T00:00:00", DateTimeZone.forID("Asia/Jerusalem"))));
// Two time stamps in same year ("Double buckets" bug in 1.3.7)
tzRounding = TimeZoneRounding.builder(DateTimeUnit.YEAR_OF_CENTURY).timeZone(DateTimeZone.forID("Asia/Jerusalem")).build();
assertThat(tzRounding.round(time("2014-11-11T17:00:00", DateTimeZone.forID("Asia/Jerusalem"))),
equalTo(tzRounding.round(time("2014-08-11T17:00:00", DateTimeZone.forID("Asia/Jerusalem")))));
}
private DateTimeUnit randomTimeUnit() {
byte id = (byte) randomIntBetween(1, 8);
return DateTimeUnit.resolve(id);
}
private String toUTCDateString(long time) {
return new DateTime(time, DateTimeZone.UTC).toString();
} }
private long utc(String time) { private long utc(String time) {

View file

@ -49,7 +49,7 @@ public class IndicesQueryCacheTests extends ElasticsearchIntegrationTest {
// which used to not work well with the query cache because of the handles stream output // which used to not work well with the query cache because of the handles stream output
// see #9500 // see #9500
final SearchResponse r1 = client().prepareSearch("index").setSearchType(SearchType.COUNT) final SearchResponse r1 = client().prepareSearch("index").setSearchType(SearchType.COUNT)
.addAggregation(dateHistogram("histo").field("f").preZone("+01:00").minDocCount(0).interval(DateHistogramInterval.MONTH)).get(); .addAggregation(dateHistogram("histo").field("f").timeZone("+01:00").minDocCount(0).interval(DateHistogramInterval.MONTH)).get();
assertSearchResponse(r1); assertSearchResponse(r1);
// The cached is actually used // The cached is actually used
@ -57,7 +57,7 @@ public class IndicesQueryCacheTests extends ElasticsearchIntegrationTest {
for (int i = 0; i < 10; ++i) { for (int i = 0; i < 10; ++i) {
final SearchResponse r2 = client().prepareSearch("index").setSearchType(SearchType.COUNT) final SearchResponse r2 = client().prepareSearch("index").setSearchType(SearchType.COUNT)
.addAggregation(dateHistogram("histo").field("f").preZone("+01:00").minDocCount(0).interval(DateHistogramInterval.MONTH)).get(); .addAggregation(dateHistogram("histo").field("f").timeZone("+01:00").minDocCount(0).interval(DateHistogramInterval.MONTH)).get();
assertSearchResponse(r2); assertSearchResponse(r2);
Histogram h1 = r1.getAggregations().get("histo"); Histogram h1 = r1.getAggregations().get("histo");
Histogram h2 = r2.getAggregations().get("histo"); Histogram h2 = r2.getAggregations().get("histo");

View file

@ -53,7 +53,7 @@ import static org.hamcrest.core.IsNull.notNullValue;
@ElasticsearchIntegrationTest.ClusterScope(scope=ElasticsearchIntegrationTest.Scope.SUITE) @ElasticsearchIntegrationTest.ClusterScope(scope=ElasticsearchIntegrationTest.Scope.SUITE)
public class DateHistogramOffsetTests extends ElasticsearchIntegrationTest { public class DateHistogramOffsetTests extends ElasticsearchIntegrationTest {
private static final String DATE_FORMAT = "YY-MM-DD:hh-mm-ss"; private static final String DATE_FORMAT = "yyyy-MM-dd:hh-mm-ss";
private DateTime date(String date) { private DateTime date(String date) {
return DateFieldMapper.Defaults.DATE_TIME_FORMATTER.parser().parseDateTime(date); return DateFieldMapper.Defaults.DATE_TIME_FORMATTER.parser().parseDateTime(date);

View file

@ -24,9 +24,7 @@ import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.common.joda.Joda; import org.elasticsearch.common.joda.Joda;
import org.elasticsearch.common.settings.ImmutableSettings; import org.elasticsearch.common.settings.ImmutableSettings;
import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.settings.Settings;
import org.elasticsearch.common.xcontent.XContentBuilder;
import org.elasticsearch.index.mapper.core.DateFieldMapper; import org.elasticsearch.index.mapper.core.DateFieldMapper;
import org.elasticsearch.search.aggregations.AbstractAggregationBuilder;
import org.elasticsearch.search.aggregations.bucket.histogram.DateHistogramInterval; import org.elasticsearch.search.aggregations.bucket.histogram.DateHistogramInterval;
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram; import org.elasticsearch.search.aggregations.bucket.histogram.Histogram;
import org.elasticsearch.search.aggregations.bucket.histogram.Histogram.Bucket; import org.elasticsearch.search.aggregations.bucket.histogram.Histogram.Bucket;
@ -45,6 +43,7 @@ import java.util.ArrayList;
import java.util.Arrays; import java.util.Arrays;
import java.util.List; import java.util.List;
import java.util.concurrent.TimeUnit; import java.util.concurrent.TimeUnit;
import java.util.concurrent.ExecutionException;
import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder; import static org.elasticsearch.common.xcontent.XContentFactory.jsonBuilder;
import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery; import static org.elasticsearch.index.query.QueryBuilders.matchAllQuery;
@ -56,6 +55,7 @@ import static org.elasticsearch.search.aggregations.AggregationBuilders.sum;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertSearchResponse; import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertSearchResponse;
import static org.hamcrest.Matchers.*; import static org.hamcrest.Matchers.*;
import static org.hamcrest.core.IsNull.notNullValue; import static org.hamcrest.core.IsNull.notNullValue;
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
/** /**
* *
@ -134,7 +134,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -164,77 +163,55 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
} }
@Test @Test
public void singleValuedField_WithPostTimeZone() throws Exception { public void singleValuedField_WithTimeZone() throws Exception {
SearchResponse response; SearchResponse response = client().prepareSearch("idx")
if (randomBoolean()) { .addAggregation(dateHistogram("histo").field("date").interval(DateHistogramInterval.DAY).timeZone("+01:00")).execute()
response = client().prepareSearch("idx") .actionGet();
.addAggregation(dateHistogram("histo").field("date").interval(DateHistogramInterval.DAY).postZone("-01:00"))
.execute().actionGet();
} else {
// checking post_zone setting as an int
response = client().prepareSearch("idx")
.addAggregation(new AbstractAggregationBuilder("histo", "date_histogram") {
@Override
public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
return builder.startObject(getName())
.startObject(type)
.field("field", "date")
.field("interval", "1d")
.field("post_zone", -1)
.endObject()
.endObject();
}
})
.execute().actionGet();
}
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
List<? extends Bucket> buckets = histo.getBuckets(); List<? extends Bucket> buckets = histo.getBuckets();
assertThat(buckets.size(), equalTo(6)); assertThat(buckets.size(), equalTo(6));
DateTime key = new DateTime(2012, 1, 2, 0, 0, DateTimeZone.forID("+01:00")); DateTime key = new DateTime(2012, 1, 1, 23, 0, DateTimeZone.UTC);
Histogram.Bucket bucket = buckets.get(0); Histogram.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key.withZone(DateTimeZone.UTC))); assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
key = new DateTime(2012, 2, 2, 0, 0, DateTimeZone.forID("+01:00")); key = new DateTime(2012, 2, 1, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(1); bucket = buckets.get(1);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key.withZone(DateTimeZone.UTC))); assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
key = new DateTime(2012, 2, 15, 0, 0, DateTimeZone.forID("+01:00")); key = new DateTime(2012, 2, 14, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(2); bucket = buckets.get(2);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key.withZone(DateTimeZone.UTC))); assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
key = new DateTime(2012, 3, 2, 0, 0, DateTimeZone.forID("+01:00")); key = new DateTime(2012, 3, 1, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(3); bucket = buckets.get(3);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key.withZone(DateTimeZone.UTC))); assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
key = new DateTime(2012, 3, 15, 0, 0, DateTimeZone.forID("+01:00")); key = new DateTime(2012, 3, 14, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(4); bucket = buckets.get(4);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key.withZone(DateTimeZone.UTC))); assertThat(((DateTime) bucket.getKey()), equalTo(key.withZone(DateTimeZone.UTC)));
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
key = new DateTime(2012, 3, 23, 0, 0, DateTimeZone.forID("+01:00")); key = new DateTime(2012, 3, 22, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(5); bucket = buckets.get(5);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
@ -248,12 +225,11 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
.addAggregation(dateHistogram("histo") .addAggregation(dateHistogram("histo")
.field("date") .field("date")
.interval(DateHistogramInterval.MONTH) .interval(DateHistogramInterval.MONTH)
.order(Histogram.Order.KEY_ASC)) .order(Histogram.Order.KEY_ASC))
.execute().actionGet(); .execute().actionGet();
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -278,7 +254,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -302,7 +277,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -326,7 +300,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -348,7 +321,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -407,7 +379,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -457,7 +428,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -482,7 +452,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -531,7 +500,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -555,7 +523,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -602,7 +569,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -649,7 +615,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -674,7 +639,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
} }
/** /**
* The script will change to document date values to the following: * The script will change to document date values to the following:
* *
@ -696,7 +660,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -755,7 +718,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -819,7 +781,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -859,7 +820,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -905,7 +865,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -961,7 +920,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -1017,7 +975,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -1032,7 +989,6 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
@ -1086,7 +1042,7 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
} }
@Test @Test
public void singleValue_WithPreZone() throws Exception { public void singleValue_WithTimeZone() throws Exception {
prepareCreate("idx2").addMapping("type", "date", "type=date").execute().actionGet(); prepareCreate("idx2").addMapping("type", "date", "type=date").execute().actionGet();
IndexRequestBuilder[] reqs = new IndexRequestBuilder[5]; IndexRequestBuilder[] reqs = new IndexRequestBuilder[5];
DateTime date = date("2014-03-11T00:00:00+00:00"); DateTime date = date("2014-03-11T00:00:00+00:00");
@ -1100,9 +1056,9 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
.setQuery(matchAllQuery()) .setQuery(matchAllQuery())
.addAggregation(dateHistogram("date_histo") .addAggregation(dateHistogram("date_histo")
.field("date") .field("date")
.preZone("-02:00") .timeZone("-02:00")
.interval(DateHistogramInterval.DAY) .interval(DateHistogramInterval.DAY)
.format("yyyy-MM-dd")) .format("yyyy-MM-dd:hh-mm-ss"))
.execute().actionGet(); .execute().actionGet();
assertThat(response.getHits().getTotalHits(), equalTo(5l)); assertThat(response.getHits().getTotalHits(), equalTo(5l));
@ -1111,58 +1067,14 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
List<? extends Histogram.Bucket> buckets = histo.getBuckets(); List<? extends Histogram.Bucket> buckets = histo.getBuckets();
assertThat(buckets.size(), equalTo(2)); assertThat(buckets.size(), equalTo(2));
DateTime key = new DateTime(2014, 3, 10, 0, 0, DateTimeZone.UTC);
Histogram.Bucket bucket = buckets.get(0); Histogram.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo("2014-03-10")); assertThat(bucket.getKeyAsString(), equalTo("2014-03-10:02-00-00"));
assertThat(bucket.getDocCount(), equalTo(2l)); assertThat(bucket.getDocCount(), equalTo(2l));
key = new DateTime(2014, 3, 11, 0, 0, DateTimeZone.UTC);
bucket = buckets.get(1); bucket = buckets.get(1);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo("2014-03-11")); assertThat(bucket.getKeyAsString(), equalTo("2014-03-11:02-00-00"));
assertThat(bucket.getDocCount(), equalTo(3l));
}
@Test
public void singleValue_WithPreZone_WithAadjustLargeInterval() throws Exception {
prepareCreate("idx2").addMapping("type", "date", "type=date").execute().actionGet();
IndexRequestBuilder[] reqs = new IndexRequestBuilder[5];
DateTime date = date("2014-03-11T00:00:00+00:00");
for (int i = 0; i < reqs.length; i++) {
reqs[i] = client().prepareIndex("idx2", "type", "" + i).setSource(jsonBuilder().startObject().field("date", date).endObject());
date = date.plusHours(1);
}
indexRandom(true, reqs);
SearchResponse response = client().prepareSearch("idx2")
.setQuery(matchAllQuery())
.addAggregation(dateHistogram("date_histo")
.field("date")
.preZone("-02:00")
.interval(DateHistogramInterval.DAY)
.preZoneAdjustLargeInterval(true)
.format("yyyy-MM-dd'T'HH:mm:ss"))
.execute().actionGet();
assertThat(response.getHits().getTotalHits(), equalTo(5l));
Histogram histo = response.getAggregations().get("date_histo");
List<? extends Histogram.Bucket> buckets = histo.getBuckets();
assertThat(buckets.size(), equalTo(2));
DateTime key = new DateTime(2014, 3, 10, 2, 0, DateTimeZone.UTC);
Histogram.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo("2014-03-10T02:00:00"));
assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(2l));
key = new DateTime(2014, 3, 11, 2, 0, DateTimeZone.UTC);
bucket = buckets.get(1);
assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo("2014-03-11T02:00:00"));
assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(3l)); assertThat(bucket.getDocCount(), equalTo(3l));
} }
@ -1229,7 +1141,8 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
DateTime boundsMaxKey = lastDataBucketKey.plusDays(boundsMaxKeyDelta); DateTime boundsMaxKey = lastDataBucketKey.plusDays(boundsMaxKeyDelta);
DateTime boundsMax = boundsMaxKey.plusDays(randomIntBetween(0, interval - 1)); DateTime boundsMax = boundsMaxKey.plusDays(randomIntBetween(0, interval - 1));
// it could be that the random bounds.min we chose ended up greater than bounds.max - this should // it could be that the random bounds.min we chose ended up greater than
// bounds.max - this should
// trigger an error // trigger an error
boolean invalidBoundsError = boundsMin.isAfter(boundsMax); boolean invalidBoundsError = boundsMin.isAfter(boundsMax);
@ -1284,7 +1197,7 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
@Test @Test
public void singleValue_WithMultipleDateFormatsFromMapping() throws Exception { public void singleValue_WithMultipleDateFormatsFromMapping() throws Exception {
String mappingJson = jsonBuilder().startObject().startObject("type").startObject("properties").startObject("date").field("type", "date").field("format", "dateOptionalTime||dd-MM-yyyy").endObject().endObject().endObject().endObject().string(); String mappingJson = jsonBuilder().startObject().startObject("type").startObject("properties").startObject("date").field("type", "date").field("format", "dateOptionalTime||dd-MM-yyyy").endObject().endObject().endObject().endObject().string();
prepareCreate("idx2").addMapping("type", mappingJson).execute().actionGet(); prepareCreate("idx2").addMapping("type", mappingJson).execute().actionGet();
IndexRequestBuilder[] reqs = new IndexRequestBuilder[5]; IndexRequestBuilder[] reqs = new IndexRequestBuilder[5];
@ -1316,33 +1229,32 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
public void testIssue6965() { public void testIssue6965() {
SearchResponse response = client().prepareSearch("idx") SearchResponse response = client().prepareSearch("idx")
.addAggregation(dateHistogram("histo").field("date").preZone("+01:00").interval(DateHistogramInterval.MONTH).minDocCount(0)) .addAggregation(dateHistogram("histo").field("date").timeZone("+01:00").interval(DateHistogramInterval.MONTH).minDocCount(0))
.execute().actionGet(); .execute().actionGet();
assertSearchResponse(response); assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo"); Histogram histo = response.getAggregations().get("histo");
assertThat(histo, notNullValue()); assertThat(histo, notNullValue());
assertThat(histo.getName(), equalTo("histo")); assertThat(histo.getName(), equalTo("histo"));
List<? extends Bucket> buckets = histo.getBuckets(); List<? extends Bucket> buckets = histo.getBuckets();
assertThat(buckets.size(), equalTo(3)); assertThat(buckets.size(), equalTo(3));
DateTime key = new DateTime(2012, 1, 1, 0, 0, DateTimeZone.UTC); DateTime key = new DateTime(2011, 12, 31, 23, 0, DateTimeZone.UTC);
Histogram.Bucket bucket = buckets.get(0); Histogram.Bucket bucket = buckets.get(0);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key)); assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(1l)); assertThat(bucket.getDocCount(), equalTo(1l));
key = new DateTime(2012, 2, 1, 0, 0, DateTimeZone.UTC); key = new DateTime(2012, 1, 31, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(1); bucket = buckets.get(1);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
assertThat(((DateTime) bucket.getKey()), equalTo(key)); assertThat(((DateTime) bucket.getKey()), equalTo(key));
assertThat(bucket.getDocCount(), equalTo(2l)); assertThat(bucket.getDocCount(), equalTo(2l));
key = new DateTime(2012, 3, 1, 0, 0, DateTimeZone.UTC); key = new DateTime(2012, 2, 29, 23, 0, DateTimeZone.UTC);
bucket = buckets.get(2); bucket = buckets.get(2);
assertThat(bucket, notNullValue()); assertThat(bucket, notNullValue());
assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key))); assertThat(bucket.getKeyAsString(), equalTo(getBucketKeyAsString(key)));
@ -1350,6 +1262,20 @@ public class DateHistogramTests extends ElasticsearchIntegrationTest {
assertThat(bucket.getDocCount(), equalTo(3l)); assertThat(bucket.getDocCount(), equalTo(3l));
} }
public void testDSTBoundaryIssue9491() throws InterruptedException, ExecutionException {
assertAcked(client().admin().indices().prepareCreate("test9491").addMapping("type", "d", "type=date").get());
indexRandom(true, client().prepareIndex("test9491", "type").setSource("d", "2014-10-08T13:00:00Z"),
client().prepareIndex("test9491", "type").setSource("d", "2014-11-08T13:00:00Z"));
ensureSearchable("test9491");
SearchResponse response = client().prepareSearch("test9491")
.addAggregation(dateHistogram("histo").field("d").interval(DateHistogramInterval.YEAR).timeZone("Asia/Jerusalem"))
.execute().actionGet();
assertSearchResponse(response);
Histogram histo = response.getAggregations().get("histo");
assertThat(histo.getBuckets().size(), equalTo(1));
assertThat(histo.getBuckets().get(0).getKeyAsString(), equalTo("2013-12-31T22:00:00.000Z"));
}
/** /**
* see issue #9634, negative interval in date_histogram should raise exception * see issue #9634, negative interval in date_histogram should raise exception
*/ */

View file

@ -70,7 +70,7 @@ public class NettyTransportMultiPortTests extends ElasticsearchTestCase {
assertPortIsBound(ports[1]); assertPortIsBound(ports[1]);
assertPortIsBound(ports[2]); assertPortIsBound(ports[2]);
} finally { } finally {
threadPool.shutdownNow(); terminate(threadPool);
} }
} }
@ -89,7 +89,7 @@ public class NettyTransportMultiPortTests extends ElasticsearchTestCase {
assertPortIsBound(ports[0]); assertPortIsBound(ports[0]);
assertPortIsBound(ports[1]); assertPortIsBound(ports[1]);
} finally { } finally {
threadPool.shutdownNow(); terminate(threadPool);
} }
} }
@ -107,7 +107,7 @@ public class NettyTransportMultiPortTests extends ElasticsearchTestCase {
try (NettyTransport ignored = startNettyTransport(settings, threadPool)) { try (NettyTransport ignored = startNettyTransport(settings, threadPool)) {
assertPortIsBound(ports[0]); assertPortIsBound(ports[0]);
} finally { } finally {
threadPool.shutdownNow(); terminate(threadPool);
} }
} }
@ -128,7 +128,7 @@ public class NettyTransportMultiPortTests extends ElasticsearchTestCase {
assertConnectionRefused(ports[1]); assertConnectionRefused(ports[1]);
assertPortIsBound(ports[2]); assertPortIsBound(ports[2]);
} finally { } finally {
threadPool.shutdownNow(); terminate(threadPool);
} }
} }
@ -151,7 +151,7 @@ public class NettyTransportMultiPortTests extends ElasticsearchTestCase {
assertPortIsBound(firstNonLoopbackAddress.getHostAddress(), ports[1]); assertPortIsBound(firstNonLoopbackAddress.getHostAddress(), ports[1]);
assertConnectionRefused(ports[1]); assertConnectionRefused(ports[1]);
} finally { } finally {
threadPool.shutdownNow(); terminate(threadPool);
} }
} }