mirror of
https://github.com/elastic/elasticsearch.git
synced 2025-06-28 01:22:26 -04:00
Fix compiler warnings in :server - part 3 (#76024)
Part of #40366. Fix a number of javac issues when linting is enforced in `server/`.
This commit is contained in:
parent
263d9f2dac
commit
128a7e7744
53 changed files with 318 additions and 194 deletions
|
@ -19,6 +19,9 @@ ij_continuation_indent_size = 2
|
|||
indent_size = 2
|
||||
max_line_length = 150
|
||||
|
||||
[*.md]
|
||||
max_line_length = 80
|
||||
|
||||
[*.groovy]
|
||||
indent_size = 4
|
||||
ij_continuation_indent_size = 4
|
||||
|
|
|
@ -1,7 +1,7 @@
|
|||
Contributing to elasticsearch
|
||||
=============================
|
||||
|
||||
Elasticsearch is a free and open project and we love to receive contributions from our community — you! There are many ways to contribute, from writing tutorials or blog posts, improving the documentation, submitting bug reports and feature requests or writing code which can be incorporated into Elasticsearch itself.
|
||||
Elasticsearch is a free and open project and we love to receive contributions from our community — you! There are many ways to contribute, from writing tutorials or blog posts, improving the documentation, submitting bug reports and feature requests or writing code which can be incorporated into Elasticsearch itself.
|
||||
|
||||
If you want to be rewarded for your contributions, sign up for the [Elastic Contributor Program](https://www.elastic.co/community/contributor). Each time you
|
||||
make a valid contribution, you’ll earn points that increase your chances of winning prizes and being recognized as a top contributor.
|
||||
|
@ -38,14 +38,14 @@ Contributing code and documentation changes
|
|||
-------------------------------------------
|
||||
|
||||
If you would like to contribute a new feature or a bug fix to Elasticsearch,
|
||||
please discuss your idea first on the Github issue. If there is no Github issue
|
||||
please discuss your idea first on the GitHub issue. If there is no GitHub issue
|
||||
for your idea, please open one. It may be that somebody is already working on
|
||||
it, or that there are particular complexities that you should know about before
|
||||
starting the implementation. There are often a number of ways to fix a problem
|
||||
and it is important to find the right approach before spending time on a PR
|
||||
that cannot be merged.
|
||||
|
||||
We add the `help wanted` label to existing Github issues for which community
|
||||
We add the `help wanted` label to existing GitHub issues for which community
|
||||
contributions are particularly welcome, and we use the `good first issue` label
|
||||
to mark issues that we think will be suitable for new contributors.
|
||||
|
||||
|
@ -147,7 +147,6 @@ and then run `curl` in another window like this:
|
|||
curl -u elastic:password localhost:9200
|
||||
|
||||
|
||||
|
||||
### Importing the project into IntelliJ IDEA
|
||||
|
||||
The minimum IntelliJ IDEA version required to import the Elasticsearch project is 2020.1
|
||||
|
@ -447,7 +446,6 @@ otherwise:
|
|||
* 2.0.
|
||||
*/
|
||||
|
||||
|
||||
It is important that the only code covered by the Elastic licence is contained
|
||||
within the top-level `x-pack` directory. The build will fail its pre-commit
|
||||
checks if contributed code does not have the appropriate license headers.
|
||||
|
@ -456,52 +454,63 @@ checks if contributed code does not have the appropriate license headers.
|
|||
> be automatically configured to add the correct license header to new source
|
||||
> files based on the source location.
|
||||
|
||||
### Type-checking, generics and casting
|
||||
|
||||
You should try to write code that does not require suppressing any warnings from
|
||||
the compiler, e.g. suppressing type-checking, raw generics, and so on. However,
|
||||
this isn't always possible or practical. In such cases, you should use the
|
||||
`@SuppressWarnings` annotations to silence the compiler warning, trying to keep
|
||||
the scope of the suppression as small as possible. Where a piece of code
|
||||
requires a lot of suppressions, it may be better to apply a single suppression
|
||||
at a higher level e.g. at the method or even class level. Use your judgement.
|
||||
|
||||
There are also cases where the compiler simply refuses to accept an assignment
|
||||
or cast of any kind, because it lacks the information to know that the types are
|
||||
OK. In such cases, you can use
|
||||
the [`Types.forciblyCast`](libs/core/src/main/java/org/elasticsearch/core/Types.java)
|
||||
utility method. As the name suggests, you can coerce any type to any other type,
|
||||
so please use it as a last resort.
|
||||
|
||||
### Creating A Distribution
|
||||
|
||||
Run all build commands from within the root directory:
|
||||
|
||||
```sh
|
||||
cd elasticsearch/
|
||||
```
|
||||
|
||||
To build a darwin-tar distribution, run this command:
|
||||
|
||||
```sh
|
||||
./gradlew -p distribution/archives/darwin-tar assemble
|
||||
```
|
||||
|
||||
You will find the distribution under:
|
||||
`./distribution/archives/darwin-tar/build/distributions/`
|
||||
|
||||
./distribution/archives/darwin-tar/build/distributions/
|
||||
|
||||
To create all build artifacts (e.g., plugins and Javadocs) as well as
|
||||
distributions in all formats, run this command:
|
||||
|
||||
```sh
|
||||
./gradlew assemble
|
||||
```
|
||||
|
||||
> **NOTE:** Running the task above will fail if you don't have a available
|
||||
> **NOTE:** Running the task above will fail if you don't have an available
|
||||
> Docker installation.
|
||||
|
||||
The package distributions (Debian and RPM) can be found under:
|
||||
`./distribution/packages/(deb|rpm|oss-deb|oss-rpm)/build/distributions/`
|
||||
|
||||
./distribution/packages/(deb|rpm|oss-deb|oss-rpm)/build/distributions/
|
||||
|
||||
The archive distributions (tar and zip) can be found under:
|
||||
`./distribution/archives/(darwin-tar|linux-tar|windows-zip|oss-darwin-tar|oss-linux-tar|oss-windows-zip)/build/distributions/`
|
||||
|
||||
./distribution/archives/(darwin-tar|linux-tar|windows-zip|oss-darwin-tar|oss-linux-tar|oss-windows-zip)/build/distributions/
|
||||
|
||||
### Running The Full Test Suite
|
||||
|
||||
Before submitting your changes, run the test suite to make sure that nothing is broken, with:
|
||||
|
||||
```sh
|
||||
./gradlew check
|
||||
```
|
||||
|
||||
If your changes affect only the documentation, run:
|
||||
|
||||
```sh
|
||||
./gradlew -p docs check
|
||||
```
|
||||
|
||||
For more information about testing code examples in the documentation, see
|
||||
https://github.com/elastic/elasticsearch/blob/master/docs/README.asciidoc
|
||||
|
||||
|
|
|
@ -46,7 +46,7 @@ public class SearchAfterSortedDocQuery extends Query {
|
|||
this.sort = Objects.requireNonNull(sort);
|
||||
this.after = after;
|
||||
int numFields = sort.getSort().length;
|
||||
this.fieldComparators = new FieldComparator[numFields];
|
||||
this.fieldComparators = new FieldComparator<?>[numFields];
|
||||
this.reverseMuls = new int[numFields];
|
||||
for (int i = 0; i < numFields; i++) {
|
||||
SortField sortField = sort.getSort()[i];
|
||||
|
|
|
@ -15,13 +15,15 @@ import org.apache.lucene.search.TopFieldDocs;
|
|||
import org.elasticsearch.search.DocValueFormat;
|
||||
import org.elasticsearch.search.SearchSortValuesAndFormats;
|
||||
|
||||
import static org.elasticsearch.core.Types.forciblyCast;
|
||||
|
||||
/**
|
||||
* Utility class to keep track of the bottom doc's sort values in a distributed search.
|
||||
*/
|
||||
class BottomSortValuesCollector {
|
||||
private final int topNSize;
|
||||
private final SortField[] sortFields;
|
||||
private final FieldComparator[] comparators;
|
||||
private final FieldComparator<?>[] comparators;
|
||||
private final int[] reverseMuls;
|
||||
|
||||
private volatile long totalHits;
|
||||
|
@ -29,7 +31,7 @@ class BottomSortValuesCollector {
|
|||
|
||||
BottomSortValuesCollector(int topNSize, SortField[] sortFields) {
|
||||
this.topNSize = topNSize;
|
||||
this.comparators = new FieldComparator[sortFields.length];
|
||||
this.comparators = new FieldComparator<?>[sortFields.length];
|
||||
this.reverseMuls = new int[sortFields.length];
|
||||
this.sortFields = sortFields;
|
||||
for (int i = 0; i < sortFields.length; i++) {
|
||||
|
@ -90,7 +92,7 @@ class BottomSortValuesCollector {
|
|||
|
||||
private int compareValues(Object[] v1, Object[] v2) {
|
||||
for (int i = 0; i < v1.length; i++) {
|
||||
int cmp = reverseMuls[i] * comparators[i].compareValues(v1[i], v2[i]);
|
||||
int cmp = reverseMuls[i] * comparators[i].compareValues(forciblyCast(v1[i]), forciblyCast(v2[i]));
|
||||
if (cmp != 0) {
|
||||
return cmp;
|
||||
}
|
||||
|
|
|
@ -36,6 +36,8 @@ import java.util.stream.Collectors;
|
|||
import java.util.stream.IntStream;
|
||||
import java.util.stream.Stream;
|
||||
|
||||
import static org.elasticsearch.core.Types.forciblyCast;
|
||||
|
||||
/**
|
||||
* This search phase can be used as an initial search phase to pre-filter search shards based on query rewriting.
|
||||
* The queries are rewritten against the shards and based on the rewrite result shards might be able to be excluded
|
||||
|
@ -185,7 +187,11 @@ final class CanMatchPreFilterSearchPhase extends AbstractSearchAsyncAction<CanMa
|
|||
private static Comparator<Integer> shardComparator(GroupShardsIterator<SearchShardIterator> shardsIts,
|
||||
MinAndMax<?>[] minAndMaxes,
|
||||
SortOrder order) {
|
||||
final Comparator<Integer> comparator = Comparator.comparing(index -> minAndMaxes[index], MinAndMax.getComparator(order));
|
||||
final Comparator<Integer> comparator = Comparator.comparing(
|
||||
index -> minAndMaxes[index],
|
||||
forciblyCast(MinAndMax.getComparator(order))
|
||||
);
|
||||
|
||||
return comparator.thenComparing(index -> shardsIts.get(index));
|
||||
}
|
||||
|
||||
|
@ -197,7 +203,7 @@ final class CanMatchPreFilterSearchPhase extends AbstractSearchAsyncAction<CanMa
|
|||
CanMatchSearchPhaseResults(int size) {
|
||||
super(size);
|
||||
possibleMatches = new FixedBitSet(size);
|
||||
minAndMaxes = new MinAndMax[size];
|
||||
minAndMaxes = new MinAndMax<?>[size];
|
||||
}
|
||||
|
||||
@Override
|
||||
|
|
|
@ -110,7 +110,7 @@ public abstract class TransportBroadcastByNodeAction<Request extends BroadcastRe
|
|||
|
||||
private Response newResponse(
|
||||
Request request,
|
||||
AtomicReferenceArray responses,
|
||||
AtomicReferenceArray<?> responses,
|
||||
List<NoShardAvailableActionException> unavailableShardExceptions,
|
||||
Map<String, List<ShardRouting>> nodes,
|
||||
ClusterState clusterState) {
|
||||
|
@ -126,6 +126,7 @@ public abstract class TransportBroadcastByNodeAction<Request extends BroadcastRe
|
|||
exceptions.add(new DefaultShardOperationFailedException(shard.getIndexName(), shard.getId(), exception));
|
||||
}
|
||||
} else {
|
||||
@SuppressWarnings("unchecked")
|
||||
NodeResponse response = (NodeResponse) responses.get(i);
|
||||
broadcastByNodeResponses.addAll(response.results);
|
||||
totalShards += response.getTotalShards();
|
||||
|
@ -401,7 +402,7 @@ public abstract class TransportBroadcastByNodeAction<Request extends BroadcastRe
|
|||
if (logger.isTraceEnabled()) {
|
||||
logger.trace("[{}] executing operation on [{}] shards", actionName, totalShards);
|
||||
}
|
||||
final AtomicArray<Object> shardResultOrExceptions = new AtomicArray(totalShards);
|
||||
final AtomicArray<Object> shardResultOrExceptions = new AtomicArray<>(totalShards);
|
||||
|
||||
final AtomicInteger counter = new AtomicInteger(shards.size());
|
||||
int shardIndex = -1;
|
||||
|
@ -430,6 +431,7 @@ public abstract class TransportBroadcastByNodeAction<Request extends BroadcastRe
|
|||
}
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private void finishHim(NodeRequest request, TransportChannel channel, Task task,
|
||||
AtomicArray<Object> shardResultOrExceptions) {
|
||||
if (task instanceof CancellableTask) {
|
||||
|
|
|
@ -133,7 +133,7 @@ public abstract class TransportBroadcastReplicationAction<Request extends Broadc
|
|||
|
||||
protected abstract ShardRequest newShardRequest(Request request, ShardId shardId);
|
||||
|
||||
private void finishAndNotifyListener(ActionListener listener, CopyOnWriteArrayList<ShardResponse> shardsResponses) {
|
||||
private void finishAndNotifyListener(ActionListener<Response> listener, CopyOnWriteArrayList<ShardResponse> shardsResponses) {
|
||||
logger.trace("{}: got all shard responses", actionName);
|
||||
int successfulShards = 0;
|
||||
int failedShards = 0;
|
||||
|
@ -159,6 +159,6 @@ public abstract class TransportBroadcastReplicationAction<Request extends Broadc
|
|||
listener.onResponse(newResponse(successfulShards, failedShards, totalNumCopies, shardFailures));
|
||||
}
|
||||
|
||||
protected abstract BroadcastResponse newResponse(int successfulShards, int failedShards, int totalNumCopies,
|
||||
protected abstract Response newResponse(int successfulShards, int failedShards, int totalNumCopies,
|
||||
List<DefaultShardOperationFailedException> shardFailures);
|
||||
}
|
||||
|
|
|
@ -95,7 +95,7 @@ public class MultiOrdinals extends Ordinals {
|
|||
if (multiValued) {
|
||||
return new MultiDocs(this, values);
|
||||
} else {
|
||||
return (SortedSetDocValues) DocValues.singleton(new SingleDocs(this, values));
|
||||
return DocValues.singleton(new SingleDocs(this, values));
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -49,7 +49,7 @@ public class SinglePackedOrdinals extends Ordinals {
|
|||
|
||||
@Override
|
||||
public SortedSetDocValues ordinals(ValuesHolder values) {
|
||||
return (SortedSetDocValues) DocValues.singleton(new Docs(this, values));
|
||||
return DocValues.singleton(new Docs(this, values));
|
||||
}
|
||||
|
||||
private static class Docs extends AbstractSortedDocValues {
|
||||
|
|
|
@ -126,7 +126,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
this.fieldName = fieldName;
|
||||
this.shape = null;
|
||||
this.supplier = supplier;
|
||||
this.indexedShapeId = indexedShapeId;;
|
||||
this.indexedShapeId = indexedShapeId;
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -190,6 +190,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
* @param geometry the geometry
|
||||
* @return this
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public QB shape(Geometry geometry) {
|
||||
if (geometry == null) {
|
||||
throw new IllegalArgumentException("No geometry defined");
|
||||
|
@ -218,6 +219,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
* @param indexedShapeIndex Name of the index where the indexed Shape is
|
||||
* @return this
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public QB indexedShapeIndex(String indexedShapeIndex) {
|
||||
this.indexedShapeIndex = indexedShapeIndex;
|
||||
return (QB)this;
|
||||
|
@ -237,6 +239,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
* @param indexedShapePath Path of the field where the Shape itself is defined
|
||||
* @return this
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public QB indexedShapePath(String indexedShapePath) {
|
||||
this.indexedShapePath = indexedShapePath;
|
||||
return (QB)this;
|
||||
|
@ -255,6 +258,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
* @param indexedShapeRouting indexed shape routing
|
||||
* @return this
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public QB indexedShapeRouting(String indexedShapeRouting) {
|
||||
this.indexedShapeRouting = indexedShapeRouting;
|
||||
return (QB)this;
|
||||
|
@ -275,6 +279,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
* @param relation relation of the shapes
|
||||
* @return this
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public QB relation(ShapeRelation relation) {
|
||||
if (relation == null) {
|
||||
throw new IllegalArgumentException("No Shape Relation defined");
|
||||
|
@ -427,6 +432,7 @@ public abstract class AbstractGeometryQueryBuilder<QB extends AbstractGeometryQu
|
|||
}
|
||||
|
||||
@Override
|
||||
@SuppressWarnings("rawtypes")
|
||||
protected boolean doEquals(AbstractGeometryQueryBuilder other) {
|
||||
return Objects.equals(fieldName, other.fieldName)
|
||||
&& Objects.equals(indexedShapeId, other.indexedShapeId)
|
||||
|
|
|
@ -206,6 +206,7 @@ public class MoreLikeThisQueryBuilder extends AbstractQueryBuilder<MoreLikeThisQ
|
|||
/**
|
||||
* Read from a stream.
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
Item(StreamInput in) throws IOException {
|
||||
index = in.readOptionalString();
|
||||
if (in.getVersion().before(Version.V_8_0_0)) {
|
||||
|
|
|
@ -644,7 +644,7 @@ public final class QueryBuilders {
|
|||
* @deprecated use {@link #geoShapeQuery(String, Geometry)} instead
|
||||
*/
|
||||
@Deprecated
|
||||
public static GeoShapeQueryBuilder geoShapeQuery(String name, ShapeBuilder shape) throws IOException {
|
||||
public static GeoShapeQueryBuilder geoShapeQuery(String name, ShapeBuilder<?, ?, ?> shape) throws IOException {
|
||||
return new GeoShapeQueryBuilder(name, shape.buildGeometry());
|
||||
}
|
||||
|
||||
|
@ -668,7 +668,7 @@ public final class QueryBuilders {
|
|||
* @deprecated use {@link #geoIntersectionQuery(String, Geometry)} instead
|
||||
*/
|
||||
@Deprecated
|
||||
public static GeoShapeQueryBuilder geoIntersectionQuery(String name, ShapeBuilder shape) throws IOException {
|
||||
public static GeoShapeQueryBuilder geoIntersectionQuery(String name, ShapeBuilder<?, ?, ?> shape) throws IOException {
|
||||
GeoShapeQueryBuilder builder = geoShapeQuery(name, shape);
|
||||
builder.relation(ShapeRelation.INTERSECTS);
|
||||
return builder;
|
||||
|
@ -696,7 +696,7 @@ public final class QueryBuilders {
|
|||
* @deprecated use {@link #geoWithinQuery(String, Geometry)} instead
|
||||
*/
|
||||
@Deprecated
|
||||
public static GeoShapeQueryBuilder geoWithinQuery(String name, ShapeBuilder shape) throws IOException {
|
||||
public static GeoShapeQueryBuilder geoWithinQuery(String name, ShapeBuilder<?, ?, ?> shape) throws IOException {
|
||||
GeoShapeQueryBuilder builder = geoShapeQuery(name, shape);
|
||||
builder.relation(ShapeRelation.WITHIN);
|
||||
return builder;
|
||||
|
@ -724,7 +724,7 @@ public final class QueryBuilders {
|
|||
* @deprecated use {@link #geoDisjointQuery(String, Geometry)} instead
|
||||
*/
|
||||
@Deprecated
|
||||
public static GeoShapeQueryBuilder geoDisjointQuery(String name, ShapeBuilder shape) throws IOException {
|
||||
public static GeoShapeQueryBuilder geoDisjointQuery(String name, ShapeBuilder<?, ?, ?> shape) throws IOException {
|
||||
GeoShapeQueryBuilder builder = geoShapeQuery(name, shape);
|
||||
builder.relation(ShapeRelation.DISJOINT);
|
||||
return builder;
|
||||
|
|
|
@ -390,6 +390,7 @@ public class TermsQueryBuilder extends AbstractQueryBuilder<TermsQueryBuilder> {
|
|||
return this;
|
||||
}
|
||||
|
||||
@SuppressWarnings("rawtypes")
|
||||
private abstract static class Values extends AbstractCollection implements Writeable {
|
||||
|
||||
private static Values readFrom(StreamInput in) throws IOException {
|
||||
|
@ -477,6 +478,7 @@ public class TermsQueryBuilder extends AbstractQueryBuilder<TermsQueryBuilder> {
|
|||
* When users send a query contain a lot of terms, A {@link BytesReference} can help
|
||||
* gc and reduce the cost of {@link #doWriteTo}, which can be slow for lots of terms.
|
||||
*/
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static class BinaryValues extends Values {
|
||||
|
||||
private final BytesReference valueRef;
|
||||
|
@ -577,6 +579,7 @@ public class TermsQueryBuilder extends AbstractQueryBuilder<TermsQueryBuilder> {
|
|||
*
|
||||
* TODO: remove in 9.0.0
|
||||
*/
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static class ListValues extends Values {
|
||||
|
||||
private final List<?> values;
|
||||
|
@ -611,6 +614,7 @@ public class TermsQueryBuilder extends AbstractQueryBuilder<TermsQueryBuilder> {
|
|||
}
|
||||
|
||||
@Override
|
||||
@SuppressWarnings("unchecked")
|
||||
public Object[] toArray(IntFunction generator) {
|
||||
return values.toArray(generator);
|
||||
}
|
||||
|
|
|
@ -69,7 +69,7 @@ public class ScoreFunctionBuilders {
|
|||
}
|
||||
|
||||
public static WeightBuilder weightFactorFunction(float weight) {
|
||||
return (WeightBuilder)(new WeightBuilder().setWeight(weight));
|
||||
return new WeightBuilder().setWeight(weight);
|
||||
}
|
||||
|
||||
public static FieldValueFactorFunctionBuilder fieldValueFactorFunction(String fieldName) {
|
||||
|
|
|
@ -367,11 +367,10 @@ public class BulkByScrollTask extends CancellableTask {
|
|||
FIELDS_SET.add(SLICES_FIELD);
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
static final ConstructingObjectParser<Tuple<Long, Long>, Void> RETRIES_PARSER = new ConstructingObjectParser<>(
|
||||
"bulk_by_scroll_task_status_retries",
|
||||
true,
|
||||
a -> new Tuple(a[0], a[1])
|
||||
a -> new Tuple<>(((Long) a[0]), (Long) a[1])
|
||||
);
|
||||
static {
|
||||
RETRIES_PARSER.declareLong(constructorArg(), new ParseField(RETRIES_BULK_FIELD));
|
||||
|
|
|
@ -163,6 +163,7 @@ public class RestActions {
|
|||
* @return Never {@code null}.
|
||||
* @throws IOException if building the response causes an issue
|
||||
*/
|
||||
@SuppressWarnings({"rawtypes", "unchecked"})
|
||||
public static <NodesResponse extends BaseNodesResponse & ToXContent> BytesRestResponse nodesResponse(final XContentBuilder builder,
|
||||
final Params params,
|
||||
final NodesResponse response)
|
||||
|
@ -214,7 +215,7 @@ public class RestActions {
|
|||
* });
|
||||
* </code>
|
||||
*/
|
||||
public static class NodesResponseRestListener<NodesResponse extends BaseNodesResponse & ToXContent>
|
||||
public static class NodesResponseRestListener<NodesResponse extends BaseNodesResponse<?> & ToXContent>
|
||||
extends RestBuilderListener<NodesResponse> {
|
||||
|
||||
public NodesResponseRestListener(RestChannel channel) {
|
||||
|
|
|
@ -40,6 +40,7 @@ public class RestClusterUpdateSettingsAction extends BaseRestHandler {
|
|||
}
|
||||
|
||||
@Override
|
||||
@SuppressWarnings("unchecked")
|
||||
public RestChannelConsumer prepareRequest(final RestRequest request, final NodeClient client) throws IOException {
|
||||
final ClusterUpdateSettingsRequest clusterUpdateSettingsRequest = Requests.clusterUpdateSettingsRequest();
|
||||
clusterUpdateSettingsRequest.timeout(request.paramAsTime("timeout", clusterUpdateSettingsRequest.timeout()));
|
||||
|
@ -50,10 +51,10 @@ public class RestClusterUpdateSettingsAction extends BaseRestHandler {
|
|||
source = parser.map();
|
||||
}
|
||||
if (source.containsKey(TRANSIENT)) {
|
||||
clusterUpdateSettingsRequest.transientSettings((Map) source.get(TRANSIENT));
|
||||
clusterUpdateSettingsRequest.transientSettings((Map<String, ?>) source.get(TRANSIENT));
|
||||
}
|
||||
if (source.containsKey(PERSISTENT)) {
|
||||
clusterUpdateSettingsRequest.persistentSettings((Map) source.get(PERSISTENT));
|
||||
clusterUpdateSettingsRequest.persistentSettings((Map<String, ?>) source.get(PERSISTENT));
|
||||
}
|
||||
|
||||
return channel -> client.admin().cluster().updateSettings(clusterUpdateSettingsRequest, new RestToXContentListener<>(channel));
|
||||
|
|
|
@ -419,6 +419,7 @@ public class RestTable {
|
|||
this.ordering = ordering;
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private int compareCell(Object o1, Object o2) {
|
||||
if (o1 == null && o2 == null) {
|
||||
return 0;
|
||||
|
|
|
@ -58,8 +58,8 @@ public class MinAndMax<T extends Comparable<? super T>> implements Writeable {
|
|||
/**
|
||||
* Return a {@link Comparator} for {@link MinAndMax} values according to the provided {@link SortOrder}.
|
||||
*/
|
||||
public static Comparator<MinAndMax<?>> getComparator(SortOrder order) {
|
||||
Comparator<MinAndMax<?>> cmp = order == SortOrder.ASC ?
|
||||
public static <T extends Comparable<? super T>> Comparator<MinAndMax<T>> getComparator(SortOrder order) {
|
||||
Comparator<MinAndMax<T>> cmp = order == SortOrder.ASC ?
|
||||
Comparator.comparing(MinAndMax::getMin) : Comparator.comparing(MinAndMax::getMax);
|
||||
if (order == SortOrder.DESC) {
|
||||
cmp = cmp.reversed();
|
||||
|
|
|
@ -65,6 +65,7 @@ import java.util.stream.Collectors;
|
|||
import java.util.stream.IntStream;
|
||||
|
||||
import static org.elasticsearch.action.search.SearchAsyncActionTests.getShardsIter;
|
||||
import static org.elasticsearch.core.Types.forciblyCast;
|
||||
import static org.hamcrest.Matchers.equalTo;
|
||||
import static org.mockito.Mockito.mock;
|
||||
|
||||
|
@ -353,7 +354,7 @@ public class CanMatchPreFilterSearchPhaseTests extends ESTestCase {
|
|||
latch.await();
|
||||
ShardId[] expected = IntStream.range(0, shardIds.size())
|
||||
.boxed()
|
||||
.sorted(Comparator.comparing(minAndMaxes::get, MinAndMax.getComparator(order)).thenComparing(shardIds::get))
|
||||
.sorted(Comparator.comparing(minAndMaxes::get, forciblyCast(MinAndMax.getComparator(order))).thenComparing(shardIds::get))
|
||||
.map(shardIds::get)
|
||||
.toArray(ShardId[]::new);
|
||||
if (shardToSkip.size() == expected.length) {
|
||||
|
|
|
@ -64,7 +64,7 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
|
||||
public void testEmptyTaskListProducesSameClusterState() throws Exception {
|
||||
final ClusterState clusterState = stateWithNoShard();
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, Collections.emptyList());
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, Collections.emptyList());
|
||||
assertSame(clusterState, result.resultingState);
|
||||
}
|
||||
|
||||
|
@ -77,11 +77,11 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
"test",
|
||||
ShardLongFieldRange.UNKNOWN);
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, singletonList(entry));
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, singletonList(entry));
|
||||
assertSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(1));
|
||||
assertThat(result.executionResults.containsKey(entry), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(entry)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(entry).isSuccess(), is(true));
|
||||
}
|
||||
|
||||
public void testNonExistentShardsAreMarkedAsSuccessful() throws Exception {
|
||||
|
@ -109,12 +109,12 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
|
||||
).collect(Collectors.toList());
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, tasks);
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, tasks);
|
||||
assertSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(tasks.size()));
|
||||
tasks.forEach(task -> {
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
});
|
||||
}
|
||||
|
||||
|
@ -137,12 +137,12 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
return new StartedShardEntry(shardId, allocationId, primaryTerm, "test", ShardLongFieldRange.UNKNOWN);
|
||||
}).collect(Collectors.toList());
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, tasks);
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, tasks);
|
||||
assertSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(tasks.size()));
|
||||
tasks.forEach(task -> {
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
});
|
||||
}
|
||||
|
||||
|
@ -163,12 +163,12 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
final String replicaAllocationId = replicaShard.allocationId().getId();
|
||||
tasks.add(new StartedShardEntry(shardId, replicaAllocationId, primaryTerm, "test", ShardLongFieldRange.UNKNOWN));
|
||||
}
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, tasks);
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, tasks);
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(tasks.size()));
|
||||
tasks.forEach(task -> {
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
|
||||
final IndexShardRoutingTable shardRoutingTable = result.resultingState.routingTable().shardRoutingTable(task.shardId);
|
||||
assertThat(shardRoutingTable.getByAllocationId(task.allocationId).state(), is(ShardRoutingState.STARTED));
|
||||
|
@ -189,12 +189,12 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
.mapToObj(i -> new StartedShardEntry(shardId, allocationId, primaryTerm, "test", ShardLongFieldRange.UNKNOWN))
|
||||
.collect(Collectors.toList());
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, tasks);
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, tasks);
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(tasks.size()));
|
||||
tasks.forEach(task -> {
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
|
||||
final IndexShardRoutingTable shardRoutingTable = result.resultingState.routingTable().shardRoutingTable(task.shardId);
|
||||
assertThat(shardRoutingTable.getByAllocationId(task.allocationId).state(), is(ShardRoutingState.STARTED));
|
||||
|
@ -224,11 +224,11 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
"primary terms does not match on primary",
|
||||
ShardLongFieldRange.UNKNOWN);
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, singletonList(task));
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, singletonList(task));
|
||||
assertSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(1));
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
IndexShardRoutingTable shardRoutingTable = result.resultingState.routingTable().shardRoutingTable(task.shardId);
|
||||
assertThat(shardRoutingTable.getByAllocationId(task.allocationId).state(), is(ShardRoutingState.INITIALIZING));
|
||||
assertSame(clusterState, result.resultingState);
|
||||
|
@ -237,11 +237,11 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
final StartedShardEntry task = new StartedShardEntry(
|
||||
shardId, primaryAllocationId, primaryTerm, "primary terms match on primary", ShardLongFieldRange.UNKNOWN);
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, singletonList(task));
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, singletonList(task));
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(1));
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
IndexShardRoutingTable shardRoutingTable = result.resultingState.routingTable().shardRoutingTable(task.shardId);
|
||||
assertThat(shardRoutingTable.getByAllocationId(task.allocationId).state(), is(ShardRoutingState.STARTED));
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
|
@ -255,11 +255,11 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
final StartedShardEntry task = new StartedShardEntry(
|
||||
shardId, replicaAllocationId, replicaPrimaryTerm, "test on replica", ShardLongFieldRange.UNKNOWN);
|
||||
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, singletonList(task));
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, singletonList(task));
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(1));
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
IndexShardRoutingTable shardRoutingTable = result.resultingState.routingTable().shardRoutingTable(task.shardId);
|
||||
assertThat(shardRoutingTable.getByAllocationId(task.allocationId).state(), is(ShardRoutingState.STARTED));
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
|
@ -288,12 +288,12 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
final String replicaAllocationId = replicaShard.allocationId().getId();
|
||||
tasks.add(new StartedShardEntry(shardId, replicaAllocationId, primaryTerm, "test", shardTimestampRange));
|
||||
}
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult result = executeTasks(clusterState, tasks);
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<?> result = executeTasks(clusterState, tasks);
|
||||
assertNotSame(clusterState, result.resultingState);
|
||||
assertThat(result.executionResults.size(), equalTo(tasks.size()));
|
||||
tasks.forEach(task -> {
|
||||
assertThat(result.executionResults.containsKey(task), is(true));
|
||||
assertThat(((ClusterStateTaskExecutor.TaskResult) result.executionResults.get(task)).isSuccess(), is(true));
|
||||
assertThat(result.executionResults.get(task).isSuccess(), is(true));
|
||||
|
||||
final IndexShardRoutingTable shardRoutingTable = result.resultingState.routingTable().shardRoutingTable(task.shardId);
|
||||
assertThat(shardRoutingTable.getByAllocationId(task.allocationId).state(), is(ShardRoutingState.STARTED));
|
||||
|
@ -311,7 +311,7 @@ public class ShardStartedClusterStateTaskExecutorTests extends ESAllocationTestC
|
|||
});
|
||||
}
|
||||
|
||||
private ClusterStateTaskExecutor.ClusterTasksResult executeTasks(final ClusterState state,
|
||||
private ClusterStateTaskExecutor.ClusterTasksResult<?> executeTasks(final ClusterState state,
|
||||
final List<StartedShardEntry> tasks) throws Exception {
|
||||
final ClusterStateTaskExecutor.ClusterTasksResult<StartedShardEntry> result = executor.execute(state, tasks);
|
||||
assertThat(result, notNullValue());
|
||||
|
|
|
@ -1606,7 +1606,7 @@ public class IndexNameExpressionResolverTests extends ESTestCase {
|
|||
.indexAliases(state, "test-0", x -> true, true, new HashSet<>(Arrays.asList("test-0", "test-alias")));
|
||||
Arrays.sort(strings);
|
||||
assertArrayEquals(new String[] {"test-alias"}, strings);
|
||||
DocWriteRequest request = randomFrom(new IndexRequest("test-alias"),
|
||||
DocWriteRequest<?> request = randomFrom(new IndexRequest("test-alias"),
|
||||
new UpdateRequest("test-alias", "_id"), new DeleteRequest("test-alias"));
|
||||
IllegalArgumentException exception = expectThrows(IllegalArgumentException.class,
|
||||
() -> indexNameExpressionResolver.concreteWriteIndex(state, request.indicesOptions(), request.indices()[0], false, false));
|
||||
|
@ -1626,7 +1626,7 @@ public class IndexNameExpressionResolverTests extends ESTestCase {
|
|||
.indexAliases(state, "test-0", x -> true, true, new HashSet<>(Arrays.asList("test-0", "test-1", "test-alias")));
|
||||
Arrays.sort(strings);
|
||||
assertArrayEquals(new String[] {"test-alias"}, strings);
|
||||
DocWriteRequest request = randomFrom(new IndexRequest("test-alias"),
|
||||
DocWriteRequest<?> request = randomFrom(new IndexRequest("test-alias"),
|
||||
new UpdateRequest("test-alias", "_id"), new DeleteRequest("test-alias"));
|
||||
IllegalArgumentException exception = expectThrows(IllegalArgumentException.class,
|
||||
() -> indexNameExpressionResolver.concreteWriteIndex(state, request.indicesOptions(), request.indices()[0], false, false));
|
||||
|
|
|
@ -546,6 +546,7 @@ public class MetadataCreateIndexServiceTests extends ESTestCase {
|
|||
}));
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void testParseMappingsAppliesDataFromTemplateAndRequest() throws Exception {
|
||||
IndexTemplateMetadata templateMetadata = addMatchingTemplate(templateBuilder -> {
|
||||
templateBuilder.putAlias(AliasMetadata.builder("alias1"));
|
||||
|
@ -608,6 +609,7 @@ public class MetadataCreateIndexServiceTests extends ESTestCase {
|
|||
assertEquals("date-math-based-2021-01-01", aliasMetadata.get(0).alias() );
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void testRequestDataHavePriorityOverTemplateData() throws Exception {
|
||||
CompressedXContent templateMapping = createMapping("test", "text");
|
||||
CompressedXContent reqMapping = createMapping("test", "keyword");
|
||||
|
|
|
@ -922,7 +922,7 @@ public class MetadataTests extends ESTestCase {
|
|||
"}";
|
||||
|
||||
public void testTransientSettingsOverridePersistentSettings() {
|
||||
final Setting setting = Setting.simpleString("key");
|
||||
final Setting<String> setting = Setting.simpleString("key");
|
||||
final Metadata metadata = Metadata.builder()
|
||||
.persistentSettings(Settings.builder().put(setting.getKey(), "persistent-value").build())
|
||||
.transientSettings(Settings.builder().put(setting.getKey(), "transient-value").build()).build();
|
||||
|
|
|
@ -48,12 +48,15 @@ public class DiffableTests extends ESTestCase {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff diff(Map<Integer, TestDiffable> before, Map<Integer, TestDiffable> after) {
|
||||
protected MapDiff<Integer, TestDiffable, Map<Integer, TestDiffable>> diff(
|
||||
Map<Integer, TestDiffable> before,
|
||||
Map<Integer, TestDiffable> after
|
||||
) {
|
||||
return DiffableUtils.diff(before, after, keySerializer);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff readDiff(StreamInput in) throws IOException {
|
||||
protected MapDiff<Integer, TestDiffable, Map<Integer, TestDiffable>> readDiff(StreamInput in) throws IOException {
|
||||
return useProtoForDiffableSerialization
|
||||
? DiffableUtils.readJdkMapDiff(in, keySerializer, TestDiffable::readFrom, TestDiffable::readDiffFrom)
|
||||
: DiffableUtils.readJdkMapDiff(in, keySerializer, diffableValueSerializer());
|
||||
|
@ -72,12 +75,12 @@ public class DiffableTests extends ESTestCase {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff diff(Map<Integer, String> before, Map<Integer, String> after) {
|
||||
protected MapDiff<Integer, String, Map<Integer, String>> diff(Map<Integer, String> before, Map<Integer, String> after) {
|
||||
return DiffableUtils.diff(before, after, keySerializer, nonDiffableValueSerializer());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff readDiff(StreamInput in) throws IOException {
|
||||
protected MapDiff<Integer, String, Map<Integer, String>> readDiff(StreamInput in) throws IOException {
|
||||
return DiffableUtils.readJdkMapDiff(in, keySerializer, nonDiffableValueSerializer());
|
||||
}
|
||||
}.execute();
|
||||
|
@ -96,12 +99,15 @@ public class DiffableTests extends ESTestCase {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff diff(ImmutableOpenMap<Integer, TestDiffable> before, ImmutableOpenMap<Integer, TestDiffable> after) {
|
||||
protected MapDiff<Integer, TestDiffable, ImmutableOpenMap<Integer, TestDiffable>> diff(
|
||||
ImmutableOpenMap<Integer, TestDiffable> before,
|
||||
ImmutableOpenMap<Integer, TestDiffable> after
|
||||
) {
|
||||
return DiffableUtils.diff(before, after, keySerializer);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff readDiff(StreamInput in) throws IOException {
|
||||
protected MapDiff<Integer, TestDiffable, ImmutableOpenMap<Integer, TestDiffable>> readDiff(StreamInput in) throws IOException {
|
||||
return useProtoForDiffableSerialization
|
||||
? DiffableUtils.readImmutableOpenMapDiff(in, keySerializer,
|
||||
new DiffableUtils.DiffableValueReader<>(TestDiffable::readFrom, TestDiffable::readDiffFrom))
|
||||
|
@ -121,12 +127,15 @@ public class DiffableTests extends ESTestCase {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff diff(ImmutableOpenMap<Integer, String> before, ImmutableOpenMap<Integer, String> after) {
|
||||
protected MapDiff<Integer, String, ImmutableOpenMap<Integer, String>> diff(
|
||||
ImmutableOpenMap<Integer, String> before,
|
||||
ImmutableOpenMap<Integer, String> after
|
||||
) {
|
||||
return DiffableUtils.diff(before, after, keySerializer, nonDiffableValueSerializer());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff readDiff(StreamInput in) throws IOException {
|
||||
protected MapDiff<Integer, String, ImmutableOpenMap<Integer, String>> readDiff(StreamInput in) throws IOException {
|
||||
return DiffableUtils.readImmutableOpenMapDiff(in, keySerializer, nonDiffableValueSerializer());
|
||||
}
|
||||
}.execute();
|
||||
|
@ -145,12 +154,15 @@ public class DiffableTests extends ESTestCase {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff diff(ImmutableOpenIntMap<TestDiffable> before, ImmutableOpenIntMap<TestDiffable> after) {
|
||||
protected MapDiff<Integer, TestDiffable, ImmutableOpenIntMap<TestDiffable>> diff(
|
||||
ImmutableOpenIntMap<TestDiffable> before,
|
||||
ImmutableOpenIntMap<TestDiffable> after
|
||||
) {
|
||||
return DiffableUtils.diff(before, after, keySerializer);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff readDiff(StreamInput in) throws IOException {
|
||||
protected MapDiff<Integer, TestDiffable, ImmutableOpenIntMap<TestDiffable>> readDiff(StreamInput in) throws IOException {
|
||||
return useProtoForDiffableSerialization
|
||||
? DiffableUtils.readImmutableOpenIntMapDiff(in, keySerializer, TestDiffable::readFrom, TestDiffable::readDiffFrom)
|
||||
: DiffableUtils.readImmutableOpenIntMapDiff(in, keySerializer, diffableValueSerializer());
|
||||
|
@ -169,12 +181,15 @@ public class DiffableTests extends ESTestCase {
|
|||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff diff(ImmutableOpenIntMap<String> before, ImmutableOpenIntMap<String> after) {
|
||||
protected MapDiff<Integer, String, ImmutableOpenIntMap<String>> diff(
|
||||
ImmutableOpenIntMap<String> before,
|
||||
ImmutableOpenIntMap<String> after
|
||||
) {
|
||||
return DiffableUtils.diff(before, after, keySerializer, nonDiffableValueSerializer());
|
||||
}
|
||||
|
||||
@Override
|
||||
protected MapDiff readDiff(StreamInput in) throws IOException {
|
||||
protected MapDiff<Integer, String, ImmutableOpenIntMap<String>> readDiff(StreamInput in) throws IOException {
|
||||
return DiffableUtils.readImmutableOpenIntMapDiff(in, keySerializer, nonDiffableValueSerializer());
|
||||
}
|
||||
}.execute();
|
||||
|
@ -279,7 +294,7 @@ public class DiffableTests extends ESTestCase {
|
|||
MapDiff<Integer, V, T> diffMap = diff(beforeMap, afterMap);
|
||||
|
||||
// check properties of diffMap
|
||||
assertThat(new HashSet(diffMap.getDeletes()), equalTo(keysToRemove));
|
||||
assertThat(new HashSet<>(diffMap.getDeletes()), equalTo(keysToRemove));
|
||||
if (diffableValues()) {
|
||||
assertThat(diffMap.getDiffs().keySet(), equalTo(keysToOverride));
|
||||
for (Integer key : keysToOverride) {
|
||||
|
@ -329,7 +344,7 @@ public class DiffableTests extends ESTestCase {
|
|||
abstract class JdkMapDriver<V> extends MapDriver<Map<Integer, V>, V> {
|
||||
|
||||
@Override
|
||||
protected Map<Integer, V> createMap(Map values) {
|
||||
protected Map<Integer, V> createMap(Map<Integer, V> values) {
|
||||
return values;
|
||||
}
|
||||
|
||||
|
@ -347,7 +362,7 @@ public class DiffableTests extends ESTestCase {
|
|||
abstract class ImmutableOpenMapDriver<V> extends MapDriver<ImmutableOpenMap<Integer, V>, V> {
|
||||
|
||||
@Override
|
||||
protected ImmutableOpenMap<Integer, V> createMap(Map values) {
|
||||
protected ImmutableOpenMap<Integer, V> createMap(Map<Integer, V> values) {
|
||||
return ImmutableOpenMap.<Integer, V>builder().putAll(values).build();
|
||||
}
|
||||
|
||||
|
@ -366,7 +381,7 @@ public class DiffableTests extends ESTestCase {
|
|||
abstract class ImmutableOpenIntMapDriver<V> extends MapDriver<ImmutableOpenIntMap<V>, V> {
|
||||
|
||||
@Override
|
||||
protected ImmutableOpenIntMap<V> createMap(Map values) {
|
||||
protected ImmutableOpenIntMap<V> createMap(Map<Integer, V> values) {
|
||||
return ImmutableOpenIntMap.<V>builder().putAll(values).build();
|
||||
}
|
||||
|
||||
|
|
|
@ -53,10 +53,11 @@ public class TaskBatcherTests extends TaskExecutorTests {
|
|||
super(logger, threadExecutor);
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
@Override
|
||||
protected void run(Object batchingKey, List<? extends BatchedTask> tasks, String tasksSummary) {
|
||||
List<UpdateTask> updateTasks = (List) tasks;
|
||||
((TestExecutor) batchingKey).execute(updateTasks.stream().map(t -> t.task).collect(Collectors.toList()));
|
||||
List<UpdateTask> updateTasks = (List<UpdateTask>) tasks;
|
||||
((TestExecutor<Object>) batchingKey).execute(updateTasks.stream().map(t -> t.task).collect(Collectors.toList()));
|
||||
updateTasks.forEach(updateTask -> updateTask.listener.processed(updateTask.source));
|
||||
}
|
||||
|
||||
|
@ -77,6 +78,7 @@ public class TaskBatcherTests extends TaskExecutorTests {
|
|||
}
|
||||
|
||||
@Override
|
||||
@SuppressWarnings("unchecked")
|
||||
public String describeTasks(List<? extends BatchedTask> tasks) {
|
||||
return ((TestExecutor<Object>) batchingKey).describeTasks(
|
||||
tasks.stream().map(BatchedTask::getTask).collect(Collectors.toList()));
|
||||
|
|
|
@ -103,6 +103,7 @@ import java.util.stream.Collectors;
|
|||
|
||||
import static com.carrotsearch.randomizedtesting.RandomizedTest.getRandom;
|
||||
import static org.elasticsearch.env.Environment.PATH_HOME_SETTING;
|
||||
import static org.elasticsearch.test.CheckedFunctionUtils.anyCheckedFunction;
|
||||
import static org.hamcrest.Matchers.notNullValue;
|
||||
import static org.junit.Assert.assertThat;
|
||||
import static org.mockito.Matchers.any;
|
||||
|
@ -133,6 +134,7 @@ public class ClusterStateChanges {
|
|||
private final NodeRemovalClusterStateTaskExecutor nodeRemovalExecutor;
|
||||
private final JoinTaskExecutor joinTaskExecutor;
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public ClusterStateChanges(NamedXContentRegistry xContentRegistry, ThreadPool threadPool) {
|
||||
ClusterSettings clusterSettings = new ClusterSettings(SETTINGS, ClusterSettings.BUILT_IN_CLUSTER_SETTINGS);
|
||||
allocationService = new AllocationService(new AllocationDeciders(
|
||||
|
@ -157,7 +159,7 @@ public class ClusterStateChanges {
|
|||
IndicesService indicesService = mock(IndicesService.class);
|
||||
// MetadataCreateIndexService uses withTempIndexService to check mappings -> fake it here
|
||||
try {
|
||||
when(indicesService.withTempIndexService(any(IndexMetadata.class), any(CheckedFunction.class)))
|
||||
when(indicesService.withTempIndexService(any(IndexMetadata.class), anyCheckedFunction()))
|
||||
.then(invocationOnMock -> {
|
||||
IndexService indexService = mock(IndexService.class);
|
||||
IndexMetadata indexMetadata = (IndexMetadata) invocationOnMock.getArguments()[0];
|
||||
|
@ -167,8 +169,7 @@ public class ClusterStateChanges {
|
|||
when(mapperService.documentMapper()).thenReturn(null);
|
||||
when(indexService.getIndexEventListener()).thenReturn(new IndexEventListener() {});
|
||||
when(indexService.getIndexSortSupplier()).thenReturn(() -> null);
|
||||
//noinspection unchecked
|
||||
return ((CheckedFunction) invocationOnMock.getArguments()[1]).apply(indexService);
|
||||
return ((CheckedFunction<IndexService, ?, ?>) invocationOnMock.getArguments()[1]).apply(indexService);
|
||||
});
|
||||
} catch (Exception e) {
|
||||
/*
|
||||
|
|
|
@ -507,6 +507,7 @@ public class RecoverySourceHandlerTests extends ESTestCase {
|
|||
IOUtils.close(store);
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void testThrowExceptionOnPrimaryRelocatedBeforePhase1Started() throws IOException {
|
||||
final RecoverySettings recoverySettings = new RecoverySettings(Settings.EMPTY, service);
|
||||
final StartRecoveryRequest request = getStartRecoveryRequest();
|
||||
|
@ -574,6 +575,7 @@ public class RecoverySourceHandlerTests extends ESTestCase {
|
|||
assertFalse(phase2Called.get());
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void testCancellationsDoesNotLeakPrimaryPermits() throws Exception {
|
||||
final CancellableThreads cancellableThreads = new CancellableThreads();
|
||||
final IndexShard shard = mock(IndexShard.class);
|
||||
|
|
|
@ -934,7 +934,7 @@ public class IngestServiceTests extends ESTestCase {
|
|||
int numRequest = scaledRandomIntBetween(8, 64);
|
||||
int numIndexRequests = 0;
|
||||
for (int i = 0; i < numRequest; i++) {
|
||||
DocWriteRequest request;
|
||||
DocWriteRequest<?> request;
|
||||
if (randomBoolean()) {
|
||||
if (randomBoolean()) {
|
||||
request = new DeleteRequest("_index", "_id");
|
||||
|
|
|
@ -80,8 +80,7 @@ public class JvmGcMonitorServiceSettingsTests extends ESTestCase {
|
|||
Settings.Builder builder = Settings.builder();
|
||||
|
||||
// drop a random setting or two
|
||||
for (@SuppressWarnings("unchecked") AbstractMap.SimpleEntry<String, String> entry : randomSubsetOf(randomIntBetween(1, 2),
|
||||
entries.toArray(new AbstractMap.SimpleEntry[0]))) {
|
||||
for (AbstractMap.SimpleEntry<String, String> entry : randomSubsetOf(randomIntBetween(1, 2), entries)) {
|
||||
builder.put(entry.getKey(), entry.getValue());
|
||||
}
|
||||
|
||||
|
|
|
@ -103,6 +103,7 @@ public abstract class PersistentTasksDecidersTestCase extends ESTestCase {
|
|||
}
|
||||
|
||||
/** Asserts that the given cluster state contains nbTasks tasks that are assigned **/
|
||||
@SuppressWarnings("rawtypes")
|
||||
protected static void assertNbAssignedTasks(final long nbTasks, final ClusterState clusterState) {
|
||||
assertPersistentTasks(nbTasks, clusterState, PersistentTasksCustomMetadata.PersistentTask::isAssigned);
|
||||
}
|
||||
|
@ -113,6 +114,7 @@ public abstract class PersistentTasksDecidersTestCase extends ESTestCase {
|
|||
}
|
||||
|
||||
/** Asserts that the cluster state contains nbTasks tasks that verify the given predicate **/
|
||||
@SuppressWarnings("rawtypes")
|
||||
protected static void assertPersistentTasks(final long nbTasks,
|
||||
final ClusterState clusterState,
|
||||
final Predicate<PersistentTasksCustomMetadata.PersistentTask> predicate) {
|
||||
|
|
|
@ -12,13 +12,13 @@ import org.apache.logging.log4j.Level;
|
|||
import org.apache.lucene.util.Constants;
|
||||
import org.apache.lucene.util.LuceneTestCase;
|
||||
import org.elasticsearch.Version;
|
||||
import org.elasticsearch.jdk.JarHell;
|
||||
import org.elasticsearch.core.Tuple;
|
||||
import org.elasticsearch.core.PathUtils;
|
||||
import org.elasticsearch.common.settings.Settings;
|
||||
import org.elasticsearch.core.PathUtils;
|
||||
import org.elasticsearch.core.Tuple;
|
||||
import org.elasticsearch.env.Environment;
|
||||
import org.elasticsearch.env.TestEnvironment;
|
||||
import org.elasticsearch.index.IndexModule;
|
||||
import org.elasticsearch.jdk.JarHell;
|
||||
import org.elasticsearch.test.ESTestCase;
|
||||
import org.hamcrest.Matchers;
|
||||
|
||||
|
@ -71,10 +71,35 @@ public class PluginsServiceTests extends ESTestCase {
|
|||
|
||||
public static class FilterablePlugin extends Plugin implements ScriptPlugin {}
|
||||
|
||||
static PluginsService newPluginsService(Settings settings, Class<? extends Plugin>... classpathPlugins) {
|
||||
static PluginsService newPluginsService(Settings settings) {
|
||||
return new PluginsService(
|
||||
settings, null, null,
|
||||
TestEnvironment.newEnvironment(settings).pluginsFile(), Arrays.asList(classpathPlugins)
|
||||
TestEnvironment.newEnvironment(settings).pluginsFile(), List.of()
|
||||
);
|
||||
}
|
||||
|
||||
static PluginsService newPluginsService(Settings settings, Class<? extends Plugin> classpathPlugin) {
|
||||
return new PluginsService(
|
||||
settings, null, null,
|
||||
TestEnvironment.newEnvironment(settings).pluginsFile(), List.of(classpathPlugin)
|
||||
);
|
||||
}
|
||||
|
||||
static PluginsService newPluginsService(Settings settings, List<Class<? extends Plugin>> classpathPlugins) {
|
||||
return new PluginsService(
|
||||
settings, null, null,
|
||||
TestEnvironment.newEnvironment(settings).pluginsFile(), classpathPlugins
|
||||
);
|
||||
}
|
||||
|
||||
static PluginsService newPluginsService(
|
||||
Settings settings,
|
||||
Class<? extends Plugin> classpathPlugin1,
|
||||
Class<? extends Plugin> classpathPlugin2
|
||||
) {
|
||||
return new PluginsService(
|
||||
settings, null, null,
|
||||
TestEnvironment.newEnvironment(settings).pluginsFile(), List.of(classpathPlugin1, classpathPlugin2)
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -136,7 +161,6 @@ public class PluginsServiceTests extends ESTestCase {
|
|||
.build();
|
||||
final Path hidden = home.resolve("plugins").resolve(".hidden");
|
||||
Files.createDirectories(hidden);
|
||||
@SuppressWarnings("unchecked")
|
||||
final IllegalStateException e = expectThrows(
|
||||
IllegalStateException.class,
|
||||
() -> newPluginsService(settings));
|
||||
|
@ -156,7 +180,7 @@ public class PluginsServiceTests extends ESTestCase {
|
|||
final Path desktopServicesStore = plugins.resolve(".DS_Store");
|
||||
Files.createFile(desktopServicesStore);
|
||||
if (Constants.MAC_OS_X) {
|
||||
@SuppressWarnings("unchecked") final PluginsService pluginsService = newPluginsService(settings);
|
||||
final PluginsService pluginsService = newPluginsService(settings);
|
||||
assertNotNull(pluginsService);
|
||||
} else {
|
||||
final IllegalStateException e = expectThrows(IllegalStateException.class, () -> newPluginsService(settings));
|
||||
|
@ -409,9 +433,9 @@ public class PluginsServiceTests extends ESTestCase {
|
|||
|
||||
public static class DummyClass3 {}
|
||||
|
||||
void makeJar(Path jarFile, Class... classes) throws Exception {
|
||||
void makeJar(Path jarFile, Class<?>... classes) throws Exception {
|
||||
try (ZipOutputStream out = new ZipOutputStream(Files.newOutputStream(jarFile))) {
|
||||
for (Class clazz : classes) {
|
||||
for (Class<?> clazz : classes) {
|
||||
String relativePath = clazz.getCanonicalName().replaceAll("\\.", "/") + ".class";
|
||||
if (relativePath.contains(PluginsServiceTests.class.getSimpleName())) {
|
||||
// static inner class of this test
|
||||
|
|
|
@ -68,7 +68,7 @@ public class ScriptContextInfoTests extends ESTestCase {
|
|||
assertEquals(eparams.get(i).v2(), info.execute.parameters.get(i).name);
|
||||
}
|
||||
assertEquals(2, info.getters.size());
|
||||
HashMap<String,String> getters = new HashMap(Map.of("getByte","byte", "getChar","char"));
|
||||
HashMap<String,String> getters = new HashMap<>(Map.of("getByte","byte", "getChar","char"));
|
||||
for (ScriptContextInfo.ScriptMethodInfo getter: info.getters) {
|
||||
assertEquals(0, getter.parameters.size());
|
||||
String returnType = getters.remove(getter.name);
|
||||
|
@ -109,7 +109,7 @@ public class ScriptContextInfoTests extends ESTestCase {
|
|||
assertEquals(eparams.get(i).v2(), info.execute.parameters.get(i).name);
|
||||
}
|
||||
assertEquals(2, info.getters.size());
|
||||
HashMap<String,String> getters = new HashMap(Map.of("getCustom1",ct1, "getCustom2",ct2));
|
||||
HashMap<String,String> getters = new HashMap<>(Map.of("getCustom1",ct1, "getCustom2",ct2));
|
||||
for (ScriptContextInfo.ScriptMethodInfo getter: info.getters) {
|
||||
assertEquals(0, getter.parameters.size());
|
||||
String returnType = getters.remove(getter.name);
|
||||
|
@ -118,7 +118,7 @@ public class ScriptContextInfoTests extends ESTestCase {
|
|||
}
|
||||
assertEquals(0, getters.size());
|
||||
|
||||
HashMap<String,String> methods = new HashMap(Map.of("getCustom1",ct1, "getCustom2",ct2, "execute",ct0));
|
||||
HashMap<String,String> methods = new HashMap<>(Map.of("getCustom1",ct1, "getCustom2",ct2, "execute",ct0));
|
||||
for (ScriptContextInfo.ScriptMethodInfo method: info.methods()) {
|
||||
String returnType = methods.remove(method.name);
|
||||
assertNotNull(returnType);
|
||||
|
@ -199,7 +199,7 @@ public class ScriptContextInfoTests extends ESTestCase {
|
|||
Set<ScriptMethodInfo> getters =
|
||||
new ScriptContextInfo("getter_conditional", GetterConditional.class).getters;
|
||||
assertEquals(2, getters.size());
|
||||
HashMap<String,String> methods = new HashMap(Map.of("getNonDefault1","boolean", "getNonDefault2","float"));
|
||||
HashMap<String,String> methods = new HashMap<>(Map.of("getNonDefault1","boolean", "getNonDefault2","float"));
|
||||
for (ScriptContextInfo.ScriptMethodInfo method: getters) {
|
||||
String returnType = methods.remove(method.name);
|
||||
assertNotNull(returnType);
|
||||
|
|
|
@ -489,7 +489,7 @@ public class ScriptServiceTests extends ESTestCase {
|
|||
assertEquals(ingest.cacheExpireDefault, holder.contextCache.get(name).get().cacheExpire);
|
||||
}
|
||||
|
||||
private void assertCompileRejected(String lang, String script, ScriptType scriptType, ScriptContext scriptContext) {
|
||||
private void assertCompileRejected(String lang, String script, ScriptType scriptType, ScriptContext<?> scriptContext) {
|
||||
try {
|
||||
scriptService.compile(new Script(scriptType, lang, script, Collections.emptyMap()), scriptContext);
|
||||
fail("compile should have been rejected for lang [" + lang + "], " +
|
||||
|
@ -499,7 +499,7 @@ public class ScriptServiceTests extends ESTestCase {
|
|||
}
|
||||
}
|
||||
|
||||
private void assertCompileAccepted(String lang, String script, ScriptType scriptType, ScriptContext scriptContext) {
|
||||
private void assertCompileAccepted(String lang, String script, ScriptType scriptType, ScriptContext<?> scriptContext) {
|
||||
assertThat(
|
||||
scriptService.compile(new Script(scriptType, lang, script, Collections.emptyMap()), scriptContext),
|
||||
notNullValue()
|
||||
|
|
|
@ -114,7 +114,7 @@ public class NestedIdentityTests extends ESTestCase {
|
|||
}
|
||||
List<Supplier<NestedIdentity>> mutations = new ArrayList<>();
|
||||
int offset = original.getOffset();
|
||||
NestedIdentity child = (NestedIdentity) original.getChild();
|
||||
NestedIdentity child = original.getChild();
|
||||
String fieldName = original.getField().string();
|
||||
mutations.add(() ->
|
||||
new NestedIdentity(original.getField().string() + "_prefix", offset, child));
|
||||
|
|
|
@ -605,13 +605,14 @@ public class SearchModuleTests extends ESTestCase {
|
|||
}
|
||||
}
|
||||
|
||||
private static class TestSuggestion extends Suggestion {
|
||||
@SuppressWarnings("rawtypes")
|
||||
private static class TestSuggestion<T extends Suggestion.Entry> extends Suggestion<T> {
|
||||
TestSuggestion(StreamInput in) throws IOException {
|
||||
super(in);
|
||||
}
|
||||
|
||||
@Override
|
||||
protected Entry newEntry(StreamInput in) throws IOException {
|
||||
protected T newEntry(StreamInput in) throws IOException {
|
||||
return null;
|
||||
}
|
||||
|
||||
|
|
|
@ -65,7 +65,7 @@ public class TopHitsAggregatorTests extends AggregatorTestCase {
|
|||
|
||||
public void testNoResults() throws Exception {
|
||||
TopHits result = (TopHits) testCase(new MatchNoDocsQuery(), topHits("_name").sort("string", SortOrder.DESC));
|
||||
SearchHits searchHits = ((TopHits) result).getHits();
|
||||
SearchHits searchHits = result.getHits();
|
||||
assertEquals(0L, searchHits.getTotalHits().value);
|
||||
assertFalse(AggregationInspectionHelper.hasValue(((InternalTopHits)result)));
|
||||
}
|
||||
|
|
|
@ -42,7 +42,7 @@ public class LeafStoredFieldsLookupTests extends ESTestCase {
|
|||
}
|
||||
|
||||
public void testBasicLookup() {
|
||||
FieldLookup fieldLookup = (FieldLookup) fieldsLookup.get("field");
|
||||
FieldLookup fieldLookup = fieldsLookup.get("field");
|
||||
assertEquals("field", fieldLookup.fieldType().name());
|
||||
|
||||
List<Object> values = fieldLookup.getValues();
|
||||
|
@ -52,7 +52,7 @@ public class LeafStoredFieldsLookupTests extends ESTestCase {
|
|||
}
|
||||
|
||||
public void testLookupWithFieldAlias() {
|
||||
FieldLookup fieldLookup = (FieldLookup) fieldsLookup.get("alias");
|
||||
FieldLookup fieldLookup = fieldsLookup.get("alias");
|
||||
assertEquals("field", fieldLookup.fieldType().name());
|
||||
|
||||
List<Object> values = fieldLookup.getValues();
|
||||
|
|
|
@ -470,7 +470,8 @@ public class FieldSortBuilderTests extends AbstractSortTestCase<FieldSortBuilder
|
|||
|
||||
try (Directory dir = newDirectory()) {
|
||||
int numDocs = randomIntBetween(10, 30);
|
||||
final Comparable[] values = new Comparable[numDocs];
|
||||
@SuppressWarnings("rawtypes")
|
||||
final Comparable<?>[] values = new Comparable[numDocs];
|
||||
try (RandomIndexWriter writer = new RandomIndexWriter(random(), dir)) {
|
||||
for (int i = 0; i < numDocs; i++) {
|
||||
Document doc = new Document();
|
||||
|
|
|
@ -561,8 +561,7 @@ public class GeoDistanceSortBuilderTests extends AbstractSortTestCase<GeoDistanc
|
|||
}
|
||||
};
|
||||
sortBuilder.setNestedSort(new NestedSortBuilder("path").setFilter(rangeQuery));
|
||||
GeoDistanceSortBuilder rewritten = (GeoDistanceSortBuilder) sortBuilder
|
||||
.rewrite(createMockSearchExecutionContext());
|
||||
GeoDistanceSortBuilder rewritten = sortBuilder.rewrite(createMockSearchExecutionContext());
|
||||
assertNotSame(rangeQuery, rewritten.getNestedSort().getFilter());
|
||||
}
|
||||
|
||||
|
|
|
@ -103,7 +103,7 @@ public class SuggestTests extends ESTestCase {
|
|||
assertNull(parser.nextToken());
|
||||
}
|
||||
assertEquals(suggest.size(), parsed.size());
|
||||
for (Suggestion suggestion : suggest) {
|
||||
for (Suggestion<?> suggestion : suggest) {
|
||||
Suggestion<? extends Entry<? extends Option>> parsedSuggestion = parsed.getSuggestion(suggestion.getName());
|
||||
assertNotNull(parsedSuggestion);
|
||||
assertEquals(suggestion.getClass(), parsedSuggestion.getClass());
|
||||
|
|
|
@ -30,12 +30,13 @@ import java.util.function.Supplier;
|
|||
|
||||
import static org.elasticsearch.common.xcontent.XContentHelper.toXContent;
|
||||
import static org.elasticsearch.common.xcontent.XContentParserUtils.ensureExpectedToken;
|
||||
import static org.elasticsearch.core.Types.forciblyCast;
|
||||
import static org.elasticsearch.test.XContentTestUtils.insertRandomFields;
|
||||
import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertToXContentEquivalent;
|
||||
|
||||
public class SuggestionEntryTests extends ESTestCase {
|
||||
|
||||
private static final Map<Class<? extends Entry>, Function<XContentParser, ? extends Entry>> ENTRY_PARSERS = new HashMap<>();
|
||||
private static final Map<Class<? extends Entry<?>>, Function<XContentParser, ? extends Entry<?>>> ENTRY_PARSERS = new HashMap<>();
|
||||
static {
|
||||
ENTRY_PARSERS.put(TermSuggestion.Entry.class, TermSuggestion.Entry::fromXContent);
|
||||
ENTRY_PARSERS.put(PhraseSuggestion.Entry.class, PhraseSuggestion.Entry::fromXContent);
|
||||
|
@ -46,27 +47,27 @@ public class SuggestionEntryTests extends ESTestCase {
|
|||
* Create a randomized Suggestion.Entry
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public static <O extends Option> Entry<O> createTestItem(Class<? extends Entry> entryType) {
|
||||
public static <O extends Option> Entry<O> createTestItem(Class<? extends Entry<O>> entryType) {
|
||||
Text entryText = new Text(randomAlphaOfLengthBetween(5, 15));
|
||||
int offset = randomInt();
|
||||
int length = randomInt();
|
||||
Entry entry;
|
||||
Entry<O> entry;
|
||||
Supplier<Option> supplier;
|
||||
if (entryType == TermSuggestion.Entry.class) {
|
||||
entry = new TermSuggestion.Entry(entryText, offset, length);
|
||||
if (entryType == (Class<? extends Entry<? extends Option>>) TermSuggestion.Entry.class) {
|
||||
entry = (Entry<O>) new TermSuggestion.Entry(entryText, offset, length);
|
||||
supplier = TermSuggestionOptionTests::createTestItem;
|
||||
} else if (entryType == PhraseSuggestion.Entry.class) {
|
||||
entry = new PhraseSuggestion.Entry(entryText, offset, length, randomDouble());
|
||||
} else if (entryType == (Class<? extends Entry<? extends Option>>) PhraseSuggestion.Entry.class) {
|
||||
entry = (Entry<O>) new PhraseSuggestion.Entry(entryText, offset, length, randomDouble());
|
||||
supplier = SuggestionOptionTests::createTestItem;
|
||||
} else if (entryType == CompletionSuggestion.Entry.class) {
|
||||
entry = new CompletionSuggestion.Entry(entryText, offset, length);
|
||||
} else if (entryType == (Class<? extends Entry<? extends Option>>) CompletionSuggestion.Entry.class) {
|
||||
entry = (Entry<O>) new CompletionSuggestion.Entry(entryText, offset, length);
|
||||
supplier = CompletionSuggestionOptionTests::createTestItem;
|
||||
} else {
|
||||
throw new UnsupportedOperationException("entryType not supported [" + entryType + "]");
|
||||
}
|
||||
int numOptions = randomIntBetween(0, 5);
|
||||
for (int i = 0; i < numOptions; i++) {
|
||||
entry.addOption(supplier.get());
|
||||
entry.addOption((O) supplier.get());
|
||||
}
|
||||
return entry;
|
||||
}
|
||||
|
@ -81,8 +82,8 @@ public class SuggestionEntryTests extends ESTestCase {
|
|||
|
||||
@SuppressWarnings("unchecked")
|
||||
private void doTestFromXContent(boolean addRandomFields) throws IOException {
|
||||
for (Class<? extends Entry> entryType : ENTRY_PARSERS.keySet()) {
|
||||
Entry<Option> entry = createTestItem(entryType);
|
||||
for (Class<? extends Entry<?>> entryType : ENTRY_PARSERS.keySet()) {
|
||||
Entry<Option> entry = createTestItem((forciblyCast(entryType)));
|
||||
XContentType xContentType = randomFrom(XContentType.values());
|
||||
boolean humanReadable = randomBoolean();
|
||||
BytesReference originalBytes = toShuffledXContent(entry, xContentType, ToXContent.EMPTY_PARAMS, humanReadable);
|
||||
|
@ -105,7 +106,7 @@ public class SuggestionEntryTests extends ESTestCase {
|
|||
Entry<Option> parsed;
|
||||
try (XContentParser parser = createParser(xContentType.xContent(), mutated)) {
|
||||
ensureExpectedToken(XContentParser.Token.START_OBJECT, parser.nextToken(), parser);
|
||||
parsed = ENTRY_PARSERS.get(entry.getClass()).apply(parser);
|
||||
parsed = (Entry<Option>) ENTRY_PARSERS.get(entry.getClass()).apply(parser);
|
||||
assertEquals(XContentParser.Token.END_OBJECT, parser.currentToken());
|
||||
assertNull(parser.nextToken());
|
||||
}
|
||||
|
|
|
@ -41,7 +41,7 @@ import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertToXC
|
|||
|
||||
public class SuggestionTests extends ESTestCase {
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
@SuppressWarnings({"rawtypes", "unchecked"})
|
||||
private static final Class<Suggestion<? extends Entry<? extends Option>>>[] SUGGESTION_TYPES = new Class[] {
|
||||
TermSuggestion.class, PhraseSuggestion.class, CompletionSuggestion.class
|
||||
};
|
||||
|
|
|
@ -159,7 +159,7 @@ public class RestoreServiceTests extends ESTestCase {
|
|||
);
|
||||
doAnswer(invocationOnMock -> {
|
||||
assertTrue(pendingRefreshes.remove(repositoryName));
|
||||
// noinspection unchecked
|
||||
@SuppressWarnings("unchecked")
|
||||
ActionListener<RepositoryData> repositoryDataListener = (ActionListener<RepositoryData>) invocationOnMock
|
||||
.getArguments()[0];
|
||||
if (randomBoolean()) {
|
||||
|
|
|
@ -52,27 +52,27 @@ public class RandomShapeGenerator extends RandomGeoGenerator {
|
|||
}
|
||||
}
|
||||
|
||||
public static ShapeBuilder createShape(Random r) throws InvalidShapeException {
|
||||
public static ShapeBuilder<?, ?, ?> createShape(Random r) throws InvalidShapeException {
|
||||
return createShapeNear(r, null);
|
||||
}
|
||||
|
||||
public static ShapeBuilder createShape(Random r, ShapeType st) {
|
||||
public static ShapeBuilder<?, ?, ?> createShape(Random r, ShapeType st) {
|
||||
return createShapeNear(r, null, st);
|
||||
}
|
||||
|
||||
public static ShapeBuilder createShapeNear(Random r, Point nearPoint) throws InvalidShapeException {
|
||||
public static ShapeBuilder<?, ?, ?> createShapeNear(Random r, Point nearPoint) throws InvalidShapeException {
|
||||
return createShape(r, nearPoint, null, null);
|
||||
}
|
||||
|
||||
public static ShapeBuilder createShapeNear(Random r, Point nearPoint, ShapeType st) throws InvalidShapeException {
|
||||
public static ShapeBuilder<?, ?, ?> createShapeNear(Random r, Point nearPoint, ShapeType st) throws InvalidShapeException {
|
||||
return createShape(r, nearPoint, null, st);
|
||||
}
|
||||
|
||||
public static ShapeBuilder createShapeWithin(Random r, Rectangle bbox) throws InvalidShapeException {
|
||||
public static ShapeBuilder<?, ?, ?> createShapeWithin(Random r, Rectangle bbox) throws InvalidShapeException {
|
||||
return createShape(r, null, bbox, null);
|
||||
}
|
||||
|
||||
public static ShapeBuilder createShapeWithin(Random r, Rectangle bbox, ShapeType st) throws InvalidShapeException {
|
||||
public static ShapeBuilder<?, ?, ?> createShapeWithin(Random r, Rectangle bbox, ShapeType st) throws InvalidShapeException {
|
||||
return createShape(r, null, bbox, st);
|
||||
}
|
||||
|
||||
|
@ -115,7 +115,7 @@ public class RandomShapeGenerator extends RandomGeoGenerator {
|
|||
|
||||
GeometryCollectionBuilder gcb = new GeometryCollectionBuilder();
|
||||
for (int i=0; i<numGeometries;) {
|
||||
ShapeBuilder builder = createShapeWithin(r, bounds);
|
||||
ShapeBuilder<?, ?, ?> builder = createShapeWithin(r, bounds);
|
||||
// due to world wrapping, and the possibility for ambiguous polygons, the random shape generation could bail with
|
||||
// a null shape. We catch that situation here, and only increment the counter when a valid shape is returned.
|
||||
// Not the most efficient but its the lesser of the evil alternatives
|
||||
|
@ -127,8 +127,9 @@ public class RandomShapeGenerator extends RandomGeoGenerator {
|
|||
return gcb;
|
||||
}
|
||||
|
||||
private static ShapeBuilder createShape(Random r, Point nearPoint, Rectangle within, ShapeType st) throws InvalidShapeException {
|
||||
ShapeBuilder shape;
|
||||
private static ShapeBuilder<?, ?, ?> createShape(Random r, Point nearPoint, Rectangle within, ShapeType st)
|
||||
throws InvalidShapeException {
|
||||
ShapeBuilder<?, ?, ?> shape;
|
||||
short i=0;
|
||||
do {
|
||||
shape = createShape(r, nearPoint, within, st, ST_VALIDATE);
|
||||
|
@ -149,7 +150,7 @@ public class RandomShapeGenerator extends RandomGeoGenerator {
|
|||
* @param st Create a random shape of the provided type
|
||||
* @return the ShapeBuilder for a random shape
|
||||
*/
|
||||
private static ShapeBuilder createShape(Random r, Point nearPoint, Rectangle within, ShapeType st, boolean validate) throws
|
||||
private static ShapeBuilder<?, ?, ?> createShape(Random r, Point nearPoint, Rectangle within, ShapeType st, boolean validate) throws
|
||||
InvalidShapeException {
|
||||
|
||||
if (st == null) {
|
||||
|
@ -179,7 +180,7 @@ public class RandomShapeGenerator extends RandomGeoGenerator {
|
|||
p = xRandomPointIn(r, within);
|
||||
coordinatesBuilder.coordinate(p.getX(), p.getY());
|
||||
}
|
||||
ShapeBuilder pcb = (st == ShapeType.MULTIPOINT)
|
||||
ShapeBuilder<?, ?, ?> pcb = (st == ShapeType.MULTIPOINT)
|
||||
? new MultiPointBuilder(coordinatesBuilder.build())
|
||||
: new LineStringBuilder(coordinatesBuilder);
|
||||
return pcb;
|
||||
|
|
|
@ -182,7 +182,7 @@ public class ElasticsearchGeoAssertions {
|
|||
assertEquals(g1.getGeom(), g2.getGeom());
|
||||
}
|
||||
|
||||
public static void assertEquals(ShapeCollection s1, ShapeCollection s2) {
|
||||
public static void assertEquals(ShapeCollection<?> s1, ShapeCollection<?> s2) {
|
||||
Assert.assertEquals(s1.size(), s2.size());
|
||||
for (int i = 0; i < s1.size(); i++) {
|
||||
assertEquals(s1.get(i), s2.get(i));
|
||||
|
@ -197,7 +197,7 @@ public class ElasticsearchGeoAssertions {
|
|||
JtsPoint p2 = (JtsPoint) s2;
|
||||
Assert.assertEquals(p1, p2);
|
||||
} else if (s1 instanceof ShapeCollection && s2 instanceof ShapeCollection) {
|
||||
assertEquals((ShapeCollection)s1, (ShapeCollection)s2);
|
||||
assertEquals((ShapeCollection<?>) s1, (ShapeCollection<?>) s2);
|
||||
} else if (s1 instanceof GeoCircle && s2 instanceof GeoCircle) {
|
||||
Assert.assertEquals(s1, s2);
|
||||
} else if (s1 instanceof RectangleImpl && s2 instanceof RectangleImpl) {
|
||||
|
|
|
@ -61,6 +61,7 @@ public class ThreadPoolSerializationTests extends ESTestCase {
|
|||
assertThat(newInfo.getQueueSize(), is(nullValue()));
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void testThatToXContentWritesOutUnboundedCorrectly() throws Exception {
|
||||
ThreadPool.Info info = new ThreadPool.Info("foo", threadPoolType, 1, 10, TimeValue.timeValueMillis(3000), null);
|
||||
XContentBuilder builder = jsonBuilder();
|
||||
|
@ -82,6 +83,7 @@ public class ThreadPoolSerializationTests extends ESTestCase {
|
|||
terminate(threadPool);
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
public void testThatToXContentWritesInteger() throws Exception {
|
||||
ThreadPool.Info info = new ThreadPool.Info("foo", threadPoolType, 1, 10,
|
||||
TimeValue.timeValueMillis(3000), SizeValue.parseSizeValue("1k"));
|
||||
|
|
|
@ -33,7 +33,7 @@ import java.util.concurrent.TimeUnit;
|
|||
import java.util.concurrent.atomic.AtomicInteger;
|
||||
import java.util.concurrent.atomic.AtomicReference;
|
||||
|
||||
import static org.mockito.Matchers.any;
|
||||
import static org.elasticsearch.test.ActionListenerUtils.anyActionListener;
|
||||
import static org.mockito.Matchers.eq;
|
||||
import static org.mockito.Mockito.doAnswer;
|
||||
import static org.mockito.Mockito.mock;
|
||||
|
@ -83,10 +83,11 @@ public class ClusterConnectionManagerTests extends ESTestCase {
|
|||
DiscoveryNode node = new DiscoveryNode("", new TransportAddress(InetAddress.getLoopbackAddress(), 0), Version.CURRENT);
|
||||
Transport.Connection connection = new TestConnect(node);
|
||||
doAnswer(invocationOnMock -> {
|
||||
@SuppressWarnings("unchecked")
|
||||
ActionListener<Transport.Connection> listener = (ActionListener<Transport.Connection>) invocationOnMock.getArguments()[2];
|
||||
listener.onResponse(connection);
|
||||
return null;
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), any(ActionListener.class));
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), anyActionListener());
|
||||
|
||||
assertFalse(connectionManager.nodeConnected(node));
|
||||
|
||||
|
@ -120,6 +121,7 @@ public class ClusterConnectionManagerTests extends ESTestCase {
|
|||
|
||||
DiscoveryNode node = new DiscoveryNode("", new TransportAddress(InetAddress.getLoopbackAddress(), 0), Version.CURRENT);
|
||||
doAnswer(invocationOnMock -> {
|
||||
@SuppressWarnings("unchecked")
|
||||
ActionListener<Transport.Connection> listener = (ActionListener<Transport.Connection>) invocationOnMock.getArguments()[2];
|
||||
|
||||
boolean success = randomBoolean();
|
||||
|
@ -135,7 +137,7 @@ public class ClusterConnectionManagerTests extends ESTestCase {
|
|||
threadPool.generic().execute(() -> listener.onFailure(new IllegalStateException("dummy exception")));
|
||||
}
|
||||
return null;
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), any(ActionListener.class));
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), anyActionListener());
|
||||
|
||||
assertFalse(connectionManager.nodeConnected(node));
|
||||
|
||||
|
@ -237,10 +239,11 @@ public class ClusterConnectionManagerTests extends ESTestCase {
|
|||
DiscoveryNode node = new DiscoveryNode("", new TransportAddress(InetAddress.getLoopbackAddress(), 0), Version.CURRENT);
|
||||
Transport.Connection connection = new TestConnect(node);
|
||||
doAnswer(invocationOnMock -> {
|
||||
@SuppressWarnings("unchecked")
|
||||
ActionListener<Transport.Connection> listener = (ActionListener<Transport.Connection>) invocationOnMock.getArguments()[2];
|
||||
listener.onResponse(connection);
|
||||
return null;
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), any(ActionListener.class));
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), anyActionListener());
|
||||
|
||||
assertFalse(connectionManager.nodeConnected(node));
|
||||
|
||||
|
@ -276,10 +279,11 @@ public class ClusterConnectionManagerTests extends ESTestCase {
|
|||
|
||||
DiscoveryNode node = new DiscoveryNode("", new TransportAddress(InetAddress.getLoopbackAddress(), 0), Version.CURRENT);
|
||||
doAnswer(invocationOnMock -> {
|
||||
@SuppressWarnings("unchecked")
|
||||
ActionListener<Transport.Connection> listener = (ActionListener<Transport.Connection>)invocationOnMock.getArguments()[2];
|
||||
listener.onFailure(new ConnectTransportException(node, ""));
|
||||
return null;
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), any(ActionListener.class));
|
||||
}).when(transport).openConnection(eq(node), eq(connectionProfile), anyActionListener());
|
||||
|
||||
assertFalse(connectionManager.nodeConnected(node));
|
||||
|
||||
|
|
|
@ -66,7 +66,7 @@ public class ResourceWatcherServiceTests extends ESTestCase {
|
|||
};
|
||||
|
||||
// checking default freq
|
||||
WatcherHandle handle = service.add(watcher);
|
||||
WatcherHandle<ResourceWatcher> handle = service.add(watcher);
|
||||
assertThat(handle, notNullValue());
|
||||
assertThat(handle.frequency(), equalTo(ResourceWatcherService.Frequency.MEDIUM));
|
||||
assertThat(service.lowMonitor.watchers.size(), is(0));
|
||||
|
|
|
@ -12,8 +12,16 @@ import org.elasticsearch.action.ActionListener;
|
|||
|
||||
import static org.mockito.Matchers.any;
|
||||
|
||||
/**
|
||||
* Test utilities for working with {@link ActionListener}s.
|
||||
*/
|
||||
public abstract class ActionListenerUtils {
|
||||
|
||||
/**
|
||||
* Returns a Mockito matcher for any argument that is an {@link ActionListener}.
|
||||
* @param <T> the action listener type that the caller expects. Do not specify this, it will be inferred
|
||||
* @return an action listener matcher
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public static <T> ActionListener<T> anyActionListener() {
|
||||
return any(ActionListener.class);
|
||||
|
|
|
@ -0,0 +1,43 @@
|
|||
/*
|
||||
* Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
|
||||
* or more contributor license agreements. Licensed under the Elastic License
|
||||
* 2.0 and the Server Side Public License, v 1; you may not use this file except
|
||||
* in compliance with, at your election, the Elastic License 2.0 or the Server
|
||||
* Side Public License, v 1.
|
||||
*/
|
||||
|
||||
package org.elasticsearch.test;
|
||||
|
||||
import org.elasticsearch.common.CheckedSupplier;
|
||||
import org.elasticsearch.core.CheckedFunction;
|
||||
|
||||
import static org.mockito.Matchers.any;
|
||||
|
||||
/**
|
||||
* Test utilities for working with {@link CheckedFunction}s and {@link CheckedSupplier}s.
|
||||
*/
|
||||
public class CheckedFunctionUtils {
|
||||
|
||||
/**
|
||||
* Returns a Mockito matcher for any argument that is an {@link CheckedFunction}.
|
||||
* @param <T> the function input type that the caller expects. Do not specify this, it will be inferred
|
||||
* @param <R> the function output type that the caller expects. Do not specify this, it will be inferred
|
||||
* @param <E> the function exception type that the caller expects. Do not specify this, it will be inferred
|
||||
* @return a checked function matcher
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public static <T, R, E extends Exception> CheckedFunction<T, R, E> anyCheckedFunction() {
|
||||
return any(CheckedFunction.class);
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns a Mockito matcher for any argument that is an {@link CheckedSupplier}.
|
||||
* @param <R> the supplier output type that the caller expects. Do not specify this, it will be inferred
|
||||
* @param <E> the supplier exception type that the caller expects. Do not specify this, it will be inferred
|
||||
* @return a checked supplier matcher
|
||||
*/
|
||||
@SuppressWarnings("unchecked")
|
||||
public static <R, E extends Exception> CheckedSupplier<R, E> anyCheckedSupplier() {
|
||||
return any(CheckedSupplier.class);
|
||||
}
|
||||
}
|
|
@ -21,7 +21,6 @@ import org.elasticsearch.common.xcontent.ToXContent;
|
|||
import org.elasticsearch.common.xcontent.XContentBuilder;
|
||||
import org.elasticsearch.common.xcontent.XContentFactory;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.core.CheckedFunction;
|
||||
import org.elasticsearch.env.Environment;
|
||||
import org.elasticsearch.env.TestEnvironment;
|
||||
import org.elasticsearch.protocol.xpack.XPackInfoResponse;
|
||||
|
@ -57,6 +56,8 @@ import java.util.Map;
|
|||
import java.util.Set;
|
||||
import javax.net.ssl.SSLException;
|
||||
|
||||
import static org.elasticsearch.test.CheckedFunctionUtils.anyCheckedFunction;
|
||||
import static org.elasticsearch.test.CheckedFunctionUtils.anyCheckedSupplier;
|
||||
import static org.hamcrest.CoreMatchers.containsString;
|
||||
import static org.mockito.Matchers.any;
|
||||
import static org.mockito.Matchers.anyString;
|
||||
|
@ -562,14 +563,4 @@ public class SetupPasswordToolTests extends CommandTestCase {
|
|||
}
|
||||
return user + "-password";
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private static <T, R, E extends Exception> CheckedFunction<T, R, E> anyCheckedFunction() {
|
||||
return any(CheckedFunction.class);
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private static <T, E extends Exception> CheckedSupplier<T, E> anyCheckedSupplier() {
|
||||
return any(CheckedSupplier.class);
|
||||
}
|
||||
}
|
||||
|
|
|
@ -7,7 +7,6 @@
|
|||
|
||||
package org.elasticsearch.xpack.security.enrollment;
|
||||
|
||||
import org.elasticsearch.common.CheckedSupplier;
|
||||
import org.elasticsearch.common.Strings;
|
||||
import org.elasticsearch.common.settings.MockSecureSettings;
|
||||
import org.elasticsearch.common.settings.SecureString;
|
||||
|
@ -17,7 +16,6 @@ import org.elasticsearch.common.xcontent.XContentFactory;
|
|||
import org.elasticsearch.common.xcontent.XContentParser;
|
||||
import org.elasticsearch.common.xcontent.XContentType;
|
||||
import org.elasticsearch.common.xcontent.json.JsonXContent;
|
||||
import org.elasticsearch.core.CheckedFunction;
|
||||
import org.elasticsearch.env.Environment;
|
||||
import org.elasticsearch.test.ESTestCase;
|
||||
import org.elasticsearch.xpack.core.security.user.ElasticUser;
|
||||
|
@ -41,6 +39,8 @@ import java.util.List;
|
|||
import java.util.Map;
|
||||
import java.util.stream.Collectors;
|
||||
|
||||
import static org.elasticsearch.test.CheckedFunctionUtils.anyCheckedFunction;
|
||||
import static org.elasticsearch.test.CheckedFunctionUtils.anyCheckedSupplier;
|
||||
import static org.elasticsearch.xpack.security.enrollment.CreateEnrollmentToken.getFilteredAddresses;
|
||||
import static org.mockito.Matchers.any;
|
||||
import static org.mockito.Matchers.anyString;
|
||||
|
@ -392,14 +392,4 @@ public class CreateEnrollmentTokenTests extends ESTestCase {
|
|||
builder.withResponseBody(responseJson);
|
||||
return builder.build();
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private static <T, E extends Exception> CheckedSupplier<T, E> anyCheckedSupplier() {
|
||||
return any(CheckedSupplier.class);
|
||||
}
|
||||
|
||||
@SuppressWarnings("unchecked")
|
||||
private static <T, R, E extends Exception> CheckedFunction<T, R, E> anyCheckedFunction() {
|
||||
return any(CheckedFunction.class);
|
||||
}
|
||||
}
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue