mirror of
https://github.com/elastic/elasticsearch.git
synced 2025-06-29 09:54:06 -04:00
This change removes the Lucene's experimental flag from the documentations of the following tokenizer/filters: * Simple Pattern Split Tokenizer * Simple Pattern tokenizer * Flatten Graph Token Filter * Word Delimiter Graph Token Filter The flag is still present in Lucene codebase but we're fully supporting these tokenizers/filters in ES for a long time now so the docs flag is misleading. Co-authored-by: James Rodewig <james.rodewig@elastic.co> |
||
---|---|---|
.. | ||
chargroup-tokenizer.asciidoc | ||
classic-tokenizer.asciidoc | ||
edgengram-tokenizer.asciidoc | ||
keyword-tokenizer.asciidoc | ||
letter-tokenizer.asciidoc | ||
lowercase-tokenizer.asciidoc | ||
ngram-tokenizer.asciidoc | ||
pathhierarchy-tokenizer-examples.asciidoc | ||
pathhierarchy-tokenizer.asciidoc | ||
pattern-tokenizer.asciidoc | ||
simplepattern-tokenizer.asciidoc | ||
simplepatternsplit-tokenizer.asciidoc | ||
standard-tokenizer.asciidoc | ||
thai-tokenizer.asciidoc | ||
uaxurlemail-tokenizer.asciidoc | ||
whitespace-tokenizer.asciidoc |