Class EdgeNGramTokenizer
java.lang.Object
com.azure.search.documents.indexes.models.LexicalTokenizer
com.azure.search.documents.indexes.models.EdgeNGramTokenizer
- All Implemented Interfaces:
com.azure.json.JsonSerializable<LexicalTokenizer>
Tokenizes the input from an edge into n-grams of the given size(s). This tokenizer is implemented using Apache
Lucene.
-
Constructor Summary
ConstructorsConstructorDescriptionEdgeNGramTokenizer(String name) Creates an instance of EdgeNGramTokenizer class. -
Method Summary
Modifier and TypeMethodDescriptionstatic EdgeNGramTokenizerfromJson(com.azure.json.JsonReader jsonReader) Reads an instance of EdgeNGramTokenizer from the JsonReader.Get the maxGram property: The maximum n-gram length.Get the minGram property: The minimum n-gram length.Get the odataType property: A URI fragment specifying the type of tokenizer.Get the tokenChars property: Character classes to keep in the tokens.setMaxGram(Integer maxGram) Set the maxGram property: The maximum n-gram length.setMinGram(Integer minGram) Set the minGram property: The minimum n-gram length.setTokenChars(TokenCharacterKind... tokenChars) Set the tokenChars property: Character classes to keep in the tokens.setTokenChars(List<TokenCharacterKind> tokenChars) Set the tokenChars property: Character classes to keep in the tokens.com.azure.json.JsonWritertoJson(com.azure.json.JsonWriter jsonWriter) Methods inherited from class LexicalTokenizer
getNameMethods inherited from class Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface com.azure.json.JsonSerializable
toJson, toJson, toJsonBytes, toJsonString
-
Constructor Details
-
EdgeNGramTokenizer
Creates an instance of EdgeNGramTokenizer class.- Parameters:
name- the name value to set.
-
-
Method Details
-
getOdataType
Get the odataType property: A URI fragment specifying the type of tokenizer.- Overrides:
getOdataTypein classLexicalTokenizer- Returns:
- the odataType value.
-
getMinGram
Get the minGram property: The minimum n-gram length. Default is 1. Maximum is 300. Must be less than the value of maxGram.- Returns:
- the minGram value.
-
setMinGram
Set the minGram property: The minimum n-gram length. Default is 1. Maximum is 300. Must be less than the value of maxGram.- Parameters:
minGram- the minGram value to set.- Returns:
- the EdgeNGramTokenizer object itself.
-
getMaxGram
Get the maxGram property: The maximum n-gram length. Default is 2. Maximum is 300.- Returns:
- the maxGram value.
-
setMaxGram
Set the maxGram property: The maximum n-gram length. Default is 2. Maximum is 300.- Parameters:
maxGram- the maxGram value to set.- Returns:
- the EdgeNGramTokenizer object itself.
-
getTokenChars
Get the tokenChars property: Character classes to keep in the tokens.- Returns:
- the tokenChars value.
-
setTokenChars
Set the tokenChars property: Character classes to keep in the tokens.- Parameters:
tokenChars- the tokenChars value to set.- Returns:
- the EdgeNGramTokenizer object itself.
-
toJson
- Specified by:
toJsonin interfacecom.azure.json.JsonSerializable<LexicalTokenizer>- Overrides:
toJsonin classLexicalTokenizer- Throws:
IOException
-
fromJson
Reads an instance of EdgeNGramTokenizer from the JsonReader.- Parameters:
jsonReader- The JsonReader being read.- Returns:
- An instance of EdgeNGramTokenizer if the JsonReader was pointing to an instance of it, or null if it was pointing to JSON null.
- Throws:
IllegalStateException- If the deserialized JSON object was missing any required properties.IOException- If an error occurs while reading the EdgeNGramTokenizer.
-
setTokenChars
Set the tokenChars property: Character classes to keep in the tokens.- Parameters:
tokenChars- the tokenChars value to set.- Returns:
- the EdgeNGramTokenizer object itself.
-