module documentation

Undocumented

Function default_separator Undocumented
Function Tokenizer Splits a string into tokens ready to be inserted into the search index.
Constant SEPARATOR_CHARS Undocumented
def default_separator(char): (source)

Undocumented

def Tokenizer(obj, metadata=None, separator=None): (source)

Splits a string into tokens ready to be inserted into the search index. Args: metadata (dict): Optional metadata can be passed to the tokenizer, this metadata will be cloned and added as metadata to every token that is created from the object to be tokenized. separator (callable or compiled regex): This tokenizer will convert its parameter to a string by calling `str` and then will split this string on characters for which `separator` is True. Lists will have their elements converted to strings and wrapped in a lunr `Token`. Returns: List of Token instances.

SEPARATOR_CHARS: str = (source)

Undocumented

Value
''' \t
\r\f\v -'''