class documentation

Class that implements a lexer for a given environment. Automatically created by the environment class, usually you don't have to do that. Note that the lexer is not automatically bound to an environment. Multiple environments can share the same lexer.

Method __init__ Undocumented
Method tokeniter This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template.
Method tokenize Calls tokeniter + tokenize and wraps it in a token stream.
Method wrap This is called with the stream as returned by `tokenize` and wraps every token in a :class:`Token` and converts the value.
Instance Variable keep_trailing_newline Undocumented
Instance Variable lstrip_blocks Undocumented
Instance Variable newline_sequence Undocumented
Instance Variable rules Undocumented
Method _normalize_newlines Replace all newlines with the configured sequence in strings and template data.
def __init__(self, environment): (source)

Undocumented

Parameters
environment:EnvironmentUndocumented
def tokeniter(self, source, name, filename=None, state=None): (source)

This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template. .. versionchanged:: 3.0 Only ``\n``, ``\r\n`` and ``\r`` are treated as line breaks.

Parameters
source:strUndocumented
name:t.Optional[str]Undocumented
filename:t.Optional[str]Undocumented
state:t.Optional[str]Undocumented
Returns
t.Iterator[t.Tuple[int, str, str]]Undocumented
def tokenize(self, source, name=None, filename=None, state=None): (source)

Calls tokeniter + tokenize and wraps it in a token stream.

Parameters
source:strUndocumented
name:t.Optional[str]Undocumented
filename:t.Optional[str]Undocumented
state:t.Optional[str]Undocumented
Returns
TokenStreamUndocumented
def wrap(self, stream, name=None, filename=None): (source)

This is called with the stream as returned by `tokenize` and wraps every token in a :class:`Token` and converts the value.

Parameters
stream:t.Iterable[t.Tuple[int, str, str]]Undocumented
name:t.Optional[str]Undocumented
filename:t.Optional[str]Undocumented
Returns
t.Iterator[Token]Undocumented
keep_trailing_newline = (source)

Undocumented

lstrip_blocks = (source)

Undocumented

newline_sequence = (source)

Undocumented

Undocumented

def _normalize_newlines(self, value): (source)

Replace all newlines with the configured sequence in strings and template data.

Parameters
value:strUndocumented
Returns
strUndocumented