class documentation
class Lexer: (source)
Class that implements a lexer for a given environment. Automatically created by the environment class, usually you don't have to do that. Note that the lexer is not automatically bound to an environment. Multiple environments can share the same lexer.
Method | __init__ |
Undocumented |
Method | tokeniter |
This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template. |
Method | tokenize |
Calls tokeniter + tokenize and wraps it in a token stream. |
Method | wrap |
This is called with the stream as returned by `tokenize` and wraps every token in a :class:`Token` and converts the value. |
Instance Variable | keep |
Undocumented |
Instance Variable | lstrip |
Undocumented |
Instance Variable | newline |
Undocumented |
Instance Variable | rules |
Undocumented |
Method | _normalize |
Replace all newlines with the configured sequence in strings and template data. |
This method tokenizes the text and returns the tokens in a generator. Use this method if you just want to tokenize a template. .. versionchanged:: 3.0 Only ``\n``, ``\r\n`` and ``\r`` are treated as line breaks.
Parameters | |
source:str | Undocumented |
name:t.Optional[ | Undocumented |
filename:t.Optional[ | Undocumented |
state:t.Optional[ | Undocumented |
Returns | |
t.Iterator[ | Undocumented |
Calls tokeniter + tokenize and wraps it in a token stream.
Parameters | |
source:str | Undocumented |
name:t.Optional[ | Undocumented |
filename:t.Optional[ | Undocumented |
state:t.Optional[ | Undocumented |
Returns | |
TokenStream | Undocumented |
This is called with the stream as returned by `tokenize` and wraps every token in a :class:`Token` and converts the value.
Parameters | |
stream:t.Iterable[ | Undocumented |
name:t.Optional[ | Undocumented |
filename:t.Optional[ | Undocumented |
Returns | |
t.Iterator[ | Undocumented |