class documentation

class Tokenizer: (source)

View In Hierarchy

Context-sensitive token parsing. Provides methods to examine the input stream to check whether the next token matches.

Method __init__ Undocumented
Method check Check whether the next token has the provided name.
Method consume Move beyond provided token name, if at current position.
Method enclosing_tokens Undocumented
Method expect Expect a certain token name next, failing with a syntax error otherwise.
Method raise_syntax_error Raise ParserSyntaxError at the given position.
Method read Consume the next token and return it.
Instance Variable next_token Undocumented
Instance Variable position Undocumented
Instance Variable rules Undocumented
Instance Variable source Undocumented
def __init__(self, source: str, *, rules: Dict[str, Union[str, re.Pattern[str]]]): (source)

Undocumented

def check(self, name: str, *, peek: bool = False) -> bool: (source)

Check whether the next token has the provided name. By default, if the check succeeds, the token *must* be read before another check. If `peek` is set to `True`, the token is not loaded and would need to be checked again.

def consume(self, name: str): (source)

Move beyond provided token name, if at current position.

@contextlib.contextmanager
def enclosing_tokens(self, open_token: str, close_token: str) -> Iterator[bool]: (source)

Undocumented

def expect(self, name: str, *, expected: str) -> Token: (source)

Expect a certain token name next, failing with a syntax error otherwise. The token is *not* read.

def raise_syntax_error(self, message: str, *, span_start: Optional[int] = None, span_end: Optional[int] = None) -> NoReturn: (source)

Raise ParserSyntaxError at the given position.

def read(self) -> Token: (source)

Consume the next token and return it.

next_token = (source)

Undocumented

position: int = (source)

Undocumented

rules: Dict[str, re.Pattern[str]] = (source)

Undocumented

Undocumented