module documentation

pygments.lexer ~~~~~~~~~~~~~~ Base lexer classes. :copyright: Copyright 2006-2022 by the Pygments team, see AUTHORS. :license: BSD, see LICENSE for details.

Class combined Indicates a state combined from multiple states.
Class default Indicates a state or state action (e.g. #pop) to apply. For example default('#pop') is equivalent to ('', Token, '#pop') Note that state tuples may be used as well.
Class DelegatingLexer This lexer takes two lexer as arguments. A root lexer and a language lexer. First everything is scanned using the language lexer, afterwards all ``Other`` tokens are lexed using the root lexer.
Class ExtendedRegexLexer A RegexLexer that uses a context object to store its state.
Class include Indicates that a state should include rules from another state.
Class Lexer Lexer for a specific language.
Class LexerContext A helper object that holds lexer position data.
Class LexerMeta This metaclass automagically converts ``analyse_text`` methods into static methods which always return float values.
Class ProfilingRegexLexer Drop-in replacement for RegexLexer that does profiling of its regexes.
Class ProfilingRegexLexerMeta Metaclass for ProfilingRegexLexer, collects regex timing info.
Class RegexLexer Base for simple stateful regular expression-based lexers. Simplifies the lexing process so that you need only provide a list of states and regular expressions.
Class RegexLexerMeta Metaclass for RegexLexer, creates the self._tokens attribute from self.tokens on the first instantiation.
Class words Indicates a list of literal words that is transformed into an optimized regex that matches any of the words.
Function bygroups Callback that yields multiple actions for each group in the match.
Function do_insertions Helper for lexers which must combine the results of several sublexers.
Function using Callback that processes the match with a different lexer.
Variable inherit Undocumented
Variable line_re Undocumented
Variable this Undocumented
Class _inherit Indicates the a state should inherit from its superclass.
Class _PseudoMatch A pseudo match object constructed from a string.
Class _This Special singleton used for indicating the caller class. Used by ``using``.
Variable _default_analyse Undocumented
Variable _encoding_map Undocumented
def bygroups(*args): (source)

Callback that yields multiple actions for each group in the match.

def do_insertions(insertions, tokens): (source)

Helper for lexers which must combine the results of several sublexers. ``insertions`` is a list of ``(index, itokens)`` pairs. Each ``itokens`` iterable should be inserted at position ``index`` into the token stream given by the ``tokens`` argument. The result is a combined token stream. TODO: clean up the code here.

def using(_other, **kwargs): (source)

Callback that processes the match with a different lexer. The keyword arguments are forwarded to the lexer, except `state` which is handled separately. `state` specifies the state that the new lexer will start in, and can be an enumerable such as ('root', 'inline', 'string') or a simple string which is assumed to be on top of the root state. Note: For that to work, `_other` must not be an `ExtendedRegexLexer`.

Undocumented

Undocumented

Undocumented

_default_analyse = (source)

Undocumented

_encoding_map: list[tuple] = (source)

Undocumented