class documentation

Convert natural language type strings to reStructuredText.

Syntax is based on numpydoc type specification with additionnal recognition of PEP 484-like type annotations (with parentheses or square brackets characters).

Exemples of valid type strings and output
Type string Output
List[str] or list(bytes), optional List[str] or list(bytes), optional
{"html", "json", "xml"}, optional {"html", "json", "xml"}, optional
list of int or float or None, default: None list of int or float or None, default: None
`complicated string` or `strIO <twisted.python.compat.NativeStringIO>` complicated string or strIO
Method __init__ Undocumented
Method __str__ No summary
Instance Variable warnings Undocumented
Class Method _tokenize_type_spec Split the string in tokens for further processing.
Static Method _recombine_set_tokens Merge the special literal choices tokens together.
Method _build_tokens Undocumented
Method _convert_type_spec_to_rst Undocumented
Method _token_type Find the type of a token. Types are defined in C{TokenType} enum.
Method _trigger_warnings Append some warnings.
Class Variable _ast_like_delimiters_regex Undocumented
Class Variable _ast_like_delimiters_regex_str Undocumented
Class Variable _default_regex Undocumented
Class Variable _natural_language_delimiters_regex Undocumented
Class Variable _natural_language_delimiters_regex_str Undocumented
Class Variable _token_regex Undocumented
Instance Variable _annotation Undocumented
Instance Variable _tokens Undocumented
Instance Variable _warns_on_unknown_tokens Undocumented
def __init__(self, annotation, warns_on_unknown_tokens=False): (source)

Undocumented

Parameters
annotation:strUndocumented
warns_on_unknown_tokens:boolUndocumented
def __str__(self): (source)
Returns
strThe parsed type in reStructuredText format.
warnings: List[str] = (source)

Undocumented

@classmethod
def _tokenize_type_spec(cls, spec): (source)

Split the string in tokens for further processing.

Parameters
spec:strUndocumented
Returns
List[str]Undocumented
@staticmethod
def _recombine_set_tokens(tokens): (source)

Merge the special literal choices tokens together.

Example

>>> tokens = ["{", "1", ", ", "2", "}"]
>>> ann._recombine_set_tokens(tokens)
... ["{1, 2}"]
Parameters
tokens:List[str]Undocumented
Returns
List[str]Undocumented
def _build_tokens(self, _tokens): (source)

Undocumented

Parameters
_tokens:List[Union[str, Any]]Undocumented
Returns
List[Tuple[str, TokenType]]Undocumented
def _convert_type_spec_to_rst(self): (source)

Undocumented

Returns
strUndocumented
def _token_type(self, token): (source)

Find the type of a token. Types are defined in C{TokenType} enum.

Parameters
token:Union[str, Any]Undocumented
Returns
TokenTypeUndocumented
def _trigger_warnings(self): (source)

Append some warnings.

_ast_like_delimiters_regex = (source)

Undocumented

_ast_like_delimiters_regex_str: str = (source)

Undocumented

_default_regex = (source)

Undocumented

_natural_language_delimiters_regex = (source)

Undocumented

_natural_language_delimiters_regex_str: str = (source)

Undocumented

_token_regex = (source)

Undocumented

_annotation = (source)

Undocumented

_warns_on_unknown_tokens = (source)

Undocumented