pygments: add typing annotation for missing base files of pygments
Diff from mypy_primer, showing the effect of this PR on open source code:
pwndbg (https://github.com/pwndbg/pwndbg)
+ pwndbg/integration/binja.py: note: In member "decompile" of class "BinjaProvider":
+ pwndbg/integration/binja.py:503: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:503: note: Possible overload variants:
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
+ pwndbg/integration/binja.py:509: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:509: note: Possible overload variants:
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
sphinx (https://github.com/sphinx-doc/sphinx)
+ sphinx/highlighting.py: note: In member "get_lexer" of class "PygmentsBridge":
+ sphinx/highlighting.py:179:30: error: Argument 1 to "add_filter" of "Lexer" has incompatible type "str"; expected "Filter" [arg-type]
strawberry (https://github.com/strawberry-graphql/strawberry)
+ strawberry/utils/graphql_lexer.py:15: error: Incompatible types in assignment (expression has type "dict[str, list[tuple[str, Any]]]", base class "RegexLexer" defined the type as "dict[str, list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]]") [assignment]
alectryon (https://github.com/cpitclaudel/alectryon)
+ alectryon/pygments_lexer.py:432: error: List item 0 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:433: error: List item 1 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:434: error: List item 2 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:435: error: List item 3 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:439: error: List item 0 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:440: error: List item 1 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:441: error: List item 2 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:456: error: Dict entry 3 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]" [dict-item]
+ alectryon/pygments_lexer.py:457: error: Dict entry 4 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]" [dict-item]
+ alectryon/pygments_lexer.py:465: error: List item 0 has incompatible type "tuple[str, Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:471: error: List item 2 has incompatible type "tuple[str, Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:475: error: List item 5 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:477: error: Dict entry 7 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]" [dict-item]
+ alectryon/pygments_lexer.py:483: error: List item 3 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:484: error: List item 4 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:485: error: List item 5 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:486: error: List item 6 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:487: error: List item 7 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:488: error: List item 8 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:490: error: Dict entry 9 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]" [dict-item]
+ alectryon/pygments_lexer.py:491: error: Dict entry 10 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]" [dict-item]
+ alectryon/pygments_lexer.py:492: error: Dict entry 11 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]]" [dict-item]
+ alectryon/pygments_lexer.py:496: error: List item 2 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:501: error: List item 2 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:506: error: List item 2 has incompatible type "include"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
+ alectryon/pygments_lexer.py:507: error: List item 3 has incompatible type "default"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]], str]" [list-item]
Diff from mypy_primer, showing the effect of this PR on open source code:
pwndbg (https://github.com/pwndbg/pwndbg)
+ pwndbg/integration/binja.py: note: In member "decompile" of class "BinjaProvider":
+ pwndbg/integration/binja.py:503: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:503: note: Possible overload variants:
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
+ pwndbg/integration/binja.py:509: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:509: note: Possible overload variants:
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
sphinx (https://github.com/sphinx-doc/sphinx)
+ sphinx/highlighting.py:39: error: Unused "type: ignore" comment [unused-ignore]
+ sphinx/highlighting.py:39:5: error: Cannot assign to a method [method-assign]
+ sphinx/highlighting.py:39:5: note: Error code "method-assign" not covered by "type: ignore" comment
+ sphinx/highlighting.py:39:35: error: Incompatible types in assignment (expression has type "classmethod[Any, [Any], Any]", variable has type "Callable[[Any], GenericAlias]") [assignment]
+ sphinx/highlighting.py:39:35: note: Error code "assignment" not covered by "type: ignore" comment
alectryon (https://github.com/cpitclaudel/alectryon)
+ alectryon/pygments_lexer.py:456: error: Dict entry 3 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include]" [dict-item]
+ alectryon/pygments_lexer.py:457: error: Dict entry 4 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include]" [dict-item]
+ alectryon/pygments_lexer.py:475: error: List item 5 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:477: error: Dict entry 7 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include]" [dict-item]
+ alectryon/pygments_lexer.py:483: error: List item 3 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:484: error: List item 4 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:485: error: List item 5 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:486: error: List item 6 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:487: error: List item 7 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:488: error: List item 8 has incompatible type "tuple[words, _TokenType]"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
+ alectryon/pygments_lexer.py:490: error: Dict entry 9 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include]" [dict-item]
+ alectryon/pygments_lexer.py:491: error: Dict entry 10 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include]" [dict-item]
+ alectryon/pygments_lexer.py:492: error: Dict entry 11 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include]" [dict-item]
+ alectryon/pygments_lexer.py:507: error: List item 3 has incompatible type "default"; expected "tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include" [list-item]
Diff from mypy_primer, showing the effect of this PR on open source code:
pwndbg (https://github.com/pwndbg/pwndbg)
+ pwndbg/integration/binja.py: note: In member "decompile" of class "BinjaProvider":
+ pwndbg/integration/binja.py:503: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:503: note: Possible overload variants:
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
+ pwndbg/integration/binja.py:509: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:509: note: Possible overload variants:
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
sphinx (https://github.com/sphinx-doc/sphinx)
+ sphinx/highlighting.py:39: error: Unused "type: ignore" comment [unused-ignore]
+ sphinx/highlighting.py:39:5: error: Cannot assign to a method [method-assign]
+ sphinx/highlighting.py:39:5: note: Error code "method-assign" not covered by "type: ignore" comment
+ sphinx/highlighting.py:39:35: error: Incompatible types in assignment (expression has type "classmethod[Any, [Any], Any]", variable has type "Callable[[Any], GenericAlias]") [assignment]
+ sphinx/highlighting.py:39:35: note: Error code "assignment" not covered by "type: ignore" comment
alectryon (https://github.com/cpitclaudel/alectryon)
+ alectryon/pygments_lexer.py:456: error: Dict entry 3 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:457: error: Dict entry 4 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:477: error: Dict entry 7 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:490: error: Dict entry 9 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:491: error: Dict entry 10 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:492: error: Dict entry 11 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
Diff from mypy_primer, showing the effect of this PR on open source code:
pwndbg (https://github.com/pwndbg/pwndbg)
+ pwndbg/integration/binja.py: note: In member "decompile" of class "BinjaProvider":
+ pwndbg/integration/binja.py:503: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:503: note: Possible overload variants:
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
+ pwndbg/integration/binja.py:509: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:509: note: Possible overload variants:
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
sphinx (https://github.com/sphinx-doc/sphinx)
+ sphinx/highlighting.py:39: error: Unused "type: ignore" comment [unused-ignore]
+ sphinx/highlighting.py:39:5: error: Cannot assign to a method [method-assign]
+ sphinx/highlighting.py:39:5: note: Error code "method-assign" not covered by "type: ignore" comment
+ sphinx/highlighting.py:39:35: error: Incompatible types in assignment (expression has type "classmethod[Any, [Any], Any]", variable has type "Callable[[Any], GenericAlias]") [assignment]
+ sphinx/highlighting.py:39:35: note: Error code "assignment" not covered by "type: ignore" comment
alectryon (https://github.com/cpitclaudel/alectryon)
+ alectryon/pygments_lexer.py:456: error: Dict entry 3 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:457: error: Dict entry 4 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:477: error: Dict entry 7 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:490: error: Dict entry 9 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:491: error: Dict entry 10 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:492: error: Dict entry 11 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
Diff from mypy_primer, showing the effect of this PR on open source code:
pwndbg (https://github.com/pwndbg/pwndbg)
+ pwndbg/integration/binja.py: note: In member "decompile" of class "BinjaProvider":
+ pwndbg/integration/binja.py:503: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:503: note: Possible overload variants:
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
+ pwndbg/integration/binja.py:509: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:509: note: Possible overload variants:
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
sphinx (https://github.com/sphinx-doc/sphinx)
+ sphinx/highlighting.py:39: error: Unused "type: ignore" comment [unused-ignore]
+ sphinx/highlighting.py:39:5: error: Cannot assign to a method [method-assign]
+ sphinx/highlighting.py:39:5: note: Error code "method-assign" not covered by "type: ignore" comment
+ sphinx/highlighting.py:39:35: error: Incompatible types in assignment (expression has type "classmethod[Any, [Any], Any]", variable has type "Callable[[Any], GenericAlias]") [assignment]
+ sphinx/highlighting.py:39:35: note: Error code "assignment" not covered by "type: ignore" comment
alectryon (https://github.com/cpitclaudel/alectryon)
+ alectryon/pygments_lexer.py:456: error: Dict entry 3 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:457: error: Dict entry 4 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:477: error: Dict entry 7 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:490: error: Dict entry 9 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:491: error: Dict entry 10 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:492: error: Dict entry 11 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
Diff from mypy_primer, showing the effect of this PR on open source code:
pwndbg (https://github.com/pwndbg/pwndbg)
+ pwndbg/integration/binja.py: note: In member "decompile" of class "BinjaProvider":
+ pwndbg/integration/binja.py:503: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:503: note: Possible overload variants:
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:503: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
+ pwndbg/integration/binja.py:509: error: No overload variant of "format" matches argument types "list[tuple[Any, str]]", "Terminal256Formatter[str]" [call-overload]
+ pwndbg/integration/binja.py:509: note: Possible overload variants:
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: SupportsWrite[_T]) -> None
+ pwndbg/integration/binja.py:509: note: def [_T: (str, bytes)] format(tokens: Iterator[tuple[_TokenType, str]], formatter: Formatter[_T], outfile: None = ...) -> _T
sphinx (https://github.com/sphinx-doc/sphinx)
+ sphinx/highlighting.py:39: error: Unused "type: ignore" comment [unused-ignore]
+ sphinx/highlighting.py:39:5: error: Cannot assign to a method [method-assign]
+ sphinx/highlighting.py:39:5: note: Error code "method-assign" not covered by "type: ignore" comment
+ sphinx/highlighting.py:39:35: error: Incompatible types in assignment (expression has type "classmethod[Any, [Any], Any]", variable has type "Callable[[Any], GenericAlias]") [assignment]
+ sphinx/highlighting.py:39:35: note: Error code "assignment" not covered by "type: ignore" comment
alectryon (https://github.com/cpitclaudel/alectryon)
+ alectryon/pygments_lexer.py:456: error: Dict entry 3 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:457: error: Dict entry 4 has incompatible type "str": "list[tuple[str | Any, ...]]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:477: error: Dict entry 7 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:490: error: Dict entry 9 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:491: error: Dict entry 10 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]
+ alectryon/pygments_lexer.py:492: error: Dict entry 11 has incompatible type "str": "list[object]"; expected "str": "list[tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]]] | tuple[str | words, _TokenType | Iterator[tuple[int, _TokenType, str]] | Callable[[Lexer, _PseudoMatch, LexerContext], Iterator[tuple[int, _TokenType, str]]], str] | include | default]" [dict-item]