Reason given for disallowing non-concrete subtype assignment is unsound
Issue
According to the discussion https://github.com/python/typing/discussions/1305, “type checkers allow incompatible __init__ overrides, because flagging them would be too disruptive.”
Accepting above as de facto, the example given as the main reason for disallowing non-concrete subtype assignment in the following section of the spec is unsound: https://github.com/python/typing/blob/e08290b70f58df509f998cbbe09a8e65abb57a9b/docs/spec/protocol.rst#type-and-class-objects-vs-protocols
class Proto(Protocol):
@abstractmethod
def meth(self) -> int:
...
class Concrete:
def meth(self) -> int:
return 42
def fun(cls: type[Proto]) -> int:
return cls().meth() # ???
fun(Proto) # Why should this error?
fun(Concrete) # OK
var: Type[Proto]
var = Proto # Why should this error?
var = Concrete # OK
var().meth() # ???
(credit https://github.com/python/mypy/issues/4717#issuecomment-1978239641 for pointing out the contradiction)
Thoughts
One radical approach would be to remove the concreteness rule from the spec altogether. The type of Proto is type[Proto], and if the constructor compatibility is not checked, there is little reason to disallow it from being assigned to variables annotated as type[Proto].
If that is too radical, the spec can stay as is but the reasoning and associated examples should be clearly marked ‘historical’ and no longer valid. That way, we can avoid any immediate changes in practice, but at the same time encourage discussions towards appropriate future specs.
Thanks for posting this. I suggest posting this to the discuss, which should attract more attention.
https://github.com/python/typing/discussions/1649
FYI discuss is here: https://discuss.python.org/
I think type[ProtocolType] often doesn't make sense. In the example above, you're better off with Callable[..., ProtocolType] — this clearly shows you that there can be no one way to instantiate a structural type.
Also see cases like https://github.com/python/mypy/issues/16919 / https://github.com/python/mypy/issues/16890
I'm curious what the real world code where you're encountering issues looks like?
I'm curious what the real world code where you're encountering issues looks like?
Please check this issue: https://github.com/python/mypy/issues/4717
Any function that accepts a type object as its argument and returns an instance of that type gets affected by the concreteness restriction. In my experience, DI containers really struggle with this because they are supposed to facilitate binding of implementation to abstraction:
- https://github.com/python-injector/injector/blob/1db1f56cfbe13951d4eb1d1e9b5207ed3c028bd8/injector/init.py#L899
- https://github.com/ivankorobkov/python-inject/blob/05ad3e87bb179ce56cabb272bd5f75cca2880f7c/inject/init.py#L395
Thanks for the links, both of those look like cases of (type[T]) -> T where T is a TypeVar, not a Protocol. Sounds like you want something more like https://github.com/python/mypy/issues/9773
both of those look like cases of
(type[T]) -> TwhereTis a TypeVar, not a Protocol.
I think T can bind to Proto to make (type[Proto]) -> Proto without a problem. The issue is that such function does not accept Proto to be specified as its parameter because Proto is a non-concrete type even though type of Proto is type[Proto].
https://github.com/python/typing/blob/e08290b70f58df509f998cbbe09a8e65abb57a9b/docs/spec/protocol.rst#type-and-class-objects-vs-protocols
Variables and parameters annotated with
type[Proto]accept only concrete (non-protocol) subtypes ofProto.
https://github.com/python/mypy/issues/9773 can somewhat improve the situation, but not sure DI containers would want to accept any TypeForms than just concrete and non-concrete Types.
Probably the AbstractType idea here works better.
But, anyway, when the main reason for having concreteness restriction is unsound, why do we keep it at all?
@ippeiukai I believe Eric brought up your issue here in case you wanted to participate!
@NeilGirdhar Thanks!