Can I use Mypy stubs as interfaces? - python

Mypy allows us to write class stubs which can be placed in the same directory as the actual class. This stub is very similar to an interface as known from other languages. Is it possible to have a client use the stub and the implementation strictly follow the stub?
Example I would like to work:
class IDependency:
def do_something(self) -> None: ...
def do_something_else(self) -> None: ...
class Service:
def __init__(self, dependency: IDependency):
dependency.do_something()
dependency.do_something_else() # this fails silently
class DependencyImplementation(IDependency):
def do_something(self) -> None:
print("doing something")
# Note there is no `do_something_else` here.
This works. However, if DependencyImplementation doesn't implement the do_something method, there is no error from Mypy and no error from Python itself. The call just doesn't do anything. Do I have to write raise NotImplementedException() or annotate each method with #abc.abstractmethod for this to work? Is there some special flags in Mypy or the Python interpreter?
Is this a use case for Mypy Protocols? It seems to be coming soon (maybe Python 4?)

This is indeed something you can do using either #abc.abstractmethod or protocols. The former is akin to using Java's abstract classes; the latter will be kin to using Go's interfaces or Rust traits.
Here is an example that uses ABCs:
from abc import abstractmethod
class Parent:
#abstractmethod
def foo(self) -> None: ...
# Missing an implementation for 'foo'!
class Child(Parent): pass
print(Child()) # Error: Cannot instantiate abstract class 'Child' with abstract attribute 'foo'
A few things to note about this example:
You get an error on the instantiation of the Child class, not on the declaration. This is to support the use case where you never instantiate Child, but instead subclass it again and define foo in that second subclass.
We don't need to add the usual abc metaclass to Parent ( e.g. class Parent(metaclass=ABCMeta)): mypy will understand what #abc.abstractmethod means with or without it. Include the metaclass only if you want the Python interpreter to also enforce that you've correctly overridden anything marked as being abstract at runtime.
ABCs are not quite interfaces -- you can still define fields and non-abstract methods. They're more akin to Java-style abstract classes.
You can also use protocols, though for now you'll need to first pip install typing_extensions to use it. Here's an example:
from typing_extensions import Protocol
class CanFoo(Protocol):
def foo(self) -> None: ...
class Child: pass
def expects_fooable(x: CanFoo) -> None: ...
x = Child()
expects_fooable(x) # Error: Argument 1 to "expects_fooable" has incompatible type "Child"; expected "CanFoo"
A few notes:
Here, Child deliberately does not inherit from CanFoo: there is no explicit link between a class and the protocol(s) it implements: protocols are very similar to Go-style interfaces and can be more ad-hoc. Contrast this to languages like Java, where you do need to include a "implements Blah" in the class definition.
Unlike the previous error, we do not get an error on the instantiation of Child: there's nothing inherently wrong with it. Rather, we get an exception when we try using it improperly.
A few final notes:
Stub files may superficially look like interfaces, but they're really not: they're more just a way to bring types to code which we cannot easily modify and add type hints to. You can think of them being vaguely similar to C-style header files: it's a way of storing the signatures of existing objects independently from the source code.
"Typeshed" is the name of a specific project which includes stubs for the standard library and a few popular 3rd party modules. That word is not a synonym for "stub files". Similarly, the term "class stub" is also a bit of a misnomer: there are only stub files, which may or may not contain definitions for classes. (If the original Python or C extension library you're trying to type contains only functions, the corresponding stub file would also likely only contain signatures for those functions.)

Related

Programming Languages where a keyword is needed to specify that the method is extended from its parent class

Forgive me for my ignorance, but does anyone know any languages that strictly enforce the condition I've given on the title? For example, using Python syntax, we can extend a class with a new method like this
class A:
pass
class B(A):
def foo(self):
pass
But is there a language that needs an additional keyword, let's say new, to specify that this method is unique to the child class and is not an override of the methods of its parent class/es? For example:
class A:
pass
class B(A):
def new foo(self):
pass
I am asking this because, when I am working on a project that requires multiple inheritance such as class B(A, C, D), and I saw a method defined in B, I need to check if the given method is from one of its parent class or its own method, and I find it extremely tedious.
The closest I can think of is the #Override annotation in Java, which can be applied to a method declaration in order for the compiler to check that it overrides an inherited method (or implements an interface method).
When used in conjunction with a linter which checks that all method overrides are annotated with #Override, then your IDE will give you a linter warning when you omit the annotation. IntellIJ IDEA and SonarSource both have linter rules for this, for example.
So long as you are strict about obeying the linter warning, then it's "strict" in that sense, but of course linter warnings don't actually prevent your code from being compiled or executed. Nonetheless, I don't know of a closer example from a real programming language. Unfortunately Java doesn't have multiple inheritance so it's not directly applicable to your problem.

python3: type annotations and enum subclasses

I don't like getting complaints from Mypy about function signatures, but I don't know how to resolve this one.
I am building a package, that is to be used by several programs. I have a subclass of IntEnum (FWIW, called _Event), which contains a number of relevant properties and methods. _Event is never used directly, since it contains no members, but several different programs use incompatible subclasses of _Event (AlphaEvent, BetaEvent, etc.), which define the actual members of the Enum (i.e, the actual events). Only one member of each subclass is common, and as it happens, it's called END. Since you can't subclass an enum with members, it is defined in each subclass individually (so _Event has no END member, but AlphaEvent.END and BetaEvent.END exist).
I have several functions which utilise subclasses of _Event. I have a couple which need to access properties of the END member, but are generic to all instances. So they contain a signature:
def generic_event_func(events: _Event):
...
events.END.action = <expr>
MyPy flags the last line of code above with "error: "_Event" has no attribute "END"
True enough, but the subclasses do. How do I annotate the function signature to remove this error?
I faced a kinda similar issue recently and ended refactored using ABC. Not sure if you have latitude to refactor much, but maybe it can help in some way:
from abc import ABC, abstractmethod
from enum import Enum
class Event(Enum):
a = EventA
b = EventB
c = EventC
class AbstractEvent(ABC):
#abstractmethod
def action(self):
pass
class EventA(AbstractEvent):
def action(self):
....
event_cls = Event["a"].value
event: Union[EventA, EventB, EventC] = event_cls()
event.action()
I'm not (yet) an optional static typing person, but something like the following might work:
from typing import Any
class _Event(IntEnum):
END: Any
End is now type hinted, but doesn't actually exist, so won't interfere with subclassing _Event.

Extending Python ABCs with a protocol

I'd like to create a new typing protocol that extends some common base classes (e.g., Iterable) with additional attributes. For example
from typing import Iterable, Protocol
class IterableWithMethod(Iterable, Protocol):
def method(self) -> None: pass
class ImplementsProtocol():
def __iter__(self):
...
def method(self):
...
def operatesOnProtocol(in: IterableWithMethod):
...
However, protocols cannot subclass normal classes. Is there a standard way to do this? Do protocol versions of the ABCs need to be added to the standard library?
Given that mypy lists many of the abstract types exported by typing as "predefined protocols" and "Python protocols such as Iterable and Sized" are explicitly mentioned in the PEP 544's rationale section, it should be fine to treat these built-in abstract classes as protocols.
For now, if mypy doesn't treat a built-in ABC like a protocol where it feels like it should, and the behavior is undocumented/undiscussed elsewhere, it may be worth raising as an issue until the docs for existing builtin protocol-like ABCs are better reconciled with the newer Protocol system.
More generally, until we have something like pure-mixin classes, there won't be a pythonic way to go from normal class to protocol.

How to get Mypy working with multiple mixins relying on each other?

Currently in Electrum we use the Union type on self to be able to access methods from multiple mixed-in parent classes. For example, QtPluginBase relies on being mixed into a subclass of HW_PluginBase to work. For example, a valid use is class TrezorPlugin(QtPluginBase, HW_PluginBase).
There is the Qt gui, the Kivy gui, and there is also CLI. Although hardware wallets are not implemented for Kivy, they could be in the future. You can already use them on the CLI.
However there are also multiple hardware wallet manufacturers, all with their own plugins.
Consider Trezor + Qt:
For Qt, we have this class hierarchy:
electrum.plugins.hw_wallet.qt.QtPluginBase used by
electrum.plugins.trezor.qt.QtPlugin(QtPluginBase)
For Trezor, we have:
electrum.plugin.BasePlugin used by
electrum.plugins.hw_wallet.plugin.HW_PluginBase(BasePlugin) used by
electrum.plugins.trezor.trezor.TrezorPlugin(HW_PluginBase)
And to create the actual Qt Trezor plugin:
electrum.plugins.trezor.qt.Plugin(TrezorPlugin, QtPlugin)
The point is that the base gui-neutral plugin will first gain manufacturer-specific methods; then it will gain gui-specific methods.
Aaron (in the comments) suggests that QtPluginBase could subclass HW_PluginBase, but that would mean that the manufacturer-specific stuff would come after, which means the resulting classes cannot be used by the CLI or Kivy.
Note that both
electrum.plugins.trezor.trezor.TrezorPlugin(HW_PluginBase)
and
electrum.plugins.hw_wallet.qt.QtPluginBase
rely on HW_PluginBase. They can't both subclass it.
So if we avoid mix-ins, then the only alternative would be to either have QtPluginBase subclass TrezorPlugin (but there are many manufacturers), or TrezorPlugin could subclass QtPluginBase but then, again, the resulting classes cannot be used by the CLI or Kivy.
I realize that Union is an "or", so the hint is indeed not making sense. But there is no Intersection type. With Union, most of the PyCharm functionality works.
One thing that would be nice is if QtPluginBase could have a type-hint that it subclasses HW_PluginBase, but without actually subclassing at runtime.
How could this be typed with Mypy without having to use this hacky Union type hint on every method (since every method has self)?
With the Protocols added in PEP-544 (Python 3.8+), you can define the intersection interface yourself! This also lets you hide implementation details in ClassA that you don't want ClassB to use.
from typing import Protocol
class InterfaceAB(Protocol):
def method_a(self) -> None: ...
def method_b(self) -> None: ...
class ClassA:
def method_a(self) -> None:
print("a")
class ClassB:
def method_b(self: InterfaceAB) -> None:
print("b")
self.method_a()
# if I remove ClassA here, I get a type checking error!
class AB(ClassA, ClassB): pass
ab = AB()
ab.method_b()
# % mypy --version
# mypy 0.761
# % mypy mypy-protocol-demo.py
# Success: no issues found in 1 source file
Credits to SomberNight/ghost43 for the initial version of this file.
Since mypy doesn't offer an Intersection type yet, you can't type the self arg correctly (and the Union is not a replacement for that!). What you can do is introducing base classes for mixins for type checking only. This is a trick I often use when working with mixins in Django projects. Example:
from typing import TYPE_CHECKING
if TYPE_CHECKING:
from .plugin import HW_PluginBase
_Base = HW_PluginBase
else:
_Base = object
class QtPluginBase(_Base):
def load_wallet(self, wallet: 'Abstract_Wallet', window: ElectrumWindow):
...
You can now drop the explicit typing of self since mypy can infer all necessary base classes itself.

Typehints for python class

Is it possible to typehint a class's self?
The reason being is basing a class off of an ambiguously dynamic class that has definitions given in hint stubs called BaseClassB and SubClassD.
I would have expected this to be valid python, but it's not. Is there a way to typehint the baseclass argument(s) to creating a class?
I'd also accept any tricks that get PyCharm to autocomplete off self correctly as an answer as this doesn't seem to be supported Python in 3.7.4.
e.g.
class MyClass(BaseClassAmbiguous: Union[BaseClassB, SubClassD])
def func(self):
self.self_doesnt_autocomplete_correctly
self. # Desired functionality is this to autocomplete according to BaseClassB and SubClassD
I suspect the reason why your type checker is choking on your code is because it's not actually valid syntax. The annotation in your base class list is a syntax error.
Probably the best available workaround is to just give BaseClassAmbiguous a fake type, like so:
from typing import Union, TYPE_CHECKING
if TYPE_CHECKING:
class BaseClassAmbiguous(BaseClassB, SubClassD): pass
else:
# Create your ambiguous base class however it's actually
# constructed at runtime
class MyClass(BaseClassAmbiguous):
def func(self) -> None:
self.blah
Basically, lie to your type-checker and pretend that BaseClassAmbiguous directly inherits from your two classes. I'm not sure if Pycharm specifically supports this kind of thing, but it's something it in principle ought to support. (E.g. it's possible to do these kinds of shenanigans in Python).
That said, if you're going to use this approach, you're probably better off just having BaseClassAmbiguous actually inherit directly from both subclasses if at all possible.
To answer your original question, yes, it's legal to annotate the self method. That technique is usually reserved for when you want to have your self variable be generic -- see this example in PEP 484, and this mini-tutorial in the mypy docs.
But you could in principle annotate self using any type hint, really, including Unions, as long as it's not fundamentally incompatible with what your class really is -- your annotation for self would need to essentially be the same type as or a supertype of MyClass.
That said, this technique will likely not help you here: what you'd need is an intersection type, not a union type, and annotating self won't help resolve the syntax error in your base class list.
I think the more broad problem here is that the PEP 484 ecosystem doesn't necessarily deal well with ambiguous or dynamic base classes. Perhaps this is not possible to do in your case, but If I were in your shoes, I'd concentrate my efforts on making the base class more concrete.

Categories