Generics/templates in python? - python

How does python handle generic/template type scenarios? Say I want to create an external file "BinaryTree.py" and have it handle binary trees, but for any data type.
So I could pass it the type of a custom object and have a binary tree of that object. How is this done in python?

The other answers are totally fine:
One does not need a special syntax to support generics in Python
Python uses duck typing as pointed out by André.
However, if you still want a typed variant, there is a built-in solution since Python 3.5.
A full list of available type annotations is available in the Python documentation.
Generic classes:
from typing import TypeVar, Generic, List
T = TypeVar('T')
class Stack(Generic[T]):
def __init__(self) -> None:
# Create an empty list with items of type T
self.items: List[T] = []
def push(self, item: T) -> None:
self.items.append(item)
def pop(self) -> T:
return self.items.pop()
def empty(self) -> bool:
return not self.items
# Construct an empty Stack[int] instance
stack = Stack[int]()
stack.push(2)
stack.pop()
stack.push('x') # Type error
Generic functions:
from typing import TypeVar, Sequence
T = TypeVar('T') # Declare type variable
def first(seq: Sequence[T]) -> T:
return seq[0]
def last(seq: Sequence[T]) -> T:
return seq[-1]
n = first([1, 2, 3]) # n has type int.
Static type checking:
You must use a static type checker such as mypy or Pyre (developed by Meta/FB) to analyze your source code.
Install mypy:
python3 -m pip install mypy
Analyze your source code, for example a certain file:
mypy foo.py
or directory:
mypy some_directory
mypy will detect and print type errors. A concrete output for the Stack example provided above:
foo.py:23: error: Argument 1 to "push" of "Stack" has incompatible type "str"; expected "int"
References: mypy documentation about generics and running mypy

Python uses duck typing, so it doesn't need special syntax to handle multiple types.
If you're from a C++ background, you'll remember that, as long as the operations used in the template function/class are defined on some type T (at the syntax level), you can use that type T in the template.
So, basically, it works the same way:
define a contract for the type of items you want to insert in the binary tree.
document this contract (i.e. in the class documentation)
implement the binary tree using only operations specified in the contract
enjoy
You'll note however, that unless you write explicit type checking (which is usually discouraged), you won't be able to enforce that a binary tree contains only elements of the chosen type.

Actually now you can use generics in Python 3.5+.
See PEP-484 and typing module documentation.
According to my practice it is not very seamless and clear especially for those who are familiar with Java Generics, but still usable.

After coming up with some good thoughts on making generic types in python, I started looking for others who had the same idea, but I couldn't find any. So, here it is. I tried this out and it works well. It allows us to parameterize our types in python.
class List( type ):
def __new__(type_ref, member_type):
class List(list):
def append(self, member):
if not isinstance(member, member_type):
raise TypeError('Attempted to append a "{0}" to a "{1}" which only takes a "{2}"'.format(
type(member).__name__,
type(self).__name__,
member_type.__name__
))
list.append(self, member)
return List
You can now derive types from this generic type.
class TestMember:
pass
class TestList(List(TestMember)):
def __init__(self):
super().__init__()
test_list = TestList()
test_list.append(TestMember())
test_list.append('test') # This line will raise an exception
This solution is simplistic, and it does have it's limitations. Each time you create a generic type, it will create a new type. Thus, multiple classes inheriting List( str ) as a parent would be inheriting from two separate classes. To overcome this, you need to create a dict to store the various forms of the inner class and return the previous created inner class, rather than creating a new one. This would prevent duplicate types with the same parameters from being created. If interested, a more elegant solution can be made with decorators and/or metaclasses.

Here's a variant of this answer that uses metaclasses to avoid the messy syntax, and use the typing-style List[int] syntax:
class template(type):
def __new__(metacls, f):
cls = type.__new__(metacls, f.__name__, (), {
'_f': f,
'__qualname__': f.__qualname__,
'__module__': f.__module__,
'__doc__': f.__doc__
})
cls.__instances = {}
return cls
def __init__(cls, f): # only needed in 3.5 and below
pass
def __getitem__(cls, item):
if not isinstance(item, tuple):
item = (item,)
try:
return cls.__instances[item]
except KeyError:
cls.__instances[item] = c = cls._f(*item)
item_repr = '[' + ', '.join(repr(i) for i in item) + ']'
c.__name__ = cls.__name__ + item_repr
c.__qualname__ = cls.__qualname__ + item_repr
c.__template__ = cls
return c
def __subclasscheck__(cls, subclass):
for c in subclass.mro():
if getattr(c, '__template__', None) == cls:
return True
return False
def __instancecheck__(cls, instance):
return cls.__subclasscheck__(type(instance))
def __repr__(cls):
import inspect
return '<template {!r}>'.format('{}.{}[{}]'.format(
cls.__module__, cls.__qualname__, str(inspect.signature(cls._f))[1:-1]
))
With this new metaclass, we can rewrite the example in the answer I link to as:
#template
def List(member_type):
class List(list):
def append(self, member):
if not isinstance(member, member_type):
raise TypeError('Attempted to append a "{0}" to a "{1}" which only takes a "{2}"'.format(
type(member).__name__,
type(self).__name__,
member_type.__name__
))
list.append(self, member)
return List
l = List[int]()
l.append(1) # ok
l.append("one") # error
This approach has some nice benefits
print(List) # <template '__main__.List[member_type]'>
print(List[int]) # <class '__main__.List[<class 'int'>, 10]'>
assert List[int] is List[int]
assert issubclass(List[int], List) # True

Since python is dynamically typed, this is super easy. In fact, you'd have to do extra work for your BinaryTree class not to work with any data type.
For example, if you want the key values which are used to place the object in the tree available within the object from a method like key() you just call key() on the objects. For example:
class BinaryTree(object):
def insert(self, object_to_insert):
key = object_to_insert.key()
Note that you never need to define what kind of class object_to_insert is. So long as it has a key() method, it will work.
The exception is if you want it to work with basic data types like strings or integers. You'll have to wrap them in a class to get them to work with your generic BinaryTree. If that sounds too heavy weight and you want the extra efficiency of actually just storing strings, sorry, that's not what Python is good at.

If you using Python 2 or want to rewrite java code. Their is not real solution for this. Here is what I get working in a night: https://github.com/FlorianSteenbuck/python-generics I still get no compiler so you currently using it like that:
class A(GenericObject):
def __init__(self, *args, **kwargs):
GenericObject.__init__(self, [
['b',extends,int],
['a',extends,str],
[0,extends,bool],
['T',extends,float]
], *args, **kwargs)
def _init(self, c, a, b):
print "success c="+str(c)+" a="+str(a)+" b="+str(b)
TODOs
Compiler
Get Generic Classes and Types working (For things like <? extends List<Number>>)
Add super support
Add ? support
Code Clean Up

Look at how the built-in containers do it. dict and list and so on contain heterogeneous elements of whatever types you like. If you define, say, an insert(val) function for your tree, it will at some point do something like node.value = val and Python will take care of the rest.

Fortunately there has been some efforts for the generic programming in python .
There is a library : generic
Here is the documentation for it: http://generic.readthedocs.org/en/latest/
It hasn't progress over years , but you can have a rough idea how to use & make your own library.
Cheers

Related

Type support for dynamically added instance variable [duplicate]

I have a class decorator, which adds a few functions and fields to decorated class.
#mydecorator
#dataclass
class A:
a: str = ""
Added (via setattr()) is a .save() function and a set of info for dataclass fields as a separate dict.
I'd like VScode and mypy to properly recognize that, so that when I use:
a=A()
a.save()
or a.my_fields_dict those 2 are properly recognized.
Is there any way to do that? Maybe modify class A type annotations at runtime?
TL;DR
What you are trying to do is not possible with the current type system.
1. Intersection types
If the attributes and methods you are adding to the class via your decorator are static (in the sense that they are not just known at runtime), then what you are describing is effectively the extension of any given class T by mixing in a protocol P. That protocol defines the method save and so on.
To annotate this you would need an intersection of T & P. It would look something like this:
from typing import Protocol, TypeVar
T = TypeVar("T")
class P(Protocol):
#staticmethod
def bar() -> str: ...
def dec(cls: type[T]) -> type[Intersection[T, P]]:
setattr(cls, "bar", lambda: "x")
return cls # type: ignore[return-value]
#dec
class A:
#staticmethod
def foo() -> int:
return 1
You might notice that the import of Intersection is conspicuously missing. That is because despite being one of the most requested features for the Python type system, it is still missing as of today. There is currently no way to express this concept in Python typing.
2. Class decorator problems
The only workaround right now is a custom implementation alongside a corresponding plugin for the type checker(s) of your choice. I just stumbled across the typing-protocol-intersection package, which does just that for mypy.
If you install that and add plugins = typing_protocol_intersection.mypy_plugin to your mypy configuration, you could write your code like this:
from typing import Protocol, TypeVar
from typing_protocol_intersection import ProtocolIntersection
T = TypeVar("T")
class P(Protocol):
#staticmethod
def bar() -> str: ...
def dec(cls: type[T]) -> type[ProtocolIntersection[T, P]]:
setattr(cls, "bar", lambda: "x")
return cls # type: ignore[return-value]
#dec
class A:
#staticmethod
def foo() -> int:
return 1
But here we run into the next problem. Testing this with reveal_type(A.bar()) via mypy will yield the following:
error: "Type[A]" has no attribute "bar" [attr-defined]
note: Revealed type is "Any"
Yet if we do this instead:
class A:
#staticmethod
def foo() -> int:
return 1
B = dec(A)
reveal_type(B.bar())
we get no complaints from mypy and note: Revealed type is "builtins.str". Even though what we did before was equivalent!
This is not a bug of the plugin, but of the mypy internals. It is another long-standing issue, that mypy does not handle class decorators correctly.
A person in that issue thread even mentioned your use case in conjunction with the desired intersection type.
DIY
In other words, you'll just have to wait until those two holes are patched. Or you can hope that at least the decorator issue by mypy is fixed soon-ish and write your own VSCode plugin for intersection types in the meantime. Maybe you can get together with the person behind that mypy plugin I mentioned above.

how to define python generic classes [duplicate]

This question already has an answer here:
Python typing: return type with generics like Clazz[T] as in Java Clazz<T>
(1 answer)
Closed 3 months ago.
I have a class:
T = TypeVar('T')
class Stack(Generic[T]):
def __init__(self) -> None:
self.items: list[T] = []
def push(self, item: T) -> None:
self.items.append(item)
def pop(self) -> T:
return self.items.pop()
def empty(self) -> bool:
return not self.items
but I can also do:
T = TypeVar('T')
class Stack:
def __init__(self) -> None:
# Create an empty list with items of type T
self.items: list[T] = []
def push(self, item: T) -> None:
self.items.append(item)
def pop(self) -> T:
return self.items.pop()
def empty(self) -> bool:
return not self.items
what is the difference between these two samples?
which on should I use?
I tried running both, and both worked.
Type checking vs runtime
After writing this, I finally understood #Alexander point in first comment: whatever you write in annotations, it does not affect runtime, and your code is executed in the same way (sorry, I missed that you're looking just not from type checking perspective). This is core principle of python typing, as opposed to strongly typed languages (which makes it wonderful IMO): you can always say "I don't need types here - save my time and mental health". Type annotations are used to help some third-party tools, like mypy (type checker maintained by python core team) and IDEs. IDEs can suggest you something based on this information, and mypy checks whether your code can work if your types match the reality.
Generic version
T = TypeVar('T')
class Stack(Generic[T]):
def __init__(self) -> None:
self.items: list[T] = []
def push(self, item: T) -> None:
self.items.append(item)
def pop(self) -> T:
return self.items.pop()
def empty(self) -> bool:
return not self.items
You can treat type variables like regular variables, but intended for "meta" usage and ignored (well, there are some runtime traces, but they exist primary for introspection purpose) on runtime. They are substituted once for every binding context (more about it - below), and can be defined only once per module scope.
The code above declares normal generic class with one type argument. Now you can say Stack[int] to refer to a stack of integers, which is great. Current definition allows either explicit typing or using implicit Any parametrization:
# Explicit type
int_stack: Stack[int] = Stack()
reveal_type(int_stack) # N: revealed type is "__main__.Stack[builtins.int]
int_stack.push(1) # ok
int_stack.push('foo') # E: Argument 1 to "push" of "Stack" has incompatible type "str"; expected "int" [arg-type]
reveal_type(int_stack.pop()) # N: revealed type is "builtins.int"
# No type results in mypy error, similar to `x = []`
any_stack = Stack() # E: need type annotation for any_stack
# But if you ignore it, the type becomes `Stack[Any]`
reveal_type(any_stack) # N: revealed type is "__main__.Stack[Any]
any_stack.push(1) # ok
any_stack.push('foo') # ok too
reveal_type(any_stack.pop()) # N: revealed type is "Any"
To make the intended usage easier, you can allow initialization from iterable (I'm not covering the fact that you should be using collections.deque instead of list and maybe instead of this Stack class, assuming it is just a toy collection):
from collections.abc import Iterable
class Stack(Generic[T]):
def __init__(self, items: Iterable[T] | None) -> None:
# Create an empty list with items of type T
self.items: list[T] = list(items or [])
...
deduced_int_stack = Stack([1])
reveal_type(deduced_int_stack) # N: revealed type is "__main__.Stack[builtins.int]"
To sum up, generic classes have some type variable bound to the class body. When you create an instance of such class, it can be parametrized with some type - it may be another type variable or some fixed type, like int or tuple[str, Callable[[], MyClass[bool]]]. Then all occurrences of T in its body (except for nested classes, which are perhaps out of "quick glance" explanation context) are replaced with this type (or Any, if it is not given and cannot be deduced). This type can be deduced iff at least one of __init__ or __new__ arguments has type referring to T (just T or, say, list[T]), and otherwise you have to specify it. Note that if you have T used in __init__ of non-generic class, it is not very cool, although currently not disallowed.
Now, if you use T in some methods of generic class, it refers to that replaced value and results in typecheck errors, if passed types are not compatible with expected.
You can play with this example here.
Working outside of generic context
However, not all usages of type variables are related to generic classes. Fortunately, you cannot declare generic function with possibility to declare generic arg on calling side (like function<T> fun(x: number): int and fun<string>(0)), but there is enough more stuff. Let's begin with simpler examples - pure functions:
T = TypeVar('T')
def func1() -> T:
return 1
def func2(x: T) -> int:
return 1
def func3(x: T) -> T:
return x
def func4(x: T, y: T) -> int:
return 1
First function is declared to return some value of unbound type T. It obviously makes no sense, and recent mypy versions even learned to mark it as error. Your function return depends only on arguments and external state - and type variable must be present there, right? You cannot also declare global variable of type T in module scope, because T is still unbound - and thus neither func1 args nor module-scoped variables can depend on T.
Second function is more interesting. It does not cause mypy error, although still makes not very much sense: we can bind some type to T, but what is the difference between this and func2_1(x: Any) -> int: ...? We can speculate that now T can be used as annotation in function body, which can help in some corner case with type variable having upper bound, and I won't say it is impossible - but I cannot quickly construct such example, and have never seen such usage in proper context (it was always a mistake). Similar example is even explicitly referenced in PEP as valid.
The third and fourth functions are typical examples of type variables in functions. The third declares function returning the same type as it's argument.
The fourth function takes two arguments of the same type (arbitrary one). It is more useful if you have T = TypeVar('T', bound=Something) or T = TypeVar('T', str, bytes): you can concatenate two arguments of type T, but cannot - of type str | bytes, like in the below example:
T = TypeVar('T', str, bytes)
def total_length(x: T, y: T) -> int:
return len(x + y)
The most important fact about all examples above in this section: T doesnot have to be the same for different functions. You can call func3(1), then func3(['bar']) and then func4('foo', 'bar'). T is int, list[str] and str in these calls - no need to match.
With this in mind your second solution is clear:
T = TypeVar('T')
class Stack:
def __init__(self) -> None:
# Create an empty list with items of type T
self.items: list[T] = [] # E: Type variable "__main__.T" is unbound [valid-type]
def push(self, item: T) -> None:
self.items.append(item)
def pop(self) -> T: # E: A function returning TypeVar should receive at least one argument containing the same TypeVar [type-var]
return self.items.pop()
Here is mypy issue, discussing similar case.
__init__ says that we set attribute x to value of type T, but this T is lost later (T is scoped only within __init__) - so mypy rejects the assignment.
push is ill-formed and T has no meaning here, but it does not result in invalid typing situation, so is not rejected (type of argument is erased to Any, so you still can call push with some argument).
pop is invalid, because typechecker needs to know what my_stack.pop() will return. It could say "I give up - just have your Any", and will be perfectly valid (PEP does not enforce this). but mypy is more smart and denies invalid-by-design usage.
Edge case: you can return SomeGeneric[T] with unbound T, for example, in factory functions:
def make_list() -> list[T]: ...
mylist: list[str] = make_list()
because otherwise type argument couldn't have been specified on calling site
For better understanding of type variables and generics in python, I suggest you to read PEP483 and PEP484 - usually PEPs are more like a boring standard, but these are really good as a starting point.
There are many edge cases omitted there, which still cause hot discussions in mypy team (and probably other typecheckers too) - say, type variables in staticmethods of generic classes, or binding in classmethods used as constructors - mind that they can be used on instances too. However, basically you can:
have a TypeVar bound to class (Generic or Protocol, or some Generic subclass - if you subclass Iterable[T], your class is already generic in T) - then all methods use the same T and can contain it in one or both sides
or have a method-scoped/function-scoped type variable - then it's useful if repeated in the signature more than once (not necessary "clean" - it may be parametrizing another generic)
or use type variables in generic aliases (like LongTuple = tuple[T, T, T, T] - then you can do x: LongTuple[int] = (1, 2, 3, 4)
or do something more exotic with type variables, which is probably out of scope

Create dataclass that accepts type like List[int] [duplicate]

How does python handle generic/template type scenarios? Say I want to create an external file "BinaryTree.py" and have it handle binary trees, but for any data type.
So I could pass it the type of a custom object and have a binary tree of that object. How is this done in python?
The other answers are totally fine:
One does not need a special syntax to support generics in Python
Python uses duck typing as pointed out by André.
However, if you still want a typed variant, there is a built-in solution since Python 3.5.
A full list of available type annotations is available in the Python documentation.
Generic classes:
from typing import TypeVar, Generic, List
T = TypeVar('T')
class Stack(Generic[T]):
def __init__(self) -> None:
# Create an empty list with items of type T
self.items: List[T] = []
def push(self, item: T) -> None:
self.items.append(item)
def pop(self) -> T:
return self.items.pop()
def empty(self) -> bool:
return not self.items
# Construct an empty Stack[int] instance
stack = Stack[int]()
stack.push(2)
stack.pop()
stack.push('x') # Type error
Generic functions:
from typing import TypeVar, Sequence
T = TypeVar('T') # Declare type variable
def first(seq: Sequence[T]) -> T:
return seq[0]
def last(seq: Sequence[T]) -> T:
return seq[-1]
n = first([1, 2, 3]) # n has type int.
Static type checking:
You must use a static type checker such as mypy or Pyre (developed by Meta/FB) to analyze your source code.
Install mypy:
python3 -m pip install mypy
Analyze your source code, for example a certain file:
mypy foo.py
or directory:
mypy some_directory
mypy will detect and print type errors. A concrete output for the Stack example provided above:
foo.py:23: error: Argument 1 to "push" of "Stack" has incompatible type "str"; expected "int"
References: mypy documentation about generics and running mypy
Python uses duck typing, so it doesn't need special syntax to handle multiple types.
If you're from a C++ background, you'll remember that, as long as the operations used in the template function/class are defined on some type T (at the syntax level), you can use that type T in the template.
So, basically, it works the same way:
define a contract for the type of items you want to insert in the binary tree.
document this contract (i.e. in the class documentation)
implement the binary tree using only operations specified in the contract
enjoy
You'll note however, that unless you write explicit type checking (which is usually discouraged), you won't be able to enforce that a binary tree contains only elements of the chosen type.
Actually now you can use generics in Python 3.5+.
See PEP-484 and typing module documentation.
According to my practice it is not very seamless and clear especially for those who are familiar with Java Generics, but still usable.
After coming up with some good thoughts on making generic types in python, I started looking for others who had the same idea, but I couldn't find any. So, here it is. I tried this out and it works well. It allows us to parameterize our types in python.
class List( type ):
def __new__(type_ref, member_type):
class List(list):
def append(self, member):
if not isinstance(member, member_type):
raise TypeError('Attempted to append a "{0}" to a "{1}" which only takes a "{2}"'.format(
type(member).__name__,
type(self).__name__,
member_type.__name__
))
list.append(self, member)
return List
You can now derive types from this generic type.
class TestMember:
pass
class TestList(List(TestMember)):
def __init__(self):
super().__init__()
test_list = TestList()
test_list.append(TestMember())
test_list.append('test') # This line will raise an exception
This solution is simplistic, and it does have it's limitations. Each time you create a generic type, it will create a new type. Thus, multiple classes inheriting List( str ) as a parent would be inheriting from two separate classes. To overcome this, you need to create a dict to store the various forms of the inner class and return the previous created inner class, rather than creating a new one. This would prevent duplicate types with the same parameters from being created. If interested, a more elegant solution can be made with decorators and/or metaclasses.
Since python is dynamically typed, this is super easy. In fact, you'd have to do extra work for your BinaryTree class not to work with any data type.
For example, if you want the key values which are used to place the object in the tree available within the object from a method like key() you just call key() on the objects. For example:
class BinaryTree(object):
def insert(self, object_to_insert):
key = object_to_insert.key()
Note that you never need to define what kind of class object_to_insert is. So long as it has a key() method, it will work.
The exception is if you want it to work with basic data types like strings or integers. You'll have to wrap them in a class to get them to work with your generic BinaryTree. If that sounds too heavy weight and you want the extra efficiency of actually just storing strings, sorry, that's not what Python is good at.
Here's a variant of this answer that uses metaclasses to avoid the messy syntax, and use the typing-style List[int] syntax:
class template(type):
def __new__(metacls, f):
cls = type.__new__(metacls, f.__name__, (), {
'_f': f,
'__qualname__': f.__qualname__,
'__module__': f.__module__,
'__doc__': f.__doc__
})
cls.__instances = {}
return cls
def __init__(cls, f): # only needed in 3.5 and below
pass
def __getitem__(cls, item):
if not isinstance(item, tuple):
item = (item,)
try:
return cls.__instances[item]
except KeyError:
cls.__instances[item] = c = cls._f(*item)
item_repr = '[' + ', '.join(repr(i) for i in item) + ']'
c.__name__ = cls.__name__ + item_repr
c.__qualname__ = cls.__qualname__ + item_repr
c.__template__ = cls
return c
def __subclasscheck__(cls, subclass):
for c in subclass.mro():
if getattr(c, '__template__', None) == cls:
return True
return False
def __instancecheck__(cls, instance):
return cls.__subclasscheck__(type(instance))
def __repr__(cls):
import inspect
return '<template {!r}>'.format('{}.{}[{}]'.format(
cls.__module__, cls.__qualname__, str(inspect.signature(cls._f))[1:-1]
))
With this new metaclass, we can rewrite the example in the answer I link to as:
#template
def List(member_type):
class List(list):
def append(self, member):
if not isinstance(member, member_type):
raise TypeError('Attempted to append a "{0}" to a "{1}" which only takes a "{2}"'.format(
type(member).__name__,
type(self).__name__,
member_type.__name__
))
list.append(self, member)
return List
l = List[int]()
l.append(1) # ok
l.append("one") # error
This approach has some nice benefits
print(List) # <template '__main__.List[member_type]'>
print(List[int]) # <class '__main__.List[<class 'int'>, 10]'>
assert List[int] is List[int]
assert issubclass(List[int], List) # True
If you using Python 2 or want to rewrite java code. Their is not real solution for this. Here is what I get working in a night: https://github.com/FlorianSteenbuck/python-generics I still get no compiler so you currently using it like that:
class A(GenericObject):
def __init__(self, *args, **kwargs):
GenericObject.__init__(self, [
['b',extends,int],
['a',extends,str],
[0,extends,bool],
['T',extends,float]
], *args, **kwargs)
def _init(self, c, a, b):
print "success c="+str(c)+" a="+str(a)+" b="+str(b)
TODOs
Compiler
Get Generic Classes and Types working (For things like <? extends List<Number>>)
Add super support
Add ? support
Code Clean Up
Look at how the built-in containers do it. dict and list and so on contain heterogeneous elements of whatever types you like. If you define, say, an insert(val) function for your tree, it will at some point do something like node.value = val and Python will take care of the rest.
Fortunately there has been some efforts for the generic programming in python .
There is a library : generic
Here is the documentation for it: http://generic.readthedocs.org/en/latest/
It hasn't progress over years , but you can have a rough idea how to use & make your own library.
Cheers

Why does mypy complain about Any as return annotation and how should I annotate a return value that can be anything?

I have a class that is essentially a wrapper around a dictionary:
class Wrapper(dict):
...
def __getitem__(self, item: Hashable) -> Any:
return self.wrapped[item]
When checking the annotations with mypy, it yields an Explicit "Any" is not allowed when checking this code. My guess is that this stems from the concept that Any is the ancestor and successor of all other types. How should I annotate such function where I want to allow anything to be returned?
We need to tell mypy that types of self.wrapped container are related to the ones of Wrapper methods, we can do that using typing.TypeVar instances for keys and values in our case.
So we can end up with something like
from collections import abc
from typing import (Dict,
Iterator,
TypeVar)
class Wrapper(abc.MutableMapping):
KeyType = TypeVar('KeyType')
ValueType = TypeVar('ValueType')
def __init__(self, wrapped: Dict[KeyType, ValueType]) -> None:
self.wrapped = wrapped
def __delitem__(self, key: KeyType) -> None:
del self.wrapped[key]
def __len__(self) -> int:
return len(self.wrapped)
def __iter__(self) -> Iterator[KeyType]:
return iter(self.wrapped)
def __setitem__(self, key: KeyType, value: ValueType) -> None:
self.wrapped[key] = value
def __getitem__(self, key: KeyType) -> ValueType:
return self.wrapped[key]
Test
Running mypy with --disallow-any-explicit flag causes no errors/warnings.
Note about inheritance/composition
If one really needs a custom mapping I'll recommend to not mix inheritance from dict with using self.wrapped dict-field as it can cause a lot of pain in the future (we should re-define all of its methods or sometimes you will be using self.wrapped and sometimes not).
We can simply use MutableMapping ABC from collections.abc module and define basic methods (like __getitem__), the rest (keys(), values, items() are defined already).
You have provided insufficient information in the question to recreate this error but if your sole question is:
How should I annotate such function where I want to allow anything to be returned?
The simple answer is don't annotate it at all. It will by default act as Any.
This happens when the disallow_any_explicit is set to True for that module in the configuration file. Just remove that option for the default False.

How can I create my own "parameterized" type in Python (like `Optional[T]`)?

I want to create my own parameterized type in Python for use in type hinting:
class MaybeWrapped:
# magic goes here
T = TypeVar('T')
assert MaybeWrapped[T] == Union[T, Tuple[T]]
Never mind the contrived example; how can I implement this? I looked at the source for Union and Optional, but it looks like some fairly low-level hackery that I'd like to avoid.
The only suggestion in the documentation comes from an example re-implementation of Mapping[KT,VT] that inherits from Generic. But that example is more about the __getitem__ method than about the class itself.
If you're just trying to create generic classes or functions, try taking a look at the documentation on mypy-lang.org about generic types -- it's fairly comprehensive, and more detailed then the standard library typing docs.
If you're trying to implement your specific example, it's worth pointing out that type aliases work with typevars -- you can simply do:
from typing import Union, TypeVar, Tuple
T = TypeVar('T')
MaybeWrapped = Union[T, Tuple[T]]
def foo(x: int) -> MaybeWrapped[str]:
if x % 2 == 0:
return "hi"
else:
return ("bye",)
# When running mypy, the output of this line is:
# test.py:13: error: Revealed type is 'Union[builtins.str, Tuple[builtins.str]]'
reveal_type(foo(3))
However, if you're trying to construct a generic type with genuinely new semantics, you're very likely out of luck. Your remaining options are to:
Construct some kind of custom class/metaclass thing that PEP 484-compliant type checkers can understand and use that.
Modify the type checker you're using somehow (mypy has an experimental "plugin" system, for example)
Petition to modify PEP 484 to include your new, custom type (you can do this by opening an issue in the typing module repo).
It is exactly the __getitem__ method that does all the magic.
That is the method called in when you subscribe one name with [ and ] brackets.
So, you need an __getitem__ method in the class of your class - that is, its metaclass, that will get as parameters whatever is within the brackets. That method is responsible for dynamically creating (or retrieving a cached copy) of whatever you want to generate, and return it.
I just can't possibly imagin how you want this for type hinting, since the typing library seems to cover all reasonable cases (I can't think of an example they don't cover already). But let's suppose you want a class to return a copy of itself, but with the parameter anotated as its type_ attribute:
class MyMeta(type):
def __getitem__(cls, key):
new_cls = types.new_class(f"{cls.__name__}_{key.__name__}", (cls,), {}, lambda ns: ns.__setitem__("type", key))
return new_cls
class Base(metaclass=MyMeta): pass
And on trying this in interactive mode, one can do:
In [27]: Base[int]
Out[27]: types.Base_int
update: As of Python 3.7, there is also the special method __class_getitem__ which is created just for this purpose: it acts as a classmethod and avoids the need or a metaclass just for this case. Whatever would be written in a metaclass.__getitem__ can be put in the cls.__class_getitem__ method directly. Defined in PEP 560
I'd like to propose improved solution, based on #jsbueno answer. Now our "generics" can be used in comparisons and identity checks, and they will behave like "true" generics from typing. Also we can forbid instantiation of non-typed class itself. Moreover! We got isinstance checking for free!
Also meet BaseMetaMixin class for perfect static type checking!
import types
from typing import Type, Optional, TypeVar, Union
T = TypeVar('T')
class BaseMetaMixin:
type: Type
class BaseMeta(type):
cache = {}
def __getitem__(cls: T, key: Type) -> Union[T, Type[BaseMetaMixin]]:
if key not in BaseMeta.cache:
BaseMeta.cache[key] = types.new_class(
f"{cls.__name__}_{key.__name__}",
(cls,),
{},
lambda ns: ns.__setitem__("type", key)
)
return BaseMeta.cache[key]
def __call__(cls, *args, **kwargs):
assert getattr(cls, 'type', None) is not None, "Can not instantiate Base[] generic"
return super().__call__(*args, **kwargs)
class Base(metaclass=BaseMeta):
def __init__(self, some: int):
self.some = some
# identity checking
assert Base[int] is Base[int]
assert Base[int] == Base[int]
assert Base[int].type is int
assert Optional[int] is Optional[int]
# instantiation
# noinspection PyCallByClass
b = Base[int](some=1)
assert b.type is int
assert b.some == 1
try:
b = Base(1)
except AssertionError as e:
assert str(e) == 'Can not instantiate Base[] generic'
# isinstance checking
assert isinstance(b, Base)
assert isinstance(b, Base[int])
assert not isinstance(b, Base[float])
exit(0)
# type hinting in IDE
assert b.type2 is not None # Cannot find reference 'type2' in 'Base | BaseMetaMixin'
b2 = Base[2]() # Expected type 'type', got 'int' instead

Categories