If I call QApplication's init without arguments i get
TypeError: arguments did not match any overloaded call:
QApplication(list-of-str): not enough arguments
QApplication(list-of-str, bool): not enough arguments
QApplication(list-of-str, QApplication.Type): not enough arguments
QApplication(Display, int visual=0, int colormap=0): not enough arguments
QApplication(Display, list-of-str, int visual=0, int cmap=0): not enough arguments
very interesting! How can I write a class like that?? I mean, every trick for this kind of function overloading I saw did not involve explicit signatures.
TypeError is just another Exception. You can take *args **kwargs, check those, and raise a TypeError yourself, specify the text displayed - e.g. listing the expected call.
That being said, PyQt is a bunch of .pyd == native python extension, written in C or C++ (using Boost::Python). At least the latter supports "real" overloads afaik.
Either way, you shouldn't do this unless you have a really good reason. Python is duck-typed, embrace it.
It's quite possible that its init is simply using __init__(self, *args, **kwargs) and then doing its own signature testing against the args list and kwargs dict.
Related
Is there any advantage to using the 'type hint' notation in python?
import sys
def parse(arg_line: int) -> str:
print (arg_line) # passing a string, returning None
if __name__ == '__main__':
parse(' '.join(sys.argv[1:]))
To me it seems like it complicates the syntax without providing any actual benefit (outside of perhaps within a development environment). Based on this:
Are there any plans for python to contain type constraints within the language itself?
What is the advantage of having a "type hint" ? Couldn't I just as easily throw that into the docstring or something?
I also don't see this much in the python codebase itself as far as I can tell -- most types are enforced manually, for example: argparse.py and any other files I've glanced at in https://github.com/python/cpython/blob/3.7/Lib/.
Are there any plans for python to contain type constraints within the language itself?
Almost certainly not, and definitely not before the next major version.
What is the advantage of having a "type hint" ? Couldn't I just as easily throw that into the docstring or something?
Off the top of my head, consider the following:
Type hints can be verified with tooling like mypy.
Type hints can be used by IDEs and other tooling to give hints and tips. E.g., when you're calling a function and you've just written foo(, the IDE can pick up on the type hints and display a box nearby that shows foo(x: int, y: List[int]). The advantage to you as a developer is that you have exactly the information you need exposed to you and don't have to munge an entire docstring.
Type hints can be used by modules like functools.singledispatch or external libraries like multipledispatch to add additional type-related features (in this case, dispatching function calls based on name and type, not just name).
One option to take advantage of type hints is the type_enforced module. Regarding official python support, it still seems unlikely that types hints will be enforced directly in the near future.
Going into type_enforced, the package allows you to take advantage of type hints. It supports both input and output typing. Only types that are specified are enforced. Multiple possible inputs are also supported so you can specify something like int or float.
Input types are first validated (lazily on the function call) and if valid, the function is processed where the return value is then validated.
There are some limitations such that nested type structures are not supported. For example you can not specify type as a list of integers, but only a list. You would need to validate the items in the list inside of your function.
pip install type_enforced
>>> import type_enforced
>>> #type_enforced.Enforcer
... def my_fn(a: int , b: [int, str] =2, c: int =3) -> None:
... pass
...
>>> my_fn(a=1, b=2, c=3)
>>> my_fn(a=1, b='2', c=3)
>>> my_fn(a='a', b=2, c=3)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 47, in __call__
return self.__validate_types__(*args, **kwargs)
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 83, in __validate_types__
self.__check_type__(assigned_vars.get(key), value, key)
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 56, in __check_type__
self.__exception__(
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 37, in __exception__
raise TypeError(f"({self.__fn__.__qualname__}): {message}")
TypeError: (my_fn): Type mismatch for typed function (my_fn) with `a`. Expected one of the following `[<class 'int'>]` but got `<class 'str'>` instead.
I'm using stricly typed Python, and would like to achieve something similar to the copy/move constructor overloading of C++. That is, I'd like to make my object convertible to another type using an explicit definition.
Here's an example:
class CallerSurface(StringEnum):
IOS_APPLICATION = "ios_application"
ANDROID_APPLICATION = "android_application"
WEB_APPLICATION = "web_application"
which I can use with some function such as:
def getResponse(data: Data, caller_name: CallerSurface) -> str:
I'd like to add some part of the definition of class CallerSurface to make it possible for a function that takes a param of type CallerSurface, to also take a param of type str, and just "know" how to convert the str to CallerSurface without the programmer needing to explicitly figure out a conversion.
So I want to use in the following way:
caller_name: str = HTTPUtils.extractCallerFromUserAgent(request)
response = getResponse(other_data, caller_name)
caller_name is a str, but getResponse takes a CallerSurface. I'd like to make the conversion be implicitly defined in then CallerSurface class.
In C++ you could achieve this by defining a copy and a move constructor that takes in string. Is there something in Python for this?
There is no way to automate the conversion (there's no such thing as implicit type conversion of the sort you're looking for), so your options are:
Expand the allowed argument types (caller_name: CallerSurface becomes caller_name: Union[CallerSurface, str]), manually convert the related types to the intended type, or
Use #functools.singledispatch to make multiple versions of the function, one that accepts each type, where all but one implementation just converts to the intended type and calls itself
In short, this isn't C++, and implicit type conversions aren't a thing in the general case.
Python doesn't do type conversion implicitly. It does provide things like the __str__ magic method, but that conversion still requires an explicit call to the str() function.
What you want to do, I think, is leave the typing as CallerSurface, and use a static type checker to force the caller to do (e.g.):
caller_name = HTTPUtils.extractCallerFromUserAgent(request)
response = getResponse(other_data, CallerSurface(caller_name))
Using a type checker (e.g. mypy) is key, since that makes it impossible to forget the CallerSurface call (or whatever other kind of conversion needs to happen).
I've found a number of questions and answers dealing with determining the parameters/arguments needed for functions* in python. Python has a number of callable objects (using callable) that will return a value for which inspect.getfullargspec returns an error (inspect.Signature seems less reliable by my informal testing?).
For example:
inspect.getfullargspec(max)
Returns an Error:
TypeError: unsupported callable
Is there an alternative to inspect that will work for otherwise "unsupported callable" [functions/objects?]
Looking at a number of builtin callable objects it seems like many are in the "unsupported" category, i can only guess at objects in other modules/libraries.
*I may be mincing my terms here, apologies - any way to determine specifically which is which?
In CPython, builtin functions are implemented in C, and thus have many quirks. This is one of them. Taking some examples from PEP 570:
In current versions of Python, many CPython "builtin" and standard
library functions only accept positional-only parameters. The
resulting semantics can be easily observed by calling one of these
functions using keyword arguments:
>>> help(pow)
...
pow(x, y, z=None, /)
...
>>> pow(x=5, y=3)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: pow() takes no keyword arguments
pow() expresses that its parameters are positional-only via the / marker.
However, this is only a documentation convention; Python developers
cannot use this syntax in code.
There are functions with other interesting semantics:
range(), an overloaded function, accepts an optional parameter to the
left of its required parameter.
dict(), whose mapping/iterator
parameter is optional and semantically must be positional-only. Any
externally visible name for this parameter would occlude that name
going into the **kwarg keyword variadic parameter dict.
Currently (while the PEP is not implemented) Python has no official way to express these unusual semantics such as positional only arguments.
One of the new features in python3.5 is type hinting. For example the code below is valid now:
def greeting(name: str) -> str:
return 'Hello ' + name
But, as I understand, it doesn't check anything by itself and also is interpreted absolutely the same way as this:
def greeting(name):
return 'Hello ' + name
and was implemented mostly to help static analyzers (and to make code that is easier to understand). But is there (or is planned to be implemented in the future) any way (maybe by using some third-party libraries) to make python throw errors when an argument of an invalid type is passed to a function with annotated argument types (using only type-hinting syntax)?
Type hints implement PEP 0484 which explicitly lists as a non-goal:
While the proposed typing module will contain some building blocks for
runtime type checking -- in particular the get_type_hints() function
-- third party packages would have to be developed to implement specific runtime type checking functionality, for example using
decorators or metaclasses. Using type hints for performance
optimizations is left as an exercise for the reader.
From this it seems to follow that the Python developers have no plan to add the functionality that you seek. The quote mentions decorators and that does seem the way to go. In concept it seems straightforward -- the decorator would use get_type_hints() on the function to be decorated and would iterate through the arguments, checking their types against any hints, either throwing an error if there is a clash or simply passing on the arguments to the function. This would be similar to pzelasko's answer but with the decorator using the hints to automatically handle the boiler-plate code. The simplest approach would be to simply vet the arguments, though you should also be able to make a decorator which would raise an error if the return type clashes with the hint. I don't yet have Python 3.5 and don't have the time to pursue it -- but it seems like a good learning exercise for someone who wants to learn about both decorators and type hints. Perhaps you can be one of the "third parties" the PEP alludes to.
If you use Python3.6 annotations, you can use typeguard decorators:
https://typeguard.readthedocs.io/en/latest/userguide.html#using-the-decorator
NB: This should only be a "debug" or "testing" tool, not a production one.
Thus, they advise to add the -O option to python to run without in production.
It does not check inline variable annotation checks automatically, only function parameters, function return and object types.
From the doc:
from typeguard import typechecked
#typechecked
def some_function(a: int, b: float, c: str, *args: str) -> bool:
...
return retval
#typechecked
class SomeClass:
# All type annotated methods (including static and class methods and properties)
# are type checked.
# Does not apply to inner classes!
def method(x: int) -> int:
...
You may also automate this to all the functions with:
with install_import_hook('myapp'):
from myapp import some_module
I think the simplest way is to check the type:
def greeting(name):
if not isinstance(name, str):
raise TypeError('Expected str; got %s' % type(name).__name__)
return 'Hello ' + name
Since the release of python 3.7, it has been possible to do this using functools.singledispatch.
from functools import singledispatch
#singledispatch
def greet(arg: object):
raise NotImplementedError(f"Don't know how to greet {type(arg)}")
#greet.register
def _(arg: str):
print(f"Hello, {arg}!")
#greet.register
def _(arg: int):
print(', '.join("Hello" for _ in range(arg)), "!")
greet("Bob") # string implementation is called — prints "Hello, Bob!"
greet(4) # int implementation is called — prints "Hello, Hello, Hello, Hello!"
greet(["Alice, Bob"]) # no list implementation, so falls back to the base implementation — will raise an exception
In the above example, a base implementation is registered that will raise NotImplementedError. This is the "fallback" implementation that is returned if none of the other implementations is suitable.
Following the definition of the base implementation, an arbitrary number of type-specific implementations can be registered, as shown in the example — the function behaves completely differently depending on the type of the argument that is fed to it, and the #singledispatch method uses type annotations to register a certain implementation of a function with a certain type.
A singledispatch function can have any number of arguments, but only the type annotation for the first argument is relevant to which implementation is called.
One can use the library https://github.com/h2oai/typesentry for doing this on runtime.
I'm trying to figure out the arguments of a method retrieved from a module.
I found an inspect module with a handy function, getargspec.
It works for a function that I define, but won't work for functions from an imported module.
import math, inspect
def foobar(a,b=11): pass
inspect.getargspec(foobar) # this works
inspect.getargspec(math.sin) # this doesn't
I'll get an error like this:
File "C:\...\Python 2.5\Lib\inspect.py", line 743, in getargspec
raise TypeError('arg is not a Python function')
TypeError: arg is not a Python function
Is inspect.getargspec designed only for local functions or am I doing something wrong?
It is impossible to get this kind of information for a function that is implemented in C instead of Python.
The reason for this is that there is no way to find out what arguments the method accepts except by parsing the (free-form) docstring since arguments are passed in a (somewhat) getarg-like way - i.e. it's impossible to find out what arguments it accepts without actually executing the function.
You can get the doc string for such functions/methods which nearly always contains the same type of information as getargspec. (I.e. param names, no. of params, optional ones, default values).
In your example
import math
math.sin.__doc__
Gives
"sin(x)
Return the sine of x (measured in radians)"
Unfortunately there are several different standards in operation. See What is the standard Python docstring format?
You could detect which of the standards is in use, and then grab the info that way. From the above link it looks like pyment could be helpful in doing just that.