I need to pass a kwargs to a Python instance method but get the following error below.
class test(object):
def __init__(self):
print 'Initializaed'
def testmethod(self,**kwargs):
for each in kwargs:
print each
x = test()
x.testmethod({"name":"mike","age":200})
Error:
Traceback (most recent call last):
File "C:\Users\AN\workspace\Scratch\Scratch\test.py", line 20, in <module>
x.testmethod({"name":"mike","age":200})
TypeError: testmethod() takes exactly 1 argument (2 given)
x.testmethod({"name":"mike","age":200})
This invokes testmethod() with two positional arguments - one is implicit (the object instance, i.e. self) and the other one is the dictionary you gave it.
To pass in keyword arguments, use the ** operator for unpacking your dictionary into key=value pairs, like this:
x.testmethod(**{"name":"mike","age":200})
This translates to x.testmethod(name='mike', age=200) which is what you want. You can also read about argument unpacking in the documentation.
Related
I am trying to create metaclass:
from typing import Tuple
class StructMeta(Tuple):
pass
class Struct(metaclass=StructMeta):
pass
print(type(Struct))
Execute:
Traceback (most recent call last):
File "main.py", line 9, in <module>
class Struct(metaclass=StructMeta):
TypeError: tuple expected at most 1 argument, got 3
Do not understand why this error?
typing.Tuple appears to be a subclass of tuple, which only takes one argument, an iterable.
When creating a class, Python passes 3 arguments to its metaclass: the class name, a tuple of base classes, and a dict representing the class body.
It's not really possible to use typing.Tuple as a metaclass.
I'm developing a class in Python 3.7 with a function which overrides the run function of its base class. The run() function must be defined with a precise number of arguments, e.g def run(self, a, b), and it will behave differently based on how many arguments are defined -- its signature is inspected when the function is called, in order to provide the right arguments number to the function. In my design, the number (and names) of arguments are passed to the class when its instance is constructed.
I've looked upon modules such as functools and solutions like *args and **kwargs, but I fear they won't work for my scenario, since I don't want to 'bind' any variable before calling the function, neither I do want a variable argument list. I need a fixed list of arguments for my function, but I need to define the function using a variable which is provided to the class :)
It is probably much simpler to show my expected results:
>>>args1 = ['arg1', 'arg2']
>>>args2 = ['arg1', 'arg2', 'arg3']
...
...
>>>c1 = MyClass(run_args=args1)
>>>c2 = MyClass(run_args=args2)
>>>import inspect
>>>inspect.getargspec(c1.run)
ArgSpec(args=['self', 'arg1', 'arg2'], varargs=None, keywords=None, defaults=None)
>>>inspect.getargspec(c2.run)
ArgSpec(args=['self', 'arg1', 'arg2', 'arg3'], varargs=None, keywords=None, defaults=None)
This is a rather strange design, because objects from the same class are expected to share common methods, hence common method signatures.
But in Python3 every object can define an attribute with a method name, and that attribute will be used when the idiom obj.method(args) is called. And if that attribute is a function or a lambda, it can have a signature.
Here is a minimalist code demonstrating the concept:
import abc
import inspect
class Base(metaclass=abc.ABCMeta):
#abc.abstractmethod
def run(self, *args):
"""Just an example to show what arguments have been passed"""
print("run method called with", *args)
class Child(Base):
def __init__(self, run_args):
"""run_args is the list of the names of the arguments of run"""
params = [inspect.Parameter(n, inspect.Parameter.POSITIONAL_OR_KEYWORD)
for n in run_args]
# self.run is a lambda calling Child.run, but has a specific signature
self.run = lambda *args: Child.run(self, *args)
self.run.__signature__ = inspect.Signature(params)
def run(self, *args):
# controls that the signature of self.run has been observed
inspect.signature(self.run).bind(*args)
# do the processing - here only call the base class method
super().run(*args)
You can now test it:
>>> c1 = Child(['arg1', 'arg2'])
>>> c1.run(1,2)
run method called with 1 2
>>> c1.run(1,2,3)
Traceback (most recent call last):
...
TypeError: too many positional arguments
>>> c1.run(1)
Traceback (most recent call last):
File "<pyshell#119>", line 1, in <module>
...
TypeError: missing a required argument: 'arg2'
>>> c2 = Child(['arg1', 'arg2', 'arg3'])
>>> c2.run(1,2,3)
run method called with 1 2 3
>>> c1.run(1,2)
run method called with 1 2
Python obviously has a way to verify whether a function call has valid arguments (correct number of positional arguments, correct keyword arguments, etc). The following is a basic example of what I mean:
def test_func(x, y, z=None):
print(x, y, z)
test_func(2) # Raises a "missing positional argument" TypeError
test_func(1, 2, 3, a=5) # Raises an "unexpected keyword argument" TypeError
Is there a way that I can use this argument verification, without actually calling the function?
I'm basically trying to write a decorator that does some preprocessing steps based on the function arguments before calling the wrapped function itself, such as:
def preprocess(func):
def wrapper(*args, **kwargs):
# Verify *args and **kwargs are valid for the original function.
# I want the exact behavior of calling func() in the case of bad arguments,
# but without actually calling func() if the arguments are ok.
preprocess_stuff(*args, **kwargs)
func(*args, **kwargs)
return wrapper
I want my wrapper function to verify that the arguments would be valid if used on the wrapped function before doing any preprocessing work.
I would like to take advantage of the checks Python already does every time you call a function and the various exceptions it will raise. I just do not want to actually call the function, because the function may not be idempotent. Writing my own checks and exceptions feels like reinventing the wheel.
You can't invoke the actual built-in argument verification for a function without calling the function, but you can use something pretty close.
The inspect module has a function signature(), which returns a Signature object representing the argument signature of the function passed to it. That Signature object has a bind() method which attempts to create a BoundArguments object using the arguments passed to it. If those arguments don't match the signature, a TypeError is raised.
While this mostly behaves like the built-in argument binding logic, it has a few differences. For example, it can't always determine the signature of functions written in C, and its interaction with decorators will depend on whether they use functools.wraps (or something else that sets the __wrapped__ attribute). That said, since the real argument binding logic is inaccessible, inspect.signature is the best alternative.
We can use all this to create your decorator:
import functools
import inspect
def preprocess(func):
sig = inspect.signature(func)
#functools.wraps(func)
def wrapper(*args, **kwargs):
try:
sig.bind(*args, **kwargs)
except TypeError:
pass # bad arguments; skip preprocessing
else:
print("Preprocessing: args=%r, kwargs=%r" % (args, kwargs))
# ... etc.
return func(*args, **kwargs)
return wrapper
Usage:
#preprocess
def test_func(x, y, z=None):
print(x, y, z)
>>> test_func(2)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 10, in wrapper
TypeError: test_func() missing 1 required positional argument: 'y'
>>> test_func(1, 2, 3, a=5)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 10, in wrapper
TypeError: test_func() got an unexpected keyword argument 'a'
>>> test_func(1, 2)
Preprocessing: args=(1, 2), kwargs={}
1 2 None
Note that, if bad arguments are supplied, you do in fact want to call the function, because you "want the exact behavior of calling func() in the case of bad arguments" (to quote your comment), and the only way of getting the exact behaviour of calling an arbitrary function (even if that behaviour is to immediately fail) is to actually call it. What you don't want to do in such cases is the preprocessing, which the decorator above achieves for you.
I am trying to pass dictionary to a function in python but it shows me error.
class stud:
def method(**arg):
print(arg)
dict1 = {1:"abc",2:"xyz"}
a = stud()
a.method(dict1)
This raises the following error:
>>> a.method(dict1)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: method() takes 0 positional arguments but 2 were given
Can you tell me were I goes wrong or the right way to pass dictionary to a function?
As #Bit mentions, if method is not a static method. You need to add a self parameter.
There are two options here:
you use a normal parameter in the method:
class stud:
def method(self,arg): # normal parameter
print(arg)
you pass the dictionary as named parameters in the call:
class stud:
def method(self,**arg): # normal parameter
print(arg)
a.method(**dict1)
Personally I would go with the first one, since it is:
more efficient: you pass only a reference to the dictionary; and
if you want to alter the original dictionary, that is still possible.
I noticed this from the docstring of __build_class__:
__build_class__(func, name, *bases, metaclass=None, **kwds) -> class
Internal helper function used by the class statement.
The part that intrigued me was the **kwds part. Can class definitions take keyword arguments? I tried it, but I got a very strange error:
>>> class Test(a=1):
... pass
...
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: type() takes 1 or 3 arguments
What's the deal here? Can classes in Python 3 somehow accept keyword arguments? Maybe a special metaclass is required?
Can classes in Python 3 somehow accept keyword arguments?
Yes. Any keyword arguments in the class statement besides metaclass are passed to the metaclass. If the metaclass argument is specified, it's used as the metaclass; otherwise, the metaclass is type. See PEP 3115 for more details.