Related
Is there a way in Python to declare instance variables from the methods arguments without the need for boilerplate writing?
For example, is there a way for self.foo, self.bar and all other arguments to be automatically declared?
def __init__(self, foo, bar, ..., last):
self.foo = foo
self.bar = bar
...
self.last = last
There actually is, though I think it's a very ugly way
class Foo:
def __init__(self, **kwargs):
for name, value in kwargs.items():
setattr(self, name, value)
Foo(name='foo', baz=123, bar=True, last=None)
As Tim Roberts suggested in the comments, using a namedtuple is a much better and cleaner way (you may also take a look at dataclasses.dataclass)
You can use a keyword argument in __init__. For example:
class MyClass:
def __init__(self, **kwargs):
for name, value in kwargs.items():
setattr(self, name, value)
Then you can create a class using whatever keyword arguments you want:
>>> mc = MyClass(foo=123, bar=456, baz=789)
>>>
Those arguments are assigned as attributes of the class instance:
>>> mc.foo
123
>>> mc.bar
456
>>> mc.baz
789
>>>
You may want to add some checks if you do this, e.g. restrict them to some known set of names. Otherwise an undesired name could override a method name for instance.
Or, you could save the keyword dictionary directly in the class instance, as a single attribute. That would insulate the class from the names. For example:
class MyClass:
def __init__(self, **kwargs):
self.args = kwargs
You could then access them through self.args.
Solution 1
As #Tim Roberts suggested this is how it can be accomplished with a namedtuple.
from collections import namedtuple
class Foo(namedtuple("Foo", ["param1", "param2"])):
pass # Rest of code goes here
my_class = Foo(1, 2)
my_class.param1 # returns 1
You can even create default arguments
namedtuple("Foo", ["param1", "param2"], defaults=[1, 2])
Solution 2
Alternatively, you can use **kwargs to set the instance variables (which doesn't use the collections library and is arguably easier) if you are fine with the user passing any random argument.
Here is an example
class Foo:
def __init__(self, **kwargs):
super().__dict__.update(kwargs)
my_class = Foo(param1=1)
my_class.param1 # returns 1
# Doesn't force you to pass the correct parameters
random_param_class = Foo(a_made_up_parameter=2)
random_param_class.param1 # Results in an error
Bear in mind that none of this is standard and most often people just declare the instance variables from within __init__.
To be honest, if you have a fixed sets of inputs, you can use this:
from inspect import getargvalues, stack
def arguments():
args = getargvalues(stack()[1][0])[-1]
del args['self']
if 'kwargs' in args:
args.update(args['kwargs'])
del args['kwargs']
return args
class myClass():
def __init__(self, foo, bar, ..., last):
# Auto update all arguments into object dictionary
self.__dict__.update(arguments())
This should do it, if you don't specify *kwargs
object = myClass(1,2,3,'foo','random')
# all the right instances will be created
# object.foo =1
# object.bar = 2
I tried writing a decorator as such (going off memory, excuse any problems in code):
def required(fn):
def wrapped(self):
self.required_attributes += [fn.__name__]
fn(self)
return wrapped
and I used this to decorate #property attributes in classes, e.g.:
#property
#required
def some_property(self):
return self._some_property
...so that I could do something like this:
def validate_required_attributes(instance):
for attribute in instance.required_attributes:
if not hasattr(instance, attribute):
raise ValueError(f"Required attribute {attribute} was not set!")
Now I forgot that this wouldn't work because in order for the required_attributes to be updated with the name of the property, I would have to retrieve the property first. So in essence, when I do init in the class, I can just do a self.propertyname to add it... but this solution is not nice at all, I might as well create a list of required attribute names in the init.
From what I know, the decorator is applied at compile time so I wouldn't be able to modify the required_attributes before defining the wrapped function. Is there another way I can make this work? I just want a nice, elegant solution.
Thanks!
I think the attrs library does what you want. You can define a class like this, where x and y are required and z is optional.
from attr import attrs, attrib
#attrs
class MyClass:
x = attrib()
y = attrib()
z = attrib(default=0)
Testing it out:
>>> instance = MyClass(1, 2)
>>> print(instance)
MyClass(x=1, y=2, z=0)
Here's my take at doing it with a class decorator and a method decorator. There's probably a nicer way of doing this using metaclasses (nice being the API not the implementation ;)).
def requiredproperty(f):
setattr(f, "_required", True)
return property(f)
def hasrequiredprops(cls):
props = [x for x in cls.__dict__.items() if isinstance(x[1], property)]
cls._required_props = {k for k, v in props if v.fget._required}
return cls
#hasrequiredprops
class A(object):
def __init__(self):
self._my_prop = 1
def validate(self):
print("required attributes are", ",".join(self._required_props))
#requiredproperty
def my_prop(self):
return self._my_prop
This should make validation work without the requiring the caller to touch the property first:
>>> a = A()
>>> a.validate()
required attributes are my_prop
>>> a.my_prop
1
The class decorator is required to make sure it has the required property names duing instantiation. The requiredproperty function is just a way to mark the properties as required.
That being said, I'm not completely sure what you are trying to achieve here. Perhaps validation of the instance attribute values that the property should return?
I have class:
class A(object):
def do_computing(self):
print "do_computing"
Then I have:
new_class = type('B', (object,), {'a': '#A', 'b': '#B'})
What I want to achieve is to make all methods and properties on class A a member of class B. Class A can have from 0 to N such elements. I want to make them all a member of class B.
So far I get to:
methods = {}
for el in dir(A):
if el.startswith('_'):
continue
tmp = getattr(A, el)
if isinstance(tmp, property):
methods[el] = tmp
if isinstance(tmp, types.MethodType):
methods[el] = tmp
instance_class = type('B', (object,), {'a': '#A', 'b': '#B'})
for name, func in methods.items():
new_method = types.MethodType(func, None, instance_class)
setattr(instance_class, name, new_method)
But then when I run:
instance().do_computing()
I get an error:
TypeError: unbound method do_computing() must be called with A instance as first argument (got B instance instead)
Why I had to do that? We have a lot of legacy code and I need fancy objects that will pretend they are old objects but really.
One more important thing. I cannot use inheritance, to much magic happens in the background.
If you do it like this, it will work:
import types
class A(object):
def do_computing(self):
print "do_computing"
methods = {name:value for name, value in A.__dict__.iteritems()
if not name.startswith('_')}
instance_class = type('B', (object,), {'a': '#A', 'b': '#B'})
for name, func in methods.iteritems():
new_method = types.MethodType(func, None, instance_class)
setattr(instance_class, name, new_method)
instance_class().do_computing()
Unless I'm missing something, you can do this with inheritance:
class B(A):
def __init__(self):
super(B, self).__init__()
Then:
>>> b = B()
>>> b.do_computing()
do_computing
Edit: cms_mgr said the same in the comments, also fixed indentation
are you creating a facade? maybe you want something like this:
Making a facade in Python 2.5
http://en.wikipedia.org/wiki/Facade_pattern
you could also use delegators. here's an example from the wxpython AGW:
_methods = ["GetIndent", "SetIndent", "GetSpacing", "SetSpacing", "GetImageList", "GetStateImageList",
"GetButtonsImageList", "AssignImageList", "AssignStateImageList", "AssignButtonsImageList",
"SetImageList", "SetButtonsImageList", "SetStateImageList", 'other_methods']
def create_delegator_for(method):
"""
Creates a method that forwards calls to `self._main_win` (an instance of :class:`TreeListMainWindow`).
:param `method`: one method inside the :class:`TreeListMainWindow` local scope.
"""
def delegate(self, *args, **kwargs):
return getattr(self._main_win, method)(*args, **kwargs)
return delegate
# Create methods that delegate to self._main_win. This approach allows for
# overriding these methods in possible subclasses of HyperTreeList
for method in _methods:
setattr(HyperTreeList, method, create_delegator_for(method))
Note that these wrap class methods... i.e both functions take a signature like def func(self, some, other, args) and are intended to be called like self.func(some, args). If you want to delegate a class function to a non-class function, you'll need to modify the delegator.
You can inherit from a parent class as such:
class Awesome():
def method_a():
return "blee"
class Beauty(Awesome):
def __init__(self):
self.x = self.method_a()
b = Beauty()
print(b.x)
>>> "blee"
This was freely typed, but the logic is the same none the less and should work.
You can also do fun things with setattr like so:
#as you can see this class is worthless and is nothing
class blee():
pass
b = blee()
setattr(b, "variable_1", "123456")
print(b.variable_1)
>>> 123456
essentially you can assign any object, method to a class instance with setattr.
EDIT: Just realized that you did use setattr, woops ;)
Hope this helps!
I'm currently creating an object like this:
class Obj(object):
def __init__(self,**kwargs):
params = ['val1','val2','val3','val4',...]
for p in params:
setattr(self,p,kwargs.get(p,None))
I'm doing this so I don't have to do this:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None,...):
self.val1=val1
self.val2=val2
self.val3=val3
self.val4=val4
...
My question is, can you do a mix of the two? Where I can define the expected parameters yet still loop the parameters to set the attributes? I like the idea of defining the expected parameters because it is self documenting and other developers don't have to search for what kwargs are used.
I know it seems pretty petty but I'm creating an object from some XML so I'll be passing in many parameters, it just clutters the code and bugs me.
I did google this but couldn't find anything, probably because dictionary and kwargs together point to kwarg examples.
UPDATE: To be more specific, is it possible to get a dictionary of passed in parameters so I don't have to use kwargs at all?
Sudo code:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None,...):
for k,v in dictionary_of_paramters.iteritems():
setattr(self,k,v)
You can use the inspect module:
import inspect
def myargs(val1, val2, val3=None, val4=5):
print inspect.currentframe().f_locals
it shows all the locals available on the current stack frame.
myargs('a','b')
==> {'val3': None, 'val2': 'b', 'val1': 'a', 'val4': 5}
(note: it's not guaranteed to be implemented on all Python interpreters)
edit: i concur that it's not a pretty solution. what i would do is more like:
def _yourargs(*names):
"returns a dict with your named local vars"
alllocs = inspect.stack()[1][0].f_locals
return {n:alllocs[n] for n in names}
def askformine(val1, val2, val3=None, val4=5):
"example to show just those args i'm interested in"
print _yourargs('val1','val2','val3','val4')
class Obj(object):
"example inserting some named args as instance attributes"
def __init__(self, arg1, arg2=4):
self.__dict__.update(_yourargs('arg1','arg2'))
edit2 slightly better:
def pickdict(d,*names):
"picks some values from a dict"
return {n:d[n] for n in names}
class Obj(object):
"example inserting some named args as instance attributes"
def __init__(self, arg1, arg2=4):
self.__dict__.update(pickdict(locals(),'arg1','arg2'))
There is no nice way to get a dictionary of all the arguments to a function. The **kwargs syntax only collects up the extra keyword arguments, not the ones that match explicit parameters in the function definition.
Although you won't be able to get the parameters without using kwargs or the inspect module (see other answers), you can do something like this...
class Obj(object):
def __init__(self, **kwargs):
self.__dict__.update(**kwargs)
Every object has a dictionary that stores all of the attributes, which you can access via self.__dict__. Then you're just using update to set all of the attributes in that object's internal dictionary.
See this question on some discussion of this method.
If you want to obtain the args dict at the very top of your method, before you define any locals, this is as simple as:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
kwargs = dict(locals())
To read this dict later on, some introspection magic is required:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
# feel free to add more locals
loc = dict(locals())
fun = sys._getframe().f_code
kwargs = {x:loc[x] for x in fun.co_varnames[:fun.co_argcount]}
You can also make the latter reusable by adding a function like this:
def getargs():
f = sys._getframe(1)
return {x:f.f_locals[x] for x in f.f_code.co_varnames[:f.f_code.co_argcount]}
and then:
class Obj(object):
def __init__(self,val1=None,val2=None,val3=None,val4=None):
# feel free to add more locals
kwargs = getargs()
This is cpython-specific, I guess.
Yes you can mix the two.
See below:
def method(a, b=1, *args, **kwargs):
'''some code'''
This is valid. Here:
'a' is a required argument
'b' is a default argument
'args' will have all the non-keyword arguments and
'kwargs' will have all the keyword arguments.
Example:
method(1,2,3,4,5,test=6,test1=7)
This call will have:
a=1
b=2
args=(3,4,5)
kwargs={'test':6,'test1':7}
A kind of an ugly workaround: Inject extra arguments into kwargs and use it where you want to loop over all keyword arguments (PS this is an example usage of the inspect module, but not recommended for production use):
#!/usr/bin/env python
import inspect
def inject_defaults(func):
""" injects '__defaults' key into into kwargs,
so it can be merged with kwargs in the decorated method """
args, varargs, varkwargs, defaults = inspect.getargspec(func)
have_defaults = args[-len(defaults):]
defaults_dict = dict(zip(have_defaults, defaults))
def fun(*args, **kwargs):
kwargs['__defaults'] = defaults_dict
return func(*args, **kwargs)
return fun
#inject_defaults
def f1(a,b,c, x=1, **kwargs):
kwargs.update(kwargs['__defaults'])
del kwargs['__defaults']
for k, v in kwargs.items():
# here, x, y and z will appear
print(k, v)
f1(1, 2, 3, y=3, z=2)
# prints
# ('y', 3)
# ('x', 1)
# ('z', 2)
Instead of writing code like this every time I define a class:
class Foo(object):
def __init__(self, a, b, c, d, e, f, g):
self.a = a
self.b = b
self.c = c
self.d = d
self.e = e
self.f = f
self.g = g
I could use this recipe for automatic attribute assignment.
class Foo(object):
#autoassign
def __init__(self, a, b, c, d, e, f, g):
pass
Two questions:
Are there drawbacks or pitfalls associated with this shortcut?
Is there a better way to achieve similar convenience?
There are some things about the autoassign code that bug me (mostly stylistic, but one more serious problem):
autoassign does not assign an
'args' attribute:
class Foo(object):
#autoassign
def __init__(self,a,b,c=False,*args):
pass
a=Foo('IBM','/tmp',True, 100, 101)
print(a.args)
# AttributeError: 'Foo' object has no attribute 'args'
autoassign acts like a decorator.
But autoassign(*argnames) calls a
function which returns a decorator.
To achieve this magic, autoassign
needs to test the type of its first
argument. If given a choice, I
prefer functions not test
the type of its arguments.
There seems to be a considerable
amount of code devoted to setting up
sieve, lambdas within lambdas,
ifilters, and lots of conditions.
if kwargs:
exclude, f = set(kwargs['exclude']), None
sieve = lambda l:itertools.ifilter(lambda nv: nv[0] not in exclude, l)
elif len(names) == 1 and inspect.isfunction(names[0]):
f = names[0]
sieve = lambda l:l
else:
names, f = set(names), None
sieve = lambda l: itertools.ifilter(lambda nv: nv[0] in names, l)
I think there might be a simpler way. (See
below).
for _ in
itertools.starmap(assigned.setdefault,
defaults): pass. I don't think
map or starmap was meant to call
functions, whose only purpose is their
side effects. It could have been
written more clearly with the mundane:
for key,value in defaults.iteritems():
assigned.setdefault(key,value)
Here is an alternative simpler implementation which has the same functionality as autoassign (e.g. can do includes and excludes), and which addresses the above points:
import inspect
import functools
def autoargs(*include, **kwargs):
def _autoargs(func):
attrs, varargs, varkw, defaults = inspect.getargspec(func)
def sieve(attr):
if kwargs and attr in kwargs['exclude']:
return False
if not include or attr in include:
return True
else:
return False
#functools.wraps(func)
def wrapper(self, *args, **kwargs):
# handle default values
if defaults:
for attr, val in zip(reversed(attrs), reversed(defaults)):
if sieve(attr):
setattr(self, attr, val)
# handle positional arguments
positional_attrs = attrs[1:]
for attr, val in zip(positional_attrs, args):
if sieve(attr):
setattr(self, attr, val)
# handle varargs
if varargs:
remaining_args = args[len(positional_attrs):]
if sieve(varargs):
setattr(self, varargs, remaining_args)
# handle varkw
if kwargs:
for attr, val in kwargs.items():
if sieve(attr):
setattr(self, attr, val)
return func(self, *args, **kwargs)
return wrapper
return _autoargs
And here is the unit test I used to check its behavior:
import sys
import unittest
import utils_method as um
class Test(unittest.TestCase):
def test_autoargs(self):
class A(object):
#um.autoargs()
def __init__(self,foo,path,debug=False):
pass
a=A('rhubarb','pie',debug=True)
self.assertTrue(a.foo=='rhubarb')
self.assertTrue(a.path=='pie')
self.assertTrue(a.debug==True)
class B(object):
#um.autoargs()
def __init__(self,foo,path,debug=False,*args):
pass
a=B('rhubarb','pie',True, 100, 101)
self.assertTrue(a.foo=='rhubarb')
self.assertTrue(a.path=='pie')
self.assertTrue(a.debug==True)
self.assertTrue(a.args==(100,101))
class C(object):
#um.autoargs()
def __init__(self,foo,path,debug=False,*args,**kw):
pass
a=C('rhubarb','pie',True, 100, 101,verbose=True)
self.assertTrue(a.foo=='rhubarb')
self.assertTrue(a.path=='pie')
self.assertTrue(a.debug==True)
self.assertTrue(a.verbose==True)
self.assertTrue(a.args==(100,101))
def test_autoargs_names(self):
class C(object):
#um.autoargs('bar','baz','verbose')
def __init__(self,foo,bar,baz,verbose=False):
pass
a=C('rhubarb','pie',1)
self.assertTrue(a.bar=='pie')
self.assertTrue(a.baz==1)
self.assertTrue(a.verbose==False)
self.assertRaises(AttributeError,getattr,a,'foo')
def test_autoargs_exclude(self):
class C(object):
#um.autoargs(exclude=('bar','baz','verbose'))
def __init__(self,foo,bar,baz,verbose=False):
pass
a=C('rhubarb','pie',1)
self.assertTrue(a.foo=='rhubarb')
self.assertRaises(AttributeError,getattr,a,'bar')
def test_defaults_none(self):
class A(object):
#um.autoargs()
def __init__(self,foo,path,debug):
pass
a=A('rhubarb','pie',debug=True)
self.assertTrue(a.foo=='rhubarb')
self.assertTrue(a.path=='pie')
self.assertTrue(a.debug==True)
if __name__ == '__main__':
unittest.main(argv = sys.argv + ['--verbose'])
PS. Using autoassign or autoargs is compatible with IPython code completion.
From Python 3.7+ you can use a Data Class, which achieves what you want and more.
It allows you to define fields for your class, which are attributes automatically assigned.
It would look something like that:
#dataclass
class Foo:
a: str
b: int
c: str
...
The __init__ method will be automatically created in your class, and it will assign the arguments of instance creation to those attributes (and validate the arguments).
Note that here type hinting is required, that is why I have used int and str in the example. If you don't know the type of your field, you can use Any from the typing module.
Is there a better way to achieve similar convenience?
I don't know if it is necessarily better, but you could do this:
class Foo(object):
def __init__(self, **kwargs):
self.__dict__.update(kwargs)
>>> foo = Foo(a = 1, b = 'bar', c = [1, 2])
>>> foo.a
1
>>> foo.b
'bar'
>>> foo.c
[1, 2]
>>>
Courtesy Peter Norvig's Python: Infrequently Answered Questions.
One drawback: many IDEs parse __init__.py to discover an object's attributes. If you want automatic code completion in your IDE to be more functional, then you may be better off spelling it out the old-fashioned way.
If you have a lot of variables, you could pass one single configuration dict or object.
Similar to the above, though not the same... the following is very short, deals with args and kwargs:
def autoassign(lcls):
for key in lcls.keys():
if key!="self":
lcls["self"].__dict__[key]=lcls[key]
Use like this:
class Test(object):
def __init__(self, a, b, *args, **kwargs):
autoassign(locals())
This a simple implementation by judy2k:
from inspect import signature
def auto_args(f):
sig = signature(f) # Get a signature object for the target:
def replacement(self, *args, **kwargs):
# Parse the provided arguments using the target's signature:
bound_args = sig.bind(self, *args, **kwargs)
# Save away the arguments on `self`:
for k, v in bound_args.arguments.items():
if k != 'self':
setattr(self, k, v)
# Call the actual constructor for anything else:
f(self, *args, **kwargs)
return replacement
class MyClass:
#auto_args
def __init__(self, a, b, c=None):
pass
m = MyClass('A', 'B', 'C')
print(m.__dict__)
# {'a': 'A', 'b': 'B', 'c': 'C'}
In this package you can now find
#autoargs inspired by answer-3653049
#autoprops to transform the fields generated by #autoargs into #property, for use in combination with a validation library such as enforce or pycontracts.
Note that this has been validated for python 3.5+
class MyClass(object):
def __init__(self, **kwargs):
for key, value in kwargs.iteritems():
setattr(self, key, value)
You just can't use *args, but you can store in some instance list (like self.args, don't know)