I have a python module that calls multiple other classes within the same module.
class Main(object):
def foo(self):
return 'bar'
def getClass(self, className, *args):
return eval(className + "(%s)" % ",".join(args))
class A(object):
def __init__(self, b):
self._b = b
class B(object):
def __init__(self, c):
self._c = c
What I would like to do is this. In the class Main, I want to generate a class Object by only knowing the class name, and passing in the variables needed to create that class.
I now I can do conditional if/else, but I was wondering if it was possible. I figured I could use eval() as well, but i heard that can be evil.
Suggestions? Comments? I have a class that references multiple sublcasses, and instead of creating a class for each type, I figured this would be easier to do.
Thank you
Related
I'm using python 2.7, I've created a class, and I got some property in it. I'm instantiating an object out of the class with some value input. The class:
class Prop(object):
def __init__(self, data):
self.A = data
and the getter/setter:
#property
def A(self):
return self._A
#A.setter
def A(self, data):
self._A = data
The above works fine. Inside my main script I'm instantiating:
prop_demo = Prop('some text')
However, if I change inside __init__ the self.A = data with self._A = data, then A doesn't pass though the setter, then I think it misses a bit the point of having a setter. I changed it since it seems more intuitive to me to use the _ for an inner class variable, and doing self.A = data seems to me something that an instance of that class should do (outside the class, in my main script).
I have 2 questions:
Why do we do self.A and not self._A in the init? maybe I'm missing something regarding the importance of _ with python class variables?
Why inside the class init we need to call the setter, and the setter is not automatically executed when creating the object? Why isn't the setter automatically being executed when I instantiate an object?
First day learning Python, please excuse the basic question.
Assuming I have been given an object which contains an unimplemented method that I need to implement, e.g:
class myclass():
def __init__(self)
self.unimplementedmethod = False
What is the correct way to implement this in an instantiated object? I do not want to alter the base class in any way.
I have experimented and found the following code seems to work, but is it correct/good style?
def methodimplementation():
print("method called")
myobject = myclass()
myobject.unimplementedmethod=methodimplementation
Is this the right path? Or should I be doing something different like perhaps creating a derived class first, implementing the methods in it, and then instantiating an object based on the derived class? What is best practice?
You need to subclass the base class:
class myclass():
def some_method():
raise NotImplementedError
class my_subclass(myclass):
def some_method():
print("method called")
You want to create a abstract base class. For that, you need to inherit abc.ABCMeta in your base class. Then defining the method as abstract, you need to decorate it with #abstractmethod. For example:
from abc import ABCMeta, abstractmethod
class BaseClass(ABCMeta):
#abstractmethod
def my_method():
pass
Then you may create the child class as:
class MyChildClass(BaseClass):
def my_method():
print 'my method'
The good way is using subclasses, but if you can't do it, here is a way to access to self from a simple function not defined in a class:
class Bar:
def __init__(self):
pass
def foo(self):
try:
self._foo(self)
except AttributeError:
raise NotImplementedError
def set_foo(self, function):
setattr(self, '_foo', function)
def another_method(self):
print "Another method from {}".format(self)
def foo(self):
self.another_method()
bar = Bar()
bar.set_foo(foo)
bar.foo()
So, def foo(self) define a function with a single argument self, like a method. This function call a instance method another_method.
Bar.set_foo create a new attribute _foo in instance of Bar.
Finally, Bar.foo try to access to self._foo with self as argument. If _foo is do not exists, Bar.foo will raise a NotImplementedError as expected.
Like it you can access to self from foo without subclasses.
I have a structure like
class A:
def __init__(self, x):
self.a=x
class B(A):
def __init__(self, x):
self.b=x
class C(A):
def __init__(self, x):
self.c=x
class D(B,C):
def __init__(self, x):
self.d=x
Now I'd like to extend the __init__s such that both B and C will call A, too (B and C can be used as stand-alone classes). Moreover D should call A, B and C (D combines features from B and C but still needs to run all initializations). All __init__ take the same parameter. Obviously A should be called only once.
Do you see an easy way to do that?
Use super. As far as I'm aware, this is it's purpose ...
First, some proposed reading:
Super considered harmful and super (simultaneously -- by different people)
Next, an example:
class A(object):
def __init__(self, x):
print "Called A"
self.a=x
class B(A):
def __init__(self, x):
print "Called B"
super(B,self).__init__(x)
self.b=x
class C(A):
def __init__(self, x):
print "Called C"
super(C,self).__init__(x)
self.c=x
class D(B,C):
def __init__(self, x):
print "Called D"
super(D,self).__init__(x)
self.d=x
foo = D(10)
As stated in the comments, you often see methods which use super defined to accept any number of positional and keyword arguments:
def mymethod(self,*args,**kwargs):
super(thisclass,self).method(*args,**kwargs)
...
As that allows super to pass the necessary/unnecessary arguments on to other objects in the inheritance chain. Those methods can then decide which arguments they need and ignore the rest (passing them on to the next super of course)
Finally, to complete this discussion, we need to discuss python2.x vs. python3.x. In python2.x, all of your classes must be new style (they need to inherit from object eventually). In python3.x, this is automatic. And in python3.x, you can omit the arguments to super.
super().__init__(*args,**kwargs)
However, I prefer the python2.x style as it works for both python2.x and python3.x and I'm always left wondering how super() (python3.x style) knows what object to pass to the underlying methods as self. It seems a lot more magical then even the python2.x version to me...
Using super
class D(B, C):
def __init__(self, *args, **kwargs):
super(D, self).__init__(*args, **kwargs)
self.d = args[0]
Some explanation about super here and a related question
In Python 2 you should also inherit from object to use new style classes, and add super to your other classes too.
class x():
def __init__(self):
self.z=2
class hi():
def __init__(self):
self.child=x()
f=hi()
print f.z
I want it to print 2.
Basically I want to forward any calls to that class to another class.
The simplest approach is implementing __getattr__:
class hi():
def __init__(self):
self.child=x()
def __getattr__(self, attr):
return getattr(self.child, attr)
This has certain disadvantages, but it might work for your limited use case. You might want to implement __hasattr__ and __setattr__ as well.
The Python syntax is:
class hi(x):
To say that hi inherit (should be a child of) x.
.
Note: in order for hi to have property z (since this is in hi's __init__) x.__init__ needs to be explicitly run in x. That is,
class hi(x):
def __init__(self):
x.__init__(self)
I have the following structure:
class foo(object):
class bar(object):
def __init__(self, parent):
self._parent=parent #this
def worker(self):
return self._parent.name
def __init__(self, name):
self.name=name
def giveNamePointer(self):
return self.bar(self) #and this
Which works fine, however I was wondering if there is an implicit or easier way to get the reference to the creating instance in the special case, that the created instance is a class defined in the creating class.
edit: could this help me :implementing descriptiors and if so how?
No. Explicit is better than implicit.
(There's nothing special about defining a class inside another class.)