I am using motor with tornado. I have the following class:
class N():
def __init__(self,collectionName host='localhost', port=27017):
self.con=motor.MotorClient(host,port)
self.xDb=self.con.XDb
setattr(self,collectionName,self.xDb[collectionName])
This is actually a parent class that i want to extend. The child class will call this class' init to set the collectionName. The problem is I also have some other methods in this class E.g.
#tornado.gen.coroutine
def dropDB(self):
yield self.xDb.drop_collection(self.COLLECTION??)
The above is broken because I dynamically set the collection in the init what's a way I can determine the self. variable I set to use in the base methods?
Set another variable:
class N():
def __init__(self, collectionName, host='localhost', port=27017):
# ... your existing code ...
self.collectionName = collectionName
#tornado.gen.coroutine
def dropDB(self):
yield self.xDb.drop_collection(self.collectionName)
Since drop_collection takes a name or a MotorCollection object, there are other ways you could store this data on self, but the way I showed might be the easiest.
http://motor.readthedocs.io/en/stable/api/motor_database.html#motor.motor_tornado.MotorDatabase.drop_collection
Related
Currently I am writing a Python program with a plugin system. To develop a new plugin a new class must be created and inherit from a base plugin class. Now it should be possible to add optional functions via mixins. Some mixins provide new functions others access builtin types of the base class and can act with them or change them.
In the following a simplified structure:
import abc
import threading
class Base:
def __init__(self):
self.config = dict()
if hasattr(self, "edit_config"):
self.edit_config()
def start(self):
"""Starts the Plugin"""
if hasattr(self, "loop"):
self._loop()
class AMixin:
def edit_config(self):
self.config["foo"] = 123
class BMixin(abc.ABC):
def _loop(self):
thread = threading.Thread(target=self.loop, daemon=True)
thread.start()
#abc.abstractmethod
def loop(self):
"""Override this method with a while true loop to establish a ongoing loop
"""
pass
class NewPlugin(Base, AMixin, BMixin):
def loop(self):
while True:
print("Hello")
plugin = NewPlugin()
plugin.start()
What is the best way to tackle this problem?
EDIT: I need to make my question more specific. The question is whether the above is the Pythonic way and is it possible to ensure that the mixin are inherited exclusively in conjunction with the Base class. Additionally it would be good in an IDE like VSCode to get support for e.g. autocomplete when accessing builtin types of the Base class, like in AMixin, without inheriting from it of course.
If you want to allow but not require subclasses to define some behaviour in a method called by the base class, the simplest way is to declare the method in the base class, have an empty implementation, and just call the method unconditionally. This way you don't have to check whether the method exists before calling it.
class Base:
def __init__(self):
self.config = dict()
self.edit_config()
def start(self):
self.loop()
def edit_config(self):
pass
def loop(self):
pass
class AMixin:
def edit_config(self):
self.config["foo"] = 123
class NewPlugin(AMixin, Base):
def loop(self):
for i in range(10):
print("Hello")
Note that you have to write AMixin before Base in the list of superclasses, so that its edit_config method overrides the one from Base, and not the other way around. You can avoid this by writing class AMixin(Base): so that AMixin.edit_config always overrides Base.edit_config in the method resolution order.
If you want to require subclasses to implement one of the methods, then you can raise TypeError() instead of pass in the base class's method.
I would move the calls to the methods provided by the mix-ins to __init__ methods defined by those classes.
import abc
import threading
class Base:
def __init__(self, **kwargs):
super.__init__(**kwargs)
self.config = dict()
class AMixin:
def __init__(self, **kwargs):
super().__init__(**kwargs)
self.edit_config()
def edit_config(self):
self.config["foo"] = 123
class BMixin(abc.ABC):
def __init__(self, **kwargs):
super().__init__(**kwargs):
self.loop()
def _loop(self):
thread = threading.Thread(target=self.loop, daemon=True)
thread.start()
#abc.abstractmethod
def loop(self):
"""Override this method with a while true loop to establish a ongoing loop
"""
pass
class NewPlugin(Base, AMixin, BMixin):
pass
When you instantiate a concrete subclass of NewPlugin, Base.__init__, AMixin.__init__, and BMixin.__init__ will be called in that order.
In Python, I'm using inheritance for a class. The initial init for the main parent class is below:
def __init__(self, Date = None):
self.Date = Date
self.DatabaseClass = Database()
self.Connection = self.DatabaseClass.databaseConnection()
I've inherited the class into the child class, but am wondering what the correct approach would be to inherit DatabaseClass and Connection variables, i.e., what would be in def __init__?
You just need to call the inherited __init__ method from your own class's __init__ method.
class Child(Parent):
def __init__(self, Date=None, other, arguments):
super().__init__(Date)
# ...
from itertools import count
class myobject(object):
id=count(0)
def __init__(self):
self.ID=next(self.id)
self.dat=[]
class bucket(object):
def __init__(self):
self.container_object=self.Container_object
class Container_object(object):
self.container={}
self.contained_objects=self.container.keys()
def create_object(myobject):
self.container[myobject.ID]=object_data
I am looking to create a container object within class bucket such that I can create different instances of myobject within class bucket.
Such that when create_object creates new object that can be accessed and contained within class bucket.Container_object and bucket.Container_object.ID would be an instance of myobject
PS I am new to classes and still try to understand how I can use them, so most likely the question shows this understanding. Please feel free to point to those misunderstandings. thanks
I think this is what you are looking for.
A class Bucket which contains multiple instances of class MyObject. Since you don't specify any particular peculiarities of your Container object, I will use a simple dictionary.
from itertools import count
class MyObject(object):
id=count(0)
def __init__(self):
self.ID=next(self.id)
self.dat=[]
class Bucket(object):
def __init__(self):
self.contained_objects={}
def add_object(myobject):
self.contained_objects[myobject.ID]=myobject.dat
If you want to wrap the collection inside some other object for some reasons, you simply define the WrapperContainer class outside the Bucket class and simply instantiate an instance of WrapperContainer inside it.
from itertools import count
class MyObject(object):
id=count(0)
def __init__(self):
self.ID=next(self.id)
self.dat=[]
class WrapperContainer(object):
def __init__(self):
self.collection = {}
def add_object(my_object):
self.collection[myobject.ID]=myobject.dat
def do_whatever_you_want():
pass
class Bucket(object):
def __init__(self):
self.container_object= WrapperContainer() #a specific instance
def add_object(myobject):
self.container_object.add_object(my_object)
Finally, you can think at your container class as a subclass of an already existent collection (list or dictionary or anything else) to inherit its features.
I am trying to create a class (MySerial) that instantiates a serial object so that I can write/read to a serial device (UART). There is an instance method that is a decorator which wraps around a function that belongs to a completely different class (App). So decorator is responsible for writing and reading to the serial buffer.
If I create an instance of MySerial inside the App class, I can't use the decorator instance method that is created from MySerial.
I have tried foregoing instance methods and using class methods as explained in this second answer, but I really need to instantiate MySerial, thus create an instance using __init__.
How can this be accomplished? Is it impossible?
Create a decorator that is an instance method.
Use this decorator within another class
class MySerial():
def __init__(self):
pass # I have to have an __init__
def write(self):
pass # write to buffer
def read(self):
pass # read to buffer
def decorator(self, func):
def func_wrap(*args, **kwargs):
self.write(func(*args, **kwars))
return self.read()
return func_wrap
class App():
def __init__(self):
self.ser = MySerial()
#self.ser.decorator # <-- does not work here.
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
if __name__ == '__main__':
app = App()
You can use a staticmethod to wrap decorator. The inner func_wrap function of decorator contains an additional parameter in its signature: cls. cls can be used to access the ser attribute of the instance of App, and then the desired methods write and read can be called from cls.ser. Also, note that in your declarations, MySerial.write takes no paramters, but is passed the result of the wrapped function. The code below uses *args to prevent the TypeError which would otherwise be raised:
class MySerial():
def __init__(self):
pass # I have to have an __init__
def write(self, *args):
pass # write to buffer
def read(self):
pass # read to buffer
#staticmethod
def decorator(func):
def func_wrap(cls, *args, **kwargs):
cls.ser.write(func(cls, *args, **kwargs))
return cls.ser.read()
return func_wrap
class App():
def __init__(self):
self.ser = MySerial()
#MySerial.decorator
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
App().myfunc()
The reason this does not work is because you are refering to self in the class body, where it is not defined. Here are two solutions.
Store the serial object as class attribute
If you store the MySerial instance as a class attribute, then it sill be accessible in the class body:
class App():
ser = MySerial()
#ser.decorator
def myfunc(self):
return 'yummy_bytes'
Decorate on each instantiation
Or if you need a different MySerial instance for every App instance, then you will need to wait for the instance to be created to define an instance attribute my_func. This means the function is decorated dynamically on every instance creation, in which case, the # decorator syntax must be replaced by a function call.
class App():
def __init__(self):
self.ser = MySerial()
self.my_func = self.ser.decorator(self.myfunc)
def myfunc(self):
return 'yummy_bytes'
This solution generalizes to decorating multiple methods or conditionally deactivating serializing, say in a test environment.
import env
class App():
def __init__(self):
self.ser = MySerial()
to_decorate = [] if env.test else ['myfunc']
for fn_name in to_decorate:
fn = getattr(self, fn_name)
setattr(self, fn_name, self.ser.decorator(fn))
There's a lot of hidden pitfalls that make this a risky design, however it is a great learning example.
First off, the call to 'self' when decorating fails because there is no self at that scope. It only exists inside the methods. Now that the easy one is out of the way...
myfunc is an attribute of App class. When you create an instance of App, it is always that one function that gets called. Even when it becomes methodfied, that only happens once.
a1 = App()
a2 = App()
assert a1.myfunc.__func__ is a2.myfunc.__func__
assert id(a1.myfunc) is id(a2.myfunc) # Methods have some weirdness that means that won't equate but id's show they are the same
This is why self is needed to get a unique namespace for the instance. It is also why you won't be able to get decorator that is unique to the instance in this way.
Another way to think about it is that Class must be defined before you can produce instances. Therefore, you can't use an instance in the defination of a Class.
Solution
The decorator needs to be written in a way that it won't store any instance attributes. It will access the App instance attributes instead.
class MySerial():
def __init__(self):
pass # Possibly don't need to have an __init__
def write(self, serial_config):
pass # write to buffer
def read(self, serial_config):
pass # read to buffer
def decorator(self, func):
def func_wrap(self_app: App, *args, **kwargs):
self.write(func(self_app, *args, **kwars), self_app.serial_config)
return self.read(self_app.serial_config)
return func_wrap
ser = MySerial()
class App():
def __init__(self, serial_config):
self.serial_config = serial_config # This is the instance data for MySerial
#ser.decorator
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
if __name__ == '__main__':
app = App()
Now I'm assuming MySerial was going to have a unique file, or port or something per instance of App. This is what would be recorded in serial_config. This may not be elegant if the stream is opening an closing but you should be able to improve this for your exact application.
I'm trying to provide framework which allows people to write their own plugins. These plugins are basically derived classes. My base class needs some variables to initialize, how can I initialize my base class without having to let my derived class feed the variable in the base class initialization?
#!/bin/python
class BaseClass():
def __init__(self,config):
self.config=config
def showConfig(self):
print "I am using %s" % self.config
class UserPlugin(BaseClass):
def __init__(self,config):
BaseClass.__init__(self,config)
def doSomething(self):
print "Something"
fubar = UserPlugin('/tmp/config.cfg')
fubar.showConfig()
My goal is to avoid the need to define the config parameter in the UserPlugin class, since this is something I don't want the user who writes a plugin to be bothered with.
You can use argument lists to pass any remaining arguments to the base class:
class UserPlugin(BaseClass):
def __init__(self, *args, **kwargs):
BaseClass.__init__(self, *args, **kwargs)
Based on your Pastebin code, how about this? This avoids using a separate global, instead using a class attribute, which is accessible as a member to all derived classes and their instances.
#!/bin/python
class BaseClass():
config = '/tmp/config.cfg'
def __init__(self):
pass
def showConfig(self):
print "I am using %s" % self.config
class UserPlugin(BaseClass):
def __init__(self):
BaseClass.__init__(self)
def doSomething(self):
print "Something"
fubar = UserPlugin()
fubar.showConfig()
This was the other way to do it that I mentioned before. Keep in mind that if you want to change the value of BaseClass.config itself, you should access it directly (i.e. BaseClass.config = '/foo/path'; otherwise, you wind up creating a custom UPinstance.config value, leaving BaseClass.config unchanged.