Abstraction Layer On Top of Sqlalchemy - python

I am on a Python 3.6 project that uses Sqlalchemy where we want another layer of abstraction over Sqlalchemy. This will allow us to more easily replace Sqlalchemy with another library if desired.
In this example it is the DbHelper class:
dbhelper.py
from dbconn import dbconn
from models.animals import Dogs
class DbHelper():
#staticmethod
def get_dog_by_nickname(nickname):
return dbconn.session.query(Dogs).get(nickname)
main.py
from dbhelper import DbHelper
class Dog():
def __init__(self, nickname, breed, age):
self.nickname = nickname
self.breed = breed
self.age = age
#classmethod
def using_nickname(cls, nickname):
row = DbHelper.get_dog_by_nickname(nickname)
return Dog(row.id, row.breed, row.age)
dog = Dog.using_nickname('Tom')
Question: Is there a better method than creating the DbHelper class for use as a container and having only staticmethod in it? I have read that this is not pythonic.
Converting all the staticmethod functions in dbhelper.py to regular methods will populate the namespace when we do from dbhelper import *

Yes, there is a better solution than creating a class full of staticmethods: Just don't create the class, and make them all module-level functions:
from models.animals import Dogs
dbconn = ...
def get_dog_by_nickname(nickname):
return dbconn.session.query(Dogs).get(nickname)
The only real point of a class full of staticmethods is to provide a namespace for all those functions to live in. A module already is a namespace for all those functions to live in. And, because it's more directly supported by the language syntax, it means you can more easily go around the namespacing when you want to do so explicitly (e.g., from dbhelper import ...).
You say:
Converting all the staticmethod functions in dbhelper.py to regular methods will populate the namespace when we do from dbhelper import *
But the answer to that is obvious: Don't from dbhelper import *, just import dbhelper. Then, all the code that you would have written with DbHelper.spam() becomes dbhelper.spam().
If you really want two levels of namespace, you can just use a package with a submodule, rather than a module with a class. But I can't see any good reason you'd need two levels of namespace here.
Another alternative (as suggested by juanpa.arrivillaga in the comments) is to turn this into a real class, where each instance (even if there will probably only be one in your real code) has its own self.dbconn instead of using a module global. That dbconn can either be passed into the __init__, or constructed directly inside the __init__. For example:
class DbHelper:
def __init__(self, dbname, otherdbparam):
self.dbconn = dblib.connect(dbname, otherdbparam)
def get_dog_by_nickname(self, nickname):
return self.dbconn.session.query(Dogs).get(nickname)
Notice that we're using normal methods, and accessing a normal instance variable. This is what a class is for—to wrap up some state together with the methods that transform that state.
How do you decide between the two? Well, if there's only ever going to be one dbconn per process, they're functionally equivalent, but conceptually they have different connotations. If you think of a DbHelper as a database, both the connection and the database behavior, it should be a class, and you should instantiate an instance of that class and use it that way. If you think of it as just a bunch of helper functions that operate on a dbconn that has its own independent existence, it should be a flat module.
In some languages (like Java), there is another point to using a class full of staticmethod-equivalents: the language either doesn't support "free functions", or makes them a completely different kind of thing from methods. But that isn't true in Python.
While we're at it, do you want your module to export Dogs and dbconn as a "public" part of the interface? If not, you should add an __all__ spec to the top of your module, like this:
from models.animals import Dogs
__all__ = [
'get_dog_by_nickname',
...
]
dbconn = ...
def get_dog_by_nickname(nickname):
return dbconn.session.query(Dogs).get(nickname)
Or, alternatively, name all your "private" module members with underscores:
from models.animals import Dogs as _Dogs
_dbconn = ...
def get_dog_by_nickname(nickname):
return _dbconn.session.query(_Dogs).get(nickname)
Either way, users of your module can still access the "private" data, but it won't show up in from dbhelper import *, help(dbhelper), the default autocomplete in many IDEs, etc.

Related

How can I resolve a circular import of two Python classes in the same module?

I have two tightly-coupled Python classes that need references to each other (at the class, not instance, level). How can I resolve the circular imports? Ideally I'd like to be able to make it work either within the same module or between two distinct modules, but I'll settle for one or the other.
# yin_yang.py
class MyYin(Yin):
__yang__ = MyYang
class MyYang(Yang):
__yin__ = MyYin
You could set the class attributes for one or both classes after they have been declared.
class MyYin(Yin):
pass
class MyYang(Yang):
__yin__ = MyYin
MyYin.__yang__ = MyYang
While #phillip-martin's response is the most pythonic one, there is an alternative way to accomplish the task:
from werkzeug import LocalProxy
class MyYin:
__yang__ = LocalProxy(lambda: MyYang)
foo = 42
class MyYang:
__yin__ = LocalProxy(lambda: MyYin)
bar = 9002
print(MyYin.__yang__.bar)
print(MyYang.__yin__.foo)
And the magic behind LocalProxy trick comes from overriding all the __getattr__, __setattr__, __etc__ methods. Check it out in the werkzeug repo.
When the Python interpreter finds a class declaration, it creates a new scope and executes the code inside the class in this code block, i.e., all class variables are instantiated when the class declaration is executed.
You can avoid this in a really simple manner:
class MyYin(Yin):
pass
class MyYang(Yang):
__yin__ = MyYin
MyYin.__yang__ = MyYang
or
class MyYang(Yang):
pass
class MyYin(Yin):
__yang__ = MyYang
MyYang.__yin__ = MyYin

Can a class be renamed programmatically?

I have Vehicle class. I want to rename it to Rover. I know we can create another reference variable, but I don't want the name of the class to be same anymore.
Played with __name__, but no success.
class Vehicle:
pass
st="Rover"
Vehicle.__name__=st
obj1=Rover()
Expected:
Vehicle class renamed to Rover without file handling and within the program.
In the comments, you have a (very slightly mangled) quote from https://docs.python.org/3/tutorial/classes.html:
This provides semantic for importing and renaming
Here's the actual quote, with a bit more context:
As in Smalltalk, classes themselves are objects. This provides semantics for importing and renaming.
What this is talking about is the fact that we can have a Python library with a class in the library:
# lib_k.py
class Klass:
... definitions ...
Then, in some other Python module, we can write:
from lib_k import Klass as LibKClass
which is mainly useful if we're also going to do:
from lib_l import Klass as LibLClass
and then write code like:
def f(args):
obj_k = LibKClass(...)
obj_l = LibLClass(...)
I personally prefer to write:
import lib_k
import lib_l
def f(args):
obj_k = lib_k.Klass(...)
obj_l = lib_l.Klass(...)
but both ways are allowed, and which to use is something of a matter of taste, rather than correctness.

Using a metaclass to substitute a class definition?

Python 3.6
I'm trying to modify the behavior of a third party library.
I don't want to directly change the source code.
Considering this code below:
class UselessObject(object):
pass
class PretendClassDef(object):
"""
A class to highlight my problem
"""
def do_something(self):
# Allot of code here
result = UselessObject()
return result
I'd like to substitute my own class for UselessObject
I'd like to know if using a metaclass in my module to intercept the creation of UselessObject is a valid idea?
EDIT
This answer posted by Ashwini Chaudhary on the same question, may be of use to others. As well as the below answer.
P.S. I also discovered that 'module' level __metaclass__ does't work in python 3. So my initial question of it 'being a valid idea' is False
FWIW, here's some code that illustrates Rawing's idea.
class UselessObject(object):
def __repr__(self):
return "I'm useless"
class PretendClassDef(object):
def do_something(self):
return UselessObject()
# -------
class CoolObject(object):
def __repr__(self):
return "I'm cool"
UselessObject = CoolObject
p = PretendClassDef()
print(p.do_something())
output
I'm cool
We can even use this technique if CoolObject needs to inherit UselessObject. If we change the definition of CoolObject to:
class CoolObject(UselessObject):
def __repr__(self):
s = super().__repr__()
return "I'm cool, but my parent says " + s
we get this output:
I'm cool, but my parent says I'm useless
This works because the name UselessObject has its old definition when the CoolObject class definition is executed.
This is not a job for metaclasses.
Rather, Python allows you to do this through a technique called "Monkeypatching", in which you, at run time, substitute one object for another in run time.
In this case, you'd be changing the thirdyparty.UselessObject for your.CoolObject before calling thirdyparty.PretendClassDef.do_something
The way to do that is a simple assignment.
So, supposing the example snippet you gave on the question is the trirdyparty module, on the library, your code would look like:
import thirdyparty
class CoolObject:
# Your class definition here
thirdyparty.UselesObject = Coolobject
Things you have to take care of: that you change the object pointed by UselessObject in the way it is used in your target module.
If for example, your PretendedClassDef and UselessObject are defined in different modules, you have to procees in one way if UselessObject is imported with from .useless import UselessObject (in this case the example above is fine), and import .useless and later uses it as useless.UselessObject - in this second case, you have to patch it on the useless module.
Also, Python's unittest.mock has a nice patch callable that can properly perform a monkeypatching and undo it if by some reason you want the modification to be valid in a limited scope, like inside a function of yours, or inside a with block. That might be the case if you don't want to change the behavior of the thirdyparty module in other sections of your program.
As for metaclasses, they only would be of any use if you would need to change the metaclass of a class you'd be replacing in this way - and them they only could have any use if you'd like to insert behavior in classes that inherit from UselessObject. In that case it would be used to create the local CoolObject and you'd still perform as above, but taking care that you'd perform the monkeypatching before Python would run the class body of any of the derived classes of UselessObject, taking extreme care when doing any imports from the thirdparty library (that would be tricky if these subclasses were defined on the same file)
This is just building on PM 2Ring's and jsbueno's answers with more contexts:
If you happen to be creating a library for others to use as a third-party library (rather than you using the third-party library), and if you need CoolObject to inherit UselessObject to avoid repetition, the following may be useful to avoid an infinite recursion error that you might get in some circumstances:
module1.py
class Parent:
def __init__(self):
print("I'm the parent.")
class Actor:
def __init__(self, parent_class=None):
if parent_class!=None: #This is in case you don't want it to actually literally be useless 100% of the time.
global Parent
Parent=parent_class
Parent()
module2.py
from module1 import *
class Child(Parent):
def __init__(self):
print("I'm the child.")
class LeadActor(Actor): #There's not necessarily a need to subclass Actor, but in the situation I'm thinking, it seems it would be a common thing.
def __init__(self):
Actor.__init__(self, parent_class=Child)
a=Actor(parent_class=Child) #prints "I'm the child." instead of "I'm the parent."
l=LeadActor() #prints "I'm the child." instead of "I'm the parent."
Just be careful that the user knows not to set a different value for parent_class with different subclasses of Actor. I mean, if you make multiple kinds of Actors, you'll only want to set parent_class once, unless you want it to change for all of them.

Can I build an automatic class-factory that works on import?

I have a class-factory F that generates classes. It takes no arguments other than a name. I'd like to be able to wrap this method and use it like this:
from myproject.myfactory.virtualmodule import Foo
"myfactory" is a real module in the project, but I want virtualmodule to be something that pretends to be a module.
Whenever I import something from virtualmodule I want it to build a new class using my factory method and make it look as if that was imported.
Can this be done? Is there a pattern that will allow me to wrap a class-factory as a module?
Thanks!
--
UPDATE0: Why do I need this? It's actually to test a process that will be run on a grid which requires that all classes should be importable.
An instance of the auto-generated class will be serialized on my PC and then unserialized on each of the grid nodes. If the class cannot be imported on the grid node the unserialize will fail.
If I hijack the import mechanism as an interface for making my test-classes, then I know for sure that anything I can import on my PC can be re-created exactly the same on the grid. That will satisfy my test's requirements.
You can stuff an arbitrary object into the sys.modules structure:
import sys
class VirtualModule(object):
def __init__(self, name):
self.__name__ = name.rsplit('.', 1)[-1]
self.__package__ = name
self.__loader__ = None
def __getattr__(self, name):
if name is valid:
# Return dynamic classes here
return VirtualClass(name)
raise AttributeError(name)
virtual_module_name = 'myproject.myfactory.virtualmodule'
sys.modules[virtual_module_name] = VirtualModule(virtual_module_name)
The Python import machinery will look up objects using attribute access, triggering the __getattr__ method on your VirtualModule instance.
Do this in the myproject/__init__.py or myproject/myfactory/__init__.py file and you are all set to go. The myproject.myfactory package does need to exist for the import machinery to find the myproject.myfactory.virtualmodule object.

How can I avoid circular imports in Python?

I'm having a problem with circular imports. I have three Python test modules: robot_test.py which is my main script, then two auxiliary modules, controller_test.py and servo_test.py. The idea is that I want controller_test.py to define a class for my microcontroller and servo_test.py to define a class for my servos. I then want to instantiate these classes in robot_test.py. Here are my three test modules:
""" robot_test.py """
from pi.nodes.actuators.servo_test import Servo
from pi.nodes.controllers.controller_test import Controller
myController = Controller()
myServo = Servo()
print myController.ID, myServo.ID
""" controller_test.py """
class Controller():
def __init__(self, id="controller"):
self.ID = id
""" servo_test.py """
class Servo():
def __init__(self, id="servo"):
self.ID = id
If I run robot_test.py, I get the expected printout:
controller servo
However, now comes the twist. In reality, the servo_test.py module depends on controller_test.py by way of robot_test.py. This is because my servo definitions require an already-instantiated controller object before they themselves can be instantiated. But I'd like to keep all the initial instantiations in robot_test.py. So I tried modifying my servo_test.py script as follows:
""" servo_test.py """
from pi.nodes.robots.robot_test import myController
class Servo():
def __init__(self, id="servo"):
self.ID = id
print myController.ID
Of course, I could sense that the circularity was going to cause problems and sure enough, when I now try to run robot_test.py I get the error:
ImportError: Cannot import name Servo
which in turn is caused by servo_test.py returning the error:
ImportError: Cannot import name myController
In C# I would define myController and myServo as static objects in robot_test.py and then I could use them in other classes. Is there anyway to do the same in Python? One workaround I have found is to pass the myController object to the Servo class as an argument, but I was hoping to avoid having to do this.
Thanks!
One workaround I have found is to pass
the myController object to the Servo
class as an argument, but I was hoping
to avoid having to do this.
Why ever would you want to avoid it? It's a classic case of a crucial Design Pattern (maybe the most important one that wasn't in the original Gang of Four masterpiece), Dependency Injection.
DI implementation alternatives to having the dependency in the initializer include using a setter method (which completes another crucial non-Gof4 DP, two-phase construction, started in the __init__) -- that avoids another circularity problem not related to imports, when A's instances need a B and B's instances need an A, in each case to complete what's logically the instances "initialization". But when you don't need two-phase construction, initializers are the natural place to inject dependencies.
Beyond the advantages related to breaking circularity, DI facilitates reuse (by generalization: e.g, if and when you need to have multiple controllers and servos rather than just one of each, it allows you to easily control the "pairing up" among them) and testing (it becomes easy to isolate each class for testing purposes by injecting in it a mock of the other, for example).
What's not to like?
servo_test.py doesn't actually need myController, since the access is within a function. Import the module instead, and access myController via that module.
import pi.nodes.robots.robot_test as robot_test
class Servo():
def __init__(self, id="servo"):
self.ID = id
print robot_test.myController.ID
This will work as long as myController exists before Servo is instantiated.
Pass the instantiated controller object as an init argument to the instantiation of the Servo.
""" servo_test.py """
class Servo():
def __init__(self,controller,id="servo"):
self.ID = id
self.ctrl = controller
print self.ctrl.ID
""" robot_test.py """
from pi.nodes.actuators.servo_test import Servo
from pi.nodes.controllers.controller_test import Controller
myController = Controller()
myServo = Servo(myController)

Categories