using configparser in python class, good or bad? - python

Is it bad practice to use a ConfigParser within class methods? Doing this would mean the class is then tied to the config and not as easily re-usable, but means less input arguments in methods, which I would find messy especially if arguments had to be passed down multiple layers.
Are there good alternatives (apart from just passing config values as method arguments)? Or a particular pattern people find works well for this?
For example
# get shared config parser configured by main script
from utils.config_utils import config_parser
class FooClass(object):
def foo_method(self):
self._foo_bar_method()
def _foo_bar_method(self):
some_property = config_parser.get("root", "FooProperty")
....

If you need a lot of arguments in a single class that might be a symptom that you are trying to do too much with that class (see SRP)
If there is indeed a real need for some configuration options that are too many to provide for a simple class as arguments I would advice to abstract the configuration as a separate class and use that as an argument:
class Configuration(object):
def __init__(self, config_parser):
self.optionA = config_parser.get("root", "AProperty")
self.optionB = config_parser.get("root", "BProperty")
self.optionX = config_parser.get("root", "XProperty")
#property
def optionY(self):
return self.optionX == 'something' and self.optionA > 10
class FooClass(object):
def __init__(self, config):
self._config = config
def _foo_bar_method(self):
some_property = self._config.optionY
....
config = Configuration(config_parser)
foo = FooClass(config)
In this way you can reuse your configuration abstraction or even build different configuration abstraction for different purposes from the same config parser.
You can even improve the configuration class to have a more declarative way to map configuration properties to instance attributes (but that's more advanced topic).

Related

Build a Python library that depends on one specific class from where this library is going to be used

I'm building a Python library magic_lib in which I need to instantiate a Python class (let's call it SomeClass) which is defined in the Python application that would import magic_lib.
What's the appropriate way to use/work on SomeClass when I develop magic_lib, since I don't have SomeClass in the magic_lib repo?
I'm thinking to create a dummy SomeClass like this. During packaging, I then exclude this class.
from typing import Any
class SomeClass:
def __init__(self, *arg: Any, **kwargs: Any):
pass
I'm wondering if this is the right approach. If not, any suggestions how I could approach this problem.
Thanks.
Additional thoughts: maybe I could use importlib like this?
my_module = importlib.import_module('some.module.available.in.production')
some_class = my_module.SomeClass()
Here is a more specific example:
Let's say I have two repos: workflows and magic_lib. Within workflows, it has defined a class named Task. Generally, we define tasks directly within the workflows repo. Everything works just fine. Now, let's say, I want to use magic_lib to programmatically define tasks in the workflows repo. Something like the following in the workflows repo:
from magic_lib import Generator
tasks: List[Task] = Generator().generate_tasks()
In order to do that, within magic_lib, I need to somehow have access to the class Task so that I can have it returned through the function generate_tasks(). I cannot really import Task defined in workflows from magic_lib. My question is how I can have access to Task within magic_lib.
Original question:
In python, there are decorators:
from <MY_PACKAGE> import add_method
#add_method
class MyClass:
def old_method(self):
print('Old method')
Decorators are functions which take classes/functions/... as argument:
def add_method(cls):
class Wrapper(cls):
def new_method(self):
print('New method')
return Wrapper
MyClass is passed as the cls argument to the add_method decorator function. The function can return a new class which inherits from MyClass
x = MyClass()
x.old_method()
x.new_method()
We can see that the method has been added. YAY !
So to recap, decorators are a great way to pass your user's custom class to your library. Decorators are just functions so they are easy to handle.
Modified question:
Classes can be passed to functions and methods as arguments
from magic_lib import generate_five_instances
tasks: List[Task] = generate_five_instances(Task)
def generate_five_instances(cls):
return [cls() for _ in range(5)]
If you come from another language, you might find this weird, but classes are FIRST CLASS CITIZENS in Python. That means you can assign them to variables and pass them as arguments.

Are classes that take an argument with variable type bad architecture

I'm working on a code base that has multiple Python modules that provide specific functionality each having a class. The classes are imported elsewhere in the code and they take a single argument which is a custom parameters object that is created from a configuration file.
This works fine in the application, but it's not great for importing the classes on their own to use their functionality elsewhere because you would have to create a parameters object for each class even if the particular class has a single parameter.
To simplify I have the idea of checking the type of the single argument:
if it's a parameters object, proceed as already implemented
if it's a string, instantiate class in a custom way
class Ruler:
def __init__(self, parameters):
if isinstance(parameters, paramsObject):
self.config = parameters
elif isinstance(parameters, str):
self.length = parameters
After this I could handle ruler = Ruler('30cm') without needing to create a parameters object.
The question is: is that good architecture and if there are some principles I'm missing here.
I would say that you have proposed an anti-pattern solution to an anti-pattern problem.
It is somewhat unhelpful that the existing architecture (over)uses paramsObject. After all, named function parameters are there for a reason and this just obfuscates what Ruler really needs in order to instantiate. It isn't much different to having all functions take *args and **kwargs.
Your proposed solution is a sort-of manual function overloading, which Python doesn't have because of the type system. Python has duck typing, which, to paraphrase, says that if it walks like a paramsObject and quacks like a paramsObject then it is a paramsObject.
In other words, the simpler solution would be to work out which values Ruler is looking for in parameters and adding only those to a new class:
class RulerParameter:
def __init__(self, length):
self.length = length
class Ruler:
def __init__(self, parameters):
self.config = parameters
def get_length(self):
return self.config.length
my_ruler = Ruler(RulerParameter(30))
print(my_ruler.get_length())

Abstraction Layer On Top of Sqlalchemy

I am on a Python 3.6 project that uses Sqlalchemy where we want another layer of abstraction over Sqlalchemy. This will allow us to more easily replace Sqlalchemy with another library if desired.
In this example it is the DbHelper class:
dbhelper.py
from dbconn import dbconn
from models.animals import Dogs
class DbHelper():
#staticmethod
def get_dog_by_nickname(nickname):
return dbconn.session.query(Dogs).get(nickname)
main.py
from dbhelper import DbHelper
class Dog():
def __init__(self, nickname, breed, age):
self.nickname = nickname
self.breed = breed
self.age = age
#classmethod
def using_nickname(cls, nickname):
row = DbHelper.get_dog_by_nickname(nickname)
return Dog(row.id, row.breed, row.age)
dog = Dog.using_nickname('Tom')
Question: Is there a better method than creating the DbHelper class for use as a container and having only staticmethod in it? I have read that this is not pythonic.
Converting all the staticmethod functions in dbhelper.py to regular methods will populate the namespace when we do from dbhelper import *
Yes, there is a better solution than creating a class full of staticmethods: Just don't create the class, and make them all module-level functions:
from models.animals import Dogs
dbconn = ...
def get_dog_by_nickname(nickname):
return dbconn.session.query(Dogs).get(nickname)
The only real point of a class full of staticmethods is to provide a namespace for all those functions to live in. A module already is a namespace for all those functions to live in. And, because it's more directly supported by the language syntax, it means you can more easily go around the namespacing when you want to do so explicitly (e.g., from dbhelper import ...).
You say:
Converting all the staticmethod functions in dbhelper.py to regular methods will populate the namespace when we do from dbhelper import *
But the answer to that is obvious: Don't from dbhelper import *, just import dbhelper. Then, all the code that you would have written with DbHelper.spam() becomes dbhelper.spam().
If you really want two levels of namespace, you can just use a package with a submodule, rather than a module with a class. But I can't see any good reason you'd need two levels of namespace here.
Another alternative (as suggested by juanpa.arrivillaga in the comments) is to turn this into a real class, where each instance (even if there will probably only be one in your real code) has its own self.dbconn instead of using a module global. That dbconn can either be passed into the __init__, or constructed directly inside the __init__. For example:
class DbHelper:
def __init__(self, dbname, otherdbparam):
self.dbconn = dblib.connect(dbname, otherdbparam)
def get_dog_by_nickname(self, nickname):
return self.dbconn.session.query(Dogs).get(nickname)
Notice that we're using normal methods, and accessing a normal instance variable. This is what a class is for—to wrap up some state together with the methods that transform that state.
How do you decide between the two? Well, if there's only ever going to be one dbconn per process, they're functionally equivalent, but conceptually they have different connotations. If you think of a DbHelper as a database, both the connection and the database behavior, it should be a class, and you should instantiate an instance of that class and use it that way. If you think of it as just a bunch of helper functions that operate on a dbconn that has its own independent existence, it should be a flat module.
In some languages (like Java), there is another point to using a class full of staticmethod-equivalents: the language either doesn't support "free functions", or makes them a completely different kind of thing from methods. But that isn't true in Python.
While we're at it, do you want your module to export Dogs and dbconn as a "public" part of the interface? If not, you should add an __all__ spec to the top of your module, like this:
from models.animals import Dogs
__all__ = [
'get_dog_by_nickname',
...
]
dbconn = ...
def get_dog_by_nickname(nickname):
return dbconn.session.query(Dogs).get(nickname)
Or, alternatively, name all your "private" module members with underscores:
from models.animals import Dogs as _Dogs
_dbconn = ...
def get_dog_by_nickname(nickname):
return _dbconn.session.query(_Dogs).get(nickname)
Either way, users of your module can still access the "private" data, but it won't show up in from dbhelper import *, help(dbhelper), the default autocomplete in many IDEs, etc.

How to setup an object before calling class methods like in Java/Spring?

I have an object that needs to be initialised by reading a config file and environment variables. It has class methods, I want to make sure the object is initialised before the classmethod is executed.
Is there any way to initialise all classes of this sort of nature? I will probably have quite a few of these in my code.
I'm coming from a Java/Spring background where simply putting #Service on top of the class or #PostConstruct over the initializer method would make sure it's called. If there's not a clean way to do this in normal Python, is there framework that'll make this easier?
So the class looks something like this
class MyClass(object):
def setup(self):
# read variables from the environment and config files
#classmethod
def my_method(cls, params):
# run some code based on params and variables initialised during setup
You could always use a simple singleton implementation, which sounds like what you're trying to do. This post has examples of how to do it and some thoughts about whether you really need a singleton.
Is there a simple, elegant way to define singletons?
Option #2.
class MyClass(object):
# No setup()
setting_A="foo"
setting_B="bar"
dict_of_settings={"foo":"bar"}
# basically you can define some class variables here
#classmethod
def my_method(cls, params):
print cls.setting_B # etc
This option avoids the use of any global code at all. However it comes at the price that your settings are no longer nicely encapsulated within an instance.
Yes.
class MyClass(object):
def __init__(self): #instead of setup()
# read variables from the environment and config files
#classmethod
def my_method(cls, params):
# run some code based on params and variables initialised during setup
MyClass._instance=MyClass()
Basically, When you first import/load the file containing MyClass, it will run the __init__ constructor once for it's internal _instance. That instance (singleton) will then be accessible from all other class methods.

Redefinition of class method in python

Context
I'm trying to have some "plugins" (I'm not sure this is the correct definition for this) to my code. By "plugin", I mean a module which defines a model (this is a scientific code) in such a way that its existence is enough to use it anywhere else in the code.
Of course these plugins must follow a template which uses some modules/function/classes defined in my code. Here is a small snippet for the relevant part of my code:
# [In the code]
class AllModels():
def __init__(self):
"""
Init.
"""
self.count = 0
def register(self, name, model):
"""
Adds a model to the code
"""
setattr(self, name, model)
self.count += 1
return
class Model():
def __init__(self, **kwargs):
"""
Some constants that defines a model
"""
self.a = kwargs.get("a", None)
self.b = kwargs.get("b", None)
# and so on...
def function1(self, *args, **kwargs):
"""
A function that all models will have, but which needs:
- to have a default behavior (when the instance is created)
- to be redefinable by the "plugin" (ie. the model)
"""
# default code for the default behavior
return
instance = AllModels()
and here is the relevant part of the "plugin":
# [in the plugin file]
from code import Model, instance
newmodel = Model(a="a name", b="some other stuff")
def function1(*args, **kwargs):
"""
Work to do by this model
"""
# some specific model-dependent work
return
instance.register(newmodel)
Additional information and requirements
function1 has exactly the same signature for any model plugin, but
is usually doing a different job for each.
I'd like a default behavior for the function1 so that if it is not
defined by the plugin, I'll still be able to do something (try
different possibilities, and/or raise a warning/error).
In the plugin, function1 may use some other functions that are only defined in this plugin. I'm stating this because the code is running with the multiprocessing module, and I need the instance instance of AllModels to be able to call function1 in child processes. instance is defined in the parent process, as well as the model plugins, but will be used in different child processes (no modification is done to it though).
it would be awesome that function1, when "redefined" by the plugin, be able to access the attributes of the Model instance (ie. self).
Problem
I've read many different sources of python documentation and some SO question. I only see two/three possible solutions to this problem:
1) not declaring function1 method in Model class, but just set it as an attribute when the plugin creates a new instance of it.
# [in the plugin file]
def function1(*args, **kwargs):
# ....
return
newmodel.function1 = function1
and then call it whenever needed. In that case the attribute function1 in the object Model would be initiate to None probably. One caveat of that is that there is no "default behaviour" for function1 (it has to be dealt in the code, eg. testing if instance.function1 is None: ...), and an even bigger one is that I can't access self this way...
2) using somehow the python decorators. I've never used this, and the documentation I've read is not that simple (I mean not straight forward due to the huge number of possibilities on its usage). But it seems to be a good solution. However I'm worried about its performance impact (I've read that it could slow down the execution of the decorated function/method). If this solution is the best option, then I'd like to know how to use it (a quick snippet maybe), and if it is possible to use attributes of the class Model:
# [in the plugin file]
#mydecorator
def function1(self, *args, **kwargs):
"""
I'm not sure I can use *self*, but it would be great since some attributes of self are used for some other function similar to *function1*...
"""
# some stuff using *self*, eg.:
x = self.var **2 + 3.4
# where self.var has been defined before, eg.: newmodel.var = 100.
3) using the module types and its MethodType... I'm not sure that is relevant in my case... but I may be wrong.
As you can probably see after this long question, I'm not very familiar with such python features, and my understanding of decorators is really poor now. While keeping reading some documentation, I thought that might be worth to ask the question here since I'm not sure of the direction to take in order to treat my problem.
Solution
The beauty of the answer of Senderle is that it is really simple and obvious... And having missed it is a shame. Sorry for polluting SO with that question.
Well, unless I'm mistaken, you want to subclass Model. This is sort of like creating an instance of Model and replacing its function1 attribute with a function defined in the plugin module (i.e. your option 1); but it's much cleaner, and takes care of all the details for you:
# [in the plugin file]
from code import Model, instance
class MyModel(Model):
def function1(*args, **kwargs):
"""
Work to do by this model
"""
# some specific model-dependent work
return
newmodel = MyModel(a="a name", b="some other stuff")
instance.register(newmodel)
This way, all the other methods (functions "attached" to a Model instance) are inherited from Model; they will behave in just the same way, but function1 will be overridden, and will follow your customized function1 definition.
Could you write a dummy function1() function in the Model class and raise a NotImplementedError? That way, if anyone tries to inherit from Model without implementing function1(), they'll get an exception when they try to run the code. If you're running the code for them, you can catch that error and return a helpful error message to the user.
For example:
class Model:
#Your code
def function1():
raise NotImplementedError("You need to implement function1
when you inherit from Model")
Then, you can do the following when you run the code:
try:
modelObj.function1()
except NotImplementedError as e:
#Perform error handling here
EDIT: The official Python documentation for NotImplementedError states: "In user defined base classes, abstract methods should raise this exception when they require derived classes to override the method." That does seem to fit the requirements here.
What you are trying to do canbe done in pretty straightforward ways - just using Objected Oriented techniques and taking advantage that in Python functions are also normal objects -
One simple thing to do is just having your "model" class to accept the "function1" as a parameter, and store it as an object member.
Some code like this, with minimal changes to your code - though much more interesting things are certainly possible:
# [In the code]
class AllModels():
def __init__(self):
"""
Init.
"""
self.count = 0
def register(self, name, **kwargs):
"""
Adds a model to the code
"""
model = Model(**kwargs)
setattr(self, name, model)
self.count += 1
return
class Model():
def __init__(self, **kwargs):
"""
Some constants that defines a model
"""
self.a = kwargs.get("a", None)
self.b = kwargs.get("b", None)
if "function1" in kwargs:
self.real_function1 = kwargs["function1"]
self.function1.__doc__ = kwargs["function1"].__doc__
# and so on...
def function1(self, *args, **kwargs):
"""
A function that all models will have, but which needs:
- to have a default behavior (when the instance is created)
- to be redefinable by the "plugin" (ie. the model)
"""
if self.real_function1:
return self.real_function1(self, *args, **kwargs)
# default code for the default behavior
return
instance = AllModels()
and
# [in the plugin file]
from code import Model, instance
newmodel = Model(a="a name", b="some other stuff")
def function1(self, *args, **kwargs):
"""
Work to do by this model
"""
# some specific model-dependent work
return
instance.register("name", function1 = function1, a="a name", b="some other stuff")

Categories