I have an application consisting of a base app that brings in several modules. The base app reads a parameter file into a configuration hash, and I want to share it across all my modules.
Currently, I am passing a 'parent' object down to modules, and then those modules are doing stuff like self.parent.config to obtain the configuration.
However, as there are several levels to the module hierarchy, I find myself doing things like self.parent.parent.config, which is starting to look bad.
What are some better patterns for sharing a config object across an application and it's modules? I am thinking about having a 'config' module which basically creates a global config variable that can be set by the base app, then imported and accessed by other modules, but I am not sure if using globals like that is a bad practice for other reasons.
I am reasonably new to Python so be nice =)
You could just:
import config
and have a global config module
excerpts from my comments:
You can always add special rules for odd situations by just saying oddValue if isOddSituation() else config.normalValue.
If you want to have configuration modules be hierarchically subclassable (like my other answer describes), then you can represent a config as a class, or you can use the copy module and make a shallow copy and modify it, or you can use a "config dictionary", e.g.:
import config as baseConfig
config = dict(baseConfig, overriddenValue=etc)
It doesn't really matter too much which scope you're in.
Answering old question:
Just use dependency injection as suggested by #Reed Copsey here. E.g.
class MyClass:
def __init__(myConfig):
self.myConfig = myConfig
...
def foo():
self.myConfig.getConfig(key)
...
self.myConfig.setConfig(key,val)
...
...
# myConfig is your configuration management Module/Class
obj = SomeClass(myConfig)
I think by 'module', you are actually referring to a 'class/object'. An object is an instance of a class, for example:
class MyClass(object):
def __init__(self, ...):
...
...
myObject = MyClass()
A module is a .py file you import, like so:
import mymodule
It seems unlikely that all the classes you instantiate would want to have access to a global configuration. However if you really need everything in your application to have access to some global parameters, you can put them in your own config module:
myParam1 = 1
myParam2 = 2
and then from any module or any object or anywhere really, as long as you did import config, you could just say print(config.myParam1)
Alternatively if you want a large hierarchy of objects to all share access to the same property, you don't need to refer to it via manually setting a self.parent. As long as you use inheritance, you can do stuff like:
class Parent(object):
def __init__(self, theConfig):
self.theConfig = theConfig
class Child(Parent):
...
def method(self,...):
print(self.theConfig)
Take a look at this. It could help you:
https://gist.github.com/dgarana/c052a3287629dd7c0b0c9d7921081e9d
Related
I would like to convert a singleton-object programmatically into a Python module so that I can use the methods of this singleton-object directly by importing them via the module instead of accessing them as object attributes. By "programmatically" I mean that I do not want to have to copy-paste the class methods explicitly into a module file. I need some sort of a workaround that allows me to import the object methods into to global scope of another module.
I would really appreciate if someone could help me on this one.
Here is a basic example that should illustrate my problem:
mymodule.py
class MyClass:
"""This is my custom class"""
def my_method(self):
return "myValue"
singleton = MyClass()
main_as_is.py
from mymodule import MyClass
myobject = MyClass()
print(myobject.my_method())
main_to_be.py
from mymodule import my_method # or from mymodule.singleton import my_method
print(my_method())
You can use the same strategy that the standard random module uses. All the functions in that module are actually methods of a "private" instance of the Random class. That's convenient for most common uses of the module, although sometimes it's useful to create your own instances of Random so that you can have multiple independent random streams.
I've adapted your code to illustrate that technique. I named the class and its instance with a single leading underscore, since that's the usual convention in Python to signify a private name, but bear in mind it's simply a convention, Python doesn't do anything to enforce this privacy.
mymodule.py
class _MyClass:
""" This is my custom class """
def my_method(self):
return "myValue"
_myclass = _MyClass()
my_method = _myclass.my_method
main_to_be.py
from mymodule import my_method
print(my_method())
output
myValue
BTW, the from mymodule import method1, method2 syntax is ok if you only import a small number of names, or it's clear from the name which module it's from (like math module functions and constants), and you don't import from many modules. Otherwise it's better to use this sort of syntax
import mymodule as mm
# Call a method from the module
mm.method1()
That way it's obvious which names are local, and which ones are imported and where they're imported from. Sure, it's a little more typing, but it makes the code a whole lot more readable. And it eliminates the possibility of name collisions.
FWIW, here's a way to automate adding all of the _myclass methods without explicitly listing them (but remember "explicit is better than implicit"). At the end of "mymodule.py", in place of my_method = _myclass.my_method, add this:
globals().update({k: getattr(_myclass, k) for k in _MyClass.__dict__
if not k.startswith('__')})
I'm not comfortable with recommending this, since it directly injects items into the globals() dict. Note that that code will add all class attributes, not just methods.
In your question you talk about singleton objects. We don't normally use singletons in Python, and many programmers in various OOP languages consider them to be an anti-pattern. See https://stackoverflow.com/questions/12755539/why-is-singleton-considered-an-anti-pattern for details. For this application there is absolutely no need at all to use a singleton. If you only want a single instance of _MyClass then simply don't create another instance of it, just use the instance that mymodule creates for you. But if your boss insists that you must use a singleton, please see the example code here.
I am struggling to figure out how to handle importing dependencies that are used in multiple files.
Let's say I want to import an external API for example, and two classes depend on this import. Putting the import into the 'index' file, as an attempt to make it global does not work. I can import it in each class file fine, but that seems to be a violation of DRY, as well as setting myself up for failure later on.
So is there a way to import once, in a single file that is globally accessible? What I experimented with was creating an index.py, foo.py (for the foo class) and bar.py (for the bar class):
Index:
from example import API
import foo
import bar
foo()
bar()
foo.py:
class foo:
... (try to put the example API to use)
bar.py: (same as foo.py really, just here to make the case for using the same dependency in two different places)
This failed to work, as the classes appeared to not be able to access exampleAPI. What is the correct way to do this, or am I looking at it wrong? Thanks!
In general, you should import each module you need in each of your own modules that needs to use it. You don't need to worry about duplication, since each module has its own global namespace. Furthermore, modules are cached (in the sys.modules dictionary) so you don't need to worry about extra work being done to load the module multiple times.
That said, there can be some exceptions. For instance, if the specific source of an API is considered "private" information (e.g. because it's an implementation detail or because it might be configurable and not always come from the same place all the time), it might make sense to import it into some namespace where all other users will look for it.
On the other hand, your example suggests you may be splitting up your code more than you should. Unlike some other languages (such as Java), in Python it's neither required nor recommended for each class to live in its own file. Instead, you should divide your code up into modules dictated by how closely they interact with each other. Closely related classes should be part of the same module, while pieces that don't interact at all might make more sense in separate modules (especially if some other code might need one part but not the other). It may not be inappropriate for your whole program to be in a single module! Obviously, some of this is a matter of style and taste, so there's not a single best option for every programmer in every situation.
For your example code, if you want to keep separate modules, I'd suggest something like this:
index.py:
from foo import Foo # no need to import API here if you're not using it directly
from bar import Bar
foo = Foo() # create an instance of the foo class
result = foo.some_method() # call methods on it
bar = Bar(foo) # you can also pass your instances around to other classes
foo.py:
from example import API
class Foo:
def some_method(self):
return API.whatever() # use the API in some way
bar.py:
from example import API # don't worry about importing the API more than ocne
class Bar:
def __init__(self, foo):
self.foo = foo
def blah(self):
API.something_else(self.foo.some_method())
Note that I changed some names around. Python's convention is to use CapitalizedNames for classes, and lowercase_names_with_underscores (sometimes known as "snake case") for modules, variables and methods. Your original code seemed to have some confusion between the modules name foo and bar and the classes within them with the same names. Using different styles for the different names can help avoiding that confusion.
I'm writing a Python library which is meant to be used as a third party library.
great_library/__init__.py:
class ClassA(object):
#cache()
def foo(self):
pass
class ClassB(object):
#cache()
def bar(self):
pass
class GreatLibrary(object):
#classmethod
def great_api(cls):
pass
# uses ClassA and ClassB
this library is used as:
from great_library import GreatLibrary
GreatLibrary.great_api()
Now the problem is, I'd like the user to config cache expiration time. which should be passed to #cache(): #cache(seconds)
How should I design this module structure so the user could easily pass in the config and let it be used by classA and classB ? thanks
The base problem is that the variable passed to the decorator will be read when the module will be load. So there are no way to change it before (at least if you don't want to reload the module by some hacking but that cannot change the old objects). So you need some hook where great_library can get the value of the cache time and where the user can write the desired value.
The more simple and wide used method is set environment variables. At the top of your great_library module you can check the variables and load the default cache time:
import os
default_time = os.getenv("GREAT_LIBRARY_CACHE_TIME", None)
In your code use #cache(default_time). I'm not sure that the cache() API take None as default argument, otherwise is simple to modify the receipt to adapt it to your problem.
Now the users of great_library can be set it either by os.putenv() in devolopment stage (before import the module) or by OS environment in production.
An other way to put an hook can be use a configuration module to import. IMHO that method can be useful only if you have a bunch of property to set. If you follow that path your great_library module should implement something of this:
try:
from great_library_config import *
except ImportError:
# Default configurations like...
default_time = None
Personally I try to avoid solutions like that for a module but can be useful for applications or framework with an high degree of configurability. Anyway also in that case the user can use a config module for production and override it by a developing/testing one.
I'm considering a package implementation set up like this:
wordproc
__init__.py
_generic.py
gedit.py
oofice.py
word.py
_generic.py would have a class like this:
class WordProc (object):
def __init__ (self):
pass
def createNewDoc (self):
print "createNewDoc unimplemented in current interface"
def getWordCount (self):
print "getWordCount unimplemented in current interface"
etc...
These could print out as shown, or raise errors. App-specific modules would just be copies of _generic.py with the WordProc classes deriving from _generic.WordProc. In this way, functionality could be implemented iteratively over time, with messages about unimplemented things simply raising alerts.
I'm imagining that __init__.py could look for the following things (listed in order) to figure out which module to use:
a wordproc module variable
a settings file in the path
a wordproc environment variable
a function that attempts to determine the environment
a default in __init__.py (probably _generic.py)
I think 3 could be a function in each app's module, or these could go into folders with particularly named environment test scripts (e.g. env.py), and __init__.py could loop over them.
I'd like then in any libraries that want to use wordproc to simply be able to do this:
import wordproc as wp
wp.createNewDoc()
etc...
What I don't know is how to have wp resolve to the proper class in the proper module as determined by __init__.py. It doesn't make sense to do this:
import wordproc.gedit as wp
This destroys the point of having __init__.py determine which module in wordproc to use. I need something like class inheritance, but on the module level.
You can achieve your desired effect by writing __init__.py like this:
Import the appropriate module first. See python docs on importlib.import_module or __import__ for help on dynamic imports.
Instantiate the class from which you want to export methods
Assign the instance methods to locals()
# import appropriate module as mod depending on settings, environment
# using importlib.import_module, or __import__
__all__ = []
_instance = mod.WordProc()
for attr in dir(_instance):
if not attr.startswith('_') and callable(getattr(_instance, attr)):
locals()[attr] = getattr(_instance, attr)
I have a large python code with many modules and classes. I have a special class, whose single instance is needed everywhere throughout the code (it's a threaded application, and that instance of a class also holds Thread Local Storage, locks, etc). It's a bit uncomfortable to always "populate" that instance in every imported module. I know, using globals is not the best practice, but anyway: is there any "import hook" in python, so I can do with hooking on it to have my instance available in every modules without extra work? It should work for normal imports, "from mod import ..." too, and for import constructs too. If this is not possible, can you suggest a better solution? Certenly it's not fun to pass that instance to the constructors of every classes, etc ... Inheritance also does not help, since I have modules without classes, and also I need a single instance, not the class itself ...
class master():
def import_module(self, name):
mod = __import__(name)
mod.m = self
return mod
[...]
m = master()
Currently I am thinking something like that: but then I have to use m.import_module() to import modules, then other modules will have instance of master class with name of "m" available, so I can use m.import_module() too, etc. But then I have to give up to use "normal" import statements, and I should write this:
example_mod = m.module_import("example_mod")
instead of just this:
import example_mod
(but for sure I can do with this too, to assign "m" to example_mod.m then)
Certainly it's not fun to pass that instance to the constructors of
every classes
You don't have to do this. Set up your global class in a module like config and import it
# /myapp/enviroment/__init__.py
class ThatSingleInstanceClass: pass
# create the singleton object directly or have a function init the module
singleton = ThatSingleInstanceClass()
# /myapp/somewhere.py
# all you need to use the object is importing it
from myapp.enviroment import singleton
class SomeClass:
def __init__(self): # no need to pass that object
print "Always the same object:", singleton
What's wrong with having each module import the needed object? Explicit is better than implicit.