Python: use of one logger instance across multiple module - python

As a newbie to python I finally managed to create my own customized logger in a class in an extra module (tools.py). One of the main achievements here is that I can set the name of the logger and with the name also the log file name.
I want to instantiate the logger in main in one module (recalc.py) and from here call a function prepData in another module (getData.py). That function prepData in module getData.py is supposed to use the logger instance I instantiated in recalc.py. prepData is calling other function in the same module that shall use the same logger instance.
I fail to get the grips on how do declare / instantiate the logger since only the in module recalc.py with the main function knows the correct name for the logger.
module tools.py
:
class Logg():
:
.
module recalc.py
import tools
from getData import prepData
:
lg = Logg("recalc", ...) # instantiate the logger
:
result = prepData(...)
.
module getData.py
:
def otherFunc()
lg.debug(...)
:
return X
def prepData(...)
lg.info(...)
:
x = otherFunc(...)
:
return RES
I read a lot about (not really existing) global variables in Python, also Python 3, one class used across multiple modules and others. The config.py solution implies to initiate the global variable in config.py itself. But that doesn't work for me as only the main module knows the logger name.
The workaround I have right now is not acceptable in the long run.
I appreciate any help.

Related

Dynamic Import - Python - Call function not working

I am having some trouble with dynamically importing Classes and attempting to run functions in said classes. This is my problem, specifically.
I have a python script dyn_imports.py in a director called dynamic_imports. Inside this folder is a subdir called scripts. In this scripts directory there is an __init__.py and a python class called AbhayScript.py. In the class AbhayScript, there is a function called say_hello()
My objective is: From dyn_imports.py, I would like to be able to import scripts.AbhayScript and call the function say_hello() in AbhayScript
So far, I have attempted a variety of options including __import__, importlib.import_module and pydoc.locate. All of these options give me access to the module AbhayScript but when I try a getattrs() or try to call the object, I get an error stating that its not callable or has no attribute.
dyn_imports.py - One of my experiments
myclass = 'scripts.AbhayScript'
import importlib
mymod = importlib.import_module(myclass)
mod,newclass = myclass.rsplit('.',1)
ncls = getattr(mod,newclass) #throws an AttributeError that 'str' object has no attribute
AbhayScript.py code
class AbhayScript(object):
def say_hello(self):
print 'Hello Abhay Bhargav'
The directory structure is as follows
The init.py in the scripts folder is empty.
I have attempted nearly all the solutions in Stackoverflow, but I am a little flummoxed by this one. What am I doing wrong?
I realize what I was doing wrong. I was importing the module and not referencing the class in the getattr function. I made the class declaration explicit in the __import__ function and in the getattr function and I was subsequently able to gain access to the functions in the class
Here's the code in dyn_imports.py
myclass = 'scripts.AbhayScript'
mod = __import__(myclass, fromlist = ['AbhayScript']) #class explicit
klass = getattr(mod, 'AbhayScript') #explicit class
klass().say_hello() #calls the function as desired

Custom logging class python scope

The following dev enviroment is to be considered.
Small number of python modules, each contanining one or more classes. A main module, that calls those classes. A custom logging module called logger with a class called Logger. Assume that I call the main execution class with a logging level of debug. How may I make this sufficient to be that log level inherited to every other call including the rest of the classes, methods in those classes, functions in the main module and so forth...
The Logger objects are called like log=Logger(0) for example (logging level is an int, to use the same syntax that we use in other scripts (shell scripts, not python).
My final picture is to have the code filled with log.log_debug('debug message') and log.log_error('error message') but only actually printing a message when the right log_level is choosen. And if possible just one
from logger import Logger
call within the main module.
Thanks for your time.
==================================
Edit
In the main execution module there is a main() function in which a parser.parse_args() object is returned with an argument --log_level to globally define (at least is my intention) the log_level. There is a default log_level handling (i.e, it is always defined)
I will try to mock a minimum example
import argparse
from logfile from Logfile
from logger import Logger
def argument_parser():
parser=argparse.ArgumentParser()
stuff
return parser.parse_args()
def log_file_stuff():
log_file=Logfile()
log_file.methods() [*]
def main():
args=argument_parser()
# Here log_level is defined with args.log_level
global log
log=log(log_level)
log_file_stuff()
main()
[*] One of those methods may call a Logger object, which I want to be exactly the same as the one defined in the main() function the question is how may I achieve this? without a log return and argument waterfall
Well to answer my own question... I actually implemented a cross module variable dictionary to use it as settings. Those settings have the log_level variable son only with
from main import settings
from logger import Logger
log=Logger(settings['log_level'])
then I have my custom logging class with the user input log_level. Of course settings is constructed with argparse, mainly as vars(args) following the notation in the question.

How to enable interaction between object in different modules

I am using SimPy, and I try to simulate a network.
This is my main module:
from SimPy.Simulation import *
import node0
import message0
import network0
reload (message0)
reload (node0)
reload(network0)
initialize()
topology=network0.Network()
activate(topology, topology.operate())
node1=node0.Node(1)
node1.interface.send(destination='node1')
simulate(until=25)
I want an object of class message, which is activated by an object of class node, to interrrupt
class Message(Process):
def arrive(self, destination, myEvent=delay):
self.destination=destination
self.interrupt(topology)
an object of class Network (topology).
But I'm getting an error:
NameError: global name 'topology' is not defined
And I don't know how to make an object global. And if I type topology in python shell then it shows me object topology, so why can't message see it?
I'm pretty sure the issue is that your Message class is defined in a different module than where your topology variable is. So called "global" variables in Python are not really global (in the sense that there's just one global namespace), but just at the top of a specific module's namespace. So the global variable topology in your main module's namespace is not accessible as a global variable from a different module.
My suggestion for working around this by passing the topology value to the Message as a parameter to the __init__ method. If the message is being created by something other than your own code (e.g. by your Node class), you might need to pass it around a bit more, so that it will be available when needed.
If that is not possible, you might be able to put the topology value in the namespace of a module that can be imported by your Message code. This can get messy though, as circular imports can break things if you're not careful.

Python - can a class act like a module?

I'm considering a package implementation set up like this:
wordproc
__init__.py
_generic.py
gedit.py
oofice.py
word.py
_generic.py would have a class like this:
class WordProc (object):
def __init__ (self):
pass
def createNewDoc (self):
print "createNewDoc unimplemented in current interface"
def getWordCount (self):
print "getWordCount unimplemented in current interface"
etc...
These could print out as shown, or raise errors. App-specific modules would just be copies of _generic.py with the WordProc classes deriving from _generic.WordProc. In this way, functionality could be implemented iteratively over time, with messages about unimplemented things simply raising alerts.
I'm imagining that __init__.py could look for the following things (listed in order) to figure out which module to use:
a wordproc module variable
a settings file in the path
a wordproc environment variable
a function that attempts to determine the environment
a default in __init__.py (probably _generic.py)
I think 3 could be a function in each app's module, or these could go into folders with particularly named environment test scripts (e.g. env.py), and __init__.py could loop over them.
I'd like then in any libraries that want to use wordproc to simply be able to do this:
import wordproc as wp
wp.createNewDoc()
etc...
What I don't know is how to have wp resolve to the proper class in the proper module as determined by __init__.py. It doesn't make sense to do this:
import wordproc.gedit as wp
This destroys the point of having __init__.py determine which module in wordproc to use. I need something like class inheritance, but on the module level.
You can achieve your desired effect by writing __init__.py like this:
Import the appropriate module first. See python docs on importlib.import_module or __import__ for help on dynamic imports.
Instantiate the class from which you want to export methods
Assign the instance methods to locals()
# import appropriate module as mod depending on settings, environment
# using importlib.import_module, or __import__
__all__ = []
_instance = mod.WordProc()
for attr in dir(_instance):
if not attr.startswith('_') and callable(getattr(_instance, attr)):
locals()[attr] = getattr(_instance, attr)

Python pattern for sharing configuration throughout application

I have an application consisting of a base app that brings in several modules. The base app reads a parameter file into a configuration hash, and I want to share it across all my modules.
Currently, I am passing a 'parent' object down to modules, and then those modules are doing stuff like self.parent.config to obtain the configuration.
However, as there are several levels to the module hierarchy, I find myself doing things like self.parent.parent.config, which is starting to look bad.
What are some better patterns for sharing a config object across an application and it's modules? I am thinking about having a 'config' module which basically creates a global config variable that can be set by the base app, then imported and accessed by other modules, but I am not sure if using globals like that is a bad practice for other reasons.
I am reasonably new to Python so be nice =)
You could just:
import config
and have a global config module
excerpts from my comments:
You can always add special rules for odd situations by just saying oddValue if isOddSituation() else config.normalValue.
If you want to have configuration modules be hierarchically subclassable (like my other answer describes), then you can represent a config as a class, or you can use the copy module and make a shallow copy and modify it, or you can use a "config dictionary", e.g.:
import config as baseConfig
config = dict(baseConfig, overriddenValue=etc)
It doesn't really matter too much which scope you're in.
Answering old question:
Just use dependency injection as suggested by #Reed Copsey here. E.g.
class MyClass:
def __init__(myConfig):
self.myConfig = myConfig
...
def foo():
self.myConfig.getConfig(key)
...
self.myConfig.setConfig(key,val)
...
...
# myConfig is your configuration management Module/Class
obj = SomeClass(myConfig)
I think by 'module', you are actually referring to a 'class/object'. An object is an instance of a class, for example:
class MyClass(object):
def __init__(self, ...):
...
...
myObject = MyClass()
A module is a .py file you import, like so:
import mymodule
It seems unlikely that all the classes you instantiate would want to have access to a global configuration. However if you really need everything in your application to have access to some global parameters, you can put them in your own config module:
myParam1 = 1
myParam2 = 2
and then from any module or any object or anywhere really, as long as you did import config, you could just say print(config.myParam1)
Alternatively if you want a large hierarchy of objects to all share access to the same property, you don't need to refer to it via manually setting a self.parent. As long as you use inheritance, you can do stuff like:
class Parent(object):
def __init__(self, theConfig):
self.theConfig = theConfig
class Child(Parent):
...
def method(self,...):
print(self.theConfig)
Take a look at this. It could help you:
https://gist.github.com/dgarana/c052a3287629dd7c0b0c9d7921081e9d

Categories