I have a script as follows
from mapper import Mapper
class A(object):
def foo(self):
print "world"
a = A()
a.foo()
Mapper['test']()
with Mapper defined in the file mapper.py:
Mapper = {'test': a.foo}
where I want to define a function call referencing an object not defined in mapper.py, but in the original code. However the code above gives the error
NameError: name 'a' is not defined
which makes kind of sense, as a is not defined in mapper.py itself. However, is it possible to change the code to let the code do the name resolution in the main code itself, or by the use of globals or something?
To solve this problem I could specify the implementation in mapper.py as a text and use eval in the main code, but I would like to avoid the usage of eval.
Additional information:
The full definition of the function has to be made in mapper.py
It is not known beforehand what the instance a is, or from what clas it is instantiated.
Barring security holes like eval, it's not possible to use a name a in mapper.py unless the name is either defined somewhere in mapper.py or imported from another module. There is no way to just let mapper.py automatically and silently access a value a from a different module.
In addition, if you're using it just in a dict as in your example, a.foo is going to be evaluated as soon as the dict is created. It's not going wait until you actually call the function; as soon as it evaluates a.foo to create the dict, it will fail because it doesn't know what a is.
You could get around this second problem by wrapping the element in a function (using a lambda for brevity):
Mapper = {'test': lambda: a.foo}
. . . but this still won't help unless you can somehow get a to be available inside mapper.py.
One possibility is to parameterize your Mapper by the "mystery" object and then pass that object in from outside:
# mapper.py
Mapper = {'test': lambda a: a.foo}
# other module
from mapper import Mapper
Mapper['test'](a)()
Or, similar to what mgilson suggested, you could "register" the object a with Mapper somehow. This lets you pass the object a only once to register it, and then you don't have to pass it for every call:
# mapper.py
Mapper = {'test': lambda a: Mapper['a'].foo}
# other module
from mapper import Mapper
Mapper['a'] = a
Mapper['test']()()
Note the two sets of parentheses at the end there: one set to evaluate the lambda and extract the function you want to call, and the second set to actually call that function. You could do a similar deal by, instead of using Mapper['a'] as the reference, using a module-level variable:
# mapper.py
Mapper = {'test': lambda: a.foo}
# other module
import mapper
Mapper = mapper.Mapper
mapper.a = a
Mapper['test']()()
Note that this requires you to do import mapper in order to set the module variable in that other module.
You could streamline this somewhat by using a custom class for Mapper instead of a regular dict, and having that class do some work in its __getitem__ to look in a "known location" (e.g., read some module variable) to use as a base for evaluating a. That would be a heavier-weight solution though.
The bottom line is that you simply cannot (again, without the use of eval or other such holes) write code in mapper.py that uses an undefined variable a, and then define a variable a in another module and have mapper.py automatically know about that. There has to be some line of code somewhere that "tells" mapper.py what value of a you want it to use.
I'm not sure I completely follow, but a could "register" it's method with Mapper from anywhere which has a reference to Mapper:
#mapping.py
Mapper = {}
and then:
#main.py
from mapping import Mapper
#snip
a = A()
Mapper['test'] = a.foo #put your instance method into the Mapper dict.
#snip
Mapper['test']()
Related
globalEx1.py:
globals()['a']='100'
def setvalue(val):
globals()['a'] = val
globalEx2.py:
from globalEx1 import *
print a
setvalue('200')
print a
On executing globalEx2.py:
Output:
100
100
How can I change value of globals['a'] using a function, so that it reflects across the .py files?
Each module has its own globals. Python is behaving exactly as expected. Updating globalEx1's a to point to something else isn't going to affect where globalEx2's a is pointing.
There are various ways around this, depending on exactly what you want.
re-import a after the setvalue() call
return a and assign it, like a = setvalue().
import globalEx1 and use globalEx1.a instead of a. (Or use import globalEx1 as and a shorter name.)
pass globalEx2's globals() as an argument to setvalue and set the value on that instead.
make a a mutable object containing your value, like a list, dict or types.SimpleNamespace, and mutate it in setvalue.
use inspect inside setvalue to get the caller's globals from its stack frame. (Convenient, but brittle.)
Last option looks suitable for me.. it will do the job with minimal code change but can I update globals of multiple modules using same way? or it only gives me the caller's globals?
Option 6 is actually the riskiest. The caller itself basically becomes a hidden parameter to the function, so something like a decorator from another module can break it without warning. Option 4 just makes that hidden parameter explicit, so it's not so brittle.
If you need this to work across more than two modules, option 6 isn't good enough, since it only gives you the current call stack. Option 3 is probably the most reliable for what you seem to be trying to do.
How does option 1 work? I mean is it about running again -> "from globalEx1 import *" because I have many variables like 'a'.
A module becomes an object when imported the first time and it's saved in the sys.modules cache, so importing it again doesn't execute the module again. A from ... import (even with the *) just gets attributes from that module object and adds them to the local scope (which is the module globals if done at the top level, that is, outside of any definition.)
The module object's __dict__ is basically its globals, so any function that alters the module's globals will affect the resulting module object's attrs, even if it's done after the module was imported.
We cannot do from 'globalEx1 import *' from a python function, any alternative to this?
The star syntax is only allowed at the top level. But remember that it's just reading attributes from the module object. So you can get a dict of all the module attributes like
return vars(globalEx1)
This will give you more than * would. It doesn't return names that begin with an _ by default, or the subset specified in __all__ otherwise. You can filter the resulting dict with a dict comprehension, and even .update() the globals dict for some other module with the result.
But rather than re-implementing this filtering logic, you could just use exec to make it the top level. Then the only weird key you'd get is __builtins__
namespace = {}
exec('from globalEx1 import *', namespace)
del namespace['__builtins__']
return namespace
Then you can globals().update(namespace) or whatever.
Using exec like this is probably considered bad form, but then so is import * to begin with, honestly.
This is an interesting problem, related to the fact that strings are immutable. The line from globalEx1 import * creates two references in the globalEx2 module: a and setvalue. globalEx2.a initially refers to the same string object as globalEx1.a, since that's how imports work.
However, once you call setvalue, which operates on the globals of globalEx1, the value referenced by globalEx1.a is replaced by another string object. Since strings are immutable, there is no way to do this in place. The value of globalEx2.a remains bound to the original string object, as it should.
You have a couple of workarounds available here. The most pythonic is to fix the import in globalEx2:
import globalEx1
print globalEx1.a
globalEx1.setvalue('200')
print globalEx1.a
Another option would be to use a mutable container for a, and access that:
globals()['a']=['100']
def setvalue(val):
globals()['a'][0] = val
from globalEx1 import *
print a[0]
setvalue('200')
print a[0]
A third, and wilder option, is to make globalEx2's setvalue a copy of the original function, but with its __globals__ attribute set to the namespace of globalEx2 instead of globalEx1:
from functools import update_wrapper
from types import FunctionType
from globalEx1 import *
_setvalue = FunctionType(setvalue.__code__, globals(), name=setvalue.__name__,
argdefs=setvalue.__defaults__,
closure=setvalue.__closure__)
_setvalue = functools.update_wrapper(_setvalue, setvalue)
_setvalue.__kwdefaults__ = f.__kwdefaults__
setvalue = _setvalue
del _setvalue
print a
...
The reason you have to make the copy is that __globals__ is a read-only attribute, and also you don't want to mess with the function in globalEx1. See https://stackoverflow.com/a/13503277/2988730.
Globals are imported only once at the beginning with the import statement. Thus, if the global is an immutable object like str, int, etc, any update will not be reflected. However, if the global is a mutable object like list, etc, updates will be reflected. For example,
globalEx1.py:
globals()['a']=[100]
def setvalue(val):
globals()['a'][0] = val
The output will be changed as expected:
[100]
[200]
Aside
It's easier to define globals like normal variables:
a = [100]
def setvalue(value):
a[0] = value
Or when editing value of immutable objects:
a = 100
def setvalue(value):
global a
a = value
I'm trying to dynamically update code during runtime by reloading modules using importlib.reload. However, I need a specific module variable to be set before the module's code is executed. I could easily set it as an attribute after reloading but each module would have already executed its code (e.g., defined its default arguments).
A simple example:
# module.py
def do():
try:
print(a)
except NameError:
print('failed')
# main.py
import module
module.do() # prints failed
module.a = 'succeeded'
module.do() # prints succeeded
The desired pseudocode:
import_module_without_executing_code module
module.initialise(a = 'succeeded')
module.do()
Is there a way to control module namespace initialisation (like with classes using metaclasses)?
It's not usually a good idea to use reload other than for interactive debugging. For example, it can easily create situations where two objects of type module.A are not the same type.
What you want is execfile. Pass a globals dictionary (you don't need an explicit locals dictionary) to keep each execution isolated; anything you store in it ahead of time acts exactly like the "pre-set" variables you want. If you do want to have a "real" module interface change, you can have a wrapper module that calls (or just holds as an attribute) the most recently loaded function from your changing file.
Of course, since you're using Python 3, you'll have to use one of the replacements for execfile.
Strictly speaking, I don't believe there is a way to do what you're describing in Python natively. However, assuming you own the module you're trying to import, a common approach with Python modules that need some initializing input is to use an init function.
If all you need is some internal variables to be set, like a in you example above, that's easy: just declare some module-global variables and set them in your init function:
Demo: https://repl.it/MyK0
Module:
## mymodule.py
a = None
def do():
print(a)
def init(_a):
global a
a = _a
Main:
## main.py
import mymodule
mymodule.init(123)
mymodule.do()
mymodule.init('foo')
mymodule.do()
Output:
123
foo
Where things can get trickier is if you need to actually redefine some functions because some dynamic internal something is dependent on the input you give. Here's one solution, borrowed from https://stackoverflow.com/a/1676860. Basically, the idea is to grab a reference to the current module by using the magic variable __name__ to index into the system module dictionary, sys.modules, and then define or overwrite the functions that need it. We can define the functions locally as inner functions, then add them to the module:
Demo: https://repl.it/MyHT/2
Module:
## mymodule.py
import sys
def init(a):
current_module = sys.modules[__name__]
def _do():
try:
print(a)
except NameError:
print('failed')
current_module.do = _do
I have a dictionary called fsdata at module level (like a global variable).
The content gets read from the file system. It should load its data once on the first access. Up to now it loads the data during importing the module. This should be optimized.
If no code accesses fsdata, the content should not be read from the file system (save CPU/IO).
Loading should happen, if you check for the boolean value, too:
if mymodule.fsdata:
... do_something()
Update: Some code already uses mymodule.fsdata. I don't want to change the other places. It should be variable, not a function. And "mymodule" needs to be a module, since it gets already used in a lot of code.
I think you should use Future/Promise like this https://gist.github.com/2935416
Main point - you create not an object, but a 'promise' about object, that behave like an object.
You can replace your module with an object that has descriptor semantics:
class FooModule(object):
#property
def bar(self):
print "get"
import sys
sys.modules[__name__] = FooModule()
Take a look at http://pypi.python.org/pypi/apipkg for a packaged approach.
You could just create a simple function that memoizes the data:
fsdata = []
def get_fsdata:
if not fsdata:
fsdata.append(load_fsdata_from_file())
return fsdata[0]
(I'm using a list as that's an easy way to make a variable global without mucking around with the global keyword).
Now instead of referring to module.fsdata you can just call module.get_fsdata().
I have a config.cfg which I parse using the python-module ConfigParser. In one section I want to configure assignments of the form fileextension : ClassName. Parsing results in the following dictionary:
types = {
"extension1" : "ClassName1",
"extension2" : "ClassName2"
}
EDIT: I know I can now do:
class_ = eval(types[extension])
foo = class()
But I was given to understand that eval is evil and should not be used.
Do you know a nicer way to dynamically configure which file-extension results in which class?
You could use eval, if the class name in the config file exactly matches the class names in your python code (and if the classes are in scope!), but ..... eval is evil (a coincidence that there's only one letter difference? I think not!)
A safer way to do it would be to add an extra dictionary that maps from configuration class name to python class name. I'd do this because:
configuration files don't have to know about your code's names
can change config files without changing code and vice versa
it avoids eval
So it'd look something like:
mappingDict = {"ClassName1" : MyPythonClass1,
"ClassName2" : MyPythonClass2, ... }
# keys are strings, values are classes
Then you perform a lookup using the value from the config file:
myClassName = types['extension1']
myClass = mappingDict[myClassName]
If module is the module the class named classname lives in, you can get the class object using
class_ = getattr(module, classname)
(If the class lives in the main module, use import __main__ to get a module object for this module.)
To look up the class in the current module's global scope, use
class_ = globals()[classname]
I think a static dictionary as in Matt's answer is the better solution.
I'd like to init a class from data stored in a simple python file specified while calling the script. The config file named myconfig.py is :
str='home'
val=2
flt=7.0
I'd like to call it during class initilization like so. One of the objectives is to define variable types as well in the file. I know of the configparser, but this method less verbose if it can be made to work.
class ClassInit(object):
def __init__(self, configFile):
fp, path, des = imp.find_module('',configFile)
imp.load_module(configFile, fp, path, des)
self.__dict__ = configFile.__dict__
fp.close()
def printVal(self):
print '%s %0.2f'%(self.str, self.val)
if __name__ == '__main__':
srcDir = 'src/'
config = osp.join(srcDir, argv[0]) # config for current run
ci = ClassInit(config)
ci.printVal()
Is anything like this possible?
Well, there are several ways to do this. The easiest way would be to use eval() or exec to evaluate this code within the class scope. But that's also the most dangerous way, especially if these files can be created by someone other than you. In that case, the creator can write malicious code that can pretty much do anything. You can override the __builtins__ key of the globals dictionary, but I'm not sure if this makes eval/exec entirely safe. For example:
class ClassInit(object):
def __init__(self, configFile):
f = open(configFile)
config = f.read()
f.close()
config_dic = { '__builtins__': None}
exec 'a = 4' in config_dic
for key, value in config_dic.iteritems():
if key != '__builtins__':
setattr(self, key, value)
This method kills the unsafe 'builtins' object, but it's still not quite safe. For instance, the file may be able to define a function which would override one of your class's functions with malicious code. So I really don't recommend it, unless you absolutely control thos .py files.
A safer but more complex way would be to create a custom interpreter that interprets this file but doesn't allow running any custom code.
You can read the following thread, to see some suggestions for parsing libraries or other safer alternatives to eval():
Python: make eval safe
Besides, if all you ever need your config.py file for is to initialize some variables in a nice way, and you don't need to be able to call fancy python functions from inside it, you should consider using JSON instead. Python 2.6 and up includes simplejson, which you can use to initialize an object from file. The syntax is Javascript and not Python, but for initializing variables there's little difference there.
Can you try self.__dict__.update(configFile.__dict__)? I don't see why that wouldn't work.