Importing a function that uses module-level variables - python

Example:
$ cat m1.py
a = 1
def f():
print a
$ cat m2.py
from m1 import f
a = 2
f()
I want python m2.py to print 2 when I run it, but it prints 1.
Do I have to make f take a as an argument or is there a better way of achieving this? I'm trying to code DRY and reuse the same function in a different "environment" this way. It would make sense to define a inside f, as well, if I could override it upon importing.
Another way I thought of is:
$ cat m1.py
a = 1
def make_f(a):
def f():
print a
return f
f = make_f(a)
$ cat m2.py
from m1 import make_f
a = 2
f = make_f(a)
f()
This works as needed, but are there more concise ways?
Edit: Thanks for the answers so far; I don't think I can clarify anything by providing a more realistic example, but I'd say that the reason why I'm even asking this is because in my mind there is a distinction between the actual arguments of f (which it would use the same way in both modules) and the "environment" a, which should differ. May be I shouldn't really distinguish (judging by the need to use different values of a in different modules) but the distinction makes sense based on the meaning a bears.
Edit 2: I gave it another thought and concluded that I probably want to use a closure, the reason being that I don't want other functions in each module to have to supply a when calling f. This is, I guess, the observable, non-virtual distinction that is there.

(If I understand you properly) -- Try this:
#m2.py
from m1 import f
import m1
m1.a = 2
f()
However, I should mention that the very fact that you need to do this throws off all sorts of bells and whistles in my head -- This seems like a very bad design.

I'd imagine that your code is more complex than this example, so I'd recommend that you try using a class:
YourClass.py
class YourClass(object):
def __init__(self, a=1):
self.a = a
def f(self):
print self.a
YourOtherFile.py
from YourClass import YourClass
o = YourClass(a=2) # Without explicitly setting `a=2`, `a` defaults to `1`
o.f()
Without much more context, I can't offer any more advice.

Related

How to set a string before importing a module in python using cli?

code.py:
"""
Chekcs if isset(ABC) or not.
if not set - sets "ABC = 10".
"""
try: ABC
except: ABC = 10
print(ABC)
Outputs => "10"
cli:
python -c "ABC = 20; import code"
Expected to print "20", but it outputs "10".
Is there any possible way to fix this?
Global variables, despite their name, are not global to every part of your program. Each module has its own global namespace, and the same variable can exist, with different values, in different namespaces.
In your example, you're running a script given on the command line. That gets interpreted as the __main__ module in the interpreter. So in __main__, ABC is going to be equal to 20. But when the main module imports code, it has its own namespace. It doesn't see the __main__.ABC value that already exists, so it creates (and prints) its own ABC value.
As for "fixing" the code, I'm not sure it's worth trying. You could probably have code import __main__ and poke around in its namespace, but that seems like a lot of work for a sort of silly goal. There is almost certainly a better way to achieve whatever your actual end goal is (e.g. printing 10), and messing around with other modules' namespaces is unlikely to be it. I suspect this is an XY problem so while I'm dressing your reasonable question about why the code you have behaves the way it does, I don't really think there's a reasonable fix.
I don't think it's possible to set a value before importing the module: When importing a module it does not neccessarily share the same set of variables. The variables inside a module is essentially scoped within its context.
You might be able to set a temporary environmental variable or use sys.argv to get arguments passed via command line but that's very limited (for example, you can't pass on a Python object).
I personally would use a class to achieve similar functions (however you do need to import it first). For example:
In code.py:
class SampleClassName():
def __init__(self, ABC = 10) -> None:
print(ABC)
# The rest of your logic
Then, you can create an instance of this class using:
python3 -c "from code import SampleClassName; instance = SampleClassName(20)"
Notice that here, ABC = 10 defines the default value for ABC. It it's not set, it would be the default value of 10.
You might want to learn more about classes by reading the Python Docs

Python: how to get a function based on whether it matches an assigned string to it [duplicate]

I have a function name stored in a variable like this:
myvar = 'mypackage.mymodule.myfunction'
and I now want to call myfunction like this
myvar(parameter1, parameter2)
What's the easiest way to achieve this?
funcdict = {
'mypackage.mymodule.myfunction': mypackage.mymodule.myfunction,
....
}
funcdict[myvar](parameter1, parameter2)
It's much nicer to be able to just store the function itself, since they're first-class objects in python.
import mypackage
myfunc = mypackage.mymodule.myfunction
myfunc(parameter1, parameter2)
But, if you have to import the package dynamically, then you can achieve this through:
mypackage = __import__('mypackage')
mymodule = getattr(mypackage, 'mymodule')
myfunction = getattr(mymodule, 'myfunction')
myfunction(parameter1, parameter2)
Bear in mind however, that all of that work applies to whatever scope you're currently in. If you don't persist them somehow, you can't count on them staying around if you leave the local scope.
def f(a,b):
return a+b
xx = 'f'
print eval('%s(%s,%s)'%(xx,2,3))
OUTPUT
5
Easiest
eval(myvar)(parameter1, parameter2)
You don't have a function "pointer". You have a function "name".
While this works well, you will have a large number of folks telling you it's "insecure" or a "security risk".
Why not store the function itself? myvar = mypackage.mymodule.myfunction is much cleaner.
modname, funcname = myvar.rsplit('.', 1)
getattr(sys.modules[modname], funcname)(parameter1, parameter2)
eval(compile(myvar,'<str>','eval'))(myargs)
compile(...,'eval') allows only a single statement, so that there can't be arbitrary commands after a call, or there will be a SyntaxError. Then a tiny bit of validation can at least constrain the expression to something in your power, like testing for 'mypackage' to start.
I ran into a similar problem while creating a library to handle authentication. I want the app owner using my library to be able to register a callback with the library for checking authorization against LDAP groups the authenticated person is in. The configuration is getting passed in as a config.py file that gets imported and contains a dict with all the config parameters.
I got this to work:
>>> class MyClass(object):
... def target_func(self):
... print "made it!"
...
... def __init__(self,config):
... self.config = config
... self.config['funcname'] = getattr(self,self.config['funcname'])
... self.config['funcname']()
...
>>> instance = MyClass({'funcname':'target_func'})
made it!
Is there a pythonic-er way to do this?

Override globals in function imported from another module

Let's say I have two modules:
a.py
value = 3
def x()
return value
b.py
from a import x
value = 4
My goal is to use the functionality of a.x in b, but change the value returned by the function. Specifically, value will be looked up with a as the source of global names even when I run b.x(). I am basically trying to create a copy of the function object in b.x that is identical to a.x but uses b to get its globals. Is there a reasonably straightforward way to do that?
Here is an example:
import a, b
print(a.x(), b.x())
The result is currently 3 3, but I want it to be 3 4.
I have come up with two convoluted methods that work, but I am not happy with either one:
Re-define x in module b using copy-and paste. The real function is much more complex than shown, so this doesn't sit right with me.
Define a parameter that can be passed in to x and just use the module's value:
def x(value):
return value
This adds a burden on the user that I want to avoid, and does not really solve the problem.
Is there a way to modify where the function gets its globals somehow?
I've come up with a solution through a mixture of guess-and-check and research. You can do pretty much exactly what I proposed in the question: copy a function object and replace its __globals__ attribute.
I am using Python 3, so here is a modified version of the answer to the question linked above, with an added option to override the globals:
import copy
import types
import functools
def copy_func(f, globals=None, module=None):
"""Based on https://stackoverflow.com/a/13503277/2988730 (#unutbu)"""
if globals is None:
globals = f.__globals__
g = types.FunctionType(f.__code__, globals, name=f.__name__,
argdefs=f.__defaults__, closure=f.__closure__)
g = functools.update_wrapper(g, f)
if module is not None:
g.__module__ = module
g.__kwdefaults__ = copy.copy(f.__kwdefaults__)
return g
b.py
from a import x
value = 4
x = copy_func(x, globals(), __name__)
The __globals__ attribute is read-only, which is why it must be passed to the constructor of FunctionType. The __globals__ reference of an existing function object can not be changed.
Postscript
I've used this enough times now that it's implemented in a utility library I wrote and maintain called haggis. See haggis.objects.copy_func.
So I found a way to (sort of) do this, although I don't think it entirely solves your problems. Using inspect, you can access the global variables of the file calling your function. So if you set up your files like so:
a.py
import inspect
value = 3
def a():
return inspect.stack()[1][0].f_globals['value']
b.py
from a import a
value = 5
print(a())
The output is 5, instead of 3. However, if you imported both of these into a third file, it would look for the globals of the third file. Just wanted to share this snippet however.
I had the same problem. But then I remembered eval was a thing.
Here's a much shorter version(if you don't need arguments):
b.py:
from a import x as xx
# Define globals for the function here
glob = {'value': 4}
def x():
return eval(xx.__code__, glob)
Hopefully after 2 years it'll still be helpful

How to get module variable in function from another module?

I'd like to define a helper function that has the ability to modify a module-level variable (with known name) from surrounding context without explicitly passing it, e.g.
# mod1.py
mod_var = 1
modify_var()
# mod_var modified
print mod_var
The problem is - I can't reference variable by mod1.mod_var, because I want to use helper function across many modules (helper itself will be defined in other module); it should dynamically 'pick' mod_var from surrounding calling context/scope.
Is this possible? How to obtain this?
My use case is to enhance defining URL -> view mapping in Django. Those definitions are spread across many sub-modules that define urlpatterns module-level variable. Helper function should pick this variable from the module that calls it and modify it. Avoiding explicitly passing it as argument would be great.
Edit:
For additional solution - check this answer.
Edit2:
Wrong solution below! (left for references in comments)
Recently I've found another solution (the least magical in my opinion ;))
modify_var() function could be implemented like this:
def modify_var():
calling_module = __import__("__main__")
calling_module.mod_var = 42
Still, potential profits are arguable.
unittest module uses this technique in its main method.
It's a truly bad, horrible, and awful idea, which will lead to future maintenance nightmares. However, Python does offer "enough rope to shoot yourself in the foot", if you truly insist: introspection and metaprogramming tools which are mostly intended for debugging purposes, but can be abused to perform the ill-conceived task you so desperately crave.
For example, in evil.py:
import inspect
def modify_var():
callersframe = inspect.stack()[1][0]
callersglobals = callersframe.f_globals
if 'mod_var' not in callersglobals:
raise ValueError, 'calling module has no "mod_var"!'
callersglobals['mod_var'] += 1
now say you have two modules, a.py:
import evil
mod_var = 23
evil.modify_var()
print 'a mod_var now:', mod_var
and b.py:
import evil
mod_var = 100
evil.modify_var()
print 'b mod_var now:', mod_var
you could do:
$ python -c'import a; import b'
a mod_var now: 24
b mod_var now: 101
However, maintaining this kind of black-magic tricks in the future is going to be a headache, so I'd strongly recommend not doing things this way.
What you want to do sounds like too much magic. Pass in urlpatterns and be done with it. Explicit is better than implicit.
OK, here's the magic, but again, I recommend not using it:
import sys
def modify_var():
"""Mysteriously change `mod_var` in the caller's context."""
f = sys._getframe(1)
f.f_locals['mod_var'] += " (modified)"
mod_var = "Hello"
modify_var()
print mod_var
prints:
Hello (modified)
As a further warning against this technique: _getframe is one of those functions that other implementations of Python don't provide, and the docs include this sentence: "This function should be used for internal and specialized purposes only."
If you really want to do that then you'll need to import mod1 in either the other module or directly in the function, and then modify it off that import. But don't do that; seasoned programmers will point and laugh.

cross module variable

from here I got an idea about how using variables from other modules. this all works fine with
import foo as bar
But I don't want to import my modules as "bar" I want to use it without any prefix like
from foo import *
Using this it´s impossible to modify variables from other modules. reading will work! any idea? suggestions?
Short answer: No, it's impossible, and you'll have to use a prefix.
It's important to understand that from foo import x, y is copying x to your namespace. It's equivallent to:
import foo
# COPY TO YOUR NAMESPACE
x = foo.x
y = foo.y
# `from foo import` does NOT leave `foo` in your namespace
def foo
This way, each module will get a local copy of x and y. Changing x won't be seen in other modules, and you won't see changes other modules do :-(
To change the central copy of a variable you must import the module itself: import foo and change foo.x. This way only one copy exists and everybody is accessing it :-)
[The linked questions also mention the possibility to put the shared variable in the module builtin. DON'T! This would eliminate the prefix for reading it, but not for writing, and is extremely bad style.]
A note in defense of Python
If you resent the need to use a foo. prefix here, you'll probably also resent the need for the self. prefix to access object variables. The bottom line is that's how Python works - since you don't declare variables, there is no choice but to use prefixes.
But there is also an upside: when reading Python code, you easily see where each variable lives. IMHO that's very good.
Supporting evidence: in other languages like C++/Java, many people observe conventions like an m_ prefix on all object variable names to achieve a similar effect...
Style remarks
You don't need import foo as bar, just use import foo.
The as form doesn't do anything new, it just renames it, which is just confusing.
It's only useful if "foo" is a very long name; a particularly accepted case is when "foo" lives deep in some package, so you can do import long.package.foo as foo.
The from foo import * is considered very bad style in programs because:
The person reading your code won't know where names came from.
It pollutes your namespace, which can lead to subtle bugs when names from
different modules clash.
The explicit form from foo import x, y is OK, but starts suffering from the same problems if you use many names from the module.
In such cases, it's best to import foo and explicitly write foo.x, foo.y.
Bottom line: when in doubt, a simple import foo is best.
Exception: It is very handy to use import * when experimenting at the interactive interpreter. Note however that it doesn't play well with reload(), so don't use it when debugging changing code. (To debug a module you are writing, it's best to launch a fresh interpreter inside the module's namespace - python -i mymodule.py / F5 in IDLE.)
As far as I know, there is no way to import a value from a module and have it readable and writable by the importing scope. When you just import foo in Python, it creates a module object named foo. Getting and setting attributes on a module object will change them in the module's scope. But when you from foo import something, foo is imported and a module object is created, but is not returned. Instead, Python copies the values you specified out of foo and puts them in the local scope. If what you are importing is an immutable type like int or str, then changing it and having the changes reflect in the foo module is impossible. It's similar to this:
>>> class N(object):
... def __init__(self, value):
... self.value = value
>>> n = N(3)
>>> value = n.value
>>> print value, n.value
3 3
>>> value = 4
>>> print value, n.value
4 3
Excepting crude hacks, if you really want to be able to modify the module's variable, you will need to import the module itself and modify the variable on the module. But generally, having to do this is indicative of bad design. If you are the writer of the foo module in question, you may want to look at some other, more Pythonic ways to solve your problem.
By using import foo from bar you don't import bar as a variable but as a constant.
from foo import * is frowned upon (by me, by Google's style guide, by the OLPC style guide - which you should see, as it has the best explanations of why this is bad - but not by PEP-8, unfortunately). - it makes for unreadable code.
Consider:
from foo import *
from bar import *
from baz import *
dostuff()
If you have an error running dostuff(), where do you look for the problem? It could have come from any of those imports.
For readable, maintainable code, stick with from foo import bar. For readable, modular, maintainable code, don't hack with globals - extend bar (by subclassing, if you can't change the upstream source) to expose methods for modifying the values you need to access.

Categories