I have two python files. From python file #1, I want to check to see if there is a certain global variable defined in python file #2.
What is the best way to do this?
You can directly test whether the file2 module (which is a module object) has an attribute with the right name:
import file2
if hasattr(file2, 'varName'):
# varName is defined in file2…
This may be more direct and legible than the try… except… approach (depending on how you want to use it).
try:
from file import varName
except ImportError:
print 'var not found'
Alternatively you could do this (if you already imported the file):
import file
# ...
try:
v = file.varName
except AttributeError:
print 'var not found'
This will work only if the var is global. If you are after scoped variables, you'll need to use introspection.
With the getattr() built-in function you also can specify a default value like:
import file2
myVar = getattr(file2, attribute, False)
See the documentation
Related
I would like to know if it is possible to import some variables from another python file using importlib.import_module function (or another similar). I need to use importlib because in my project I employ a variable for the name of the file. For instance, if the name of the file was explicitly given - let this be myfile - I would use
from myfile import a, b
to import variables a and b.
But for the case of myfile being a variable for the file name, I was thinking that I could use something like
import importlib
a = importlib.import_module(myfile.a)
However this seems to only work for the case of myfile being a variable for the name of a package and not a variable for the name of a file. If it is the case of the latter and the name of the file is for instance "foo", the error
No module named 'foo.a'; 'foo' is not a package
happens. Also, I did not find a way to import the additional variable b through the importlib.import_module function. Is there any importlib function that could do this?
We ended up using something like
imported_function = getattr(import_module(module_name), function_name)
I can import a python script using import_module. But, how can I call a function stored as a variable from that script? I've previously used getattr to work with dictionaries stored as variables, but I don't think this same method works with functions. Here's an example that does not currently work:
from importlib import import_module
file_list = ['file1','file2']
func_list = ['func1','func2']
for file in file_list:
test_file = import_module(file)
for func in func_list:
from test_file import func
file1:
def func1():
...
def func2():
...
file2:
def func1():
...
def func2():
...
I can import a python script using import_module.
When you do this, the result is a module object - just the same as an import statement provides.
from test_file import func
The reason this doesn't work is because it is looking for a test_file module - and it cares about module names as they appear in sys.path, not about your local variable names.
Fortunately, since you already have the module object, you presumably realized you could access the contents normally, as attributes, e.g. test_file.func.
I've previously used getattr to work with dictionaries stored as variables, but I don't think this same method works with functions
I'm not quite sure what you mean here. Attributes are attributes, whether they're plain data, functions, classes or anything else. test_file is a thing that has a func attribute, therefore getattr(test_file, 'func') gets that attribute.
The remaining issue is the variable-variables problem - you don't really want to be creating a name for that result dynamically. So yes, you can store that in a dict, if you want. But frankly it's easier to just use the module object. Unless perhaps for some reason you need/want to "trim" the contents and only expose a limited interface (for some other client); but you can't avoid loading the whole module. from X import Y does that anyway.
The module object that you got from the dynamic import is already working as a namespace, which you need here anyway because you're importing multiple modules that have overlapping attribute names.
tl;dr: if you want to call a function from that imported module, just do it the same way that you would have if you had imported the module (not a name from that module) normally. We can, for example, put the imported modules in a list:
modules = [import_module(f) for f in filenames]
and then call the appropriate method by looking it up within the appropriate module object:
modules[desired_module_id].desired_func()
basically You would run this code in a separate file and where it says the_file_where_this_is_needed.py You would insert the file where You want these import statement to be. (also probably You can run this code in the very file). it will be sort of like hardcoding but automatic
file_list = ['file1', 'file2']
func_list = ['func1', 'func2']
with open('the_file_where_this_is_needed.py', 'r') as file:
data = file.read()
string = ''
for file in file_list:
for func in func_list:
string += f'from {file} import {func}\n'
data = string + data
with open('the_file_where_this_is_needed.py', 'w') as file:
file.write(data)
I want to import some package depending on which value the user chooses.
The default is file1.py:
from files import file1
If user chooses file2, it should be :
from files import file2
In PHP, I can do this using variable variables:
$file_name = 'file1';
include($$file_name);
$file_name = 'file2';
include($$file_name);
How can I do this in Python?
Python doesn't have a feature that's directly equivalent to PHP's "variable variables". To get a "variable variable"'s value (or the value of any other expression) you can use the eval function.
foo = "Hello World"
print eval("foo")
However, this can't be used in an import statement.
It is possible to use the __import__ function to import using a variable.
package = "os"
name = "path"
imported = getattr(__import__(package, fromlist=[name]), name)
is equivalent to
from os import path as imported
Old thread, but I needed the answer, so someone else still might...
There's a cleaner way to do this in Python 2.7+:
import importlib
my_module = importlib.import_module("package.path.%s" % module_name)
As Fredrik Lundh states:
Anyway, here’s how these statements and functions work:
import X imports the module X, and creates a reference to that module
in the current namespace. Or in other words, after you’ve run this
statement, you can use X.name to refer to things defined in module X.
from X import * imports the module X, and creates references in the
current namespace to all public objects defined by that module (that
is, everything that doesn’t have a name starting with “_”). Or in
other words, after you’ve run this statement, you can simply use a
plain name to refer to things defined in module X. But X itself is not
defined, so X.name doesn’t work. And if name was already defined, it
is replaced by the new version. And if name in X is changed to point
to some other object, your module won’t notice.
from X import a, b, c imports the module X, and creates references in
the current namespace to the given objects. Or in other words, you can
now use a and b and c in your program.
Finally, X = __import__(‘X’) works like import X, with the difference
that you 1) pass the module name as a string, and 2) explicitly assign
it to a variable in your current namespace.
And by the way that's the last one method that you're intrested in.
Simply write (for example):
var = "datetime"
module = __import__(var)
Basing myself on mattjbray's answer:
from importlib import import_module
# lookup in a set is in constant time
safe_names = {"file1.py", "file2.py", "file3.py", ...}
user_input = ...
if user_input in safe_names:
file = import_module(user_input)
else:
print("Nope, not doing this.")
Saves a few lines of code, and allows you to set safe_names programmatically, or load multiple modules and assign them to a dict, for example.
It's probably a very bad idea to let the user choose what to import. Packages can execute code on import, so you're effectively allowing a user to arbitrarily execute code on your system! Much safer to do something like
if user_input == 'file1.py':
from files import file1 as file
elif user_input == 'file2.py':
from files import file2 as file
else:
file = None
print "Sorry, you can't import that file"
I want to import some package depending on which value the user chooses.
The default is file1.py:
from files import file1
If user chooses file2, it should be :
from files import file2
In PHP, I can do this using variable variables:
$file_name = 'file1';
include($$file_name);
$file_name = 'file2';
include($$file_name);
How can I do this in Python?
Python doesn't have a feature that's directly equivalent to PHP's "variable variables". To get a "variable variable"'s value (or the value of any other expression) you can use the eval function.
foo = "Hello World"
print eval("foo")
However, this can't be used in an import statement.
It is possible to use the __import__ function to import using a variable.
package = "os"
name = "path"
imported = getattr(__import__(package, fromlist=[name]), name)
is equivalent to
from os import path as imported
Old thread, but I needed the answer, so someone else still might...
There's a cleaner way to do this in Python 2.7+:
import importlib
my_module = importlib.import_module("package.path.%s" % module_name)
As Fredrik Lundh states:
Anyway, here’s how these statements and functions work:
import X imports the module X, and creates a reference to that module
in the current namespace. Or in other words, after you’ve run this
statement, you can use X.name to refer to things defined in module X.
from X import * imports the module X, and creates references in the
current namespace to all public objects defined by that module (that
is, everything that doesn’t have a name starting with “_”). Or in
other words, after you’ve run this statement, you can simply use a
plain name to refer to things defined in module X. But X itself is not
defined, so X.name doesn’t work. And if name was already defined, it
is replaced by the new version. And if name in X is changed to point
to some other object, your module won’t notice.
from X import a, b, c imports the module X, and creates references in
the current namespace to the given objects. Or in other words, you can
now use a and b and c in your program.
Finally, X = __import__(‘X’) works like import X, with the difference
that you 1) pass the module name as a string, and 2) explicitly assign
it to a variable in your current namespace.
And by the way that's the last one method that you're intrested in.
Simply write (for example):
var = "datetime"
module = __import__(var)
Basing myself on mattjbray's answer:
from importlib import import_module
# lookup in a set is in constant time
safe_names = {"file1.py", "file2.py", "file3.py", ...}
user_input = ...
if user_input in safe_names:
file = import_module(user_input)
else:
print("Nope, not doing this.")
Saves a few lines of code, and allows you to set safe_names programmatically, or load multiple modules and assign them to a dict, for example.
It's probably a very bad idea to let the user choose what to import. Packages can execute code on import, so you're effectively allowing a user to arbitrarily execute code on your system! Much safer to do something like
if user_input == 'file1.py':
from files import file1 as file
elif user_input == 'file2.py':
from files import file2 as file
else:
file = None
print "Sorry, you can't import that file"
I would like to load a .py file at runtime. This .py file is basically a config file with the following format:
var1=value
var2=value
predicate_function=func line : <return true or false>
Once this file is loaded, I would like to be able to access var1, var2 and predicate_function. For each line, I'll pass it to the predicate function, and if it returns false, I'll ignore it.
In any case, I'm not sure how to load a python file at runtime and access its variables.
Clarification: there may be any number of these config files that I need to pass to the main program and I won't know their names until runtime. Google tells me I should use __import__. I'm not sure how to correctly use that method and then access the variables of the imported file.
As written in the python official documentation, if you just want to import a module by name, you can look it up in the sys.modules dictionary after using __import__.
Supposing your configuration is in myproject.mymodule, you would do like that :
module_name = 'myproject.mymodule'
import sys
__import__(module_name)
mymodule = sys.modules[module_name]
# Then you can just access your variables and functions
print mymodule.var1
print mymodule.var2
# etc...
You can also use the return value of __import__ statement but you will have to understand fully how python works with namespaces and scopes.
You just need to be able to dynamically specify the imports and then dynamically get at the variables.
Let's say your config file is bar.py and looks like this:
x = 3
y = 4
def f(x): return (x<4)
Then your code should look like this:
import sys
# somehow modnames should be a list of strings that are the names of config files
#
# you can do this more dynamically depending on what you're doing
modnames = ['bar']
for modname in modnames:
exec('import %s' % modname)
for modname in modnames:
mod = sys.modules[modname]
for k in mod.__dict__:
if k[:2] != '__':
print modname, k, mod.__dict__[k]
I get this output:
bar f <function f at 0x7f2354eb4cf8>
bar x 3
bar y 4
Then you at least have all the variables and functions. I didn't quite get what you wanted from the predicate functions, but maybe you can get that on your own now.
To access another Python module, you import it. execfile has been mentioned by a couple people, but it is messy and dangerous. execfile clutters your namespace, possibly even messing up the code you are running. When you want to access another Python source file, use the import statement.
Even better would be not to use a Python file for configuration at all, but rather to use the builtin module ConfigParser or a serialization format like JSON. This way your configuration files don't allow execution of arbitrary (possibly malicious) code, doesn't require people to know Python to configure your program, and can easily be altered programatically.
If the imported module is on the regular search path, you can use __import__.
If you need to load the module from an arbitrary path in the filesystem, use imp.load_module.
Be sure to consider the security implications of loading arbitrary user-specified code.
In Python 2.*, execfile works (I recommend passing a specific dictionary and accessing the variables from there -- as the note in the docs says, execfile can't affect the calling function's locals() dictionary).
In Python 3.*, execfile has been removed, so do, instead:
with open('thefile.py') as f:
exec(f.read(), somedict)
Since the Python version hasn't been clearly mentioned, it is worth pointing out that the imp module has been deprecated in newer Python versions in favor of the importlib module. Example here.
I'm kinda late to the party, but I want to present an alternative answer nonetheless.
If you want to import code without affecting the global module namespace, you can create an anonymous module (using types.ModuleType) and load arbitrary code in it (using compile and exec). For instance, like this:
import types
filename = "/path/to/your/file.py"
with open(filename) as fp:
code = compile(fp.read(), filename, "exec")
config_module = types.ModuleType("<config>")
exec code in config_module.__dict__
You can then access the variables as config_module.var1, &c.
If you want to have a configuration file that will only be edited by the user when the program isn't running, just import it as a normal python file
ie.
main.py:
import config
print config.var1
config.py:
var="var12"
var2 = 100.5
try the imp module : http://docs.python.org/library/imp.html