I have different config files which are basically python files that define variables and I want to import them in my main program.
Usually I will have a "config_file1.py" and do something like:
import config_file1 as params
# and then I can access the different parameters as
params.var1
params.var2
Now I want to be able to select which config_file I want to use, so I want to pass a parameter when calling my program that tells which config file to use. Something like:
config_filename = sys.argv[1]
import config_filename as params
However this looks for a file named "config_filename".
I want instead to import the file referenced by the value of config_filename
EDIT:
Basically my program will run a set of experiments, and those experiments need a set of parameters to run.
Eg:
*** config1.py ****
num_iterations = 100
initial_value1 = 10
initial_value2 = 20
So I can run my program loading those variables into memory.
However another config file (config2.py) might have another set of parameters, so I want to be able to select which experiment I want to run by loading the desired config file.
If you really want to do this, you can use the importlib module:
import importlib
params = importlib.import_module(sys.argv[1])
Then you can use the var like this
params.var1
This is in response to your details and not the question.
If you want to load variables such as num_iterations = 100, initial_value1 = 10, initial_value2 = 20 from a file, then I'd really recommend some sort of config instead of abusing imports for global variables.
Json would be the easiest way, where you'd load the file and you'd straight up get a dict:
>>> import json
>>> params = json.loads(config_file1)
{'num_iterations': 100, 'initial_value1': 10, 'initial_value2': 20}
Alternatively you could use ConfigParser, which looks nicer, but I've found it to be quite prone to breaking.
You can do like this:
config_filename = sys.argv[1]
params = __import__(config_filename)
I wouldn't recommend such a risky approach. You relinquish all controls over at the point of the sys.argv and your script can fail if any one of the named attribute doesn't exist within your module.
Instead I would suggest explicitly controlling what are the supported modules being passed in:
config_filename = sys.argv[1].lower()
if config_filename == 'os':
import os as params
elif config_filename == 'shutil':
import shutil as params
else: # unhandled modules
raise ImportError("Unknown module")
Instead of using import statement, you can use __import__ function, like this:
params = __import__('module_name')
Definition:
importlib.__import__(name, globals=None, locals=None, fromlist=(), level=0)
Reference:
https://docs.python.org/3/library/importlib.html#importlib.import
Related
I want to pass arguments to the python module to help me decide whether or not to execute some part of the module initialisation code.
Suppose I have a python module named my_module
import sys
flag = sys.argv[1]
if (flag):
# Do Some thing
else:
# Do something else
def module_hello ():
print "hello"
However, I don't want a user script to interfere with the arguments. It has to be purely based on parameters passed while spawning. In the environment where this will be used, I control the spawn of the script. But the script is provided by user of the module
Say a user writes script which imports this module
sys.argv[1] = "Change to something unpleasant"
import my_module
I don't want user to have control over sys.argv. The CLI arguments passed to the script should go to the module unharmed.
Is there a way to achieve this?
If you want to set some global values for a module, you should probably consider encapsulating it in a class, or setting them by calling an intialisation function, so you can pass the parameters like that.
main.py:
import my_module
my_module.init('some values')
mymodule.py:
VALUES = None
function init(values):
global VALUES
VALUES = values
But why not simply declare some variables in the module and just set the value when you load it?
main.py:
import my_module
my_module.values = 'some values'
mymodule.py:
values = None
Or if you just want to read the arguments, it's like any other script:
main.py:
import my_module
mymodule.py:
import sys
values = sys.argv[1]
Of course you can get as fancy as you like, read https://docs.python.org/3/library/argparse.html
So, try to read arguments at your module.
my-module.py
import sys
# Assign my module properties
is_debug = arg[1]
I'm trying to dynamically import a python-based SQL query module from a sub-folder, and that folder is obtained by using the argparse module.
My project structure :
main_file.py
Data_Projects/
ITA_data/
__init__.py
sqlfile.py
UK_data/
__init__.py
sqlfile.py
Within main_file.py is the argparse module with the argument 'dir' containing the location of the specified directory i.e.
parser.add_argument('--dir', default='Data_Projects/ITA_data/', type=str,
help="set the data directory")
My understanding thus far is that modules should be imported at the top and to import just one sql query I would use:
from Data_Project.ITA_data import sqlfile
I know I can't set the import statement after the args have been defined, so how can I keep the format correct with the imports at the top, and yet retrospectively update this with the arguments that get defined afterwards?
Many thanks.
UPDATE
Thanks to the below answer. I've now tried to assign :
sqlfile = __import__(in_arg.dir + 'sqlfile.py')
However I'm getting the following error:
*** ModuleNotFoundError: No module named 'Data_Projects/ITA_data/sqlfile'
I've tried using things like
os.path.join(Path(__file__).resolve().parents[0], in_arg.dir + 'sqlfile')
If it helps, when I try just :
__import__('Data_Projects') - works fine
__import__('Data_Projects/ITA_data') - doesn't work - ModuleNotFound
And as a check to verify I'm not crazy:
os.path.exists('Data_Projects/ITA_Data/sqlfile.py') >>> True
os.path.exists(in_arg.dir + 'sqlfile.py') >>> True
I don't see anything wrong with
import argparse
parser = ...
parser.add_argument('data', choices=['UK', 'ITA'])
args = parser.parse_args()
if args.dir == 'UK':
import UK_data as data
elif args.dir == 'ITA':
import ITA_data as data
else ...
You could refine this with functions and __name__ etc. But a conditional import is ok, just so long as it occurs before the data module is used.
You can use __import__(filename: str) function instead of import statement. It does the same:
# option 1
import foo as bar
# option 2
bar = __import__('foo')
If you need to import from aside, you need to add your directory to module search paths. There are several ways to achieve that, depending on your version of Python. You can find them all in great post:
How to import a module given the full path?
The issue was resolved by using :
import sys
sys.path.insert(0, os.getcwd() + "/" + in_arg.dir)
This sets the PYTHONPATH variable to include the directory I want (which changes depending on the argument) to use to search for the file.
From there using grapes help it was a case of doing:
sqlfile = __import__('sqlfile')
And from there I could use the variable to perform the relevant sql query.
As described in this answer how to import module one can import a module located in another path this way:
import sys
sys.path.append('PathToModule')
import models.user
My question is:
How can I execute this other module (and also pass parameters to it), if this other module is setup this way:
if __name__ == '__main__':
do_something()
and do_something() uses argparse.ArgumentParser to work with the parameters supplied?
I ADDED THE FOLLOWING AFTER THE FIRST QUESTIONS/COMMENTS CAME UP
I am able to pass the parameters via
sys.argv[1:] = [
"--param1", "123",
"--param2", "456",
"--param3", "111"
]
so this topic is already covered.
Why do I want to call another module with parameters?
I would like to be able to do a kind of a small regression test for another project. I would like to get this other project via a git clone and have different versions locally available, that I can debug, too, if needed.
But I do not want to be involved too much in that other project (so that forking does not make sense).
AND SO MY REMAINING QUESTION IS
How can I tweak the contents of __name__ when calling the other module?
There are multiple ways to approach this problem.
If the module you want to import is well-written, it should have separate functions for parsing the command line arguments and for actually doing work. It should look something like this:
def main(arg1, arg2):
pass # do something
def parse_args():
parser = argparse.ArgumentParser()
... # lots of code
return vars(parser.parse_args())
if __name__ == '__main__':
args = parse_args()
main(**args)
In this case, you would simply import the module and then call its main function with the correct arguments:
import yourModule
yourModule.main('foo', 'bar')
This is the optimal solution.
If the module doesn't define such a main function, you can manually set sys.argv and use runpy.run_module to execute the module:
import runpy
import sys
sys.argv[1:] = ['foo', 'bar']
runpy.run_module('yourModule', run_name='__main__', alter_sys=True)
Note that this only executes the module; it doesn't import it. (I.e. the module won't be added to sys.modules and you don't get a module object that you can interact with.)
To illustrate what I am trying to do, let's say I have a module testmod that lives in ./testmod.py. The entire contents of this module is
x = test
I would like to be able to successfully import this module into Python, using any of the tools available in importlib or any other built in library.
Obviously doing a simple import testmod statement from the current directory results in an error: NameError: name 'test' is not defined.
I thought that maybe passing either globals or locals to __import__ correctly would modify the environment inside the script being run, but it does not:
>>> testmod = __import__('testmod', globals={'test': 'globals'}, locals={'test': 'locals'})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/jfoxrabi/testmod.py", line 1, in <module>
x = test
NameError: name 'test' is not defined
I was setting the value of test differently so I could see which dict testmod.x came from if this worked.
Since neither of these seems to work, I am stuck. Is it even possible to accomplish what I am trying to do? I would guess that yes, since this is Python, not Sparta.
I am using Python 3.5 on Anaconda. I would very much prefer not to use external libraries.
Update: The Why
I am importing a module into my program as a configuration file. The reason that I am not using JSON or INI is that I would like to have the full scope of Python's interpreter available to compute the values in the config from expressions. I would like to have certain values that I compute before-hand in the program available to do those calculations.
While I am aware of the fact that this is about as bad as calling eval (I do that too in my program), I am not concerned with the security aspect for the time being. I am, however, quite willing to entertain better solutions should this indeed turn out to be a case of XY.
I came up with a solution based on this answer and the importlib docs. Basically, I have access to the module object before it is loaded by using the correct sequence of calls to importlib:
from importlib.util import spec_from_file_location, module_from_spec
from os.path import splitext, basename
def loadConfig(fileName):
test = 'This is a test'
name = splitext(basename(fileName))[0]
spec = spec_from_file_location(name, fileName)
config = module_from_spec(spec)
config.test = test
spec.loader.exec_module(config)
return config
testmod = loadConfig('./testmod.py')
This is a bit better than modifying builtins, which may have unintended consequences in other parts of the program, and may also restrict the names I can pass in to the module.
I decided to put all the configuration items into a single field accessible at load time, which I named config. This allows me to do the following in testmod:
if 'test' in config:
x = config['test']
The loader now looks like this:
from importlib.util import spec_from_file_location, module_from_spec
from os.path import splitext, basename
def loadConfig(fileName, **kwargs):
name = splitext(basename(fileName))[0]
spec = spec_from_file_location(name, fileName)
config = module_from_spec(spec)
config.config = kwargs
spec.loader.exec_module(config)
return config
testmod = loadConfig('./testmod.py', test='This is a test')
After finding myself using this a bunch of times, I finally ended up adding this functionality to the utility library I maintain, haggis. haggis.load.load_module loads a text file as a module with injection, while haggis.load.module_as_dict does a more advanced version of the same that loads it as a potentially nested configuration file into a dict.
You could screw with Python's builtins to inject your own fake built-in test variable:
import builtins # __builtin__, no s, in Python 2
builtins.test = 5 # or whatever other placeholder value
import testmod
del builtins.test # clean up after ourselves
I am relatively new to Python. I am looking to create a "settings" module where various application-specific constants will be stored.
Here is how I am wanting to set up my code:
settings.py
CONSTANT = 'value'
script.py
import settings
def func():
var = CONSTANT
# do some more coding
return var
I am getting a Python error stating:
global name 'CONSTANT' is not defined.
I have noticed on Django's source code their settings.py file has constants named just like I do. I am confused on how they can be imported to a script and referenced through the application.
EDIT
Thank you for all your answers! I tried the following:
import settings
print settings.CONSTANT
I get the same error
ImportError: cannot import name CONSTANT
The easiest way to do this is to just have settings be a module.
(settings.py)
CONSTANT1 = "value1"
CONSTANT2 = "value2"
(consumer.py)
import settings
print settings.CONSTANT1
print settings.CONSTANT2
When you import a python module, you have to prefix the the variables that you pull from it with the module name. If you know exactly what values you want to use from it in a given file and you are not worried about them changing during execution, then you can do
from settings import CONSTANT1, CONSTANT2
print CONSTANT1
print CONSTANT2
but I wouldn't get carried away with that last one. It makes it difficult for people reading your code to tell where values are coming from. and precludes those values being updated if another client module changes them. One final way to do it is
import settings as s
print s.CONSTANT1
print s.CONSTANT2
This saves you typing, will propagate updates and only requires readers to remember that anything after s is from the settings module.
step 1: create a new file settings.py on the same directory for easier access.
#database configuration settings
database = dict(
DATABASE = "mysql",
USER = "Lark",
PASS = ""
)
#application predefined constants
app = dict(
VERSION = 1.0,
GITHUB = "{url}"
)
step 2: importing settings module into your application file.
import settings as s # s is aliasing settings & settings is the actual file you do not have to add .py
print(s.database['DATABASE']) # should output mysql
print(s.app['VERSION']) # should output 1.0
if you do not like to use alias like s you can use a different syntax
from settings import database, app
print(database['DATABASE']) # should output mysql
print(app['VERSION']) # should output 1.0
notice on the second import method you can use the dict names directly
A small tip you can import all the code on the settings file by using * in case you have a large file and you will be using most of the settings on it on your application
from settings import * # * represent all the code on the file, it will work like step 2
print(database['USER']) # should output lark
print(app['VERSION']) # should output 1.0
i hope that helps.
When you import settings, a module object called settings is placed in the global namespace - and this object carries has that was in settings.py as attributes. I.e. outside of settings.py, you refer to CONSTANT as settings.CONSTANT.
Leave your settings.py exactly as it is, then you can use it just as Django does:
import settings
def func():
var = settings.CONSTANT
...Or, if you really want all the constants from settings.py to be imported into the global namespace, you can run
from settings import *
...but otherwise using settings.CONSTANT, as everyone else has mentioned here, is quite right.
See the answer I posted to Can I prevent modifying an object in Python? which does what you want (as well as force the use of UPPERCASE identifiers). It might actually be a better answer for this question than it was for the the other.
This way is more efficient since it loads/evaluates your settings variables only once. It works well for all my Python projects.
pip install python-settings
Docs here: https://github.com/charlsagente/python-settings
You need a settings.py file with all your defined constants like:
# settings.py
DATABASE_HOST = '10.0.0.1'
Then you need to either set an env variable (export SETTINGS_MODULE=settings) or manually calling the configure method:
# something_else.py
from python_settings import settings
from . import settings as my_local_settings
settings.configure(my_local_settings) # configure() receives a python module
The utility also supports Lazy initialization for heavy to load objects, so when you run your python project it loads faster since it only evaluates the settings variable when its needed
# settings.py
from python_settings import LazySetting
from my_awesome_library import HeavyInitializationClass # Heavy to initialize object
LAZY_INITIALIZATION = LazySetting(HeavyInitializationClass, "127.0.0.1:4222")
# LazySetting(Class, *args, **kwargs)
Just configure once and now call your variables where is needed:
# my_awesome_file.py
from python_settings import settings
print(settings.DATABASE_HOST) # Will print '10.0.0.1'
I'm new in python but if we define an constant like a function
on setings.py
def CONST1():
return "some value"
main.py
import setings
print setings.CONST1() ##take an constant value
here I see only one, value cant be changed but its work like a function..
Try this:
In settings.py:
CONSTANT = 5
In your main file:
from settings import CONSTANT
class A:
b = CONSTANT
def printb(self):
print self.b
I think your above error is coming from the settings file being imported too late. Make sure it's at the top of the file.
Also worth checking out is the simple-settings project which allows you to feed the settings into your script at runtim, which allows for environment-specific settings (think dev, test, prod,...)