I apologize if this is a basic question but I can't seem to find the answer here or on Google. Basically I'm trying to create a single config module that would be available to all other modules imported in a python application. Of course it works if I have import config in each file but I would like to make my config dynamic based on the environment the application is running in and I'd prefer not to have to copy the logic into every file.
Here's an example:
app.py:
import config
import submodule
# do other stuff
submodule.py:
print config.some_config_variable
But python of course complains that config isn't defined.
I did find some stuff about global variables but that didn't seem to work either. Here's what I tried:
Edit I changed this to show that I'd like the actual config being imported to be dynamic. However I do currently have a static config modle for my tests just to figure out how to import globally and then worry about that logic
app.py
# logic here that defines some_dynamic_config
global config
config = __import__(some_dynamic_config)
import submodule
submodule.py
print config.some_config_variable
But config still isn't defined.
I'm aware that I could create a single config.py and place logic to set the variables but I dislike that. I prefer the config file to just configuration and not contain a bunch of logic.
You've got to put your logic somewhere. Your config.py could be a module that determines which config files to load, something like this:
#config.py
import sys
from common_config import *
if sys.platform == 'darwin':
from mac_config import *
elif sys.platform == 'win32':
from win32_config import *
else:
from linux_config import *
With this approach, you can put common settings in common_settings.py, and platform-specific settings in their respective files. Since the platform-specific settings are imported after common_settings, you can also override anything in common_settings by defining it again in the platform-specific files.
This is a common pattern used in Django settings files, and works quite well there.
You could also wrap each import call with try... except ImportError: blocks if need be.
Modules are shared, so each module can import config without issue.
config.py itself can have logic to set it's global variables howver you like. As an example:
config.py:
import sys
if sys.platform == "win32":
temp = "c:\\temp"
else:
temp = "/tmp"
Now import config in any module to use it:
import config
print "Using tmp dir: %s" % config.temp
If you have a module that you know will be initialized before anything else, you can create an empty config.py, and then set it externally:
import config
config.temp = "c:\\temp"
But you'll need to run this code before anything else that uses it. The empty module can be used as a singleton.
Related
I have a large codebase where all settings & constants are stored inside a settings.py file, that gets imported in various places. I want to be able to import an arbitrary .yml file instead, with the filename given at run-time to the executable main.py.
It's easy to change settings to load a .yml file; but of course I can't pass it an arbitrary filename from main, such that it will remember wherever else that settings gets imported.
I tried to think of a few solutions:
Modify main so that the first thing it does is copy the arbitrary yaml file, into the default place the yaml file will be loaded from (ugly)
Import the settings in main.py, and change references from import settings to import main (circular imports)
Use os to set an environment variable SETTINGS_FILE=some_file.yml, that later gets read by the settings submodule (somewhat ugly...)
Refactor the entire codebase to pass around a settings class instead of a module (a lot of work)
I feel like I'm generally thinking of this all in a stupid way, and that there must be a better way (although I couldn't find anything by search)... what am I missing?
EDIT:
I had no idea but apparently this works...
$ cat settings.py
setting = 1
$ cat other_module.py
import settings
print(settings.setting)
$ cat main.py
import settings
print(settings.setting)
settings.setting = 2
import other_module
$ python main.py
1
2
I'm trying to make my modules available globally
Filesystem structure
main.py
module_static.py
folder/module_dynamic.py # this is example, but imagine 100s of modules
main.py
print('Loading module_static')
import module_static
module_static.test()
# Trying to make module_static available globally
globals()['module_static'] = module_static
__all__ = ['module_static']
print('Loading module_dynamic')
import sys
sys.path.append('./folder/')
import module_dynamic
module_dynamic.test()
module_static.py
def test():
print(' -> This is module_static')
module_dynamic.py
def test():
print(' -> This is module_dynamic')
module_static.test()
Running main.py creates the following execution flow main.py -> module_dynamic.py -> module_static.py
So as you can see:
Loading of modules is working properly
However, despite trying to make module_static available globally, it isn't working a module_dynamic.py throws an error saying module_static doesn't exist
How can I make module_static.py available in module_dynamic.py (ideally without having to write any additional code in module_dynamic.py)?
Not saying it's good practice, but you can do
main.py
import builtins
import module_static
builtins.module_static = module_static
This should allow you to use module_static from anywhere.
More info on builtins: How to make a cross-module variable?
It can't work the way you expect. globals() return a dict of globals variables in your script. Maybe this may help you to understand
You can take a look at this course for better understanding
https://www.python-course.eu/python3_global_vs_local_variables.php
Anyway, you will have to import the module to use it.
If it's just a local tool for your personnal use, you could move it to the folder
{Python_installation_folder}/Lib.
Then, in any script, you will be able to do
import module_static
and use your module.
If you want to share your module with other people, publish (upload) it on PyPi. You could follow the tutorial bellow
https://anweshadas.in/how-to-upload-a-package-in-pypi-using-twine/
I've got a situation where I need to import * from a list of modules which may or may not exist. So far, I've got a working solution that goes a little something like this:
for module in modules:
try:
imported = __import__(module, globals(), locals(), [], -1)
for sub in module.split(".")[1:]:
imported = getattr(imported, sub)
for name in [n for n in dir(imported) if not n.startswith("_")]:
globals()[name] = getattr(imported, name)
except (ImportError, AttributeError):
pass
This works, but is a total mess and confuses linters no end when they're looking for the variables that get imported in this way. I'm certain I'm missing something in the way __import__ works, since surely from foo import * doesn't generate such a horrible call as I've got above. Can anyone enlighten me?
Update:
The actual use case here is refactoring a django project settings file, to move app-specific settings into a settings file within that app. Currently, all settings are in the main settings file, which has become unmaintainably large. I need to retain the ability to override any of the settings defined in that way from another settings file (client_settings.py), which gets imported as from foo_project.client_settings import * at the bottom of the settings file.
Doing from app.settings import * for each app would work, but I want to drive this from the installed apps setting to maintain DRY.
I have a directory, let's call it Storage full of packages with unwieldy names like mypackage-xxyyzzww, and of course Storage is on my PYTHONPATH. Since packages have long unmemorable names, all of the packages are symlinked to friendlier names, such as mypackage.
Now, I don't want to rely on file system symbolic links to do this, instead I tried mucking around with sys.path and sys.modules. Currently I'm doing something like this:
import imp
imp.load_package('mypackage', 'Storage/mypackage-xxyyzzww')
How bad is it to do things this way, and is there a chance this will break in the future? One funny thing is that there's even no mention of imp.load_package function in the docs.
EDIT: besides not relying on symbolic links, I can't use PYTHONPATH variable anymore.
Instead of using imp, you can assign different names to imported modules.
import mypackage_xxyyzzww as mypackage
If you then create a __init__.py file inside of Storage, you can add several of the above lines to make importing easier.
Storage/__init__.py:
import mypackage_xxyyzzww as mypackage
import otherpackage_xxyyzzww as otherpackage
Interpreter:
>>> from Storage import mypackage, otherpackage
importlib may be more appropriate, as it uses/implements the PEP302 mechanism.
Follow the DictImporter example, but override find_module to find the real filename and store it in the dict, then override load_module to get the code from the found file.
You shouldn't need to use sys.path once you've created your Storage module
#from importlib import abc
import imp
import os
import sys
import logging
logging.basicConfig(level=logging.DEBUG)
dprint = logging.debug
class MyImporter(object):
def __init__(self,path):
self.path=path
self.names = {}
def find_module(self,fullname,path=None):
dprint("find_module({fullname},{path})".format(**locals()))
ml = imp.find_module(fullname,path)
dprint(repr(ml))
raise ImportError
def load_module(self,fullname):
dprint("load_module({fullname})".format(**locals()))
return imp.load_module(fullname)
raise ImportError
def load_storage( path, modname=None ):
if modname is None:
modname = os.path.basename(path)
mod = imp.new_module(modname)
sys.modules[modname] = mod
assert mod.__name__== modname
mod.__path__=[path]
#sys.meta_path.append(MyImporter(path))
mod.__loader__= MyImporter(path)
return mod
if __name__=="__main__":
load_storage("arbitrary-path-to-code/Storage")
from Storage import plain
from Storage import mypkg
Then when you import Storage.mypackage, python will immediately use your importer without bothering to look on sys.path
That doesn't work. The code above does work to import ordinary modules under Storage without requiring Storage to be on sys.path, but both 3.1 and 2.6 seem to ignore the loader attribute mentioned in PEP302.
If I uncomment the sys.meta_path line, 3.1 dies with StackOverflow, and 2.6 dies with ImportError. hmmm... I'm out of time now, but may look at it later.
Packages are just entries in the namespace. You should not name your path components with anything that is not a legal python variable name.
I am relatively new to Python. I am looking to create a "settings" module where various application-specific constants will be stored.
Here is how I am wanting to set up my code:
settings.py
CONSTANT = 'value'
script.py
import settings
def func():
var = CONSTANT
# do some more coding
return var
I am getting a Python error stating:
global name 'CONSTANT' is not defined.
I have noticed on Django's source code their settings.py file has constants named just like I do. I am confused on how they can be imported to a script and referenced through the application.
EDIT
Thank you for all your answers! I tried the following:
import settings
print settings.CONSTANT
I get the same error
ImportError: cannot import name CONSTANT
The easiest way to do this is to just have settings be a module.
(settings.py)
CONSTANT1 = "value1"
CONSTANT2 = "value2"
(consumer.py)
import settings
print settings.CONSTANT1
print settings.CONSTANT2
When you import a python module, you have to prefix the the variables that you pull from it with the module name. If you know exactly what values you want to use from it in a given file and you are not worried about them changing during execution, then you can do
from settings import CONSTANT1, CONSTANT2
print CONSTANT1
print CONSTANT2
but I wouldn't get carried away with that last one. It makes it difficult for people reading your code to tell where values are coming from. and precludes those values being updated if another client module changes them. One final way to do it is
import settings as s
print s.CONSTANT1
print s.CONSTANT2
This saves you typing, will propagate updates and only requires readers to remember that anything after s is from the settings module.
step 1: create a new file settings.py on the same directory for easier access.
#database configuration settings
database = dict(
DATABASE = "mysql",
USER = "Lark",
PASS = ""
)
#application predefined constants
app = dict(
VERSION = 1.0,
GITHUB = "{url}"
)
step 2: importing settings module into your application file.
import settings as s # s is aliasing settings & settings is the actual file you do not have to add .py
print(s.database['DATABASE']) # should output mysql
print(s.app['VERSION']) # should output 1.0
if you do not like to use alias like s you can use a different syntax
from settings import database, app
print(database['DATABASE']) # should output mysql
print(app['VERSION']) # should output 1.0
notice on the second import method you can use the dict names directly
A small tip you can import all the code on the settings file by using * in case you have a large file and you will be using most of the settings on it on your application
from settings import * # * represent all the code on the file, it will work like step 2
print(database['USER']) # should output lark
print(app['VERSION']) # should output 1.0
i hope that helps.
When you import settings, a module object called settings is placed in the global namespace - and this object carries has that was in settings.py as attributes. I.e. outside of settings.py, you refer to CONSTANT as settings.CONSTANT.
Leave your settings.py exactly as it is, then you can use it just as Django does:
import settings
def func():
var = settings.CONSTANT
...Or, if you really want all the constants from settings.py to be imported into the global namespace, you can run
from settings import *
...but otherwise using settings.CONSTANT, as everyone else has mentioned here, is quite right.
See the answer I posted to Can I prevent modifying an object in Python? which does what you want (as well as force the use of UPPERCASE identifiers). It might actually be a better answer for this question than it was for the the other.
This way is more efficient since it loads/evaluates your settings variables only once. It works well for all my Python projects.
pip install python-settings
Docs here: https://github.com/charlsagente/python-settings
You need a settings.py file with all your defined constants like:
# settings.py
DATABASE_HOST = '10.0.0.1'
Then you need to either set an env variable (export SETTINGS_MODULE=settings) or manually calling the configure method:
# something_else.py
from python_settings import settings
from . import settings as my_local_settings
settings.configure(my_local_settings) # configure() receives a python module
The utility also supports Lazy initialization for heavy to load objects, so when you run your python project it loads faster since it only evaluates the settings variable when its needed
# settings.py
from python_settings import LazySetting
from my_awesome_library import HeavyInitializationClass # Heavy to initialize object
LAZY_INITIALIZATION = LazySetting(HeavyInitializationClass, "127.0.0.1:4222")
# LazySetting(Class, *args, **kwargs)
Just configure once and now call your variables where is needed:
# my_awesome_file.py
from python_settings import settings
print(settings.DATABASE_HOST) # Will print '10.0.0.1'
I'm new in python but if we define an constant like a function
on setings.py
def CONST1():
return "some value"
main.py
import setings
print setings.CONST1() ##take an constant value
here I see only one, value cant be changed but its work like a function..
Try this:
In settings.py:
CONSTANT = 5
In your main file:
from settings import CONSTANT
class A:
b = CONSTANT
def printb(self):
print self.b
I think your above error is coming from the settings file being imported too late. Make sure it's at the top of the file.
Also worth checking out is the simple-settings project which allows you to feed the settings into your script at runtim, which allows for environment-specific settings (think dev, test, prod,...)