i am trying to build an infrastructure which uses a text file to dynamically setup an environment in which code should run. the text file will contain paths of python modules to load AND variables which other parts of my code will need. thus, i am trying to use one mechanism for loading modules and variables (versus virtualenv just for modules and another scheme for variables)
my code would look like:
my_script_which_uses_env.py:
import envcontrol_module
import some_other_module_whose_path_is_unknown_at_run_time
def main():
my_envcontrol = envcontrol_module.EnvControlClass()
my_variable = my_envcontrol.get_value("SOME_VARIABLE")
those using my infrastructure would be required to "import envcontrol_module" as the first line in their code. after that import is done, then the syspath has already been modified to include the paths of the modules to load, ie when import envcontrol_module returns, the sys path has the path for some_other_module_whose_path_is_unknown_at_run_time.
at a high level, i think the following is needed:
1. need "import envcontrol_module" to cause a reading of my config text file. this would create a dict with the variable names and values. here i could modify sys.path
2. somehow keep this dict around such that when an instance of EnvControlClass is created, this class is able to access the dict
envcontrol_module.py
def main():
config_dict = read_config_file()
modify_sys_path(config_dict)
class EnvControl():
self.config_dict = config_dict
how can i do the above in a python friendly way:
1. config dict is not global to envcontrol_module
2. maintain the need for this to be transparent to programmers, they simply do "import envcontrol_module" first and then do subsequent imports normally
i know i can do the following, but then the programmers would do the imports in their main() rather than at top of file, which will be different/not as nice/not python-esque:
my_script_which_uses_env.py:
import envcontrol_module
def main():
my_envcontrol = envcontrol_module.EnvControlClass()
import some_other_module_whose_path_is_unknown_at_run_time
my_variable = my_envcontrol.get_value("SOME_VARIABLE")
envcontrol_module.py
class EnvControl():
def __init__(self):
self.config_dict = read_config_file()
modify_sys_path(config_dict)
not sure if i explained this well. if i haven't, let me know and i can clarify more.
thanks!
Related
let's say I wanted to make a core library for a project, with functions like:
def foo(x):
"""common, useful function"""
and I want to make these functions globally available in my project so that when I call them in a file, I don't need to import them. I have a virtualenv, so I feel like I should be able to modify my interpreter to make them globally available, but wasn't sure if there was any established methodologies behind this. I am aware it defies some pythonic principles.
It is possible to create a custom "launcher" that sets up some global variables and executes the code in a python file:
from sys import argv
# we read the code of the file passed as the first CLI argument
with open(argv[1]) as fin:
code = fin.read()
# just an example: this will be available in the executed python file
def my_function():
return "World"
global_variables = {
'MY_CONSTANT': "Hello", # prepare a global variable
'my_function': my_function # prepare a global function
}
exec(code, global_variables) # run the file with new global variables
Use it like this: python launcher.py my_dsl_file.py.
Example my_dsl_file.py:
# notice: no imports at all
print(MY_CONSTANT)
print(my_function())
Interestingly Python (at least CPython) uses a different way to setup some useful functions like help. It runs a file called site.py that adds some values to the builtins module.
import builtins
def my_function():
return "World"
builtins.MY_CONSTANT = "Hello"
builtins.my_function = my_function
# run your file like above or simply import it
import <your file>
I wouldn't recommend either of these ways. A simple from <your library> import * is a much better approach.
The downside of the first two variants is that no tool will know anything about your injected globals. E.g. mypy, flake8 and all IDEs i know of will fail.
I'm trying to make my modules available globally
Filesystem structure
main.py
module_static.py
folder/module_dynamic.py # this is example, but imagine 100s of modules
main.py
print('Loading module_static')
import module_static
module_static.test()
# Trying to make module_static available globally
globals()['module_static'] = module_static
__all__ = ['module_static']
print('Loading module_dynamic')
import sys
sys.path.append('./folder/')
import module_dynamic
module_dynamic.test()
module_static.py
def test():
print(' -> This is module_static')
module_dynamic.py
def test():
print(' -> This is module_dynamic')
module_static.test()
Running main.py creates the following execution flow main.py -> module_dynamic.py -> module_static.py
So as you can see:
Loading of modules is working properly
However, despite trying to make module_static available globally, it isn't working a module_dynamic.py throws an error saying module_static doesn't exist
How can I make module_static.py available in module_dynamic.py (ideally without having to write any additional code in module_dynamic.py)?
Not saying it's good practice, but you can do
main.py
import builtins
import module_static
builtins.module_static = module_static
This should allow you to use module_static from anywhere.
More info on builtins: How to make a cross-module variable?
It can't work the way you expect. globals() return a dict of globals variables in your script. Maybe this may help you to understand
You can take a look at this course for better understanding
https://www.python-course.eu/python3_global_vs_local_variables.php
Anyway, you will have to import the module to use it.
If it's just a local tool for your personnal use, you could move it to the folder
{Python_installation_folder}/Lib.
Then, in any script, you will be able to do
import module_static
and use your module.
If you want to share your module with other people, publish (upload) it on PyPi. You could follow the tutorial bellow
https://anweshadas.in/how-to-upload-a-package-in-pypi-using-twine/
I want to import a python module without adding its containing folder to the python path. I would want the import look like
from A import B as C
Due to the specific path that shall be used, the import looks like
import imp
A = imp.load_source('A', 'path')
C = A.B
This is quite unhandy with long paths and module names. Is there an easier way? Is there A way, where the module is not added to the local variables (no A)?
If you just don't want A to be visible at a global level, you could stick the import (imp.load_source) inside a function. If you actually don't want a module object at all in the local scope, you can do that too, but I wouldn't recommend it.
If module A is a python source file you could read in the file (or even just the relevant portion that you want) and run an exec on it.
source.py
MY_GLOBAL_VAR = 1
def my_func():
print 'hello'
Let's say you have some code that wants my_func
path = '/path/to/source.py'
execfile(path)
my_func()
# 'hello'
Be aware that you're also going to get anything else defined in the file (like MY_GLOBAL_VAR). Again, this will work, but I wouldn't recommend it
Someone looking at your code won't be able to see where my_func came from.
You're essentially doing the same thing as a from A import * import, which is generally frowned upon in python, because , you could be importing all sorts of things into your namespace that you didn't want. And even if it works now, if the source code changes, it could import names that shadow your own global symbols.
It's potentially a security hole, since you could be exec'ing an untrusted source file.
It's way more verbose than a regular python import.
I am trying to create a python addin for ArcMap. The toolbar that I am creating has four buttons which will run one python script per button. These scripts are all running data management processes which are mostly centred around transfering data from one or more personal geodatabases to an oracle database. I have each of the individual python scripts which work but need some updating to make them more user friendly. The issue I am having at the moment is that I want to keep the four scrupts seperate so I need to import/call them from a master script which the addin will be based off. At the moment each script is made up of a single class but in the future they may end up having more than one class.
The issue I am having at the moment is that the script below seems to not import/run the individual scripts when I try and use the toolbar in ArcMap. The four buttons are there but nothing happens when I click on them. I have an __init__.py script in the package directory where the four python scripts are stored. At the moment I am just trying to run the individual class from each script but I may need to run several classes/the whole script in the future.
import os
import sys
sys.path.append(os.path.dirname(__file__))
import arcpy
import pythonaddins
import getopt
import fnmatch
import traceback
import pyodbc
import csv
from datetime import datetime
import logging
from package import *
#print dir()
class AppendModelsOracle(object):
"""Implementation for Append_Models_Oracle.button (Button)"""
def __init__(self):
self.enabled = True
self.checked = False
def onClick(self):
# Run the Append_Models_Oracle.py script stored in the package directory
package.Append_Models_Oracle.AppendReachNetwork()
class SAGIStoOracle(object):
"""Implementation for SAGIS_to_Oracle.button (Button)"""
def __init__(self):
self.enabled = True
self.checked = False
def onClick(self):
# Run the SAGIS_to_Oracle.py script stored in the package directory
package.SAGIS_to_Oracle.SAGIS()
class ImportGIS1CSV(object):
"""Implementation for Import_GIS1_CSV.button (Button)"""
def __init__(self):
self.enabled = True
self.checked = False
def onClick(self):
# Run the Import_GIS1_csv.py script stored in the package directory
package.Import_GIS1_csv.ImportGIS1()
class ImportDeterminandCSV(object):
"""Implementation for Import_determinand_csv.button (Button)"""
def __init__(self):
self.enabled = True
self.checked = False
def onClick(self):
# Run the Import_Determinands_csv.py script stored in the package directory
package.Import_Determinands_csv.ImportDets()enter code here
I would be greatful for any help as to why individual scripts are not being imported/run when I try to use the python addin in ArcMap.
Thanks in advance,
Ben.
As specified in this answer, you should have used the all variable to set the classes that will be imported when you use "*", as described in the python docs:
Now what happens when the user writes from sound.effects import *? Ideally, one would hope that this somehow goes out to the filesystem, finds which submodules are present in the package, and imports them all. This could take a long time and importing sub-modules might have unwanted side-effects that should only happen when the sub-module is explicitly imported.
The only solution is for the package author to provide an explicit index of the package. The import statement uses the following convention: if a package’s init.py code defines a list named all, it is taken to be the list of module names that should be imported when from package import * is encountered. It is up to the package author to keep this list up-to-date when a new version of the package is released. Package authors may also decide not to support it, if they don’t see a use for importing * from their package. For example, the file sound/effects/init.py could contain the following code:
__all__ = ["echo", "surround", "reverse"]
This would mean that from sound.effects import * would import the three named submodules of the sound package.
If all is not defined, the statement from sound.effects import * does not import all submodules from the package sound.effects into the current namespace;
Pretty sure you already figure it out, answering for future questions.
I am relatively new to Python. I am looking to create a "settings" module where various application-specific constants will be stored.
Here is how I am wanting to set up my code:
settings.py
CONSTANT = 'value'
script.py
import settings
def func():
var = CONSTANT
# do some more coding
return var
I am getting a Python error stating:
global name 'CONSTANT' is not defined.
I have noticed on Django's source code their settings.py file has constants named just like I do. I am confused on how they can be imported to a script and referenced through the application.
EDIT
Thank you for all your answers! I tried the following:
import settings
print settings.CONSTANT
I get the same error
ImportError: cannot import name CONSTANT
The easiest way to do this is to just have settings be a module.
(settings.py)
CONSTANT1 = "value1"
CONSTANT2 = "value2"
(consumer.py)
import settings
print settings.CONSTANT1
print settings.CONSTANT2
When you import a python module, you have to prefix the the variables that you pull from it with the module name. If you know exactly what values you want to use from it in a given file and you are not worried about them changing during execution, then you can do
from settings import CONSTANT1, CONSTANT2
print CONSTANT1
print CONSTANT2
but I wouldn't get carried away with that last one. It makes it difficult for people reading your code to tell where values are coming from. and precludes those values being updated if another client module changes them. One final way to do it is
import settings as s
print s.CONSTANT1
print s.CONSTANT2
This saves you typing, will propagate updates and only requires readers to remember that anything after s is from the settings module.
step 1: create a new file settings.py on the same directory for easier access.
#database configuration settings
database = dict(
DATABASE = "mysql",
USER = "Lark",
PASS = ""
)
#application predefined constants
app = dict(
VERSION = 1.0,
GITHUB = "{url}"
)
step 2: importing settings module into your application file.
import settings as s # s is aliasing settings & settings is the actual file you do not have to add .py
print(s.database['DATABASE']) # should output mysql
print(s.app['VERSION']) # should output 1.0
if you do not like to use alias like s you can use a different syntax
from settings import database, app
print(database['DATABASE']) # should output mysql
print(app['VERSION']) # should output 1.0
notice on the second import method you can use the dict names directly
A small tip you can import all the code on the settings file by using * in case you have a large file and you will be using most of the settings on it on your application
from settings import * # * represent all the code on the file, it will work like step 2
print(database['USER']) # should output lark
print(app['VERSION']) # should output 1.0
i hope that helps.
When you import settings, a module object called settings is placed in the global namespace - and this object carries has that was in settings.py as attributes. I.e. outside of settings.py, you refer to CONSTANT as settings.CONSTANT.
Leave your settings.py exactly as it is, then you can use it just as Django does:
import settings
def func():
var = settings.CONSTANT
...Or, if you really want all the constants from settings.py to be imported into the global namespace, you can run
from settings import *
...but otherwise using settings.CONSTANT, as everyone else has mentioned here, is quite right.
See the answer I posted to Can I prevent modifying an object in Python? which does what you want (as well as force the use of UPPERCASE identifiers). It might actually be a better answer for this question than it was for the the other.
This way is more efficient since it loads/evaluates your settings variables only once. It works well for all my Python projects.
pip install python-settings
Docs here: https://github.com/charlsagente/python-settings
You need a settings.py file with all your defined constants like:
# settings.py
DATABASE_HOST = '10.0.0.1'
Then you need to either set an env variable (export SETTINGS_MODULE=settings) or manually calling the configure method:
# something_else.py
from python_settings import settings
from . import settings as my_local_settings
settings.configure(my_local_settings) # configure() receives a python module
The utility also supports Lazy initialization for heavy to load objects, so when you run your python project it loads faster since it only evaluates the settings variable when its needed
# settings.py
from python_settings import LazySetting
from my_awesome_library import HeavyInitializationClass # Heavy to initialize object
LAZY_INITIALIZATION = LazySetting(HeavyInitializationClass, "127.0.0.1:4222")
# LazySetting(Class, *args, **kwargs)
Just configure once and now call your variables where is needed:
# my_awesome_file.py
from python_settings import settings
print(settings.DATABASE_HOST) # Will print '10.0.0.1'
I'm new in python but if we define an constant like a function
on setings.py
def CONST1():
return "some value"
main.py
import setings
print setings.CONST1() ##take an constant value
here I see only one, value cant be changed but its work like a function..
Try this:
In settings.py:
CONSTANT = 5
In your main file:
from settings import CONSTANT
class A:
b = CONSTANT
def printb(self):
print self.b
I think your above error is coming from the settings file being imported too late. Make sure it's at the top of the file.
Also worth checking out is the simple-settings project which allows you to feed the settings into your script at runtim, which allows for environment-specific settings (think dev, test, prod,...)