I create a module with one function (read_file) which read a file and add data in one dict (DATA).
The goal is to have access to the dict (DATA) everywhere in other modules, but with only one call to the function read_file. I need this module compatible with python2 and Python3
The tree is :
Config\
__init__.py
config.py
OtherModule\
__init__.py
module1.py
module2.py
The code are:
Config
==== __init__.py ====
from config.config import read_file
DATA = dict()
==== config.py ====
import config
def read_file(param):
....
config.DATA = (depend of param)
OtherModule
==== module1.py =====
import config
config.read_file(param)
==== module2.py =====
import config
print(config.DATA) -> Empty
import module2
print(config.DATA) -> NotEmpty
The question is why this works very fine in python3 but it's always empty in python2 ?
I know I can pass with another module, like config.varaible.DATA but I would like to not do it. It is possible ?
I using python2.7 et python 3.6
Related
I want to import constants from a constants module from two different modules, but I get the following error:
Traceback (most recent call last):
File "C:\Temp\tmp\pycircular\pycircular\pycircular.py", line 2, in <module>
from my_classes.foo import Foo
File "C:\Temp\tmp\pycircular\pycircular\my_classes\foo.py", line 1, in <module>
from pycircular.constants import ANOTHER_CONSTANT
File "C:\Temp\tmp\pycircular\pycircular\pycircular.py", line 2, in <module>
from my_classes.foo import Foo
ImportError: cannot import name 'Foo' from partially initialized module 'my_classes.foo' (most likely due to a circular import) (C:\Temp\tmp\pycircular\pycircular\my_classes\foo.py)
My project structure is the following:
|-constants.py
|-my_classes
| |-foo.py
| |-__init__.py
|-pycircular.py
|-__init__.py
# =============
# pycircular.py
# =============
from constants import SOME_CONSTANT
from my_classes.foo import Foo
def main():
print(SOME_CONSTANT)
my_foo = Foo()
my_foo.do_something()
if __name__ == "__main__":
main()
# =============
# foo.py
# =============
from pycircular.constants import ANOTHER_CONSTANT
class Foo:
def do_something(self):
print(ANOTHER_CONSTANT)
# =============
# constants.py
# =============
ANOTHER_CONSTANT = "ANOTHER"
SOME_CONSTANT = "CONSTANT"
I assume that it is the same problem as solved here https://stackoverflow.com/a/62303448/2021763.
But I really do not get why from my_classes.foo import Foo in pycircular.py is called a second time.
Update:
After renaming the package pycircular to pycircular_pack it worked in PyCharm.
But it only works because in Pycharm the option Add content roots to to PYTHONPATH is automatically set.
The output of sys.path is ['C:\\Temp\\tmp\\pycircular\\pycircular_pack', 'C:\\Temp\\tmp\\pycircular', 'C:\\Tools\\miniconda\\envs\\my_env\\python39.zip', 'C:\\Tools\\miniconda\\envs\\my_env\\DLLs', 'C:\\Tools\\miniconda\\envs\\my_env\\lib', 'C:\\Tools\\miniconda\\envs\\my_env', 'C:\\Tools\\miniconda\\envs\\my_env\\lib\\site-packages']
Without the option the output is ['C:\\Temp\\tmp\\pycircular\\pycircular_pack', 'C:\\Tools\\miniconda\\envs\\my_env\\python39.zip', 'C:\\Tools\\miniconda\\envs\\my_env\\DLLs', 'C:\\Tools\\miniconda\\envs\\my_env\\lib', 'C:\\Tools\\miniconda\\envs\\my_env', 'C:\\Tools\\miniconda\\envs\\my_env\\lib\\site-packages']
And without the option I only get it to work with absolute imports.
# pycircular.py
from constants import SOME_CONSTANT
from my_classes.foo import Foo
...
# foo.py
from constants import ANOTHER_CONSTANT
To elaborate based on the comments and edit:
After renaming the package pycircular to pycircular_pack it worked in PyCharm. But it only works because in Pycharm the option Add content roots to to PYTHONPATH is automatically set.
You should make sure the package directory is not set as a content root or source root. The directory hosting the package directory should be set as source root.
C:\Temp\tmp\pycircular # <- source root
|- pycircular_pack # <- not set as anything
| |- constants.py
| |- my_classes
| | |- foo.py
| | |- __init__.py
| |- pycircular.py
| |- __init__.py
|- other_file.py # <- for illustration's sake
Now your sys.path will be set to include C:\Temp\tmp\pycircular only and there will be exactly one way to import things from your module.
Namely,
other_file.py (outside the package) will be able to use the package as pycircular_pack
pycircular_pack/*.py can refer to modules in the pycircular_pack package by either
(e.g.) from .constants import ... (relative import from current package), or
(e.g.) from pycircular_pack.constants import ... (absolute import)
pycircular_pack/my_classes/*.py can refer to modules in the pycircular_pack package by either
(e.g.) from ..constants import ... (relative import from parent package), or
(e.g.) from pycircular_pack.constants import ... (absolute import)
If your pycircular_pack package would contain a runnable script, e.g. a CLI as pycircular_pack/cli.py, then the correct way to run that script on the command line would be to use python -m pycircular_pack.cli; this has Python set up the path just like we want here, where python pycircular_pack/cli.py would not do the right thing.
I am running multiple tests in a tests package, and I want to print each module name in the package, without duplicating code.
So, I wanted to insert some code to __init__.py or conftest.py that will give me the executing module name.
Let's say my test modules are called: checker1, checker2, etc...
My directory structure is like this:
tests_dir/
├── __init__.py
├── conftest.py
├── checker1
├── checker2
└── checker3
So, inside __init__.py I tried inserting:
def module_name():
return os.path.splitext(__file__)[0]
But it still gives me __init__.py from each file when I call it.
I also tried using a fixture inside conftest.py, like:
#pytest.fixture(scope='module')
def module_name(request):
return request.node.name
But it seems as if I still need to define a function inside each module to get module_name as a parameter.
What is the best method of getting this to work?
Edit:
In the end, what I did is explained here:
conftest.py
#pytest.fixture(scope='module', autouse=True)
def module_name(request):
return request.node.name
example for a test file with a test function. The same needs to be added to each file and every function:
checker1.py
from conftest import *
def test_columns(expected_res, actual_res, module_name):
expected_cols = expected_res.columns
actual_cols = actual_res.columns
val = expected_cols.difference(actual_cols) # verify all expected cols are in actual_cols
if not val.empty:
log.error('[{}]: Expected columns are missing: {}'.format(module_name, val.values))
assert val.empty
Notice the module_name fixture I added to the function's parameters.
expected_res and actual_res are pandas Dataframes from excel file.
log is a Logger object from logging package
In each module (checker1, checker2, checker3, conftest.py), in the main function, execute
print(__name__)
When the __init__.py file imports those packages, it should print the module name along with it.
Based on your comment, you can perhaps modify the behaviour in the __init__.py file for local imports.
__init.py__
import sys, os
sys.path.append(os.path.split(__file__)[0])
def my_import(module):
print("Module name is {}".format(module))
exec("import {}".format(module))
testerfn.py
print(__name__)
print("Test")
Directory structure
tests_dir/
├── __init__.py
└── testerfn.py
Command to test
import tests_dir
tests_dir.my_import("testerfn")
I have a module - let's call it foo - and I want to make it usable via a python -m foo call. My program look like this:
my_project
├── foo
│ └── __init__.py
└── my_program.py
In __init__.py I have some code which I run when calling python -m foo:
def bar(name):
print(name)
# -- code used to 'run' the module
def main(name):
bar("fritz")
if __name__ == "__main__":
main()
Since I have a fair amount of execution code in __init__.py now (argparse stuff and some logic) I want to separate it into a __main__.py:
my_project
├── foo
│ ├── __init__.py
│ └── __main__.py
└── my_program.py
Despite that looks very simple to me I didn't manage to import stuff located in __init__.py from __main__.py yet.
I know - if foo is located in site-packages or accessible via PYTHONPATH I can just import foo..
But in case I want to execute __main__.py directly (e.g. from some IDE) with foo located anywhere (i.e. not a folder where Python looks for packages) - is there a way to import foo (__init__.py from the same directory)?
I tried import . and import foo - but both approaches fail (because they just mean something else of course)
What I can do - at least to explain my goal - is something like this:
sys.path.append(os.path.join(os.path.dirname(__file__), ".."))
import foo
Works, but is ugly and a bit dangerous since I don't even know if I really import foo from the same directory..
You can manually set the module import state as if __main__.py were executed with -m:
# foo/__main__.py
import os
import sys
if __package__ is None and __name__ == "__main__": # executed without -m
# set special attributes as if part of the package
__file__ = os.path.abspath(__file__)
__package__ = os.path.basename(os.path.dirname(__file__))
# replace import path for __main__ with path for package
main_path = os.path.dirname(__file__)
try:
index = sys.path.index(dir_path)
if index != 0 or index != 1:
raise ValueError('expected script directory after current directory or matching it')
except ValueError:
raise RuntimeError('sys.path does not include script directory as expected')
else:
sys.path[index] = main_path
# import regularly
from . import bar
This exploits that python3 path/to/foo/__main__.py executes __main__ as a standalone script: __package__ is None and the __name__ does not include the package either. The search path in this case is <current directory>, <__main__ directory>, ..., though it gets collapsed if the two are the same: the index is either 0 or 1.
As with all trickery on internals, there is some transient state where invariants are violated. Do not perform any imports before the module is patched!
I have several 'app'-modules (which are being started by a main-application)
and a utility module with some functionality:
my_utility/
├── __init__.py
└── __main__.py
apps/
├── app1/
│ ├── __init__.py
│ └── __main__.py
├── app2/
│ ├── __init__.py
│ └── __main__.py
...
main_app.py
The apps are being started like this (by the main application):
python3 -m <app-name>
I need to provide some meta information (tied to the module) about each app which is readable by the main_app and the apps themselves:
apps/app1/__init__.py:
meta_info = {'min_platform_version': '1.0',
'logger_name': 'mm1'}
... and use it like this:
apps/app1/__main__.py:
from my_utility import handle_meta_info
# does something with meta_info (checking, etc.)
handle_meta_info()
main_app.py:
mod = importlib.import_module('app1')
meta_inf = getattr(mod, 'meta_info')
do_something(meta_inf)
The Problem
I don't know how to access meta_info from within the apps. I know I can
import the module itself and access meta_info:
apps/app1/__main__.py:
import app1
do_something(app1.meta_info)
But this is only possible if I know the name of the module. From inside another module - e.g. my_utility I don't know how to access the module which has been started in the first place (or it's name).
my_utility/__main__.py:
def handle_meta_info():
import MAIN_MODULE <-- don't know, what to import here
do_something(MAIN_MODULE.meta_info)
In other words
I don't know how to access meta_info from within an app's process (being started via python3 -m <name> but from another module which does not know the name of the 'root' module which has been started
Approaches
Always provide the module name when calling meta-info-functions (bad, because it's verbose and redundant)
from my_utility import handle_meta_info
handle_meta_info('app1')
add meta_info to __builtins__ (generally bad to pollute global space)
Parse the command line (ugly)
Analyze the call stack on import my_utility (dangerous, ugly)
The solution I'd like to see
It would be nice to be able to either access the "main" modules global space OR know it's name (to import)
my_utility/__main__.py:
def handle_meta_info():
do_something(__main_module__.meta_info)
OR
def handle_meta_info():
if process_has_been_started_as_module():
mod = importlib.import_module(name_of_main_module())
meta_inf = getattr(mod, 'meta_info')
do_something(meta_inf)
Any ideas?
My current (bloody) solution:
Inside my_utility I use psutil to get the command line the module has been started with (why not sys.argv? Because). There I extract the module name. This way I attach the desired meta information to my_utility (so I have to load it only once).
my_utility/__init__.py:
def __get_executed_modules_meta_info__() -> dict:
def get_executed_module_name()
from psutil import Process
from os import getpid
_cmdline = Process(getpid()).cmdline
try:
# normal case: app has been started via 'python3 -m <app>'
return _cmdline[_cmdline.index('-m') + 1]
except ValueError:
return None
from importlib import import_module
try:
_main_module = import_module(get_module_name())
return import_module(get_executed_module_name()).meta_info
except AttributeError:
return {}
__executed_modules_meta_info__ = __get_executed_modules_meta_info__()
I am trying to organize some modules for my own use. I have something like this:
lib/
__init__.py
settings.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
In lib/__init__.py, I want to define some classes to be used if I import lib. However, I can't seem to figure it out without separating the classes into files, and import them in__init__.py.
Rather than say:
lib/
__init__.py
settings.py
helperclass.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
from lib.settings import Values
from lib.helperclass import Helper
I want something like this:
lib/
__init__.py #Helper defined in this file
settings.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
from lib.settings import Values
from lib import Helper
Is it possible, or do I have to separate the class into another file?
EDIT
OK, if I import lib from another script, I can access the Helper class. How can I access the Helper class from settings.py?
The example here describes Intra-Package References. I quote "submodules often need to refer to each other". In my case, the lib.settings.py needs the Helper and lib.foo.someobject need access to Helper, so where should I define the Helper class?
'lib/'s parent directory must be in sys.path.
Your 'lib/__init__.py' might look like this:
from . import settings # or just 'import settings' on old Python versions
class Helper(object):
pass
Then the following example should work:
from lib.settings import Values
from lib import Helper
Answer to the edited version of the question:
__init__.py defines how your package looks from outside. If you need to use Helper in settings.py then define Helper in a different file e.g., 'lib/helper.py'.
.
| `-- import_submodule.py
`-- lib
|-- __init__.py
|-- foo
| |-- __init__.py
| `-- someobject.py
|-- helper.py
`-- settings.py
2 directories, 6 files
The command:
$ python import_submodule.py
Output:
settings
helper
Helper in lib.settings
someobject
Helper in lib.foo.someobject
# ./import_submodule.py
import fnmatch, os
from lib.settings import Values
from lib import Helper
print
for root, dirs, files in os.walk('.'):
for f in fnmatch.filter(files, '*.py'):
print "# %s/%s" % (os.path.basename(root), f)
print open(os.path.join(root, f)).read()
print
# lib/helper.py
print 'helper'
class Helper(object):
def __init__(self, module_name):
print "Helper in", module_name
# lib/settings.py
print "settings"
import helper
class Values(object):
pass
helper.Helper(__name__)
# lib/__init__.py
#from __future__ import absolute_import
import settings, foo.someobject, helper
Helper = helper.Helper
# foo/someobject.py
print "someobject"
from .. import helper
helper.Helper(__name__)
# foo/__init__.py
import someobject
If lib/__init__.py defines the Helper class then in settings.py you can use:
from . import Helper
This works because . is the current directory, and acts as a synonym for the lib package from the point of view of the settings module. Note that it is not necessary to export Helper via __all__.
(Confirmed with python 2.7.10, running on Windows.)
You just put them in __init__.py.
So with test/classes.py being:
class A(object): pass
class B(object): pass
... and test/__init__.py being:
from classes import *
class Helper(object): pass
You can import test and have access to A, B and Helper
>>> import test
>>> test.A
<class 'test.classes.A'>
>>> test.B
<class 'test.classes.B'>
>>> test.Helper
<class 'test.Helper'>
Add something like this to lib/__init__.py
from .helperclass import Helper
now you can import it directly:
from lib import Helper
Edit, since i misunderstood the question:
Just put the Helper class in __init__.py. Thats perfectly pythonic. It just feels strange coming from languages like Java.
Yes, it is possible. You might also want to define __all__ in __init__.py files. It's a list of modules that will be imported when you do
from lib import *
Maybe this could work:
import __init__ as lib