I have the following package:
foo/
__init__.py:
from .body import UsefulClass
from .another import AnotherClass
body.py:
from . import utll
class UsefulClass:
util.do_something()
another.py:
class AnotherClass: ...
util.py:
def do_something...
The idea is, when someone imports foo, they should be able to use foo.UsefulClass without worrying about the internal structure of the package. In other words, I don't want them to import foo.body, just foo.
However, when I do from . import util in body.py this also imports __init__.py, which in turn imports body once again. I realize that python handles this situation well, however I'm not comfortable having this obviously circular dependency.
Is there a better way to export things at the package level without creating circular dependencies in imports?
PS: I'd like to avoid in-function imports
I think your initial premise is wrong. If you do an import foo statement with your current setup, __init__.py and body.py will only be imported once as can be shown by putting a print statement as the first line in each of those files:
The layout of directory foo (I have omitted another.py, since its presence or absence is not relevant):
File __init__.py:
print('__init__.py imported')
from .body import UsefulClass
File body.py:
print('body.py imported')
from . import util
class UsefulClass:
x = util.do_something()
File util.py:
def do_something():
return 9
And finally:
Python 3.8.5 (tags/v3.8.5:580fbb0, Jul 20 2020, 15:57:54) [MSC v.1924 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import foo
__init__.py imported
body.py imported
>>> foo.UsefulClass.x
9
>>>
Related
I need to import a single class (not the whole file) from a python file starting with number.
There was a topic on importing a whole module and it works, but can't find my way around this one.
(In python, how to import filename starts with a number)
Normally it would be:
from uni_class import Student
though the file is called 123_uni_class.
Tried different variations of
importlib.import_module("123_uni_class")
and
uni_class=__import__("123_uni_class")
Error:
from 123_uni_class import Student
^
SyntaxError: invalid decimal literal
importlib.import_module("123_uni_class") returns the module after importing it, you must give it a valid name in order to reuse it:
import importlib
my_uni_class = importlib.import_module("123_uni_class")
Then you can access your module under the name 'my_uni_class'.
This would be equivalent to import 123_uni_class as my_uni_class if 123_uni_class were valid in this context.
It works for me:
Python 3.6.8 (default, Feb 14 2019, 22:09:48)
[GCC 7.4.0] on cygwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import importlib
>>> importlib.import_module('123_a')
<module '123_a' from '/path/to/123_a.py'>
>>> __import__('123_a')
<module '123_a' from '/path/to/123_a.py'>
You would not see an actual syntax error containing the literal text "from 123_uni_class import ..." unless you actually have some source code containing that line.
If you must, you can also bypass the import system entirely by reading the contents of the file and exec()-ing them, possibly into a namespace you provide. For example:
mod = {}
with open('123_uni_class.py') as fobj:
exec(fobj.read(), mod)
Student = mod['Student']
This general technique is used, for example, to read config files written in Python and things like that. I would discourage it though for normal usage and suggest you just use a valid module name.
In my project, I am trying to share a global variable across modules as described in the Python documentation: https://docs.python.org/3/faq/programming.html#how-do-i-share-global-variables-across-modules
I seemed to find, however, that the global variable is instantiated twice and I am not sure why. I want the variable to be a singleton.
I was able to reproduce the issue with this minimal example:
# main.py
class TheClass:
def __init__(self):
print("init TheClass")
def do_work(self):
import worker
worker.do_work()
the_class:TheClass = TheClass()
if __name__ == '__main__':
the_class.do_work()
# worker.py
from main import the_class
def do_work():
print("doing work...")
Output of python main.py:
init TheClass
init TheClass
doing work...
'init TheClass' is logged twice which implies that the class was instantiated twice.
I was unable to reproduce this behaviour in the interactive Python shell:
Python 3.7.4 (tags/v3.7.4:e09359112e, Jul 8 2019, 20:34:20) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import main
init TheClass
>>> main.the_class.do_work()
doing work...
>>>
How do I fix this undesired behaviour?
This happens because you are having a circular import and because of something special that happens with the codefile that you are actually starting.
If you start a file (as the argument to python) it will become the __main__ module independent of its filename. If at a later point in time the same file is imported, it is not recognized as already imported and is reimported under its real name, main in your case. This is why the code is run twice. A third import will not rerun the code.
You can fix it by preventing circular imports (don't put stuff to import into the starter), or by putting the code that should not run twice into the name checking if you already have.
This question already has answers here:
from ... import OR import ... as for modules
(6 answers)
Closed 4 years ago.
I've always used from a import b but recently a team at work decided to move a module into a new namespace, and issued a warning notice telling people to replace import b with import a.b as b.
I've never used import as and the only documentation I can find seems to suggest it doesn't support import a.b as b, though clearly it does.
but is there actually a difference, and if so what?
As far as I know, correct me if I am wrong.
First, import a.b must import a module(a file) or a package(a directory contains __init__.py).
For example, you can import tornado.web as web but you cannot import flask.Flask as Flask as Flask is an object in package flask.
Second, import a.b also import the namespace a which from a import b won't. You can check it by globals().
So what's the influence? For example:
import tornado.web as web
Now you have access to namespace tornado, but you cannot access tornado.options even though tornado has this module. But as python's global package management, if you from tornado import options, you will not only get access to options but also add it to namespace tornado. So now you can also access options by tornado.options.
I am going to provide only a partial answer, and I will speculate a bit.
1) Sometimes I have observed that the second way works while the first does not. On my system:
Python 3.6.3 (default, Oct 3 2017, 21:45:48)
[GCC 7.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from tensorflow import keras # <- this works
>>>
>>> import tensorflow.keras as K # <- this fails
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'tensorflow.keras'
>>>
2) I usually don't see a difference between these two approaches to importing. I haven't investigated WHY there is a difference with TensorFlow. It may have to do with what names are imported to the top level by the various TensorFlow subfolder init.py files (which are completely empty in most cases, but the one in ../dist_packages/tensorflow/python is pretty long).
I have the following folder structure for a Python 3 project where vehicle.py is the main script and the folder stats is treated as a package containing several modules:
The cars module defines the following functions:
def neon():
print('Neon')
print('mpg = 32')
def mustang():
print('Mustang')
print('mpg = 27')
Using Python 3, I can access the functions in each module from within vehicle.py as follows:
import stats.cars as c
c.mustang()
However, I would like to access the functions defined in each module directly, but I receive an error when doing this:
import stats as st
st.mustang()
# AttributeError: 'module' object has no attribute 'mustang'
I also tried placing an __init__.py file in the stats folder with the following code:
from cars import *
from trucks import *
but I still receive an error:
import stats as st
st.mustang()
# ImportError: No module named 'cars'
I'm trying to use the same approach as NumPy such as:
import numpy as np
np.arange(10)
# prints array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
How can I create a package like NumPy in Python 3 to access functions directly in modules?
Put an __init__.py file in the stats folder (as others have said), and put this in it:
from .cars import neon, mustang
from .trucks import truck_a, truck_b
Not so neat, but easier would be to use the * wildcard:
from .cars import *
from .trucks import *
This way, the __init__.py script does some importing for you, into its own namespace.
Now you can use functions/classes from the neon/mustang module directly after you import stats:
import stats as st
st.mustang()
add empty __init__.py file in your stats folder and magic happens.
You need to create __init__.py file in stats folder.
The __init__.py files are required to make Python treat the directories as containing packages.
Documentation
Have you tried something like
from cars import stats as c
You may also need a an empty __init__.py file in that directory.
host:~ lcerezo$ python
Python 2.7.10 (default, Oct 23 2015, 18:05:06)
[GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.0.59.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from boto.s3.connection import S3Connection as mys3
>>>
Importing the standard "logging" module pollutes sys.modules with a bunch of dummy entries:
Python 2.5.4 (r254:67916, Dec 23 2008, 15:10:54) [MSC v.1310 32 bit (Intel)] on win32
>>> import sys
>>> import logging
>>> sorted(x for x in sys.modules.keys() if 'log' in x)
['logging', 'logging.atexit', 'logging.cStringIO', 'logging.codecs',
'logging.os', 'logging.string', 'logging.sys', 'logging.thread',
'logging.threading', 'logging.time', 'logging.traceback', 'logging.types']
# and perhaps even more surprising:
>>> import traceback
>>> traceback is sys.modules['logging.traceback']
False
>>> sys.modules['logging.traceback'] is None
True
So importing this package puts extra names into sys.modules, except that they are not modules, just references to None. Other modules (e.g. xml.dom and encodings) have this issue as well. Why?
Edit: Building on bobince's answer, there are pages describing the origin (see section "Dummy Entries in sys.modules") and future of the feature.
None values in sys.modules are cached failures of relative lookups.
So when you're in package foo and you import sys, Python looks first for a foo.sys module, and if that fails goes to the top-level sys module. To avoid having to check the filesystem for foo/sys.py again on further relative imports, it stores None in the sys.modules to flag that the module didn't exist and a subsequent import shouldn't look there again, but go straight to the loaded sys.
This is a cPython implementation detail you can't usefully rely on, but you will need to know it if you're doing nasty magic import/reload hacking.
It happens to all packages, not just logging. For example, import xml.dom and see xml.dom.xml in the module list as it tries to import xml from inside xml.dom.
As Python moves towards absolute import this ugliness will happen less.