Python: relative import imports whole package - python

I just noticed that relative import like this:
from .foo import myfunc
print myfunc # ok
print foo # ok
imports both foo and myfunc. Is such behaviour documented anywhere? Can I disable it?
-- Update
Basically problem is following.
bar/foo/__init__.py:
__all__ = ['myfunc']
def myfunc(): pass
bar/__init__.py:
from .foo import *
# here I expect that there is only myfunc defined
main.py:
import foo
from bar import * # this import shadows original foo
I can add __all__ to the bar/__init__.py as well, but that way I have to repeat names in several places.

I am assuming your package layout is
my_package/
__init__.py
from .foo import myfunc
foo.py
def myfunc(): pass
The statement from .foo import myfunc first imports the module foo, generally without introducing any names into the local scope. After this first step, myfunc is imported into the local namespace.
In this particular case, however, the first step also imports the module into the local namespace: sub-modules of packages are put in the package's namespace upon importing, regardless from where they are imported. Since __init__.py is also executed in the package's namespace, this happens to conincide with the local namespace.
You cannot reasonably disable this behaviour. If you don't want the name foo in your package's namespace, my advice is to rename the module to _foo to mark it as internal.

Related

Import objects from a python module that has the same name as an __init__ import

The project is structured as follows:
dir/
__init__.py
foo.py
Foo.py has a function that uses a local assignment:
"""foo.py"""
BAR = 12345
def foo():
# do something with BAR
My goal is to import the object BAR to use in my own code. However, __init__.py contains an import of bar that masks any attempt to import from foo as a module:
"""__init__.py"""
from dir.foo import foo
So when I interact with the package, I'm only able to see dir.foo as a function definition instead of a module. How can I get access to dir.foo.BAR?
Lame hack:
import sys
foo_module = sys.modules["dir.foo"]
bar = foo_module.BAR
Better way might be to ask the author of dir not to shadow the submodule name within the top-level namespace, for example by avoiding naming a function the same way as the module in which it was defined.

Why relative import so restrict?

With directory:
app/
sub1/
__init__.py
module1.py
sub2/
__init__.py
test.py
what I imagine import a module to do is:
create a scope(or thread?)
run module.py in that scope
from ..sub import module1 is invalid with top-level at test.py
but open('../sub1/module1.py', 'r') works !!
So it's readable, but not importable.
Start with something similar to import moudule as *
exec(open('../sub1/module1.py', 'r').read())
Do further by execute this script in a sepcific scope, and name that scope.
class would provide a scope, also calling class variables is similar to calling module variables.
import module1 as cus
class Module:
exec(open('../sub1/module1.py', 'r').read(), locals(), locals())
cus = Mudule()
cus.function_inside_module1()
function exec(object[, globals[, locals]]) run object under globals scope, and store variables into locals. (I guess)
Since argument globals and locals are both locals() of class Module, it's like what i imagine import to do.
If this work properly, module under module can be writen as nested class i guess.
What kind of problems will this odd importing cause?
If not, why a file is readable but not importable(with top-level restriction)?
Edit
#user2357112 sorry I don't knew how to write multiline comment:
would this gives the behavior you asked for loading parent package?
class sub1:
exec(open('../sub1/__init__.py', 'r').read(), locals(), locals())
class Module:
exec(open('../sub1/module1.py', 'r').read(), locals(), locals())
cus = sub1.Module()
del sub1
Relative imports are not a directory traversal mechanism. from ..a import b does not mean "go up a directory, enter the a directory, and load b.py". It means "import the b member of the a submodule of the current package's parent package". This usually looks a lot like what the directory traversal would do, but it is not the same, especially for cases involving namespace packages, custom module loaders, or sys.modules manipulation.
sub2 has no parent package. Trying to refer to a nonexistent parent package is an error. Also, if you ran test.py directly by file name, sub2 is not even considered a package at all.

Custom Py Module: Intra Module Importing

I am creating a custom Python module that I wanted to work similarly to something like numpy where most of the functionality can be accessed by calling np.function() or what have you. However, in creating my submodules and __init__.py file, I think that I've run into a circular import reference. Here's what I'm trying to do and the error I'm seeing, any tips to help me get this working the way I envisioned are greatly appreciated:
Let's call the module "foobar" and say I have a folder named "foobar" on the path and in this folder are three py files: "__init__.py", "foo.py", and "bar.py".
An example of submodule 1, "foo.py":
import bar
class Foo():
def __init__(self):
pass
def runfunc(self):
bar.func()
An example of submodule 2, "bar.py":
import foo
def func():
f=foo.F()
An example of __init__.py:
import numpy
from foo import *
from bar import *
__all__ = ['foo', 'bar']
So what I perceive to be my issue is that when foo is imported, it imports bar, which imports foo, which creates a loop (in my mind). I am also confused as to the use of __init__.py if, when it is called and imports other modules they cannot see each other when executing (i.e. I need to import bar into foo even though __init__.py did this).
When I run something from a script trying to import the entire module, I get the error in bar.py "ImportError: cannot import name foo". I am actually also having this issue elsewhere in the same module where I have a file with the base class "Base" and the extended class "Extended" where base.py needs to import extended.py so it can spawn instances of the expanded class and extended.py needs to import base.py so it can inherit the class.
Thanks for any help in advance!

Why this import related code works in __init__.py but not in different .py file?

Let's have this __init__.py in a Python3 package:
from .mod1 import *
from .mod2 import *
from .mod3 import *
__all__ = mod1.__all__ + mod2.__all__ + mod3.__all__
The code looks quite simple and does what is expected: it imports from modules mod1, mod2 and mod3 all symbols that these modules have put into their __all__ list and then a summary of all three __all__ lists is created.
I tried to run the very same code in a module, i.e. not in the __init__.py. It imported the three modules, but mod1, mod2 and mod3 were undefined variables.
(BTW, if you run pylint on the original __init__.py, you will get this error too.)
The same statement from .mod1 import * creates a mod1 object when executed in the __init__.py, but does not create it elsewhere. Why?
__init__.py is a special file, but till now, I thought only its name was special.
According to the documentation, this is expected behaviour:
When a submodule is loaded using any mechanism (e.g. importlib APIs, the import or import-from statements, or built-in __import__()) a binding is placed in the parent module’s namespace to the submodule object. For example, if package spam has a submodule foo, after importing spam.foo, spam will have an attribute foo which is bound to the submodule.
In other words, when you do a from .whatever import something within a module, you will magically get a whatever attribute bound to the module. Naturally, you can access module's own attributes within __init__.py as if they were defined as variables there. When you are in another module you cannot do it. In this sense __init__.py is special indeed.

Import module that contains main module functions

Suppose I have this folder structure:
module
module.py
__init__.py
main.py
Main.py imports module.py which itself should have functions that are only present in main.py. For example, main.py code:
from module import *
def foo(var):
print var
module.foo_module()
Content of module.py:
def foo_module():
foo("Hello world!")
Is there anyway I can achieve this without repeating the functions? If not, how can I import main.py into module.py?
Many thanks
Everything is an object in python, including functions. You can pass the necessary function as an argument. Whether this makes sense in your case, I don't have enough information to know.
def foo(var):
print var
module.foo_module(foo)
def foo_module(foo):
foo("Hello world!")
Avoid circular imports. You could do that here by placing foo in module.
If you don't want foo in module, you could instead create a separate module bar to hold foo, and import bar in both main and module.

Categories