Use 'import module' or 'from module import'? - python

I've tried to find a comprehensive guide on whether it is best to use import module or from module import. I've just started with Python and I'm trying to start off with best practices in mind.
Basically, I was hoping if anyone could share their experiences, what preferences other developers have and what's the best way to avoid any gotchas down the road?

The difference between import module and from module import foo is mainly subjective. Pick the one you like best and be consistent in your use of it. Here are some points to help you decide.
import module
Pros:
Less maintenance of your import statements. Don't need to add any additional imports to start using another item from the module
Cons:
Typing module.foo in your code can be tedious and redundant (tedium can be minimized by using import module as mo then typing mo.foo)
from module import foo
Pros:
Less typing to use foo
More control over which items of a module can be accessed
Cons:
To use a new item from the module you have to update your import statement
You lose context about foo. For example, it's less clear what ceil() does compared to math.ceil()
Either method is acceptable, but don't use from module import *.
For any reasonable large set of code, if you import * you will likely be cementing it into the module, unable to be removed. This is because it is difficult to determine what items used in the code are coming from 'module', making it easy to get to the point where you think you don't use the import any more but it's extremely difficult to be sure.

There's another detail here, not mentioned, related to writing to a module. Granted this may not be very common, but I've needed it from time to time.
Due to the way references and name binding works in Python, if you want to update some symbol in a module, say foo.bar, from outside that module, and have other importing code "see" that change, you have to import foo a certain way. For example:
module foo:
bar = "apples"
module a:
import foo
foo.bar = "oranges" # update bar inside foo module object
module b:
import foo
print foo.bar # if executed after a's "foo.bar" assignment, will print "oranges"
However, if you import symbol names instead of module names, this will not work.
For example, if I do this in module a:
from foo import bar
bar = "oranges"
No code outside of a will see bar as "oranges" because my setting of bar merely affected the name "bar" inside module a, it did not "reach into" the foo module object and update its bar.

Even though many people already explained about import vs import from, I want to try to explain a bit more about what happens under the hood, and where all the places it changes are.
import foo:
Imports foo, and creates a reference to that module in the current namespace. Then you need to define completed module path to access a particular attribute or method from inside the module.
E.g. foo.bar but not bar
from foo import bar:
Imports foo, and creates references to all the members listed (bar). Does not set the variable foo.
E.g. bar but not baz or foo.baz
from foo import *:
Imports foo, and creates references to all public objects defined by that module in the current namespace (everything listed in __all__ if __all__ exists, otherwise everything that doesn't start with _). Does not set the variable foo.
E.g. bar and baz but not _qux or foo._qux.
Now let’s see when we do import X.Y:
>>> import sys
>>> import os.path
Check sys.modules with name os and os.path:
>>> sys.modules['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> sys.modules['os.path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
Check globals() and locals() namespace dicts with os and os.path:
>>> globals()['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> locals()['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> globals()['os.path']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'os.path'
>>>
From the above example we found that only os is inserted in the local and global namespace.
So, we should be able to use:
>>> os
<module 'os' from
'/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> os.path
<module 'posixpath' from
'/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>>
But not path.
>>> path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'path' is not defined
>>>
Once you delete the os from locals() namespace, you won't be able to access os as well as os.path even though they exist in sys.modules:
>>> del locals()['os']
>>> os
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
>>> os.path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
>>>
Now let's talk about import from:
from:
>>> import sys
>>> from os import path
Check sys.modules with os and os.path:
>>> sys.modules['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> sys.modules['os.path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
We found that in sys.modules we found as same as we did before by using import name
OK, let's check how it looks like in locals() and globals() namespace dicts:
>>> globals()['path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> locals()['path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> globals()['os']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'os'
>>>
You can access by using name path not by os.path:
>>> path
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> os.path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
>>>
Let's delete 'path' from locals():
>>> del locals()['path']
>>> path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'path' is not defined
>>>
One final example using an alias:
>>> from os import path as HELL_BOY
>>> locals()['HELL_BOY']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> globals()['HELL_BOY']
<module 'posixpath' from /System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>>
And no path defined:
>>> globals()['path']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'path'
>>>

Both ways are supported for a reason: there are times when one is more appropriate than the other.
import module: nice when you are using many bits from the module. drawback is that you'll need to qualify each reference with the module name.
from module import ...: nice that imported items are usable directly without module name prefix. The drawback is that you must list each thing you use, and that it's not clear in code where something came from.
Which to use depends on which makes the code clear and readable, and has more than a little to do with personal preference. I lean toward import module generally because in the code it's very clear where an object or function came from. I use from module import ... when I'm using some object/function a lot in the code.

I personally always use
from package.subpackage.subsubpackage import module
and then access everything as
module.function
module.modulevar
etc. The reason is that at the same time you have short invocation, and you clearly define the module namespace of each routine, something that is very useful if you have to search for usage of a given module in your source.
Needless to say, do not use the import *, because it pollutes your namespace and it does not tell you where a given function comes from (from which module)
Of course, you can run in trouble if you have the same module name for two different modules in two different packages, like
from package1.subpackage import module
from package2.subpackage import module
in this case, of course you run into troubles, but then there's a strong hint that your package layout is flawed, and you have to rethink it.

import module
Is best when you will use many functions from the module.
from module import function
Is best when you want to avoid polluting the global namespace with all the functions and types from a module when you only need function.

I've just discovered one more subtle difference between these two methods.
If module foo uses a following import:
from itertools import count
Then module bar can by mistake use count as though it was defined in foo, not in itertools:
import foo
foo.count()
If foo uses:
import itertools
the mistake is still possible, but less likely to be made. bar needs to:
import foo
foo.itertools.count()
This caused some troubles to me. I had a module that by mistake imported an exception from a module that did not define it, only imported it from other module (using from module import SomeException). When the import was no longer needed and removed, the offending module was broken.

Here is another difference not mentioned. This is copied verbatim from http://docs.python.org/2/tutorial/modules.html
Note that when using
from package import item
the item can be either a submodule (or subpackage) of the package, or some other name defined in the package, like a function, class or variable. The import statement first tests whether the item is defined in the package; if not, it assumes it is a module and attempts to load it. If it fails to find it, an ImportError exception is raised.
Contrarily, when using syntax like
import item.subitem.subsubitem
each item except for the last must be a package; the last item can be a module or a package but can’t be a class or function or variable defined in the previous item.

Since I am also a beginner, I will be trying to explain this in a simple way:
In Python, we have three types of import statements which are:
1. Generic imports:
import math
this type of import is my personal favorite, the only downside to this import technique is that if you need use any module's function you must use the following syntax:
math.sqrt(4)
of course, it increases the typing effort but as a beginner, it will help you to keep track of module and function associated with it, (a good text editor will reduce the typing effort significantly and is recommended).
Typing effort can be further reduced by using this import statement:
import math as m
now, instead of using math.sqrt() you can use m.sqrt().
2. Function imports:
from math import sqrt
this type of import is best suited if your code only needs to access single or few functions from the module, but for using any new item from the module you have to update import statement.
3. Universal imports:
from math import *
Although it reduces typing effort significantly but is not recommended because it will fill your code with various functions from the module and their name could conflict with the name of user-defined functions.
example:
If you have a function of your very own named sqrt and you import math, your function is safe: there is your sqrt and there is math.sqrt. If you do from math import *, however, you have a problem: namely, two different functions with the exact same name. Source: Codecademy

I would like to add to this. It can be useful to understand how Python handles imported modules as attributes if you run into circular imports.
I have the following structure:
mod/
__init__.py
main.py
a.py
b.py
c.py
d.py
From main.py I will import the other modules using differnt import methods
main.py:
import mod.a
import mod.b as b
from mod import c
import d
dis.dis shows the difference (note module names, a b c d):
1 0 LOAD_CONST 0 (-1)
3 LOAD_CONST 1 (None)
6 IMPORT_NAME 0 (mod.a)
9 STORE_NAME 1 (mod)
2 12 LOAD_CONST 0 (-1)
15 LOAD_CONST 1 (None)
18 IMPORT_NAME 2 (b)
21 STORE_NAME 2 (b)
3 24 LOAD_CONST 0 (-1)
27 LOAD_CONST 2 (('c',))
30 IMPORT_NAME 1 (mod)
33 IMPORT_FROM 3 (c)
36 STORE_NAME 3 (c)
39 POP_TOP
4 40 LOAD_CONST 0 (-1)
43 LOAD_CONST 1 (None)
46 IMPORT_NAME 4 (mod.d)
49 LOAD_ATTR 5 (d)
52 STORE_NAME 5 (d)
55 LOAD_CONST 1 (None)
In the end they look the same (STORE_NAME is result in each example), but this is worth noting if you need to consider the following four circular imports:
example1
foo/
__init__.py
a.py
b.py
a.py:
import foo.b
b.py:
import foo.a
>>> import foo.a
>>>
This works
example2
bar/
__init__.py
a.py
b.py
a.py:
import bar.b as b
b.py:
import bar.a as a
>>> import bar.a
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "bar\a.py", line 1, in <module>
import bar.b as b
File "bar\b.py", line 1, in <module>
import bar.a as a
AttributeError: 'module' object has no attribute 'a'
No dice
example3
baz/
__init__.py
a.py
b.py
a.py:
from baz import b
b.py:
from baz import a
>>> import baz.a
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "baz\a.py", line 1, in <module>
from baz import b
File "baz\b.py", line 1, in <module>
from baz import a
ImportError: cannot import name a
Similar issue... but clearly from x import y is not the same as import import x.y as y
example4
qux/
__init__.py
a.py
b.py
a.py:
import b
b.py:
import a
>>> import qux.a
>>>
This one also works

import package
import module
With import, the token must be a module (a file containing Python commands) or a package (a folder in the sys.path containing a file __init__.py.)
When there are subpackages:
import package1.package2.package
import package1.package2.module
the requirements for folder (package) or file (module) are the same, but the folder or file must be inside package2 which must be inside package1, and both package1 and package2 must contain __init__.py files. https://docs.python.org/2/tutorial/modules.html
With the from style of import:
from package1.package2 import package
from package1.package2 import module
the package or module enters the namespace of the file containing the import statement as module (or package) instead of package1.package2.module. You can always bind to a more convenient name:
a = big_package_name.subpackage.even_longer_subpackage_name.function
Only the from style of import permits you to name a particular function or variable:
from package3.module import some_function
is allowed, but
import package3.module.some_function
is not allowed.

To add to what people have said about from x import *: besides making it more difficult to tell where names came from, this throws off code checkers like Pylint. They will report those names as undefined variables.

My own answer to this depends mostly on first, how many different modules I'll be using. If i'm only going to use one or two, I'll often use from ... import since it makes for fewer keystrokes in the rest of the file, but if I'm going to make use of many different modules, I prefer just import because that means that each module reference is self-documenting. I can see where each symbol comes from without having to hunt around.
Usuaully I prefer the self documenting style of plain import and only change to from.. import when the number of times I have to type the module name grows above 10 to 20, even if there's only one module being imported.

This is my directory structure of my current directory:
.
└─a
└─b
└─c
The import statement remembers all intermediate names.
These names have to be qualified:
In[1]: import a.b.c
In[2]: a
Out[2]: <module 'a' (namespace)>
In[3]: a.b
Out[3]: <module 'a.b' (namespace)>
In[4]: a.b.c
Out[4]: <module 'a.b.c' (namespace)>
The from ... import ... statement remembers only the imported name.
This name must not be qualified:
In[1]: from a.b import c
In[2]: a
NameError: name 'a' is not defined
In[2]: a.b
NameError: name 'a' is not defined
In[3]: a.b.c
NameError: name 'a' is not defined
In[4]: c
Out[4]: <module 'a.b.c' (namespace)>
Note: Of course, I restarted my Python console between steps 1 and 2.

There have been many answers, but none have mentioned testing (with unittest or pytest).
tl;dr
Use import foo for external modules to simplify testing.
The Hard Way
Importing classes/functions (from foo import bar) individually from a module makes red-green-refactor cycles tedious. For example, if my file looks like
# my_module.py
from foo import bar
class Thing:
def do_thing(self):
bar('do a thing')
and my test is
# test_my_module.py
from unittest.mock import patch
import my_module
patch.object(my_module, 'bar')
def test_do_thing(mock_bar):
my_module.Thing().do_thing()
mock_bar.assert_called_with('do a thing')
At first glance, this seems great. But what happens if I want to implement Thing class in a different file? My structure would have to change like this...
# my_module.py
from tools import Thing
def do_thing():
Thing().do_thing()
# tools.py
from foo import bar
class Thing:
def do_thing(self):
bar('do a thing')
# test_my_module.py
from unittest.mock import patch
import my_module
import tools # Had to import implementation file...
patch.object(tools, 'bar') # Changed patch
def test_do_thing(mock_bar):
my_module.do_thing() # Changed test (expected)
mock_bar.assert_called_with('do a thing')
Unfortunately, since I used from foo import bar, I need to update my patch to reference the tools module. Essentially, since my test knows too much about implementation, much more than expected needs to be changed to do this refactor.
The Better Approach
Using import foo, my tests can ignore how the module is implemented and simply patch the whole module.
# my_module.py
from tools import Thing
def do_thing():
Thing().do_thing()
# tools.py
import foo
class Thing:
def do_thing(self):
foo.bar('do a thing') # Specify 'bar' is from 'foo' module
# test_my_module.py
from unittest.mock import patch
import my_module
patch('foo') # Patch entire foo module
def test_do_thing(mock_foo):
my_module.do_thing() # Changed test (expected)
mock_foo.bar.assert_called_with('do a thing')
The less implementation details your tests know, the better. That way, if you come up with a better solution (use classes instead of functions, use additional files to separate ideas, etc.), less needs to be changed in your tests to accommodate the refactor.

One of the significant difference I found out which surprisingly no-one has talked about is that using plain import you can access private variable and private functions from the imported module, which isn't possible with from-import statement.
Code in image:
setting.py
public_variable = 42
_private_variable = 141
def public_function():
print("I'm a public function! yay!")
def _private_function():
print("Ain't nobody accessing me from another module...usually")
plain_importer.py
import settings
print (settings._private_variable)
print (settings.public_variable)
settings.public_function()
settings._private_function()
# Prints:
# 141
# 42
# I'm a public function! yay!
# Ain't nobody accessing me from another module...usually
from_importer.py
from settings import *
#print (_private_variable) #doesn't work
print (public_variable)
public_function()
#_private_function() #doesn't work

As Jan Wrobel mentions, one aspect of the different imports is in which way the imports are disclosed.
Module mymath
from math import gcd
...
Use of mymath:
import mymath
mymath.gcd(30, 42) # will work though maybe not expected
If I imported gcd only for internal use, not to disclose it to users of mymath, this can be inconvenient. I have this pretty often, and in most cases I want to "keep my modules clean".
Apart from the proposal of Jan Wrobel to obscure this a bit more by using import math instead, I have started to hide imports from disclosure by using a leading underscore:
# for instance...
from math import gcd as _gcd
# or...
import math as _math
In larger projects this "best practice" allows my to exactly control what is disclosed to subsequent imports and what isn't. This keeps my modules clean and pays back at a certain size of project.

since many people answered here but i am just trying my best :)
import module is best when you don't know which item you have to import from module. In this way it may be difficult to debug when problem raises because
you don't know which item have problem.
form module import <foo> is best when you know which item you require to import and also helpful in more controlling using importing specific item according to your need. Using this way debugging may be easy because you know which item you imported.

Import Module - You don't need additional efforts to fetch another thing from module. It has disadvantages such as redundant typing
Module Import From - Less typing &More control over which items of a module can be accessed.To use a new item from the module you have to update your import statement.

There are some builtin modules that contain mostly bare functions (base64, math, os, shutil, sys, time, ...) and it is definitely a good practice to have these bare functions bound to some namespace and thus improve the readability of your code. Consider how more difficult is to understand the meaning of these functions without their namespace:
copysign(foo, bar)
monotonic()
copystat(foo, bar)
than when they are bound to some module:
math.copysign(foo, bar)
time.monotonic()
shutil.copystat(foo, bar)
Sometimes you even need the namespace to avoid conflicts between different modules (json.load vs. pickle.load)
On the other hand there are some modules that contain mostly classes (configparser, datetime, tempfile, zipfile, ...) and many of them make their class names self-explanatory enough:
configparser.RawConfigParser()
datetime.DateTime()
email.message.EmailMessage()
tempfile.NamedTemporaryFile()
zipfile.ZipFile()
so there can be a debate whether using these classes with the additional module namespace in your code adds some new information or just lengthens the code.

I was answering a similar question post but the poster deleted it before i could post. Here is one example to illustrate the differences.
Python libraries may have one or more files (modules). For exmaples,
package1
|-- __init__.py
or
package2
|-- __init__.py
|-- module1.py
|-- module2.py
We can define python functions or classes inside any of the files based design requirements.
Let's define
func1() in __init__.py under mylibrary1, and
foo() in module2.py under mylibrary2.
We can access func1() using one of these methods
import package1
package1.func1()
or
import package1 as my
my.func1()
or
from package1 import func1
func1()
or
from package1 import *
func1()
We can use one of these methods to access foo():
import package2.module2
package2.module2.foo()
or
import package2.module2 as mod2
mod2.foo()
or
from package2 import module2
module2.foo()
or
from package2 import module2 as mod2
mod2.foo()
or
from package2.module2 import *
foo()

In simple words, this is all about programmer convenience. At the core level, they simply import all functionality of the module.
import module: When you use import module then to use methods of this module you have to write module.method(). Every time you use any method or property then you have to refer to the module.
from module import all: When you use from module import all than to use methods of this module you just have to write method() without referring to the module.

There is a crucial aspect of these imports that #ahfx already mentioned, namely the internals of the process of loading modules. This pops up if your system needs to use circular imports (e.g. you want to make use of dependency injection in some popular http frameworks). In such cases the from {module} import {function} appears much more aggressive with its requirements on how the loading process proceeds. Let us take the example:
#m1.py:
print('--start-m1--')
from m2 import * # form does not matter; just need to force import of m2
print('--mid-m1--')
def do1(x):
print(x)
print('--end-m1--')
importing
#m2.py
print('--start-m2--')
# from m1 import * # A
# from m1 import do1 # B
# import m1 # C
# D -- no import of "do1" at all
print('--mid-m2--')
def do2(x):
m1.do1(x)
print('--end-m2--')
run via
#main.py:
from m1 import do1
do1('ok')
Of all the import possibilities in m2.py (A,B,C,D), the from {module} import {function} is the only one that actually crashes the load process, leading to the infamous (CPython 3.10.6)
ImportError: cannot import name 'do1' from partially initialized module 'm1'
(most likely due to a circular import)
While I cannot say why this happens, it appears that the from ... import ... statement puts a more stringent requirement on "how far" the module in question is already in its initialization process.

Related

How come import subprocess.Popen not work and from subprocess import Popen does work? What "type" of object am I allowed to import in python? [duplicate]

So I am confused as what the difference is...Here is some code to display my confusion:
>>> import collections.OrderedDict as od
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named OrderedDict
>>> from collections import OrderedDict as od
>>> od
<class 'collections.OrderedDict'>
explanation:
import collections.OrderedDict did not find the module, yet from collections import OrderedDict found it?! What is the difference between those two statements?
the class is read as collections.OrderedDict, so I don't understand why the first attempt was unable to find the module
note:
I am simply using collections as an example. I am not looking for specifically why my example acted the way it did for collections, but rather an explanation for what the different lines of code are actually requesting as far as imports go. If you would like to include an explanation on the error, feel free! Thanks!
OrderedDict is a class within the collections module. When you see things like x.y and something is being imported from it, that means that "y" in this case is actually a module.
You should read the docs about how import works: here. It's long and involved but at the same time fairly straight forward in how it looks into the different packages and modules to find what should be brought into play. Specifically, the import statement itself and import system.
PEP 221 talks about import as.
import foo.bar
is for importing a submodule bar of the module foo. This can be 'imported as'
import foo.bar as fb
An object is imported
from foo import baz
This, too, can be 'imported as'
from foo import baz as fb
collections.OrderedDict is not a submodule but an object so it can only be 'imported as' in the second way.

Python module import - why are components only available when explicitly imported?

I have recently installed scikit-image version 0.11.3. I am using python 2.7.10. When I import the entire module I cannot access the io module.
import skimage
img = skimage.io.imread(path_)
Gives error:
AttributeError: 'module' object has no attribute 'io'
However the following does not error.
from skimage import io
img = io.imread(path_)
Question: Why?
Quick answer: IO is a submodule. Submodules need to be imported from the parent module explicitly.
Long answer: From section 5.4.2 of the python docs:
When a submodule is loaded using any mechanism (e.g. importlib APIs, the import or import-from statements, or built-in import()) a binding is placed in the parent module’s namespace to the submodule object. For example, if package spam has a submodule foo, after importing spam.foo, spam will have an attribute foo which is bound to the submodule. Let’s say you have the following directory structure:
spam/
__init__.py
foo.py
bar.py
and spam/init.py has the following lines in it:
from .foo import Foo
from .bar import Bar
then executing the following puts a name binding to foo and bar in the spam module:
>>>
>>> import spam
>>> spam.foo
<module 'spam.foo' from '/tmp/imports/spam/foo.py'>
>>> spam.bar
<module 'spam.bar' from '/tmp/imports/spam/bar.py'>
Given Python’s familiar name binding rules this might seem surprising, but it’s actually a fundamental feature of the import system. The invariant holding is that if you have sys.modules['spam'] and sys.modules['spam.foo'] (as you would after the above import), the latter must appear as the foo attribute of the former.
It's simply the way Python handles modules.
One reason is that it would make importing one module very slow if cpython needed to scan for submodules, import all of them and then import all of their submodules.
The other reason is "better be explicit than implicit". Why should Python import everything possible when you only need a small fraction of a package with a complex module hierarchy.
Instead of from skimage import io you can also write
import skimage.io
then skimage.io.imread will be found.

import everything from a module except a few methods

Is it possible to import everything (*) from an existing Python module except a number of explicitly specified methods?
(Background: Against recommended Python practice it is common in FEniCS to do from dolfin import *. A few of the methods names contain the string "Test" though (e.g., TestFunction()) and are mistaken for unit tests by nose.)
In case you don't have an access to the module, you can also simply remove these methods or variables from a global namespace. Here's how this could be done:
to_exclude = ['foo']
from somemodule import *
for name in to_exclude:
del globals()[name]
Yes, you can define the __all__ module
Add
__all__ = ["echo", "surround", "reverse"] #Or whatever your module names are
to the file which has these modules, or __init__.py of the package you want to import from.
Now
from module import *
imports only the specified modules in __all__
#alexander-zhukov's solution will work most of the time, but not when the imported module coincidentally contains a variable called globals.
For example,
to_exclude = ['abort']
from flask import *
for name in to_exclude:
del globals()[name]
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
TypeError: 'module' object is not callable
The error is because the flask package contains a namespace called globals (that cannot be called) which will overwrite your current global symbol globals.
The following solution will work for flask and others:
to_exclude = ['abort']
from flask import *
for name in to_exclude:
__builtins__.globals().pop(name)
However ridiculously, it does not work if you open a Python console and type in the commands manually. I think this is a defect of Python 3. If you want this to work in a Python console, then you have to explicitly import the builtins module:
import builtins
to_exclude = ['abort']
from flask import *
for name in to_exclude:
builtins.globals().pop(name)

difference between from x import y and import x.y

So I am confused as what the difference is...Here is some code to display my confusion:
>>> import collections.OrderedDict as od
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named OrderedDict
>>> from collections import OrderedDict as od
>>> od
<class 'collections.OrderedDict'>
explanation:
import collections.OrderedDict did not find the module, yet from collections import OrderedDict found it?! What is the difference between those two statements?
the class is read as collections.OrderedDict, so I don't understand why the first attempt was unable to find the module
note:
I am simply using collections as an example. I am not looking for specifically why my example acted the way it did for collections, but rather an explanation for what the different lines of code are actually requesting as far as imports go. If you would like to include an explanation on the error, feel free! Thanks!
OrderedDict is a class within the collections module. When you see things like x.y and something is being imported from it, that means that "y" in this case is actually a module.
You should read the docs about how import works: here. It's long and involved but at the same time fairly straight forward in how it looks into the different packages and modules to find what should be brought into play. Specifically, the import statement itself and import system.
PEP 221 talks about import as.
import foo.bar
is for importing a submodule bar of the module foo. This can be 'imported as'
import foo.bar as fb
An object is imported
from foo import baz
This, too, can be 'imported as'
from foo import baz as fb
collections.OrderedDict is not a submodule but an object so it can only be 'imported as' in the second way.

Problem with python and __import__

Sorry for the generic title, will change it once I understand the source of my problem
I have the following structure:
foo/
foo/__init__.py
foo/bar/
foo/bar/__init__.py
foo/bar/some_module.py
When I try to import some_module by doing so:
from foo.bar import some_module
it works like a charm.
But this is no good for me, since I only know the name of the module to import in runtime. so if I try:
from foo.bar import *
mod=__import__('some_module')
I get an error. Am I doing something wrong? Is there a better way to do this? and why is this happening?
Why is that? I am not quite sure I completely understand the concept behind python packages. I thought they were equivalent to java's packages and thus
I believe the proper way to do this is:
mod = __import__('foo.bar', fromlist = ['some_module'])
This way even the 'foo.bar' part can be changed at runtime.
As a result some_modulewill be available as mod.some_module; use getattr if you want it in a separate variable:
the_module = getattr(mod, 'some_module')
from foo.bar import *
is a bad practice since it imports some_module into the global scope.
You should be able to access your module through:
import foo.bar
mod = getattr(foo.bar, 'some_module')
It can be easily demonstrated that this approach works:
>>> import os.path
>>> getattr(os.path, 'basename')
<function basename at 0x00BBA468>
>>> getattr(os.path, 'basename\n')
Traceback (most recent call last):
File "<pyshell#31>", line 1, in <module>
getattr(os.path, 'basename\n')
AttributeError: 'module' object has no attribute 'basename
'
P.S. If you're still interested in using your kind of import statement. You need an eval:
from foo.bar import *
eval('some_module')
To clarify: not only it's bad practice to use *-import it's even worse in combination with eval. So just use getattr, it's designed exactly for situations like yours.
From the docs:
Direct use of __import__() is rare, except in cases where you want to import a module whose name is only known at runtime.
However, the dotted notation should work:
mod = __import__('foo.bar.some_module')

Categories