import from vs import in Python 2.7 - python

Is from foo import * equivalent to import foo? Plese help.
The question is about python 2.7

No,
When you use import foo, to call a function from from this package you have to do : foo.my_function() whereas with from foo import * you can write my_function() directly

I think the difference is more in style of use. But I would not use from foo import *. I think it's better to be specific on what you need to import:
from foo import package_0
from foo import package_1
from foo import package_2
# etc
However, you need to add additional import every time you need a new package which can be avoided if you use import foo. But in that case you need to prefix every package inside of it with foo:
foo.package_0.some_method()
foo.package_1.another_method()

Related

Abbreviation for long python import path

Given imports like this:
from a.very.long.list.of.packages.aaa.bbb.ccc import abc
from a.very.long.list.of.packages.ddd.eee import de
from a.very.long.list.of.packages.fff import f
from a.very.long.list.of.packages import somepackage
is there any way to define aliases for the common part of the module path and reuse it?
I'm imagining something like this:
x = a.very.long.list.of.packages
from x.aaa.bbb.ccc import abc
from x.ddd.eee import de
from x.fff import f
from x import somepackage
Given x = a.very.long.list.of.packages, Python will try to resolve all the attributes and fail immediately because no name a has been defined. If it already exists, it's unlikely that it has the attribute very and the object this attribute points to has the attribute long and so on. Anyway, everything to the right of the assignment operator will be evaluated to some object, and it's not possible to import stuff from objects with from ... import ....
You can use dynamic importing with the built-in importlib module. It lets you treat strings as paths to modules.

Can we enforce named imports in Python (and 'disable' * imports)

Python allows for importing symbols from other modules using the import <symbol> statement. Similarly, I can also say from <module> import * and it will import all the symbols from module.py.
Now, say if I want to not allow anyone to import * from my module, can I override this functionality and disable it? I.e. can I enforce that users of my module can only import symbols by their names and not the * wildcard; say, if you import using * wildcard, the Python interpreter would throw an error.
I know that most Python linters would catch and flag such imports by default (or if configured); but I was wondering if there's a way I can enforce such a thing in my module code itself.
You can define what Objects, functions and classes get imported with import * from your module with __all__
in the beginning of your module add:
__all__ = []
Everything you put in that list can be imported with from yourmodule import *.
Everything else that is not named can still be accessed with yourmodule.objectname.
Example:
Let's assume you have 2 functions in your module.
E.g.:
def foo():
print("Foo")
def bar():
print("Bar")
Importing * from your module will import foo and bar.
If you add __all__ = ['foo'] then import * will only import foo.
And if you add __all__ = [] then import * should not import anything.
Edit:
if you leave the list empty, nothing will be imported, however if you want an error to be raised so that users understand that you don't want to allow import *, then add one entry that will fail, e.g.: __all__=['NO_WILDCARD_IMPORT_ALLOWED'].

Best practice for nested Python module imports

Suppose I have a Python module "main.py":
import math # from the standard Python library
import my_own_module
...
foo = math.cos(bar)
And I also need to import the standard math module in "my_own_module.py":
import math
...
baz = math.sin(qux)
In this case I think import math in "main.py" is redundant and can be omitted.
What's best practice in this case:
Omit import math from "main.py" becuase it's redundant? Or,
Keep import math in "main.py" to clarify that the code in that module requires it?
The reference to math.cos in main.py means that import math is required in main.py, regardless of whether my_own_module.py imports it or not. It is not redundant, and it cannot be omitted (and if you try to omit it, you'll get an error).
import math
does something else than simply including the full text of one file into the other.
It introduces a new namespace with the name math, and this math name will be known in your current namespace.
If you omit the
import math
from your main.py file, your command
foo = math.cos(bar)
becomes illegal, as the math symbol will be not (recognized) in the main.py namespace.
This is not like, eg #include in C++. The import is not optional. Importing a module is required to be able to refer to its contents. This is true for every single file that does it.
A good question. The short answer is yes, if you use a math function in a py file then you need to import the module at the top regardless of how many times its imported elsewhere.
It gets interesting when we throw a thrid file into the mix, lets call this "explanation.py"
And lets suppose that your "main.py" becomes "my_functions.py" and contains a function called foo:
#my_functions.py
import math
import my_own_module
def foo(bar):
return math.cos(bar)
and in my_own_module.py:
#my_own_module.py
import math
def bar(foo):
return math.sin(foo)
and finally explanation.py (new main())
#main.py
import my_functions
import my_own_module
bar = my_functions.foo(10)
foo = my_own_module.bar(10)
print(foo)
print(bar)
Notice how you DO NOT need to add math if you call the functions imported from another file. I hope that might add further clarity to your enquiry :)
However it might be worth noting that this would exclude maths from the current namespace, therefore rendering any further calls to the math functions useless.

Python package naming and importing

I have a package called foo. It's organized as follows:
package_dir/foo/foo.py
package_dir/foo/utils.py
package_dir/foo/other.py
package_dir/foo/__init__.py
I probably should have named foo.py something else but this library has grown and evolved over time and now supports other stuff and is used all over now. The package is bundled and stored on our internal pipy server so that when installed, I end up with /usr/lib/python2.7/site-packages/foo.
What is the best way to import from this package?
Currently I do this:
import foo
I then end up doing this in client code:
foo.foo.myfunction()
There are way to many classes and functions being used to use this approach:
from foo import blah, blah, blah
This gets kind of klunky. Is this a pythonic approach to packages? Is there a better way to do this?
Thx for any help.
* update *
So I've done this but it's not finding my function(s):
from __init__.py:
import foo
import utils
import other
I then import like this:
import foo
Then in my code I have tried:
foo.myfunc()
And also:
myfunc()
Both complain:
AttributeError: 'module' object has no attribute 'myfunc'
You have some options:
from foo import foo
foo.myfunction()
Or
from foo import foo as something
something.myfunction()
Or
import foo.foo as something
something.myfunction()
Or you can "promote" some APIs so they're published from the top-level. Import them in foo/__init__.py and then you can use them from the package object. For example, your new foo/__init__.py could be:
from .foo import myfunction
Notice the explicit relative import syntax to avoid ambiguity between the top-level foo package and the nested foo package.
Then you can write:
import foo
foo.myfunction()
and so on.

Use 'import module' or 'from module import'?

I've tried to find a comprehensive guide on whether it is best to use import module or from module import. I've just started with Python and I'm trying to start off with best practices in mind.
Basically, I was hoping if anyone could share their experiences, what preferences other developers have and what's the best way to avoid any gotchas down the road?
The difference between import module and from module import foo is mainly subjective. Pick the one you like best and be consistent in your use of it. Here are some points to help you decide.
import module
Pros:
Less maintenance of your import statements. Don't need to add any additional imports to start using another item from the module
Cons:
Typing module.foo in your code can be tedious and redundant (tedium can be minimized by using import module as mo then typing mo.foo)
from module import foo
Pros:
Less typing to use foo
More control over which items of a module can be accessed
Cons:
To use a new item from the module you have to update your import statement
You lose context about foo. For example, it's less clear what ceil() does compared to math.ceil()
Either method is acceptable, but don't use from module import *.
For any reasonable large set of code, if you import * you will likely be cementing it into the module, unable to be removed. This is because it is difficult to determine what items used in the code are coming from 'module', making it easy to get to the point where you think you don't use the import any more but it's extremely difficult to be sure.
There's another detail here, not mentioned, related to writing to a module. Granted this may not be very common, but I've needed it from time to time.
Due to the way references and name binding works in Python, if you want to update some symbol in a module, say foo.bar, from outside that module, and have other importing code "see" that change, you have to import foo a certain way. For example:
module foo:
bar = "apples"
module a:
import foo
foo.bar = "oranges" # update bar inside foo module object
module b:
import foo
print foo.bar # if executed after a's "foo.bar" assignment, will print "oranges"
However, if you import symbol names instead of module names, this will not work.
For example, if I do this in module a:
from foo import bar
bar = "oranges"
No code outside of a will see bar as "oranges" because my setting of bar merely affected the name "bar" inside module a, it did not "reach into" the foo module object and update its bar.
Even though many people already explained about import vs import from, I want to try to explain a bit more about what happens under the hood, and where all the places it changes are.
import foo:
Imports foo, and creates a reference to that module in the current namespace. Then you need to define completed module path to access a particular attribute or method from inside the module.
E.g. foo.bar but not bar
from foo import bar:
Imports foo, and creates references to all the members listed (bar). Does not set the variable foo.
E.g. bar but not baz or foo.baz
from foo import *:
Imports foo, and creates references to all public objects defined by that module in the current namespace (everything listed in __all__ if __all__ exists, otherwise everything that doesn't start with _). Does not set the variable foo.
E.g. bar and baz but not _qux or foo._qux.
Now let’s see when we do import X.Y:
>>> import sys
>>> import os.path
Check sys.modules with name os and os.path:
>>> sys.modules['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> sys.modules['os.path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
Check globals() and locals() namespace dicts with os and os.path:
>>> globals()['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> locals()['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> globals()['os.path']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'os.path'
>>>
From the above example we found that only os is inserted in the local and global namespace.
So, we should be able to use:
>>> os
<module 'os' from
'/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> os.path
<module 'posixpath' from
'/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>>
But not path.
>>> path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'path' is not defined
>>>
Once you delete the os from locals() namespace, you won't be able to access os as well as os.path even though they exist in sys.modules:
>>> del locals()['os']
>>> os
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
>>> os.path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
>>>
Now let's talk about import from:
from:
>>> import sys
>>> from os import path
Check sys.modules with os and os.path:
>>> sys.modules['os']
<module 'os' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/os.pyc'>
>>> sys.modules['os.path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
We found that in sys.modules we found as same as we did before by using import name
OK, let's check how it looks like in locals() and globals() namespace dicts:
>>> globals()['path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> locals()['path']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> globals()['os']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'os'
>>>
You can access by using name path not by os.path:
>>> path
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> os.path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'os' is not defined
>>>
Let's delete 'path' from locals():
>>> del locals()['path']
>>> path
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'path' is not defined
>>>
One final example using an alias:
>>> from os import path as HELL_BOY
>>> locals()['HELL_BOY']
<module 'posixpath' from '/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>> globals()['HELL_BOY']
<module 'posixpath' from /System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/posixpath.pyc'>
>>>
And no path defined:
>>> globals()['path']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'path'
>>>
Both ways are supported for a reason: there are times when one is more appropriate than the other.
import module: nice when you are using many bits from the module. drawback is that you'll need to qualify each reference with the module name.
from module import ...: nice that imported items are usable directly without module name prefix. The drawback is that you must list each thing you use, and that it's not clear in code where something came from.
Which to use depends on which makes the code clear and readable, and has more than a little to do with personal preference. I lean toward import module generally because in the code it's very clear where an object or function came from. I use from module import ... when I'm using some object/function a lot in the code.
I personally always use
from package.subpackage.subsubpackage import module
and then access everything as
module.function
module.modulevar
etc. The reason is that at the same time you have short invocation, and you clearly define the module namespace of each routine, something that is very useful if you have to search for usage of a given module in your source.
Needless to say, do not use the import *, because it pollutes your namespace and it does not tell you where a given function comes from (from which module)
Of course, you can run in trouble if you have the same module name for two different modules in two different packages, like
from package1.subpackage import module
from package2.subpackage import module
in this case, of course you run into troubles, but then there's a strong hint that your package layout is flawed, and you have to rethink it.
import module
Is best when you will use many functions from the module.
from module import function
Is best when you want to avoid polluting the global namespace with all the functions and types from a module when you only need function.
I've just discovered one more subtle difference between these two methods.
If module foo uses a following import:
from itertools import count
Then module bar can by mistake use count as though it was defined in foo, not in itertools:
import foo
foo.count()
If foo uses:
import itertools
the mistake is still possible, but less likely to be made. bar needs to:
import foo
foo.itertools.count()
This caused some troubles to me. I had a module that by mistake imported an exception from a module that did not define it, only imported it from other module (using from module import SomeException). When the import was no longer needed and removed, the offending module was broken.
Here is another difference not mentioned. This is copied verbatim from http://docs.python.org/2/tutorial/modules.html
Note that when using
from package import item
the item can be either a submodule (or subpackage) of the package, or some other name defined in the package, like a function, class or variable. The import statement first tests whether the item is defined in the package; if not, it assumes it is a module and attempts to load it. If it fails to find it, an ImportError exception is raised.
Contrarily, when using syntax like
import item.subitem.subsubitem
each item except for the last must be a package; the last item can be a module or a package but can’t be a class or function or variable defined in the previous item.
Since I am also a beginner, I will be trying to explain this in a simple way:
In Python, we have three types of import statements which are:
1. Generic imports:
import math
this type of import is my personal favorite, the only downside to this import technique is that if you need use any module's function you must use the following syntax:
math.sqrt(4)
of course, it increases the typing effort but as a beginner, it will help you to keep track of module and function associated with it, (a good text editor will reduce the typing effort significantly and is recommended).
Typing effort can be further reduced by using this import statement:
import math as m
now, instead of using math.sqrt() you can use m.sqrt().
2. Function imports:
from math import sqrt
this type of import is best suited if your code only needs to access single or few functions from the module, but for using any new item from the module you have to update import statement.
3. Universal imports:
from math import *
Although it reduces typing effort significantly but is not recommended because it will fill your code with various functions from the module and their name could conflict with the name of user-defined functions.
example:
If you have a function of your very own named sqrt and you import math, your function is safe: there is your sqrt and there is math.sqrt. If you do from math import *, however, you have a problem: namely, two different functions with the exact same name. Source: Codecademy
I would like to add to this. It can be useful to understand how Python handles imported modules as attributes if you run into circular imports.
I have the following structure:
mod/
__init__.py
main.py
a.py
b.py
c.py
d.py
From main.py I will import the other modules using differnt import methods
main.py:
import mod.a
import mod.b as b
from mod import c
import d
dis.dis shows the difference (note module names, a b c d):
1 0 LOAD_CONST 0 (-1)
3 LOAD_CONST 1 (None)
6 IMPORT_NAME 0 (mod.a)
9 STORE_NAME 1 (mod)
2 12 LOAD_CONST 0 (-1)
15 LOAD_CONST 1 (None)
18 IMPORT_NAME 2 (b)
21 STORE_NAME 2 (b)
3 24 LOAD_CONST 0 (-1)
27 LOAD_CONST 2 (('c',))
30 IMPORT_NAME 1 (mod)
33 IMPORT_FROM 3 (c)
36 STORE_NAME 3 (c)
39 POP_TOP
4 40 LOAD_CONST 0 (-1)
43 LOAD_CONST 1 (None)
46 IMPORT_NAME 4 (mod.d)
49 LOAD_ATTR 5 (d)
52 STORE_NAME 5 (d)
55 LOAD_CONST 1 (None)
In the end they look the same (STORE_NAME is result in each example), but this is worth noting if you need to consider the following four circular imports:
example1
foo/
__init__.py
a.py
b.py
a.py:
import foo.b
b.py:
import foo.a
>>> import foo.a
>>>
This works
example2
bar/
__init__.py
a.py
b.py
a.py:
import bar.b as b
b.py:
import bar.a as a
>>> import bar.a
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "bar\a.py", line 1, in <module>
import bar.b as b
File "bar\b.py", line 1, in <module>
import bar.a as a
AttributeError: 'module' object has no attribute 'a'
No dice
example3
baz/
__init__.py
a.py
b.py
a.py:
from baz import b
b.py:
from baz import a
>>> import baz.a
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "baz\a.py", line 1, in <module>
from baz import b
File "baz\b.py", line 1, in <module>
from baz import a
ImportError: cannot import name a
Similar issue... but clearly from x import y is not the same as import import x.y as y
example4
qux/
__init__.py
a.py
b.py
a.py:
import b
b.py:
import a
>>> import qux.a
>>>
This one also works
import package
import module
With import, the token must be a module (a file containing Python commands) or a package (a folder in the sys.path containing a file __init__.py.)
When there are subpackages:
import package1.package2.package
import package1.package2.module
the requirements for folder (package) or file (module) are the same, but the folder or file must be inside package2 which must be inside package1, and both package1 and package2 must contain __init__.py files. https://docs.python.org/2/tutorial/modules.html
With the from style of import:
from package1.package2 import package
from package1.package2 import module
the package or module enters the namespace of the file containing the import statement as module (or package) instead of package1.package2.module. You can always bind to a more convenient name:
a = big_package_name.subpackage.even_longer_subpackage_name.function
Only the from style of import permits you to name a particular function or variable:
from package3.module import some_function
is allowed, but
import package3.module.some_function
is not allowed.
To add to what people have said about from x import *: besides making it more difficult to tell where names came from, this throws off code checkers like Pylint. They will report those names as undefined variables.
My own answer to this depends mostly on first, how many different modules I'll be using. If i'm only going to use one or two, I'll often use from ... import since it makes for fewer keystrokes in the rest of the file, but if I'm going to make use of many different modules, I prefer just import because that means that each module reference is self-documenting. I can see where each symbol comes from without having to hunt around.
Usuaully I prefer the self documenting style of plain import and only change to from.. import when the number of times I have to type the module name grows above 10 to 20, even if there's only one module being imported.
This is my directory structure of my current directory:
.
└─a
└─b
└─c
The import statement remembers all intermediate names.
These names have to be qualified:
In[1]: import a.b.c
In[2]: a
Out[2]: <module 'a' (namespace)>
In[3]: a.b
Out[3]: <module 'a.b' (namespace)>
In[4]: a.b.c
Out[4]: <module 'a.b.c' (namespace)>
The from ... import ... statement remembers only the imported name.
This name must not be qualified:
In[1]: from a.b import c
In[2]: a
NameError: name 'a' is not defined
In[2]: a.b
NameError: name 'a' is not defined
In[3]: a.b.c
NameError: name 'a' is not defined
In[4]: c
Out[4]: <module 'a.b.c' (namespace)>
Note: Of course, I restarted my Python console between steps 1 and 2.
There have been many answers, but none have mentioned testing (with unittest or pytest).
tl;dr
Use import foo for external modules to simplify testing.
The Hard Way
Importing classes/functions (from foo import bar) individually from a module makes red-green-refactor cycles tedious. For example, if my file looks like
# my_module.py
from foo import bar
class Thing:
def do_thing(self):
bar('do a thing')
and my test is
# test_my_module.py
from unittest.mock import patch
import my_module
patch.object(my_module, 'bar')
def test_do_thing(mock_bar):
my_module.Thing().do_thing()
mock_bar.assert_called_with('do a thing')
At first glance, this seems great. But what happens if I want to implement Thing class in a different file? My structure would have to change like this...
# my_module.py
from tools import Thing
def do_thing():
Thing().do_thing()
# tools.py
from foo import bar
class Thing:
def do_thing(self):
bar('do a thing')
# test_my_module.py
from unittest.mock import patch
import my_module
import tools # Had to import implementation file...
patch.object(tools, 'bar') # Changed patch
def test_do_thing(mock_bar):
my_module.do_thing() # Changed test (expected)
mock_bar.assert_called_with('do a thing')
Unfortunately, since I used from foo import bar, I need to update my patch to reference the tools module. Essentially, since my test knows too much about implementation, much more than expected needs to be changed to do this refactor.
The Better Approach
Using import foo, my tests can ignore how the module is implemented and simply patch the whole module.
# my_module.py
from tools import Thing
def do_thing():
Thing().do_thing()
# tools.py
import foo
class Thing:
def do_thing(self):
foo.bar('do a thing') # Specify 'bar' is from 'foo' module
# test_my_module.py
from unittest.mock import patch
import my_module
patch('foo') # Patch entire foo module
def test_do_thing(mock_foo):
my_module.do_thing() # Changed test (expected)
mock_foo.bar.assert_called_with('do a thing')
The less implementation details your tests know, the better. That way, if you come up with a better solution (use classes instead of functions, use additional files to separate ideas, etc.), less needs to be changed in your tests to accommodate the refactor.
One of the significant difference I found out which surprisingly no-one has talked about is that using plain import you can access private variable and private functions from the imported module, which isn't possible with from-import statement.
Code in image:
setting.py
public_variable = 42
_private_variable = 141
def public_function():
print("I'm a public function! yay!")
def _private_function():
print("Ain't nobody accessing me from another module...usually")
plain_importer.py
import settings
print (settings._private_variable)
print (settings.public_variable)
settings.public_function()
settings._private_function()
# Prints:
# 141
# 42
# I'm a public function! yay!
# Ain't nobody accessing me from another module...usually
from_importer.py
from settings import *
#print (_private_variable) #doesn't work
print (public_variable)
public_function()
#_private_function() #doesn't work
As Jan Wrobel mentions, one aspect of the different imports is in which way the imports are disclosed.
Module mymath
from math import gcd
...
Use of mymath:
import mymath
mymath.gcd(30, 42) # will work though maybe not expected
If I imported gcd only for internal use, not to disclose it to users of mymath, this can be inconvenient. I have this pretty often, and in most cases I want to "keep my modules clean".
Apart from the proposal of Jan Wrobel to obscure this a bit more by using import math instead, I have started to hide imports from disclosure by using a leading underscore:
# for instance...
from math import gcd as _gcd
# or...
import math as _math
In larger projects this "best practice" allows my to exactly control what is disclosed to subsequent imports and what isn't. This keeps my modules clean and pays back at a certain size of project.
since many people answered here but i am just trying my best :)
import module is best when you don't know which item you have to import from module. In this way it may be difficult to debug when problem raises because
you don't know which item have problem.
form module import <foo> is best when you know which item you require to import and also helpful in more controlling using importing specific item according to your need. Using this way debugging may be easy because you know which item you imported.
Import Module - You don't need additional efforts to fetch another thing from module. It has disadvantages such as redundant typing
Module Import From - Less typing &More control over which items of a module can be accessed.To use a new item from the module you have to update your import statement.
There are some builtin modules that contain mostly bare functions (base64, math, os, shutil, sys, time, ...) and it is definitely a good practice to have these bare functions bound to some namespace and thus improve the readability of your code. Consider how more difficult is to understand the meaning of these functions without their namespace:
copysign(foo, bar)
monotonic()
copystat(foo, bar)
than when they are bound to some module:
math.copysign(foo, bar)
time.monotonic()
shutil.copystat(foo, bar)
Sometimes you even need the namespace to avoid conflicts between different modules (json.load vs. pickle.load)
On the other hand there are some modules that contain mostly classes (configparser, datetime, tempfile, zipfile, ...) and many of them make their class names self-explanatory enough:
configparser.RawConfigParser()
datetime.DateTime()
email.message.EmailMessage()
tempfile.NamedTemporaryFile()
zipfile.ZipFile()
so there can be a debate whether using these classes with the additional module namespace in your code adds some new information or just lengthens the code.
I was answering a similar question post but the poster deleted it before i could post. Here is one example to illustrate the differences.
Python libraries may have one or more files (modules). For exmaples,
package1
|-- __init__.py
or
package2
|-- __init__.py
|-- module1.py
|-- module2.py
We can define python functions or classes inside any of the files based design requirements.
Let's define
func1() in __init__.py under mylibrary1, and
foo() in module2.py under mylibrary2.
We can access func1() using one of these methods
import package1
package1.func1()
or
import package1 as my
my.func1()
or
from package1 import func1
func1()
or
from package1 import *
func1()
We can use one of these methods to access foo():
import package2.module2
package2.module2.foo()
or
import package2.module2 as mod2
mod2.foo()
or
from package2 import module2
module2.foo()
or
from package2 import module2 as mod2
mod2.foo()
or
from package2.module2 import *
foo()
In simple words, this is all about programmer convenience. At the core level, they simply import all functionality of the module.
import module: When you use import module then to use methods of this module you have to write module.method(). Every time you use any method or property then you have to refer to the module.
from module import all: When you use from module import all than to use methods of this module you just have to write method() without referring to the module.
There is a crucial aspect of these imports that #ahfx already mentioned, namely the internals of the process of loading modules. This pops up if your system needs to use circular imports (e.g. you want to make use of dependency injection in some popular http frameworks). In such cases the from {module} import {function} appears much more aggressive with its requirements on how the loading process proceeds. Let us take the example:
#m1.py:
print('--start-m1--')
from m2 import * # form does not matter; just need to force import of m2
print('--mid-m1--')
def do1(x):
print(x)
print('--end-m1--')
importing
#m2.py
print('--start-m2--')
# from m1 import * # A
# from m1 import do1 # B
# import m1 # C
# D -- no import of "do1" at all
print('--mid-m2--')
def do2(x):
m1.do1(x)
print('--end-m2--')
run via
#main.py:
from m1 import do1
do1('ok')
Of all the import possibilities in m2.py (A,B,C,D), the from {module} import {function} is the only one that actually crashes the load process, leading to the infamous (CPython 3.10.6)
ImportError: cannot import name 'do1' from partially initialized module 'm1'
(most likely due to a circular import)
While I cannot say why this happens, it appears that the from ... import ... statement puts a more stringent requirement on "how far" the module in question is already in its initialization process.

Categories