Import modules using an alias [duplicate] - python

This question already has answers here:
Python: importing a sub‑package or sub‑module
(3 answers)
Closed 6 years ago.
When attempting to import from an alias - which is common in scala I was surprised to see the following results:
Create an alias
import numpy as np
Use the alias to import modules it contains
from np import linalg
ImportError: No module named np.linalg
Is there any other syntax/equivalent in python useful for importing modules?

Using import module as name does not create an alias. You misunderstood the import system.
Importing does two things:
Load the module into memory and store the result in sys.modules. This is done once only; subsequent imports re-use the already loaded module object.
Bind one or more names in your current namespace.
The as name syntax lets you control the name in the last step.
For the from module import name syntax, you need to still name the full module, as module is looked up in sys.modules. If you really want to have an alias for this, you would have to add extra references there:
import numpy # loads sys.modules['numpy']
import sys
sys.modules['np'] = numpy # creates another reference
However, doing so can have side effects when you also are importing submodules. Generally speaking, you don't want to create aliases for packages by poking about in sys.modules without also creating aliases for all (possible) submodules as not doing so can cause Python to re-import submodules as separate namespaces.
In this specific case, importing numpy also triggers the loading of numpy.linalg, so all you really have to do is:
import numpy as np
# np.linalg now is available
No module aliasing is needed. For packages that don't import submodules automatically, you'd have to use:
import package as alias
import package.submodule
and alias.submodule is then available anyway, because a submodule is always added as an attribute on the parent package.

My understanding of your example would be that since you already imported numpy, you couldn't re import it with an alias, as it would already have the linalg portion imported.

Related

Import a module once and use it globally in python

I have done some research and learned that python's import statement only imports something once, and when used again, it just checks if it was already imported. I'm working on a bigger project and noticed that the same thing is imported in multiple files, which aparently doesn't affect performance but leaves the code a bit polluted imo. My question is: is there a way to import something only once and use it everywhere in the directory without calling the import statement over and over?
Here are some of the modules that I'm importing in various files:
from PyQt5.QtWebEngineWidgets import QWebEngineView
from PyQt5.QtCore import *
Each module (.py file) that needs to have those imported names in scope, will have to have its own import statements. This is the standard convention in Python. However, it's not recommended to import * but rather to import only the names you will actually use from the package.
It is possible to put your package import statements in a __init__.py file in the directory instead of in each .py file, but then you will still need a relative import statement to import those names from your package, as described here: Importing external package once in my module without it being added to the namespace

Does importing a Python file also import the imported files into shell?

I am running Python 3.6.2 and trying to import other files into my shell prompt as needed. I have the following code inside my_file.py.
import numpy as np
def my_file(x):
s = 1/(1+np.exp(-x))
return s
From my 3.6.2 shell prompt I call
from my_file import my_file
But in my shell prompt if I want to use the library numpy I still have to import numpy into the shell prompt even though I have imported a file that imports numpy. Is this functionality by design? Or is there a way to import numpy once?
import has three completely separate effects:
If the module has not yet been imported in the current process (by any script or module), execute its code (usually from disk) and store a module object with the resulting classes, functions, and variables.
If the module is in a package, (import the package first, and) store the new module as an attribute on the containing package (so that references like scipy.special work).
Assign the module ultimately imported to a variable in the invoking scope. (import foo.bar assigns foo; import baz.quux as frob assigns baz.quux to the name frob.)
The first two effects are shared among all clients, while the last is completely local. This is by design, as it avoids accidentally using a dependency of an imported module without making sure it’s available (which would break later if the other modules changed what they imported). It also lets different clients use different shorthands.
As hpaul noted, you can use another module’s imports with a qualified name, but this is abusing the module’s interface just like any other use of a private name unless (like six.moves, for example, or os.path which is actually not a module at all) the module intends to publish names for other modules.

How to create deprecation warnings when using old namespace

I have a python package with a large number of subpackages and I've recently rewritten a good chunk of it and renamed and reorganized the packages and objects.
For example, in the past, I would import something like
from package import MyClass
but now, it should be
from package.subpackage import MyClass
For backwards compatibility, I've created symbols in the old locations that import and use the modules and objects from the new namespaces.
In /package/__init__.py
from .subpackage import MyClass
Is there a way to raise a deprecation warning when someone tries to import or access one of the old namespace locations? With a class I could use __getattr__. Is there a similar mechanism for catching attribute access on modules and packages?

Import package from file with same name as package [duplicate]

I have a module that conflicts with a built-in module. For example, a myapp.email module defined in myapp/email.py.
I can reference myapp.email anywhere in my code without issue. However, I need to reference the built-in email module from my email module.
# myapp/email.py
from email import message_from_string
It only finds itself, and therefore raises an ImportError, since myapp.email doesn't have a message_from_string method. import email causes the same issue when I try email.message_from_string.
Is there any native support to do this in Python, or am I stuck with renaming my "email" module to something more specific?
You will want to read about Absolute and Relative Imports which addresses this very problem. Use:
from __future__ import absolute_import
Using that, any unadorned package name will always refer to the top level package. You will then need to use relative imports (from .email import ...) to access your own package.
NOTE: The above from ... line needs to be put into any 2.x Python .py files above the import ... lines you're using. In Python 3.x this is the default behavior and so is no longer needed.

Properly importing modules in Python

How do I set up module imports so that each module can access the objects of all the others?
I have a medium size Python application with modules files in various subdirectories. I have created modules that append these subdirectories to sys.path and imports a group of modules, using import thisModule as tm. Module objects are referred to with that qualification. I then import that module into the others with from moduleImports import *. The code is sloppy right now and has several of these things, which are often duplicative.
First, the application is failing because some module references aren't assigned. This same code does run when unit tested.
Second, I'm worried that I'm causing a problem with recursive module imports. Importing moduleImports imports thisModule, which imports moduleImports . . . .
What is the right way to do this?
"I have a medium size Python application with modules files in various subdirectories."
Good. Make absolutely sure that each directory include a __init__.py file, so that it's a package.
"I have created modules that append these subdirectories to sys.path"
Bad. Use PYTHONPATH or install the whole structure Lib/site-packages. Don't update sys.path dynamically. It's a bad thing. Hard to manage and maintain.
"imports a group of modules, using import thisModule as tm."
Doesn't make sense. Perhaps you have one import thisModule as tm for each module in your structure. This is typical, standard practice: import just the modules you need, no others.
"I then import that module into the others with from moduleImports import *"
Bad. Don't blanket import a bunch of random stuff.
Each module should have a longish list of the specific things it needs.
import this
import that
import package.module
Explicit list. No magic. No dynamic change to sys.path.
My current project has 100's of modules, a dozen or so packages. Each module imports just what it needs. No magic.
Few pointers
You may have already split
functionality in various module. If
correctly done most of the time you
will not fall into circular import
problems (e.g. if module a depends
on b and b on a you can make a third
module c to remove such circular
dependency). As last resort, in a
import b but in b import a at the
point where a is needed e.g. inside
function.
Once functionality is properly in
modules group them in packages under
a subdir and add a __init__.py file
to it so that you can import the
package. Keep such pakages in a
folder e.g. lib and then either add
to sys.path or set PYTHONPATH env
variable
from module import * may not
be good idea. Instead, import whatever
is needed. It may be fully qualified. It
doesn't hurt to be verbose. e.g.
from pakageA.moduleB import
CoolClass.
The way to do this is to avoid magic. In other words, if your module requires something from another module, it should import it explicitly. You shouldn't rely on things being imported automatically.
As the Zen of Python (import this) has it, explicit is better than implicit.
You won't get recursion on imports because Python caches each module and won't reload one it already has.

Categories