Automatic imports from imported file - python

I have a file called utils.py with the following code:
from __future__ import division
import numpy as np
In another file test.py I call the previous one:
from utils import *
print np.sqrt(4)
print 1/2
Now, as an outcome I get 2 and 0. That is, np imported in utils.py also imports to test.py through utils.py, however the division module does not. Is there a way to make sure division is imported to test.py by importing everything from utils.py?
The motivation is that in almost all my files I import utils.py so I do not want to import division in each file separately, as I can currently do with np.

Imports from __future__ are not real imports! They are a different kind of statement that happen to have a similar syntax.
The documentation states clearly:
It allows use of the new features on a per-module basis before the
release in which the feature becomes standard.
They are a way to tell python to treat that file in a different way, in particular to compile the code using a possibly different syntax or semantics.
So, no you cannot "re-export" __future__ imports.

Related

python package best practice: managing imports

Coming from R, I'm trying to wrap my head around the package system in python.
My question (in short) is: what is the best practice for managing external library imports?
Suppose I have a package (call it pointless) with the following directory structure.
pointless/
setup.py
...etc
pointless/
__init__.py
module1.py
module2.py
And suppose both module1 and module2 had the header:
from __future__ import division
import numpy as np
...
My issue is that when I import pointless I get a double-whammy of np and division in both pointless.module1 and pointless.module2. There has to be a better way?
EDIT
Apologies if that wasn't clear. It bugs me that when I run (ipython):
>>> import pointless
>>> pointless.module1.<TAB>
pointless.module1.np
pointless.module.division
...
>>> pointless.module2.<TAB>
pointless.module1.np
pointless.module.division
...
I can see the np namespace in both modules, which seems messy and way overkill.
Is there a way I can "centralize" my external library imports so that I don't see them in every module? Or am I missing something?
This is related to this question: what happens when i import module twice in python. Long story short: If you import a module twice, it is loaded only once, so your example is not problematic at all.

Importing system modules in python

Background:
I'm writing a symbolic package aimed at pypy for representing expressions and taking derivatives. Not working on simplification or anything like that at the moment, just differentiation and evaluation.
Question:
I'm writing expression classes to represent many of the functions in math (sqrt, log, exp, sin, cos, etc, etc). I have named the module that this goes in math to coincide with the module where you natural find said function. I may in the future do the same with cmath and any other module of particular mathematical interest. My module is part of a larger package ldb.algebra so the name math isn't going to collide with anything at the base namespace. However, in order to evaluate expressions I'm going to need to import the system math module inside the ldb.algebra.math module, and I can't do this from inside my math module (or any other module at the same level of package for that matter). Here's my basic path so far:
ldb/
__init__.py - empty
algebra/
__init__.py - empty
math.py
Contents of ldb/algebra/math.py as an example of the problem:
import math
my_sin = math.sin
Obviously not terribly useful but it shows the idea. I'm using python2 (or rather pypy for python2). I've found that I can get around this by creating a subpackage, a module under that say sys_math, importing math in that module and exposing the necessary functions but this seems like a lot of work and an ugly mess.
So my question is: can I get at the system math module while still having a module called ldb.algebra.math? Perhaps something funny in the __init__.py file like making the module _math and then changing the name in __init__.py somehow, or some special way to import the system module instead of myself.
My ugly solution so far:
ldb/
algebra/
extra_package/
__init__.py - empty
sys_math.py
Contents of sys_math.py:
import math
sin = math.sin
from __future__ import absolute_import
Put that at the top of your module to disable implicit relative imports. Then import math will import the built-in math module, and if you want to use relative imports, you have to do them explicitly with syntax like from . import math.

Python import modules in another file

I'm currently re-factoring a project (formerly big one file) into several seperate python files, each of which runs a specific part of my application.
Eg, GUIthread.py runs the GUI, Computethread.py does some maths, etc etc.
Each thread includes the use of functions from imported modules like math, time, numpy, etc etc.
I already have a file globalClasses.py containing class definitions for my datatypes etc, which each .py file imports at the start, as per recomendation here: http://effbot.org/pyfaq/how-do-i-share-global-variables-across-modules.htm . This is working well.
What I would like to do is have all my 3rdparty module imports in the globals file as well, so that I can write, for example, import math once but have all of my project files able to use math functions.
Questions:
1. Is this possible?
2. Is it a good idea/is it good Python practice?
My current solution is just to put
import math
import time
import numpy
...
(plus imports for all the other modules I'm using as well)
at the top of every file in my project... But that doesn't seem very tidy, and it's easy to forget to move a dependency's import statement when moving code-chunks from file to file...
Yeah I guess there is a more elegant way of doing this which will save redundant line of code. Suppose you want to import some modules math, time, numpy(say), then you can create a file importing_modules(say) and import the various modules as from module_name import *, So the importing_modules.py may look something like this:
importing_modules.py
from math import *
from numpy import *
from time import *
main.py
from importing_modules import *
#Now you can call the methods of that module directly
print sqrt(25) #Now we can call sqrt() directly in place of math.sqrt() or importing_modules.math.sqrt().
The other answer shows how what you want is (sort of) possible, but didn't address your second question about good practice.
Using import * is almost invariably considered bad practice. See "Why is import * bad?" and "Importing * from a package" from the docs.
Remember from PEP 20 that explicit is better than implicit. With explicit, specific imports (e.g. from math import sqrt) in every module, there is never confusion about from where a name came, your module's namespace includes only what it needs, and bugs are prevented.
The downside of having to write a couple import statements per module does not outweigh the potential problems introduced by trying to get around writing them.

Circular function import

I'm trying to do the following in python 2.6.
my_module.py:-
from another_module import another_factory
def my_factory(name):
pass
another_module.py:-
from my_module import my_factory
def another_factory(name):
pass
Both modules in the same folder.
It gives me the error:
Error: cannot import name my_factory
As seen from the comments, you are trying to do a circle import which is impossible.
If in your module A you try to import something from the module B, and when loading the module B (to satisfy this dependency) you are trying to import something from the module A, you are where you started and you got a circle import: A needs B and B needs A!!, it is somehow like saying that A needs A, which is quite unlogic.
For instance:
# moduleA
from moduleB import functionB
...
So the interpreter tries to load the moduleB, which looks like the following:
# moduleB
from moduleA import functionA
...
And goes back to the moduleA, which tries again to import B, and, etc. Therefore python just raises the error and stops the insanity for a greater good.
Dependencies don't work like this. Define what module needs the other one, and just do a simple import. In your example, it seems that another_module needs my_module, so change my_module and eliminate the dependency on another_module.
If both modules actually need each other, it is a clear sign that they belong to the same logical concept, and should be merged.
PD: in some cases to avoid huge files, you can split a logical unit in two, and to avoid the circle dependencies, you write your imports inside of the functions (which are not executed at load time), so that there is not a circle. This is however in general something to avoid.
The real question is... do you consider each file as a module or are they part of a package ?
Trying to import modules outside a package is sometimes painful. You should rather build a package by simply creating an empty __init__.py module in the directory. Though, if you have
__init__.py
my_module.py
another_module.py
If you have te following function in my_module.py,
def my_factory(x):
return x * x
You should be able to access the my_factory() function from another_module.py by writing this :
from my_module import my_factory
But, if you don't have the __init__.py file/module, the import function will be (somehow) lost and will only use the sys.path for searching other modules. You may then add the following lines (before the import) in the another_module.py file :
sys.path.append(os.path.dirname(os.path.expanduser('.')))
You may also use the various packages available to help importing modules, like imp or import_file (see the documentation). Or you can decide to use load_source (also see the doc : https://docs.python.org/2/library/imp.html)

PEP 8: How should __future__ imports be grouped?

According to PEP 8:
Imports should be grouped in the following order:
standard library imports
related third party imports
local application/library specific imports
You should put a blank line between each group of imports.
But it does not mention about __future__ imports. Should __future__ imports be grouped together with standard library imports or separated from standard library imports.
So, which is more preferred:
from __future__ import absolute_import
import sys
import os.path
from .submod import xyz
or:
from __future__ import absolute_import
import sys
import os.path
from .submod import xyz
I personally separate them. A __future__ import isn't just binding a name like other imports, it changes the meaning of the language. With things like from __future__ import division the module will likely run fine both with and without the import, but give different (wrong) results at places that have nothing telling me to go look at names imported if I want to know more about where they come from. __future__ imports should stand out as much as possible.
Also, I generally sort imports within a group alphabetically (no particularly good reason for doing that; I just find it has some very small benefits to diffs and merging branches), and __future__ imports have to be first, so I put them in their own group.

Categories