Background:
I'm writing a symbolic package aimed at pypy for representing expressions and taking derivatives. Not working on simplification or anything like that at the moment, just differentiation and evaluation.
Question:
I'm writing expression classes to represent many of the functions in math (sqrt, log, exp, sin, cos, etc, etc). I have named the module that this goes in math to coincide with the module where you natural find said function. I may in the future do the same with cmath and any other module of particular mathematical interest. My module is part of a larger package ldb.algebra so the name math isn't going to collide with anything at the base namespace. However, in order to evaluate expressions I'm going to need to import the system math module inside the ldb.algebra.math module, and I can't do this from inside my math module (or any other module at the same level of package for that matter). Here's my basic path so far:
ldb/
__init__.py - empty
algebra/
__init__.py - empty
math.py
Contents of ldb/algebra/math.py as an example of the problem:
import math
my_sin = math.sin
Obviously not terribly useful but it shows the idea. I'm using python2 (or rather pypy for python2). I've found that I can get around this by creating a subpackage, a module under that say sys_math, importing math in that module and exposing the necessary functions but this seems like a lot of work and an ugly mess.
So my question is: can I get at the system math module while still having a module called ldb.algebra.math? Perhaps something funny in the __init__.py file like making the module _math and then changing the name in __init__.py somehow, or some special way to import the system module instead of myself.
My ugly solution so far:
ldb/
algebra/
extra_package/
__init__.py - empty
sys_math.py
Contents of sys_math.py:
import math
sin = math.sin
from __future__ import absolute_import
Put that at the top of your module to disable implicit relative imports. Then import math will import the built-in math module, and if you want to use relative imports, you have to do them explicitly with syntax like from . import math.
Related
My project structure like this:
/project
main.py
/a_module
__init__.py
/sub_module
__init__.py
some_file.py
main.py
from a_module import main_api
a_module/__init__.py
from sub_module import sub_api
sub_module/__init__.py
from some_file import detail_api
In a_module/__init__.py gives Unable to import 'sub_module' error.
Why I cannot import 'sub_module'?
When I change to the relative path solve the error.
from .sub_module import sub_api
But I don't understand, does __init__.py design for public the API of the module? Why don't treat sub_module as a module instead of a directory? it's such a bad design to me...
__init__.py is executed when you import the package that contains it. But it's not your problem. Your problem is that module imports are always absolute unless explicitly relative. That means that they must chain from some directory in sys.path. By default this includes the working directory, so when you run main.py from within project, it can find a_module, and nothing else.
from sub_module import sub_api
In a_module/__init__.py doesn't work though, because imports are always absolute unless explicitly relative. So that import says "starting from some sys.path root, find a top level package named sub_module and import sub_api from it". Since no such module exists you get an error. from .sub_module import sub_api works because you opted into relative imports, so it doesn't start over from sys.path.
For an example of why you would do this, I'll give you something that broke in our own code back in the Python 2 days before absolute import by default was the law (from __future__ import absolute_import enabled the Py3 behavior, which is how we fixed it, but despite what the docs say, it was never enabled by default in Py2, the only enabled by default behavior was relative imports). Our layout was:
teamnamespace/
module.py
math/
mathrelatedsubmodule.py
othermathsubmodule.py
Now, we innocently thought hey, we'll put all our packages under a single shared top level namespace, and subpackages cover broad categories within them, and since we had a lot of additional utilities for basic mathematics, we put them under teamnamespace.math. Problem was, for the non-math modules, like teamnamespace.module, when they did:
import math # or
from math import ceil
it defaulted to relative lookup, and imported teamnamespace.math as math (a thoroughly useless import, since it was a namespace package only, all the functionality was in the sub-modules), not the built-in math module. In fact, without the Python 3 behavior, there was no reasonable way to get the built-in math module from a module under teamnamespace. Whereas with the Python 3 behavior, you can get either one or both (by aliasing one or the other with as, with no ambiguity:
# Gets built-in
import math
# Gets teamnamespace.math
from . import math
I'm building a dependency graph in python3 using the ast module. How do I know what file(s) will be imported if that import statement were to be executed?
Not a complete answer, but here are some bits you should be aware of:
Imports might happen in conditional or try-catch blocks. So depending on a setting of an environment variable, module A might or might not import module B.
There's a wide variety of import syntax: import A, from A import B, from A import *, from . import A, from .. import A, from ..A import B as well as their versions with A replaced with sub-modules.
Imports can happen in any executable context - the top-level of the file, in a function, in a class definition etc.
eval can evaluate code with imports. Up to you if you consider such code to be a dependency.
The standard library modulefinder module might help.
As suggested in a comment: the other answers are valid, but one of the fundamental problems is that your examples only work for 'simple' scripts or files: A lot of more complex code will use things like dynamic imports: consider the following:
path, task_name = "module.function".rsplit(".", 1);
module = importlib.import_module(path);
real_func = getattr(module, task_name);
real_func();
The actual original string could be obfuscated, or pulled from a DB, or a file or...
There are alternatives to importlib, but this is on top of the exec type stuff you might see in #horia's good answer.
I'm currently re-factoring a project (formerly big one file) into several seperate python files, each of which runs a specific part of my application.
Eg, GUIthread.py runs the GUI, Computethread.py does some maths, etc etc.
Each thread includes the use of functions from imported modules like math, time, numpy, etc etc.
I already have a file globalClasses.py containing class definitions for my datatypes etc, which each .py file imports at the start, as per recomendation here: http://effbot.org/pyfaq/how-do-i-share-global-variables-across-modules.htm . This is working well.
What I would like to do is have all my 3rdparty module imports in the globals file as well, so that I can write, for example, import math once but have all of my project files able to use math functions.
Questions:
1. Is this possible?
2. Is it a good idea/is it good Python practice?
My current solution is just to put
import math
import time
import numpy
...
(plus imports for all the other modules I'm using as well)
at the top of every file in my project... But that doesn't seem very tidy, and it's easy to forget to move a dependency's import statement when moving code-chunks from file to file...
Yeah I guess there is a more elegant way of doing this which will save redundant line of code. Suppose you want to import some modules math, time, numpy(say), then you can create a file importing_modules(say) and import the various modules as from module_name import *, So the importing_modules.py may look something like this:
importing_modules.py
from math import *
from numpy import *
from time import *
main.py
from importing_modules import *
#Now you can call the methods of that module directly
print sqrt(25) #Now we can call sqrt() directly in place of math.sqrt() or importing_modules.math.sqrt().
The other answer shows how what you want is (sort of) possible, but didn't address your second question about good practice.
Using import * is almost invariably considered bad practice. See "Why is import * bad?" and "Importing * from a package" from the docs.
Remember from PEP 20 that explicit is better than implicit. With explicit, specific imports (e.g. from math import sqrt) in every module, there is never confusion about from where a name came, your module's namespace includes only what it needs, and bugs are prevented.
The downside of having to write a couple import statements per module does not outweigh the potential problems introduced by trying to get around writing them.
I'm trying to do the following in python 2.6.
my_module.py:-
from another_module import another_factory
def my_factory(name):
pass
another_module.py:-
from my_module import my_factory
def another_factory(name):
pass
Both modules in the same folder.
It gives me the error:
Error: cannot import name my_factory
As seen from the comments, you are trying to do a circle import which is impossible.
If in your module A you try to import something from the module B, and when loading the module B (to satisfy this dependency) you are trying to import something from the module A, you are where you started and you got a circle import: A needs B and B needs A!!, it is somehow like saying that A needs A, which is quite unlogic.
For instance:
# moduleA
from moduleB import functionB
...
So the interpreter tries to load the moduleB, which looks like the following:
# moduleB
from moduleA import functionA
...
And goes back to the moduleA, which tries again to import B, and, etc. Therefore python just raises the error and stops the insanity for a greater good.
Dependencies don't work like this. Define what module needs the other one, and just do a simple import. In your example, it seems that another_module needs my_module, so change my_module and eliminate the dependency on another_module.
If both modules actually need each other, it is a clear sign that they belong to the same logical concept, and should be merged.
PD: in some cases to avoid huge files, you can split a logical unit in two, and to avoid the circle dependencies, you write your imports inside of the functions (which are not executed at load time), so that there is not a circle. This is however in general something to avoid.
The real question is... do you consider each file as a module or are they part of a package ?
Trying to import modules outside a package is sometimes painful. You should rather build a package by simply creating an empty __init__.py module in the directory. Though, if you have
__init__.py
my_module.py
another_module.py
If you have te following function in my_module.py,
def my_factory(x):
return x * x
You should be able to access the my_factory() function from another_module.py by writing this :
from my_module import my_factory
But, if you don't have the __init__.py file/module, the import function will be (somehow) lost and will only use the sys.path for searching other modules. You may then add the following lines (before the import) in the another_module.py file :
sys.path.append(os.path.dirname(os.path.expanduser('.')))
You may also use the various packages available to help importing modules, like imp or import_file (see the documentation). Or you can decide to use load_source (also see the doc : https://docs.python.org/2/library/imp.html)
This question already has answers here:
How can I import from the standard library, when my project has a module with the same name? (How can I control where Python looks for modules?)
(6 answers)
Closed last month.
How can a standard-library module (say math) be accessed when a file prog.py is placed in the same directory as a local module with the same name (math.py)?
I'm asking this question because I would like to create a package uncertainties that one can use as
import uncertainties
from uncertainties.math import *
Thus, there is a local math module inside the uncertainties directory. The problem is that I want to access the standard library math module from uncertainties/__init__.py.
I prefer not to rename uncertainties.math because this module is precisely intended to replace functions from the math module (with equivalents that handle numerical uncertainties).
PS: this question pertains to the module I wrote for performing calculations with uncertainties while taking into account correlations between variables.
You are looking for Absolute/Relative imports from PEP 328, available with 2.5 and upward.
In Python 2.5, you can switch import‘s behaviour to absolute imports using a from __future__ import absolute_import directive. This absolute- import behaviour will become the default in a future version (probably Python 2.7). Once absolute imports are the default, import math will always find the standard library’s version. It’s suggested that users should begin using absolute imports as much as possible, so it’s preferable to begin writing from pkg import string in your code.
Relative imports are still possible by adding a leading period to the module name when using the from ... import form:
from __future__ import absolute_import
# Import uncertainties.math
from . import math as local_math
import math as sys_math
Why can't you rename your local module again?
Clearly, it's not a "total" replacement, if you still need things from the installed uncertainties.
Since it's a partial replacement, you should not give it the same name.
What's different? What's the same? Pick a better name based on that.