I am Wrapping a fortran library with f2py, so far it has been successful in terms of getting the functions to work within python
I am at the point of structuring my package and am running into __init__.py issues when using these f2py generated modules. This is related to a package constructed with distutils extensions for f2py
After installing the egg, in an external script I can import the f2py generated module directly
from package.core.module1 import fortran_fn
,but when I place
from module1 import fortran_fn
in package/core/__init__.py in order to refactor the namespace for users,
the package import fails
----> 1 import package.core as core
~/venv/dev/lib/python3.5/site-packages/package-py3.5-linux-x86_64.egg/package/core/__init__.py in <module>()
1
2
----> 3 from module1 import fortran_fn
4
5
ImportError: No module named 'module1'
However, there is one exception, for whatever reason one module in particular, does wish to import itself.
Project Structure:
setup.py
package/
__init__.py
core/
__init__.py
module1.pyf
module2.pyf
...
src/
lib/
module1.f
module2.f
...
*pyf were generated with f2py -h package/core/moduleX.pyf src/lib/moduleX.f -m moduleX
Further,
#core/__init__.py
from module1 import fortran_fn
and,
#setup.py
from setuptools import find_packages
from numpy.distutils.core import Extension
ext1 = Extension(name='package.core.module1',
sources=['package/core/module1.pyf',
'src/lib/module1.f'])
#ext2 = ...
# and others
setup (name="package",
packages = find_packages(),
ext_modules = [ ext1 ]) # ..., ext2, ... ]
Code samples simplified to illustrate the problem, the specific code is available at this github
For some context, I initially tried to have f2py place the fortran into the .core subpackage directly by compiling with -m package.core and the same for name='package.core' for extension, however f2py would complain about missing names.
In this particular structure there is an additional layer where each fortran module is just one function, and I would like to remove the extra submodule layer from the API.
Using a fully qualified name in core/__init__.py will resolve the immediate import problem.
Instead of:
#core/__init__.py
from module1 import fortran_fn
This will work:
#core/__init__.py
from package.core.module1 import fortran_fn
then
> import package.core as c
> c.fortran_fn()
SUCCESS
It appears the f2py extensions do not register as subpackages/submodules when distutils runs. I may be mistaken in that interpretation.
In Python 3 all imports are absolute. To perform relative import do it explicitly:
from .module1 import fortran_fn
Related
I have a python package defined via __init__.py in a folder named python_utils that has many python scripts in it.
python_utils/
__init__.py
print_pretty.py
script_1.py
...
For convenience this __init__.py imports all of them using:
from .pretty_print import pretty_print
from .scritp_1 import func1
from .script_2 import func2
....
The problem I have is that some of the python scripts depend on packages that may not be available at run time. So when I try to import a very trivial function from the package (from python_utils import print_pretty), it fails because of the dependencies the other scripts have. I also tried to import directly with from python_utils.print_pretty import print_pretty same problem. I can get rid of the __init__.py to solve the problem, but it saves me a lot of typing when importing packages. Is there a way I can import only pretty_print without getting the dependencies from the other scripts?
tl;dr - My python package requires import package.package instead of working with just import package. How do I get it to work with the latter?
I am trying to set up my first python package and am running into some issues with the import part of the process.
My python package file setup looks like this on my computer:
my-package
- build
- dist
- package
- package.egg-info
- LICENSE
- README.md
- Setup.py
Inside package is the following:
__init__.py
package.py
__init__.py reads name = 'package', and package.py contains all of the content of the package.
EDIT: I have attempted using various different versions of __init__.py, including adding import package, import package.package, or import package.package as package below the line name = 'package', but all of resulted in the same issue.
Using the Packaging Python Projects tutorial, I've been able to get my package upload to TestPyPi, but when I install the package to my computer, none of the functions/methods are available, and when I run "import package" and do help(package), I get the following:
Help on package package:
NAME
package
PACKAGE CONTENTS
package
DATA
name = 'package'
FILE
url/to/package
When I run import package.package and help(package), I can access the methods/functions, and get the expected help text for the package's content.
My question is, how to I configure the package file on my computer in such a way that once it is upload to TestPyPi and then downloaded, import package works, instead of needing to run import package.package?
When you write import package, you can access names in package/__init__.py as package.foo.
So, inside __init__.py, if you import all the necessary functions/variables/etc from package.py, those names will be visible to clients that just import package.
So, if you have this in package/__init__.py:
from .package import (foo, bar, baz)
Then in your other code you can do this:
from package import foo
And you dont have to worry about from package.package import foo.
I am attempting to compile a *.pyx file. It uses some definitions and constants inside a __init__.py in the same directory. The project structure is:
setup.py
Foo/__init__.py
Foo/Foo.pyx
and the setup command is as follows:
from setuptools import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
import numpy
setup(
cmdclass = {'build_ext': build_ext},
ext_module = [ Extension(name='Foo', sources=['Foo/Foo.pyx']) ],
include_dirs=[numpy.get_include()],
name='Foo',
packages=['Foo'],
zip_safe=True
)
Problem arises when the egg is built and deployed. The resultant egg has the following structure:
Foo.so
Foo.py
Foo/__init__.py
Now, Foo.py contains some dynamic import code that basically imports the *.so file. However, because of the presence of Foo/__init__.py, import Foo attempts to import symbols only from __init__.py, which contains just some constants (all the relevant code is actually in Foo.so).
I've hacked around this issue by pasting all the definitions from __init__.py into Foo.pyx, but I'm trying to figure out what a proper solution might be.
Any advice is appreciated!
I tracked down my problem to an extraneous argument to the setup() command. Judging by the documentation at https://docs.python.org/2/distutils/setupscript.html, I do not need the packages=['Foo'] argument, and in fact that's what's causing it to create the inner Foo package that's messing everything up.
I am struggling a bit to set up a working structure in one of my projects. The problem is, that I have main package and a subpackage in a structure like this (I left out all unnecessary files):
code.py
mypackage/__init__.py
mypackage/work.py
mypackage/utils.py
The utils.py has some utility code that is normally only used in the mypackage package.
I normally have some test code each module file, that calls some methods of the current module and prints some things to quickcheck if everything is working correctly. This code is placed in a if __name__ == "__main__" block at the end of the file. So I include the utils.py directly via import utils. E.g mypackage/work.py looks like:
import utils
def someMethod():
pass
if __name__ == "__main__":
print(someMethod())
But now when I use this module in the parent package (e.g. code.py) and I import it like this
import mypackage.work
I get the following error:
ImportError: No module named 'utils'
After some research I found out, that this can be fixed by adding the mypackage/ folder to the PYTHONPATH environment variable, but this feels strange for me. Isn't there any other way to fix this? I have heard about relative imports, but this is mentioned in the python docs about modules
Note that relative imports are based on the name of the current module. Since the name of the main module is always "main", modules intended for use as the main module of a Python application must always use absolute imports.
Any suggestions how I can have a if __name__ == "__main__" section in the submodule and also can use this file from the parent package without messing up the imports?
EDIT: If I use a relative import in work.py as suggested in a answer to import utils:
from . import utils
I get the following error:
SystemError: Parent module '' not loaded, cannot perform relative import
Unfortunately relative imports and direct running of submodules don't mix.
Add the parent directory of mypackage to your PYTHONPATH or always cd into the parent directory when you want to run a submodule.
Then you have two possibilities:
Use absolute (from mypackage import utils) instead of relative imports (from . import utils) and run them directly as before. The drawback with that solution is that you'll always need to write the fully qualified path, making it more work to rename mypackage later, among other things.
or
Run python3 -m mypackage.utils etc. to run your submodules instead of running python3 mypackage/utils.py.
This may take some time to adapt to, but it's the more correct way (a module in a package isn't the same as a standalone script) and you can continue to use relative imports.
There are more "magical" solutions involving __package__ and sys.path but they all require extra code at the top of every file with relative imports you want to run directly. I wouldn't recommend these.
You should create a structure like this:
flammi88
├── flammi88
│ ├── __init__.py
│ ├── code.py
│ └── mypackage
│ ├── __init__.py
│ ├── utils.py
│ └── work.py
└── setup.py
then put at least this in the setup.py:
import setuptools
from distutils.core import setup
setup(
name='flammi88',
packages=['flammi88'],
)
now, from the directory containing setup.py, run
pip install -e .
This will make the flammi88 package available in development mode. Now you can say:
from flammi88.mypackage import utils
everywhere. This is the normal way to develop packages in Python, and it solves all of your relative import problems. That being said, Guido frowns upon standalone scripts in sub-packages. With this structure I would move the tests inside flammi88/tests/... (and run them with e.g. py.test), but some people like to keep the tests next to the code.
Update:
an extended setup.py that describes external requirements and creates executables of the sub-packages you want to run can look like:
import setuptools
from distutils.core import setup
setup(
name='flammi88',
packages=['flammi88'],
install_requires=[
'markdown',
],
entry_points={
'console_scripts': [
'work = flammi88.mypackage.work:someMethod',
]
}
)
now, after pip installing your package, you can just type work at the command line.
Import utils inside the work.py as follows:
import mypackage.utils
or if you want to use shorter name:
from mypackage import utils
EDIT: If you need to run work.py outside of the package, then try:
try:
from mypackage import utils
except ImportError:
import utils
Use:
from . import utils
as suggested by Peter
In your code.py you should use:
from mypackage import work
I'm trying to create an install package for a Python project with included unit tests. My project layout is as follows:
setup.py
src/
disttest/
__init__.py
core.py
tests/
disttest/
__init__.py
testcore.py
My setup.py looks like this:
from distutils.core import setup
import setuptools
setup(name='disttest',
version='0.1',
package_dir={'': 'src'},
packages=setuptools.find_packages('src'),
test_suite='nose.collector',
tests_require=['Nose'],
)
The file tests/disttest/testcore.py contains the line from disttest.core import DistTestCore.
Running setup.py test now gives an ImportError: No module named core.
After a setup.py install, python -c "from disttest.core import DistTestCore" works fine. It also works if I put import core into src/disttest/__init__.py, but I don't really want to maintain that and it only seems necessary for the tests.
Why is that? And what is the correct way to fix it?
You may want to double-check this, but it looks like your tests are importing the disttest package in the tests/ directory, instead of the package-under-test from the src/ directory.
Why do you need to use a package with the same name as the package-under-test? I'd simply move the testcore module up to the tests directory, or rename the tests/disttest package and avoid the potential naming conflict altogether.
In any case, you want to insert a import pdb; pdb.set_trace() line just before the failing import and play around with different import statements to see what is being imported from where (import sys; sys.modules['modulename'].__file__ is your friend) so you get a better insight into what is going wrong.