I am trying to call a custom module installed in my virtualenv. The module built by myself has the following structure:
test
├── README.md
├── setup.py
├── src
│ ├── __init__.py
│ └── module_x
│ └── abc.py
└── tests
setup.py looks like following:
from setuptools import find_packages, setup
with open('README.md', 'r') as f:
long_description = f.read()
setup(
name='test',
version='0.1.0',
author='me',
description='description',
long_description=long_description,
long_description_content_type='text/markdown',
packages=find_packages('src'),
package_dir={'': 'src'},
install_requires=[''],
entry_points={
'console_scripts': [
'test=module_x.abc:main'
],
}
)
Inside of the abc.py, I am using the following code:
class RandomClass:
def __init__(self, msg):
self.msg = msg
info()
def info():
print(self.msg)
def main(msg):
RandomClass(msg)
init.py looks like
from .module_x.abc import main
The egg was installed using pip install -e ..
UPDATED
How can I call the module in a new Python script like from test import main? Currently it works just like from test.src import main and I don't want to know the whole structure of the package. In the future will exist also module_y, module_z etc. My assumption is that I would need to modify something in the setup.py.
Related
I'm making a python package using setuptools, and I'm having trouble making all nested folders in my source code available for import after installing the package. The directory I'm working in has a structcure like illustrated below.
├── setup.py
└── src
└── foo
├── a
│ ├── aa
│ │ └── aafile.py
│ └── afile.py
├── b
│ └── bfile.py
└── __init__.py
Currently, I can't import submodules, such as from foo.a import aa or from foo.a.aa import some_method, unless I explicitly pass the names of the submodules to setuptools. That is, setup.py needs to contain something like
from setuptools import setup
setup(
version="0.0.1",
name="foo",
py_modules=["foo", "foo.a", "foo.a.a", "foo.b"],
package_dir={"": "src"},
packages=["foo", "foo.a", "foo.a.a", "foo.b"],
include_package_data=True,
# more arguments go here
)
This makes organizing the code pretty cumbersome. Is there a simple way to just allows users of the package to install any submodule contained in src/foo?
You'll want setuptools.find_packages() – though all in all, you might want to consider tossing setup.py altogether in favor of a PEP 517 style build with no arbitrary Python but just pyproject.toml (and possibly setup.cfg).
from setuptools import setup, find_packages
setup(
version="0.0.1",
name="foo",
package_dir={"": "src"},
packages=find_packages(where='src'),
include_package_data=True,
)
Every package/subpackage must contain a (at least empty) __init__.py file to be considered so.
If you want to the whole package&subpackages tree to be imported with just one import foo consider filling your __init__.py files with the import of the relative subpackages.
# src/foo/__init__.py
import foo.a
import foo.b
# src/foo/a/__init__.py
import foo.a.aa
# src/foo/b/__init__.py
import foo.b.bb
Otherwise leave the __init__.py files empty and the user will need to manually load the subpackag/submodule he wants.
I'm struggling to figure out how to copy the wrapper generated by swig at the same level than the swig shared library. Consider this tree structure:
│ .gitignore
│ setup.py
│
├───hello
├───src
│ hello.c
│ hello.h
│ hello.i
│
└───test
test_hello.py
and this setup.py:
import os
import sys
from setuptools import setup, find_packages, Extension
from setuptools.command.build_py import build_py as _build_py
class build_py(_build_py):
def run(self):
self.run_command("build_ext")
return super().run()
setup(
name='hello_world',
version='0.1',
cmdclass={'build_py': build_py},
packages=["hello"],
ext_modules=[
Extension(
'hello._hello',
[
'src/hello.i',
'src/hello.c'
],
include_dirs=[
"src",
],
depends=[
'src/hello.h'
],
)
],
py_modules=[
"hello"
],
)
When I do pip install . I'll get this content on site-packages:
>tree /f d:\virtual_envs\py364_32\Lib\site-packages\hello
D:\VIRTUAL_ENVS\PY364_32\LIB\SITE-PACKAGES\HELLO
_hello.cp36-win32.pyd
>tree /f d:\virtual_envs\py364_32\Lib\site-packages\hello_world-0.1.dist-info
D:\VIRTUAL_ENVS\PY364_32\LIB\SITE-PACKAGES\HELLO_WORLD-0.1.DIST-INFO
INSTALLER
METADATA
RECORD
top_level.txt
WHEEL
As you can see hello.py (the file generated by swig) hasn't been copied in site-packages.
Thing is, I've already tried many answers from the below similars posts:
setup.py: run build_ext before anything else
python distutils not include the SWIG generated module
Unfortunately, the question still remains unsolved.
QUESTION: How can I fix my current setup.py so the swig wrapper will be copied at the same level than the .pyd file?
setuptools cannot do this the way you want: it will look for py_modules only where setup.py is located. The easiest way IMHO is keeping the SWIG modules where you want them in the namespace/directory structure: rename src to hello, and add hello/__init__.py (may be empty or simply include everything from hello.hello), leaving you with this tree:
$ tree .
.
├── hello
│ ├── __init__.py
│ ├── _hello.cpython-37m-darwin.so
│ ├── hello.c
│ ├── hello.h
│ ├── hello.i
│ ├── hello.py
│ └── hello_wrap.c
└── setup.py
Remove py_modules from setup.py. The "hello" in the package list will make setuptools pick up the whole package, and include __init__.py and the generated hello.py:
import os
import sys
from setuptools import setup, find_packages, Extension
from setuptools.command.build_py import build_py as _build_py
class build_py(_build_py):
def run(self):
self.run_command("build_ext")
return super().run()
setup(
name='hello_world',
version='0.1',
cmdclass={'build_py': build_py},
packages = ["hello"],
ext_modules=[
Extension(
'hello._hello',
[
'hello/hello.i',
'hello/hello.c'
],
include_dirs=[
"hello",
],
depends=[
'hello/hello.h'
],
)
],
)
This way, also .egg-linking the package works (python setup.py develop), so you can link the package under development into a venv or so. This is also the reason for the way setuptools (and distutils) works: the dev sandbox should be structured in a way that allows running the code directly from it, without moving modules around.
The SWIG-generated hello.py and the generated extension _hello will then live under hello:
>>> from hello import hello, _hello
>>> print(hello)
<module 'hello.hello' from '~/so56562132/hello/hello.py'>
>>> print(_hello)
<module 'hello._hello' from '~/so56562132/hello/_hello.cpython-37m-darwin.so'>
(as you can see from the extension filename, I am on a Mac right now, but this works exactly the same under Windows)
Also, beyond packaging, there's more useful information about SWIG and Python namespaces and packages in the SWIG manual: http://swig.org/Doc4.0/Python.html#Python_nn72
Update: I have changed my file directory
I have a directory structure as follows and I would like to import a module in a parent directory.
**project**/
__init__.py
main.py
**APP_NAME**/
**parser**/
__init__.py
parser.py
**test**/
__init__.py
parser_test.py
parser.py
class Parser(object):
pass
main.py (Works fine)
from APP_NAME.parser.parser import Parser
parser_test.py (Throws error)
from ..APP_NAME.parser.parser import Parser
Throws the following error at parser_test.py
Parent module '' not loaded, cannot perform relative import
I know I can fix it using sys.path.append(), but I want to import it like a package the way I did it in main.py.
Any help is appreciated. Thanks.
I had to check back at one of my projects for a reference.
To test files in the tests folder you must first create setup.py, so that you can install you project for python to use it.
If on linux use the command, sudo python setup.py install to install the package. When changes have been made to the project, you must install again for the changes to take place.
These folder will be created in your root project directory after installing.
build, dist, and project.egg-info.
You may need to clean the build directory before re-installing to update.
python setup.py clean
python setup.py build
python setup.py install
Project Structure
project
├── setup.py
├── tests
│ └── parser_test.py
│
└── project
├── __init__.py
├── __init__.pyc
├── main.py
└── parser
├── __init__.py
├── __init__.pyc
├── parser.py
└── parser.pyc
project/setup.py
from setuptools import setup
# Make sure the project name will not conflict with other libraries
# For example do not name the project, 'os', 'sys', ect.
setup(
name='project',
description='My project description',
author='your_online_name',
license='MIT', # Check out software licenses
packages=['project', 'tests']
)
project/tests/parser_test.py
from project.parser import Parser
parser = Parser()
project/project/__init__.py
from . import parser
project/project/parser/__init__.py
from .parser import Parser
project/project/parser/parser.py
class Parser(object):
pass
You shouldn't be using absolute import within your package. In-package imports should be done with relative imports this way:
parser_test.py
from ..parser.parser import Parser
With relative imports in Python, the first point refers to the file's directory and each extra point refers to the parent directory.
In this case, you would be pointing to the project/parser/parser.py file which from test_parser.py standpoint's is ../parser.py
If you are using Python 2, you should add the following line at the top of all the files in your parser package
from __future__ import absolute_import
This will avoid that you use absolute imports inside you package files by mistake.
Still assuming you are working with Python 2, you should also import unicode_literals for native unicode support and print_function to replace the print command by the print() function.
However, I would rather have my tests in the top folder of the package, which, assuming the package is called project and not parser, would give the following directory structure:
project/ # top project directory
├── main.py
└── project # top package directory
├── __init__.py # this file is required even if it is empty
├── parser
│ ├── __init__.py
│ └── parser.py
└── tests
└── test_parser.py
Also, the project/project/parser/__init__.py could contain the following:
from .parser import Parser
So that your main.py file could import the Parser class like this:
from project.parser import Parser
instead of the more tedious:
from project.parser.parser import Parser
Your test_parser.py file, however, will still have to import the Parser class like this:
from ..parser.parser import Parser
because the classes exposed in an __init__.py file are not available to relative imports.
Finally, if you are starting a new independent project, you should do it in Python 3 (that's a PEP recommendation), where all the above rules apply, except the from __future__ imports which are unnecessary.
Sources: https://axialcorps.wordpress.com/2013/08/29/5-simple-rules-for-building-great-python-packages/
Let's say I have very simple package with a following structure:
.
├── foo
│ ├── bar
│ │ └── __init__.py
│ └── __init__.py
└── setup.py
Content of the files:
setup.py:
from distutils.core import setup
setup(
name='foobar',
version='',
packages=['foo', 'foo.bar'],
url='',
license='Apache License 2.0',
author='foobar',
author_email='',
description=''
)
foo/bar/__init__.py:
def foobar(x):
return x
The remaining files are empty.
I install the package using pip:
cd foobar
pip install .
and can confirm it is installed correctly.
Now I want to create a separate package with stub files:
.
├── foo
│ ├── bar
│ │ └── __init__.pyi
│ └── __init__.pyi
└── setup.py
Content of the files:
setup.py:
from distutils.core import setup
import sys
import pathlib
setup(
name='foobar_annot',
version='',
packages=['foo', 'foo.bar'],
url='',
license='Apache License 2.0',
author='foobar',
author_email='',
description='',
data_files=[
(
'shared/typehints/python{}.{}/foo/bar'.format(*sys.version_info[:2]),
["foo/bar/__init__.pyi"]
),
],
)
foo.bar.__init__.pyi:
def foobar(x: int) -> int: ...
I can install this package, see that it creates anaconda3/shared/typehints/python3.5/foo/bar/__init__.pyi in my Anaconda root, but it doesn't look like it is recognized by PyCharm (I get no warnings). When I place pyi file in the main package everything works OK.
I would be grateful for any hints how to make this work:
I've been trying to make some sense from PEP 484 - Storing and distributing stub files but to no avail. Even pathlib part seem to offend my version of distutils
PY-18597 and https://github.com/python/mypy/issues/1190#issuecomment-188526651 seem to be related but somehow I cannot connect the dots.
I tried putting stubs in the .PyCharmX.X/config/python-skeletons but it didn't help.'
Some things that work, but don't resolve the problem:
Putting stub files in the current project and marking as sources.
Adding stub package root to the interpreter path (at least in some simple cases).
So the questions: How to create a minimal, distributable package with Python stubs, which will be recognized by existing tools. Based on the experiments I suspect one of two problems:
I misunderstood the structure which should be created by the package in the shared/typehints/pythonX.Y - if this is true, how should I define data_files?
PyCharm doesn't consider these files at all (this seem to be contradicted by some comments in the linked issue).
It suppose to work just fine, but I made some configure mistake and looking for external problem which doesn't exist.
Are there any established procedures to troubleshoot problems like this?
Problem is that you didn't include the foo/__init__.pyi file in your stub distribution. Even though it's empty, it makes foo a stub files package, and enables search for foo.bar.
You can modify the data_files in your setup.py to include both
data_files=[
(
'shared/typehints/python{}.{}/foo/bar'.format(*sys.version_info[:2]),
["foo/bar/__init__.pyi"]
),
(
'shared/typehints/python{}.{}/foo'.format(*sys.version_info[:2]),
["foo/__init__.pyi"]
),
],
For a directory structure like the following, I haven't been able to make xy an importable package.
xy
├── __init__.py
├── z
│ ├── __init__.py
│ └── stuff.py
└── setup.py
If the setup.py were a directory up, I could use
from setuptools import setup
setup(name='xy',
packages=['xy'])
but short of that, no combination of package_dir and packages has let me import xy, only import z. Unfortunately, moving the setup.py a directory up isn't really an option due to an excessive number of hard-coded paths.
See the following answer for ideas how to use package_dir and packages to help with such projects: https://stackoverflow.com/a/58429242/11138259
In short for this case here:
#!/usr/bin/env python3
import setuptools
setuptools.setup(
# ...
packages=['xy', 'xy.z'],
#packages=setuptools.find_packages('..') # only if parent directory is otherwise empty
package_dir={
'xy': '.',
},
)