When building a python package where the source tree looks like this:
src -\
+- module -\
<stuff>
+- setup.py
is pretty clear.
Is it possible to build a package where module source doesn't reside in the same location as the setup.py? For more specific use case the code for the module is either partially or full autogenerated in a location other then src
E.g.
src -\
+- setup.py
generated -\
module -\
<module code>
You can control the directory where packages reside by using the package_dir argument to setup(...)
and while it does appear to build a proper source distribution when package_dir is a relative path starting with .., it appears that pip will refuse to install it -- I'd suggest instead nesting your generated code inside that src directory instead and then using package_dir to select that.
Here's an example which moves all modules inside a generated subdir:
setup(
name='mypkg',
package_dir={'': 'generated'},
packages=find_packages('generated'),
)
Using a setup like:
$ tree .
.
├── generated
│ ├── mod1
│ │ └── __init__.py
│ └── mod2
│ └── __init__.py
└── setup.py
This would make the following succeed after install: import mod1; import mod2
If you wanted to make those modules available under a different prefix, you would do:
setup(
name='mypkg',
package_dir={'hello': 'generated'},
packages=[f'hello.{mod}' for mod in find_packages('generated')],
)
This would make import hello.mod1; import hello.mod2 succeed after installation
You can use relative paths in package lookup configuration. Examples:
all distributed sources are in generated
from setuptools import setup, find_packages
setup(
...
package_dir={'': '../generated'},
packages=find_packages(where='../generated'),
)
selected packages should be included from generated
In this example, only packages spam and eggs from generated will be included:
import pathlib
from setuptools import setup, find_packages
setup(
name='so',
package_dir={'spam': '../generated/spam', 'eggs': '../generated/eggs'},
packages=find_packages(where='../generated'), # or just ['spam', 'eggs']
)
Or implement a dynamic lookup like e.g.
package_dir={p.name: p.resolve() for p in pathlib.Path('..', 'generated').iterdir()}
better implementation by resolving all paths relative to the setup.py file
Resolving all paths relative to the setup.py script allows you to run the script from any other directory than src, e.g. you can run python src/setup.py bdist_wheel etc. You may or may not need it, depending on your use case. Nevertheless, the recipe is as usual: resolve all paths to __file__, e.g.
import pathlib
from setuptools import setup, find_packages
src_base = pathlib.Path(__file__, '..', '..', 'generated').resolve()
setup(
...
package_dir={'': str(src_base)},
packages=find_packages(where=src_base),
)
Related
My project structure:
/myproject/ <- I would like to skip that folder
/mypackage/
/subpackage/
mymodule.py
run.py
- setup.py
Within run.py I'd like to import from mymodule.py like that:
from mypackage.subpackage.mymodule import something
instead of:
from myproject.mypackage.subpackage.mymodule import something
Is there a way in setup() to define the entrypoint as being mypackage and skip myproject?
You can use what's called a src-layout (since src/ is more typically used as a top-level directory for packages. See https://setuptools.readthedocs.io/en/latest/setuptools.html#using-a-src-layout
If using setup.cfg you can write this like:
[options]
package_dir=
=src
packages=find:
[options.packages.find]
where=src
or equivalently, using an old-style setup.py:
from setuptools import find_packages
setup(
...
package_dir={'': 'src'}
packages=find_packages(where='src')
...
)
packages
I have a python package called mltester which contains two sub-packages (actions, dialogs) and a main script ml_tester.py, structured as follows:
+ <ProjectFolder>
+---+ <mltester>
| +---- <actions>
| +---- <dialogs>
| +---- ml_tester.py
| +---- __init__.py
+---- setup.py
My __init__.py looks as follows:
import actions
import dialogs
import ml_tester
In ml_tester.py I do something like:
from actions import *
from dialogs import *
All works fine when running from eclipse. When doing pip install, the following setup.py works fine:
from setuptools import setup
setup(
name="MLTester",
version="1.0",
packages=["mltester",
"mltester.actions",
"mltester.dialogs"],
install_requires=[
"matplotlib",
],
entry_points='''
[console_scripts]
ml_tester_gui=mltester.ml_tester:main
'''
)
But when I remove "mltester.actions", "mltester.dialogs" from the list of packages, I now get an error like:
File "/usr/local/lib/python2.7/dist-packages/mltester/__init__.py", line 1, in <module>
import actions
ImportError: No module named actions
And I don't understand why listing just the containing mltester package is not enough. Of Course I can simply add the packages back, but now I think that I'm missing something more conceptual here.
Because packages do not do any package lookup in subtree. Adding a package to packages will only include the package itself and all direct submodules, but none of the subpackages.
For example, if you have a source tree with package spam containing a module eggs and subpackage bacon:
src
└── spam
├── __init__.py
├── eggs.py
└── bacon
└── __init__.py
Specifying packages=['spam'] will include only spam and spam.eggs, but not spam.bacon, so spam.bacon will not be installed. You have to add it separately to include the complete source codebase: packages=['spam', 'spam.bacon'].
To automate the building of the packages list, setuptools offers a handy function find_packages:
from setuptools import find_packages, setup
setup(
packages=find_packages(),
...
)
Let's say I have very simple package with a following structure:
.
├── foo
│ ├── bar
│ │ └── __init__.py
│ └── __init__.py
└── setup.py
Content of the files:
setup.py:
from distutils.core import setup
setup(
name='foobar',
version='',
packages=['foo', 'foo.bar'],
url='',
license='Apache License 2.0',
author='foobar',
author_email='',
description=''
)
foo/bar/__init__.py:
def foobar(x):
return x
The remaining files are empty.
I install the package using pip:
cd foobar
pip install .
and can confirm it is installed correctly.
Now I want to create a separate package with stub files:
.
├── foo
│ ├── bar
│ │ └── __init__.pyi
│ └── __init__.pyi
└── setup.py
Content of the files:
setup.py:
from distutils.core import setup
import sys
import pathlib
setup(
name='foobar_annot',
version='',
packages=['foo', 'foo.bar'],
url='',
license='Apache License 2.0',
author='foobar',
author_email='',
description='',
data_files=[
(
'shared/typehints/python{}.{}/foo/bar'.format(*sys.version_info[:2]),
["foo/bar/__init__.pyi"]
),
],
)
foo.bar.__init__.pyi:
def foobar(x: int) -> int: ...
I can install this package, see that it creates anaconda3/shared/typehints/python3.5/foo/bar/__init__.pyi in my Anaconda root, but it doesn't look like it is recognized by PyCharm (I get no warnings). When I place pyi file in the main package everything works OK.
I would be grateful for any hints how to make this work:
I've been trying to make some sense from PEP 484 - Storing and distributing stub files but to no avail. Even pathlib part seem to offend my version of distutils
PY-18597 and https://github.com/python/mypy/issues/1190#issuecomment-188526651 seem to be related but somehow I cannot connect the dots.
I tried putting stubs in the .PyCharmX.X/config/python-skeletons but it didn't help.'
Some things that work, but don't resolve the problem:
Putting stub files in the current project and marking as sources.
Adding stub package root to the interpreter path (at least in some simple cases).
So the questions: How to create a minimal, distributable package with Python stubs, which will be recognized by existing tools. Based on the experiments I suspect one of two problems:
I misunderstood the structure which should be created by the package in the shared/typehints/pythonX.Y - if this is true, how should I define data_files?
PyCharm doesn't consider these files at all (this seem to be contradicted by some comments in the linked issue).
It suppose to work just fine, but I made some configure mistake and looking for external problem which doesn't exist.
Are there any established procedures to troubleshoot problems like this?
Problem is that you didn't include the foo/__init__.pyi file in your stub distribution. Even though it's empty, it makes foo a stub files package, and enables search for foo.bar.
You can modify the data_files in your setup.py to include both
data_files=[
(
'shared/typehints/python{}.{}/foo/bar'.format(*sys.version_info[:2]),
["foo/bar/__init__.pyi"]
),
(
'shared/typehints/python{}.{}/foo'.format(*sys.version_info[:2]),
["foo/__init__.pyi"]
),
],
I have a project named myproj structured like
/myproj
__init__.py
module1.py
module2.py
setup.py
my setup.py looks like this
from distutils.core import setup
setup(name='myproj',
version='0.1',
description='Does projecty stuff',
author='Me',
author_email='me#domain.com',
packages=[''])
But this places module1.py and module2.py in the install directory.
How do I specify setup such that the directory /myproj and all of it's contents are dropped into the install directory?
In your myproj root directory for this project, you want to move module1.py and module2.py into a directory named myproj under that, and if you wish to maintain Python < 3.3 compatibility, add a __init__.py into there.
├── myproj
│ ├── __init__.py
│ ├── module1.py
│ └── module2.py
└── setup.py
You may also consider using setuptools instead of just distutils. setuptools provide a lot more helper methods and additional attributes that make setting up this file a lot easier. This is the bare minimum setup.py I would construct for the above project:
from setuptools import setup, find_packages
setup(name='myproj',
version='0.1',
description="My project",
author='me',
author_email='me#example.com',
packages=find_packages(),
)
Running the installation you should see lines like this:
copying build/lib.linux-x86_64-2.7/myproj/__init__.py -> build/bdist.linux-x86_64/egg/myproj
copying build/lib.linux-x86_64-2.7/myproj/module1.py -> build/bdist.linux-x86_64/egg/myproj
copying build/lib.linux-x86_64-2.7/myproj/module2.py -> build/bdist.linux-x86_64/egg/myproj
This signifies that the setup script has picked up the required source files. Run the python interpreter (preferably outside this project directory) to ensure that those modules can be imported (not due to relative import).
On the other hand, if you wish to provide those modules at the root level, you definitely need to declare py_modules explicitly.
Finally, the Python Packaging User Guide is a good resource for more specific questions anyone may have about building distributable python packages.
For a directory structure like the following, I haven't been able to make xy an importable package.
xy
├── __init__.py
├── z
│ ├── __init__.py
│ └── stuff.py
└── setup.py
If the setup.py were a directory up, I could use
from setuptools import setup
setup(name='xy',
packages=['xy'])
but short of that, no combination of package_dir and packages has let me import xy, only import z. Unfortunately, moving the setup.py a directory up isn't really an option due to an excessive number of hard-coded paths.
See the following answer for ideas how to use package_dir and packages to help with such projects: https://stackoverflow.com/a/58429242/11138259
In short for this case here:
#!/usr/bin/env python3
import setuptools
setuptools.setup(
# ...
packages=['xy', 'xy.z'],
#packages=setuptools.find_packages('..') # only if parent directory is otherwise empty
package_dir={
'xy': '.',
},
)