I want to create multiple sub-packages and distribute them individually. The obvious solution is to create a multiple setup.py for each sub-package and create a script that can frontend the creation and distribution of those packages. However, I am not sure if this is a good idea and if its the right way to do it. Any recommendations? I looked into namespace packages but thats not exactly what I need, my goal is it ship slim packages with minimal dependencies and thats why I am breaking my project into multiple subpackages that can be distributed
Second question so currently I have the following structure
a/b/c
__init__.py
setup.py
1.py
2.py
subpackage/
__init__.py
3.py
setup.py
Now when I create a sdist using
python a/b/c/subpackage/setup.py sdist bdist_wheel
it adds all the files from top level directory too e.g 1.py 2.py but I would like to only distribute the files from subpackage dir. How can I force it only add files from subpackage dir?
Related
I've reviewed the information here: How can I make setuptools (or distribute) install a package from the local file system
It looks like that question was posted a very long time ago and I'm hoping that in the seven years since it was posted that there have been some new developments to managing dependencies in the Python world.
Specifically, in our case we are working on a repository of related GCP packages:
src/
airflow/
__init__.py
dags/
__init__.py
requirements.txt
dag1.py
libs/
__init__.py
utils.py
tests/
dags/
test_dag1.py
plugins/
dataflow/
__init__.py
setup.py
dataflow1.py
libs/
__init__.py
utils.py
cli_helper/
__init__.py
cli_command.py
libs/
__init__.py
util.py
shared_utils/
util1.py
I have found myself repeating the same bits of helper functions within the context of each package and would like to put those helper functions in one place and then have a linked copy of the shared_utils files either in a shared_utils folder under each package or even to just have a copy of util1.py placed under the existing libs directory for each package.
What is most "pythonic" way to accomplish this?
Right now it seems that my only options would be to:
Use requirements.txt as listed above where I can and use a custom command in setup.py where requirements.txt can't be used.
Create a os level link from the shared_utils directory into each package such that it appears that the directory exists natively in each of my packages.
Package my shared_utils and then install directly from git. Though this option again requires requirements.txt and in some of my deployment environments, I can't rely on requirements.txt, I have to run everything through setup.py
If you can't use requirements.txt, maybe you can use setuptool's install_requires and go the shared_utils package root. Would that solve your issue?
Want to create python source distribution by running python setup.py sdist from a directory outside of the one I want to package up. Can't seem to find a way to do this. I have a script that generates a setup.py and MANIFEST.in dynamically, and I'd like to tell python to use those files to create an sdist of the source in a different directory "over there".
What I'm doing is creating a script that lets a user create an sdist w/o any setup.py etc. They just say "package up this directory and everything under it". So I generate a setup.py and MANIFEST.in (with recursive-include * to grab all files) in a python tempfile.mkdtemp directory (in an unrelated file path like /tmp/whatever) that I can clean up afterwards...but I can't seem to use those to package their directory. I don't want to create those files in their source dir.
You can use setuptools's, --dist-dir=DIR / -d DIR option to specify where the otherwise default dist/-folder is made. In other words, this changes the output directory.
E.g.:
python setup.py sdist -d /tmp/whatever
If you are using distutils.core: Instead of using from distutils.core import setup you can use from setuptools import setup.
In order to define where the source directories come from, I think you can add the directory to sys.path and then setup() will discover the content files automatically:
import sys
from os import path
# ...
# Add other folders to sys.path
sys.path.append('/tmp/whatever')
sys.path.append(path.join(path.abspath('..'), 'some', 'folder'))
Sort of a hack, but this worked for me. Right before running setup(), use os.chdir() to change the directory to that of the base path where setup.py would normally run. To specify where the distribution packages go, I use the arguments to setup.py, specifically:
python setup.py sdist --formats=gztar -d 'directory_for_the_distribution' egg_info --egg-base 'directory_for_the_egg_info'
Thus you can run setuptools from a directory other than at the base of the package directories and the distribution and temp egg directories go wherever you want.
I have created a custom python package following this guide, so I have the following structure:
mypackage/ <-- VCS root
mypackage/
submodule1/
submodule2/
setup.py
And setup.py contains exactly the same information as in the guide:
from setuptools import setup, find_packages
setup(name='mypackage',
version='0.1',
description='desc',
url='vcs_url',
author='Hodossy, Szabolcs',
author_email='myemail#example.com',
license='MIT',
packages=find_packages(),
install_requires=[
# deps
],
zip_safe=False)
I have noticed if I go into the folder where setup.py is, and then call python setup.py install in a virtual environment, in site-packages the following structure is installed:
.../site-packages/mypackage-0.1-py3.6.egg/mypackage/
submodule1/
submodule2/
but if I call it from one folder up like python mypackage/setup.py install, then the structure is the following:
.../site-packages/mypackage-0.1-py3.6.egg/mypackage/
mypackage/
submodule1/
submodule2/
This later one ruins all imports from my module, as the path is different for the submodules.
Could you explain what is happening here and how to prevent that kind of behaviour?
This is experienced with Python 3.6 on both Windows and Linux.
Your setup.py does not contain any paths, but seems to only find the files via find_packages. So of course it depends from where you run it. The setup.py isn't strictly tied to its location. Of course you could do things like chdir to the basename of the setup file path in sys.argv[0], but that's rather ugly.
The question is, WHY do you want to build it that way? It looks more like you would want a structure like
mypackage-source
mypackage
submodule1
submodule2
setup.py
And then execute setup.py from the work directory. If you want to be able to run it from anywhere, the better workaround would be to put a shellscript next to it, like
#!/bin/sh
cd ``basename $0``
python setup.py $#
which separates the task of changing to the right directory (here I assume the directory with setup.py in the workdir) from running setup.py
I have a repository I inherited used by a lot of teams, lots of scripts call it, and it seems like its going to be a real headache to make any structural changes to it. I would like to make this repo installable somehow. It is structured like this:
my_repo/
scripts.py
If it was my repository, I would change the structure like so and make it installable, and run python setup.py install:
my_repo/
setup.py
my_repo/
__init__.py
scripts.py
If this is not feasible (and it sounds like it might not be), can I somehow do something like:
my_repo/
setup.py
__init__.py
scripts.py
And add something to setup.py to let it know that the repo is structured funny like this, so that I can install it?
You can do what you suggest.
my_repo/
setup.py
__init__.py
scripts.py
The only thing is you will need to import modules in your package via their name if they are in the base level. So for example if your structure looked like this:
my_repo/
setup.py
__init__.py
scripts.py
lib.py
pkg/
__init__.py
pkgmodule.py
Then your imports in scripts.py might look like
from lib import func1, func2
from pkg.pkgmodule import stuff1, stuff2
So in your base directory imports are essentially by module name not by package. This could screw up some of your other packages namespaces if you're not careful, like if there is another dependency with a package named lib. So it would be best if you have these scripts running in a virtualenv and if you test to ensure namespacing doesn't get messed up
There is a directive in setup.py file to set the name of a package to install and from where it should get it's modules for installation. That would let you use the desired directory structure. For instance with a given directory structure as :
my_repo/
setup.py
__init__.py
scripts.py
You could write a setup.py such as:
setup(
# -- Package structure ----
packages=['my_repo'],
package_dir={'my_repo': '.'})
Thus anyone installing the contents of my_repo with the command "./setup.py install" or "pip install ." would end up with an installed copy of my_repo 's modules.
As a side note; relative imports work differently in python 2 and python 3. In the latter, any relative imports need to explicitly specify the will to do so. This method of installing my_repo will work in python 3 when calling in an absolute import fashion:
from my_repo import scripts
I've taken to putting module code directly in a packages __init__.py, even for simple packages where this ends up being the only file.
So I have a bunch of packages that look like this (though they're not all called pants:)
+ pants/
\-- __init__.py
\-- setup.py
\-- README.txt
\--+ test/
\-- __init__.py
I started doing this because it allows me to put the code in a separate (and, critically, separately versionable) directory, and have it work in the same way as it would if the package were located in a single module.py. I keep these in my dev python lib directory, which I have added into $PYTHONPATH when working on such things. Each package is a separate git repo.
edit...
Compared to the typical Python package layout, as exemplified in Radomir's answer, this setup saves me from having to add each package's directory into my PYTHONPATH.
.../edit
This has worked out pretty well, but I've hit upon this (somewhat obscure) issue:
When running tests from within the package directory, the package itself, i.e. code in __init__.py, is not guaranteed to be on the sys.path. This is not a problem under my typical environment, but if someone downloads pants-4.6.tgz and extracts a tarball of the source distribution, cds into the directory, and runs python setup.py test, the package pants itself won't normally be in their sys.path.
I find this strange, because I would expect setuptools to run the tests from a parent directory of the package under test. However, for whatever reason, it doesn't do that, I guess because normally you wouldn't package things this way.
Relative imports don't work because test is a top-level package, having been found as a subdirectory of the current-directory component of sys.path.
I'd like to avoid having to move the code into a separate file and importing its public names into __init__.py. Mostly because that seems like pointless clutter for a simple module.
I could explicitly add the parent directory to sys.path from within setup.py, but would prefer not to. For one thing, this could, at least in theory, fail, e.g. if somebody decides to run the test from the root of their filesystem (presumably a Windows drive). But mostly it just feels jerry-rigged.
Is there a better way?
Is it considered particularly bad form to put code in __init__.py?
I think the standard way to package python programs would be more like this:
\-- setup.py
\-- README.txt
\--+ pants/
\-- __init__.py
\-- __main__.py
...
\--+ tests/
\-- __init__.py
...
\--+ some_dependency_you_need/
...
Then you avoid the problem.