To avoid specifying dependencies in two places, I have a Python project whose setup.py parses a requirements.txt file to generate the list of install_requires packages. This works great until I try to upload a wheel to a devpi server and then install it - I get the error that requirements.txt is not found.
Is it possible to build a distribution with the requirements.txt files next to setup.py? I've tried package_data and data_files, but the resulting distribution still didn't contain those files.
Just add a MANIFEST.in in the project folder with the content:
include requirements.txt
And it would include the file. You can also use wildcards like * too.
Related
I want to reuse some code for my internal team at work. My plan is to create a package and then have people install the package using pip straight out of our git repo. i.e. as shown here: https://pip.pypa.io/en/latest/reference/pip_install/#git
My question is, do I commit the dist folder to git? What is pip looking for?
Or is there a better way to share / reuse code internally for a team (across many different projects)?
I used a .gitignore file from here (is that github's default Python .gitignore file?) and it ignores all the dist files:
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
but it seems wrong to exclude these from the repo when I'm trying to install from the repo.
You do not need to commit the dist folder. pip really just needs the repository to have a setup.py file along with the packages and/or modules you're installing.
dist is a default name for a directory that contains the final build result: your project ready to be distributed, that is, packaged into a file which pip or other package managers know how to install:
$ python setup.py sdist --help
...
--dist-dir (-d) directory to put the source distribution archive(s) in
[default: dist]
So it is safe to ignore the directory and all of its contents in .gitignore. If you do not plan to upload you project's installation files to PyPI and intend to install it via passing Git url, you don't even need the dist directory and can safely delete it. It will be recreated anyway once you issue any dist command (sdist, bdist, bdist_wheel, bdist_rpm etc).
I uploaded my package on pypi using this guide.
But it seems that there is an error with this line in setup.py
long-description=open(os.path.join(os.path.dirname(__file__),'README.md')).read()
which is giving me, on trying to install via pip
IO Error no such file or directory.
So how can I fix this? should a simple open('README.md')?
Is the long-description line really needed when I already have this in my setup.cfg
[metadata]
description-file = README.md
You need to add 'include README.md' in MANIFEST.in, if you do not have this file (the Manifest one), create it.
Just execute this on your repository root directory (where the setup.py file is located)
echo "include README.md" >> MANIFEST.in
Cheers.
I have a Python package myapp which depends on a Python package theirapp.
theirapp is used by others and may update occasionally, but it is not hosted on PyPI.
I currently have my repository setup like this:
my-app/
myapp/
__init__.py
requirements.txt
their-app/
setup.py
theirapp/
__init__.py
My requirements.txt file contains the following line (among others):
./their-app/
their-app is not hosted on PyPI but I want to make sure the latest version is installed. Up to this point I have been downloading a zip file containing my-app and typing pip install -U requirements.txt and using the application manually.
I would like to make an installable Python package. Ideally I would like to download a my-app.zip file and type pip install my-app.zip to install myapp, theirapp and any other dependencies.
Is this possible? If not, what is the best way to handle this scenario?
You may just need to bundle theirapp as part of your project and import it as myapp.contrib.theirapp. If both projects are versioned in git you can impliment it as a submodule, but it may increase complexity for maintainers.
How pip handles a similar problem:
https://github.com/pypa/pip/tree/develop/pip/_vendor
You can see pip imports bundled vendor packages as pip._vendor.theirapp.
I'm trying to grasp the git(hub) way of managing software. I have a repository:
https://github.com/pythonishvili/django-inguri
And I try to pip install it with this command
pip install git+git://github.com/pythonishvili/django-inguri.git
The response I get:
Downloading/unpacking git+git://github.com/pythonishvili/django-inguri.git
Cloning git://github.com/pythonishvili/django-inguri.git to /tmp/pip-bv5r89-build
Running setup.py egg_info for package from git+git://github.com/pythonishvili/django-inguri.git
Installing collected packages: inguri
Running setup.py install for inguri
Successfully installed inguri
Cleaning up...
But installation went clearly wrong because all I get in my virtualenv (/home/username/.virtualenvs/envname/lib/python2.7/site-packages/inguri) are two files:
__init__.py
__init__.pyc
What did I do wrong? How do I make this work?
I believe you need to add all the subdirectories of your project to the packages option of your setup.py file. Right now, you have just the outermost directory - inguri. You would need to add inguri.ads, inguri.ads.migrations and so forth (as they contain .py files too which you want to include in your distribution).
You also need to add the following line in your manifest file: recursive-include inguri *
Goals:
Make use of modern Python packaging toolsets to deploy/install proprietary packages into some virtualenv.
The installed packages should include compiled *.pyc(or *.pyo) only without source files.
There are a couple of packages, and a vendor name (here we choose dgmx for our studio) is used as the package names. Therefore, the installed packages would be something like dgmx/alucard, dgmx/banshee, dgmx/carmilla, ...
The file hierarchy of installed packages should be like ones by python setup.py install --single-version-externally-managed or pip install. Refer to How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Question in short:
I like to deploy proprietary namespaced packages into a virtualenv by only compiled *.pyc(or *.pyo) files, in which the file/directory hierarchy just reflects the namespace with polluting sys.path by lots of ooxx.egg paths.
Something I have tried:
python setup.py bdist_egg --exclude-source-files then easy_install ooxx.egg.
pollute "sys.path" for each namespace package.
python setup.py install --single-version-externally-managed.
not *.pyc only.
the "install_requires" got ignored!
need to manually put a ooxx.egg-info/installed-files.txt to make uninstall work correctly.
pip install . in the location of "setup.py".
not *.pyc only.
pysetup install . in the location of "setup.py".
not *.pyc only.
Update:
My current idea is to follow method 2.
python setup.py egg_info --egg-base . # get requires.txt
python setup.py install --single-version-externally-managed --record installed-files.txt # get installed-files.txt
manually install other dependencies through "requires.txt"
manually delete installed source files (*.py) through "installed-files.txt"
remove source files (*.py) from "installed-files.txt" and put it into deployed "ooxx.egg-info/installed-files.txt"
References:
Migrating to pip+virtualenv from setuptools
installing only .pyc (python compiled) with setuptools
Can I deploy Python .pyc files only to Google App Engine?
How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Some trick may help:
Compile your source into .pyc, zip them up in a single .zip file.
Write a new module with a simple module all it does is to add the .zip to the sys.path.
So when you import this module, the .zip is in the path. All you have to do is in a custom step in setup.py, copy the zip file to the proper place.