I have a requirements.txt file that contains both normal package names (to install from pypi) and paths to local tar.gz packages within the repo, e.g.
flask
pandas
local_dir/local_pkg.tar.gz
The problem is, there are two different deployment pipelines used for this repo, which both need to work.
The first will only run pip install -r requirements.txt (I cannot modify this command or add any additional options), but it always runs it from the base repo path. So currently, this runs successfully without issue.
The second is the problem. It runs from a different location entirely, and installs the packages via pip install -r /path/to/repo/requirements.txt. The trouble is, pip install doesn't automatically look in /path/to/repo/ for the listed local package path (local_dir/local_pkg.tar.gz); it instead looks for that local package path in the location where the command is being run. It obviously can't find the local package there, and so throws an error.
With this second deployment pipeline, I can add additional options to pip install. However, I've tried out some of the listed options and cannot find anything that resolves my issue.
tl;dir:
How can I modify the pip install -r /path/to/repo/requirements.txt command, so that it looks for local packages as if it's running from /path/to/repo/ (regardless of where the command is actually being run from)?
Related
Given a .py file in our local system, is there a way to find its dependancies?
By dependancies I mean the import statements that we specify.
tried pip show [package name] but it does not give the dependancies for a .py file in our local system
Use freeze to get all the dependencies installed in your environment. (This will list all the dependencies even though you have not used it in the project but installed.)
pip freeze > requirements.txt
If you want to list only the used ones use pipreqs,
pip install pipreqs
then,
pipreqs path/to/project
What I have:
local Python3 files that I want to turn into a module test_module
test_module folder containing an empty __init__.py, a setup.py file (see below) and subdirectories with several source
files
What I want:
continuously work on and improve test_module locally
have an easy way to install test_module and all its dependencies locally in my own virtual environment (created using python3 -m venv my_environment)
run files that make use of the module via python myexample.py, without having to take care of adapting my local PYTHONPATH variable each time i enter or exit the my_environment
share my python code with others via git, and allow them to install their code locally on their machines using the same procedure (as simple as possible)
learn best practices on how to create my own module
How I'm doing it at the moment:
pip freeze > requirements.txt and pip install -r requirements.txt for installing dependencies
adding export PYTHONPATH="${PYTHONPATH}:." to my_environment/bin/activate, to have my own module in the search path
(as found here: How do you set your pythonpath in an already-created virtualenv?)
I'd like to know if there are "cleaner" solutions based on setup.py, possibly involving something like pip install ./test_module or similar that takes care of 2.-3. automagically.
My current setup.py file looks as follows
from setuptools import setup
setup(
name='test_module',
version='0.1',
description='Some really good stuff, that I am still working on',
author='Bud Spencer',
author_email='bud.spencer#stackoverflow.com',
packages=['test_module'], # same as name
install_requires=['numpy', 'scipy', 'sklearn', 'argparse'], # external packages as dependencies
)
It sounds like you want to run pip install -e <path/url> from within your virtual env, which will install a package (with a setup.py file as you have) from either a local path or a Git repo. See https://pip.pypa.io/en/stable/reference/pip_install/#vcs-support for an explanation on the syntax of the latter.
Example:
pip install -e git+https://github.com/me/test_module/#egg=test-module
If you have already installed and want to pull the latest code from the repo, add an --upgrade switch to the above.
I have a package that I am developing for a local server. I would like to have the current stable release importable in a Jupyter notebook using import my_package and the current development state importable (for end-to-end testing and stuff) with import my_package_dev, or something like that.
The package is version controlled with git. The master branch holds the stable release, and new development work is done in the develop branch.
I currently pulled these two branches into two different folders:
my_package/ # tracks master branch of repository
setup.py
requirements.txt
my_package/
__init__.py
# other stuff
my_package_dev/ # tracks develop branch of repository
setup.py
requirements.txt
my_package/
__init__.py
# other stuff for dev branch
My setup.py file looks like this:
from setuptools import setup
setup(
name='my_package', # or 'my_package_dev' for the dev version
# metadata stuff...
)
I can pip install my_package just fine, but I have been unable to get anything to link to the name my_package_dev in Python.
Things I have tried
pip install my_package_dev
Doesn't seem to overwrite the existing my_package, but doesn't seem to make my_package_dev available either, even though pip says it finishes OK.
pip install -e my_package_dev
makes an egg and puts the development package path in easy-install.pth, but I cannot import my_package_dev, and my_package is still the old content.
Adding a file my_package_dev.pth to site-packages directory and filling it with /path/to/my_package_dev
causes no visible change. Still does not allow me to import my_package_dev.
Thoughts on a solution
It looks like the best approach is going to be to use virtual environments, as discussed in the answers.
With pip install you install packages by its name in setup.py's name attribute. If you have installed both and execute pip freeze, you will see both packages listed. Which code is available depends on how they are included in Python path.
The issue is those two packages contains just a python module named my_package, that it why you can not import my_package_dev (it does not exist).
I would suggest you to have an working copy for each version (without modifying package name) and use virtualenv to keep environments isolated (one virtualenv for stable version and the other for dev).
You could also use pip's editable install to keep the environment updated with the working copies.
Note: Renaming my_package_dev's my_package module directory to my_package_dev, will also work. But it will be harder to merge changes from one version to the other.
The answer provided by Gonzalo got me on the right track: use virtual environments to manage two different builds. I created the virtual environment for the master (stable) branch with:
$ cd my_package
$ virtualenv venv # make the virtual environment
$ source venv/bin/activate
(venv) $ pip install -r requirements.txt # install everything listed as a requirement
(venv) $ pip install -e . # install my_package dynamicially so that any changes are visible right away
(venv) $ sudo venv/bin/python -m ipykernel install --name 'master' --display-name 'Python 3 (default)'
And for the develop branch, I followed the same procedure in my my_package_dev folder, giving it a different --name and --display-name value.
Note that I needed to use sudo for the final ipykernel install command because I kept getting permission denied errors on my system. I would recommend trying without sudo first, but for this system it needed to be installed system-wide.
Finally, to switch between which version of the tools I am using, I just have to select Kernel -> Change kernel and choose Python 3 (default) or Python 3 (develop). The import stays the same (import my_package), so nothing in the notebook has to change.
This isn't quite my ideal scenario since it means that I will then have to re-run the whole notebook any time I change kernels, but it works!
I have a requirements.txt file with several dependencies listed.
Whenever I try pip install -r requirements.txt in a brand new system, this will usually fail when some dependency is not met (see for example: here and here) This happens specially with the matplotlib package.
After downloading the entire package (in the case of matplotlib it's about 50Mb) and failing to install it, I go and fix the issue and then attempt to install the package again.
pip does not seem to be smart enough to realize it just downloaded that package and automatically re-use that same file (perhaps because it keeps no copy by default?) so the package will be downloaded entirely again.
To get around this issue I can follow the instructions given here and use:
pip install --download=/path/to/packages -r requirements.txt
to first download all packages and then:
pip install --no-index --find-links=/path/to/packages -r requirements.txt
to install all the packages using the locally stored files.
My question: is there a smart command that includes both these directives? I'm after a single line I can run repeatedly so that it will use stored copies of the packages if they exist or download them if they don't, and in this last case, store those copies to some location so they can be re-used later on if needed.
The pip documentation lacks too much wordings (to my eyes), about parameters to deal with source and destinations.
I've experienced strange things installing Sphinx with pip3 and playing with the options available to seemingly allow me to install it precisely where I wanted (for some reasons, I want to have each thing in its own directory). I say “playing”, not that I did not read the doc nor tried --help, but because the pip3 help install did not help, and the pip install official documentation page is too short on this and actually says not more than the pip3 help install.
Here are the experiments done and the observations.
First case with --root
I downloaded the current Sphinx repository tarball, unpacked it, get into the newly created directory and did:
pip3 install --root /home/<user-name>/apps/sphinx -e .
I though this would be the same as --prefix, as there was no --prefix option visibly available. To my surprise, it installed the commands in the bin directory of Python3 (which is also installed locally in its own directory) along to some things in its library directory, and strange, instead of a /home/<user-name>/apps/sphinx directory, I get a /home/<user-name>/apps/sphinx/home/<user-name>/apps/sphinx/…: it appended the specified path to itself.
How especially the last point does make sense? What's the purpose of --root?
Second case with --target
Then I though if it's not --root, that may be --target, so I did (after a clean up):
pip3 install --target /home/<user-name>/apps/sphinx -e .
It did not work, complaining about an unrecognized --home option.
What is this --home (which I did not specified) it complains about, and what exactly is --target?
Third case with --install-option='--prefix=…'
After some web‑searching and a thread on StackOverflow, I tried this:
pip3 install --install-option='--prefix=/home/<user-name>/apps/sphinx' -e .
It just complained it could not install a .pth file and something is wrong with my PYTHONPATH, which was addressable restarting the same with the addition of a variable definition:
export PYTHONPATH=/home/<user-name>/apps/sphinx/lib/python3.4/site-packages
pip3 install --install-option='--prefix=/home/<user-name>/apps/sphinx' -e .
I just had to the set PYTHONPATH even before the directory actually exists and anything was installed in it, but this one was OK (whether or not pip should update PYTHONPATH itself during the process and remind to set it up definitively, is a debatable question).
This option, which was the good one, was also the less clearly visible one.
Another last related one:
What's the difference between --editable and --src?
Update #1
I can't tell if it's Sphinx related, but I noticed two additional things.
Doing
pip3 install --install-option='--prefix=<install-dir>' -e <repository-dir>
where repository-dir is a local check out of Sphinx, Sphinx gets installed in install-dir, is listed by pip3 list but can't be uninstalled.
On the opposite, doing
pip3 install --install-option='--prefix=<install-dir>' Sphinx
that is, letting pip3 retrieving an archive, Sphinx is not installed in install-dir, is installed in the python directory instead, is listed by pip3 list and can be uninstalled.
Depending on whether the source is a local repository or a remote archive, it won't be installed at the same location and will not be or will be uninstallable.
Dependencies were not affected, were handled the same way in both cases (installed where expected, listed, and uninstallable).
Update #2
The behaviour with --root make me feel about a kind of fake‑root (like the one you get when building a Debian package or when cross‑compiling). If it's intended to be the same, then the path which surprised me, is on the contrary, expected.
First and obvious question: why don't you just install the package from PyPI?
sudo pip install sphinx
If you want to install anything that has a setup.py file with pip you can use the --editable flag:
-e, --editable <path/url>
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url.
So you can just issue the command (prefix with sudo if necessary):
pip3 install -e /path/to/pkg
where /path/to/pkg is the directory where setup.py can be found (where you extracted the files).
To answer the other questions:
--root <dir> is used to change the root directory of the file system where pip should install package resources, not to change where to find the package.
--target is used to tell pip in which folder to install the package.
--install-option is used to set some variables that will be used by setup.py, not to change where pip should look for the file.