Easy_install cache downloaded files - python

Is there a way to configure easy_install to avoid having to download the files again when an installation fails?

Update 13 years later: easy_install was removed from Python in January 2021. The python package manager is pip, it caches downloaded packages.
pip (http://pypi.python.org/pypi/pip/) is a drop-in replacement for the easy_install tool and can do that.
Just run easy_install pip and set an environment variable PIP_DOWNLOAD_CACHE to the path you want pip to store the files.
Note that the cache won't work with dependencies that checkout from a source code repository (like svn/git/hg/bzr).
Then use pip install instead of easy_install

Here is my solution using pip, managing even installation of binary packages and usable on both, Linux and Windows. And as requested, it will limit download from PyPi to the minimum, and as extra bonus, on Linux, it allows to speed up repeated installation of packages usually requiring compilation to a fraction of a second.
Setup takes few steps, but I thing, it is worth to do.
Create pip config file
Create pip configuration file (on linux: ~/.pip/pip.conf, on Windows %HOME%\pip\pip.ini)
My one has this content:
[global]
download-cache = /home/javl/.pip/cache
find-links = /home/javl/.pip/packages
[install]
use-wheel = yes
[wheel]
wheel-dir = /home/javl/.pip/packages
Populating cache dir - goes automatically
The cache dir will get cached version of data downloaded from pypi each time, pip attempts to get some package from pypi. It is easy to get it there (no special care needed), but note, that from pip point of view, these are just cashed data downloaded from PyPi, not packages, so in case you use an option --no-index, it will not work.
pip download to populate packages dir
The packages dir is place to put real package files to. E.g. for my favorite package plac, I would do:
$ pip download --dest ~/.pip/packages plac
and the plac package file would appear in that dir. You may even use -r requirements.txt file to do this for multiple packages at once.
These packages are used even with $ pip install --no-index <something>.
Prevent repeated compilation of the same package on Linux
E.g. lxml package requires compliation, and download and compile may take from 45 seconds to minutes. Using wheel format, you may save here a lot.
Install wheel tool, if you do not have it yet:
$ pip install wheel
Create the wheel for lxml (assuming, you have managed to install lxml in past - it requires some libs in the system to be installed):
$ pip wheel lxml
This goes over download, compile, but finally results in lxml whl file being in packages dir.
Since then
$ pip install lxml
or even faster
$ pip install --no-index lxml
will take fraction of a second, as it uses wheel formatted package.
Prepare wheel package from Window setup exe package
(note: this can be prepared even on Linux machine, there is no compilation, only some repacking from exe file into whl.)
download the exe form of the package from pypi, e.g:
$ wget https://pypi.python.org/packages/2.7/l/lxml/lxml-3.2.3.win32-py2.7.exe#md5=14ab978b7f0a3382719b65a1ca938d33
$ dir
lxml-3.2.3.win32-py2.7.exe
convert it to whl
$ wheel convert lxml-3.2.3.win32-py2.7.exe
$ dir
lxml-3.2.3.win32-py2.7.exe
lxml-3.2.3-cp27-none-win32.whl
Test it:
$ pip install lxml
or
$ pip install --no-index lxml
shall be very quick.
Note, that wheel convert can do exactly the same conversion for egg formatted packages.
Let easy_install and setup.py install reuse your packages dir
easy_install and $ python setup.py install do not seem to offer download cache, but allow to use packages we have in our packages dir.
To do so, edit config file for these two tools:
On Linux: $HOME/.pydistutils.cfg
On Windows: %HOME%\pydistutils.cfg
In my case I have here in /home/javl/.pydistutils.cfg:
[easy_install]
find_links = /home/javl/.pip/packages
This config may help even some cases of pip install calls, when pip attempts to install a package, declaring dependency on other ones. As it delegates this task to setup.py call, without the .pydistutils.cfg config it would download the file from PyPi.
Unfortunately, wheel format is not supported in this case (as far as I am aware of).

Related

How to modify package before installing it with pip

Trying to install a package on Windows fails, as it requires a site.cfg file to contain the path of a library.
It looks like pip extracts packages to c:\users\[username]\appdata\local\temp\pip-install-[random string]\ during installation and deletes after installation (successful or not), so I can't "hot-edit" it.
Can I make pip wait for me before installing?
Can I make pip download and unpack the package and afterwards tell it to install a package from a directory rather than a package name or URL?
Feel free to comment alternative solutions for installing scikits.audiolab on Windows (demands sndfile to be defined in site.cfg).
Does not work on wheels (see #hoefling comment)
Download package from url (pip tells you the url when you do pip install <package_name>)
Untar the file and do your modifications
Then do pip install <path_of_package>

Can't uninstall project with no packages

While trying to build an MCVE for another question, I created an example directory with one file in it, a setup.py with the following contents:
from setuptools import setup
setup(
name='example',
)
and installed it with
python3.6 setup.py sdist
python3.6 -m pip install --user dist/example-0.0.0.tar.gz
No actual packages or modules, but something got installed:
redacted:~/example> python3.6 -m pip list | grep example
DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
example (0.0.0)
Now I can't uninstall it:
redacted:~/example> python3.6 -m pip uninstall example
Can't uninstall 'example'. No files were found to uninstall.
Other posts suggest there might be a .pth file I have to remove from my site-packages directory, but I don't see any:
redacted:~/example> find ~/.local/lib/python3.6/site-packages/ -name '*.pth'
redacted:~/example>
What did I just do to my system, and how can I undo it?
The steps shown in the question will actually create and install a real package. It won't create any importable files, but it will create metadata in a site-packages directory. Exactly where it has installed depends on your USER_SITE configuration, which you can check with python3.6 -m site, but it's probably going to be at ~/.local/lib/python3.6/site-packages/example-0.0.0-py3.6.egg-info.
Path files (.pth) are unrelated.
The reason it can't uninstall, saying:
Can't uninstall 'example'. No files were found to uninstall.
is because the build command executed earlier will have created an example.egg-info in the current directory, and using python3.6 -m pip means the empty-string is in sys.path. So, the current directory is also considered a package location. Since the current working directory, at sys.path[0], is before the user site the example.egg-info metadata will be found here instead of in site-packages.
The command python3.6 -m pip uninstall also finds this build artifact first, for the same reasons, and does not find the metadata from site-packages which has a record of the files that should be removed during an uninstall. To correctly uninstall this package you could:
rm -rf example.egg-info # first prevent pip from getting confused by the temporary build artifact in cwd
python3.6 -m pip uninstall example # uninstall it from the user site
Or, you could change directory before uninstalling, so that pip finds the package metadata for example in the user site instead of in the working directory.
Note 1: These workarounds are not required for pip >= 20.1. Since April 2020, using python -m pip now ejects the cwd from sys.path and it will uninstall successfully from the user site in the first place without getting confused (#7731)
Note 2: some details are slightly different if this python3.6 environment has a wheel installation in it - in this case the install command will first create a wheel file from the sdist, and then install the wheel, which will result in an example-0.0.0.dist-info subdirectory for the metadata instead of an egg-info subdirectory, but the important details are the same whether you have an .egg-info or .dist-info style install in the user site. It is not possible to determine from the details in the question whether the python3.6 environment had a wheel installation available.
Since you didn't specify any files, there was nothing to be installed. So you can't uninstall anything either.

"yum install package" or "python setup.py install" in CentOS?

I was wondering how the above "yum install package" & "python setup.py install" are used differently in CentOS? I used yum install ... all the time. However, when I try to do python setup.py install, I always get: this setup.py file couldn't be found even though its path shows up under echo $PATH, unless I try to use it in its current directory or use the absolute path.
When you type python setup.py install, your shell will check your $PATH for the python command, and run that. Then, python will be examining its arguments, which are setup.py install. It knows that it can be given the name of a script, so it looks for the file called setup.py so it can be run. Python doesn't use your $PATH to find scripts, though, so it should be a real path to a file. If you just give it the name setup.py it will only look in your current directory.
The source directory for a python module should not, ideally, be in your $PATH.
yum install is a command that will go to a package repository, download all the files needed to install something, and then put them in the right place. yum (and equivalents on other distributions, like apt for Debian systems) will also fetch and install any other packages you need, including any that aren't python modules.
Python has a package manager, too. You may also find using pip install modulename or pip install --user modulename (if you don't have administrative rights) easier than downloading and installing the module by hand. You can often get more recent versions of modules this way, as the ones provided by an operating system (through yum) tend to be older, more stable versions. Sometimes the module is not available through yum at all. pip can't install any extra packages that aren't python modules, though.
If you don't have pip already (it comes with Python3, but might need installing separately for Python2, depending on how it was set up), then you can install it by following the instructions here: https://pip.pypa.io/en/stable/installing/

Breaking 'pip install' to smaller steps, so I can edit the package before it is installed

My familiarity with pip ends up with the ability to do: 'pip install', 'pip uninstall', and 'pip list' - with the name of the package I want to install as the single argument.
This limited knowledge carried me so far, to the extent I'm able to install most of the simple packages, and sometime, when I'm luck, I'm even able to install packages that requires compilation. This is all magic for me.
I'm now facing a situation where I need to do a little bit of editing to the C file (side note: this seems to be a known workaround for the 'netifaces' package - which everyone seems to be in peace with. By itself this is an amazing phenomena).
So I would like to break the installation into smaller steps:
Download the egg file (I've figured out this one: pip install --download).
Unzip or otherwise unpackage the package file, to the point I can edit individual
Do my custom modification.
Do the build
Do the installation.
Other than step #1, I don't know how to proceed.
Modern pip (Since 1.10)
Use pip download:
pip download mypackage
pip 1.5 - 1.9
Use pip install -d
pip install -d . --allow-external netifaces --allow-unverified netifaces netifaces
tar xzf netifaces-0.8.tar.gz # Unpack the downloaded file.
cd netifaces-0.8
Now do your modifications and continue:
pip install .
Old pip (Before 1.5)
Install the package with --no-install option; with --no-install option, pip downloads and unpacks all packages, but does not actually install the package.
pip install --no-install netifaces
Change to the build directory. If you don't know where is the build directory, issue above command again, then it display the location.
cd /tmp/pip_build_falsetru/netifaces
Do the custom modification.
Install the package using pip install . (add --no-clean option if you want keep the build directory) or python setup.py install.
sudo pip install --no-clean .
First, download the source to 0.8 from the author's home page (there's no direct download link from PyPI, for some reason). Go to the directory where you downloaded it and unzip it:
tar zxvf netifaces-0.8.tar.gz
Enter the netifaces-0.8/ directory and edit netifaces.c with your favorite editor. Save the file. Then, build the module:
python setup.py build
and install it:
sudo python setup.py install
To test, first leave the directory, then start your python interpreter and import netifaces to see if it works.
Good luck!
Download your selected package, extract the files,edit what you want. Then, open the directory with your terminal\cmd and run:
python setup.py install
Depending on your os you might need to add a little sudo to the beginning of this command (if you intend to install globally on a Unix machine)
You could just download the source from pypi, edit it and use setup.py buid, setup.py install

Installing Python packages from local file system folder to virtualenv with pip

Is it possible to install packages using pip from the local filesystem?
I have run python setup.py sdist for my package, which has created the appropriate tar.gz file. This file is stored on my system at /srv/pkg/mypackage/mypackage-0.1.0.tar.gz.
Now in a virtual environment I would like to install packages either coming from pypi or from the specific local location /srv/pkg.
Is this possible?
PS
I know that I can specify pip install /srv/pkg/mypackage/mypackage-0.1.0.tar.gz. That will work, but I am talking about using the /srv/pkg location as another place for pip to search if I typed pip install mypackage.
What about::
pip install --help
...
-e, --editable <path/url> Install a project in editable mode (i.e. setuptools
"develop mode") from a local project path or a VCS url.
eg, pip install -e /srv/pkg
where /srv/pkg is the top-level directory where 'setup.py' can be found.
I am pretty sure that what you are looking for is called --find-links option.
You can do
pip install mypackage --no-index --find-links file:///srv/pkg/mypackage
From the installing-packages page you can simply run:
pip install /srv/pkg/mypackage
where /srv/pkg/mypackage is the directory, containing setup.py.
Additionally1, you can install it from the archive file:
pip install ./mypackage-1.0.4.tar.gz
1
Although noted in the question, due to its popularity, it is also included.
I am installing pyfuzzybut is is not in PyPI; it returns the message: No matching distribution found for pyfuzzy.
I tried the accepted answer
pip install --no-index --find-links=file:///Users/victor/Downloads/pyfuzzy-0.1.0 pyfuzzy
But it does not work either and returns the following error:
Ignoring indexes: https://pypi.python.org/simple
Collecting pyfuzzy
Could not find a version that satisfies the requirement pyfuzzy (from versions: )
No matching distribution found for pyfuzzy
At last , I have found a simple good way there: https://pip.pypa.io/en/latest/reference/pip_install.html
Install a particular source archive file.
$ pip install ./downloads/SomePackage-1.0.4.tar.gz
$ pip install http://my.package.repo/SomePackage-1.0.4.zip
So the following command worked for me:
pip install ../pyfuzzy-0.1.0.tar.gz.
Hope it can help you.
This is the solution that I ended up using:
import pip
def install(package):
# Debugging
# pip.main(["install", "--pre", "--upgrade", "--no-index",
# "--find-links=.", package, "--log-file", "log.txt", "-vv"])
pip.main(["install", "--upgrade", "--no-index", "--find-links=.", package])
if __name__ == "__main__":
install("mypackagename")
raw_input("Press Enter to Exit...\n")
I pieced this together from pip install examples as well as from Rikard's answer on another question. The "--pre" argument lets you install non-production versions. The "--no-index" argument avoids searching the PyPI indexes. The "--find-links=." argument searches in the local folder (this can be relative or absolute). I used the "--log-file", "log.txt", and "-vv" arguments for debugging. The "--upgrade" argument lets you install newer versions over older ones.
I also found a good way to uninstall them. This is useful when you have several different Python environments. It's the same basic format, just using "uninstall" instead of "install", with a safety measure to prevent unintended uninstalls:
import pip
def uninstall(package):
response = raw_input("Uninstall '%s'? [y/n]:\n" % package)
if "y" in response.lower():
# Debugging
# pip.main(["uninstall", package, "-vv"])
pip.main(["uninstall", package])
pass
if __name__ == "__main__":
uninstall("mypackagename")
raw_input("Press Enter to Exit...\n")
The local folder contains these files: install.py, uninstall.py, mypackagename-1.0.zip
An option --find-links does the job and it works from requirements.txt file!
You can put package archives in some folder and take the latest one without changing the requirements file, for example requirements:
.
└───requirements.txt
└───requirements
├───foo_bar-0.1.5-py2.py3-none-any.whl
├───foo_bar-0.1.6-py2.py3-none-any.whl
├───wiz_bang-0.7-py2.py3-none-any.whl
├───wiz_bang-0.8-py2.py3-none-any.whl
├───base.txt
├───local.txt
└───production.txt
Now in requirements/base.txt put:
--find-links=requirements
foo_bar
wiz_bang>=0.8
A neat way to update proprietary packages, just drop new one in the folder
In this way you can install packages from local folder AND pypi with the same single call: pip install -r requirements/production.txt
PS. See my cookiecutter-djangopackage fork to see how to split requirements and use folder based requirements organization.
Assuming you have virtualenv and a requirements.txt file, then you can define inside this file where to get the packages:
# Published pypi packages
PyJWT==1.6.4
email_validator==1.0.3
# Remote GIT repo package, this will install as django-bootstrap-themes
git+https://github.com/marquicus/django-bootstrap-themes#egg=django-bootstrap-themes
# Local GIT repo package, this will install as django-knowledge
git+file:///soft/SANDBOX/python/django/forks/django-knowledge#egg=django-knowledge
To install only from local you need 2 options:
--find-links: where to look for dependencies. There is no need for the file:// prefix mentioned by others.
--no-index: do not look in pypi indexes for missing dependencies (dependencies not installed and not in the --find-links path).
So you could run from any folder the following:
pip install --no-index --find-links /srv/pkg /path/to/mypackage-0.1.0.tar.gz
If your mypackage is setup properly, it will list all its dependencies, and if you used pip download to download the cascade of dependencies (ie dependencies of depencies etc), everything will work.
If you want to use the pypi index if it is accessible, but fallback to local wheels if not, you can remove --no-index and add --retries 0. You will see pip pause for a bit while it is try to check pypi for a missing dependency (one not installed) and when it finds it cannot reach it, will fall back to local. There does not seem to be a way to tell pip to "look for local ones first, then the index".
Having requirements in requirements.txt and egg_dir as a directory
you can build your local cache:
$ pip download -r requirements.txt -d eggs_dir
then, using that "cache" is simple like:
$ pip install -r requirements.txt --find-links=eggs_dir
What you need is --find-links of pip install.
-f, --find-links If a url or path to an html file, then parse for links to archives. If a local path or
file:// url that's a directory, then look for archives in the directory listing.
In my case, after python -m build, tar.gz package (and whl file) are generated in ./dist directory.
pip install --no-index -f ./dist YOUR_PACKAGE_NAME
Any tar.gz python package in ./dist can be installed by this way.
But if your package has dependencies, this command will prompt error.
To solve this, you can either pip install those deps from official pypi source, then add --no-deps like this
pip install --no-index --no-deps -f ./dist YOUR_PACKAGE_NAME
or copy your deps packages to ./dist directory.
I've been trying to achieve something really simple and failed miserably, probably I'm stupid.
Anyway, if you have a script/Dockerfile which download a python package zip file (e.g. from GitHub) and you then want to install it you can use the file:/// prefix to install it as shown in the following example:
$ wget https://example.com/mypackage.zip
$ echo "${MYPACKAGE_MD5} mypackage.zip" | md5sum --check -
$ pip install file:///.mypackage.zip
NOTE: I know you could install the package straight away using pip install https://example.com/mypackage.zip but in my case I wanted to verify the checksum (never paranoid enough) and I failed miserably when trying to use the various options that pip provides/the #md5 fragment.
It's been surprisingly frustrating to do something so simple directly with pip. I just wanted to pass a checksum and have pip verify that the zip was matching before installing it.
I was probably doing something very stupid but in the end I gave up and opted for this. I hope it helps others trying to do something similar.
In my case, it was because this library depended on another local library, which I had not yet installed. Installing the dependency with pip, and then the dependent library, solved the issue.
If you want to install one local package (package A) to be used inside another local project/package (B) this is quite simple. All you need is to CD to (B) and call:
pip install /path/to/package(A)
Of course you will need to first compile the package (A) with:
sudo python3 ./setup.py install
And, each time you change package A, just run again setup.py in package (A) then pip install ... inside the using project/package (B)
Just add directory on pip command
pip install mypackage file:/location/in/disk/mypackagename.filetype

Categories