I have some local packages hosted on my own machine, I would like to include a copy of them in distribution of other packages that depends on them. When installing a local package, pip freeze shows something like
public-package==3.0.1
local-package # file:///home/user/local-package/dist/local-package-1.0.0.tar.gz
but if one try to install that package on other computer will get error from pip because local-package path does not exist. Can I extend setup.py commands to process that requirements.txt file, extract local packages path, copy local packages into deps folder of dist archive and rewrite requirements.txt like
public-package==3.0.1
local-package # deps/local-package-1.0.0.tar.gz
and make pip treat deps/ as a relative path to the package archive itself?
I managed to do it with source distributions and overriding sdist and egg_info commands to make setuptools bundle local dependencies together with package and to make pip search dependencies in that bundle when installing the built package later. But later I figured out it makes system vulnerable to dependency confusion attacks because local packages installed from that bundle are visible with pip freeze, if for some reason the dependency location, like local-package # file:///home/user/packages/local-package.tar.gz is stripped to just local-package pip will search it on pypi, which allows dependency confusion to happen.
The best solution for this problem is to vendor all local dependencies where their source code is copied to the package, pip itself vendors its dependencies using vendoring.
Related
I have a Python script which uses open source third party libraries for geoprocessing (OGR and Shapely). My plan is to execute this script on a computer without having to install the required libraries.
I know that there are tools such as py2exe available for this purpose. However, compiling an executable file is not my first choice as I noticed that such files can get pretty large in size. Besides, I would like to use the code within another script. I would therefore like to create a portable python script which already includes the third party methods needed for executing.
Is it possible to include third party methods in a Python script in order to avoid the installation of third party libraries? And if not, what can I do instead, besides compiling an executable file? I work on Windows OS.
You can export your libraries using pip and embbed them into your application.
pip wheel requires the wheel package to be installed, which provides the "bdist_wheel" setuptools extension that it uses.
To build wheels for your requirements and all their dependencies to a local directory:
pip install wheel
pip freeze > requirements.txt
At this point check requirements.txt, clean it up, then you can download wheels in a local folder :
pip wheel --wheel-dir=/local/wheels -r requirements.txt
And then to install those requirements just using your local directory of wheels (and not from PyPI):
pip install --no-index --find-links=/local/wheels -r requirements.txt
Then though you'll need pip, though it's shipped with latest versions of python.
Check this : https://pip.readthedocs.io/en/latest/user_guide/#requirements-files
If your third party lib does not have any dependencies you can get the source files (.py) put into your project folder and use it as package by using import, else it has dependencies your project size grow more better create exe for your script.
I have a python project hosted on PyPI. I've discovered I have some dependency conflicts that need to be pinned. I know pip looks at the install_requires key in setup.py for dependencies, but I've read it's best to place pinned dependencies in a requirements.txt file. I've included this file (see below) using pip freeze, but I am unsure whether pip install project is sufficient to install dependencies as well.
# requirments.txt
numpy==1.9.2
pandas==0.16.2
I would like to make the simplest installation process for the user. For a package hosted on PyPI:
How do I setup requirements to simply pip install a project and include all of it's pinned dependencies automatically (similar to conda)?
Must install_requires=['numpy', 'pandas'] be included? If so, how do I best set it up to install the pinned versions only.
I am working on a Windows system with limited download access. My workaround is to download an app into a local folder. Then in the virtualenv I use pip install -e download folder to install. For instance, pip install -e c:\django\test_virtualenv\South-0.8.4 and South is installed in the virtual environment.
In the virtualenv I am trying to install django-user-accounts, using the same method. Unfortunately, this app has two dependencies listed in setup.py, namely, "django-appconf>=0.6", "pytz>=2013.9". You can see these in the install_requires section in the setup.py file.
I was unable to redirect to a local folder such as c:\download\django-appconf
This lead to the question whether arguments were available when using install_required (part of setuptools). install_requires in setup.py depending on installed Python version points out install_requires takes just a list.
In http://pythonhosted.org/setuptools/setuptools.html#declaring-dependencies the text talks about "limited network access" and "manually download all the eggs to a single directory". This doc also suggests using easy_install with the -f option.
I have a neat system operational at the moment using pip and I am asking if anyone has found a way to redirect the downloading of dependencies from a local folder, no access to network resources for download.
Tommy.
I am running a local pypi server. I can install packages from this server by either specifying it with the -i option of the pip command or by setting the PIP_INDEX_URL environment variable. When I install a package that has prerequisites, setup.py has historically honored the PIP_INDEX_URL environment variable, pulling the additional packages from my local server.
However, on a couple of systems that have been recently installed, it is behaving differently. Running, for instance, python setup.py develop fails because it tries to install prerequisites packages from pypi.python.org.
I have updated all of the related python packages (python, distribute, virtualenv, pip, etc...) on all the systems I'm testing on and continue to see this discrepancy. On my "original" system, setup.py downloads prerequisites from the pypi server specified in my PIP_INDEX_URL environment variable. On the newer systems, I can't seem to make it honor this variable.
What am I missing?
Create setup.cfg in the same folder as your setup.py with following content:
[easy_install]
allow_hosts = *.myintranet.example.com
From: http://pythonhosted.org/setuptools/easy_install.html#restricting-downloads-with-allow-hosts
You can use the --allow-hosts (-H) option to restrict what domains EasyInstall will look for links and downloads on.
--allow-hosts=None prevents downloading altogether.
I ran into the same issue. Fundamentally, setup.py is using setuptools which leverages easy_install, not pip. Thus, it ignores any pip-related environment variables you set.
Rather than use python setup.py develop you can run pip (from the top of the package) pip install -e . to produce the same effect.
I am using some custom modules, not available on PyPI. Is it possible to manage the dependency through virtualenv?
Yes. pip can install packages from -
PyPI (and other indexes) using requirement specifiers.
VCS project urls.
Local project directories.
Local or remote source archives.
So all you have to do it provide the location of the module from some VCS or local directory in the requirements.txt file and pip install -r requirements.txt after activating virtualenv, and it'll work. More examples can be found at pip documentation.
Just keep in mind that pip will run python setup.py install after downloading and extracting your custom module. So you must package your module to support that.