I have a package that will be installed with PIP, let say it's names PackageA.
PackageA has a one requirement, written out in requirements.txt to PackageB.
Like:
file://../../PackageB
PackageB should require another PackageC, using setup.py or requrements.txt. All packages are placed localy. How should be it done?
setup.py could not have requirements to a local packages.
I do not now how place in Package B requirement to a local PackageC
Try with pip install --download-cache=<cachedir> packageA
Where cachedir is the directory with the dependency packages.
Try to fiddle with --no-download and --no-deps
Check pip install --help for more information.
Related
I am trying to build my own Python package (installable by pip) using the twine package. This is all going well right up until the point where I try to pip install my actual package (so after uploading to PyPi).
So I first run:
python3 setup.py sdist bdist_wheel
In which my setup.py install_requires list looks like this:
install_requires=[
'jupyter_kernel_gateway==2.4.0',
'pandas==1.0.2',
'numpy==1.18.1',
'azure-storage-blob==2.0.1',
'azure-datalake-store==0.0.48',
'psycopg2-binary==2.8.4',
'xlsxwriter==1.2.6',
'SQLAlchemy==1.3.12',
'geoalchemy2==0.6.3',
'tabulate==0.8.2',
'pyproj==1.9.6',
'geopandas==0.4.0',
'contextily==0.99.0',
'matplotlib==3.0.2',
'humanize==0.5.1',
'ujson==1.35',
'singleton-decorator==1.0.0',
'dataclasses==0.6',
'xlrd==1.2.0'],
In my understanding, these install_requires would be installed by pip when installing my own package.
After this I run
python3 -m twine upload --repository testpypi dist/*
To actually upload my package to PyPi. However, when pip installing my package, I get errors that say there are no versions that satisfy the requirements for a lot of the listed requirements. E.g.: ERROR: Could not find a version that satisfies the requirement psycopg2-binary==2.8.4
When I manually install these packages (e.g. pip install psycopg2-binary==2.8.4), they do get installed.
Is there any way to make the pip install of my package actually install the install_requires requirement list succesfully?
You didn't show how your pip install-ing your package, but I'm guessing you're using something like:
pip install your_project --index-url https://test.pypi.org/simple
The issue is that TestPyPI doesn't contain copies of your dependencies that exist on PyPI. For example:
Exists: https://pypi.org/project/psycopg2-binary/2.8.4/
Does not exist: https://test.pypi.org/project/psycopg2-binary/2.8.4/
You can configure pip to fall back on TestPyPI when a package is missing instead by specifying --extra-index-url instead:
pip install your_project --extra-index-url https://test.pypi.org/simple
I want to have pip install my setup.py that contains a dependency locally modified for my needs that shares the same name as a package on PyPi, naturally pip goes for this one first.
Initially the setup I had was:
install_requires=[f'klvdata # file://{str(Path("localhost", Path.cwd(), "salt", "libs", "3rd_party", "klvdata-0.0.3-py3-none-any.whl"))}']
This pointed to my local copy and python setup.py install installs this no problem, pip on the other hand running pip install setup.py fails and looks to PyPi for the package.
My second attempt was modifying my install_requires and dependency_links keys with the following:
install_requires=['klvdata']
dependency_links=[str(Path("localhost", Path.cwd(), "salt", "libs", "3rd_party", "klvdata-0.0.3-py3-none-any.whl"))],
This fails as well for pip.
Is there something I'm missing?
I have created an uploaded a package to PyPI. I have generated the requirements.txt file within the package too. How can I now ensure that when a user does pip install MyPackge, all its dependencies are also installed?
Is there any significant difference between
pip install -e /path/to/mypackage
and the setuptools variant?
python /path/to/mypackage/setup.py develop
Try to avoid calling setup.py directly, it will not properly tell pip that you've installed your package.
With pip install -e:
For local projects, the “SomeProject.egg-info” directory is created
relative to the project path. This is one advantage over just using
setup.py develop, which creates the “egg-info” directly relative the
current working directory.
More: docs
Also read the setuptools' docs.
One more difference: pip install -e uses wheel while
python setup.py develop
doesn't use it.
With install, you could achieve the same behavior by using
pip install -e /path/to/package --no-use-wheel
More info on wheels : python wheels
Another difference that may favor pip install -e is that if your project has dependencies in install_requires in setup.py, then pip install -e . installs dependencies with pip, while python setup.py develop can install with easy_install, and may cause problems re: 'egg-info' as mentioned above. When install-requires uses dependency_links with custom git URLs, with attached egg identifiers, this can be especially annoying.
Yet another difference: when you run python setup.py develop for a version that is considered a pre-release (perhaps because you're running it from a git clone when not having checked out a release), then you will enable installation of pre-releases of your dependencies. On the other hand, with pip install --editable you would have to pass --pre explicitly if you want these pre-releases.
(See the CI log with pre-releases accidentally used and compare that to a fixed build here.)
If I want to use the pip command to download a package (and its dependencies), but keep all of the zipped files that get downloaded (say, django-socialregistration.tar.gz) - is there a way to do that?
I've tried various command-line options, but it always seems to unpack and delete the zipfile - or it gets the zipfile, but only for the original package, not the dependencies.
pip install --download is deprecated. Starting from version 8.0.0 you should use pip download command:
pip download <package-name>
The --download-cache option should do what you want:
pip install --download-cache="/pth/to/downloaded/files" package
However, when I tested this, the main package downloaded, saved and installed ok, but the the dependencies were saved with their full url path as the name - a bit annoying, but all the tar.gz files were there.
The --download option downloads the main package and its dependencies and does not install any of them. (Note that prior to version 1.1 the --download option did not download dependencies.)
pip install package --download="/pth/to/downloaded/files"
The pip documentation outlines using --download for fast & local installs.
I always do this to download the packages:
pip install --download /path/to/download/to_packagename
OR
pip install --download=/path/to/packages/downloaded -r requirements.txt
And when I want to install all of those libraries I just downloaded, I do this:
pip install --no-index --find-links="/path/to/downloaded/dependencies" packagename
OR
pip install --no-index --find-links="/path/to/downloaded/packages" -r requirements.txt
Update
Also, to get all the packages installed on one system, you can export them all to requirement.txt that will be used to intall them on another system, we do this:
pip freeze > requirement.txt
Then, the requirement.txt can be used as above for download, or do this to install them from requirement.txt:
pip install -r requirement.txt
REFERENCE: pip installer
pip wheel is another option you should consider:
pip wheel mypackage -w .\outputdir
It will download packages and their dependencies to a directory (current working directory by default), but it performs the additional step of converting any source packages to wheels.
It conveniently supports requirements files:
pip wheel -r requirements.txt -w .\outputdir
Add the --no-deps argument if you only want the specifically requested packages:
pip wheel mypackage -w .\outputdir --no-deps
Use pip download <package1 package2 package n> to download all the packages including dependencies
Use pip install --no-index --find-links . <package1 package2 package n> to install all the packages including dependencies.
It gets all the files from CWD.
It will not download anything
In version 7.1.2 pip downloads the wheel of a package (if available) with the following:
pip install package -d /path/to/downloaded/file
The following downloads a source distribution:
pip install package -d /path/to/downloaded/file --no-binary :all:
These download the dependencies as well, if pip is aware of them (e.g., if pip show package lists them).
Update
As noted by Anton Khodak, pip download command is preferred since version 8. In the above examples this means that /path/to/downloaded/file needs to be given with option -d, so replacing install with download works.
installing python packages offline
For windows users:
To download into a file
open your cmd and folow this:
cd <*the file-path where you want to save it*>
pip download <*package name*>
the package and the dependencies will be downloaded in the current working directory.
To install from the current working directory:
set your folder where you downloaded as the cwd then follow these:
pip install <*the package name which is downloded as .whl*> --no-index --find-links <*the file locaation where the files are downloaded*>
this will search for dependencies in that location.
All the answers mentioned in this thread assume that the packages will be downloaded on the same OS configuration as the target OS where it has to be installed.
In my personal experience i was using windows as my work machine and had to download packages for linux environment and have seen people doing vice versa as well. I had done some extensive googling, and found sodim.dev.
All i had to do was upload requirements.txt file and select the environment configuration like OS and python version and it gives out a csv with download url, source code url etc
I guess in the backend this app spins up the OS VM as requested and installs that particular python version and then generates the report, because it does take about 15-20 minutes for 30-50 packages.
P.S.: I work in an offline environment, where security is of very high concern, and downloading packages are not that frequent. We whitelist source code and download urls for each individual requests and then after running some appsec tools, we approve/reject the source code to be downloaded.
I would prefer (RHEL) - pip download package==version --no-deps --no-binary=:all: