Install dependencies locally with pip install via setup.py - python

I want to have pip install my setup.py that contains a dependency locally modified for my needs that shares the same name as a package on PyPi, naturally pip goes for this one first.
Initially the setup I had was:
install_requires=[f'klvdata # file://{str(Path("localhost", Path.cwd(), "salt", "libs", "3rd_party", "klvdata-0.0.3-py3-none-any.whl"))}']
This pointed to my local copy and python setup.py install installs this no problem, pip on the other hand running pip install setup.py fails and looks to PyPi for the package.
My second attempt was modifying my install_requires and dependency_links keys with the following:
install_requires=['klvdata']
dependency_links=[str(Path("localhost", Path.cwd(), "salt", "libs", "3rd_party", "klvdata-0.0.3-py3-none-any.whl"))],
This fails as well for pip.
Is there something I'm missing?

Related

Could not find a version that satisfies the requirement length-hpi [duplicate]

I am trying to build my own Python package (installable by pip) using the twine package. This is all going well right up until the point where I try to pip install my actual package (so after uploading to PyPi).
So I first run:
python3 setup.py sdist bdist_wheel
In which my setup.py install_requires list looks like this:
install_requires=[
'jupyter_kernel_gateway==2.4.0',
'pandas==1.0.2',
'numpy==1.18.1',
'azure-storage-blob==2.0.1',
'azure-datalake-store==0.0.48',
'psycopg2-binary==2.8.4',
'xlsxwriter==1.2.6',
'SQLAlchemy==1.3.12',
'geoalchemy2==0.6.3',
'tabulate==0.8.2',
'pyproj==1.9.6',
'geopandas==0.4.0',
'contextily==0.99.0',
'matplotlib==3.0.2',
'humanize==0.5.1',
'ujson==1.35',
'singleton-decorator==1.0.0',
'dataclasses==0.6',
'xlrd==1.2.0'],
In my understanding, these install_requires would be installed by pip when installing my own package.
After this I run
python3 -m twine upload --repository testpypi dist/*
To actually upload my package to PyPi. However, when pip installing my package, I get errors that say there are no versions that satisfy the requirements for a lot of the listed requirements. E.g.: ERROR: Could not find a version that satisfies the requirement psycopg2-binary==2.8.4
When I manually install these packages (e.g. pip install psycopg2-binary==2.8.4), they do get installed.
Is there any way to make the pip install of my package actually install the install_requires requirement list succesfully?
You didn't show how your pip install-ing your package, but I'm guessing you're using something like:
pip install your_project --index-url https://test.pypi.org/simple
The issue is that TestPyPI doesn't contain copies of your dependencies that exist on PyPI. For example:
Exists: https://pypi.org/project/psycopg2-binary/2.8.4/
Does not exist: https://test.pypi.org/project/psycopg2-binary/2.8.4/
You can configure pip to fall back on TestPyPI when a package is missing instead by specifying --extra-index-url instead:
pip install your_project --extra-index-url https://test.pypi.org/simple

Can't install a github repository

I'm trying to run this repository from Github, in order to work on it. I downloaded the files but i keep getting the error 'no module named pyttrex' (pyttrex is the name of the repository).
I tried to install it using pip (pip install pyttrex) but pip didn't even find the module, after that i tried with pip install git+https://github.com/icoprimers/pyttrex, but i got the error Command "python setup.py egg_info" failed with error code 1, although setuptools is upgraded. I have two versions of python installed: 2.7 and 3.6, i just can't see the error.
By default pip install pyttrex installs packages from PyPI. There is no pyttrex at PyPI — URL https://pypi.python.org/pypi/pyttrex returns error 404 — hence pip cannot install it.
At Github repository there is no setup.py. Resume — the repo is not a package and isn't pip-installable. Your best bet is to clone the repository and copy subdirectory pyttrex manually to site-packages.

"pip install --editable ./" vs "python setup.py develop"

Is there any significant difference between
pip install -e /path/to/mypackage
and the setuptools variant?
python /path/to/mypackage/setup.py develop
Try to avoid calling setup.py directly, it will not properly tell pip that you've installed your package.
With pip install -e:
For local projects, the “SomeProject.egg-info” directory is created
relative to the project path. This is one advantage over just using
setup.py develop, which creates the “egg-info” directly relative the
current working directory.
More: docs
Also read the setuptools' docs.
One more difference: pip install -e uses wheel while
python setup.py develop
doesn't use it.
With install, you could achieve the same behavior by using
pip install -e /path/to/package --no-use-wheel
More info on wheels : python wheels
Another difference that may favor pip install -e is that if your project has dependencies in install_requires in setup.py, then pip install -e . installs dependencies with pip, while python setup.py develop can install with easy_install, and may cause problems re: 'egg-info' as mentioned above. When install-requires uses dependency_links with custom git URLs, with attached egg identifiers, this can be especially annoying.
Yet another difference: when you run python setup.py develop for a version that is considered a pre-release (perhaps because you're running it from a git clone when not having checked out a release), then you will enable installation of pre-releases of your dependencies. On the other hand, with pip install --editable you would have to pass --pre explicitly if you want these pre-releases.
(See the CI log with pre-releases accidentally used and compare that to a fixed build here.)

setup.py & pip: override one of the dependency's sub-dependency from requirements.txt

I'm currently working on a package and in my requirements.txt, I have a dependency: wikipedia. Now, wikipedia 1.3 uses requests-2.2.1 while my package uses version 2.3.0.
Also, as one would expect, wikipedia-1.3's installation depends on presence of it's dependency.
But, If I start a new virtualenv and directly include wikipedia in my requirements.txt, it gives an ImportError on requests since at the time setup.py runs, requests-2.3.0's setup.py doesn't execute unless all others execute. In the Figure attached below, there's no running setup.py for requests after it gets unpacked.
For some weird reason, wikipedia's setup.py contains import wikipedia, which in turn imports it's dependencies before they're even installed; however it passes the CI test because it's installing requirements separately through pip and then running setup.py.
To over come this situation, I've made a setup script consisting of:
pip install -r requirements.txt
pip install wikipedia
pip install -e .
This installs requests-2.3.0 and beautifulsoup4;
then installs wikipedia (which can then run setup.py and installs wikipedia and requests-2.2.1)
then 'pip install -e .' option installs my package along with requests-2.3.0 again.
Hence requests-2.3.0 is first getting installed, then getting replaced by older version 2.2.1 and then replaced again by 2.3.0.
I tried going through various specifications on how to overcome this but those were confusing. How could I overcome this mess?
As noted by Martijn the correct way would be to specify a minimum version in the project assuming full compatibility is preserved in future releases of the sub-dependency.
If you do not have any way to change the requirements file you can download the project and edit the requirements file locally to specify whatever version you want. This can be done via the pip download command:
pip download wikipedia==1.3
Besides that if you want to use pip for the whole process and preserve requests==2.3.0 without deleting and reinstalling again you can specify a constraints file. This can be done with:
pip install -c constraints.txt wikipedia==1.3
Where constraints.txt contains something like:
requests>=2.3.0
beautifulsoup4
This will produce a warning, but the wikipedia package will be installed:
wikipedia 1.3.0 has requirement requests==2.2.1, but you'll have requests 2.3.0 which is incompatible.
Installing collected packages: wikipedia
Successfully installed wikipedia-1.3.0
Now, if you really know what you are doing(or just want to try if it works) you can use the --no-deps flag which will ignore package dependencies entirely and will not produce the warning above:
pip install --no-deps -c constraints.txt wikipedia==1.3
In both cases pip freeze shows:
beautifulsoup4==4.6.0
bs4==0.0.1
requests==2.3.0
wikipedia==1.3.0
Note: This was tested with pip 10.0.1, but it should work with any recent pip version.

PIP and setup.py with local requirements

I have a package that will be installed with PIP, let say it's names PackageA.
PackageA has a one requirement, written out in requirements.txt to PackageB.
Like:
file://../../PackageB
PackageB should require another PackageC, using setup.py or requrements.txt. All packages are placed localy. How should be it done?
setup.py could not have requirements to a local packages.
I do not now how place in Package B requirement to a local PackageC
Try with pip install --download-cache=<cachedir> packageA
Where cachedir is the directory with the dependency packages.
Try to fiddle with --no-download and --no-deps
Check pip install --help for more information.

Categories