Pipenv: dependencies of platform specific packages are installed unconditionally? - python

I am trying to integrate pipenv into my new project because I need to have some dependencies for development only and I couldn't find an easy way to have additional dev dependencies with pip or venv.
However, I ran into an issue when trying it:
My project depends on pypiwin32 when it's used in windows. Because it's a windows dependency, I installed it with the command:
pipenv install "pypiwin32 ; platform_system == 'Windows'"
which succesfully added this dependency to my Pipfile with the platform restriction:
# Pipfile
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
# dependencies...
[packages]
# dependencies...
pypiwin32 = {markers = "platform_system == 'Windows'",version = "*"}
[requires]
python_version = "3.7"
The problem is that pypiwin32 depends on the pywin32 package and when I looked in the Pipfile.lock file I saw that it's not restricted to windows only:
# Pipfile.lock
"pypiwin32": {
"index": "pypi",
"markers": "platform_system == 'Windows'",
"version": "==223"
},
"pywin32": {
"version": "==224"
},
which is weird. I tested it by changing the restriction to 'Linux' and trying to install the dependencies into a new environment without the Pipfile.lock and it ignored pypiwin32 as expected but it installed pywin32 anyway despite the fact that it's not listed in the Pipfile and that it's a dependency of a package which was ignored.
This is a serious problem because if I decide to deploy or develop on a linux machine, it will simply not work because pywin32 isn't available on linux or mac (as far as I know).
Is there a way to correctly mark the dependencies of a package with the restrictions of the package?
I don't want to manually edit Pipfile.lock (I also don't think it will work because when I tried to install it into a new environment, I copied the Pipfile only).
Alternatively, is there a better way to manage my dependencies? I need the ability to specify development dependencies that won't be installed on a production environment and I need the ability to specify platform specific dependencies. pip with the requirements.txt can't handle the first request as far as I know, which is why I tried switching to pipenv.
I tried looking it up with no success. Any help would be appreciated.

Adding platform specific installation of both pypiwin32 and pywin32 packages explicitly in the pipfile should solve the issue.
pypiwin32 = {version = "*", sys_platform = "== 'win32'"}
pywin32 = {version = "*", sys_platform = "== 'win32'"}
The prerequisite package pywin32 will in that case also be ignored on linux.

Related

pyproject.toml listing an editable package as a dependency for an editable package

Using setuptools, is it possible to list another editable package as a dependency for an editable package?
I'm trying to develop a collection of packages in order to use them across different production services, one of these packages (my_pkg_1) depends on a subset of my package collection (my_pkg_2, my_pkg_x, ...), so far, I've managed to put together this pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "my_pkg_1"
version = "0.0.1"
dependencies = [
"my_pkg_2 # file:///somewhere/in/mysystem/my_pkg_2"
]
which does work when/for installing my_pkg_1 in editable mode, and it does install my_pkg_2 but not in editable mode. this is what I see when I pip list:
Package Version Editable project location
--------------- ------- -------------------------
my_pkg_2 0.0.1
my_pkg_1 0.0.1 /somewhere/in/mysystem/my_pkg_1
Is what I'm trying to do even possible? if so, how?
You may install my_pkg_2 explicitly in editable mode before installing my_pkg_1:
pip install --editable /somewhere/in/mysystem/my_pkg_2
Unfortunately, It is not possible to install dependencies (and dependencies of dependencies) automatically in editable mode by installing the main package. I am curious why it is not implemented.
Alternatively, you may add the package paths to the environment variable PYTHONPATH before running code from your main package. That way, you are able to import python modules from your other packages without having to install them.
This can not be done in pyproject.toml. At least not the way you want it and in a standard way.
If I were you I would write for myself a requirements.txt file (you could also give it a different name, obviously):
# install the current project as editable
--editable .
# install `my_pk_2` as editable
--editable /somewhere/in/mysystem/my_pkg_2
And you could use it like so:
path/to/venv/bin/python -m pip install --requirement 'path/to/requirements.txt'
for when you want to work on (edit) both pieces of software at the same time in the same environment.
Alternatively you could use a "development workflow tool" (such as PDM, Hatch, Poetry, and so on), and maybe one of those would be a better fit for your expectations.

Python build error - Could not find a version that satisfies the requirement wheel

I am trying to build python wheel following the instructions as described in the link below. I am doing this first time.
https://packaging.python.org/en/latest/tutorials/packaging-projects/
I set up the folder structure, files and all. I have added this in pyproject.toml file.
[build-system]
requires= ["setuptools>=57.4.0","wheel>=0.37.1"]
build-backend = "setuptools.build_meta"
I have installed setuptoos and wheel on my virtual environment.
When I tried to run the build command, I am getting an SSL warnings and below error.
Could not find a version that satisfies the requirement wheel>=0.37.1
Could not fetch from URL https://pypi.org/simple
Even though have installed setuptools and wheel on my virtual environment, I think it is hitting the pypi to find and download these packages.
I don't know how the build module finds the modules/packages in "requires". I am not finding a way to direct to use the already installed setuptools and wheel on my machine instead of fetching from pypi.
Even if it tries to doanload again, how can derect to use our artifactory instead of pypi.
Any help in this is greatly appreciated.
I tried all below with differenct combinations but did not work. Obviously I am missing something.
1.
I added a pip.ini in my virtual environment (Lib\site-packages\pip).
Added the index-url with our organization's artifactory url.
Added trusted-host
Also tried pip.config
I downloaded the wheels for setuptools and wheel.
Added another argument in pyproject.toml
[easy-install]
find-links = c:\wheels
Added the wheels directly in the src folder.
Thanks.

pip and tox ignore full path dependencies, instead look for "best match" in pypi

This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)

How to `pipenv uninstall` a package installed from an archive file on the web?

I started to use pipenv a few days ago. I have installed a 2.0.0 version of a library, I did:
pipenv intall https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz#egg=es_core_news_sm
Then I realized I need a 2.3.0 version, so I did
pip install https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.3.0/es_core_news_sm-2.3.0.tar.gz#egg=es_core_news_sm
And I would like to remove the previous one (2.0.0), so:
pipenv uninstall https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz
Un-installing https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz…
No package https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz to remove from Pipfile.
Locking [dev-packages] dependencies…
Locking [packages] dependencies…
So it looks like pipenv did not remove the first version. The Pipfile still has lines for both:
[packages.f8ba4b6]
file = "https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz"
[packages.0feb3d5]
file = "https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.3.0/es_core_news_sm-2.3.0.tar.gz"
And so does the Pipfile.lock:
"default": {
"0feb3d5": {
"file": "https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.3.0/es_core_news_sm-2.3.0.tar.gz"
},
"f8ba4b6": {
"file": "https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz"
}
}
I also tried uninstalling them like this:
pipenv uninstall es_core_news_sm en_core_web_sm
Un-installing es_core_news_sm…
Found existing installation: es-core-news-sm 2.3.0
Uninstalling es-core-news-sm-2.3.0:
Successfully uninstalled es-core-news-sm-2.3.0
No package es_core_news_sm to remove from Pipfile.
Un-installing en_core_web_sm…
Found existing installation: en-core-web-sm 2.2.0
Uninstalling en-core-web-sm-2.2.0:
Successfully uninstalled en-core-web-sm-2.2.0
But the Pipenv still has the lines with the 2.0.0 version (the ones, I've pasted above).
So far I have removed the obsolete 2.0.0 entry from Pipfile, then pipenv locked and now the Pipfile.lock does not have the 2.0.0 entry. I however, wonder if the package is still in my .venv, poor lost soul.
Have you tried looking in the virtual environment folder?
pipenv -venv
In my experience, the pipenv install/uninstall command only works for packages listed under [packages] and [dev-packages]. Somehow your install command generated another section - so it seems like you did the right thing just to remove it. If you install again using the same command as above:
pipenv install https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz#egg=es_core_news_sm
You should end up with something like this:
[packages]
es-core-news-sm = {file = "https://github.com/explosion/spacy-models/releases/download/es_core_news_sm-2.0.0/es_core_news_sm-2.0.0.tar.gz"}

Install needed libraries for Weasyprint on pipenv (Windows environment)

In order to start generating documents with Weasyprint I installed it on my Windows machine following these instructions:
https://weasyprint.readthedocs.io/en/stable/install.html#step-5-run-weasyprint
On my computer it works but I have a Django project where I want to integrate this library and I use pipenv. How to install the necessary libraries even in the virtual environment?
I tried setting the path for the pycairo package into the Pipfile like
pycairo= {path= "C:/Program Files/GTK3-Runtime Win64/bin/"}
but still it throws the error:
OSError: dlopen() failed to load a library: cairo / cairo-2 / cairo-gobject-2 / cairo.so.2
I have 64bit Windows machine and this is the Pipfile:
[[source]]
name = "pypi"
url = "https://pypi.org/simple"
verify_ssl = true
[dev-packages]
pylint = "*"
[packages]
django = "*"
mysql = "*"
ipython = "*"
django-webpack = "*"
django-webpack-loader = "*"
django-livereload-server = "*"
pylint = "*"
reportlab = "*"
weasyprint = "*"
django-weasyprint = "*"
pycairo= {path= "C:/Program Files/GTK3-Runtime Win64/bin/"}
cairocffi = "*"
[requires]
python_version = "3.7"
You need install 'GTK+ 64 Bit Installer' path in this local:
C:\msys2
Source: WeasyPrint Github
I came across this error as well and followed every step mentioned in the Weasyprint docs for the installation. I was using PowerShell as my default terminal and pipenv. First I tried using the command import weasyprint inside python shell in my virtualenv, but it always returned the cairo, dlopen() etc error.
What worked for me was switching to cmd. Switched and used the same commands and now it's executing perfectly.
Also, in my pipfile, for weasyprint, which is enough to generate a report in django, I only have weasyprint installed. In the question, if it is still relevant for someone, the libraries reportlab, django-weasyprint, pycairo and cairocffi can be safely removed/uninstalled from pipenv.
Please type the following command:
WHERE libcairo-2.dll
you should be getting 'C:\msys2\mingw64\bin\libcairo-2.dll'
then open your cmd and type the following.
SET PROPER_GTK_FOLDER=
SET PATH=%PROPER_GTK_FOLDER%;%PATH%
Please follow the documentation it has everything to run on windows. It worked for me it i hope it will work for youenter link description here

Categories