setup.py file using requirements.txt - python

I've read a discussion where a suggestion was to use the requirements.txt inside the setup.py file to ensure the correct installation is available on multiple deployments without having to maintain both a requirements.txt and the list in setup.py.
However, when I'm trying to do an installation via pip install -e ., I get an error:
Obtaining file:///Users/myuser/Documents/myproject
Processing /home/ktietz/src/ci/alabaster_1611921544520/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory:
'/System/Volumes/Data/home/ktietz/src/ci/alabaster_1611921544520/work'
It looks like pip is trying to look for packages that are available on pip (alabaster) on my local machine. Why? What am I missing here? Why isn't pip looking for the required packages on the PyPi server?

I have done it before the other way around, maintaining the setup file and not the requirements file. For the requirements file, just save it as:
*
and for setup, do
from distutils.core import setup
from setuptools import find_packages
try:
from Module.version import __version__
except ModuleNotFoundError:
exec(open("Module/version.py").read())
setup(
name="Package Name",
version=__version__,
packages=find_packages(),
package_data={p: ["*"] for p in find_packages()},
url="",
license="",
install_requires=[
"numpy",
"pandas"
],
python_requires=">=3.8.0",
author="First.Last",
author_email="author#company.com",
description="Description",
)
For reference, my version.py script looks like:
__build_number__ = "_LOCAL_"
__version__ = f"1.0.{__build_number__}"
Which Jenkins is replacing the build_number with a tag

This question consists of two separate questions, for the rather philosopihc choice of how to arrange setup requirements is actually unrelated to the installation error that you are experiencing.
First about the error: It looks like the project you are trying to install depends on another library (alabaster) of which you apparently also did an editable install using pip3 install -e . that points to this directory:
/home/ktietz/src/ci/alabaster_1611921544520/work
What the error tells you is that the directory where the install is supposed to be located does not exist anymore. You should only install your project itself in editable mode, but the dependencies should be installed into a classical system directory, i. e. without the option -e.
To clean up, I would suggest that you do the following:
# clean up references to the broken editable install
pip3 uninstall alabaster
# now do a proper non-editable install
pip3 install alabaster
Concerning the question how to arrange setup requirements, you should primarily use the install_requires and extras_require options of setuptools:
# either in setup.py
setuptools.setup(
install_requires = [
'dep1>=1.2',
'dep2>=2.4.1',
]
)
# or in setup.cfg
[options]
install_requires =
dep1>=1.2
dep2>=2.4.1
[options.extras_require]
extra_deps_a =
dep3
dep4>=4.2.3
extra_deps_b =
dep5>=5.2.1
Optional requirements can be organised in groups. To include such an extra group with the install, you can do pip3 install .[extra_deps_name].
If you wish to define specific dependency environments with exact versions (e. g. for Continuous Integration), you may use requirements.txt files in addition, but the general dependency and version constraint definitions should be done in setup.cfg or setup.py.

Related

pip and tox ignore full path dependencies, instead look for "best match" in pypi

This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)

Python3. Setuptools. Adding a local package to an assembly

There is a locally built package (eg main-0.1.tar.gz). There is another package (for example base-0.1) that requires main-0.1 as a dependency.
It is necessary that during the subsequent installation of the base-0.1 package, the main-0.1 package is also installed.
Those. You can specify only packages with PyPI in install_requires, but local adding packages to the assembly is not clear how.
You can add the package main-0.1.tag.gz to the base-0.1 archive using MANIFEST.in (include main-0.1.tag.gz). But further dependency_links, for example, does not work correctly.
How do I add a local package to the build of another package and then install it along with another package, as if it were pulled from PyPI?
You might want to look at:
PEP 440 ("File URLs")
PEP 508
import setuptools
setuptools.setup(
# [...]
install_requires = [
'main # file:///path/to/main-0.1.tar.gz'
# [...]
],
)
Alternatively (probably better actually), use some combination of pip install options:
pip install --no-index --find-links '/path/to/distributions' main base
Reference:
https://pip.pypa.io/en/stable/user_guide/#installing-from-local-packages
Found a rough solution. I don't know how much it is for Feng Shui, but it works.
Add include main-0.1.tar.gz to MANIFEST.in
In setup.py, at the end of the file (after calling setup ()), add:
if 'sdist' not in sys.argv[1]:
os.system('pip install main-0.1.tar.gz')
The condition may be different if, for example, sdist is not used for building (python setup.py sdist). The main thing is to somehow determine that this is running setup for assembly, and not for installation (pip install base-0.1.tar.gz in the future).
In this case, we copy the local dependent package into the archive of the package being built, and it is distributed, accordingly, along with it. And installed the same way.

Python package additional name

I'm building a new PyPI package based on an existing open source project using setuptools and add some code modifications (they are not the same).
Example:
opensource-custom=2.13.1
Since this project requires dependencies that will look for opensource
what options can I pass to my setup.py when building my wheel files so when I do pip freeze/pip list I can see both?
opensource-custom=2.13.1
opensource=2.13.0
An example of this scenario is intel-numpy if you do a pip install of it, it will generate a copy of numpy.
>pip install intel-numpy
>pip freeze
icc-rt==2019.0
intel-numpy==1.15.1
intel-openmp==2019.0
mkl==2019.0
mkl-fft==1.0.6
mkl-random==1.0.1.1
numpy==1.15.1
tbb==2019.0
tbb4py==2019.0
It sounds like you want to make opensource a dependency of opensource-custom. To do this, you can specify the install_requires parameter in setup.py:
from setuptools import setup
setup(
name='opensource-custom',
install_requires=[
'opensource',
],
...
)
See https://packaging.python.org/guides/distributing-packages-using-setuptools/#install-requires

I pip install a specific package with --global-option. How do I add that package as a dependency in setup.py?

I'm using this package called Dulwich. While developing, I install it like this:
pip install dulwich --global-option="--pure"
I want to add dulwich as a dependency to the setup.py file for my own package, but i'm not sure how to get it to use that pure flag. If my dependencies just looks like this:
DEPENDENCIES = [
'dulwich',
]
setup(
install_requires=DEPENDENCIES,
...
)
it will fail. I've tried all variations of adding --pure and --global-options but they all fail with errors like:
'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Invalid requirement, parse error at "'--pure'"
How am I supposed to correctly add this package as a dependency? The end goal is that I can put my package on PyPi, so that when someone runs
pip install my_package
it will automatically run the equivalent of pip install dulwich --global-option="--pure" as well

Using setuptools to install files to arbitrary locations

Is there a way to install files to arbitrary locations with setuptools? I've used Data Files with setuptools before, but those are typically installed inside the package directory. I need to install a plugin file that will be located in the install directory of another application.
It seems that setuptools has purposely made it difficult to install files outside of the package directory.
I instead included the plugin files as package data and used the Entry Points feature of setuptools to expose the install/uninstall functions for the plugin files I wanted to distribute.
setup(
...
entry_points={
'console_scripts': [
'mypackage_install_plugins = mypackage:install_plugins',
'mypackage_uninstall_plugins = mypackage:uninstall_plugins',
],
}
)
I just added an additional step to the installation instructions to run the following command after installing the python package:
$> mypackage_install_plugins
The data_files attribute will allow you to specify full paths.
You could also do some shutil.copy magic in your setup.py, except don't.
Check out this answer:
Execute a Python script post install using distutils / setuptools
which shows how to add an arbitrary install script (python, shell, whatever) that runs at the end of the install. It'll run whther you use "setup.py install" directly, or a package manager like "pip install". With this, you can add any files you want, anywhere you want.
Unfortunately, I feel Brendan's pain - setuptools, not being a full package manager itself, does not handle the uninstall. Therefore, there's no way to have an uninstall hook to reverse what you did in the post-install script.

Categories