Install python package from svn using dependency_links in setup.py - python

I am trying to install hw3 package which has a dependency package hw2. My setup.py looks as follows -
setup(
name='hw3',
version='0.1',
packages = find_packages(),
install_requires = 'hw2',
dependency_links = [
r'svn+https://server.local/svn/Libraries/testPkg2/trunk#egg=hw2'
]
)
I get the following error when I run python setup.py install in windows cmd
svn: E170013: Unable to connect to a repository at URL 'svn+https://server.local/svn/Libraries/testPkg2/trunk'
svn: E125002: Undefined tunnel scheme 'https'
Alternatively, I have requirements.txt which is as follows
svn+https://server.local/svn/Libraries/testPkg2/trunk#egg=hw2
If I run pip install -r requirements.txt, it installs hw2 package successfully.
My svn version is
svn, version 1.9.7 (r1800392) compiled Aug 8 2017, 22:14:48 on
x86-microsoft-windows
how to resolve this error? Thanks
I am getting the same error for 'http' and 'svn'.
For 'ssh' it is
svn: E170012: Can't create tunnel
svn: E720002: Can't create tunnel: The system cannot find the file specified.

Maybe try it directly with the install_requires option (requires pip>=18.1):
setup(
name='hw3',
version='0.1',
packages = find_packages(),
install_requires = ['hw2#svn+https://server.local/svn/Libraries/testPkg2/trunk#egg=hw2'],
)
See also this answer to a related question https://stackoverflow.com/a/54216163/13835019.

Related

PIP: Cant install dependencies from builded poetry wheel if pyproject.toml contains deps from file

Im using poetry in my project for dependencies, so my pyproject.toml contains all my dependencies. My goal is is to build a wheel from the current project and install them in another virtualenv using pip. But im facing an error:
pip._vendor.pkg_resources.RequirementParseError: Invalid URL: artifacts/pp-0.358.0.tar.gz
I guess that the problem may be in the relative path, but I don't know how to fix it, because when installing dependencies, this package is taken from the tar.gz file located in this path.
Мой pyproject.toml:
[tool.poetry]
name = "package_name"
version = "0.0.0"
description = ""
readme = "README.md"
packages = [
{include = "package1"},
{include = "package2"},
{include = "package3"},
]
[tool.poetry.dependencies]
python = "~3.8"
pp = {file = "artifacts/pp-0.358.0.tar.gz"}
...some other deps
To build a wheel im using following command:
poetry build --format wheel
Installing to another virtualenv with following command:
pip3 install /Users/av/Projects/package_name/dist/package_name-0.0.0-py3-none-any.whl
Is it any ways to fix it somehow? If I comment pp line in dependencies installation works well.

setup.py file using requirements.txt

I've read a discussion where a suggestion was to use the requirements.txt inside the setup.py file to ensure the correct installation is available on multiple deployments without having to maintain both a requirements.txt and the list in setup.py.
However, when I'm trying to do an installation via pip install -e ., I get an error:
Obtaining file:///Users/myuser/Documents/myproject
Processing /home/ktietz/src/ci/alabaster_1611921544520/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory:
'/System/Volumes/Data/home/ktietz/src/ci/alabaster_1611921544520/work'
It looks like pip is trying to look for packages that are available on pip (alabaster) on my local machine. Why? What am I missing here? Why isn't pip looking for the required packages on the PyPi server?
I have done it before the other way around, maintaining the setup file and not the requirements file. For the requirements file, just save it as:
*
and for setup, do
from distutils.core import setup
from setuptools import find_packages
try:
from Module.version import __version__
except ModuleNotFoundError:
exec(open("Module/version.py").read())
setup(
name="Package Name",
version=__version__,
packages=find_packages(),
package_data={p: ["*"] for p in find_packages()},
url="",
license="",
install_requires=[
"numpy",
"pandas"
],
python_requires=">=3.8.0",
author="First.Last",
author_email="author#company.com",
description="Description",
)
For reference, my version.py script looks like:
__build_number__ = "_LOCAL_"
__version__ = f"1.0.{__build_number__}"
Which Jenkins is replacing the build_number with a tag
This question consists of two separate questions, for the rather philosopihc choice of how to arrange setup requirements is actually unrelated to the installation error that you are experiencing.
First about the error: It looks like the project you are trying to install depends on another library (alabaster) of which you apparently also did an editable install using pip3 install -e . that points to this directory:
/home/ktietz/src/ci/alabaster_1611921544520/work
What the error tells you is that the directory where the install is supposed to be located does not exist anymore. You should only install your project itself in editable mode, but the dependencies should be installed into a classical system directory, i. e. without the option -e.
To clean up, I would suggest that you do the following:
# clean up references to the broken editable install
pip3 uninstall alabaster
# now do a proper non-editable install
pip3 install alabaster
Concerning the question how to arrange setup requirements, you should primarily use the install_requires and extras_require options of setuptools:
# either in setup.py
setuptools.setup(
install_requires = [
'dep1>=1.2',
'dep2>=2.4.1',
]
)
# or in setup.cfg
[options]
install_requires =
dep1>=1.2
dep2>=2.4.1
[options.extras_require]
extra_deps_a =
dep3
dep4>=4.2.3
extra_deps_b =
dep5>=5.2.1
Optional requirements can be organised in groups. To include such an extra group with the install, you can do pip3 install .[extra_deps_name].
If you wish to define specific dependency environments with exact versions (e. g. for Continuous Integration), you may use requirements.txt files in addition, but the general dependency and version constraint definitions should be done in setup.cfg or setup.py.

pip and tox ignore full path dependencies, instead look for "best match" in pypi

This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)

Download dependencies declared in pyproject.toml using Pip

I have a Python project that doesn't contain requirements.txt.
But it has a pyproject.toml file.
How can I download packages (dependencies) required by this Python project and declared in pyproject.toml using the Pip package manager (instead of the build tool Poetry).
So instead of pip download -r requirements.txt, something like pip download -r pyproject.toml.
Here is an example of .toml file:
[build-system]
requires = [
"flit_core >=3.2,<4",
]
build-backend = "flit_core.buildapi"
[project]
name = "aedttest"
authors = [
{name = "Maksim Beliaev", email = "beliaev.m.s#gmail.com"},
{name = "Bo Yang", email = "boy#kth.se"},
]
readme = "README.md"
requires-python = ">=3.7"
classifiers = ["License :: OSI Approved :: MIT License"]
dynamic = ["version", "description"]
dependencies = [
"pyaedt==0.4.7",
"Django==3.2.8",
]
[project.optional-dependencies]
test = [
"black==21.9b0",
"pre-commit==2.15.0",
"mypy==0.910",
"pytest==6.2.5",
"pytest-cov==3.0.0",
]
deploy = [
"flit==3.4.0",
]
to install core dependencies you run:
pip install .
if you need test(develop) environment (we use test because it is a name defined in .toml file, you can use any):
pip install .[test]
To install from Wheel:
pip install C:\git\aedt-testing\dist\aedttest-0.0.1-py3-none-any.whl[test]
pip supports installing pyproject.toml dependencies natively.
As of version 10.0, pip supports projects declaring dependencies that are required at install time using a pyproject.toml file, in the form described in PEP 518. When building a project, pip will install the required dependencies locally, and make them available to the build process. Furthermore, from version 19.0 onwards, pip supports projects specifying the build backend they use in pyproject.toml, in the form described in PEP 517.
From the project's root, use pip's local project install:
python -m pip install .
You can export the dependencies to a requirements.txt and use pip download afterwards:
poetry export -f requirements.txt > requirements.txt
pip download -r requirements.txt

Why does setuptools not understand git+https URLs?

According to Dependency section in the setuptools manual git repository URLs can be specified in the dependency_links argument to setup with git+URL. Yet,
cd /tmp
mkdir py-test
cd py-test
touch __init__.py
and creation of a setup.py file with
from setuptools import setup, find_packages
from pkg_resources import parse_version
setup(
name = "py-test",
version = "1.0",
packages = ["."],
dependency_links = [
"git+https://github.com/wxWidgets/wxPython.git"
],
install_requires = ["wxPython"],
)
causes the error Download error on git+https://github.com/wxWidgets/wxPython.git: unknown url type: git+https -- Some packages may not be found! when I run python setup.py build && sudo setup.py install.
The installation of the package python-setuptools-git doesn't help.
I'm using setuptools 18.2 with python 2.7 on Ubuntu 15.04.
From the setuptools docs:
In the case of a VCS checkout, you should also append #egg=project-version in order to identify for what package that checkout should be used
So the fix is just to append the #egg=wxPython fragment onto the end:
dependency_links = [
"git+https://github.com/wxWidgets/wxPython.git#egg=wxPython"
]

Categories