# setup.py
from setuptools import setup
from setuptools.command.install import install
from subprocess import check_call
class CustomInstall(install):
def run(self):
check_call("./build.sh")
install.run(self)
setup(
name='customlib',
packages=['customlib'],
version='0.0.1',
...
cmdclass={'install': CustomInstall}
)
build.sh contains a make & make install step which takes more than 10 minutes to finish.
Is there a PyPi way to "package" the output of build.sh to speed up the pip install process?
Use wheel. A wheel is a great standard format for passing around Python packages, and it can contain C code compiled for various architectures. PyPI supports uploading wheels for your project, and pip will download them when available.
Very useful docs can be found here: https://packaging.python.org/tutorials/distributing-packages/#packaging-your-project
Related
This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)
I want to make a distributable package.
And my package depends on some OS package
Here what I want to install:
def install_libmagic():
if sys.platform == 'darwin':
subprocess.run(['brew', 'install', 'libmagic'])
elif sys.platform == 'linux':
subprocess.run(['apt-get', 'update'])
subprocess.run(['apt-get', 'install', '-y', 'libmagic1'])
else:
raise Exception(f'Unknown system: {sys.platform}, can not install libmagic')
I want this code to be executed only when smb call:
pip install mypacakge
I don't want it to be executed when I run: python setup.py bdist_wheel
How can I achieve this?
I tried this:
setup(
...
install_requires=install_libmagic(),
)
Also tried to override install command:
from setuptools.command.install import install
class MyInstall(install):
def run(self):
install_libmagic()
install.run(self)
setup(
...
cmdclass={'install': MyInstall}
)
But the function was executed on python setup.py bdist_wheel, which is not what I am trying to achieve.
I think you're mixing up the behaviors of built distributions (wheels) and source distributions.
If your goal is run some subprocesses at install time, then you can't do this with a built distribution. A built distribution executes no Python code at install time. It only executes setup.py at build time, which is why you're seeing your functions executed when you call python setup.py bdist_wheel.
On the other hand, a source distribution (python setup.py sdist) does execute the setup.py file at both build time and install time (roughly the same as python setup.py install) and would give you the behavior you're looking for.
However, as the comments have already mentioned, this is going to be very fragile and not very user-friendly or portable. What you're describing is really a distro/OS package that contains some Python module, and you'd probably be better off with that instead.
I'm building a new PyPI package based on an existing open source project using setuptools and add some code modifications (they are not the same).
Example:
opensource-custom=2.13.1
Since this project requires dependencies that will look for opensource
what options can I pass to my setup.py when building my wheel files so when I do pip freeze/pip list I can see both?
opensource-custom=2.13.1
opensource=2.13.0
An example of this scenario is intel-numpy if you do a pip install of it, it will generate a copy of numpy.
>pip install intel-numpy
>pip freeze
icc-rt==2019.0
intel-numpy==1.15.1
intel-openmp==2019.0
mkl==2019.0
mkl-fft==1.0.6
mkl-random==1.0.1.1
numpy==1.15.1
tbb==2019.0
tbb4py==2019.0
It sounds like you want to make opensource a dependency of opensource-custom. To do this, you can specify the install_requires parameter in setup.py:
from setuptools import setup
setup(
name='opensource-custom',
install_requires=[
'opensource',
],
...
)
See https://packaging.python.org/guides/distributing-packages-using-setuptools/#install-requires
I use pip with setuptools to install a package.
I want pip to copy some resource files to, say, /etc/my_package.
My setup.py looks like this:
setup(
...
data_files=[('/etc/my_package', ['config.yml'])]
)
When running pip install, the file ends up in
~/.local/lib/python3.5/site-packages/etc/my_package/config.yml
instead of /etc/my_package.
What am I doing wrong?
(pip version 9.0.1)
Short answer: use pip install --no-binary :all: to install your package.
I struggled with this for a while and eventually figured out that there is some weirdness/inconsistency in how data_files are handled between binary wheels and source distributions. Specifically, there is a bug with wheels that makes all paths in data_files relative to the install location (see https://github.com/pypa/wheel/issues/92 for an issue tracking this).
"Thats fine", you might say, "but I'm not using a wheel!". Not so fast! It turns out recent versions of pip (I am working with 9.0.1) will try to compile a wheel even from a source distribution. For example, if you have a package my_package you can see this doing something like
$ python setup.py sdist # create source tarball as dist/my_package.tar.gz
[...]
$ pip install dist/my_package.tar.gz # install the generated source
[...]
Building wheels for collected packages: my_package
Running setup.py bdist_wheel for my_package ... done
pip tries to be helpful and build a wheel to install from and cache for later. This means you will run into the above bug even though in theory you are not using bdist_wheel yourself. You can get around this by running python setup.py install directly from the package source folder. This avoids the building and caching of built wheels that pip will try to do but is majorly inconvenient when the package you want is already on PyPI somewhere. Fortunately pip offers an option to explicitly disable binaries.
$ pip install --no-binary :all: my_package
[...]
Skipping bdist_wheel for my_package, due to binaries being disabled for it.
Installing collected packages: my_package
Running setup.py install for my_package ... done
Successfully installed my_package-0.1.0
Using the --no-binary option prevents wheel building and lets us reference absolute paths in our data_files paths again. For the case where you are installing a lot of packages together and want to selectively disable wheels you can replace :all: with a comma separated list of packages.
it seems that data_files can't support absolute path, it will add sys.prefix before "/etc/my_package", if you want to put config.yml to ../site_packages/my_package, please try:
import os
import sys
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "my_package")
setup(
...
data_files=[(date_files_relative_path, ['config.yml'])]
)
I ended up writing an init() function that installs the config file on first run instead of creating it during the installation:
def init():
try:
if not path.isdir(config_dir):
os.mkdir(cs_dir)
copyfile(pkg_resources.resource_filename(
__name__, "default_config.yml"), config_file)
print("INFO: config file created. ")
except IOError as ex:
print("ERROR: could not create config directory: " + str(ex)
if __name__ == "__main__":
init()
main()
When I install pytz via setuptools, iterating over pytz.all_timezones takes multiple seconds. Someone suggested running pip unzip pytz, and that fixes the performance problem. Now I want to make setuptools install pytz uncompressed any time someone installs my package.
Can I configure setuptools to always unzip a particular dependency of my package?
$ virtualenv ve2.7
$ source ve2.7/bin/activate
(ve2.7)$ python setup.py install
(ve2.7)$ python slowpytz.py
2.62620520592s
(ve2.7)$ pip unzip pytz
DEPRECATION: 'pip zip' and 'pip unzip` are deprecated, and will be removed in a future release.
Unzipping pytz (in ./ve2.7/lib/python2.7/site-packages/pytz-2014.7-py2.7.egg)
(ve2.7)$ python slowpytz.py
0.0149159431458s
setup.py
from setuptools import setup
setup(name='slowpytz', version='0.0.1', install_requires=['pytz==2014.7'])
slowpytz.py
import pytz
import time
start = time.time()
zones = list(pytz.all_timezones)
print(str(time.time() - start) + 's')
There's no way that I know of to force unzipping of your dependencies in all cases. Some things that fall slightly short of that, but might still be useful:
You could submit a bug report for pytz to set zip_safe=False in its setup.py, using performance data as a justification for the change.
Failing that, you could fork pytz, add zip_safe=False, and have your package depend on your fork. (Not a great option.)
You could recommend that users always install your package with pip, which always installs everything unzipped (including dependencies), rather than easy_install or python setup.py install.
If your users must use easy_install, you can recommend they use easy_install -Z, which forces unzipped installation.