When you tell pip to install multiple packages simultaneously, it looks up all child dependencies and installs the most recent version that is allowed by all parent package restrictions.
For example, if packageA requires child>=0.9 and packageB requires child<=1.1, then pip will install child==1.1.
I'm trying to write a script to scan a requirements.txt and pin all unlisted child package versions.
One way to do this would be to simply install everything from a requirements.txt file via pip install -r requirements.txt, then parse the output of pip freeze, and strip out all the packages from my requirements.txt file. Everything left should be child packages and the highest version number that pip calculated.
However, this requires creating a Python virtual environment and installing all packages, which can take a bit a time. Is there any option in pip to do the version calculation without actually installing the packages? If not, is there pip-like tool that provides this?
You can read the package requirements for PyPI hosted packages with the json module:
python3 terminal/script:
import csv
import json
import requests
your_package = str('packageA')
pypi_url = 'https://pypi.python.org/pypi/' + your_package + '/json'
data = requests.get(pypi_url).json()
reqs = data['info']['requires_dist']
print(reqs)
print('Writing requirements for ' + your_package + ' to requirements.csv')
f = open('requirements.csv', 'w')
w = csv.writer(f, delimiter = ',')
w.writerows([x.split(',') for x in reqs])
f.close()
If i get your question correctly you’re trying to install a particular version of a package, one way to do this is after specifying the version in ur requirement.txt file, it can be installed by using pip install then the dependency plus the version
example pip install WhatsApp v3.5
Related
I am enrolled in a machine learning competition and, for some reason, the submission is not a CSV file, but rather the code in Python.
In order to make it run, they asked the participants to create another file called install.py to automatically install all the packages used.
I need to install multiple packages (keras, numpy, etc.).
For each package, I have to use the command os.system. I have no idea what it does, and this is the only information that I have.
Yes, this type of question was asked before, but not with several packages and this specific os.system line.
I don't know if this might work for your specific issues. Give it a go.
import os
packages = ["keras","sklearn"] #etc
for package in packages:
os.system("pip install "+ package) #installs particular package
The way I recommend doing this is to import pip as a module, as follows: (untested)
import pip
def install(package):
if hasattr(pip, 'main'):
pip.main(['install', package])
else:
pip._internal.main(['install', package])
packages = [] #Add your packages as strings
for package in packages:
install(package)
I used this question for most of the code.
You could create a requirements.txt file with all of your package requirements.
import os
os.system("pip install -r requirements.txt")
What I should have:
I want my Yocto Project to build a package for my Python project with all dependencies inside. The project has to run out of box on the resulting read-only sdcard image.
It simply should install all requirements in the required version to the package.
What I tried without luck:
Calling pip in do_install():
"pip/pip3 is not found", even it's in RDEPENDS.
Anyway, I really prefer this way.
With inherit pypi:
When trying with inherit pypi, it tries to get also my local sources (my pyton project) from pypi. And I have always to copy the requirements to the recipe. This is not my preferred way.
Calling pip in pkg_postinst():
It tries to install the modules on first start and fails, because the system has no internet connection and it's a read-only system. It must run out of the box without installation on first boot time. Does its stuff to late.
Where I'll get around:
There should be no need to change anything in the recipes when something changes in requirements.txt.
Background information
I'm working with Yocto Rocko in a Linux environment.
In the Hostsystem, there is no pip installed. I want to run this one installed from RDEPENDS in the target system.
Building the Package (only this recipe) with:
bitbake myproject
Building the whole sdcard image:
bitbake myProject-image-base
The recipe:
myproject.bb (relevant lines):
RDEPENDS_${PN} = "python3 python3-pip"
APP_SOURCES_DIR := "${#os.path.abspath(os.path.dirname(d.getVar('FILE', True)) + '/../../../../app-sources')}"
FILESEXTRAPATHS_prepend := "${THISDIR}/files:"
SRC_URI = " \
file://${APP_SOURCES_DIR}/myProject \
...
"
inherit allarch # tried also with pypi and setuptools3 for the pypi way.
do_install() { # Line 116
install -d -m 0755 ${D}/myProject
cp -R --no-dereference --preserve=mode,links -v ${APP_SOURCES_DIR}/myProject/* ${D}/myProject/
pip3 install -r ${APP_SOURCES_DIR}/myProject/requirements.txt
# Tried also python ${APP_SOURCES_DIR}/myProject/setup.py install
}
# Tried also this, but it's no option because the data MUST be included in the Package:
# pkg_postinst_${PN}() {
# #!/bin/sh -e
# pip3 install -r /myProject/requirements.txt
# }
FILES_${PN} = "/myProject/*"
Resulting Errors:
Expected to install the listed modules from requirements.txt into the myProject package, so that the python app will run directly on the resulting readonly sdcard image.
With pip, I get:
| /*/tmp/work/*/myProject/0.1.0-r0/temp/run.do_install: 116: pip3: not found
| WARNING: exit code 127 from a shell command.
| ERROR: Function failed: do_install ...
When using pypi:
404 Not Found
ERROR: myProject-0.1.0-r0 do_fetch: Fetcher failure for URL: 'https://files.pythonhosted.org/packages/source/m/myproject/myproject-0.1.0.tar.gz'. Unable to fetch URL from any source.
=> But it should not fetch myProject, since it is already local and nowhere remote.
Any ideas? What would be the best way to reach to a ready to use sdcard image without the need to change recipes when requirements.txt changes?
You should use RDEPENDS_${PN} to take care of your dependencies for your app in the recipe.
For example, assuming your python app needs aws-iot-device-sdk-python module, you should add it to RDEPENDS in the recipe. In your case, it would be like this:
RDEPENDS_${PN} = "python3 \
python3-pip \
python3-aws-iot-device-sdk-python \
"
Here's the link showing the Python modules supported by OpenEmbedded Layer.
https://layers.openembedded.org/layerindex/branch/master/layer/meta-python/
If the modules you need are not there, you will likely need to create recipes for the modules.
My newest findings:
Yocto/bitbake seems to suppress interpreting the requirements, because this breaks automatic dependency resolving what could lead to conflicts.
Reason: The required modules from setup.py would not be stored as independent packages, but as part of my package. So, bitbake does not know about this modules what could conflict with other packages that probably requires same modules in different versions.
What was in my recipe:
MY_INSTALL_ARGS = "--root=${D} \
--prefix=${prefix} \
--install-lib=${PYTHON_SITEPACKAGES_DIR} \
--install-data=${datadir}"
do_install() {
PYTHONPATH=${PYTHON_SITEPACKAGES_DIR} \
${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py install ${MY_INSTALL_ARGS}
}
If I execute this outside of bitbake as python3 setup.py install ${MY_INSTALL_ARGS}, all will be installed correctly, but in the recipe, no requirements are installed.
There is a parameter --no-deps, but I didn't find where it is set.
I think there could be one possibility to exploit the requirements out of setup.py:
Find out where to disable --no-deps in the openembedded/poky layer for easy_install.
Creating a separate PYTHON_SITEPACKAGES_DIR
Install this separate PYTHON_SITEPACKAGES_DIR in eg the home directory as private python modules dir.
This way, no python module would trigger a conflict.
Since I do not have the time to experiment with this, I'll define now one recipe per requirement.
You try installing pip?
Debian
apt-get install python-pip
apt-get install python3-pip
Centos
yum install python-pip
For some reason, I cannot pip install %CD%\*.whl as I will then get:
Requirement 'C:\\Users\fredrik\\Downloads\\*.whl' looks like a filename, but the file does not exist
`*.whl is not a valid wheel filename.
On macOS (and I believe on Linux), I can do this without issues:
pip install *.whl
Processing ./certifi-2017.11.5-py2.py3-none-any.whl
Processing ./chardet-3.0.4-py2.py3-none-any.whl
Processing ./idna-2.6-py2.py3-none-any.whl
Processing ./requests-2.18.4-py2.py3-none-any.whl
Processing ./urllib3-1.22-py2.py3-none-any.whl
...
Why is there a difference in this behavior between the platforms?
Is there a preferred way to make this (pip install *.whl) work on Windows?
If you know the package name e.g. foo
You can use this:
python -m pip install --find-links=C:\Users\fredrik\Downloads foo
this will find any foo package e.g. foo-X.Y.Z-cp37-cp37m-win_amd64.whl etc...
If you don't know the package name then you can try:
FOR %%i in (C:\Users\fredrik\Downloads\*.whl) DO python -m pip install %i
I have a structure of the directory as such with foobar and alphabet data directories together with the code something.py:
\mylibrary
\packages
\foobar
foo.zip
bar.zip
\alphabet
abc.zip
xyz.zip
something.py
setup.py
And the goal is such that users can pip install the module as such:
pip install mylibrary[alphabet]
And that'll only include the data from the packages/alphabet/* data and the python code. Similar behavior should be available for pip install mylibrary[foobar].
If the user installs without the specification:
pip install mylibrary
Then it'll include all the data directories under packages/.
Currently, I've tried writing the setup.py with Python3.5 as such:
import glob
from setuptools import setup, find_packages
setup(
name = 'mylibrary',
packages = ['packages'],
package_data={'packages':glob.glob('packages' + '/**/*.txt', recursive=True)},
)
That will create a distribution with all the data directories when users do pip install mylibrary.
How should I change the setup.py such that specific pip installs like pip install mylibrary[alphabet] is possible?
Firs you have to package and publish alphabet and foobar as a separate packages as pip install mylibrary[alphabet] means
pip install mylibrary
pip install alphabet
After that add alphabet and foobar as extras:
setup(
…,
extras = {
'alphabet': ['alphabet'],
'foobar': ['foobar'],
}
)
The keys in the dictionary are the names used in pip install mylibrary[EXTRA_NAME], the values are a list of package names that will be installed from PyPI.
PS. And no, you cannot use extras to install some data files that are not available as packages from PyPI.
When I install pytz via setuptools, iterating over pytz.all_timezones takes multiple seconds. Someone suggested running pip unzip pytz, and that fixes the performance problem. Now I want to make setuptools install pytz uncompressed any time someone installs my package.
Can I configure setuptools to always unzip a particular dependency of my package?
$ virtualenv ve2.7
$ source ve2.7/bin/activate
(ve2.7)$ python setup.py install
(ve2.7)$ python slowpytz.py
2.62620520592s
(ve2.7)$ pip unzip pytz
DEPRECATION: 'pip zip' and 'pip unzip` are deprecated, and will be removed in a future release.
Unzipping pytz (in ./ve2.7/lib/python2.7/site-packages/pytz-2014.7-py2.7.egg)
(ve2.7)$ python slowpytz.py
0.0149159431458s
setup.py
from setuptools import setup
setup(name='slowpytz', version='0.0.1', install_requires=['pytz==2014.7'])
slowpytz.py
import pytz
import time
start = time.time()
zones = list(pytz.all_timezones)
print(str(time.time() - start) + 's')
There's no way that I know of to force unzipping of your dependencies in all cases. Some things that fall slightly short of that, but might still be useful:
You could submit a bug report for pytz to set zip_safe=False in its setup.py, using performance data as a justification for the change.
Failing that, you could fork pytz, add zip_safe=False, and have your package depend on your fork. (Not a great option.)
You could recommend that users always install your package with pip, which always installs everything unzipped (including dependencies), rather than easy_install or python setup.py install.
If your users must use easy_install, you can recommend they use easy_install -Z, which forces unzipped installation.