I've got a couple of projects here for which I'm preparing documentation at the moment, hosted at readthedocs.org. FYI, all of them use poetry and I use custom .readthedocs.yml files with this entry:
python:
install:
- method: pip
path: .
It works fine for most projects, but it fails for two for different reasons during installation of the project via pip:
The first one uses PyGObject, which failes like this:
Package gobject-introspection-1.0 was not found in the pkg-config search path.
Perhaps you should add the directory containing `gobject-introspection-1.0.pc'
to the PKG_CONFIG_PATH environment variable
No package 'gobject-introspection-1.0' found
Command '('pkg-config', '--print-errors', '--exists', 'gobject-introspection-1.0 >= 1.56.0')' returned non-zero exit status 1.
Try installing it with: 'sudo apt install libgirepository1.0-dev'
So it seems that PyGObject cannot be installed without some system packages to be installed. I could rearrange the code so that the import is not top-level. But still I need it in the dependencies. Can I tell pip install to ignore this single package somehow? Any other idea?
The second project compiles some C++ code via Cython and fails, because it's missing a library. I use a custom build script in the pyproject.toml:
[tool.poetry.build]
script = "build.py"
generate-setup-file = false
Is there some flag in pip that I could set and retrieve in build.py to skip the compilation? Or is there a better way?
Related
This question already has answers here:
What is the use case for `pip install -e`?
(3 answers)
Closed 4 months ago.
Whilst in some_other_package, I am importing files from the snnalgorithms pip package. I received the error:
No module named 'src.snnalgorithms'. This is a valid error because that src.snnalgorithms file does not exist in the some_other_package project from which I was calling the pip package.
As a solution, I can make all the imports in the snn.algorithms pip package, relative to itself. Instead of:
from src.snnalgorithms.population.DUMMY import DUMMY
One could write:
from snnalgorithms.population.DUMMY import DUMMY
However, that implies that each time I want to run the code to briefly verify a minor change, or run the tests after a change, I will have to:
Upload the changes into the pip package.
Re-install the pip package locally to reflect the changes.
This significantly slows down development. Hence I was wondering are there more efficient solutions for this?
You can use editable mode for development mode
pip install -e . # Install package locally
From pip documentation:
Editable installs allow you to install your project without copying any files. Instead, the files in the development directory are added to Python’s import path. This approach is well suited for development and is also known as a “development installation”.
With an editable install, you only need to perform a re-installation if you change the project metadata (eg: version, what scripts need to be generated etc). You will still need to run build commands when you need to perform a compilation for non-Python code in the project (eg: C extensions).
Example
If you have project: some_other_package from which you call pip package snnalgorithms you can:
cd snnalgorithms
pip install -e .
cd ..
cd some_other_package
python -m src.some_other_package
Assuming you use the same conda environment for both packages, both packages will then be able to use your newest changes that have not even been published to pypi.org yet.
I am attempting to install from a Python setup. I had this working after a month of hell-ish attempts on 18.04, and now need it on 20.04.
The setup.py while I am using is on this repository ( https://github.com/flashlight/flashlight/tree/main/bindings/python ).
I am using Ubuntu 20.04, fresh install sans a few dependencies I know the system needs.
I have Python3.10 installed locally.
During the install at steps
cd bindings/python
python3.10 setup.py install
I hit the error
CMake Error at cmake/FindMKL.cmake:400. MKL library not found. Please specify the library location by appending the root directory of the MKL installation to the environment variable CMAKE_PREFIX_PATH
Which is weird for a variety of reasons.
I did a sudo apt-get install mkl-devel which took, but also
The setup script claims it defaults to MKL off.
Right prior to this error, I do get a warning.
Cmake Warning at cmake/FindMKL.cmake:387. MKL libraries files are found, but MKL header files are not. You can get them by 'pip install mkl-devel' (I did this btw) if using pip. If build fails with header files available on the system, please make sure that CMake will search the directory containing them by setting CMAKE_INCLUDE_PATH
That last part I did not do, primarily because I'm not sure how to tell where pip install mkl-devel would have put those header files.
That said, when I do a find / "*mkl*" I notice files in two primary locations
.h files : /home/myusername/.local/include
.so files: /home/myusername/.local/lib
So I set the following environment variables in my terminal
LIBRARY_PATH=/home/myusername/.local/lib
LD_LIBRARY_PATH=/home/myusername/.local/lib
KENLM_ROOT=/home/myusername/Flashlight/kenlm
USE_MKL=0
CMAKE_INCLUDE_PATH=/home/myusername/.local/include/
CMAKE_PREFIX_PATH=/home/myusername/.local/lib
However when I try to install again, the process still fails with both of the MKL issues above (Warning and Error)
I'm baffled at what I am supposed to be pointing where to get this to succeed at this point.
What I should have:
I want my Yocto Project to build a package for my Python project with all dependencies inside. The project has to run out of box on the resulting read-only sdcard image.
It simply should install all requirements in the required version to the package.
What I tried without luck:
Calling pip in do_install():
"pip/pip3 is not found", even it's in RDEPENDS.
Anyway, I really prefer this way.
With inherit pypi:
When trying with inherit pypi, it tries to get also my local sources (my pyton project) from pypi. And I have always to copy the requirements to the recipe. This is not my preferred way.
Calling pip in pkg_postinst():
It tries to install the modules on first start and fails, because the system has no internet connection and it's a read-only system. It must run out of the box without installation on first boot time. Does its stuff to late.
Where I'll get around:
There should be no need to change anything in the recipes when something changes in requirements.txt.
Background information
I'm working with Yocto Rocko in a Linux environment.
In the Hostsystem, there is no pip installed. I want to run this one installed from RDEPENDS in the target system.
Building the Package (only this recipe) with:
bitbake myproject
Building the whole sdcard image:
bitbake myProject-image-base
The recipe:
myproject.bb (relevant lines):
RDEPENDS_${PN} = "python3 python3-pip"
APP_SOURCES_DIR := "${#os.path.abspath(os.path.dirname(d.getVar('FILE', True)) + '/../../../../app-sources')}"
FILESEXTRAPATHS_prepend := "${THISDIR}/files:"
SRC_URI = " \
file://${APP_SOURCES_DIR}/myProject \
...
"
inherit allarch # tried also with pypi and setuptools3 for the pypi way.
do_install() { # Line 116
install -d -m 0755 ${D}/myProject
cp -R --no-dereference --preserve=mode,links -v ${APP_SOURCES_DIR}/myProject/* ${D}/myProject/
pip3 install -r ${APP_SOURCES_DIR}/myProject/requirements.txt
# Tried also python ${APP_SOURCES_DIR}/myProject/setup.py install
}
# Tried also this, but it's no option because the data MUST be included in the Package:
# pkg_postinst_${PN}() {
# #!/bin/sh -e
# pip3 install -r /myProject/requirements.txt
# }
FILES_${PN} = "/myProject/*"
Resulting Errors:
Expected to install the listed modules from requirements.txt into the myProject package, so that the python app will run directly on the resulting readonly sdcard image.
With pip, I get:
| /*/tmp/work/*/myProject/0.1.0-r0/temp/run.do_install: 116: pip3: not found
| WARNING: exit code 127 from a shell command.
| ERROR: Function failed: do_install ...
When using pypi:
404 Not Found
ERROR: myProject-0.1.0-r0 do_fetch: Fetcher failure for URL: 'https://files.pythonhosted.org/packages/source/m/myproject/myproject-0.1.0.tar.gz'. Unable to fetch URL from any source.
=> But it should not fetch myProject, since it is already local and nowhere remote.
Any ideas? What would be the best way to reach to a ready to use sdcard image without the need to change recipes when requirements.txt changes?
You should use RDEPENDS_${PN} to take care of your dependencies for your app in the recipe.
For example, assuming your python app needs aws-iot-device-sdk-python module, you should add it to RDEPENDS in the recipe. In your case, it would be like this:
RDEPENDS_${PN} = "python3 \
python3-pip \
python3-aws-iot-device-sdk-python \
"
Here's the link showing the Python modules supported by OpenEmbedded Layer.
https://layers.openembedded.org/layerindex/branch/master/layer/meta-python/
If the modules you need are not there, you will likely need to create recipes for the modules.
My newest findings:
Yocto/bitbake seems to suppress interpreting the requirements, because this breaks automatic dependency resolving what could lead to conflicts.
Reason: The required modules from setup.py would not be stored as independent packages, but as part of my package. So, bitbake does not know about this modules what could conflict with other packages that probably requires same modules in different versions.
What was in my recipe:
MY_INSTALL_ARGS = "--root=${D} \
--prefix=${prefix} \
--install-lib=${PYTHON_SITEPACKAGES_DIR} \
--install-data=${datadir}"
do_install() {
PYTHONPATH=${PYTHON_SITEPACKAGES_DIR} \
${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py install ${MY_INSTALL_ARGS}
}
If I execute this outside of bitbake as python3 setup.py install ${MY_INSTALL_ARGS}, all will be installed correctly, but in the recipe, no requirements are installed.
There is a parameter --no-deps, but I didn't find where it is set.
I think there could be one possibility to exploit the requirements out of setup.py:
Find out where to disable --no-deps in the openembedded/poky layer for easy_install.
Creating a separate PYTHON_SITEPACKAGES_DIR
Install this separate PYTHON_SITEPACKAGES_DIR in eg the home directory as private python modules dir.
This way, no python module would trigger a conflict.
Since I do not have the time to experiment with this, I'll define now one recipe per requirement.
You try installing pip?
Debian
apt-get install python-pip
apt-get install python3-pip
Centos
yum install python-pip
I'm running python 3.6 via anaconda 3, using Visual Studio Code.
I followed instructions like these (Interactive Brokers API install) and downloaded the package to a local directory of mine say: c:\dev\pyib, so now the code is in c:\dev\pyib\IbPy-master
I open that directory in command line and run
python setup.py install
All runs ok.
But then my program, which is in c:\dev\pyib says Module not found. (In my case ibapi). The linter is also showing red.
There is no other python installed on this pc.
Where did the package install to? and how do I check that? What will I find where the package installed itself to that shows me its there?
Or do I have to use a trial-and-error with the linter and sys.path.append()? (I tried that with the directory where the files are downloaded to - to no avail)
I'm trying to set up the PYTHONPATH using the "env" in launch.json from Visual Studio Code, as shown in this unaccepted answer.
Current sys.path:
'c:\\dev\\pyIb',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\python36.zip',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\DLLs',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\lib',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\lib\\site-
packages',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\lib\\site-packages\\Babel-2.5.0-py3.6.egg',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\lib\\site-packages\\win32',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\lib\\site-packages\\win32\\lib',
'C:\\Users\\user\\AppData\\Local\\Continuum\\anaconda3\\lib\\site-packages\\Pythonwin'
I deleted the ib directory and re-ran the install. The last line says: Writing C:\Users\user\AppData\Local\Continuum\anaconda3\Lib\site-packages\IbPy2-0.8.0-py3.6.egg-info So is the location of the egg-info the location of my undetected module? The actual folder in the site-packages is called ib.
Or could my problems be because of a difference in Lib vs. lib with the lowercase in the sys.path and the uppercase in the actual directory?
But the real question here is still: HOW DO I KNOW WHERE the package was installed what should I search for?
This answer is specific for anaconda3 Python and packages installed using python setup.py install (which is actually using distutils)
Take a look at anaconda3\Lib\site-packages you should see a directory for the package you installed.
The way to know for sure where your package is, is by doing a pip list then trying to pip uninstall and re-install again using the python setup.py install: Here are the detailed instructions:
When uninstalling, pip will tell you it cannot because it was done via distutils.
You'll get a message like this:
DEPRECATION: Uninstalling a distutils installed project (ibpy2) has been deprecated and will be removed in a future version.
This is due to the fact that uninstalling a distutils project will only partially uninstall the project.
You'll be prompted to continue anyway. If you choose No, then you can find the directory in
C:\Users\<yourusername>\AppData\Local\Continuum\anaconda3\Lib\site-packages
Thanks to Emanuel Mtali for pointing me in the right direction
Some more information:
The problem I had was due to a stupid mistake of mine. I was running setup of a different (but related) package not used anymore. IbPy2 instead of TwsAPI. I was supposed to run the setup.py of the package installed via the latest version of the MSI from IB, and NOT the IbPy2 package. :-(
Since upgrading to subversion 1.7 I get "unrecognized .svn/entries format" when running buildout. I notice there is an unresolved bug reports for both distribute and setuptools for this error and it also seems that you can use setuptools_subversion to resolve the issue.
What I can't find out how to do is install setuptools_subversion so that buildout picks it up. Please can someone help?
I've tried
downloading it and running python setup.py install
adding it to the eggs list of the [buildout] part of my buildout configuration
You need to install it at the python site-packages level; easy_install (used under the hood by buildout) needs it available before it'll install anything else.
That said, the python setup.py install stanza should have installed it just fine; check by running the following test:
$ python -m setuptools_subversion
/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/setuptools_subversion.py directory
That should print the installation path of the module, like it did for me in the above example. You could try to use pip or easy_install for automatic download:
$ pip install setuptools_subversion
or
$ easy_install setuptools_subversion
You can do that in a virtualenv if you want to isolate the installation. Because this is basically a dependency for svn 1.7, installing this at the same level as the svn binary (usually system wide) is certainly acceptable and the norm.
Note that the unrecognized .svn/entries format error message will not disappear, but your buildout will otherwise succeed. The message is printed no matter what as easy_install first tries the internal .svn parser before deferring to the external plugin.
If you really, really want to verify if the plugin is installed, run the following python code:
import pkg_resources
for entrypoint in pkg_resources.iter_entry_points('setuptools.file_finders'):
print entrypoint
On my system this prints:
svn = setuptools_subversion:listfiles
svn_cvs = setuptools.command.sdist:_default_revctrl
git = setuptools_git:gitlsfiles
hg = setuptools_hg:hg_file_finder