Shipping *.so and binaries while building RPM package - python

I have created a python application in which I would like to ship .so and some binary files in the final RPM package. After long reading I found a way to add binaries/ image and other data files in setup.py. Now, when I build an RPM with python setup.py bdist_rpm command, it complains about architecture dependency:
Arch dependent binaries in noarch package
error: command 'rpmbuild' failed with exit status 1
After googling I found that we can add:
#%define _binaries_in_noarch_packages_terminate_build 0
or removing the line BuildArch: noarch in the packagename.spec file to overcome the rpmbuild failure. However, every time I add or remove line from build/bdist.linux-i686/rpm/SPECS/packagename.spec the command python setup.py bdist_rpm always overwrites the .spe file.
Is there a way to avoid Arch dependent binaries and ship *.so and other binary files in rpm?

The behavior of bdist_rpm is defined by a bunch of settings in:
/usr/lib/rpm/macros
/etc/rpm/macros
$HOME/.rpmmacros
I'm willing to bet that only /usr/lib/rpm/macros exists on your system. This is normal.
So, in order to prevent the "Arch dependent binaries in noarch package" error you would create /etc/rpm/macros or ~/.rpmmacros and add the following:
%_unpackaged_files_terminate_build 0
%_binaries_in_noarch_packages_terminate_build 0
Do not modify /usr/lib/rpm/macros because that file will be overwritten by the system whenever the rpm-build package is upgraded, downgraded, or re-installed.
If you want to override the behavior for everyone on the system put the settings in /etc/rpm/macros. If you want override the behavior for a particular user then add the settings to $HOME/.rpmmacros.
.rpmmacros trumps /etc/rpm/macros which trumps /usr/lib/rpm/macros.
Note: it's useful to examine /usr/lib/rpm/macros to see what settings are available and for syntax examples.
As a side note, %_unpackaged_files_terminate_build 0 setting prevents the error: Installed (but unpackaged) file(s) found: error.

.so files are always arch dependent as far as I know.
In your case to avoid having to edit the specs-file all the time you can add --force-arch=<your_arch> to our setup.py bdist_rpm
e.g.
python setup.py bdist_rpm --force-arch=x86_64

If you encounter this matter in a .spec file when you try to build a new .rpm package.
Change the BuildArch from noarch to x86_64 (or whatever you have on the building system)
[root#devel-mga7][~/build/yate-ota]# grep Arch yate-ota.spec
BuildArch: x86_64
[root#devel-mga7][~/build/yate-ota]#

Related

How to diagnose conan install issue

I have some installation issues with conan
After my Ubuntu 18.04 told "Command 'conan' not found", I guessed the Python
version is wrong. So I attempted to upgrade with the result
$ sudo apt-get install python
python is already the newest version (2.7.15~rc1-1)
However
$ locate python
/var/lib/binfmts/python2.7
/var/lib/binfmts/python3.6
When in this state I attempted to install conan
$ pip install conan
Collecting conan
...
Successfully installed Jinja2-2.10.1 MarkupSafe-1.1.1 PyJWT-1.7.1 PyYAML-5.1.2 astroid-1.6.6 attrs-19.1.0 backports.functools-lru-cache-1.5 bottle-0.12.17 certifi-2019.6.16 chardet-3.0.4 colorama-0.4.1 conan-1.18.0 configparser-3.7.4 deprecation-2.0.6 distro-1.1.0 enum34-1.1.6 fasteners-0.15 future-0.16.0 futures-3.3.0 idna-2.8 isort-4.3.21 lazy-object-proxy-1.4.1 mccabe-0.6.1 monotonic-1.5 node-semver-0.6.1 packaging-19.1 patch-1.16 pluginbase-0.7 pygments-2.4.2 pylint-1.9.5 pyparsing-2.4.2 python-dateutil-2.8.0 requests-2.22.0 singledispatch-3.4.0.3 six-1.12.0 tqdm-4.32.2 urllib3-1.25.3 wrapt-1.11.2
then 'conan' is listed as being installed but
$ conan
Command 'conan' not found, did you mean:
I.e, no error message or warning, just does not install.
I could find out that the path was not listed in my PATH, so I added '~.local/bin'. Now the story goes on with the error message
CMake Error at CMakeLists.txt:90 (include):
include could not find load file:
Conan
I found
https://docs.conan.io/en/latest/howtos/cmake_launch.html.
OK, I inserted in my CMakeLists.txt file line
# Download automatically, you can also just copy the conan.cmake file
if(NOT EXISTS "${CMAKE_BINARY_DIR}/conan.cmake")
message(STATUS "Downloading conan.cmake from https://github.com/conan-io/cmake-conan")
file(DOWNLOAD "https://raw.githubusercontent.com/conan-io/cmake-conan/master/conan.cmake"
"${CMAKE_BINARY_DIR}/conan.cmake")
endif()
include(${CMAKE_BINARY_DIR}/conan.cmake)
conan_cmake_run(REQUIRES Catch2/2.6.0#catchorg/stable
BASIC_SETUP)
I was also advised,
Please specify in command line CMAKE_BUILD_TYPE
(-DCMAKE_BUILD_TYPE=Release)
So I use
cmake .. -DCMAKE_BUILD_TYPE=Release
rather than
cmake ..
Still, I receive
ERROR: compiler not defined for compiler.libcxx
Please define compiler value first too
FATAL_ERROR;conan install command failed.
STATUS;Conan: Compiler GCC>=5, checking major version 7
STATUS;Conan: Checking correct version: 7
About two weeks ago I could install on another system the same project flawlessly. Can I go back somehow to that state? I expected conan to be stable, rather than alpha.
Edit 2:
I issued
conan profile new default --detect --force
The reply is
Found gcc 7
gcc>=5, using the major as version
************************* WARNING: GCC OLD ABI COMPATIBILITY ***********************
Conan detected a GCC version > 5 but has adjusted the 'compiler.libcxx' setting to
'libstdc++' for backwards compatibility.
Your compiler is likely using the new CXX11 ABI by default (libstdc++11).
(I do not really know why in the case of a new project I need backward compatibility) After that,
cmake ..
finally seems to work. I am afraid I will have further issues due to the compiler standards. For example, SystemC defaults to '98, but some other library uses feature needing '14, and now conan forces to use '11. Is there a way to handle all this centrally, specific to MY system?
Concerning the two python versions: I did not install this manually, only some other install programs did so. I do not really know why and which install script causes such doubling. BTW: Ubuntu said that V2.7 is the newest version, although V3.x is also present. I am a bit confused about these version numbers.
I simply made a new install, and did not especially very WHEN the second version of python appeared. I personally do not even use python, only some install scripts could install it.
Whether my system is specific: I do not think so. I just installed Ubuntu 18.04.2, and my primary goal was to install this SystemC related stuff. I really installed ONLY what was declared as missing. (plus livetex, git, etc.)
In the meantime 'cmake ..' terminated. Appearently, the installation by conan terminated OK. However, when configuring my project, gives messages like
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
SCV_INCLUDE_DIRS
The missing files are installed also by conan, using
[requires]
SystemC/2.3.3#minres/stable
SystemCVerification/2.0.1#minres/stable
doxygen_installer/1.8.15#bincrafters/stable
qt/5.12.0#bincrafters/stable
gtest/1.8.1#bincrafters/stable
flex/2.6.4#bincrafters/stable
I am using literally the same files (either my old disk connected to the bus or the new one, using the same cable). The installation made about a month ago runs fine, the new one behaves as described.
It looks like installing and using conan is too complicated for me. I wanted to simplify installation rather than complicate it.
There is a bunch of cases related to installation listed here:
https://docs.conan.io/en/latest/installation.html#known-installation-issues-with-pip
I would say Conan is installed but is not listed in your PATH. You could find Conan in your Python package folder and update your PATH with conan path:
python -m site # list your package folder
find <package folder> -name conan
echo PATH=${PATH}:<package folder> >> ~/.bashrc
source ~/.bashrc

Aubio 0.4.0 Python Windows Installed but failing when creating aubio source

So I managed to get aubio 0.4.0 installed so that it imports into python without errors however I haven't figured out how to pass files to be analyzed.
Here's the steps I've done to install aubio 0.4.0 taken from here:
Downloaded the most recent git build of Aubio 0.4.0 source download - http://git.aubio.org/
Unpacked onto C:\
installed python 2.7.6
appended C:\python27 to the 'Path' environment variable
installed MinGW v-0.6.2 mingw.org/download/installer
inside MinGW Installation manager I included - [mingw32-base]
appended C:\MinGW\bin to the 'Path' environment variable
created file "C:\Python27\Lib\distutils\distutils.cfg" containing:
[build]
compiler = mingw32
--------------- INCLUDING LIBAV libraries ---------------------------
download pygtk-all-in-one-2.24.2.win32-py2.7.msi to get pkgconfig and all it's dependancies: ftp.gnome.org/pub/GNOME/binaries/win32/pygtk/2.24/
download libav win32 build win32.libav.org/win32/ and unpack into C:\libav\
create a new environment variable name: "PKG_CONFIG_PATH" with the value at: C:\libav\usr\lib\pkgconfig
append C:\libav\usr\bin\ to the 'Path' environment variable
-------------------- END LIBAV ---------------------------------------
Inside the aubio path run the command: python .\waf configure build -j 1 --check-c-compiler=gcc
I get a crash at 168/193 with test-delnull.exe but the build keeps going and returns "build" finished successfully
Install numpy v-1.8.0 sourceforge.net/projects/numpy/files/NumPy/
Inside the aubio\python path run the command: python setup.py build
Inside the aubio\python path run the command: python setup.py install
I had to copy the dll from aubio\build\src\libaubio-4.dll into python27\Lib\site-packages\aubio\
Then I added one of my own test.mp3 and test.wav files into aubio\python\tests\sounds\
Inside the aubio\python\tests path I ran the command: python run_all_tests -v
------------------- EDIT ---------------------------------
The above instructions should work now without the problem originally asked
------------------- END EDIT -----------------------------
So from the results I get a lot of 'ok' regarding the many different tests being made however it's first problem is with "test_many_sinks" where it tries to use the .wav file from sounds and gives:
AUBIO ERROR: failed creating aubio source with [wav file path]
It continues giving the same error for the rest of the tests until it crashes on "test_zero_hop_size" and stops.
Any further advice as to what I still need to do would be much appreciated.
Thanks!
With help from Paul Brossier we found out two issues:
Because I never included libav into my build I can't use .mp3's to test
Using a newer git repository ended up allowing me successfully run demo_bpm_extract.py which was previously erroring even when I tested with a .wav file. The git source I used can be found here: http://git.aubio.org/?p=aubio.git;a=commit;h=4a1378c12ffe7fd518448f6a1ab00f99f0557286
There are still quite a few errors showing up when executing "run_all_tests" which I've tried to pass over to Paul.

Installing python module to python-version neutral location

I've built a debian package containing a python module.
The problem is that
dpkg-deb -c python-mymodule_1.0_i386.deb
show that all the files will be installed under
/usr/lib/python2.6/dist-packages/mymodule*
This means that the end-user who installs my deb package will need to be using exactly the same version of python as I am - yet I know that my module works fine on later versions too.
In my Makefile I have the following target:
install:
python setup.py install --root $(DESTDIR) $(COMPILE) --install-layout=deb
Where setup.py is
from distutils.core import setup
setup(name='mymodule',
version='1.0',
description='does my stuff',
author='Me',
author_email='myemail#localhost',
url='http://myurl/',
packages=['mymodule'],
)
Is there some way I can edit either the setup.py file or the Makefile so that the resulting module is installed in a python-version neutral directory instead of /usr/lib/python2.6?
Thanks,
Alex
I think I have found the answer now (thanks to the pointers from Tshepang):
In debian/rules you need to invoke dh_pysupport.
This grabs all the files installed by setup.py on the build machine in
$(DESTDIR)/usr/lib/pythonX.Y/dist-packages/mymodule*
and puts them into a non python-version specific location in the .deb file, namely
/usr/share/pyshared/mymodule*
finally, it adds a call to update-python-modules to the postinst script, which makes sure that the module is available on every version of python present on the target machine.

unable to install graph-tool in windows 7

I'd like to use graph-tool on windows 7, but I'm having trouble installing it.
All the requirements listed here are successfully installed. Python 2.7 is installed in C:\python27. Boost 1.49.0 was successfully compiled with mingw, installed in C:\boost and the BOOST_ROOT environment variable is pointing to it. Boost is compiled in debug and release mode and both static and dynamic.
Invoking configure from within MSyS leads to the following error.
configure: error:
Could not link test program to Python. Maybe the main Python library has been
installed in some non-standard library path. If so, pass it to configure,
via the LDFLAGS environment variable.
Example: ./configure LDFLAGS="-L/usr/non-standard-path/python/lib"
============================================================================
ERROR!
You probably have to install the development version of the Python package
for your distribution. The exact name of this package varies among them.
============================================================================
Calling configure LDFLAGS="-LC:/python27/libs" fixed this error, but lead to the following error
checking for boostlib >= 1.38.0... configure: error: We could not detect the boo
st libraries (version 1.38 or higher). If you have a staged boost library (still
not installed) please specify $BOOST_ROOT in your environment and do not give a
PATH to --with-boost option. If you are sure you have boost installed, then ch
eck your version number looking in <boost/version.hpp>. See http://randspringer.
de/boost for more documentation.
This is weird, since BOOST_ROOT is clearly defined (checked it with printenv command).
The next command I tried was configure --with-boost="C:/boost" LDFLAGS="-LC:/python27/libs"
checking for boostlib >= 1.38.0... yes
checking whether the Boost::Python library is available... no
configure: error: No usable boost::python found
Alright it detects boost, but It can't find boost::python. Due to its size I'm unable to post the config.log on stackoverflow but you can find it here.
I'm really confused right now and would appreciate any help.
I have zero experience with compiling graph-tool (or anything else) for windows, but the following part of your config.log stands out:
configure:17224: checking whether the Boost::Python library is available
configure:17254: g++ -c -Wall -ftemplate-depth-150 -Wno-deprecated -Wno-unknown-pragmas -O99 -fvisibility=default -fvisibility-inlines-hidden -Wno-unknown-pragmas -Ic:\python27\include conftest.cpp >&5
conftest.cpp:32:36: fatal error: boost/python/module.hpp: No such file or directory
compilation terminated.
Note how the boost path you passed is not being used! Try to pass CXXFLAGS="-IC:\boost\include" to configure as well.
May be something like this would help:
./configure --prefix=/usr/
for windows path is different, try it yourself.

How to build debian package with CPack to execute setup.py?

Until now, my project had only .cpp files that were compiled into different binaries and I managed to configure CPack to build a proper debian package without any problems.
Recently I wrote a couple of python applications and added them to the project, as well as some custom modules that I would also like to incorporate to the package.
After writing a setup.py script, I'm wondering how to add these files to the CPack configuration in a way that setup.py get's executed automatically when the user installs the package on the system with dpkg -i package.deb.
I'm struggling to find relevant information on how to configure CPack to install custom python applications/modules. Has anyone tried this?
I figured out a way to do it but it's not very simple. I'll do my best to explain the procedure so please be patient.
The idea of this approach is to use postinst and prerm to install and remove the python application from the system.
In the CMakeLists.txt that defines the project, you need to state that CPACK is going to be used to generate a .deb package. There's some variables that need to be filled with info related to the package itself, but one named CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA is very important because it's used to specify the location of postinst and prerm, which are standard scripts of the debian packaging system that are automatically executed by dpkg when the package is installed/removed.
At some point of your main CMakeLists.txt you should have something like this:
add_subdirectory(name_of_python_app)
set(CPACK_COMPONENTS_ALL_IN_ONE_PACKAGE 1)
set(CPACK_PACKAGE_NAME "fake-package")
set(CPACK_PACKAGE_VENDOR "ACME")
set(CPACK_PACKAGE_DESCRIPTION_SUMMARY "fake-package - brought to you by ACME")
set(CPACK_PACKAGE_VERSION "1.0.2")
set(CPACK_PACKAGE_VERSION_MAJOR "1")
set(CPACK_PACKAGE_VERSION_MINOR "0")
set(CPACK_PACKAGE_VERSION_PATCH "2")
SET(CPACK_SYSTEM_NAME "i386")
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "ACME Technology")
set(CPACK_DEBIAN_PACKAGE_DEPENDS "libc6 (>= 2.3.1-6), libgcc1 (>= 1:3.4.2-12), python2.6, libboost-program-options1.40.0 (>= 1.40.0)")
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA "${CMAKE_SOURCE_DIR}/name_of_python_app/postinst;${CMAKE_SOURCE_DIR}/name_of_python_app/prerm;")
set(CPACK_SET_DESTDIR "ON")
include(CPack)
Some of these variables are optional, but I'm filling them with info for educational purposes.
Now, let's take a look at the scripts:
postinst:
#!/bin/sh
# postinst script for fake_python_app
set -e
cd /usr/share/pyshared/fake_package
sudo python setup.py install
prerm:
#!/bin/sh
# prerm script
#
# Removes all files installed by: ./setup.py install
sudo rm -rf /usr/share/pyshared/fake_package
sudo rm /usr/local/bin/fake_python_app
If you noticed, script postinst enters at /usr/share/pyshared/fake_package and executes the setup.py that is laying there to install the app on the system. Where does this file come from and how it ends up there? This file is created by you and will be copied to that location when your package is installed on the system. This action is configured in name_of_python_app/CMakeLists.txt:
install(FILES setup.py
DESTINATION "/usr/share/pyshared/fake_package"
)
install(FILES __init__.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_python_app
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_1.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_2.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
As you can probably tell, besides the python application I want to install there's also 2 custom python modules that I wrote that also need to be installed. Below I describe the contents of the most important files:
setup.py:
#!/usr/bin/env python
from distutils.core import setup
setup(name='fake_package',
version='1.0.5',
description='Python modules used by fake-package',
py_modules=['fake_package.fake_module_1', 'fake_package.fake_module_2'],
scripts=['fake_package/fake_python_app']
)
_init_.py: is an empty file.
fake_python_app : your python application that will be installed in /usr/local/bin
And that's pretty much it!
A setup.py file is the equivalent of the configure && make && make install dance for a standard unix source distribution and as such is inappropriate to run as a part of a distributions package install process. See this discussion of the different ways to include Python modules in a .deb package.

Categories