I'm trying to cross-compile Python 2.7.18 for an x86,uclibc machine using a crosstool-ng example toolchain. The commands used are the following:
CONFIG_SITE=config.site CC=/home/msainz/x-tools/x86_64-unknown-linux-uclibc/bin/x86_64-unknown-linux-uclibc-gcc CXX=/home/msainz/x-tools/x86_64-unknown-linux-uclibc/bin/x86_64-unknown-linux-uclibc-g++ AR=/home/msainz/x-tools/x86_64-unknown-linux-uclibc/bin/x86_64-unknown-linux-uclibc-ar RANLIB=/home/msainz/x-tools/x86_64-unknown-linux-uclibc/bin/x86_64-unknown-linux-uclibc-ranlib READELF=/home/msainz/x-tools/x86_64-unknown-linux-uclibc/bin/x86_64-unknown-linux-uclibc-readelf LDFLAGS="-L/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/lib -L/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/usr/lib" CFLAGS="-I/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/usr/include -I/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/include" CPPFLAGS="-I/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/usr/include -I/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/include" ./configure --enable-shared --host=x86_64-unknown-linux-uclibc --build=x86_64 --disable-ipv6 --prefix=/home/msainz/Projects/python2_top_uclibc/
followed by
PATH=$PATH:/home/msainz/Projects/python2_top_glibc/bin/ make
and
PATH=$PATH:/home/msainz/Projects/python2_top_glibc/bin/ make install
Execution ends with the following error:
fi /home/msainz/x-tools/x86_64-unknown-linux-uclibc/bin/x86_64-unknown-linux-uclibc-gcc -L/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/lib -L/home/msainz/Projects/Selene/WP3/local/uclibc/base_rootfs/usr/lib -Xlinker -export-dynamic -o python \ Modules/python.o \ -L. -lpython2.7 -ldl -lpthread -lm _PYTHON_PROJECT_BASE=/home/msainz/Projects/Python-2.7.18 _PYTHON_HOST_PLATFORM=linux2-x86_64 PYTHONPATH=./Lib:./Lib/plat-linux2 python -S -m sysconfig --generate-posix-vars ;\ if test $? -ne 0 ; then \ echo "generate-posix-vars failed" ; \ rm -f ./pybuilddir.txt ; \ exit 1 ; \ fi python: error while loading shared libraries: libc.so.0: cannot open shared object file: No such file or directory generate-posix-vars failed make: *** [Makefile:523: pybuilddir.txt] Error 1
python2_top_glibc dir contains a previous Python-2.7.18 installation but for native glibc which was compiled perfectly. libc.so.0 is in fact in the base_rootfs of target system, which is being linked in ./configure stage. I'm stuck at this at the moment. Any clue will be appreciated. Any additional info will be supplied on demand.
Thanks in advance.
python: cannot open shared object file: No such file or directory
This is a run-time loader error. You are trying to run a python executable that is linked against that libc.so.0.
If this executable can actually run in your host environment, you can enable it by adding your base_rootfs library to LD_LIBRARY_PATH. Otherwise, you need to use your host python executable in this step of the build process, or disable it altogether.
Upon attempting to compile python 3.7 I hit Could not import runpy module:
jeremyr#b88:$ wget https://www.python.org/ftp/python/3.7.3/Python-3.7.3.tar.xz
....
jeremyr#b88:~/Python-3.7.3$ ./configure --enable-optimizations
jeremyr#b88:~/Python-3.7.3$ make clean
jeremyr#b88:~/Python-3.7.3$ make -j32
....
gcc -pthread -Xlinker -export-dynamic -o Programs/_testembed Programs/_testembed.o libpython3.7m.a -lcrypt -lpthread -ldl -lutil -lm
./python -E -S -m sysconfig --generate-posix-vars ;\
if test $? -ne 0 ; then \
echo "generate-posix-vars failed" ; \
rm -f ./pybuilddir.txt ; \
exit 1 ; \
fi
Could not import runpy module
Traceback (most recent call last):
File "/home/jeremyr/Python-3.7.3/Lib/runpy.py", line 15, in <module>
import importlib.util
File "/home/jeremyr/Python-3.7.3/Lib/importlib/util.py", line 14, in <module>
from contextlib import contextmanager
File "/home/jeremyr/Python-3.7.3/Lib/contextlib.py", line 4, in <module>
import _collections_abc
SystemError: <built-in function compile> returned NULL without setting an error
generate-posix-vars failed
Makefile:603: recipe for target 'pybuilddir.txt' failed
make[1]: *** [pybuilddir.txt] Error 1
make[1]: Leaving directory '/home/jeremyr/Python-3.7.3'
Makefile:531: recipe for target 'profile-opt' failed
make: *** [profile-opt] Error 2
jeremyr#88:~/Python-3.7.3$ lsb_release -a
No LSB modules are available.
Distributor ID: Debian
Description: Debian GNU/Linux 8.11 (jessie)
Release: 8.11
Codename: jessie
jeremyr#88:~/Python-3.7.3$ gcc --version
gcc (Debian 4.9.2-10+deb8u2) 4.9.2
Copyright (C) 2014 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
jeremyr#88:~/Python-3.7.3$ sudo apt upgrade gcc
Reading package lists... Done
Building dependency tree
Reading state information... Done
Calculating upgrade... gcc is already the newest version.
jeremyr#b88:~/Python-3.7.3$ echo $PYTHONPATH
Any advice on how to overcome this and install python3.7 appreciated.
Edit - the solution listed below seems to work for various other python versions, so I changed title to python 3.x from 3.7
It seems the enable-optimizations was the problem,
jeremyr#b88:~/Python-3.7.3$ ./configure
jeremyr#b88:~/Python-3.7.3$ make clean
takes care of it in my case.
In case others come across this question: I encountered the same problem on Centos 7. I also had --enable-optimizations but didn't want to remove that flag. Updating my build dependencies and then re-running solved the problem. To do that I ran:
sudo yum groupinstall "Development Tools" -y
In case the yum group is not available, you can also install the pacakges individually using:
sudo yum install bison byacc cscope ctags cvs diffstat doxygen flex gcc gcc-c++ gcc-gfortran gettext git indent intltool libtool patch patchutils rcs redhat-rpm-config rpm-build subversion swig systemtap
For whomever MicGer's answer didn't work and would like to retain --enable-optimizations, check your gcc version. The error was solved for me on gcc 8.3.0.
For me, CentOS 7, compiling Python3.9, the fix was updating the gcc and related packages.
Removing --enable-optimizations didn't solved the issue here, neither upgrading GCC from gcc-7.5.0 to gcc-8.2.1 trying building Python 3.6.13 on opensuse-15.2.
Here for some reason the build is picking a local python installation, so before calling make I have updated the PATH env to include the build directory (were python file is generated), for instance: export PATH=$(pwd):$PATH
I want to debug ARM application on devices like Android machine, i prefer to use gdb(ARM version) than gdb with gdbserver to debug, because there is a dashboard , a visual interface for GDB in Python.
It must cooperation with gdb(ARM version) on devices,so i need to cross compiling a ARM version of gdb with python, the command used shows below:
./configure --host=arm-linux-gnueabi --target=arm-linux-gnueabi --with-python=/usr/bin
But finally a error message appeared:
configure:8096: checking whether to use python
configure:8098: result: /usr/bin/
configure:8316: checking for python2.7
configure:8334: arm-linux-gnueabi-gcc -o conftest -g -O2 -I/usr/include/python2.7 -I/usr/include/python2.7 conftest.c -ldl -ltermcap -lm -lpthread -ldl -lutil -lm -lpython2.7 -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions >&5
In file included from /usr/include/python2.7/Python.h:8:0,
from conftest.c:50:
/usr/include/python2.7/pyconfig.h:15:52: fatal error: arm-linux-gnueabi/python2.7/pyconfig.h: No such file or directory
compilation terminated.
Then I find the line 15 in /usr/include/python2.7/pyconfig.h, as below:
# elif defined(__ARM_EABI__) && !defined(__ARM_PCS_VFP)
#include <arm-linux-gnueabi/python2.7/pyconfig.h>
Here is the point, I only have x86_64-linux-gnu/python2.7/pyconfig.h in /usr/include, how can I get the arm-linux-gnueabi/python2.7/pyconfig.h? I already apt-get install python2.7-dev.
I ran into this problem when attempting to cross-compile a python wrapper module built with SWIG, but it looks to me like it will happen to anyone cross compiling python-linked C code on a Debian system.
Apparently the Debian python-dev packages are not set up with the header files to facilitate a cross compile, but it is possible to go get them manually. I'm not sure if this is a python bug or a Debian package bug, and I haven't researched if it applies to other distros.
pyconfig.h sets preprocessor defines to tell the python source code about platform-depended things like endianness and data type sizes, so the proper pyconfig.h is definitely needed to correctly compile python source. Fortunately, the pyconfig.h file should be the only file you need to grab separately, and it is available from the Debian python-dev package for your target platform.
you can download the package file for armeabi or any other architecture from https://packages.debian.org/jessie/libpython2.7-dev and extract the include directory yourself, or you can use the following commands to download the package and copy the proper files for armeabi to /usr/local/include
wget http://security.debian.org/debian-security/pool/updates/main/p/python2.7/libpython2.7-dev_2.7.9-2+deb8u2_armel.deb
dpkg -x libpython2.7-dev_2.7.9-2_armel.deb libpython2.7-dev_2.7.9-2_armel_extracted
sudo cp -r libpython2.7-dev_2.7.9-2_armel_extracted/usr/include/arm-linux-gnueabi/ /usr/local/include/
rm -r libpython2.7-dev_2.7.9-2_armel*
Note that on some cross compilers you will have to add -I /usr/local/include to the compiler options if it does not search this location by default, but to me this is better than modifying /usr/include and risking your changes being wiped out by the OS
When compiling python-3.4.0rc3 with a local openssl-1.0.1f shared install, make prints no error but then I get the following core dump on make install or make test:
Program terminated with signal 11, Segmentation fault.
(gdb) bt
#0 0x00007f131dd10510 in EVP_PKEY_CTX_dup () from /data2/soft/openssl/lib/libcrypto.so.1.0.0
#1 0x00007f131dd0284f in EVP_MD_CTX_copy_ex () from /data2/soft/openssl/lib/libcrypto.so.1.0.0
#2 0x00007f131e256ab5 in EVPnew (name_obj=0x7f131e46a500, digest=0x0, initial_ctx=0x7f131e459a40, cp=0x0, len=0) at /data2/soft/python3/Python-3.4.0rc3/Modules/_hashopenssl.c:410
#3 0x00007f131e25726e in EVP_new_md5 (self=<value optimized out>, args=<value optimized out>) at /data2/soft/python3/Python-3.4.0rc3/Modules/_hashopenssl.c:799
#4 0x00000000004c7eef in ?? ()
Here is the complete list of commands used
tar -axf Python-3.4.0rc3.tgz
cd Python-3*
# For lzma and a pip-compatible openssl
export CFLAGS='-I/data2/soft/openssl/include/openssl -I/data2/local/include/'
export LDFLAGS='-L/data2/soft/openssl/lib -L/data2/local/lib/'
export LD_LIBRARY_PATH="/data2/soft/openssl/lib:/data2/local/lib/:$LD_LIBRARY_PATH"
# Ready !
./configure --prefix=/data2/soft/python3
make
make install
Notes:
OS is SUSE Linux Enterprise Server 11 (x86_64)
pointing python to a custom location for the lzma lib located in
openssl was built with ./config shared --openssldir=/data2/soft/openssl
openssl make test prints ALL TESTS SUCCESSFUL.
without the custom openssl *FLAGS, make install is successful and I get these results for make test:
71 tests OK.
tests failed:
test_cmd_line test_gdb test_smtpnet test_ssl
How can I fix this, or at least investigate what is going on ?
Edit 1--5:
The shared libs were generated correctly:
> ls /data2/soft/openssl/lib
drwxr-xr-x engines
-rw-r--r-- libcrypto.a
lrwxrwxrwx libcrypto.so -> libcrypto.so.1.0.0
-r-xr-xr-x libcrypto.so.1.0.0
-rw-r--r-- libssl.a
lrwxrwxrwx libssl.so -> libssl.so.1.0.0
-r-xr-xr-x libssl.so.1.0.0
drwxr-xr-x pkgconfig
So I changed this in Setup :
SSL=/data2/soft/openssl/
_ssl _ssl.c \
-DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
$(SSL)/lib/libssl.a $(SSL)/lib/libcrypto.a -ldl
And I changed back LDFLAGS/CFLAGS accordingly. But there still is a -lssl when I run make clean && make, because of the _hashopen module:
gcc -pthread -shared -L/data2/local/lib/ -L/data2/local/lib/ -L/data2/local/lib/ -I/data2/local/include/ build/temp.linux-x86_64-3.4/data2/soft/python3/Python-3.4.0rc3/Modules/_hashopenssl.o -L/data2/local/lib/ -L/usr/local/lib -lssl -lcrypto -o build/lib.linux-x86_64-3.4/_hashlib.cpython-34m.so
I guess it is the one causing the cores, because they are still there... I tried adding similar stuff to Setup file, but there is no commented item for this one and creating it causes another more cryptic failure:
gcc -pthread -Xlinker -export-dynamic -o python Modules/python.o libpython3.4m.a -lpthread -ldl -lutil /data2/eoubrayrie/soft/openssl/lib/libssl.a /data2/eoubrayrie/soft/openssl/lib/libcrypto.a -ldl /data2/eoubrayrie/soft/openssl/lib/libssl.a /data2/eoubrayrie/soft/openssl/lib/libcrypto.a -ldl -lm
libpython3.4m.a(config.o):(.data+0x158): undefined reference to `PyInit__hashopenssl'
collect2: ld returned 1 exit status
Edit 6:
I couldn't find anyway to modify how _hashlib.so is generated, as there is too much Makefile magic involved (it doesn't appear anywhere, nor does '-lssl' yet both magically end up on the same line together
But I can get it linking dynamically to my own openssl through good old -I/-L:
ldd build/lib.linux-x86_64-3.4/_hashlib.cpython-34m.so
libssl.so.1.0.0 => /data2/soft/openssl/lib/libssl.so.1.0.0 (0x00007f5605799000)
libcrypto.so.1.0.0 => /data2/soft/openssl/lib/libcrypto.so.1.0.0 (0x00007f56053bd000)
Now the only problem is, gdb info shared still tells me another one is used at core-time ... but how ?
From To Syms Read Shared Object Library
0x00007ffff5465930 0x00007ffff5466e98 Yes /data2/soft/python3/Python-3.4.0rc3/build/lib.linux-x86_64-3.4/_hashlib.cpython-34m.so
0x00007ffff5321220 0x00007ffff5351878 Yes /opt/python-2.6-64/lib/libssl.so.1.0.0
0x00007ffff50d3100 0x00007ffff519b118 Yes /opt/python-2.6-64/lib/libcrypto.so.1.0.0
env | grep -F 'python-2.6-64' -> shows nothing !
grep -RF 'python-2.6-64' /etc/ld.so.* -> idem
gcc -print-search-dirs | sed 's/:/\n/g' | grep python -> idem
find . -name '*.so*' | xargs ldd | grep ssl -> only gives me the good ones
Level 1 dependencies don't require any wrong ssl version either. This was checked with: find . -name '*.so*' | xargs ldd | awk '/\t+[[:alnum:].]+ => [[:alnum:]./]+ \(/ {print $3}' | sort | uniq | xargs ldd | grep ssl
strace ./python ./Tools/scripts/run_tests.py 2>&1 | grep python-2.6-64 -> shows nothing
So how does ld picks this wrong library if he cannot konow about it ?? It's not in any standard location (if it were in /lib I could understand...)
Solution:
Found how to statically link _hashlib thanks to this OpenOffice bug: though the -Wl,--exclude-libs=ALL" option didn't work either, it pointed me to the right lines in setup.py.
TL;DR Here is the patch to setup.py I applied.
And finally... it works !
#noloader I'm accepting your most complete answer as you help was invaluable, though the "exact" answer for anyone encountering this problem is to compile with the patch above.
How can I fix this, or at least investigate what is going on ?
...
export LDFLAGS='-L/data2/soft/openssl/lib -L/data2/local/lib/'
export LD_LIBRARY_PATH="/data2/soft/openssl/lib:/data2/local/lib/
I run into these problems a lot too because I avoid the crippled versions of OpenSSL shipped by Debian, Ubuntu, Fedora, et al. For example, Ubuntu ships an OpenSSL that disables TLSv1.1 and TLS v1.2 (cf., Ubuntu 12.04 LTS: OpenSSL downlevel version and does not support TLS 1.2).
You're probably loading the wrong version of the OpenSSL library. If you can get the misbehaving program under the debugger, issue info shared to see which libcrypto and libssl you are actually loading.
ldd might help too. Run it on the Pyhton executable: ldd /data2/soft/python3/python. I can only say it "may" help because the OpenSSL's are binary compatible, so you might only see a dependency on, for example, libcrypto.so.1.0.0 (use otool -L on Mac OS X). Below I used an rpath to force linking against the libraries in /usr/local/ssl/lib/.
$ ldd my-test.exe
linux-vdso.so.1 => (0x00007fffd61ff000)
libssl.so.1.0.0 => /usr/local/ssl/lib/libssl.so.1.0.0 (0x00007f151528e000)
libcrypto.so.1.0.0 => /usr/local/ssl/lib/libcrypto.so.1.0.0 (0x00007f1514e74000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f1514c42000)
...
As a fix, you might try adding an rpath:
LDFLAGS='-L/data2/soft/openssl/lib -L/data2/local/lib/ -Wl,-rpath,/data2/soft/openssl/lib'
A better fix would be to link to the static version of the OpenSSL library to avoid these problems all together. I believe you can do that with: -Bstatic -lssl -lcrypto -Bdynamic -ldl.
Personally, I don't even use -Bstatic because of different one-off problems. I open the Makefiles, remove all instances of -lssl -lcrypto, and add a full path to the archive to remove all ambiguity. For example, /data2/soft/openssl/lib/libssl.a and /data2/soft/openssl/lib/libcrypto.a.
Note well: an rpath is not honored on Mac OS X. More extreme measures are needed for Mac OS X because the linker does not honor -Bstatic either. You'll have to use the full path trick.
openssl was built with ./config shared --openssldir=/data2/soft/openssl
Another thing... On Fedora, its not enough to specify shared. You need to add the following, too:
export CFLAGS="-fPIC"
Otherwise, the shared object is not built. If the 1.0.1f shared object is not built, you're probably getting a downlevel version supplied by the distro. You can check what was built before install with:
./config shared --openssldir=/data2/soft/openssl
make all
# Verify artifacts
find . -iname -libcrypto.*
find . -iname -libssl.*
# Proceed if OK
sudo make install
Finally, make sure that all of Python's dependencies are also using your version of OpenSSL and not the system's version of OpenSSL.
I recently suffered that problem when my program used my OpenSSL; but my program also used libevent and libevent used the system's version of OpenSSL. I fixed it by rebuilding libevent and forcing it to statically link to my version of OpenSSL.
How should I go about changing it ? Will adding a _ssl.c: gcc ... line anywhere override the default behaviour ? My Makefile skills are rusted and this beast is 1600+ lines long !!
Here are the files you want to look at:
$ cd Python-3.4.0rc3
$ grep -R -- '-lcrypto' *
Modules/Setup.dist:# -L$(SSL)/lib -lssl -lcrypto
$ grep -R -- '-lssl' *
Modules/Setup.dist:# -L$(SSL)/lib -lssl -lcrypto
Here are the lines of interest in Setup.dist:
# Socket module helper for SSL support; you must comment out the other
# socket line above, and possibly edit the SSL variable:
#SSL=/usr/local/ssl
#_ssl _ssl.c \
# -DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
# -L$(SSL)/lib -lssl -lcrypto
Uncomment the lines and change Setup.dist to the following. I keep my OpenSSL in /usr/local/ssl, so that's how mine is setup below (you should use /data2/soft/openssl/lib/...):
SSL=/usr/local/ssl
_ssl _ssl.c \
-DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
/usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
Use the full path to the archive, and don't use -l. Be sure to add -ldl because OpenSSL needs it in this configuration.
Once you change Setup.dist, re-run ./configure to effect the changes.
After changing the above line to, here's what I look like after a configure:
$ grep -R "libssl.a" *
Makefile:LOCALMODLIBS= /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
Makefile:Modules/_ssl$(SO): Modules/_ssl.o; $(BLDSHARED) Modules/_ssl.o /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl -o Modules/_ssl$(SO)
Modules/Setup: /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
Modules/Setup.dist: /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
You can test the static linking by using ldd or otool -L. You will not see an OpenSSL dependency. After make'ing, here's what I got:
$ find . -iname python
./python
$ ldd ./python
linux-vdso.so.1 => (0x00007fff67709000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f3aed8e1000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f3aed6dd000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f3aed4d9000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f3aed257000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f3aececc000)
/lib64/ld-linux-x86-64.so.2 (0x00007f3aedb14000)
No libssl or libcrypto dependencies to go wrong :) And make test ran fine (actually, one failed test due to a Python bug: Issue 20896):
======================================================================
ERROR: test_get_server_certificate (test.test_ssl.NetworkedTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/jwalton/Python-3.4.0rc3/Lib/test/test_ssl.py", line 1373, in test_get_server_certificate
_test_get_server_certificate('svn.python.org', 443, SVN_PYTHON_ORG_ROOT_CERT)
File "/home/jwalton/Python-3.4.0rc3/Lib/test/test_ssl.py", line 1354, in _test_get_server_certificate
pem = ssl.get_server_certificate((host, port))
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 902, in get_server_certificate
with context.wrap_socket(sock) as sslsock:
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 344, in wrap_socket
_context=self)
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 540, in __init__
self.do_handshake()
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 767, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:598)
----------------------------------------------------------------------
Ran 96 tests in 8.610s
FAILED (errors=1, skipped=3)
test test_ssl failed
make: *** [test] Error 1
This isn't an answer, just an observation after running though this with you:
$ make
...
gcc -pthread -c -Wno-unused-result -Werror=declaration-after-statement
-DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I. -IInclude -I./Include
-DPy_BUILD_CORE -o Objects/capsule.o Objects/capsule.c
-fwrapv is really bad. Its used to make illegal programs work. Its better to fix the broken program and drop the -fwrapv. See Ian Lance Taylor's blog on Signed Overflow.
I couldn't find anyway to modify how _hashlib.so is generated, as there is too much Makefile magic involved (it doesn't appear anywhere, nor does '-lssl' yet both magically end up on the same line together
grep is your friend ;)
$ grep -R _hashlib * | grep ssl
Lib/hashlib.py: f = getattr(_hashlib, 'openssl_' + name)
Lib/hashlib.py: _hashlib.openssl_md_meth_names)
Lib/test/test_hashlib.py: self.assertTrue(hasattr(_hashlib, 'openssl_md5'))
Lib/test/test_hashlib.py: self.assertTrue(hasattr(_hashlib, 'openssl_sha1'))
Lib/test/test_hashlib.py: constructor = getattr(_hashlib, 'openssl_'+algorithm, None)
Lib/test/ssltests.py:TESTS = ['test_asyncio', 'test_ftplib', 'test_hashlib', 'test_httplib',
Lib/test/time_hashlib.py: print(" '_hashlib' 'openssl_hName' 'fast' tests the builtin _hashlib")
Modules/_hashopenssl.c: "_hashlib.HASH", /*tp_name*/
Modules/_hashopenssl.c:static struct PyModuleDef _hashlibmodule = {
Modules/_hashopenssl.c: "_hashlib",
Modules/_hashopenssl.c:PyInit__hashlib(void)
Modules/_hashopenssl.c: m = PyModule_Create(&_hashlibmodule);
PCbuild/build_ssl.py:# Script for building the _ssl and _hashlib modules for Windows.
PCbuild/build_ssl.py:# for the actual _ssl.pyd and _hashlib.pyd DLLs.
PCbuild/build_ssl.py:# it should configure and build SSL, then build the _ssl and _hashlib
setup.py: exts.append( Extension('_hashlib', ['_hashopenssl.c'],
setup.py: print("warning: openssl 0x%08x is too old for _hashlib" %
Tools/ssl/test_multiple_versions.py: "test_asyncio", "test_ftplib", "test_hashlib", "test_httplib",
Tools/ssl/test_multiple_versions.py:MINIMAL_TESTS = ["test_ssl", "test_hashlib"]
Another partial answer...
But I can get it linking dynamically to my own openssl through good old -I/-L:
...
Now the only problem is, gdb info shared still tells me another one is used at core-time ... but how ?
That's your good old friends -l and -L. Don't use them because they do this sort of thing all the time (take it from a guy who has suffered it in the past). Instead, specify the full path to libssl and libcrypto. E.g., use /data2/soft/openssl/lib/libssl.a.
We had a similar problem. We are using apache httpd + mod_wsgi + python + django and our c++ module for the apache httpd which also uses openssl. Now everything is loaded within one httpd process, correct version of openssl shared lib was loaded (1.0.0l) with our c++ module. But as soon as we access the web, python loads the hashlib and exactly the same problem appears - segfault in openssl called from python.
Normally python compiles with whatever openssl is available at the default location and there is no way how to specify it without fiddling with setup.py or makefiles. Python developers should add configure setting --with_ssl=path.
We installed new openssl libs and rebuild python and other binaries but with no success. We mapped default libssl.so and libcrypto.so to the new openssl binaries with no success. Finally after reading this thread I realized that probably wrong headers are being used while compiling python. And that was the problem. Follow the step to workaround the problem:
there must not be incorrect version of openssl headers in /usr/include and default locations /usr/local/ssl, /usr/contrib/ssl (if you installed openssl-devel then uninstall it, or simply erase/rename the directory)
yum remove openssl-devel
make symbolic link to your openssl installation from /usr/local/ssl
ln -s /opt/openssl-1.0.1l /usr/local/ssl
ensure that the new openssl libs are accessible from /usr/lib (make symbolic links if it is not installed here)
ln -s /opt/openssl-1.0.1l/lib/libcrypto.so.1.0.0 /usr/lib/libcrypto.so.1.0.0
...
now configure and clean build the python
I am able to run python 2.7.11 with non-default SSL after patching fix mentioned here https://gist.github.com/eddy-geek/9604982
However, with this it's not building _socket module which is required by many other modules. for example, easy_install / pip started failing with error Importerr: no module named _socket
In Module/Setup.dist, am i suppose to uncomment or comment the line
_socket socketmodule.o
?
I see socketmodule.o and timemodule.o getting generated. but not _socket.so
Am i missing something ?
I'm trying to run mod_wsgi 3.1 under Apache 2.2.14 using a non-default python installation on Mac OS X 10.6.
After downloading the mod_wsgi source I run:
sudo apachectl -k stop
then
./configure --with-python=/usr/local/Cellar/python/2.6.4/bin/python
make
sudo make install
I then start up apache again
sudo apachectl -k start
When I cat /var/log/httpd/error_log I see:
[Mon Dec 21 12:27:26 2009] [warn] mod_wsgi: Compiled for Python/2.6.4.
[Mon Dec 21 12:27:26 2009] [warn] mod_wsgi: Runtime using Python/2.6.1.
[Mon Dec 21 12:27:26 2009] [notice] Apache/2.2.14 (Unix) DAV/2 mod_wsgi/3.1 Python/2.6.1 configured -- resuming normal operations
When I run otool -L mod_wsgi.so is see:
mod_wsgi.so:
/System/Library/Frameworks/Python.framework/Versions/2.6/Python (compatibility version 2.6.0, current version 2.6.1)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 125.0.0)
What gives? Why is it linking with the system framework?
Here is the output from my mod_wsgi configure and build commands:
Archimedes:mod_wsgi-3.1 awolf$ ./configure --with-python=/usr/local/Cellar/python/2.6.4/bin/python
checking for apxs2... no
checking for apxs... /opt/apache2/bin/apxs
checking Apache version... 2.2.14
configure: creating ./config.status
config.status: creating Makefile
Archimedes:mod_wsgi-3.1 awolf$ make
/opt/apache2/bin/apxs -c -I/usr/local/Cellar/python/2.6.4/include/python2.6 -DNDEBUG -Wc,'-arch x86_64' mod_wsgi.c -L/usr/local/Cellar/python/2.6.4/lib -L/usr/local/Cellar/python/2.6.4/lib/python2.6/config -arch x86_64 -lpython2.6 -ldl
/Library/Webserver/build/libtool --silent --mode=compile gcc -prefer-pic -DDARWIN -DSIGPROCMASK_SETS_THREAD_MASK -no-cpp-precomp -g -O2 -I/opt/apache2/include -I/opt/apache2/include -I/opt/apache2/include -arch x86_64 -I/usr/local/Cellar/python/2.6.4/include/python2.6 -DNDEBUG -c -o mod_wsgi.lo mod_wsgi.c && touch mod_wsgi.slo
In file included from /usr/local/Cellar/python/2.6.4/include/python2.6/Python.h:125,
from mod_wsgi.c:135:
/usr/local/Cellar/python/2.6.4/include/python2.6/modsupport.h:27: warning: 'PyArg_ParseTuple' is an unrecognized format function type
/Library/Webserver/build/libtool --silent --mode=link gcc -o mod_wsgi.la -rpath /opt/apache2/modules -module -avoid-version mod_wsgi.lo -L/usr/local/Cellar/python/2.6.4/lib -L/usr/local/Cellar/python/2.6.4/lib/python2.6/config -arch x86_64 -lpython2.6 -ldl
Archimedes:mod_wsgi-3.1 awolf$ sudo make install
Password:
/opt/apache2/bin/apxs -i -S LIBEXECDIR=/opt/apache2/modules -n 'mod_wsgi' mod_wsgi.la
/Library/Webserver/build/instdso.sh SH_LIBTOOL='/Library/Webserver/build/libtool' mod_wsgi.la /opt/apache2/modules
/Library/Webserver/build/libtool --mode=install cp mod_wsgi.la /opt/apache2/modules/
cp .libs/mod_wsgi.so /opt/apache2/modules/mod_wsgi.so
cp .libs/mod_wsgi.lai /opt/apache2/modules/mod_wsgi.la
cp .libs/mod_wsgi.a /opt/apache2/modules/mod_wsgi.a
chmod 644 /opt/apache2/modules/mod_wsgi.a
ranlib /opt/apache2/modules/mod_wsgi.a
This post is old, but still turns up in searches about mac + homebrew + python, so I thought I'd add some useful information. I was having the problem as the OP, just with a different module (uwsgi). I learned that you don't have to abandon homebrew. Homebrew can, in fact, install python as a framework; you just have to tell it to do so:
% brew uninstall python
Uninstalling /usr/local/Cellar/python/2.7.2...
% brew install python --universal --framework
... and all's well.
Because for some reason some Python framework installs from source code, usually MacPorts, have something wrong with the information embedded in the Python framework and the run time look up path of the executable isn't set correctly. As a result it ends up using the Python framework from /System/Library instead.
When you run 'configure' for mod_wsgi add the additional option '--disable-framework'. Eg:
./configure --with-python=/usr/local/Cellar/python/2.6.4/bin/python --disable-framework
This will change how Python library/framework is linked and may resolve the problem.
For more details see bug fixes (1) and (2) in:
http://code.google.com/p/modwsgi/wiki/ChangesInVersion0206
Graham helped me solve this over on the mod_wsgi mailing list.
http://groups.google.com/group/modwsgi/browse_thread/thread/4046eaf290a49b1e/ae14888450de39f5#ae14888450de39f5
Here's a summary:
The problem was my installation of python was done via Homebrew. Homebrew’s python is not installed as a framework OR dylib install so it could not be used for embedding (such as in Apache/mod_wsgi).
Instead I installed python 2.6.4 from source:
./configure --prefix=/usr/local/python-2.6.4 --enable-framework=/usr/local/python-2.6.4/frameworks --enable-universalsdk=/ MACOSX_DEPLOYMENT_TARGET=10.5 --with-universal-archs=3-way
make
sudo make install
I was able to build a version of python 2.6.4 that I could then build
mod_wsgi with:
./configure --with-python=/usr/local/python-2.6.4/bin/python
make
sudo make install
To confirm:
otool -L /opt/apache2/modules/ mod_wsgi.so
/opt/apache2/modules/mod_wsgi.so:
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current
version 125.0.0)
/usr/local/python-2.6.4/frameworks/Python.framework/Versions/2.6/Python (compatibility version 2.6.0, current version 2.6.0)
shows that python is now using the 2.6.4 framework and not the system one. When I start apache, I no longer get the version mismatch warnings.
I re-installed django, psycopg2, and so on into my new python installation and everything is working like a charm.
Thanks again for your help!