I am trying to compile python-mysqlclient against mariadb-connector-c in a Conda environment. That means that the install prefix is not /usr/local but, for example, $HOME/conda/envs/test. I also want to use the auth_gssapi_client.so plugin.
Both packages build, but import MySQLdb raises the following exception:
Traceback (most recent call last):
File "/opt/emsconda/conda-bld/env/test_tmp/run_test.py", line 2, in <module>
import MySQLdb
File "/opt/emsconda/conda-bld/env/lib/python3.6/site-packages/MySQLdb/__init__.py", line 19, in <module>
import _mysql
ImportError: libmariadb.so.3: cannot open shared object file: No such file or directory
The reason for this is that mysqlclient only searches lib/ but not
lib/mariadb even though it was configured with the right path and sucessfully
built. I can work around this issue by copying the *.so files to lib/ (or
by creating a symlink), but then, it does not find the GSSAPI plugisn …
I build mariadb-connector-c 3.0.2 this way:
mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=$PREFIX -DCMAKE_BUILD_TYPE=Release ..
make
make install
I can install it and run mariadb_config which gives this output:
Copyright 2011-2015 MariaDB Corporation AB
Get compiler flags for using the MariaDB Connector/C.
Usage: /opt/emsconda/conda-bld/mysqlclient_1510048680472/_h_env/bin/mariadb_config [OPTIONS]
--cflags [-I/opt/emsconda/conda-bld/env/include/mariadb -I/opt/emsconda/conda-bld/env/include/mariadb/mysql]
--include [-I/opt/emsconda/conda-bld/env/include/mariadb -I/opt/emsconda/conda-bld/env/include/mariadb/mysql]
--libs [-L/opt/emsconda/conda-bld/env/lib/mariadb/ -lmariadb -lpthread -ldl -lm -lssl -lcrypto]
--libs_r [-L/opt/emsconda/conda-bld/env/lib/mariadb/ -lmariadb -lpthread -ldl -lm -lssl -lcrypto]
--libs_sys [-lpthread -ldl -lm -lssl -lcrypto]
--version [10.2.6]
--socket [/tmp/mysql.sock]
--port [3306]
--plugindir [/opt/emsconda/conda-bld/env/lib/mariadb/plugin]
--tlsinfo [OpenSSL 1.0.2k]
I then build python-mysqlclient 1.3.12 this way:
MYSQL_CONFIG="$PREFIX/bin/mariadb_config"
echo "mysql_config = $PREFIX/bin/mariadb_config" >> site.cfg
$PYTHON -m pip install -I --no-deps .
There are two possible solutions for this problem:
Configure mariadb-connector-c to directly put its stuff into lib/ – I have not found documentation on how to do this.
Make python-mysqlclient respect the paths that mariadb_config returns – How?
You have to statically link the gssapi stuff into mariadb-connector-c and Python mysqlclient will work.
This is how to build mariadb-connector-c:
mkdir build
cd build
cmake \
-DCMAKE_INSTALL_PREFIX=$PREFIX \
-DINSTALL_LIBDIR=lib \
-DINSTALL_PLUGINDIR=lib/plugin \
-DWITH_MYSQLCOMPAT=ON \
-DAUTH_GSSAPI=STATIC \
-DCMAKE_BUILD_TYPE=Release \
..
make
make install
# WITH_MYSQLCOMPAT only creates links for the libs, not for the binary:
cd $PREFIX/bin
ln -s mariadb_config mysql_config
I want to debug ARM application on devices like Android machine, i prefer to use gdb(ARM version) than gdb with gdbserver to debug, because there is a dashboard , a visual interface for GDB in Python.
It must cooperation with gdb(ARM version) on devices,so i need to cross compiling a ARM version of gdb with python, the command used shows below:
./configure --host=arm-linux-gnueabi --target=arm-linux-gnueabi --with-python=/usr/bin
But finally a error message appeared:
configure:8096: checking whether to use python
configure:8098: result: /usr/bin/
configure:8316: checking for python2.7
configure:8334: arm-linux-gnueabi-gcc -o conftest -g -O2 -I/usr/include/python2.7 -I/usr/include/python2.7 conftest.c -ldl -ltermcap -lm -lpthread -ldl -lutil -lm -lpython2.7 -Xlinker -export-dynamic -Wl,-O1 -Wl,-Bsymbolic-functions >&5
In file included from /usr/include/python2.7/Python.h:8:0,
from conftest.c:50:
/usr/include/python2.7/pyconfig.h:15:52: fatal error: arm-linux-gnueabi/python2.7/pyconfig.h: No such file or directory
compilation terminated.
Then I find the line 15 in /usr/include/python2.7/pyconfig.h, as below:
# elif defined(__ARM_EABI__) && !defined(__ARM_PCS_VFP)
#include <arm-linux-gnueabi/python2.7/pyconfig.h>
Here is the point, I only have x86_64-linux-gnu/python2.7/pyconfig.h in /usr/include, how can I get the arm-linux-gnueabi/python2.7/pyconfig.h? I already apt-get install python2.7-dev.
I ran into this problem when attempting to cross-compile a python wrapper module built with SWIG, but it looks to me like it will happen to anyone cross compiling python-linked C code on a Debian system.
Apparently the Debian python-dev packages are not set up with the header files to facilitate a cross compile, but it is possible to go get them manually. I'm not sure if this is a python bug or a Debian package bug, and I haven't researched if it applies to other distros.
pyconfig.h sets preprocessor defines to tell the python source code about platform-depended things like endianness and data type sizes, so the proper pyconfig.h is definitely needed to correctly compile python source. Fortunately, the pyconfig.h file should be the only file you need to grab separately, and it is available from the Debian python-dev package for your target platform.
you can download the package file for armeabi or any other architecture from https://packages.debian.org/jessie/libpython2.7-dev and extract the include directory yourself, or you can use the following commands to download the package and copy the proper files for armeabi to /usr/local/include
wget http://security.debian.org/debian-security/pool/updates/main/p/python2.7/libpython2.7-dev_2.7.9-2+deb8u2_armel.deb
dpkg -x libpython2.7-dev_2.7.9-2_armel.deb libpython2.7-dev_2.7.9-2_armel_extracted
sudo cp -r libpython2.7-dev_2.7.9-2_armel_extracted/usr/include/arm-linux-gnueabi/ /usr/local/include/
rm -r libpython2.7-dev_2.7.9-2_armel*
Note that on some cross compilers you will have to add -I /usr/local/include to the compiler options if it does not search this location by default, but to me this is better than modifying /usr/include and risking your changes being wiped out by the OS
When compiling python-3.4.0rc3 with a local openssl-1.0.1f shared install, make prints no error but then I get the following core dump on make install or make test:
Program terminated with signal 11, Segmentation fault.
(gdb) bt
#0 0x00007f131dd10510 in EVP_PKEY_CTX_dup () from /data2/soft/openssl/lib/libcrypto.so.1.0.0
#1 0x00007f131dd0284f in EVP_MD_CTX_copy_ex () from /data2/soft/openssl/lib/libcrypto.so.1.0.0
#2 0x00007f131e256ab5 in EVPnew (name_obj=0x7f131e46a500, digest=0x0, initial_ctx=0x7f131e459a40, cp=0x0, len=0) at /data2/soft/python3/Python-3.4.0rc3/Modules/_hashopenssl.c:410
#3 0x00007f131e25726e in EVP_new_md5 (self=<value optimized out>, args=<value optimized out>) at /data2/soft/python3/Python-3.4.0rc3/Modules/_hashopenssl.c:799
#4 0x00000000004c7eef in ?? ()
Here is the complete list of commands used
tar -axf Python-3.4.0rc3.tgz
cd Python-3*
# For lzma and a pip-compatible openssl
export CFLAGS='-I/data2/soft/openssl/include/openssl -I/data2/local/include/'
export LDFLAGS='-L/data2/soft/openssl/lib -L/data2/local/lib/'
export LD_LIBRARY_PATH="/data2/soft/openssl/lib:/data2/local/lib/:$LD_LIBRARY_PATH"
# Ready !
./configure --prefix=/data2/soft/python3
make
make install
Notes:
OS is SUSE Linux Enterprise Server 11 (x86_64)
pointing python to a custom location for the lzma lib located in
openssl was built with ./config shared --openssldir=/data2/soft/openssl
openssl make test prints ALL TESTS SUCCESSFUL.
without the custom openssl *FLAGS, make install is successful and I get these results for make test:
71 tests OK.
tests failed:
test_cmd_line test_gdb test_smtpnet test_ssl
How can I fix this, or at least investigate what is going on ?
Edit 1--5:
The shared libs were generated correctly:
> ls /data2/soft/openssl/lib
drwxr-xr-x engines
-rw-r--r-- libcrypto.a
lrwxrwxrwx libcrypto.so -> libcrypto.so.1.0.0
-r-xr-xr-x libcrypto.so.1.0.0
-rw-r--r-- libssl.a
lrwxrwxrwx libssl.so -> libssl.so.1.0.0
-r-xr-xr-x libssl.so.1.0.0
drwxr-xr-x pkgconfig
So I changed this in Setup :
SSL=/data2/soft/openssl/
_ssl _ssl.c \
-DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
$(SSL)/lib/libssl.a $(SSL)/lib/libcrypto.a -ldl
And I changed back LDFLAGS/CFLAGS accordingly. But there still is a -lssl when I run make clean && make, because of the _hashopen module:
gcc -pthread -shared -L/data2/local/lib/ -L/data2/local/lib/ -L/data2/local/lib/ -I/data2/local/include/ build/temp.linux-x86_64-3.4/data2/soft/python3/Python-3.4.0rc3/Modules/_hashopenssl.o -L/data2/local/lib/ -L/usr/local/lib -lssl -lcrypto -o build/lib.linux-x86_64-3.4/_hashlib.cpython-34m.so
I guess it is the one causing the cores, because they are still there... I tried adding similar stuff to Setup file, but there is no commented item for this one and creating it causes another more cryptic failure:
gcc -pthread -Xlinker -export-dynamic -o python Modules/python.o libpython3.4m.a -lpthread -ldl -lutil /data2/eoubrayrie/soft/openssl/lib/libssl.a /data2/eoubrayrie/soft/openssl/lib/libcrypto.a -ldl /data2/eoubrayrie/soft/openssl/lib/libssl.a /data2/eoubrayrie/soft/openssl/lib/libcrypto.a -ldl -lm
libpython3.4m.a(config.o):(.data+0x158): undefined reference to `PyInit__hashopenssl'
collect2: ld returned 1 exit status
Edit 6:
I couldn't find anyway to modify how _hashlib.so is generated, as there is too much Makefile magic involved (it doesn't appear anywhere, nor does '-lssl' yet both magically end up on the same line together
But I can get it linking dynamically to my own openssl through good old -I/-L:
ldd build/lib.linux-x86_64-3.4/_hashlib.cpython-34m.so
libssl.so.1.0.0 => /data2/soft/openssl/lib/libssl.so.1.0.0 (0x00007f5605799000)
libcrypto.so.1.0.0 => /data2/soft/openssl/lib/libcrypto.so.1.0.0 (0x00007f56053bd000)
Now the only problem is, gdb info shared still tells me another one is used at core-time ... but how ?
From To Syms Read Shared Object Library
0x00007ffff5465930 0x00007ffff5466e98 Yes /data2/soft/python3/Python-3.4.0rc3/build/lib.linux-x86_64-3.4/_hashlib.cpython-34m.so
0x00007ffff5321220 0x00007ffff5351878 Yes /opt/python-2.6-64/lib/libssl.so.1.0.0
0x00007ffff50d3100 0x00007ffff519b118 Yes /opt/python-2.6-64/lib/libcrypto.so.1.0.0
env | grep -F 'python-2.6-64' -> shows nothing !
grep -RF 'python-2.6-64' /etc/ld.so.* -> idem
gcc -print-search-dirs | sed 's/:/\n/g' | grep python -> idem
find . -name '*.so*' | xargs ldd | grep ssl -> only gives me the good ones
Level 1 dependencies don't require any wrong ssl version either. This was checked with: find . -name '*.so*' | xargs ldd | awk '/\t+[[:alnum:].]+ => [[:alnum:]./]+ \(/ {print $3}' | sort | uniq | xargs ldd | grep ssl
strace ./python ./Tools/scripts/run_tests.py 2>&1 | grep python-2.6-64 -> shows nothing
So how does ld picks this wrong library if he cannot konow about it ?? It's not in any standard location (if it were in /lib I could understand...)
Solution:
Found how to statically link _hashlib thanks to this OpenOffice bug: though the -Wl,--exclude-libs=ALL" option didn't work either, it pointed me to the right lines in setup.py.
TL;DR Here is the patch to setup.py I applied.
And finally... it works !
#noloader I'm accepting your most complete answer as you help was invaluable, though the "exact" answer for anyone encountering this problem is to compile with the patch above.
How can I fix this, or at least investigate what is going on ?
...
export LDFLAGS='-L/data2/soft/openssl/lib -L/data2/local/lib/'
export LD_LIBRARY_PATH="/data2/soft/openssl/lib:/data2/local/lib/
I run into these problems a lot too because I avoid the crippled versions of OpenSSL shipped by Debian, Ubuntu, Fedora, et al. For example, Ubuntu ships an OpenSSL that disables TLSv1.1 and TLS v1.2 (cf., Ubuntu 12.04 LTS: OpenSSL downlevel version and does not support TLS 1.2).
You're probably loading the wrong version of the OpenSSL library. If you can get the misbehaving program under the debugger, issue info shared to see which libcrypto and libssl you are actually loading.
ldd might help too. Run it on the Pyhton executable: ldd /data2/soft/python3/python. I can only say it "may" help because the OpenSSL's are binary compatible, so you might only see a dependency on, for example, libcrypto.so.1.0.0 (use otool -L on Mac OS X). Below I used an rpath to force linking against the libraries in /usr/local/ssl/lib/.
$ ldd my-test.exe
linux-vdso.so.1 => (0x00007fffd61ff000)
libssl.so.1.0.0 => /usr/local/ssl/lib/libssl.so.1.0.0 (0x00007f151528e000)
libcrypto.so.1.0.0 => /usr/local/ssl/lib/libcrypto.so.1.0.0 (0x00007f1514e74000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f1514c42000)
...
As a fix, you might try adding an rpath:
LDFLAGS='-L/data2/soft/openssl/lib -L/data2/local/lib/ -Wl,-rpath,/data2/soft/openssl/lib'
A better fix would be to link to the static version of the OpenSSL library to avoid these problems all together. I believe you can do that with: -Bstatic -lssl -lcrypto -Bdynamic -ldl.
Personally, I don't even use -Bstatic because of different one-off problems. I open the Makefiles, remove all instances of -lssl -lcrypto, and add a full path to the archive to remove all ambiguity. For example, /data2/soft/openssl/lib/libssl.a and /data2/soft/openssl/lib/libcrypto.a.
Note well: an rpath is not honored on Mac OS X. More extreme measures are needed for Mac OS X because the linker does not honor -Bstatic either. You'll have to use the full path trick.
openssl was built with ./config shared --openssldir=/data2/soft/openssl
Another thing... On Fedora, its not enough to specify shared. You need to add the following, too:
export CFLAGS="-fPIC"
Otherwise, the shared object is not built. If the 1.0.1f shared object is not built, you're probably getting a downlevel version supplied by the distro. You can check what was built before install with:
./config shared --openssldir=/data2/soft/openssl
make all
# Verify artifacts
find . -iname -libcrypto.*
find . -iname -libssl.*
# Proceed if OK
sudo make install
Finally, make sure that all of Python's dependencies are also using your version of OpenSSL and not the system's version of OpenSSL.
I recently suffered that problem when my program used my OpenSSL; but my program also used libevent and libevent used the system's version of OpenSSL. I fixed it by rebuilding libevent and forcing it to statically link to my version of OpenSSL.
How should I go about changing it ? Will adding a _ssl.c: gcc ... line anywhere override the default behaviour ? My Makefile skills are rusted and this beast is 1600+ lines long !!
Here are the files you want to look at:
$ cd Python-3.4.0rc3
$ grep -R -- '-lcrypto' *
Modules/Setup.dist:# -L$(SSL)/lib -lssl -lcrypto
$ grep -R -- '-lssl' *
Modules/Setup.dist:# -L$(SSL)/lib -lssl -lcrypto
Here are the lines of interest in Setup.dist:
# Socket module helper for SSL support; you must comment out the other
# socket line above, and possibly edit the SSL variable:
#SSL=/usr/local/ssl
#_ssl _ssl.c \
# -DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
# -L$(SSL)/lib -lssl -lcrypto
Uncomment the lines and change Setup.dist to the following. I keep my OpenSSL in /usr/local/ssl, so that's how mine is setup below (you should use /data2/soft/openssl/lib/...):
SSL=/usr/local/ssl
_ssl _ssl.c \
-DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
/usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
Use the full path to the archive, and don't use -l. Be sure to add -ldl because OpenSSL needs it in this configuration.
Once you change Setup.dist, re-run ./configure to effect the changes.
After changing the above line to, here's what I look like after a configure:
$ grep -R "libssl.a" *
Makefile:LOCALMODLIBS= /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
Makefile:Modules/_ssl$(SO): Modules/_ssl.o; $(BLDSHARED) Modules/_ssl.o /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl -o Modules/_ssl$(SO)
Modules/Setup: /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
Modules/Setup.dist: /usr/local/ssl/lib/libssl.a /usr/local/ssl/lib/libcrypto.a -ldl
You can test the static linking by using ldd or otool -L. You will not see an OpenSSL dependency. After make'ing, here's what I got:
$ find . -iname python
./python
$ ldd ./python
linux-vdso.so.1 => (0x00007fff67709000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f3aed8e1000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f3aed6dd000)
libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f3aed4d9000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f3aed257000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f3aececc000)
/lib64/ld-linux-x86-64.so.2 (0x00007f3aedb14000)
No libssl or libcrypto dependencies to go wrong :) And make test ran fine (actually, one failed test due to a Python bug: Issue 20896):
======================================================================
ERROR: test_get_server_certificate (test.test_ssl.NetworkedTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/jwalton/Python-3.4.0rc3/Lib/test/test_ssl.py", line 1373, in test_get_server_certificate
_test_get_server_certificate('svn.python.org', 443, SVN_PYTHON_ORG_ROOT_CERT)
File "/home/jwalton/Python-3.4.0rc3/Lib/test/test_ssl.py", line 1354, in _test_get_server_certificate
pem = ssl.get_server_certificate((host, port))
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 902, in get_server_certificate
with context.wrap_socket(sock) as sslsock:
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 344, in wrap_socket
_context=self)
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 540, in __init__
self.do_handshake()
File "/home/jwalton/Python-3.4.0rc3/Lib/ssl.py", line 767, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:598)
----------------------------------------------------------------------
Ran 96 tests in 8.610s
FAILED (errors=1, skipped=3)
test test_ssl failed
make: *** [test] Error 1
This isn't an answer, just an observation after running though this with you:
$ make
...
gcc -pthread -c -Wno-unused-result -Werror=declaration-after-statement
-DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I. -IInclude -I./Include
-DPy_BUILD_CORE -o Objects/capsule.o Objects/capsule.c
-fwrapv is really bad. Its used to make illegal programs work. Its better to fix the broken program and drop the -fwrapv. See Ian Lance Taylor's blog on Signed Overflow.
I couldn't find anyway to modify how _hashlib.so is generated, as there is too much Makefile magic involved (it doesn't appear anywhere, nor does '-lssl' yet both magically end up on the same line together
grep is your friend ;)
$ grep -R _hashlib * | grep ssl
Lib/hashlib.py: f = getattr(_hashlib, 'openssl_' + name)
Lib/hashlib.py: _hashlib.openssl_md_meth_names)
Lib/test/test_hashlib.py: self.assertTrue(hasattr(_hashlib, 'openssl_md5'))
Lib/test/test_hashlib.py: self.assertTrue(hasattr(_hashlib, 'openssl_sha1'))
Lib/test/test_hashlib.py: constructor = getattr(_hashlib, 'openssl_'+algorithm, None)
Lib/test/ssltests.py:TESTS = ['test_asyncio', 'test_ftplib', 'test_hashlib', 'test_httplib',
Lib/test/time_hashlib.py: print(" '_hashlib' 'openssl_hName' 'fast' tests the builtin _hashlib")
Modules/_hashopenssl.c: "_hashlib.HASH", /*tp_name*/
Modules/_hashopenssl.c:static struct PyModuleDef _hashlibmodule = {
Modules/_hashopenssl.c: "_hashlib",
Modules/_hashopenssl.c:PyInit__hashlib(void)
Modules/_hashopenssl.c: m = PyModule_Create(&_hashlibmodule);
PCbuild/build_ssl.py:# Script for building the _ssl and _hashlib modules for Windows.
PCbuild/build_ssl.py:# for the actual _ssl.pyd and _hashlib.pyd DLLs.
PCbuild/build_ssl.py:# it should configure and build SSL, then build the _ssl and _hashlib
setup.py: exts.append( Extension('_hashlib', ['_hashopenssl.c'],
setup.py: print("warning: openssl 0x%08x is too old for _hashlib" %
Tools/ssl/test_multiple_versions.py: "test_asyncio", "test_ftplib", "test_hashlib", "test_httplib",
Tools/ssl/test_multiple_versions.py:MINIMAL_TESTS = ["test_ssl", "test_hashlib"]
Another partial answer...
But I can get it linking dynamically to my own openssl through good old -I/-L:
...
Now the only problem is, gdb info shared still tells me another one is used at core-time ... but how ?
That's your good old friends -l and -L. Don't use them because they do this sort of thing all the time (take it from a guy who has suffered it in the past). Instead, specify the full path to libssl and libcrypto. E.g., use /data2/soft/openssl/lib/libssl.a.
We had a similar problem. We are using apache httpd + mod_wsgi + python + django and our c++ module for the apache httpd which also uses openssl. Now everything is loaded within one httpd process, correct version of openssl shared lib was loaded (1.0.0l) with our c++ module. But as soon as we access the web, python loads the hashlib and exactly the same problem appears - segfault in openssl called from python.
Normally python compiles with whatever openssl is available at the default location and there is no way how to specify it without fiddling with setup.py or makefiles. Python developers should add configure setting --with_ssl=path.
We installed new openssl libs and rebuild python and other binaries but with no success. We mapped default libssl.so and libcrypto.so to the new openssl binaries with no success. Finally after reading this thread I realized that probably wrong headers are being used while compiling python. And that was the problem. Follow the step to workaround the problem:
there must not be incorrect version of openssl headers in /usr/include and default locations /usr/local/ssl, /usr/contrib/ssl (if you installed openssl-devel then uninstall it, or simply erase/rename the directory)
yum remove openssl-devel
make symbolic link to your openssl installation from /usr/local/ssl
ln -s /opt/openssl-1.0.1l /usr/local/ssl
ensure that the new openssl libs are accessible from /usr/lib (make symbolic links if it is not installed here)
ln -s /opt/openssl-1.0.1l/lib/libcrypto.so.1.0.0 /usr/lib/libcrypto.so.1.0.0
...
now configure and clean build the python
I am able to run python 2.7.11 with non-default SSL after patching fix mentioned here https://gist.github.com/eddy-geek/9604982
However, with this it's not building _socket module which is required by many other modules. for example, easy_install / pip started failing with error Importerr: no module named _socket
In Module/Setup.dist, am i suppose to uncomment or comment the line
_socket socketmodule.o
?
I see socketmodule.o and timemodule.o getting generated. but not _socket.so
Am i missing something ?
I have a virtual Linux box with Debian 7.1 where I need a Python 2.4.6 to reanimate an old Zope installation (in order to update it to Plone 4, of course).
I definitely need ssl support, and when I'm compiling, I want readline as well, of course. Finally, of course I need zlib, otherwise ez_setup.py etc. won't work; I'm having a hard time to get zlib included.
I downloaded the tarball of Python 2.4.6, enabled ssl in Modules/Setup.dist:
SSL=/usr/local/ssl
_ssl _ssl.c \
-DUSE_SSL -I$(SSL)/include -I$(SSL)/include/openssl \
-L$(SSL)/lib -lssl -lcrypto
... and called:
./configure --prefix=/my/dest/dir --with-zlib
make
make gives me some warnings at the end about crypt and nis, but make install doesn't yield any errors. However, the resulting Python features both readline and ssl support, but no zlib; thus, I can't use ez_setup.py to get setuptools/pip etc.
I tried both to uncomment and re-exclude the line
zlib zlibmodule.c -I$(prefix)/include -L$(exec_prefix)/lib -lz
from Setup.dist.
Some system packages which are installed:
zlib1g-dev
lib32z1-dev
libreadline-gplv2-dev
Is there anything else I have missed?
Update, after heaving read https://stackoverflow.com/a/4047583/1051649:
I did
$ sudo apt-get install zlib1g zlib1g-dev libncurses5-dev libreadline6-dev ncurses-doc
$ python setup.py clean
$ ./configure --with-ssl --with-zlib --prefix=...
$ make
$ sudo make install
The resulting interpreter was not able to execute distribute_setup.py.
I found the solution here:
I changed setup.py, looking for the first assignment to the lib_dirs variable, changing it like so:
lib_dirs = self.compiler.library_dirs + [
'/lib64', '/usr/lib64',
'/lib', '/usr/lib',
'/usr/lib/x86_64-linux-gnu', # added
'/usr/lib/i386-linux-gnu', # added
]
Then I repeated the whole thing, starting with setup.py clean, and it worked.
I have Trac 0.12rc1(customized by somebody) it needs python subversion bindings to work with svn repos. But all of my attempts to compile the libraries ended with:
Last command in make:
/bin/sh /usr/local/src/subversion-1.6.20/libtool --tag=CC --silent --mode=compile gcc -pthread -fPIC -g -O2 -pthread -DLINUX=2 -D_REENTRANT -D_GNU_SOURCE -I/usr/local
/src/subversion-1.6.20/subversion -I/usr/local/src/subversion-1.6.20/subversion/include
-I/usr/local/src/subversion-1.6.20/subversion/bindings/swig -I/usr/local/src/subversion-1.6.20/subversion/bindings/swig/include
-I/usr/local/src/subversion-1.6.20/subversion/bindings/swig/proxy
-I/usr/local/src/subversion-1.6.20/subversion/bindings/swig/proxy
-I/usr/include/apr-1 -I/usr/include/apr-1 -I/usr/include/python2.6
-I/usr/local/src/subversion-1.6.20/subversion/bindings/swig/python/libsvn_swig_py
-prefer-pic -c -o subversion/bindings/swig/python/svn_client.lo subversion/bindings
/swig/python/svn_client.c
Last part of it's output(all the time looks the same):
subversion/bindings/swig/python/svn_client.c:23637: error: expected ‘)’ before ‘*’ token
subversion/bindings/swig/python/svn_client.c: In function ‘init_client’:
subversion/bindings/swig/python/svn_client.c:23690: error: ‘PyObject’ undeclared (first use in this function)
subversion/bindings/swig/python/svn_client.c:23690: error: ‘m’ undeclared (first use in this function)
subversion/bindings/swig/python/svn_client.c:23690: error: ‘d’ undeclared (first use in this function)
subversion/bindings/swig/python/svn_client.c:23693: error: ‘SwigMethods’ undeclared (first use in this function)
What I tried:
Python:
2.4 (work, but our Trac doesn't work properly with it)
2.6 "make swig-py" fails
2.7 "make swig-py" fails
Subversion:
1.6.17
1.6.20
1.7.8
SWIG:
2.0.9
1.3.29
Does anyone had success with building subversion bindings for python2.6+ ?
Is it posible at all?
Is it possible to find any pre-built binaries for RHEL\OEL\CentOS 5.x ?
Use this: http://egao1980.blogspot.com/2011/03/installing-trac-and-subversion-with.html
Below copy paste from site:
Install RPMForge repo rpm -ihv http://download.fedora.redhat.com/pub/epel/5/x86_64/epel-release-5-4.noarch.rpm
Good to have
yum install bash-completion
Black list subversion in [base] and [updates] repos
vim /etc/yum.repos.d/CentOS-Base.repo
[base]
exclude=subversion
exclude=subversion-devel
...
[updates]
exclude=subversion
exclude=subversion-devel
...
Get 1.6.6 Subversion install script from www.wandisco.com and follow instructions to install it.
Install MySQL.
yum install mysql mysql-devel
Install Python 2.5, Trac and rebuild mod_python accordig to Installing-python-25-on-centos-5 and Installing-trac-on-centos-5.
Build Subversion bindings:
get Sqlite:
wget http://www.sqlite.org/sqlite-autoconf-3070500.tar.gz
tar xzvf sqlite-autoconf-3070500.tar.gz && cd sqlite-autoconf-3070500 && make && make install
get and build SWIG
wget http://downloads.sourceforge.net/project/swig/swig/swig-2.0.2/swig-2.0.2.tar.gz
tar xzvf swig-2.0.2.tar.gz && ./configure --with-python=/usr/bin/python25 --prefix=/usr && make && make install
build Python 2.5 subversion bindings
wget http://subversion.tigris.org/downloads/subversion-1.6.16.tar.bz2
tar xjvf subversion-1.6.16.tar.bz2 && cd subversion-1.6.16
./configure PYTHON=/usr/bin/python25 --with-sqlite=/usr/local && make && make swig-py && make install-swig-py
echo /usr/local/lib/svn-python > /usr/lib/python2.5/site-packages/svn.pth
At this point you should have Subversion 1.6.6, Trac 0.12, MySQLDb 1.2.2 and mod_python configured to run with Python 2.5.
I've contacted to the Trac-users maillist and they provided a solution. It was a patch for spec file for rpm building. After patch was applied to the package everything went fine.
https://groups.google.com/d/topic/trac-users/BVVnh9I17Po/discussion