Building 64bit libpython27.a using cygwin, dlltool - python

I'm trying to build a python extension DLL on a 64bit Win7 machine using cygwin (as cygwin only run as 32bit process, this is actually cross-compiling).
I created libpython27.a myself from python27.dll using dlltool (as explained, for example, here), but the build fail during the linker phase saying
skipping incompatible c:\Python27\libs/libpython27.a when searching for -lpython27
This is exactly the error reported here (where the guy ended up moving to MSVC compiler...).
More info:
- Active Python 2.7.2, win64, x64
- latest version of cygwin, using the /usr/bin/x86_64-w64-mingw32-g++.exe compiler
Does anyone know if this is supported?
Is there way to use dlltool which I miss here?
(I did found here the guidance to use
dlltool --as-flags=--64 -m i386:x86-64 -k -l libpython27.a -d python.def
but when doing so I got "invalid bfd target" error from dlltool)
Thanks!
Update: I believe it can be done because Enthought python contains such a file. I would like to create one for the more common distributions which don't contain it.

The problem is that you are using the 32 bit dlltool. Probably in C:\MinGW\bin instead of C:\MinGW64\bin. You can change your path, or run the 64 bit tool specifically as such:
C:\MinGW64\bin\dlltool -v --dllname python27.dll --def python27.def --output-lib libpython27.a

I'm not sure how helpful you find this, but at the bottom of the page you linked to there's a link to here - Where it says:
Do not use MinGW-w64. As you will notice, the MinGW import library for
Python (e.g. libpython27.a) is omitted from the AMD64 version of
Python. This is deliberate. Do not try to make one using dlltool.
There is no official MinGW-w64 release yet, it is still in "beta" and
considered unstable, although you can get a 64-bit build from e.g.
TDM-GCC. There have also been issues with the mingw runtime
conflicting with the MSVC runtime; this can happen from places you
don't expect, such as inside runtime libraries for g++ or gfortran. To
stay on the safe side, avoid MinGW-w64 for now.

Related

How do I build a C Python Extension for Windows using Cygwin64?

To preface: my code works as I expect when compiling and running on Linux. However, this library needs to be compiled for use on a Windows machine. I looked in to a couple different options, and decided that using Cygwin to compile for Windows seemed to be the correct choice. I'm using a setup.py file with the distutils.core library and compiling using python setup.py install. When compiling on Windows in Cygwin, it fails to find pthread.h, arpa/inet.h, netinet/in.h, and sys/socket.h. I was under the impression that Cygwin came prepackaged with these headers, which is why I chose to use it. The alternative to Cygwin is putting preprocessor commands everywhere and using Windows specific libraries such as winsock2.h, which I want to avoid if at all possible. Is it possible to compile for Windows using Cygwin? If so, what have I done wrong to cause Cygwin to not recognize these headers?
You need to install the proper headers
$ cygcheck -p usr/include/pthread.h
Found 9 matches for usr/include/pthread.h
cygwin-devel-3.0.7-1 - cygwin-devel: Core development files
..
cygwin-devel-3.1.6-1 - cygwin-devel: Core development files
...
so install the cygwin-devel package
To check all the shared libraries needed by the built dll, you can use cygcheck
$ cygcheck /usr/lib/python3.8/site-packages/Cython/Compiler/FlowControl.cpython-38-x86_64-cygwin.dll
D:\cygwin64\lib\python3.8\site-packages\Cython\Compiler\FlowControl.cpython-38-x86_64-cygwin.
dll
D:\cygwin64\bin\cygwin1.dll
C:\WINDOWS\system32\KERNEL32.dll
C:\WINDOWS\system32\ntdll.dll
C:\WINDOWS\system32\KERNELBASE.dll
D:\cygwin64\bin\libpython3.8.dll
D:\cygwin64\bin\cygintl-8.dll
D:\cygwin64\bin\cygiconv-2.dll
D:\cygwin64\bin\cyggcc_s-seh-1.dll
As was built with Cygwin Python, you need also to transfer the cygwin python...
Most important, I think, is to follow the instructions in the Python help or on the Python doc web site for "Extending and Embedding the Python Interpreter" for the version you are building the extension for. For windows, the build instructions identify the build environment used to create the binary package that you download from python.org, usually something like VS2013 or VS2017. (As an aside, I think the Community editions have everything you need, and I don't think you actually have to use the Visual Studio GUI when you build using nmake from the CMD.EXE terminal.)
To build in Cygwin for use in a Windows version of Python, you may need to install and then use the x86_64-w64-mingw32-gcc, etc., cygwin packages to cross-compile non-cygwin (i.e. pure windows) executables and DLLs from Cygwin.
Binary extensions must be built using the source tree for a specific Python major.minor version, and bitness. For windows, you will need to build multiple versions of the extension, one for each major.minor, bitness version of Python that will import it, e.g. 3.6, 3.7, 3.8, 3.9, 32-bit, 64-bit. The extension code may not require changes between versions, but it still needs to be compiled with the right compiler and linked against exactly the same shared libraries (in this case .DLL files) as used by the Python executable. For instance, it must use exactly the same version of Microsoft's C run time library DLL as the Python executable does. This is a bit more sensitive and restrictive than on Linux, where you can rebuild the python executable and your extension with the same toolchain from your distro more easily.

is compiled python file under debian is compatible with ubuntu

I'm using nuitka to compile my python codes. I use --module option to import my code inside other python files:
nuitka --module --recurse-none file.py
Output: file.so
If I don't need to import the code and just need to run on terminal, I'm following regular compiling process:
nuitka --recurse-none file.py
Output: file.exe
I'm compiling these files under Debian and they work without a problem under Debian. When I move these files to an Ubuntu system, I sometimes get Segmentation Fault errors. Is it because a compiled python code under Debian is not compatible with Ubuntu or am I doing a personal mistake (like missing library etc.)
As answered by abarnert, if you want to make your executable independent from the specific python installation on your device, you need to use the --standalone option.
You can check that info in the Nuitka Manual
Dynamic Linking
From the docs,
It translates the Python into a C level program that then uses "libpython" to execute in the same way as CPython does.
Do you have libpython installed and pointing to the same version as the one you are compiling from? Example, on arch:
$ whereis libpython
libpython: /usr/lib/libpython3.so
Shows I have libpython installed and belonging to python 3.x (notice 3 at end of path).
Static linking.
The other way to do is I guess as suggested by others, i.e, using --standalone option. This should avoid the need of libpython
I'm kind of suspicious that you have your hint right in your question. *.exe is generally a Windows executable, while *.so is a UNIX/Linux reloadable module. Without delving into the manual very far, I notice that in one example you have --module and you get, sure enough, a Linux module. In the other case, you don't. And you don't.

Missing python on MinGW

I am trying to build gdb on Windows7 using Mingw. I am configuring it with "--with-python" flag. However, I get error:
configure:8898: checking for python
configure:8916: found /c/Python34//python
configure:8929: result: /c/Python34//python
configure:9067: checking for python34
...
configure:9095: result: no
configure:9278: error: python is missing or unusable
I suppose it may be due to that I am using 64-bit version of python, while the mingw is 32 bit. I tried to use MinGW64 but I do not find it useful - it does not support bash, and I can't use configure script.
Please, show the way to build gdb on mingw32 with python, or how to make MinGW usable.
Try using msys2 to provide a 64bit (and 32bit if you so wish) gcc.
This comes with bash and the standard coreutils

Installing Numpy locally

I have an account in a remote computer without root permissions and I needed to install a local version of Python (the remote computer has a version of Python that is incompatible with some codes I have), Numpy and Scipy there. I've been trying to install numpy locally since yesterday, with no success.
I successfully installed a local version of Python (2.7.3) in /home/myusername/.local/, so I access to this version of Python by doing /home/myusername/.local/bin/python. I tried two ways of installing Numpy:
I downloaded the lastest stable version of Numpy from the official webpage, unpacked it, got into the unpacked folder and did: /home/myusername/.local/bin/python setup.py install --prefix=/home/myusername/.local. However, I get the following error, which is followed by a series of other errors (deriving from this one):
gcc -pthread -shared build/temp.linux-x86_64-2.7/numpy/core/blasdot/_dotblas.o
-L/usr/local/lib -Lbuild/temp.linux-x86_64-2.7 -lptf77blas -lptcblas -latlas
-o build/lib.linux-x86_64-2.7/numpy/core/_dotblas.so
/usr/bin/ld: /usr/local/lib/libptcblas.a(cblas_dptgemm.o): relocation
R_X86_64_32 against `a local symbol' can not be used when making a shared
object; recompile with -fPIC
Not really knowing what this meant (except that the error apparently has to do with the LAPACK library), I just did the same command as above, but now putting LDFLAGS='-fPIC', as suggested by the error i.e., I did
LDFLAGS="-fPIC" /home/myusername/.local/bin/python setup.py install --prefix=/home/myusername/.local.
However, I got the same error (except that the prefix -fPIC was addeded after the gcc command above).
I tried installing it using pip, i.e., doing /home/myusername/.local/bin/pip install numpy /after successfully instaling pip in my local path). However, I get the exact same error.
I searched on the web, but none of the errors seemed to be similar to mine. My first guess is that this has to do with some piece of code that needs root permissions to be executed, or maybe with some problem with the version of the LAPACK libraries.
Help, anyone?
The error message is telling you that your ATLAS library has not been built with the -fPIC flag. That means it cannot be linked into a shared library like Python extension modules. You need to rebuild ATLAS with the -fPIC flag. The ATLAS documentation describes how to do so.
It's kind of a pain to build from source. Is it possible to avoid doing that?
If we assume that you are trying to install on an x86 computer (Intel, AMD, whatever), can you just install Python on another x86 computer where you do have root, then make a tar archive of the Python installation, copy the tar to the other computer, and unpack the tar archive?
The problem with the above is that the pre-built Python might have hard-coded paths for where to look for libraries: it might need the libraries to be in /usr/share or whatever. It would be a bit of a hack, but you might be able to make a chroot jail and get Python to run.
You might also want to take a look at Enthought Python Distribution (EPD). I believe the EPD installer simply asks you where you want EPD installed, and installs it there.
http://www.enthought.com/products/epdgetstart.php?platform=linux
There is a free version of EPD. If you want 64-bit you would have to pay for EPD, but if 32-bit will work for you, EPD Free might be all you need.
http://www.enthought.com/products/epd_free.php
P.S. The Enthought web site seems to be rejecting any URL that doesn't start with www.! This means that some Google search links don't work unless you edit them to insert the www. at the beginning. I'm sure they will fix this soon.
You may want to look into EasyBuild for building your local Python version with numpy and scipy enabled, see http://hpcugent.github.com/easybuild/.
It basically takes all the nasty stuff away from you, you just need to configure it a little bit (specify where you want the software to end up, for exaple), and then you can build Python with the packages of your choice with a single command.

Finding the correct Python framework with cmake

I am using the macports version of python on a Snow Leopard computer, and using cmake to build a cross-platform extension to it. I search for the python interpreter and libraries on the system using the following commands in CMakeLists.txt
include(FindPythonInterp)
include(FindPythonLibs )
However, while cmake identified the correct interpreter in /opt/local/bin, it tries to link against the wrong framework - namely the system Python framework.
-- Found PythonInterp: /opt/local/bin/python2.6
-- Found PythonLibs: -framework Python
And this causes the following runtime error
Fatal Python error: Interpreter not initialized (version mismatch?)
As soon as I replace -framework Python with /opt/local/Library/Frameworks/Python.framework/Python things seem to work as expected.
How can I make cmake link against the correct Python framework found in
/opt/local/Library/Frameworks/Python.framework/Python
rather than the system one in
/System/Library/Frameworks/Python.framework/Python
?
Adding the following in ~/.bash_profile
export DYLD_FRAMEWORK_PATH=/opt/local/Library/Frameworks
fixes the problem at least temporarily. Apparently, this inconsistency between the python interpreter and the python framework used by cmake is a bug that should be hopefully fixed in the new version.
I am not intimately familiar with CMake, but with the Apple version of gcc/ld, you can pass the -F flag to specify a new framework search path. For example, -F/opt/local/Library/Frameworks will search in MacPorts' frameworks directory. If you can specify such a flag using CMake, it may solve your problem.

Categories