The output produced by running perl -V is packed with useful information (see example below). Is there anything like it for Python?
Example output:
% perl -V
Summary of my perl5 (revision 5 version 10 subversion 1) configuration:
Platform:
osname=linux, osvers=2.6.32-5-amd64, archname=x86_64-linux-gnu-thread-multi
uname='linux brahms 2.6.32-5-amd64 #1 smp tue jun 14 09:42:28 utc 2011 x86_64 gnulinux '
config_args='-Dusethreads -Duselargefiles -Dccflags=-DDEBIAN -Dcccdlflags=-fPIC -Darchname=x86_64-linux-gnu -Dprefix=/usr -Dprivlib=/usr/share/perl/5.10 -Darchlib=/usr/lib/perl/5.10 -Dvendorprefix=/usr -Dvendorlib=/usr/share/perl5 -Dvendorarch=/usr/lib/perl5 -Dsiteprefix=/usr/local -Dsitelib=/usr/local/share/perl/5.10.1 -Dsitearch=/usr/local/lib/perl/5.10.1 -Dman1dir=/usr/share/man/man1 -Dman3dir=/usr/share/man/man3 -Dsiteman1dir=/usr/local/man/man1 -Dsiteman3dir=/usr/local/man/man3 -Dman1ext=1 -Dman3ext=3perl -Dpager=/usr/bin/sensible-pager -Uafs -Ud_csh -Ud_ualarm -Uusesfio -Uusenm -DDEBUGGING=-g -Doptimize=-O2 -Duseshrplib -Dlibperl=libperl.so.5.10.1 -Dd_dosuid -des'
hint=recommended, useposix=true, d_sigaction=define
useithreads=define, usemultiplicity=define
useperlio=define, d_sfio=undef, uselargefiles=define, usesocks=undef
use64bitint=define, use64bitall=define, uselongdouble=undef
usemymalloc=n, bincompat5005=undef
Compiler:
cc='cc', ccflags ='-D_REENTRANT -D_GNU_SOURCE -DDEBIAN -fno-strict-aliasing -pipe -fstack-protector -I/usr/local/include -D_LARGEFILE_SOURCE -D_FILE_OFFSET_BITS=64',
optimize='-O2 -g',
cppflags='-D_REENTRANT -D_GNU_SOURCE -DDEBIAN -fno-strict-aliasing -pipe -fstack-protector -I/usr/local/include'
ccversion='', gccversion='4.4.5', gccosandvers=''
intsize=4, longsize=8, ptrsize=8, doublesize=8, byteorder=12345678
d_longlong=define, longlongsize=8, d_longdbl=define, longdblsize=16
ivtype='long', ivsize=8, nvtype='double', nvsize=8, Off_t='off_t', lseeksize=8
alignbytes=8, prototype=define
Linker and Libraries:
ld='cc', ldflags =' -fstack-protector -L/usr/local/lib'
libpth=/usr/local/lib /lib /usr/lib /lib64 /usr/lib64
libs=-lgdbm -lgdbm_compat -ldb -ldl -lm -lpthread -lc -lcrypt
perllibs=-ldl -lm -lpthread -lc -lcrypt
libc=/lib/libc-2.11.2.so, so=so, useshrplib=true, libperl=libperl.so.5.10.1
gnulibc_version='2.11.2'
Dynamic Linking:
dlsrc=dl_dlopen.xs, dlext=so, d_dlsymun=undef, ccdlflags='-Wl,-E'
cccdlflags='-fPIC', lddlflags='-shared -O2 -g -L/usr/local/lib -fstack-protector'
Characteristics of this binary (from libperl):
Compile-time options: MULTIPLICITY PERL_DONT_CREATE_GVSV
PERL_IMPLICIT_CONTEXT PERL_MALLOC_WRAP USE_64_BIT_ALL
USE_64_BIT_INT USE_ITHREADS USE_LARGE_FILES
USE_PERLIO USE_REENTRANT_API
Locally applied patches:
DEBPKG:debian/arm_thread_stress_timeout - http://bugs.debian.org/501970 Raise the timeout of ext/threads/shared/t/stress.t to accommodate slower build hosts
DEBPKG:debian/cpan_config_path - Set location of CPAN::Config to /etc/perl as /usr may not be writable.
<snip-- iow patches galore --you get the picture>
DEBPKG:fixes/safe-reval-rdo-cve-2010-1447 - [PATCH] Wrap by default coderefs returned by rdo and reval
DEBPKG:patchlevel - http://bugs.debian.org/567489 List packaged patches for 5.10.1-17squeeze2 in patchlevel.h
Built under linux
Compiled at Jun 30 2011 22:28:00
#INC:
/etc/perl
/usr/local/lib/perl/5.10.1
/usr/local/share/perl/5.10.1
/usr/lib/perl5
/usr/share/perl5
/usr/lib/perl/5.10
/usr/share/perl/5.10
/usr/local/lib/site_perl
/usr/local/lib/perl/5.10.0
/usr/local/share/perl/5.10.0
.
Not to be confused with the much less informative perl -v:
% perl -v
This is perl, v5.10.1 (*) built for x86_64-linux-gnu-thread-multi
(with 53 registered patches, see perl -V for more detail)
Copyright 1987-2009, Larry Wall
Perl may be copied only under the terms of either the Artistic License or the
GNU General Public License, which may be found in the Perl 5 source kit.
Complete documentation for Perl, including FAQ lists, should be found on
this system using "man perl" or "perldoc perl". If you have access to the
Internet, point your browser at http://www.perl.org/, the Perl Home Page.
python -c 'import sysconfig, pprint; pprint.pprint(sysconfig.get_config_vars())'
Although this is incredibly hackish, impractical, and not as detailed as perl -V, this is a one-liner that can get decent information about the environment.
python -c "import platform as p;exec('for x in vars(p):\n try:\n print ({x:vars(p)[x]()})\n except:\n pass')"
Since this is not your typical easy-to-remember command, you could save this line to Python's Lib directory as sys_info.py and then you could just run:
python -m sys_info
Related
I build netifaces module with "python setup.py buiild". But during two build process, the output path is different.
First build, the output path is lib.linux-x86_64-3.9
Second build, the output path is lib.linux-x86_64-cpython-39
I want to know how python setup.py determines output path?
Thanks.
I don't know the reason why the build path changes. Unfortunately, I'm unable to replicate your experience. I created this dockerfile in an attempt to replacate:
FROM jaraco/multipy-tox
RUN pipx install httpie
RUN http GET https://files.pythonhosted.org/packages/a6/91/86a6eac449ddfae239e93ffc1918cf33fd9bab35c04d1e963b311e347a73/netifaces-0.11.0.tar.gz | tar xz
WORKDIR netifaces-0.11.0
RUN pip-run -q setuptools -- setup.py build > build1.txt
RUN pip-run -q setuptools -- setup.py build > build2.txt
CMD diff --unified build1.txt build2.txt
It produces this output:
$ docker run -it #$(docker build -q .)
--- build1.txt 2022-09-05 14:54:42.639229009 +0000
+++ build2.txt 2022-09-05 14:54:43.869229009 +0000
## -1,16 +1,12 ##
running build
running build_ext
-checking for getifaddrs...found.
-checking for getnameinfo...found.
-checking for IPv6 socket IOCTLs...not found.
-checking for optional header files...netash/ash.h netatalk/at.h netax25/ax25.h neteconet/ec.h netipx/ipx.h netpacket/packet.h netrose/rose.h linux/atm.h linux/llc.h linux/tipc.h linux/dn.h.
-checking whether struct sockaddr has a length field...no.
-checking which sockaddr_xxx structs are defined...at ax25 in in6 ipx un rose ash ec ll atmpvc atmsvc dn llc.
-checking for routing socket support...no.
-checking for sysctl(CTL_NET...) support...no.
-checking for netlink support...yes.
+checking for getifaddrs...found. (cached)
+checking for getnameinfo...found. (cached)
+checking for IPv6 socket IOCTLs...not found. (cached)
+checking for optional header files...netash/ash.h netatalk/at.h netax25/ax25.h neteconet/ec.h netipx/ipx.h netpacket/packet.h netrose/rose.h linux/atm.h linux/llc.h linux/tipc.h linux/dn.h. (cached)
+checking whether struct sockaddr has a length field...no. (cached)
+checking which sockaddr_xxx structs are defined...at ax25 in in6 ipx un rose ash ec ll atmpvc atmsvc dn llc. (cached)
+checking for routing socket support...no. (cached)
+checking for sysctl(CTL_NET...) support...no. (cached)
+checking for netlink support...yes. (cached)
will use netlink to read routing table
-building 'netifaces' extension
-aarch64-linux-gnu-gcc -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -fPIC -DNETIFACES_VERSION=0.11.0 -DHAVE_GETIFADDRS=1 -DHAVE_GETNAMEINFO=1 -DHAVE_NETASH_ASH_H=1 -DHAVE_NETATALK_AT_H=1 -DHAVE_NETAX25_AX25_H=1 -DHAVE_NETECONET_EC_H=1 -DHAVE_NETIPX_IPX_H=1 -DHAVE_NETPACKET_PACKET_H=1 -DHAVE_NETROSE_ROSE_H=1 -DHAVE_LINUX_ATM_H=1 -DHAVE_LINUX_LLC_H=1 -DHAVE_LINUX_TIPC_H=1 -DHAVE_LINUX_DN_H=1 -DHAVE_SOCKADDR_AT=1 -DHAVE_SOCKADDR_AX25=1 -DHAVE_SOCKADDR_IN=1 -DHAVE_SOCKADDR_IN6=1 -DHAVE_SOCKADDR_IPX=1 -DHAVE_SOCKADDR_UN=1 -DHAVE_SOCKADDR_ROSE=1 -DHAVE_SOCKADDR_ASH=1 -DHAVE_SOCKADDR_EC=1 -DHAVE_SOCKADDR_LL=1 -DHAVE_SOCKADDR_ATMPVC=1 -DHAVE_SOCKADDR_ATMSVC=1 -DHAVE_SOCKADDR_DN=1 -DHAVE_SOCKADDR_LLC=1 -DHAVE_PF_NETLINK=1 -I/usr/include/python3.11 -c netifaces.c -o build/temp.linux-aarch64-cpython-311/netifaces.o
-creating build/lib.linux-aarch64-cpython-311
-aarch64-linux-gnu-gcc -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -g -fwrapv -O2 build/temp.linux-aarch64-cpython-311/netifaces.o -L/usr/lib/aarch64-linux-gnu -o build/lib.linux-aarch64-cpython-311/netifaces.cpython-311-aarch64-linux-gnu.so
If I remove the build file between builds, there's no diff.
The output path is determined by setuptools or possibly by distutils, but I also observe that netifaces has a giant setup.py file, so it's entirely possible the behavior affecting the output directory is there.
What I recommend is to:
(a) Determine if the behavior you're observing is unique to netifaces or happens to other (simpler) packages with an extension module.
(b) Using a simpler package with an extension module, create a set of steps that replicate the behavior you observe.
(c) Put breakpoints or print statements at various points in the distutils/setuptools code to trace the behavior and determine where the output dir is set and how it changes.
I have just updated my Macbook to Monterey and installed XCode 13. I'm now seeing errors when trying to link my code - for example one library needs to link to the system python2.7, but gives the error:
Keiths-MacBook-Pro:libcdb keith$ make rm -f libcdb.1.0.0.dylib
libcdb.dylib libcdb.1.dylib libcdb.1.0.dylib
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang++
-stdlib=libc++ -headerpad_max_install_names -arch x86_64 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk
-mmacosx-version-min=10.13 -Wl,-rpath,#executable_path/../Frameworks -Wl,-rpath,/usr/local/Qt-5.15.7/lib -single_module -dynamiclib -compatibility_version 1.0 -current_version 1.0.0 -install_name libcdb.1.dylib -o libcdb.1.0.0.dylib release/db.o release/KDTree.o release/db_Wlist.o release/db_VSeg.o
release/db_View.o release/db_ViaInst.o release/db_ViaGen.o
release/db_Via.o release/db_Vertex.o release/db_Vector.o
release/db_Utils.o release/db_Trapezoid.o release/db_Transform64.o
release/db_Transform.o release/db_Techfile.o release/db_Style.o
release/db_Signal.o release/db_Shape.o release/db_SegParam.o
release/db_Segment.o release/db_Rectangle.o release/db_Rect.o
release/db_QTree.o release/db_Property.o release/db_Polygon.o
release/db_PointList.o release/db_Point.o release/db_Pin.o
release/db_Path.o release/db_ObjList.o release/db_Obj.o
release/db_Net.o release/db_Mst.o release/db_Mpp.o release/db_Lpp.o
release/db_Line.o release/db_Library.o release/db_Layer.o
release/db_Label.o release/db_InstPin.o release/db_Inst.o
release/db_HVTree.o release/db_HSeg.o release/db_HierObj.o
release/db_Group.o release/db_Ellipse.o release/db_Edge.o
release/db_CellView.o release/db_Cell.o release/db_Array.o
release/db_Arc.o -F/usr/local/Qt-5.15.7/lib -L../libcpp/release -lcpp
-L/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/config
-lpython2.7 -framework QtWidgets -framework QtGui -framework AppKit -framework Metal -framework QtNetwork -framework QtCore -framework DiskArbitration -framework IOKit -framework OpenGL -framework AGL
ld: cannot link directly with dylib/framework, your binary is not an
allowed client of
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/config/libpython2.7.tbd
for architecture x86_64 clang: error: linker command failed with exit
code 1 (use -v to see invocation) make: ***
[release/libcdb.1.0.0.dylib] Error 1
Given that I have recompiled (successfully) the Qt libs and the code for this library, why is it giving me this 'your binary is not an allowed client' error?
As far as I can see the python2.7 paths have not changed, so the error is baffling.
So the quick and diirty fix is to edit the Python.tdb file that is located at:
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX12.0.sdk/System/Library/Frameworks/Python.framework/Versions/Current/Python.tdb
And add your library/executable targets to the clients list.
Of course, there is a reason Apple are doing this - python2 is deprecated and sooner or later they will drop it. But until they do, this works.
What fixed it for me was to drop the Python2 framework and link with the framework provided by my version of Python3 installed via Brew.
Locate your installation of Python3 by using this command in Terminal: ls -la $(which python3).
The framework is in the "Frameworks" folder located one level above "bin", for example: /usr/local/Cellar/python#3.9/3.9.7_1/Frameworks
Once your have the location of the Python3 framework, add it as a framework in your XCode project.
In the Build Settings, don't forget to add:
the Frameworks folder location in Framework Search Path (e.g. "/usr/local/Cellar/python#3.9/3.9.7_1/Frameworks")
the framework's Headers folder in Header Search Path (e.g. "/usr/local/Cellar/python#3.9/3.9.7_1/Frameworks/Python.framework/Headers")
Some functions changed in version 3 so you'll need to update some of your Python function calls.
Given the below makefile:
TARGET = _example.pyd
OFILES = example.obj example_wrap.obj
HFILES =
CC = cl
CXX = cl
LINK = link
CPPFLAGS = -DNDEBUG -DUNICODE -DWIN32 -I. -Id:\virtual_envs\py351\include
CFLAGS = -nologo -Zm200 -Zc:wchar_t- -FS -Zc:strictStrings -O2 -MD -W3 -w44456 -w44457 -w44458
CXXFLAGS = -nologo -Zm200 -Zc:wchar_t- -FS -Zc:strictStrings -D_HAS_EXCEPTIONS=0 -O2 -MD -W3 -w34100 -w34189 -w44996 -w44456 -w44457 -w44458 -wd4577
LFLAGS = /LIBPATH:. /NOLOGO /DYNAMICBASE /NXCOMPAT /DLL /MANIFEST /MANIFESTFILE:$(TARGET).manifest /SUBSYSTEM:WINDOWS /INCREMENTAL:NO
LIBS = /LIBPATH:d:\virtual_envs\py351\libs python35.lib
.SUFFIXES: .c .cpp .cc .cxx .C
{.}.cpp{}.obj::
$(CXX) -c $(CXXFLAGS) $(CPPFLAGS) -Fo #<<
$<
<<
{.}.cc{}.obj::
$(CXX) -c $(CXXFLAGS) $(CPPFLAGS) -Fo #<<
$<
<<
{.}.cxx{}.obj::
$(CXX) -c $(CXXFLAGS) $(CPPFLAGS) -Fo #<<
$<
<<
{.}.C{}.obj::
$(CXX) -c $(CXXFLAGS) $(CPPFLAGS) -Fo #<<
$<
<<
{.}.c{}.obj::
$(CC) -c $(CFLAGS) $(CPPFLAGS) -Fo #<<
$<
<<
all: $(TARGET)
$(OFILES): $(HFILES)
$(TARGET): $(OFILES)
$(LINK) $(LFLAGS) /OUT:$(TARGET) #<<
$(OFILES) $(LIBS)
<<
mt -nologo -manifest $(TARGET).manifest -outputresource:$(TARGET);2
install: $(TARGET)
#if not exist d:\virtual_envs\py351\Lib\site-packages mkdir d:\virtual_envs\py351\Lib\site-packages
copy /y $(TARGET) d:\virtual_envs\py351\Lib\site-packages\$(TARGET)
clean:
-del $(TARGET)
-del *.obj
-del *.exp
-del *.lib
-del $(TARGET).manifest
test:
python runme.py
I'd like to improve a couple of things here:
I'd like to consider swig files (*.i) in the makefile. For example, every time some swig file has been changed a new wrap file should be generated (ie: swig -python -c++ file_has_changed.cpp) and then rebuild the project
I'd like to avoid having hardcoded object files. For instance, I'd like to use all cpp files using wildcards somehow
I've read a little bit of the docs talking about Makefiles but I'm still pretty much confused. How could I achieve this?
Right now I'm using a hacky solution like swig -python -c++ whatever_file.i && nmake, that's of course it's not ideal at all
REFERENCES
Achieving this inside visual studio IDE is quite easy following these steps but I'd like to use this makefile inside SublimeText, that's why I'm quite interested on knowing how to have a proper Makefile
Producing any kind of target from any kind of source, that's the essence of a makefile:
.i.cpp:
swig -python -c++ $<
This elegance will, however, break with nmake (as opposed to GNU make) if the .cpp file is missing because nmake doesn't try to chain inference rules through a missing link.
Moreover, it will break silently and "build" from stale versions of the files that are later in the build chain (which includes the resulting executable) if they are present.
Possible kludges workarounds here (save for ditching nmake, of course) are:
invoke nmake multiple times, first, to generate all files that are an intermediate steps between two inference rules (which can in turn require multiple invocations if they are generated from one another), and then for the final targets
This requires an external script which can very well be another makefile. E.g.:
move the current Makefile to main_makefile and create a new Makefile with commands for the main target like this:
python -c "import os,os.path,subprocess;
subprocess.check_call(['nmake', '/F', 'main_makefile']
+[os.path.splitext(f)[0]+'.cpp'
for f in os.listdir('.') if os.path.isfile(f)
and f.endswith('.i')])"
nmake /F main_makefile
do not rely solely on inference rules but have an explicit rule for each .cpp to be produced (that's what CMake does btw)
this asks for the relevant part of Makefile to be autogenerated. That part can be !INCLUDE'd, but still, external code is needed to do the generation before nmake gets to work on the result. Example code (again, in Python):
import os,os.path,subprocess
for f in os.listdir('.') if os.path.isfile(f) and f.endswith('.i'):
print '"%s": "%s"'%(os.path.splitext(f)[0]+'.cxx',f)
#quotes are to allow for special characters,
# see https://msdn.microsoft.com/en-us/library/956d3677.aspx
#command is not needed, it will be added from the inferred rule I gave
# in the beginning, see http://www.darkblue.ch/programming/Namke.pdf, p.41 (567)
I have solved this using CMake and this translates directly to using autoconf and automake and thereby makefiles.
The idea is to introduce the following variable
DEPENDENCIES = `swig -M -python -c++ -I. example.i | sed 's/\//g'`
and make your target depend on this. The above generates a list of dependencies of all headers and .i files your SWIG interface file may include.
I have written a Python C module (just ffmpeg.c which depends on some FFmpeg libs and other libs) and I am wondering how to link.
I'm compiling with:
cc -std=c99 -c ../ffmpeg.c -I /usr/include/python2.7 -g
I'm trying to link right now with:
ld -shared -o ../ffmpeg.so -L/usr/local/lib -lpython2.7 -lavutil -lavformat -lavcodec -lswresample -lportaudio -lchromaprint ffmpeg.o -lc
There is no error. However, when I try to import ffmpeg in Python, I get:
ImportError: ./ffmpeg.so: undefined symbol: avio_alloc_context
Maybe this is already correct. I checked the resulting ffmpeg.so with ldd and it partly links to a wrong FFmpeg. This is strange however because of the -L/usr/local/lib which should take precedence over the default. Maybe because my custom installed FFmpeg (in /usr/local/lib) has for some reason only installed static *.a libs and *.so files have precedence over *.a files.
You should put the libraries that you're linking to after the .o file; i.e.:
ld -shared -o ../ffmpeg.so ffmpeg.o -L/usr/local/lib -lpython2.7 -lavutil -lavformat -lavcodec -lswresample -lportaudio -lchromaprint -lc
The linker is dumb, and will not link in code from static libraries that it doesn't think are needed until a dependency arises i.e. the use of avio_alloc_context happens in ffmpeg.o, and because it's not listed after the use of the library, then the linker will not consider the code in the library as needed, so it doesn't get linked in - this is the biggest reason why linking using .a files fails.
You can also use --start-group and --end-group around all the files that you are linking - this allows you to link static libraries that have cross dependencies that just seem impossible to resolve through other means:
ld -shared -o ../ffmpeg.so -L/usr/local/lib -lpython2.7 --start-group -lavutil -lavformat -lavcodec -lswresample -lportaudio -lchromaprint ffmpeg.o --end-group -lc
using .a files is a little bit trickier than .so files, but these two items generally will work around any issues you have when linking.
I'm trying to embed Python into a MATLAB mex function on OS X. I've seen references that this can be done (eg here) but I can't find any OS X specific information. So far I can successfully build an embedded Python (so my linker flags must be OK) and I can also build example mex files without any trouble and with the default options:
jm-g26b101:mex robince$ cat pytestnomex.c
#include <Python/Python.h>
int main() {
Py_Initialize();
PyRun_SimpleString("print 'hello'");
Py_Finalize();
return 0;
}
jm-g26b101:mex robince$ gcc -arch i386 pytestnomex.c -I/Library/Frameworks/Python.framework/Versions/2.5/include/python2.5 -L/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/config -ldl -lpython2.5
jm-g26b101:mex robince$ ./a.out
hello
But when I try to build a mex file that embeds Python I run into a problem with undefined symbol main. Here is my mex function:
#include <Python.h>
#include <mex.h>
void mexFunction(int nlhs, mxArray *plhs[], int nrhs, const mxArray*prhs[])
{
mexPrintf("hello1\n");
Py_Initialize();
PyRun_SimpleString("print 'hello from python'");
Py_Finalize();
}
Here are the mex compilation steps:
jm-g26b101:mex robince$ gcc -c -I/Applications/MATLAB_R2009a.app/extern/include -I/Applications/MATLAB_R2009a.app/simulink/include -DMATLAB_MEX_FILE -arch i386 -I/Library/Frameworks/Python.framework/Versions/2.5/include/python2.5 -DMX_COMPAT_32 -O2 -DNDEBUG "pytest.c"
jm-g26b101:mex robince$ gcc -O -arch i386 -L/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/config -ldl -lpython2.5 -o "pytest.mexmaci" pytest.o -L/Applications/MATLAB_R2009a.app/bin/maci -lmx -lmex -lmat -lstdc++
Undefined symbols:
"_main", referenced from:
start in crt1.10.6.o
ld: symbol(s) not found
collect2: ld returned 1 exit status
I've tried playing around with arch settings (I added -arch i386 it to try and keep everything 32bit - I am using the python.org 32 bit 2.5 build), and the order of the linker flags, but haven't been able to get anywhere. Can't find much online either. Does anyone have any ideas of how I can get this to build?
[EDIT: should probably add I'm on OS X 10.6.1 with MATLAB 7.8 (r2009a), Python 2.5.4 (python.org) - I've tried both gcc-4.0 and gcc-4.2 (apple)]
I think I found the answer - by including the mysterious apple linker flags:
-undefined dynamic_lookup -bundle
I was able to get it built and it seems to work OK. I'd be very interested if anyone has any references about these flags or library handling on OS X in general. Now I see them I remember being bitten by the same thing in the past - yet I'm unable to find any documentation on what they actually do and why/when they should be needed.