python embedding on Mac ModuleNotFoundError: No module named 'encodings' - python

I'm currently unable to using the Cython embedding feature. The binary compiles fine and otool -L embedded returns the following results.
embedded:
#rpath/libpython3.6m.dylib (compatibility version 3.6.0, current version 3.6.0)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1238.60.2)
/System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation (compatibility version 150.0.0, current version 1349.8.0)
This is the command I ran. Any thoughts on why this is not working? Cython using setup.py works fine when I want to create a Cython module, i.e. I'm able to import the Cython module in Python.
$ make
gcc -c embedded.c -I/Users/$USER/miniconda3/include/python3.6m -I/Users/$USER/miniconda3/include/python3.6m
gcc -o embedded embedded.o -L/Users/$USER/miniconda3/lib -L/Users/$USER/miniconda3/lib/python3.6/config-3.6m-darwin -lpython3.6m -ldl -framework CoreFoundation -Wl,-stack_size,1000000 -framework CoreFoundation
$ ./embedded
Could not find platform independent libraries <prefix>
Could not find platform dependent libraries <exec_prefix>
Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>]
Fatal Python error: Py_Initialize: unable to load the file system codec
ModuleNotFoundError: No module named 'encodings'
Current thread 0x000000010f8113c0 (most recent call first):
[1] 32931 abort ./embedded
Suggestions?

You are basically trying to run a Python native code extension as a stand alone binary without the Python interpreter. This will never work.
Cython extension code produces extensions to the Python interpreter.
They are shared modules that can only be loaded within a running Python interpreter. They cannot be used as stand alone binaries.
If you want to make and distribute a stand alone binary of Python code with or without extensions, the interpreter will need to be bundled along with the code - see cx_freeze.

Related

Is this Boost::Python (Python 3.7) error "__init__() should return None, not 'NoneType'" a linking problem?

Update
I'm not going to add this as an answer, since I still haven't technically solved the problem. But since I've now spent 2.5 days trying to get things to work with boost-python3, I've lost the will to live with it.
I've just come across pybind11 (how my previous lengthy searches for python binding tools didn't turn it up, I don't know) and am using that. 2.5 days of misery compares to <20 minutes installing and building their cmake example... and all the specific-python-version-dependency-hell is gone.
It's syntactically similar to boost-python but much easier to manage, quicker, is header-only and is more feature rich.
Yay!
Original question
I'm using boost::python to bind a class in python 3.7.2.
The class import successfully but instantiating it gives the following error:
<my-terminal>$ python
Python 3.7.2 (default, Feb 14 2019, 17:36:47)
[Clang 10.0.0 (clang-1000.11.45.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import classes
>>> t = classes.World()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: __init__() should return None, not 'NoneType'
>>>
Here is classes.cpp:
#include <boost/python.hpp>
#include <boost/python/list.hpp>
#include <boost/python/extract.hpp>
#include <string>
#include <sstream>
#include <vector>
struct World
{
void set(std::string msg) { mMsg = msg; }
void many(boost::python::list msgs) {
long l = len(msgs);
std::stringstream ss;
for (long i = 0; i<l; ++i) {
if (i>0) ss << ", ";
std::string s = boost::python::extract<std::string>(msgs[i]);
ss << s;
}
mMsg = ss.str();
}
std::string greet() { return mMsg; }
std::string mMsg;
};
using namespace boost::python;
BOOST_PYTHON_MODULE(classes)
{
class_<World>("World")
.def("greet", &World::greet)
.def("set", &World::set)
.def("many", &World::many)
;
};
Hypothesis
This question, almost identical was solved because of a python 2/3 issue (linking against python 3 instead of python 2 libraries). So I suspected a library linking issue.
Checking the hypothesis
I can't get bjam to work, and wouldn't be able to switch all our build systems over for one module anyway... so am building with cmake, which compiles successfully to classes.so with output as follows, suggesting I'm finding all the correct includes, libraries and executables:
-- Found PythonInterp: /Users/me/.pyenv/versions/boost37/bin/python3 (found suitable version "3.7.2", minimum required is "3")
PYTHON_VERSION_SUFFIX
-- Boost version: 1.68.0
-- Found the following Boost libraries:
-- python37
-- Found PythonLibs: /usr/local/Frameworks/Python.framework/Versions/3.7/lib/libpython3.7m.dylib (found suitable version "3.7.2", minimum required is "3")
-- PYTHON_LIBRARIES = /usr/local/Frameworks/Python.framework/Versions/3.7/lib/libpython3.7m.dylib
-- PYTHON_EXECUTABLE = /Users/thc29/.pyenv/versions/boost37/bin/python3
-- PYTHON_INCLUDE_DIRS = /usr/local/Frameworks/Python.framework/Versions/3.7/include/python3.7m
-- Boost_LIBRARIES = /usr/local/lib/libboost_python37-mt.dylib
Boost-python3 library directory contents:
ls /usr/local/Cellar/boost-python3/1.68.0/lib
libboost_numpy37-mt.a libboost_numpy37.dylib libboost_python37.a
libboost_numpy37-mt.dylib libboost_python37-mt.a libboost_python37.dylib
libboost_numpy37.a libboost_python37-mt.dylib
I used brew install boost, and brew install boost-python3 --build-from-source with my python 3.7 virtualenv activated, to ensure boost-python3 is linked against the correct version of python.
Checking libraries...
otool -L classes.so gives:
classes.so:
/usr/l/opt/boost-python3/lib/libboost_python37-mt.dylib (compatibility version 0.0.0, current version 0.0.0)
/usr/local/opt/python/Frameworks/Python.framework/Versions/3.7/Python (compatibility version 3.7.0, current version 3.7.0)
/usr/lib/libc++.1.dylib (compatibility version 1.0.0, current version 400.9.4)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1252.200.5)
otool -L /usr/local/opt/boost-python3/lib/libboost_python37-mt.dylib gives:
/usr/local/opt/boost-python3/lib/libboost_python37-mt.dylib:
/usr/local/opt/boost-python3/lib/libboost_python37-mt.dylib (compatibility version 0.0.0, current version 0.0.0)
/usr/lib/libc++.1.dylib (compatibility version 1.0.0, current version 400.9.4)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1252.200.5)
In the related question, that showed their problem. But here it appears fine!
No progress yet...
After the painful process of getting this all compiling properly and checking the linking, I can't spot any flaws. Is this a different problem? Or is there a linking issue that I haven't spotted?
Thanks for any help!
Adding an answer here for those using the Anaconda or Conda-Forge Distribution:
The python interpreter statically links in the libpythonXY library. Which is why it makes the python binary different compared to other distributions.
The fix for the problem reported by OP is to use:
-undefined dynamic_lookup
Instead of:
-lpythonXY
You are creating a Python C/C++ extension, and not embedding the python interpreter. So you shouldn't be linking to the python library. Pybind11 handles this correctly.
See the following for more information:
https://gitlab.kitware.com/cmake/cmake/issues/18100
https://github.com/ContinuumIO/anaconda-issues/issues/9078
One a side note, python 3.8 has added an additional flag: --embed and only then it adds -lpythonXY in the output:
$ python3.8-config --libs
-ldl -framework CoreFoundation
$ python3.8-config --libs --embed
-lpython3.8 -ldl -framework CoreFoundation
I am following a similar example and I adopt the Makefile from here. I have installed python 3.7.4 and boost-python via brew on macOS. To fix the NoneType issue, I follow the procedure below:
1. Check the Python Path
To check the python path, use
which python
If the output does not look like the following one (brew's python installation path)
/usr/local/opt/python/libexec/bin/python
set the PATH variable as
export PATH="/usr/local/opt/python/libexec/bin:$PATH"
Check if the Python path looks like the one above again.
2. Check the Compilation Flag
Below is the adopted Makefile. Note the LIB variable. If the boost-python flag is -lboost_python, change it to -lboost_python37.
CPP = clang++
PYLIBPATH = $(shell python-config --exec-prefix)/lib
# LIB = -L$(PYLIBPATH) $(shell python-config --libs) -lboost_python
LIB = -L$(PYLIBPATH) $(shell python-config --libs) -lboost_python37
OPTS = $(shell python-config --include) -O2
default: hello.so
hello.so: hello.o
$(CPP) $(LIB) -Wl,-rpath,$(PYLIBPATH) -shared $< -o $#
hello.o: hello.cpp Makefile
$(CPP) $(OPTS) -c $< -o $#
clean:
rm -rf *.so *.o
.PHONY: default clean
Recompile the C++ code and run the python script. The NoneType issue should disappear.
Hope this helps.
Note
If you are using anaconda and want to restore the PATH variable after the above changes, try
export PATH="~/anaconda3/bin:$PATH"
Your anaconda's path may be different.
Credit
1. George's comment in How do I use brew installed Python as the default Python?
2. leiyc's comment in ld: library not found for -lboost_python on MacOS
Following up on Nehal's answer for Anaconda based builds. Rereading the FindPython documentation for cmake shows the Python::Module target was added in cmake 3.15 for creating modules. That means the CMakeLists.txt should be:
set(Python3_FIND_VIRTUALENV FIRST)
find_package(Python3 REQUIRED Development)
find_package(Boost REQUIRED python3)
add_library(classes MODULE classes.cpp)
target_link_libraries(classes PRIVATE Python3::Module Boost::python3)
Apparently, the Python3::Python is for embedding Python in another application.

Building graph-tool on OSX with Python 3.4

I tried to install graph-tool on Mac OSX 10.10 using homebrew. The brew build process works fine, but when I try to import graph-tool I get the error described in this question.
Another problem with homebrew is that I always builds graph-tool for python2.7 and it installs the packages in the Python 2.7 sit-packages folder. But I want to use it with Python 3.4. These are the reasons why I tried to build graph-tool from source.
The ./configure command automatically uses Python 2.7, too. So I passed it the desired Python version with ./configure PYTHON=python3.4
It then detects the correct version as well as the related paths but crash with the following error:
configure: error:
Could not link test program to Python. Maybe the
main Python library has been installed in some non-standard library
path. If so, pass it to configure, via the LDFLAGS environment
variable.
Example: ./configure LDFLAGS="-L/usr/non-standard-path/python/lib"
======================================================================
ERROR!
You probably have to install the development version of the
Python package for your distribution. The exact name of this package varies
among them.
======================================================================
The error occurs with and without PYTHON variable set.
From the output of ./configure I can see that everything works fine except for the last line, which says:
checking consistency of all components of python development
environment... no
Whats does the above line mean and how do I properly install graph-tool on my maschine?
The error message is explaining exactly what needs to be done. Since python was installed in a non-standard path, you need to pass the flag LDFLAGS="-L/usr/non-standard-path/python/lib" pointing to the directory where the python libraries are located. This is most likely "/usr/local/lib", if you are using homebrew.
I was getting this error when I was trying to install graph-tool using outdated an autoconf / automake / pkg-config combination (installed using yum in CentOS 5.10). Installing those packages from source fixed the problem... although I'm not sure how this related to my python installation....
It worked for me by passing the variable PYTHON_EXTRA_LDFLAGS="-Wl,-stack_size,1000000 -F/usr/local/Cellar/python3/3.6.3/Frameworks -framework CoreFoundation".
In your case, it would be the path to the homebrew installation of python3.4.
The way I found out is that in the config.log, the error message shows the following:
configure:19023: checking python extra libraries
configure:19030: result: -ldl -framework CoreFoundation
configure:19037: checking python extra linking flags
configure:19044: result: -Wl,-stack_size,1000000 -framework CoreFoundation Python.framework/Versions/3.6/Python
configure:19051: checking consistency of all components of python development environment
configure:19079: gcc -o conftest -g -O2 -DNDEBUG -I/usr/local/Cellar/python3/3.6.3/Frameworks/Python.framework/Versions/3.6/include/python3.6m -F/usr/local/Cellar/python3/3.6.3/Frameworks/ -Wl,-stack_size,1000000 -framework CoreFoundation Python.framework/Versions/3.6/Python conftest.c -L/usr/local/opt/python3/Frameworks/Python.framework/Versions/3.6/lib -lpython3.6m -ldl -framework CoreFoundation -ldl -framework CoreFoundation >&5
clang: error: no such file or directory: 'Python.framework/Versions/3.6/Python'
The error seems to be path 'Python.framework/Versions/3.6/Python', that in a homebrew installation does not exist. I search for the same path in the config.log and I found this line:
PYTHON_EXTRA_LDFLAGS="-Wl,-stack_size,1000000 -framework CoreFoundation Python.framework/Versions/3.6/Python"
So, the solution for me was to pass this variable with the right path.

Making Cython work with Python 3.4 on Anacondas, Windows 7 64-bit

I have just installed Python 3.4 on my Windows 7 64-bit machine, using Anaconda/Condas.
When I run the "hello world" cython example I get this error:
[py34] C:\Users\Jon\Documents\GitHub\CythonFunctions\cython_funcs>python setup.py build_ext --inplace
running build_ext
building 'cython_funcs.hello' extension
C:\Anaconda\envs\py34\MinGW\bin\gcc.exe -mdll -O -Wall -IC:\Anaconda\envs\py34\include -IC:\Anaconda\envs\py34\include -c hello.c -o build\temp.win-amd64-3.4\Release\hello.o
writing build\temp.win-amd64-3.4\Release\hello.def
C:\Anaconda\envs\py34\MinGW\bin\gcc.exe -shared -s build\temp.win-amd64-3.4\Release\hello.o build\temp.win-amd64-3.4\Release\hello.def -LC:\Anaconda\envs\py34\libs -LC:\Anaconda\envs\py34\PCbuild\amd6
4 -lpython34 -lmsvcr100 -o C:\Users\Jon\Documents\GitHub\CythonFunctions\cython_funcs\cython_funcs\hello.pyd
build\temp.win-amd64-3.4\Release\hello.o:hello.c:(.text+0x314): undefined reference to `__imp__PyThreadState_Current'
build\temp.win-amd64-3.4\Release\hello.o:hello.c:(.text+0x493): undefined reference to `__imp__Py_NoneStruct'
build\temp.win-amd64-3.4\Release\hello.o:hello.c:(.text+0x97b): undefined reference to `__imp_PyExc_ImportError'
collect2.exe: error: ld returned 1 exit status
error: command 'C:\\Anaconda\\envs\\py34\\MinGW\\bin\\gcc.exe' failed with exit status 1
From searching stackoverflow and google, this error occurs when the gcc and python versions are not both either 32 bit or 64 bit.
I have checked that my Python is 64 bit. The MinGW that I have, as can be seen from the path below, was part of my Python installation. How can I check if it is 64 bit or not? Or could this error be due to something else?
Update:
Strangely, the Ipython cythonmagic command here works fine:
http://docs.cython.org/src/quickstart/build.html?highlight=cythonmagic
One way would be to conda remove libpython (this will cause distutils to not use mingw), and install Visual Studio 2010, and use that to compile.
Your gcc line is missing a define: -DMS_WIN64. Anaconda (I presume) have modified the file cygwinccompiler.py in Lib/distutils of 2.7 envs, but this modification is not present in the distutils of the 3.4 env. I was getting different errors to yours, but this change fixed my setup.
I ran into a similar problem with Anaconda but it worked when I stopped trying to compile in the py34 directory. Instead put your helloworld.pyx file (make sure you change it to that extension .pyx as well) in the Anaconda3 folder along with the setup.py and when you compile make sure you are in the Anaconda3 folder as well so C:\Anaconda3 python setup.py build_ext --inplace. If memory serves this ran just fine.
This might work for linking problems (to be done in a temporary directory; copy instead of cp if not in a msys2 environment)
gendef c:/Windows/System32/python34.dll
dlltool -U -d python34.def -l libpython34.dll.a
cp libpython34.dll.a c:/Python34/libs
If gendef is unable to access python34.dll, it can be copied using windows explorer before gendef command.
gendef is available at least with mingw-w64 packages.

swig-generated code links to wrong python installation

I have the following problem,
I'm building a python module using swig to wrap C-code.
I have installed python, gcc(45),.. using MacPorts.
Here's a minimal setup which reproduces the problem:
Two files:
test.i:
%module test
double sum(double a, double b);
test.c:
double sum(double a, double b){return a+b;}
I run
$ swig -python -I. test.i
$ gcc -fPIC -I/opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c test_wrap.c
$ gcc -c -o test.o test.c
$ gcc -shared -I/opt/local/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -lpython -dynamiclib -fPIC -o _test.so test.o test_wrap.o
When I run python (the MacPorts one: /opt/local/bin/python2.7) and try to import the module via import test, the code crashes with exactly the same problem as above.
Examining the file _test.so with otool yields the following:
$ otool -L _test.so
_test.so:
_test.so (compatibility version 0.0.0, current version 0.0.0)
/System/Library/Frameworks/Python.framework/Versions/2.7/Python (compatibility version 2.7.0, current version 2.7.2)
/usr/lib/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1669.0.0)
/opt/local/lib/libgcc/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 169.3.0)
I found out that in the swig-generated file test.py the line #include <Python.h> is contained. However there is a Python.h in /System/Library/... and one in /opt/local/...
My guess is that the mistake is happening here. But how can I make the compiler/linker point to the correct one?
Thanks a lot for any help!!
Thomas
I don't have a mac, but I have good results by using distutils to deal with multiple Python versions on Linux and Windows when using swig (swig doc and distutil doc). The following shows a minimal example of a setup.py:
from distutils.core import setup,Extension
ext = Extension('_test',sources=['test.c,test.i'])
setup(name='test',ext_modules=[ext],py_modules=["test"])
Distutils is binding to the calling python version and knows about swig. However, it is important to compile by:
python setup.py build_ext
python setup.py build_py
And not calling build directly.

Undefined symbol when importing Python Sybase module on OSX 10.6

I'm trying to get the python-sybase module working on OSX 10.6, but I've run into a bit of a snag.
When I do
import Sybase
I get
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "Sybase.py", line 15, in <module>
from sybasect import *
ImportError: dlopen(/Library/Python/2.6/site-packages/python_sybase-0.40pre2-py2.6-macosx-10.6-universal.egg/sybasect.so, 2): Symbol not found: _blk_alloc
Referenced from: /Library/Python/2.6/site-packages/python_sybase-0.40pre2-py2.6-macosx-10.6-universal.egg/sybasect.so
Expected in: flat namespace
in /Library/Python/2.6/site-packages/python_sybase-0.40pre2-py2.6-macosx-10.6-universal.egg/sybasect.so
I took a look at sybasect.so, and sure enough, _blk_alloc is undefined. The function is located in Sybase's sybblk.dylib, which is installed, and its containing directory is in LD_LIBRARY_PATH.
When I compiled python-sybase using python setup.py build, the gcc command appears to find all of the right libs correctly, but for some reason those libs don't appear to be linked after installing sybasect.so to the Python module dir.
The gcc command is
gcc-4.2 -Wl,-F. -bundle -undefined dynamic_lookup -arch i386 -arch ppc -arch x86_64 build/temp.macosx-10.6-universal-2.6/blk.o build/temp.macosx-10.6-universal-2.6/databuf.o build/temp.macosx-10.6-universal-2.6/cmd.o build/temp.macosx-10.6-universal-2.6/conn.o build/temp.macosx-10.6-universal-2.6/ctx.o build/temp.macosx-10.6-universal-2.6/datafmt.o build/temp.macosx-10.6-universal-2.6/iodesc.o build/temp.macosx-10.6-universal-2.6/locale.o build/temp.macosx-10.6-universal-2.6/msgs.o build/temp.macosx-10.6-universal-2.6/numeric.o build/temp.macosx-10.6-universal-2.6/money.o build/temp.macosx-10.6-universal-2.6/datetime.o build/temp.macosx-10.6-universal-2.6/date.o build/temp.macosx-10.6-universal-2.6/sybasect.o -L/Applications/Sybase/System/OCS-15_0/lib -lsybblk -lsybct -lsybcs -lsybtcl -lsybcomn -lsybintl -lsybunic -o build/lib.macosx-10.6-universal-2.6/sybasect.so
The -L/Applications/Sybase/System/OCS-15_0/lib location is correct, and that folder contains all of the right .dylib's.
When I run otool the output is:
$ otool -L build/lib.macosx-10.6-universal-2.6/sybasect.so
build/lib.macosx-10.6-universal-2.6/sybasect.so:
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 125.2.11)
I was expecting to see the Sybase libs in there.
I'm a little new to linking on Mac. How do I ensure that sybasect.so references the Sybase libs?
Fixed it.
The problem was that the various Sybase libraries I was linking to were 32-bit only, but I was running Python in 64 bit mode. The fix was just running python in 32-bit mode.
I used the command defaults write com.apple.versioner.python Prefer-32-Bit -bool yes since I don't have any particular need for 64 bit mode.

Categories