Cannot find -lpython3.5 -> Eclipse CPP - python

I'm trying to run Python commands inside a C++ Project using eclipse.
I've already included "/usr/include/python3.5" in my Include paths and in the Library search path for Cross G++ Linker. In Miscellaneous from Cross G++ Linker I've add -lpython3.5.
With these configurations, my build looks like: g++ -L/usr/include/python3.5/ -lpython3.5 -o "CppPyTest" ./src/CppPyTest.o
However, I got the following error:
/usr/bin/ld: cannot find -lpython3.5
makefile:45: recipe for target 'CppPyTest' failed
If I remove the -lpython3.5 I got the error:
undefined reference to `Py_Initialize'
My full code is:
#include "Python.h"
#include <iostream>
using namespace std;
int main(int argc, char *argv[]) {
Py_Initialize();
PyRun_SimpleString("from time import time,ctime\n"
"print 'Today is',ctime(time())\n");
Py_Finalize();
return 0;
}

This shows how to compile and link your code.
This gives the proper paramters for eighter python from the package-manager or the anaconda package.
Code (made Python 3 compatible):
#include "Python.h"
#include <iostream>
using namespace std;
int main(int argc, char *argv[]) {
Py_Initialize();
PyRun_SimpleString("from time import time,ctime\n"
"print('Today is', ctime(time()))\n");
Py_Finalize();
return 0;
}
Calling (for anaconda)
g++ -I/opt/anaconda3/include/python3.7m -L /opt/anaconda3/lib -lpython3.7m CppPyTest.cpp
or (python from package manager)
g++ -I/usr/include/python3.6m/ -lpython3.6m CppPyTest.cpp
creates an executateble a.out in the folder.
We can execute a.out by
export LD_LIBRARY_PATH=/opt/anaconda3/lib:$LD_LIBRARY_PATH
./a.out
giving the output:
Today is Tue Jul 7 14:56:31 2020
To help you adjust the paths properly, here the content of the anaconda folders:
$ ls /opt/anaconda3/lib/libpython*
/opt/anaconda3/lib/libpython3.7m.a
/opt/anaconda3/lib/libpython3.7m.nolto.a
/opt/anaconda3/lib/libpython3.7m.so
/opt/anaconda3/lib/libpython3.7m.so.1.0
/opt/anaconda3/lib/libpython3.so
$ ls /opt/anaconda3/include/python3.7m/
Python-ast.h errcode.h object.h pymem.h
Python.h eval.h objimpl.h pyport.h
abstract.h fileobject.h odictobject.h pystate.h
accu.h fileutils.h opcode.h pystrcmp.h
asdl.h floatobject.h osdefs.h pystrhex.h
ast.h frameobject.h osmodule.h pystrtod.h
bitset.h funcobject.h parsetok.h pythonrun.h
bltinmodule.h genobject.h patchlevel.h pythread.h
boolobject.h graminit.h pgen.h pytime.h
bytearrayobject.h grammar.h pgenheaders.h rangeobject.h
bytes_methods.h greenlet py_curses.h setobject.h
bytesobject.h import.h pyarena.h sip.h
cellobject.h internal pyatomic.h sliceobject.h
ceval.h intrcheck.h pycapsule.h structmember.h
classobject.h iterobject.h pyconfig.h structseq.h
code.h listobject.h pyctype.h symtable.h
codecs.h longintrepr.h pydebug.h sysmodule.h
compile.h longobject.h pydtrace.h token.h
complexobject.h marshal.h pyerrors.h traceback.h
context.h memoryobject.h pyexpat.h tupleobject.h
datetime.h metagrammar.h pyfpe.h typeslots.h
descrobject.h methodobject.h pyhash.h ucnhash.h
dictobject.h modsupport.h pylifecycle.h unicodeobject.h
dtoa.h moduleobject.h pymacconfig.h warnings.h
dynamic_annotations.h namespaceobject.h pymacro.h weakrefobject.h
enumobject.h node.h pymath.h

Related

Call a function from a cython generated .so file inside a c++ code

My goal is to call python functions from C++. These python function must be compiled with cython in a .so file. The .so file must be the one that communicate with the c++ program.
Before all:
I am on Ubuntu, I am working with miniconda with python3.9.
I am in a folder (~/exp) made like this:
exp
exp
__ init __.py
main.py
setup.py
run.cpp
run.py
I translate the main.py to main.pyx, the file contains this code:
def add(a, b):
return a+b
def entry_point():
print(add(21,21))
I compiled this script with cython and obtained a .so file, with this setup.py script:
from setuptools import setup
from Cython.Build import cythonize
setup(
name='exp',
ext_modules=cythonize("exp/main.pyx"),
libraries=[('python3.9', {'include_dirs': ["~/miniconda3/include/python3.9"]})],
library_dirs=['~/miniconda3/lib']
)
And this command:
python3 setup.py build_ext --inplace
Now I have a main.cpython-39-x86_64-linux-gnu.so file in ~/exp/exp.
When I launch (run.py) which contains:
from exp.main import entry_point
if __name__ == "__main__":
entry_point()
I have a normal behavior : It returns 42.
Now, here come the problems
I compile my run.cpp source, which contains :
#include <iostream>
#include <dlfcn.h>
#include "Python.h"
int main(int argc, char *argv[]) {
setenv("PYTHONPATH",".",1);
Py_Initialize();
PyObject *pName, *pModule, *pDict, *pFunc, *pValue, *presult;
// Initialize the Python Interpreter
Py_Initialize();
// Build the name object
pName = PyUnicode_FromString((char*)"exp/main");
// Load the module object
pModule = PyImport_Import(pName);
// pDict is a borrowed reference
pDict = PyModule_GetDict(pModule);
// pFunc is also a borrowed reference
pFunc = PyDict_GetItemString(pDict, (char*)"entry_point");
Py_DECREF(pValue);
// Clean up
Py_DECREF(pModule);
Py_DECREF(pName);
Py_Finalize();
}
with the command :
g++ -Wall -I~/miniconda3/include/python3.9 run.cpp -L~/miniconda3/lib -lpython3.9 -o run.o -ldl
And then execute : ./run.o
To end with a beautiful :
Segmentation fault (core dumped)
I tried with dlopen without success either.
Maybe I miss something, any help would be welcome.
Thank you :)
Firstly, thank you to Craig Estey and DavidW for their comments.
So I finally was able to make it work, two things was wrong:
pValue was not used, so the Py_DECREF raised an Error
the module path "exp/main" was indeed not valid, but "exp.main" was valid.
A very last thing. Something I omitted was the PyObject_CallObject that allows to call my PyObject pFunc.
I've finally got my '42' answer.
Here the final code:
#include <iostream>
#include <dlfcn.h>
#include "Python.h"
int main(int argc, char *argv[]) {
setenv("PYTHONPATH",".",1);
Py_Initialize();
PyObject *pName, *pModule, *pDict, *pFunc, *presult;
// Initialize the Python Interpreter
// Build the name object
pName = PyUnicode_FromString((char*)"exp.main");
// Load the module object
pModule = PyImport_Import(pName);
// pDict is a borrowed reference
pDict = PyModule_GetDict(pModule);
// pFunc is also a borrowed reference
pFunc = PyDict_GetItemString(pDict, (char*)"entry_point");
presult = PyObject_CallObject(pFunc, NULL);
// Py_DECREF(pValue);
// Clean up
Py_DECREF(pModule);
Py_DECREF(pName);
Py_Finalize();
}
(Craig pointed out that executable file might not finish by '.o', learn more: What is *.o file)
So, the new compile command is:
g++ -Wall -I~/miniconda3/include/python3.9 run.cpp -L~/miniconda3/lib -lpython3.9 -o run

Embedding Python in C: Undefined reference (but works in Go)

I'm trying to embed Python 3.7 in a C application in Windows 10.
#define PY_SSIZE_T_CLEAN
#include <Python.h>
int main () {
Py_Initialize();
PyRun_SimpleString("print('OK')");
}
I use the following command to compile: (MinGW-W64-builds-4.3.4, gcc 7.3.0)
gcc "-IC:/Program Files/Python37_64/include" "-LC:/Program Files/Python37_64/libs" -lpython37 main.c
But it gives the following error:
C:\Users\Paul\AppData\Local\Temp\ccKQF3zu.o:main.c:(.text+0x10): undefined reference to `__imp_Py_Initialize'
C:\Users\Paul\AppData\Local\Temp\ccKQF3zu.o:main.c:(.text+0x25): undefined reference to `__imp_PyRun_SimpleStringFlags'
collect2.exe: error: ld returned 1 exit status
The strange thing is, when I try the same in Go 1.13 (Golang), it does work:
package main
/*
#cgo CFLAGS: "-IC:/Program Files/Python37_64/include"
#cgo LDFLAGS: "-LC:/Program Files/Python37_64/libs" -lpython37
#define PY_SSIZE_T_CLEAN
#include <Python.h>
void run () {
Py_Initialize();
PyRun_SimpleString("print('OK')");
}
*/
import "C"
func main () {
C.run()
}
Compile command:
go build python.go
How to fix this?
I found the solution in this answer.
The argument main.c has to be put somewhere before -lpython37.
So this works:
gcc "-IC:/Program Files/Python37_64/include" "-LC:/Program Files/Python37_64/libs" main.c -lpython37

Import error "undefined symbol: _ZNK9FastNoise8GetNoiseEff" when calling C++ extension in Python 3.6

I am currently trying to make some C++ extensions for a python script. In the C++ side of the story, it seems to be compiled just fine and it generates my .so shared library, but when I call it inside my python script it raises an error of undefined symbol. The current code is as follow:
#include <iostream>
#include "FastNoise.h"
#include <string>
#include <time.h>
#include <opencv2/core.hpp>
#include <opencv2/imgcodecs.hpp>
#include <opencv2/highgui.hpp>
#include <boost/python/def.hpp>
#include <boost/python/module.hpp>
using namespace std;
using namespace cv;
namespace bp = boost::python;
int gen(int _size)
{
FastNoise myNoise;
myNoise.SetNoiseType(FastNoise::Simplex);
myNoise.SetSeed((int)(rand() * time(NULL)));
Size img_size(_size, _size);
Mat noise_map(img_size, CV_32FC3);
for (int y = 0; y < _size; y++) {
for (int x = 0; x < _size; x++) {
Vec3f &color = noise_map.at<Vec3f>(Point(x, y));
color.val[0] = (myNoise.GetNoise(x, y) + 1) / 2;
color.val[1] = (myNoise.GetNoise(x, y) + 1) / 2;
color.val[2] = (myNoise.GetNoise(x, y) + 1) / 2;
}
}
imshow("test", noise_map);
waitKey(0);
return 0;
}
BOOST_PYTHON_MODULE(gen) {
bp::def("gen", gen);
}
And here is how I compiled it:
g++ main.cpp -I/opt/opencv/include/opencv -I/usr/include/python3.6m -I/usr/local/include/boost -L/opt/opencv/release/lib -L/usr/lib/python3.6/config-3.6m-x86_64-linux-gnu -L/usr/local/lib -lopencv_core -lopencv_highgui -lopencv_imgcodecs -lpython3.6m -lboost_python36 -o NoiseModule.so -shared -fPI
When I import it within python it gives me this error:
>>> import NoiseModule
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: /home/matheus/PycharmProjects/TerrainGenerator/NoiseGenerator/NoiseModule.so: undefined symbol: _ZNK9FastNoise8GetNoiseEff
>>>
Any kind of help in regards of this problem will be really appreciated.
Your shared object doesn't have access to every function you use. You probably have a file like FastNoise.cpp which implements your FastNoise object. Yet you only use main.cpp to compile your dynamic library (.so) file. So make sure all .cpp files are included in the build of your python c++ extension.
Another option might be to make to implement your FastNoise object entirely inside of the header.

Segmentation fault using boost python and numpy

I have a boost::python library and am using numpy (not boost::numpy). My Python script uses my library and functions without any issue. However, when my script ends, numpy segfaults. I have tried to generate a MWE that shows the issue.
The following python script will segfault sometimes after finishing:
test.py
import arrayTest
arr = arrayTest.getTestArray()
print(arr)
pythonModule.cpp
#include <boost/python.hpp>
#define NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION
#include <numpy/ndarrayobject.h>
#include <vector>
namespace detail
{
#if PY_MAJOR_VERSION >= 3
inline void* init()
{
import_array();
}
#else
inline void init()
{
import_array();
}
#endif
}
boost::python::object getNPArray()
{
using namespace boost::python;
auto data = std::vector<double>(2*3*4);
data.at(6) = 12;
npy_intp shape[3] = { 2,3,4 };
PyObject* obj = PyArray_New(
&PyArray_Type,
3,
shape,
NPY_DOUBLE,
nullptr,
data.data(),
0,
NPY_ARRAY_FARRAY,
nullptr);
handle<> handle( obj );
numeric::array arr( handle );
return arr.copy();
}
BOOST_PYTHON_MODULE(arrayTest)
{
detail::init();
using namespace boost::python;
numeric::array::set_module_and_type(
"numpy",
"ndarray");
def("getTestArray", &getNPArray);
}
To make it easy for you to compile I have also prepared a cmake script:
CMakeLists.txt
cmake_minimum_required(VERSION 3.6 FATAL_ERROR)
project("testArray")
set(Boost_USE_STATIC_LIBS OFF)
set(Boost_USE_MULTITHREADED ON)
set(Boost_USE_STATIC_RUNTIME OFF)
find_package(Boost COMPONENTS python3 REQUIRED)
find_package(PythonLibs 3 REQUIRED)
# If you have your own findNumpy you can remove this part
find_package(PythonInterp 3)
if(PYTHON_EXECUTABLE)
execute_process(
COMMAND "${PYTHON_EXECUTABLE}" -c "try: import numpy; print(numpy.get_include(), end='')\nexcept:pass\n" OUTPUT_VARIABLE path)
execute_process(
COMMAND "${PYTHON_EXECUTABLE}" -c "try: import numpy; print(numpy.__version__, end='')\nexcept:pass\n" OUTPUT_VARIABLE version)
find_path(PYTHON_NUMPY_INCLUDE_DIRS numpy/ndarrayobject.h
HINTS "${path}" "${PYTHON_INCLUDE_PATH}" NO_DEFAULT_PATH)
if(PYTHON_NUMPY_INCLUDE_DIRS)
set(PYTHON_NUMPY_VERSION ${version})
endif()
else()
message(SEND_ERROR "Python executable not found => Cannot determine numpy location.")
endif()
include(FindPackageHandleStandardArgs)
find_package_handle_standard_args(NumPy REQUIRED_VARS PYTHON_NUMPY_INCLUDE_DIRS VERSION_VAR PYTHON_NUMPY_VERSION)
# end of findNumpy part
include_directories(${PYTHON_INCLUDE_DIRS} ${PYTHON_NUMPY_INCLUDE_DIRS})
set(libs
${Boost_LIBRARIES}
${PYTHON_LIBRARIES})
add_library(arrayTest SHARED pythonModule.cpp)
target_link_libraries(arrayTest ${libs})
set_target_properties(arrayTest
PROPERTIES
PREFIX "")
Edit:
In the python documentation I found the following:
If data is passed to PyArray_NewFromDescr or PyArray_New, this memory must not be deallocated until the new array is deleted.
So maybe I need to manually delete the array? How do I do this?

Error while creating a python binding

This program in C runs and compiles well :
#ifdef HAVE_CONFIG_H
#include <config.h>
#endif
#include <stdio.h>
#include <string.h>
#include <errno.h>
#include <getopt.h>
#include <atasmart.h>
int main(){
const char *device = "/dev/sda";
int ret;
uint64_t ms;
SkDisk *d;
if ((ret = sk_disk_open(device, &d)) < 0) {
printf("Failed to open disk\n");
return 1;
}
if ((ret = sk_disk_smart_read_data(d)) < 0) {
printf("Failed to read SMART data: \n");
}
if ((ret = sk_disk_smart_get_power_on(d, &ms)) < 0) {
printf("Failed to get power on time:\n");
}
printf("%llu\n", (unsigned long long) ms);
return 0;
}
using:
gcc atatest.c `pkg-config --cflags --libs libatasmart`
However while trying to create python bindings based on that program:
#ifdef HAVE_CONFIG_H
#include <config.h>
#endif
#include <stdio.h>
#include <string.h>
#include <errno.h>
#include <getopt.h>
#include <atasmart.h>
#include <Python.h>
static PyObject *pySmart_powerOn(PyObject *self, PyObject *args)
{
const char *device = "/dev/sda";
int ret;
uint64_t ms;
SkDisk *d;
if (!PyArg_ParseTuple(args, "s", &device))
{
return NULL;
}
if ((ret = sk_disk_smart_get_power_on(d, &ms)) < 0) {
return Py_BuildValue("s", "Failed to get power on time");
}
return Py_BuildValue("K", (unsigned long long) ms);
}
static PyMethodDef pySmart_methods[] = {
{ "powerOn", (PyCFunction)pySmart_powerOn, METH_VARARGS, NULL },
{ NULL, NULL, 0, NULL }
};
PyMODINIT_FUNC initpySmart()
{
Py_InitModule3("pySmart", pySmart_methods, "Trial module");
}
I create a shared library using
gcc -shared -I/usr/include/python2.7 `pkg-config --cflags --libs libatasmart` atabind.c -o pySmart.so -fPIC
then I get a warning as follows :, but the file compiles
In file included from /usr/include/python2.7/Python.h:8:0,
from atabind.c:12:
/usr/include/python2.7/pyconfig.h:1158:0: warning: "_POSIX_C_SOURCE" redefined [enabled by default]
/usr/include/features.h:214:0: note: this is the location of the previous definition
when in Python i run
import pySmart
I get
ImportError: ./pySmart.so: undefined symbol: sk_disk_smart_get_power_on
My guess is that the error is caused because I have compiled the pySmart.so shared library with incorrect flags/options.. but I'm unable to figure it out!
You need to specify linker flags (-lfoo) after your source files. That's because of the way how linker works: when you specify a library to it, it checks the library for symbols needed so far. If no symbols needed (as if you didn't get to any source objects yet), it just skips the library.
Try the following commandline:
gcc -shared -I/usr/include/python2.7 \
`pkg-config --cflags libatasmart` \
atabind.c \
`pkg-config --libs libatasmart` \
-o pySmart.so -fPIC
You should include your Python.h first then any std header.
All function, type and macro definitions needed to use the Python/C API are included in your code by
#include "Python.h"
This implies inclusion of the following standard headers:
<stdio.h>, <string.h>, <errno.h>, <limits.h>, <assert.h> and <stdlib.h> (if available).
Since Python may define some pre-processor definitions which affect the standard headers on some systems, you must include Python.h before any standard headers are included.
Alternatively:
just _GNU_SOURCE , and it will be ignored by GNU libc's /usr/include/features.h

Categories