How to import a function from python file by Boost.Python - python

I am totally new to boost.python.
I reviewed a lot of recommending of using boost.python to apply with python, however still not easy to understand and find a solution for me.
What I want is to import a function or class that directly from a python "SourceFile"
Example File:
Main.cpp
MyPythonClass.py
Let's says if there is a "Dog" class in "MyPythonClass.py" with "bark()" function, how do I get callback and send argument in cpp?
I have no idea what I should do!
Please help me!

When one needs to call Python from C++, and C++ owns the main function, then one must embed the Python interrupter within the C++ program. The Boost.Python API is not a complete wrapper around the Python/C API, so one may find the need to directly invoke parts of the Python/C API. Nevertheless, Boost.Python's API can make interoperability easier. Consider reading the official Boost.Python embedding tutorial for more information.
Here is a basic skeleton for a C++ program that embeds Python:
int main()
{
// Initialize Python.
Py_Initialize();
namespace python = boost::python;
try
{
... Boost.Python calls ...
}
catch (const python::error_already_set&)
{
PyErr_Print();
return 1;
}
// Do not call Py_Finalize() with Boost.Python.
}
When embedding Python, it may be necessary to augment the module search path via PYTHONPATH so that modules can be imported from custom locations.
// Allow Python to load modules from the current directory.
setenv("PYTHONPATH", ".", 1);
// Initialize Python.
Py_Initialize();
Often times, the Boost.Python API provides a way to write C++ code in a Python-ish manner. The following example demonstrates embedding a Python interpreter in C++, and having C++ import a MyPythonClass Python module from disk, instantiate an instance of MyPythonClass.Dog, and then invoking bark() on the Dog instance:
#include <boost/python.hpp>
#include <cstdlib> // setenv
int main()
{
// Allow Python to load modules from the current directory.
setenv("PYTHONPATH", ".", 1);
// Initialize Python.
Py_Initialize();
namespace python = boost::python;
try
{
// >>> import MyPythonClass
python::object my_python_class_module = python::import("MyPythonClass");
// >>> dog = MyPythonClass.Dog()
python::object dog = my_python_class_module.attr("Dog")();
// >>> dog.bark("woof");
dog.attr("bark")("woof");
}
catch (const python::error_already_set&)
{
PyErr_Print();
return 1;
}
// Do not call Py_Finalize() with Boost.Python.
}
Given a MyPythonClass module that contains:
class Dog():
def bark(self, message):
print "The dog barks: {}".format(message)
The above program outputs:
The dog barks: woof

Boost python is used to call cplusplus functions from a python source. Pretty much like the Perl xs module.
If you have a function say bark() in main.cpp, you can use boost python to convert this main.cpp into a python callable module.
Then from python script(assuming link output file is main.so):
import main
main.bark()

Related

Python Initialization fails with dynamically loaded DLL [duplicate]

I have a C++ program and it has sort of plugin structure: when program starts up, it's looking for dll in the plugin folder with certain exported function signatures, such as:
void InitPlugin(FuncTable* funcTable);
Then the program will call the function in the dll to initialize and pass function pointers to the dll. From that time on, the dll can talk to the program.
I know Cython let you call C function in Python, but I'm not sure can I write a Cython code and compile it to a dll so my C++ program can initialize with it. An example code would be great.
Using cython-module in a dll is not unlike using a cython-module in an embeded python interpreter.
The first step would be to mark cdef-function which should be used from external C-code with public, for example:
#cyfun.pyx:
#doesn't need python interpreter
cdef public int double_me(int me):
return 2*me;
#needs initialized python interpreter
cdef public void print_me(int me):
print("I'm", me);
cyfun.c and cyfun.h can be generated with
cython -3 cyfun.pyx
These files will be used for building of the dll.
The dll will need one function to initialize the python interpreter and another to finalize it, which should be called only once before double_me and print_me can be used (Ok, double_me would work also without interpreter, but this is an implementation detail). Note: the initialization/clean-up could be put also in DllMain - see such a version further bellow.
The header-file for the dll would look like following:
//cyfun_dll.h
#ifdef BUILDING_DLL
#define DLL_PUBLIC __declspec(dllexport)
#else
#define DLL_PUBLIC __declspec(dllimport)
#endif
//return 0 if everything ok
DLL_PUBLIC int cyfun_init();
DLL_PUBLIC void cyfun_finalize();
DLL_PUBLIC int cyfun_double_me(int me);
DLL_PUBLIC void cyfun_print_me(int me);
So there are the necessary init/finalize-functions and the symbols are exported via DLL_PUBLIC (which needs to be done see this SO-post) so it can be used outside of the dll.
The implementation follows in cyfun_dll.c-file:
//cyfun_dll.c
#define BUILDING_DLL
#include "cyfun_dll.h"
#define PY_SSIZE_T_CLEAN
#include <Python.h>
#include "cyfun.h"
DLL_PUBLIC int cyfun_init(){
int status=PyImport_AppendInittab("cyfun", PyInit_cyfun);
if(status==-1){
return -1;//error
}
Py_Initialize();
PyObject *module = PyImport_ImportModule("cyfun");
if(module==NULL){
Py_Finalize();
return -1;//error
}
return 0;
}
DLL_PUBLIC void cyfun_finalize(){
Py_Finalize();
}
DLL_PUBLIC int cyfun_double_me(int me){
return double_me(me);
}
DLL_PUBLIC void cyfun_print_me(int me){
print_me(me);
}
Noteworthy details:
we define BUILDING_DLL so DLL_PUBLIC becomes __declspec(dllexport).
we use cyfun.h generated by cython from cyfun.pyx.
cyfun_init inizializes python interpreter and imports the built-in module cyfun. The somewhat complicated code is because since Cython-0.29, PEP-489 is default. More information can be found in this SO-post. If the Python-interpreter isn't initialized or if the module cyfun is not imported, the chances are high, that calling the functionality from cyfun.h will end in a segmentation fault.
cyfun_double_me just wraps double_me so it becomes visible outside of the dll.
Now we can build the dll!
:: set up tool chain
call "<path_to_vcvarsall>\vcvarsall.bat" x64
:: build cyfun.c generated by cython
cl /Tccyfun.c /Focyfun.obj /c <other_coptions> -I<path_to_python_include>
:: build dll-wrapper
cl /Tccyfun_dll.c /Focyfun_dll.obj /c <other_coptions> -I<path_to_python_include>
:: link both obj-files into a dll
link cyfun.obj cyfun_dll.obj /OUT:cyfun.dll /IMPLIB:cyfun.lib /DLL <other_loptions> -L<path_to_python_dll>
The dll is now built, but the following details are noteworthy:
<other_coptions> and <other_loptions> can vary from installation to installation. An easy way is to see them is to run cythonize some_file.pyx` and to inspect the log.
we don't need to pass python-dll, because it will be linked automatically, but we need to set the right library-path.
we have the dependency on the python-dll, so later on it must be somewhere where it can be found.
Were you go from here depends on your task, we test our dll with a simple main:
//test.c
#include "cyfun_dll.h"
int main(){
if(0!=cyfun_init()){
return -1;
}
cyfun_print_me(cyfun_double_me(2));
cyfun_finalize();
return 0;
}
which can be build via
...
:: build main-program
cl /Tctest.c /Focytest.obj /c <other_coptions> -I<path_to_python_include>
:: link the exe
link test.obj cyfun.lib /OUT:test_prog.exe <other_loptions> -L<path_to_python_dll>
And now calling test_prog.exe leads to the expected output "I'm 4".
Depending on your installation, following things must be considered:
test_prog.exe depends on pythonX.Y.dll which should be somewhere in the path so it can be found (the easiest way is to copy it next to the exe)
The embeded python interpreter needs an installation, see this and/or this SO-posts.
IIRC, it is not a great idea to initialize, then to finalize and then to initialize the Python-interpreter again (that might work for some scenarios, but not all , see for example this) - the interpreter should be initialized only once and stay alive until the programs ends.
Thus, it may make sense to put initialization/clean-up code into DllMain (and make cyfun_init() and cyfun_finalize() private), e.g.
BOOL WINAPI DllMain(
HINSTANCE hinstDLL, // handle to DLL module
DWORD fdwReason, // reason for calling function
LPVOID lpReserved ) // reserved
{
// Perform actions based on the reason for calling.
switch( fdwReason )
{
case DLL_PROCESS_ATTACH:
return cyfun_init()==0;
case DLL_PROCESS_DETACH:
cyfun_finalize();
break;
case DLL_THREAD_ATTACH:
// Do thread-specific initialization.
break;
case DLL_THREAD_DETACH:
// Do thread-specific cleanup.
break;
}
return TRUE;
}
If your C/C++-program already has an initialized Python-interpreter it would make sense to offer a function which only imports the module cyfun and doesn't initialize the python-interpreter. In this case I would define CYTHON_PEP489_MULTI_PHASE_INIT=0, because PyImport_AppendInittab must be called before Py_Initialize, which might be already too late when the dll is loaded.
I'd imagine it'd be difficult to call it directly, with Cython depending so much on the Python runtime.
Your best bet is to embed a Python interpreter directly inside your app, e.g. as described in this answer, and call your Cython code from the interpreter. That's what I would do.

Boost - Importing Qt created C++ Python module in Python

I have the following code in the Qt Quick Application together with Boost.
In this Cpp there is a personal module created using BOOST_PYTHON_MODULE(hello). The main goal is to be able to import hello in Python and call the methods of hello struct. My Python script only contains very simple structure as i just want to see no errors when importing hello.
import hello
print("Import was successful!")
Most of the codes below are copied from a different question in stackoverflow but not entirely so i had to repost the parts.
Main.cpp
#include <cstdlib> // setenv, atoi
#include <iostream> // cerr, cout, endl
#include <boost/python.hpp>
#include <QGuiApplication>
#include <QQmlApplicationEngine>
struct World
{
void set(std::string msg) { this->msg = msg; }
std::string greet() { return msg; }
std::string msg;
};
//---------------------------------------------------------------------------------------------------------------------
/// Staticly linking a Python extension for embedded Python.
BOOST_PYTHON_MODULE(hello)
{
namespace python = boost::python;
python::class_<World>("World")
.def("greet", &World::greet)
.def("set", &World::set)
;
}
//---------------------------------------------------------------------------------------------------------------------
int main(int argc, char *argv[])
{
QCoreApplication::setAttribute(Qt::AA_EnableHighDpiScaling);
QGuiApplication app(argc, argv);
namespace python = boost::python;
try
{
int uploaded = PyImport_AppendInittab("hello", &PyInit_hello);
//This executes else part
if(uploaded == -1)
std::cout<< "Module table was not extended: " << uploaded << std::endl;
else
std::cout<< "Module Table was extended" << std::endl;
Py_Initialize();
} catch (...)
{
PyErr_Print();
return 1;
}
return app.exec();
}
Finally, I run my QT application and the return app.exec(); keeps it running while i try and run my python script as mentioned above from the terminal. The python script is in the same directory as the currently running application, not sure if that makes any difference.
Then the error i get is:
Traceback (most recent call last):
File "test_hilton.py", line 1, in <module>
import hello
ModuleNotFoundError: No module named 'hello'
Not sure what i am missing here. According to the Python API:
PyImport_AppendInittab - Add a single module to the existing table of
built-in modules. This is a convenience wrapper around
PyImport_ExtendInittab(), returning -1 if the table could not be
extended. The new module can be imported by the name name, and uses
the function initfunc as the initialization function called on the
first attempted import.
And the If-else part inside the try-catch block inside the main proves that the hello module is being added to the table. Out of ideas on what to do, looked in different places. But still stuck with this part of the problem.
Since the hello module is defined in that Qt program it is available only in that program. Executing the program doesn't make it available to python interpreter that expects to find hello.py or hello.so (the file extension may vary depending on the operating system) when importing hello by import hello.
You need to build a python module, answer might help.

python import .dll compiled by .net

i used jscript.net to create a .dll file.
as a test, i can successfully call the function hello() in another jscript.net program, which compiled as .exe.
BUT my question is:
how can i call the function in python?
this dll can be successfully loaded by using ctypes.windll.LoadLibrary("power.dll"). however, the name of function can't be found...
i have done some tests of my dll file.
i used "dumpbin /all" to check my dll, but i can't find any usable funciton names. it is odd...
the code of my dll file in jscript.net
""""""""""""""""""""""
import System;
import System.Console;
import System.IO;
package power{
public class testp {
function hello (){
var time_1 = DateTime.Now;
for (var i =0; i<10000; i++){
Console.WriteLine ("hello world!");
};
var time_2 = DateTime.Now;
Console.WriteLine (time_2-time_1);
};
};
};
""""""""""""""""""""""
Yes, you can consume any DLL in python if you make the code COM-exposed. I only skimmed through it, but this tutorial seems to give a good example of how to do this:
http://www.codeproject.com/Articles/73880/Using-COM-Objects-in-Scripting-Languages-Part-2-Py

Python - C embedded Segmentation fault

I am facing a problem similar to the Py_initialize / Py_Finalize not working twice with numpy .. The basic coding in C:
Py_Initialize();
import_array();
//Call a python function which imports numpy as a module
//Py_Finalize()
The program is in a loop and it gives a seg fault if the python code has numpy as one of the imported module. If I remove numpy, it works fine.
As a temporary work around I tried not to use Py_Finalize(), but that is causing huge memory leaks [ observed as the memory usage from TOP keeps on increasing ]. And I tried but did not understand the suggestion in that link I posted. Can someone please suggest the best way to finalize the call while having imports such as numpy.
Thanks
santhosh.
I recently faced a very similar issue and developed a workaround that works for my purposes, so I thought I would write it here in the hope it might help others.
The problem
I work with some postprocessing pipeline for which I can write a own functor to work on some data passing through the pipeline and I wanted to be able to use Python scripts for some of the operations.
The problem is that the only thing I can control is the functor itself, which gets instantiated and destroyed at times beyond my control. I furthermore have the problem that even if I do not call Py_Finalize the pipeline sometimes crashes once I pass another dataset through the pipeline.
The solution in a Nutshell
For those who don't want to read the whole story and get straight to the point, here's the gist of my solution:
The main idea behind my workaround is not to link against the Python library, but instead load it dynamically using dlopen and then get all the addresses of the required Python functions using dlsym. Once that's done, one can call Py_Initialize() followed by whatever you want to do with Python functions followed by a call to Py_Finalize() once you're done. Then, one can simply unload the Python library. The next time you need to use Python functions, simply repeat the steps above and Bob's your uncle.
However, if you are importing NumPy at any point between Py_Initialize and Py_Finalize, you will also need to look for all the currently loaded libraries in your program and manually unload those using dlclose.
Detailed workaround
Loading instead of linking Python
The main idea as I mentioned above is not to link against the Python library. Instead, what we will do is load the Python library dynamically using dlopen():
#include
...
void* pHandle = dlopen("/path/to/library/libpython2.7.so", RTLD_NOW | RTLD_GLOBAL);
The code above loads the Python shared library and returns a handle to it (the return type is an obscure pointer type, thus the void*). The second argument (RTLD_NOW | RTLD_GLOBAL) is there to make sure that the symbols are properly imported into the current application's scope.
Once we have a pointer to the handle of the loaded library, we can search that library for the functions it exports using the dlsym function:
#include <dlfcn.h>
...
// Typedef named 'void_func_t' which holds a pointer to a function with
// no arguments with no return type
typedef void (*void_func_t)(void);
void_func_t MyPy_Initialize = dlsym(pHandle, "Py_Initialize");
The dlsym function takes two parameters: a pointer to the handle of the library that we obtained previously and the name of the function we are looking for (in this case, Py_Initialize). Once we have the address of the function we want, we can create a function pointer and initialize it to that address. To actually call the Py_Initialize function, one would then simply write:
MyPy_Initialize();
For all the other functions provided by the Python C-API, one can just add calls to dlsym and initialize function pointers to its return value and then use those function pointers instead of the Python functions. One simply has to know the parameter and return value of the Python function in order to create the correct type of function pointer.
Once we are finished with the Python functions and call Py_Finalize using a procedure similar to the one for Py_Initialize one can unload the Python dynamic library in the following way:
dlclose(pHandle);
pHandle = NULL;
Manually unloading NumPy libraries
Unfortunately, this does not solve the segmentation fault problems that occur when importing NumPy. The problems comes from the fact that NumPy also loads some libraries using dlopen (or something equivalent) and those do not get unloaded them when you call Py_Finalize. Indeed, if you list all the loaded libraries within your program, you will notice that after closing the Python environment with Py_Finalize, followed by a call to dlclose, some NumPy libraries will remain loaded in memory.
The second part of the solution requires to list all the Python libraries that remain in memory after the call dlclose(pHandle);. Then, for each of those libraries, grab a handle to them and then call dlcloseon them. After that, they should get unloaded automatically by the operating system.
Fortunately, there are functions under both Windows and Linux (sorry MacOS, couldn't find anything that would work in your case...):
- Linux: dl_iterate_phdr
- Windows: EnumProcessModules in conjunction with OpenProcess and GetModuleFileNameEx
Linux
This is rather straight forward once you read the documentation about dl_iterate_phdr:
#include <link.h>
#include <string>
#include <vector>
// global variables are evil!!! but this is just for demonstration purposes...
std::vector<std::string> loaded_libraries;
// callback function that gets called for every loaded libraries that
// dl_iterate_phdr finds
int dl_list_callback(struct dl_phdr_info *info, size_t, void *)
{
loaded_libraries.push_back(info->dlpi_name);
return 0;
}
int main()
{
...
loaded_libraries.clear();
dl_iterate_phdr(dl_list_callback, NULL);
// loaded_libraries now contains a list of all dynamic libraries loaded
// in your program
....
}
Basically, the function dl_iterate_phdr cycles through all the loaded libraries (in the reverse order they were loaded) until either the callback returns something other than 0 or it reaches the end of the list. To save the list, the callback simply adds each element to a global std::vector (one should obviously avoid global variables and use a class for example).
Windows
Under Windows, things get a little more complicated, but still manageable:
#include <windows.h>
#include <psapi.h>
std::vector<std::string> list_loaded_libraries()
{
std::vector<std::string> m_asDllList;
HANDLE hProcess(OpenProcess(PROCESS_QUERY_INFORMATION
| PROCESS_VM_READ,
FALSE, GetCurrentProcessId()));
if (hProcess) {
HMODULE hMods[1024];
DWORD cbNeeded;
if (EnumProcessModules(hProcess, hMods, sizeof(hMods), &cbNeeded)) {
const DWORD SIZE(cbNeeded / sizeof(HMODULE));
for (DWORD i(0); i < SIZE; ++i) {
TCHAR szModName[MAX_PATH];
// Get the full path to the module file.
if (GetModuleFileNameEx(hProcess,
hMods[i],
szModName,
sizeof(szModName) / sizeof(TCHAR))) {
#ifdef UNICODE
std::wstring wStr(szModName);
std::string tModuleName(wStr.begin(), wStr.end());
#else
std::string tModuleName(szModName);
#endif /* UNICODE */
if (tModuleName.substr(tModuleName.size()-3) == "dll") {
m_asDllList.push_back(tModuleName);
}
}
}
}
CloseHandle(hProcess);
}
return m_asDllList;
}
The code in this case is slightly longer than for the Linux case, but the main idea is the same: list all the loaded libraries and save them into a std::vector. Don't forget to also link your program to the Psapi.lib!
Manual unloading
Now that we can list all the loaded libraries, all you need to do is find among those the ones that come from loading NumPy, grab a handle to them and then call dlclose on that handle. The code below will work on both Windows and Linux, provided that you use the dlfcn-win32 library.
#ifdef WIN32
# include <windows.h>
# include <psapi.h>
# include "dlfcn_win32.h"
#else
# include <dlfcn.h>
# include <link.h> // for dl_iterate_phdr
#endif /* WIN32 */
#include <string>
#include <vector>
// Function that list all loaded libraries (not implemented here)
std::vector<std::string> list_loaded_libraries();
int main()
{
// do some preprocessing stuff...
// store the list of loaded libraries now
// any libraries that get added to the list from now on must be Python
// libraries
std::vector<std::string> loaded_libraries(list_loaded_libraries());
std::size_t start_idx(loaded_libraries.size());
void* pHandle = dlopen("/path/to/library/libpython2.7.so", RTLD_NOW | RTLD_GLOBAL);
// Not implemented here: get the addresses of the Python function you need
MyPy_Initialize(); // Needs to be defined somewhere above!
MyPyRun_SimpleString("import numpy"); // Needs to be defined somewhere above!
// ...
MyPyFinalize(); // Needs to be defined somewhere above!
// Now list the loaded libraries again and start manually unloading them
// starting from the end
loaded_libraries = list_loaded_libraries();
// NB: this below assumes that start_idx != 0, which should always hold true
for(std::size_t i(loaded_libraries.size()-1) ; i >= start_idx ; --i) {
void* pHandle = dlopen(loaded_libraries[i].c_str(),
#ifdef WIN32
RTLD_NOW // no support for RTLD_NOLOAD
#else
RTLD_NOW|RTLD_NOLOAD
#endif /* WIN32 */
);
if (pHandle) {
const unsigned int Nmax(50); // Avoid getting stuck in an infinite loop
for (unsigned int j(0) ; j < Nmax && !dlclose(pHandle) ; ++j);
}
}
}
Final words
The examples shown here capture the basic ideas behind my solution, but can certainly be improved to avoid global variables and facilitate ease of use (for example, I wrote a singleton class that handles the automatic initialization of all the function pointers after loading the Python library).
I hope this can be useful to someone in the future.
References
dl_iterate_phdr: https://linux.die.net/man/3/dl_iterate_phdr
PsAPI library: https://msdn.microsoft.com/en-us/library/windows/desktop/ms684894(v=vs.85).aspx
OpenProcess: https://msdn.microsoft.com/en-us/library/windows/desktop/ms684320(v=vs.85).aspx
EnumProcess: https://msdn.microsoft.com/en-us/library/windows/desktop/ms682629(v=vs.85).aspx
GetModuleFileNameEx: https://msdn.microsoft.com/en-us/library/windows/desktop/ms683198(v=vs.85).aspx
dlfcn-win32 library: library: https://github.com/dlfcn-win32/dlfcn-win32
I'm not quite sure how you don't seem to understand the solution posted in Py_initialize / Py_Finalize not working twice with numpy. The solution posted is quite simple: call Py_Initialize and Py_Finalize only once for each time your program executes. Do not call them every time you run the loop.
I assume that your program, when it starts, runs some initialization commands (which are only run once). Call Py_Initialize there. Never call it again. Also, I assume that when your program terminates, it has some code to tear down things, dump log files, etc. Call Py_Finalize there. Py_Initialize and Py_Finalize are not intended to help you manage memory in the Python interpreter. Do not use them for that, as they cause your program to crash. Instead, use Python's own functions to get rid of objects you don't want to keep.
If you really MUST create a new environment every time you run your code, you can use Py_NewInterpreter and to create a sub-interpreter and Py_EndInterpreter to destroy that sub-interpreter later. They're documented near the bottom of the Python C API page. This works similarly to having a new interpreter, except that modules are not re-initialized each time a sub-interpreter starts.

Segfault on calling standard windows .dll from python ctypes with wine

I'm trying to call some function from Kernel32.dll in my Python script running on Linux. As Johannes Weiß pointed How to call Wine dll from python on Linux? I'm loading kernel32.dll.so library via ctypes.cdll.LoadLibrary() and it loads fine. I can see kernel32 loaded and even has GetLastError() function inside. However whenever I'm trying to call the function i'm gettings segfault.
import ctypes
kernel32 = ctypes.cdll.LoadLibrary('/usr/lib/i386-linux-gnu/wine/kernel32.dll.so')
print kernel32
# <CDLL '/usr/lib/i386-linux-gnu/wine/kernel32.dll.so', handle 8843c10 at b7412e8c>
print kernel32.GetLastError
# <_FuncPtr object at 0xb740b094>
gle = kernel32.GetLastError
# OK
gle_result = gle()
# fails with
# Segmentation fault (core dumped)
print gle_result
First I was thinking about calling convention differences but it seems to be okay after all. I'm ending with testing simple function GetLastError function without any params but I'm still getting Segmentation fault anyway.
My testing system is Ubuntu 12.10, Python 2.7.3 and wine-1.4.1 (everything is 32bit)
UPD
I proceed with my testing and find several functions that I can call via ctypes without segfault. For instance I can name Beep() and GetCurrentThread() functions, many other functions still give me segfault. I created a small C application to test kernel32.dll.so library without python but i've got essentially the same results.
int main(int argc, char **argv)
{
void *lib_handle;
#define LOAD_LIBRARY_AS_DATAFILE 0x00000002
long (*GetCurrentThread)(void);
long (*beep)(long,long);
void (*sleep)(long);
long (*LoadLibraryExA)(char*, long, long);
long x;
char *error;
lib_handle = dlopen("/usr/local/lib/wine/kernel32.dll.so", RTLD_LAZY);
if (!lib_handle)
{
fprintf(stderr, "%s\n", dlerror());
exit(1);
}
// All the functions are loaded e.g. sleep != NULL
GetCurrentThread = dlsym(lib_handle, "GetCurrentThread");
beep = dlsym(lib_handle, "Beep");
LoadLibraryExA = dlsym(lib_handle, "LoadLibraryExA");
sleep = dlsym(lib_handle, "Sleep");
if ((error = dlerror()) != NULL)
{
fprintf(stderr, "%s\n", error);
exit(1);
}
// Works
x = (*GetCurrentThread)();
printf("Val x=%d\n",x);
// Works (no beeping, but no segfault too)
(*beep)(500,500);
// Segfault
(*sleep)(5000);
// Segfault
(*LoadLibraryExA)("/home/ubuntu/test.dll",0,LOAD_LIBRARY_AS_DATAFILE);
printf("The End\n");
dlclose(lib_handle);
return 0;
}
I was trying to use different calling conventions for Sleep() function but got no luck with it too. When I comparing function declarations\implementation in Wine sources they are essentially the same
Declarations
HANDLE WINAPI GetCurrentThread(void) // http://source.winehq.org/source/dlls/kernel32/thread.c#L573
BOOL WINAPI Beep( DWORD dwFreq, DWORD dwDur ) // http://source.winehq.org/source/dlls/kernel32/console.c#L354
HMODULE WINAPI DECLSPEC_HOTPATCH LoadLibraryExA(LPCSTR libname, HANDLE hfile, DWORD flags) // http://source.winehq.org/source/dlls/kernel32/module.c#L928
VOID WINAPI DECLSPEC_HOTPATCH Sleep( DWORD timeout ) // http://source.winehq.org/source/dlls/kernel32/sync.c#L95
WINAPI is defined to be __stdcall
However some of them works and some don't. As I can understand this sources are for kernel32.dll file and kernel32.dll.so file is a some kind of proxy that supposed to provide access to kernel32.dll for linux code. Probably I need to find exact sources of kernel32.dll.so file and take a look on declarations.
Is there any tool I can use to take a look inside .so file and find out what functions and what calling conventions are used?
The simplest way to examine a DLL is to use the nm command, i.e.
$ nm kernel32.dll.so | grep GetLastError
7b86aae0 T _GetLastError
As others have pointed out, the default calling convention for Windows C DLLs is stdcall. It has nothing to do with using Python. On the Windows platform, ctypes.windll is available.
However, I am not even sure what you are trying to do is at all possible. Wine is a full-blown Windows emulator and it is safe to guess that at least you would have to start it with wine_init before loading any other functions. The Windows API probably have some state (set when Windows boots).
The easiest way to continue is probably to install a Windows version of Python under Wine and run your script from there.

Categories