Execute Randomforest model built using python in julia - python

Is it possible to use a randomforest model built in python to be exported and executed natively in julia? Will it give performance boost?

You can use PyCall to call python code in Julia. Julia can't magically make Python code (or any other code) faster. You could call more basic components written in python and glue the desired end results together in Julia, which should theoretically be faster. For example, much of Scikit-learn uses Numpy, but you could call the Numpy library and construct the relevant code to create a random forest, which may be faster because Julia can compile the binding code. At this point it would make more sense to just use Julia entirely though, because most of Numpy's functionality is available Julia's LinearAlgebra package.
It's just a trade off how fast you want your code to be versus how much work you want to put into optimizing it.

Related

Calling a Julia 0.4 function from Python. How i can make it work?

I wrote an optimization function in Julia 0.4 and I want to call it from Python. I'm storing the function in a .jl file in the same working directory as the Python code. The data to be transferred is numeric, and I think of using Numpy arrays and Julia arrays for calls. Is there a tutorial on how to make this work?
Check out the pyjulia module. It allows you make calls to Julia from within Python
In my experience, calling Julia using the Python pyjulia package is difficult and not a robust solution outside HelloWorld usage.
1) pyjulia is very neglected.
There is practically no documentation aside the source code. For example, the only old tutorials I've found still use julia.run() that was replaced by julia.eval(). pyjulia is not registered on PyPI, the Python Package Index. Many old issues have few or no responses. And buggy, particularly with heavy memory usage and you could run into mysterious segfaults. Maybe, garbage collector conflicts...
2) You should limit pyjulia use to Julia functions that return simple type
Julia objects imported to python using pyjulia are difficult to use. pyjulia doesn't import any type constructors. For example, pyjulia seems to convert complex Julia types into plain python matrix.
3) If you can isolate the software modules and manage the I/O, you should consider the shell facility, particularly in Linux / Unix environment.

Connecting Matlab to Tensorflow

I am using an open-source Matlab toolbox for brain-computer interface (BCI). I want to send the brain imaging data over to Tensorflow for classification and get the results back to Matlab. Is there any way to pass data structures from Matlab to Tensorflow and get the results back into Matlab?
In case someone lands here with a similar question, I'd like to suggest a Matlab package I am currently writing. It's called tensorflow.m and it's available on GitHub. There's no stable release yet, but simple functionality like importing a frozen graph and running an inference is already possible (see the examples) - this is all you'd need to classify the images in Matlab (only).
The advantage is that you don't need any expensive toolbox nor a Python/Tensorflow installation on your machine. The Python interface of Matlab also seems to be rather adventurous, while tensorflow.m is pure Matlab/C.
I'd be glad if the package can be of use for someone looking for similar solutions; even more so, in case you extend/implement something and open a PR.
So far the best way I found is to run your python module in matlab through matlab's now built-in mechanism for connecting to python:
I wrote my python script in a .py file and in there I imported tensorflow and used it in different functions. You can then return the results to matlab by calling
results = py.myModule.myFunction(arg1,arg2,...,argN)
More detailed instructions for calling user-defined python modules in matlab could be found in the following link:
http://www.mathworks.com/help/matlab/matlab_external/call-user-defined-custom-module.html

Rewrite part of a c/c++ program in python

I am trying to develop a plugin interface in python for a popular 3d program, called Blender 3D. The purpose of the plugin interface is to extend some functionality of the program through python.
Now my problem is that I am trying to asses the performance impact. I will replace an existing functionality written in c code with something written in python.
I am worried that this might slow the application because the functionality that I am replacing is executed in real time and has to be really fast. It consists of a plain c function that just splits some polygons in triangles.
So the operations that I executing work on pieces of data that usually do not have more than 30 or 40 input points. And at most the operations that I am executing on them have a complexity of log(n) * n^2.
But I will be creating plenty of python objects each second, so I am already prepared to implement pooling to recycle the objects.
Now I am mostly worried that the python code will run 100 times slower than the c code and slow down the application. Should I be worried?
At most I will be doing 8500 computations in a single python function call. This function will be called each time when rendering the application interface.
The question of using c or python will depend on the use of your work. Is this a function that the blender developers will accept into the blender development? Do you expect many blender users will want to use it? A python addon allows you to develop your work outside of the main blender development and give many users access to it, while a patch to the c code that requires the user to compile their own version will reduce users.
You could also look at compiling your c code to a binary library that is included with the python addon and loaded as a python module. See two addons by Pyroevil created using cython - molecular and cubesurfer, some pre-built binaries are available on his main website. I'm not sure if using cython makes the python module creation easier or not, you could also use cython only as glue between python and your library.

c++ gis/spatial library for higher performance computing

So I am working on setting up an agent-based model that runs over a geographic map--syria in this case. I tried writing it in python, but the performance is rather slow--even after some optimization tricks. I was thinking that I should shift to just writing the model in C++, but I don't know which visualization packages can incorporate maps? I tend to use gnuplot in C++, but I have not been able to find a way to incorporate a gis basemap in that package. I am not sure if this is possible in VTK or any other packages. I would like to find a way to run my model fast in C++ but not lose the geographic information. Any suggestions?
Perhaps this project could be useful to you ?
http://code.google.com/p/vtk-grass-bridge/
If you can handle your GIS data using GRASS, it seems that project can convert it to something VTK can render, all in one C++ application.
So I actually figured out the answer to this problem and am posting the solution for everyone. The best choice if you are using python, is to just use the mayavi and tvtk packages from Enthought. Mayavi is a gui on top of the C++ VTK libraries. And tvtk is actually a wrapper for python access to VTK objects. So this allows a person to use python GIS packages--like pyshp, Shapely, and others to manipulate GIS objects and then write them to robust and fast mayavi for visualization. At the same time, if you want to stick to C++ then you can still just write your code in C++ using gdal or ogr, etc., and then run your visualization in VTK. This seems a lot easier and more intuitive then trying to run through some other packages like GRASS, QGIS, or ArcGIS.
Here is a good example of this toolset in action.
Example
What makes you believe that a C++ implementation of your model will be dramatically faster? I suggest before being concerned with how you will visualize the results you focus first on what causes your python implementation to be slow. Is it that your algorithm won't scale? If you have tried optimization tricks, what tricks were those and why do you believe they did not work?
It all eventually comes down to machine instructions being executed on hardware, whether those instructions started out as python, C++ , or some other language source code. Unless your python was running fully interpreted all the time I don't think you will find that switching languages alone will cause you to have a fundamentally different performance profile. Premature optimization is still something to be avoided.

Creating a shared library in MATLAB

A researcher has created a small simulation in MATLAB and we want to make it accessible to others. My plan is to take the simulation, clean up a few things and turn it into a set of functions. Then I plan to compile it into a C library and use SWIG to create a Python wrapper. At that point, I should be able to call the simulation from a small Django application. At least I hope so.
Do I have the right plan? Are there are any serious pitfalls that I'm not aware of at the moment?
One thing to remember is that the MATLAB compiler does not actually compile the MATLAB code into native machine instructions. It simply wraps it into a stand-alone executable or a library with its own runtime engine that runs it. You would be able to run your code without MATLAB installed, and you would be able to interface it with other languages, but it will still be interpreted MATLAB code, so there would be no speedup.
Matlab Coder, on the other hand, is the thing that can generate C code from Matlab. There are some limitations, though. Not all Matlab functions are supported for code generation, and there are things you cannot do, like change the type of a variable on the fly.
I remember that I was able to wrap a MATLAB simulation into a DLL file and then call it from a Delphi application. It worked really well.
I'd also try ctypes first.
Use the MATLAB compiler to compile the code into C.
Compile the C code into a DLL.
Use ctypes to load and call code from this DLL
The hardest step is probably 1, but if you already know MATLAB and have used the MATLAB compiler, you should not have serious problems with it.
Perhaps try ctypes instead of SWIG. If it has been included as a part of Python 2.5, then it must be good :-)

Categories