Linked brushing possible in R - python

I saw this just yesterday but it's for matplotlib which, as far as I know, is Python only. This functionality that would be stupendously useful for my work.
Is anything similar available for R? I've looked around and the closest I've seen mentioned is iPlots/Acinonyx, but the websites for those are a few years out of date. Do those packages work reasonably well? I've not seen any examples of their use.
Alternatively, does mpld3/matplotlib/python play well with R? By that I mean, could I load my dataframes in R, use mpld3/matplotlib/python for exploring my data, then making up final/pretty plots in R?
Full disclosure: I'm a newbie (R is the first programming language that I've really tried to learn since QBASIC as a child...).

While R doesn't seem to have anything quite like this yet, I want to note that mpld3 now has a well-defined JSON layout for figure representations, in some ways similar to Vega (but at a much lower level). I'm not an R/ggplot user, but it seems like the ggvis ggplot-to-vega approach could be rather easily adapted to convert from ggplot to mpld3.

I've forgotten how to do linked plots with brushing in R, but I know the capability is there. I use GGobi for that, however - http://ggobi.org/. It's designed for exploratory data analysis using visualizations, and there are R packages to communicate with it and script it.
There's a pretty good book on GGobi - Interactive and Dynamic Graphics for Data Analysis: With R and GGobi.

The R package ggvis will have similar functionality. It is still in relatively early development, as version 0.1 was just tagged a few days ago. (Although that's also true of mpld3).
To answer your second question, yes they work reasonably well together. The easiest way to do what you suggested would use the R magic function in the IPython notebook.

The package JGR provides a java interface for R. From here, you can call the library iplots. In your R terminal, type
install.packages("JGR");
library(JGR);
JGR()
This will open a new window that you can use just like the standard R terminal.
You should now be able to brush using iplots:
X = matrix(rnorm(900), ncol = 3);
iplot(X[,1], X[,2]);
iplot(X[,1], X[,3]);
ihist(X[,1])

Also take a look at http://cranvas.org/ - it might be somewhat hard to install (especially for a newbie) but it's well worth the effort.

Related

Loading/Parsing Mathematical Programming System files

In order not to reinvent the wheel I tried to find some code to parse Mathematical Programming System files but I didnt find any implementations in python.
Is there any code allready available for this?
Update
Reading Mathematical Prog. files
Example MPS (afiro.mps: link1, link2)
Contains:
objective function, one row, n columns
table with restrictions, m rows, n columns
right table, one column, m rows
Many languages have packages for reading and writing these files.
The question does not address the specifics, e.g. pure python vs. c-wrapper-based, nor any license-issues.
But well... two things which worked for me in the past (the former was more tested for my own IPM-method on the netlib dataset; the latter looked good too):
Dirty code to use netlib's test cases with scipy's solvers based on the former approach.
cvxopt
MPS-reading is somewhat hidden here and here.
Looks pretty much python-only to me.
One should be careful about potentially modifications already done to the problem by cvxopt, at least when asking cvxopt for the matrix-form. I don't remember right now what to expect here (and it also did not matter much in my cases).
Warning: cvxopt is known for a non-trivial installation-process on windows, if you try to install the whole project!
There are also some warnings about what features of MPS-files are not supported.
GLPK + swiglpk
Basically swig-based bindings for GLPK. Available here (probably most newest python-bindings to GLPK). If using this, use it together with GLPK's manual and some understanding of SWIG (or else ).
This one should be more controllable in terms of what we read (see manual)!
You can use pysmps package in Python. It can simply be installed through pip install pysmps. Further details can be found in:
https://pypi.org/project/pysmps/
https://github.com/jmaerte/pysmps

Octave/MATLAB vs. Python On An Embedded Computer

I want to perform image processing on a low-end (Atom processor) embedded computer or microcontroller that is running Linux.
I'm trying to decide whether I should write my image processing code in Octave or Python. I feel comfortable in both languages, but is there any reason why I should use one over the other? Are there huge performance differences? I feel as though Octave may more closely resemble, syntax-wise, the domain of image processing than Python.
Thanks for your input.
Edit: The motivation for this question comes from the fact that I design in Octave and get a working algorithm and then port the algorithm to C++. I am trying to avoid this double work and go from design to deployment easily.
I am bit surprised that you don't stick to C/C++ - many convenient image processing libraries exists. Even though, I have like 20 years of experience with C, 8 years of experience with Matlab and only 1 years of experience with Python, I would choose Python together with OpenCV, which is an extremely optimized library for computer vision supporting Intel Performance Primitives. Once you have a working Python solution, it is easy to translate this to C or C++ to get the additional performance or reduce the power consumption. I would start with Python and Numpy using matplotlib for displaying / prototyping, optimize using OpenCV from within Python and finally use C++ and test it against the Python reference implementation.
MATLAB has a code generation feature which could potentially help with your workflow. Have a look at this example. My understanding is that the Atom is x86 architecture, so the generated code ought to work on it too. You could consider getting a Trial version and giving the above example a spin on your specific target to evaluate performance and inspect the generated C code.

GNU Radio--raw data from uhd_fft.py

I would like to do spectrum sensing with GNU radio. Is there a good way to get the raw output from uhd_fft.py (the value for each frequency)? I would like to do this programatically (with code), rather than through a GUI.
I have tried doing spectrum sensing with usrp_spectrum_sense.py, and this script has questionable accuracy and seems to be much slower than uhd_fft.py.
Thanks!
You should really direct your question to the GNURadio mailing list. This is a very application-specific question, which isn't necessarily appropriate for SO.
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio
To answer your question a bit, uhd_fft.py is just a Python program that is doing a transform on your data. You can do the same thing in C++ with GNURadio. Just edit the Python code to dump the bin data instead of plotting it and you should get what you want.

Can I program Nvidia's CUDA using only Python or do I have to learn C?

I guess the question speaks for itself. I'm interested in doing some serious computations but am not a programmer by trade. I can string enough python together to get done what I want. But can I write a program in python and have the GPU execute it using CUDA? Or do I have to use some mix of python and C?
The examples on Klockner's (sp) "pyCUDA" webpage had a mix of both python and C, so I'm not sure what the answer is.
If anyone wants to chime in about Opencl, feel free. I heard about this CUDA business only a couple of weeks ago and didn't know you could use your video cards like this.
You should take a look at CUDAmat and Theano. Both are approaches to writing code that executes on the GPU without really having to know much about GPU programming.
I believe that, with PyCUDA, your computational kernels will always have to be written as "CUDA C Code". PyCUDA takes charge of a lot of otherwise-tedious book-keeping, but does not build computational CUDA kernels from Python code.
pyopencl offers an interesting alternative to PyCUDA. It is described as a "sister project" to PyCUDA. It is a complete wrapper around OpenCL's API.
As far as I understand, OpenCL has the advantage of running on GPUs beyond Nvidia's.
Great answers already, but another option is Clyther. It will let you write OpenCL programs without even using C, by compiling a subset of Python into OpenCL kernels.
A promising library is Copperhead (alternative link), you just need to decorate the function that you want to be run by the GPU (and then you can opt-in / opt-out it to see what's best between cpu or gpu for that function)
There is a good, basic set of math constructs with compute kernels already written that can be accessed through pyCUDA's cumath module. If you want to do more involved or specific/custom stuff you will have to write a touch of C in the kernel definition, but the nice thing about pyCUDA is that it will do the heavy C-lifting for you; it does a lot of meta-programming on the back-end so you don't have to worry about serious C programming, just the little pieces. One of the examples given is a Map/Reduce kernel to calculate the dot product:
dot_krnl = ReductionKernel(np.float32, neutral="0",
reduce_expr="a+b",
map_expr="x[i]*y[i]",
arguments="float *x, float *y")
The little snippets of code inside each of those arguments are C lines, but it actually writes the program for you. the ReductionKernel is a custom kernel type for map/reducish type functions, but there are different types. The examples portion of the official pyCUDA documentation goes into more detail.
Good luck!
Scikits CUDA package could be a better option, provided that it doesn't require any low-level knowledge or C code for any operation that can be represented as numpy array manipulation.
I was wondering the same thing and carried a few searches. I found the article linked below which seems to answer your question. However, you asked this back in 2014 and the Nvidia article does not have a date.
https://developer.nvidia.com/how-to-cuda-python
The video goes through the set up, an initial example and, quite importantly, profiliing. However, I do not know if you can implement all of the usual general compute patterns. I would think you can because as far as I could there are no limitations in NumPy.

Are there any matrix math modules compatible with Python 3.x?

When I began this project, I thought it would be easy to get libraries for common stuff like matrix math, so I chose to work in Python 3.1- it being the most recent, updated version of the language. Unfortunately, NumPy is only compatible with 2.5 and 2.6 and seems to be the only game in town! Even other stuff that I turned up like gameobjects seemed to be based on NumPy and therefore incompatible with 3.x as well.
Does anyone out there know of a matrix library that is compatible with 3? I need to be able to do the following: matrix add, subtract, multiply, scalar multiply, inverse, transpose, and determinant. I've been looking all day and all roads seem to lead back to NumPy. I even tried this module: http://www.nightmare.com/squirl/python-ext/misc/matrix.py but it too is for 2.x. Even after converting it using the 2to3 tool, I can't get the yarn module that it refers to (and is probably itself 2.x).
Any help is hugely appreciated.
Given that a large portion of those interested in this sort of development are involved in NumPy, and given their schedule for migrating I think the answer is "no, there is nothing yet".
I would advise treating Python 3.x as "still experimental" and start with Python 2.6 instead. Make some small effort to write your code in a such a way that it won't be too hard to migrate in, say, a year or two, when the Python 3.x series really stabilizes, but don't jump there just yet. Other more general questions have answers that may help you decide.
EDIT: PyEuclid has supports matrices, vectors up to 4 dimensions and is designed for geometric operations.
Otherwise, the answer is probably not what you want, but:
use python 2.x instead, do use numpy (which is really good), until numpy supports python 3.x
implement your own Matrix class, since you don't require too much functionality and it's a good exercise to implement. Should be less work than looking a day on the internet.

Categories