Using Jupyter Notebook, when I execute a function from an R package (in my case Climatol) from a notebook that uses the R kernel, messages are displayed as output that report the procedures that are being done. Nothing new.
The code block used is this:
library(maps)
library(mapdata)
library(climatol)
# Apply function (from R kernel)
homogen('Vel',2011,2012,tinc='6 hour',expl=TRUE)
Now, using the Python kernel from another notebook, when I call the same function through rpy2 applying the same parameters, I don't get the same messages that appear in the previous image. Instead I get this:
This time, the code block used is this:
from rpy2.robjects import r
from rpy2.robjects.packages import importr
importr('maps')
importr('mapdata')
importr('climatol')
# Apply function ( from Python kernel)
r["homogen"]("Vel",2011,2012,tinc="6 hour",expl=r['as.logical']("T"))
I ran the mentioned Python code from Sublime Text and in this case the messages are displayed:
The messages are also displayed when running the code from the Windows console, which leads me to think that the downside is Jupyter. That being said, how can I get those messages using Jupyter?
I'm using Python 3.7 and the version of rpy2 is 2.9.4
Thanks for the help.
rpy2 it not fully supported on Windows. Callbacks used to set how R output is handled are likely not working, which results in the issue you are observing.
If this improves in will be in more recent rpy2 versions. The latest rpy2 release is 3.3.6 for example.
Otherwise, for better compatibility on Windows consider running rpy2/jupyter in Docker, or if it works in a WSL (Windows Subsystem for Linux).
Related
This is partially related to an already closed question about keras package and R in google colab. But I have some specific doubts in such a workflow...
1 .It is known we have how to use R in google colab. And the use of the google colab's GPU and TPU are indeed interesting.
And although the documentations says we need to run install_keras() in R if we want to use GPU on keras, at google colab it works without this setting. Neither python installation is required.
But deep learning processes are time consuming... running all the code in just one notebook have some limitations... Splitting into several ones, saving and sharing the resultant object to re-use in the next notebook can be interesting.
We can think the above is more desirable specially because the environment is ephemeral. And the solution would be mounting the google drive to be able to use its data, and save some partial outputs on it. But mounting google drive appears to be restricted to python notebooks... yes, we have discussions proposing solutions as here and here but I was not able to implement them...
So I am really curious how the R keras users (and other R users) deal with such an issue when using google colab?
If we keep in this idea of a workflow using more than one notebook, some possible related questions is this one (without answer)
So, I have tried to another alternative: using the python notebook and run R in specific cells inside it using rpy2 like indicated here and other discussions I mentioned before... Yes, one can ask why not coding on Python... Ignoring it and still keeping R...
But happens that R's keras is an api for python keras, and need python to run... But I do not why, when I try to run any keras function, even a simple
%%R
imdb<-dataset_imdb()
I get:
R[write to console]: Error: Error 1 occurred creating conda
environment /root/.local/share/r-miniconda/envs/r-reticulate
I also see one saying Rkerkel does not see the colab's python like here, but I know it is not true, because R'keras works in the Rkernel, and if I run the same py_config, I am able to see the python versions...
But the point is... why in this python notebook, using rpy2, we cannot verify the Python...?
If we run the notebook with R Kernel, the all package requiring python works well without any intervention... that's strange...
I see discussions of how to install conda like here. But I believe this should not be the way... Maybe is related to rpy2...
I have tried some alternatives to check the existence of python versions inside the R cells (called with %%R), and I believe the R called in this sense are not able to see python...
%%R
library(reticulate)
py_config()
It returns the same
R[write to console]: Error: Error 1 occurred creating conda
environment /root/.local/share/r-miniconda/envs/r-reticulate
Error: Error 1 occurred creating conda environment
/root/.local/share/r-miniconda/envs/r-reticulate
So, my major question:
how to use effectively the R keras (and other R packages using python in its background) inside google colab's notebook with python kernel?
What I am missing here with rpy2?
Working with R in Python using rpy2 on windows 7.
I need to open some rasters as RasterLayer using the function raster() from the raster package. I manage to install the package, but not to use its function.
I install the packages that I need (rgdal, sp, raster, lidR, io) using
utils.install_packages(StrVector(names_to_install))
names_to_install is a list of the packages that are still not installed. This works fine.
I know how to try the "basic" functions, like sum, and it works:
import rpy2.robjects as robjects
function_sum = robjects.r['sum']
But the same doesn't seem to work with the raster function from the raster package:
function_raster = robjects.r['raster']
since I get the error:
LookupError: 'raster' not found
I also tried the following:
raster_package = importr('raster')
with the intention to be able to run the next and load my raster file:
raster_package.raster(my_raster_file)
but the first line (import('raster')) causes the crash of python and I get the error:
Process finished with exit code -1073741819 (0xC0000005)
This doesn't happen with other loaded packages like rgdal, but with the raster package and with the lidR package I get the error.
I looked up this error, seems to be access violation, but I don't know what I can do about it and why it only happens with certain packages.
I expect to be able to call the raster function from the package raster.
Edit
I tried it on a computer with windows 10 and the error doesn't show anymore when running
raster_package = importr('raster')
Still would be nice to know what is the problem with Windows 7 and if there is any solution.
rpy2 does not currently have Windows support. This is not a final situation, most of what is likely needed is contributions to finalize this: https://github.com/rpy2/rpy2/blob/master/rpy2/rinterface_lib/embedded_mswin.py
I am a big fan of Rstudio Cloud and would like to inter-grate R and Python by using the package Reticulate.
It looks like Rstudio Cloud is using python 2.7 (no problems with that). When I try to write Python Code in an R markdown document, nothing gets run.
---
title: "reticulate"
output: html_document
---
```{r setup, include=FALSE}
knitr::opts_chunk$set(echo = TRUE)
```
```{r}
library(reticulate)
py_config()
```
```{python}
import pandas
x = 4
```
Python code does not get run.
I am also finding that if I want to install python packages in an R script using reticulate. I have to create a virtual environment. What is the reason behind that?
library(reticulate)
virtualenv_create("r-reticulate")
virtualenv_install("r-reticulate", "scipy")
virtualenv_install("r-reticulate", "pandas")
If I use conda_install, I get an error message.
conda_create("r-reticulate")
Error: Unable to find conda binary. Is Anaconda installed?
conda_install("r-reticulate", "scipy")
Error: Unable to find conda binary. Is Anaconda installed?
The goal is to have python working in Rstudio cloud on R markdown. I can not install packages and execute code.
I just succeeded in getting Conda installed in Rstudio cloud after receiving the same error message as you1, so thought I'd share how I got this working.
I created two scripts:
to install miniconda (i think that's the step you're missing, and why Conda didn't work for you) and then restartSession for this to be accessible
to seperately store the commands for setting up Conda, with the running of this script passed as a command to the call to restartSession (because otherwise the commands are triggered before R has restarted, and they fail; sys.sleep() didn't seem to work, but this method did)
setup.R
setwd("/cloud/project") # to ensure students get required resources
install.packages("rstudioapi") # to restart R session w/ installations
install.packages("reticulate") # for python
reticulate::install_miniconda("miniconda") # for python
# Restart again to make sure all system things are loaded
# and then create a new Conda environment
rstudioapi::restartSession(command="source('nested_reticulate_setup.R')")
nested_reticulate_setup.R
reticulate::conda_create("r-reticulate")
reticulate::conda_install("r-reticulate", "scipy")
Sys.setenv(RETICULATE_PYTHON="/cloud/project/miniconda/envs/r-reticulate/bin/python")
reticulate::use_condaenv("r-reticulate")
osmnx <- reticulate::import("scipy")
Then if you make a call to scipy, eg scicpy$`__version__` , I believe it should work for you without that error you observed.
I couldn't find a solution to this issue elsewhere, so thought it worth responding to this old post in case it helps somebody some day. I am sure there are other ways of approaching this.
1 Perhaps for a different reason; i'll explain later in the post...
Image gallery: http://imgur.com/a/qZkTW#qGj7I0H
I just installed the new version of Canopy 1.3 from enthought. I opened up ipython, and I imported mayavi's mlab without issue. I then plotted a 3d sphere without issue using the following:
import mayavi
from mayavi import mlab
mlab.points3d(1,1,1)
mlab.show()
And I get what I would expect (See figure #2 in gallery). I can then open up the scene editor without issue (see figure #1 in gallery), but when I try to open any other traits editors for anything else, I get a weird black background with no text:
scalarscatter editor
This issue affects all other editors other than the scene editor. It has been reproduced after uninstalling canopy per the description on their website, restarting the computer and reinstalling canopy. It has persisted despite reinstallation with both 32- and 64-bit installations, and it also affects mayavi2 when run from the command line. I don't get this error when I open the Canopy.open an app and run everything from inside canopy, which is not really a viable option for my current workflow (I want to use ipython notebooks)
The only error I get via stderr seems to be unrelated:
Python[4434:d0f] CoreText performance note: Client called CTFontCreateWithName() using name ".Lucida Grande UI" and got font with PostScript name ".LucidaGrandeUI". For best performance, only use PostScript names when calling this API.
Python[4434:d0f] CoreText performance note: Set a breakpoint on CTFontLogSuboptimalRequest to debug.
I have updated all the canopy packages using the built-in installer. I'm using the built-in python for canopy. I never had any similar issues in the past with EPD, only since installing Canopy 1.3 on my computer.
I have searched the internet, and cannot find any other complaints of this issue. Please let me know if you have any ideas. I would really like to use the ipython notebook feature rather than opening Canopy.app every time.
Any help would be greatly appreciated!
Several notes:
1) This should do it:
ETS_TOOLKIT=qt4 ipython notebook --pylab qt
(These settings are default within the Canopy app).
2) Be sure that you are starting Canopy User Python from Terminal. sys.prefix in terminal should be the same as from within Canopy's (i)Python shell. For details, see https://support.enthought.com/entries/23646538-Make-Canopy-User-Python-be-your-default-Python
3) FWIW, IPython notebook is useable directly within Canopy (File / New / IPython Notebook), but admittedly the experience is still not as good as in a regular browser, especially on Mac. By Canopy 1.4 or 1.5 we hope that it will be, so you can have the best of both worlds.
I'm trying to run a standard Mayavi example. I just installed mayavi2 on Ubuntu (Kubuntu 12.04) and this is my first step with Mayavi. Unfortunately, this step is failing.
The examples I wish to run come from here:
http://docs.enthought.com/mayavi/mayavi/auto/examples.html
For example, this one.
The behavior I am seeing is that the plot canvas area is blank (mostly). The popup window is shown and its controls are present and working.
The only errors I am seeing are:
libGL error: failed to load driver: swrast
libGL error: Try again with LIBGL_DEBUG=verbose for more details.
Where would I add LIBGL_DEBUG=verbose?
I'm on Kubuntu 12.04 with:
Python 2.7.3
IPython 1.1.0
wxPython 2.8
vtk 5.8.0-5
setuptools, numpy, scipy - latest versions (just updated)
I am running the examples in IPython (which seems to be the recommended way). I am using this command to start the shell:
ipython --gui=wx --pylab=wx
I also tried running the examples from within an IPython notebook as so:
%run example.py
In all cases the examples fail to display the animation. The window itself is display as are the controls. But the animation canvas is mostly blank, although a flash of the images will sometimes appear.
At least once previously I saw my attempts crash Python. The message was:
The crashed program seems to use third-party or local libraries:
/usr/local/lib/python2.7/dist-packages/traits/ctraits.so
/usr/local/lib/python2.7/dist-packages/tvtk/array_ext.so
However, I am not seeing that crash now.
I found some important clues here:
https://askubuntu.com/questions/283640/libgl-error-failed-to-load-driver-i965
Like that person, I ended up reinstalling my graphics driver and that solved my problem. (The problem wasn't related to mayavi or python after all.)