Is there a way to automatically switch matplotlib backend in Jupyter? - python

When working with my python package, a certain function has some interactive matplotlib stuff going on.
In Jupyter Notebook I always have to use the magic %matplotlib qt to switch backend in order to make it work.
However, this might seem obvious to me, but others who're trying to work with my package this is not that straight forward.
This is what I have so far in my __init__.py:
def run_from_notebook():
return hasattr(__builtins__, '__IPYTHON__')
if run_from_notebook():
# this has no effect
try:
from IPython import get_ipython
ipython = get_ipython()
except ImportError:
import IPython.ipapi
ipython = IPython.ipapi.get()
ipython.magic("matplotlib qt")
I also tried:
if matplotlib.get_backend() != 'Qt5Agg':
matplotlib.use('Qt5Agg')
but still no effect.
Is there a way to automatically switch backend in Jupyter Notebook when someone imports my package?
and also: Is there any reason it's not considered as a good practice?

It turns out that the problem is with the run_from_notebook function. When running it simply in a notebook cell it returns True, but when it's imported from my module, it returns False. The question now is rather: How to detect if code is run inside Jupyter?
For example running manually
def switch_backend_to_Qt5():
import matplotlib
if matplotlib.get_backend() != 'Qt5Agg':
matplotlib.use('Qt5Agg')
gets the job done.
EDIT :
The following function suits my needs:
import os
def run_from_notebook():
try:
__IPYTHON__
# If it's run inside Spyder we don't need to do anything
if any('SPYDER' in name for name in os.environ):
return False
# else it's probably necessary to switch backend
return True
except NameError:
return False

Related

Check if module is running in Jupyter or not

I'm looking for a reliable way to figure out if my module is being loaded/run from within a Jupyter notebook, or more specifically, if ipywidgets is available.
This is not a duplicate of other questions: everything else I've found either has no reliable solution, or (more often) they use the "just try it and fail gently" approach that's common in Python. In my case, I'm trying to write the following logic:
if in_jupyter():
from tqdm import tqdm_notebook as tqdm
else:
from tqdm import tqdm
I don't think "try and fail" is an appropriate solution here since I don't want to produce any output yet.
The closest I've found to a solution so far is:
from IPython import get_ipython
get_ipython().config['IPKernelApp']['parent_appname'] == 'ipython-notebook'
but this configuration property is some seemingly empty traitlets.config.loader.LazyConfigValue (.get_value(None) is just an empty string).
You could use the following snippet to figure out if you are in jupyter, ipython or in a terminal:
def type_of_script():
try:
ipy_str = str(type(get_ipython()))
if 'zmqshell' in ipy_str:
return 'jupyter'
if 'terminal' in ipy_str:
return 'ipython'
except:
return 'terminal'
You can find more indepth info at How can I check if code is executed in the IPython notebook?
If you have tqdm >= 4.27.0, you can import from tqdm.auto to handle this:
from tqdm.auto import tqdm
tqdm can then be used as normal. If you are in a notebook, tqdm will be imported from .notebook.
You could check whether type(get_ipython()) is from ipykernel, e.g. if
type(get_ipython()).__module__.startswith('ipykernel.')
I'm not sure how stable that is though.

Signal handler works in python but not in ipython

I'm attempting to set the numpy print options using a signal handler on the window resize event. Don't want to make the connection until numpy has been imported, and don't want to import numpy automatically at python startup. I've got it almost-working with the code below:
# example.py
import wrapt
#wrapt.when_imported('numpy')
def post_import_hook(numpy):
import signal
try:
from shutil import get_terminal_size
except ImportError:
# Python 2
from shutil_backports import get_terminal_size
def resize_handler(signum=signal.SIGWINCH, frame=None):
w, h = get_terminal_size()
numpy.set_printoptions(linewidth=w)
print('handled window resize {}'.format(w))
resize_handler()
signal.signal(signal.SIGWINCH, resize_handler)
It works in vanilla python REPL (test with python -i example.py and resize the terminal a bit). But it doesn't work in ipython when the same code is added to my startup ipython config, and I don't understand why.
I'm not fixed on this particular approach (that's just what I've tried so far), so I'll phrase the question more generally:
How can numpy correctly fill to the terminal width automatically in ipython?
You can use print(np.arange(200)), for example, to check numpy's line wrapping behaviour.
Inspired by the standard fix for printing large arrays without truncation, I tried setting the line width to infinity. This seems to be working fine both in the REPL and in ipython, so I suggest this workaround:
import numpy
numpy.set_printoptions(linewidth=numpy.inf)
This doesn't explain why your fix doesn't work for ipython, but in case the above line doesn't mess with anything unexpected, it should make printing immune to resizing.

Matplotlib: check if undefined DISPLAY

Sometimes I run my script via ssh. This answer told me to set up
import matplotlib
#matplotlib.use('Agg') # Must be before importing matplotlib.pyplot or pylab!
import matplotlib.pyplot as plt
when I get the undefined SCREEN error by running the script via ssh. However with that preamble I cannot view the graphs interactively when I run the script on my local machine.
What's the condition to check if the screen is defined? I'd like to do
if SCREEN == None:
matplotlib.use('Agg')
How's the proper code for that, how can I check this?
It looks like an easiest way to do this is to check 'DISPLAY' environment variable
import os
# 'DISPLAY' will be something like this ':0'
# on your local machine, and None otherwise
if os.environ.get('DISPLAY') is None:
matplotlib.use('Agg')

Conditional import in a module

I have created a module modA which I am importing in my main program. Depending on what happens in my main program (it has an interactive mode and a batch script mode), I want modA itself to import matplotlib with either the TkAgg backend or the ps backend. Is there a way for my main program to communicate information to modA to tell it how it should import matplotlib?
To clarify the situation:
The main program:
#if we are in interactive mode
#import modA which imports matplotlib using TkAgg backend
#else
#import modA which imports matplotlib using the ps backend
Module modA:
#import matplotlib
#matplotlib.use('ps') or matplotlib.use('TkAgg') (how can I do this?)
Have a function in your module which will determine this.
import matplotlib
def setEnv(env):
matplotlib.use(env)
Then in your program you can have modA.setEnv('ps') or something else based on if-else statement condition.
You do not need a conditional import here (since you are using only one external module), but it is possible to do it:
if condition:
import matplotlib as mlib
else:
import modifiedmatplotlib as mlib
For more information about importing modules within function see these:
Python: how to make global imports from a function
Is it possible to import to the global scope from inside a function (Python)?
You can probably detect the way your session is started by evaluating the arguments passed to the command line:
import sys
import matplotlib
if '-i' in sys.argv:
# program started with an interactive session
matplotlib.use('TkAdd')
else:
# batch session
matplotlib.use('ps')
If not, you can use os.environ to communicate between modules:
In main:
import os
if interactive:
os.environ['MATPLOTLIB_USE'] = 'TkAdd'
else:
os.environ['MATPLOTLIB_USE'] = 'ps'
In modA:
import os
import matplotlib
matplotlib.use(os.environ['MATPLOTLIB_USE'])

Importing an ipynb file from another ipynb file?

Interactive Python (ipython) is simply amazing, especially as you are piecing things together on the fly... and does it in such a way that it is easy to go back.
However, what seems to be interesting is the use-case of having multiple ipython notebooks (ipynb files). It apparently seems like a notebook is NOT supposed to have a relationship with other notebooks, which makes sense, except that I would love to import other ipynb files.
The only workaround I see is converting my *.ipynb files into *.py files, which then can be imported into my notebook. Having one file hold everything in a project is a bit weird, especially if I want to really push for code-reuse (isn't that a core tenet of python?).
Am I missing something? Is this not a supported use case of ipython notebooks? Is there another solution I can be using for this import of an ipynb file into another notebook? I'd love to continue to use ipynb, but it's really messing up my workflow right now :(
It is really simple in newer Jupyter:
%run MyOtherNotebook.ipynb
See here for details.
Official docs: %run IPython magic command
Install my helper library from the command prompt:
pip install import-ipynb
Import it from your notebook:
import import_ipynb
Now import your .ipynb notebook as if it was a .py file
import TheOtherNotebook
This python-ipynb module is just one file and it strictly adheres to the official howto on the jupyter site.
PS It also supports things like from A import foo, from A import * etc
PPS Works with subdirectories: import A.B
Run
!pip install ipynb
and then import the other notebook as
from ipynb.fs.full.<notebook_name> import *
or
from ipynb.fs.full.<notebook_name> import <function_name>
Make sure that all the notebooks are in the same directory.
Edit 1: You can see the official documentation here - https://ipynb.readthedocs.io/en/stable/
Also, if you would like to import only class & function definitions from a notebook (and not the top level statements), you can use ipynb.fs.defs instead of ipynb.fs.full. Full uppercase variable assignment will get evaluated as well.
Install ipynb from your command prompt
pip install import-ipynb
Import in your notebook file
import import_ipynb
Now use regular import command to import your file
import MyOtherNotebook
%run YourNotebookfile.ipynb is working fine;
if you want to import a specific module then just add the import command after the ipynb
i.e YourNotebookfile.ipynb having def Add()
then you can just use it
%run YourNotebookfile.ipynb import Add
You can use import nbimporter then import notebookName
The above mentioned comments are very useful but they are a bit difficult to implement. Below steps you can try, I also tried it and it worked:
Download that file from your notebook in PY file format (You can find that option in File tab).
Now copy that downloaded file into the working directory of Jupyter Notebook
You are now ready to use it. Just import .PY File into the ipynb file
The issue is that a notebooks is not a plain python file. The steps to import the .ipynb file are outlined in the following: Importing notebook
I am pasting the code, so if you need it...you can just do a quick copy and paste. Notice that at the end I have the import primes statement. You'll have to change that of course. The name of my file is primes.ipynb. From this point on you can use the content inside that file as you would do regularly.
Wish there was a simpler method, but this is straight from the docs.
Note: I am using jupyter not ipython.
import io, os, sys, types
from IPython import get_ipython
from nbformat import current
from IPython.core.interactiveshell import InteractiveShell
def find_notebook(fullname, path=None):
"""find a notebook, given its fully qualified name and an optional path
This turns "foo.bar" into "foo/bar.ipynb"
and tries turning "Foo_Bar" into "Foo Bar" if Foo_Bar
does not exist.
"""
name = fullname.rsplit('.', 1)[-1]
if not path:
path = ['']
for d in path:
nb_path = os.path.join(d, name + ".ipynb")
if os.path.isfile(nb_path):
return nb_path
# let import Notebook_Name find "Notebook Name.ipynb"
nb_path = nb_path.replace("_", " ")
if os.path.isfile(nb_path):
return nb_path
class NotebookLoader(object):
"""Module Loader for Jupyter Notebooks"""
def __init__(self, path=None):
self.shell = InteractiveShell.instance()
self.path = path
def load_module(self, fullname):
"""import a notebook as a module"""
path = find_notebook(fullname, self.path)
print ("importing Jupyter notebook from %s" % path)
# load the notebook object
with io.open(path, 'r', encoding='utf-8') as f:
nb = current.read(f, 'json')
# create the module and add it to sys.modules
# if name in sys.modules:
# return sys.modules[name]
mod = types.ModuleType(fullname)
mod.__file__ = path
mod.__loader__ = self
mod.__dict__['get_ipython'] = get_ipython
sys.modules[fullname] = mod
# extra work to ensure that magics that would affect the user_ns
# actually affect the notebook module's ns
save_user_ns = self.shell.user_ns
self.shell.user_ns = mod.__dict__
try:
for cell in nb.worksheets[0].cells:
if cell.cell_type == 'code' and cell.language == 'python':
# transform the input to executable Python
code = self.shell.input_transformer_manager.transform_cell(cell.input)
# run the code in themodule
exec(code, mod.__dict__)
finally:
self.shell.user_ns = save_user_ns
return mod
class NotebookFinder(object):
"""Module finder that locates Jupyter Notebooks"""
def __init__(self):
self.loaders = {}
def find_module(self, fullname, path=None):
nb_path = find_notebook(fullname, path)
if not nb_path:
return
key = path
if path:
# lists aren't hashable
key = os.path.sep.join(path)
if key not in self.loaders:
self.loaders[key] = NotebookLoader(path)
return self.loaders[key]
sys.meta_path.append(NotebookFinder())
import primes
There is no problem at all using Jupyter with existing or new Python .py modules. With Jupyter running, simply fire up Spyder (or any editor of your choice) to build / modify your module class definitions in a .py file, and then just import the modules as needed into Jupyter.
One thing that makes this really seamless is using the autoreload magic extension. You can see documentation for autoreload here:
http://ipython.readthedocs.io/en/stable/config/extensions/autoreload.html
Here is the code to automatically reload the module any time it has been modified:
# autoreload sets up auto reloading of modified .py modules
import autoreload
%load_ext autoreload
%autoreload 2
Note that I tried the code mentioned in a prior reply to simulate loading .ipynb files as modules, and got it to work, but it chokes when you make changes to the .ipynb file. It looks like you need to restart the Jupyter development environment in order to reload the .ipynb 'module', which was not acceptable to me since I am making lots of changes to my code.
Please make sure that you also add a __init__.py file in the package where all your other .ipynb files are located.
This is in addition to the nbviewer link that minrk and syi provided above.
I also had some similar problem and then I wrote the solution as well as a link to my public google drive folder which has a working example :)
My Stackoverflow post with step by step experimentation and Solution:
Jupyter Notebook: Import .ipynb file and access it's method in other .ipynb file giving error
Hope this will help others as well.
Thanks all!
While the '%run childNotebook.ipynb' command is a pretty simple and useful solution (as mentioned in a previous answer), you should be cautious about using it when the child file is also using another '%run grandChildNotebook.ipynb' in it, but is located in another directory! It can result in duplicate run of files, and is also error prone (as the child no longer uses the same path as its parents, while when running, Jupyter assumes it to do so!)
For resolving the mentioned problem, one solution may be this: just before importing any file, first check where the current directory is located, and then act based on that. Here is an example:
if 'myFolder' in os.getcwd():
%run graindChildNotebook.ipynb
else:
%run myFolder/grandChildNotebook.ipynb
In the above example, it is first checked if we are in the 'myFolder' directory or not. If so, we would find out that the 'grandChildNotebook' is in the same directory and it would be enough to run it as normal. Else, we need to run it by adding the name of the folder this file is located in.
Be careful that it is just an example, and you should do your personalized solution based on your case!

Categories