Type-check Jupyter Notebooks with mypy - python

I have a project containing a bunch of Python modules (.py files) and a bunch of Jupyter Notebooks (.ipynb files) which import things from the Python modules.
I can (assuming I've got __init__.py files in all subfolders) type-check all the .py files by simply running mypy . from the root of the project. But I'd like to also be able to type-check my Jupyter Notebooks.
An ideal solution would:
type check all Python code in my Jupyter Notebooks,
be able to follow imports of .py modules from within Jupyter Notebooks when type-checking, just like imports in .py files,
let me type-check the whole project from the command line, so that I can run type-checking as part of a test suite or a pre-commit hook, and
in some way meaningfully report the locations of type errors within my Notebooks, analogously to how mypy prints line numbers for errors in .py files.
How can I do this?

You could use nbQA and do
pip install -U nbqa
nbqa mypy your_notebook.ipynb

You can:
Convert all notebooks to python, then run mypy on that (How do I convert a IPython Notebook into a Python file via commandline?).
jupyter nbconvert --to script [YOUR_NOTEBOOK].ipynb
Just write a small script to do this and you are fine :)

Checkout nb-mypy
Nb Mypy is a facility to automatically run mypy on Jupyter notebook cells as they are executed, whilst retaining information about the execution history.
More details here

I use Jupytext and my IDE for this.
I export a copy in py:percent format, link that to the notebook. I Do the development in the jupyter lab environment, but the .py file is the one that goes in the git repository. Before commiting, I run it throught the usual linters, black, pydocstyle, mypy (with a strict configuration). I then reload the notebook in Jupyter lab, restart the kernel and 'Run All' to make sure the results are still OK, and only then commit the file to the repository

Related

The variables created using jupyter (.ipynb) do not work on .py files in vscode (in the same environment in WSL). Different terminals?

Context: I'm using vscode with WSL and I also use conda for environment management.
I'm trying to create a variable in a jupyter notebook, let's say x = [10, 20], and then use that same variable in a .py file (not on jupyter notebooks). I'm already using the same environment on both, but the terminal/kernel I believe is different for each. I believe this because when I run a cell on jupyter notebook, nothing happens on the terminal. However, when I run on .py files, the terminal runs the code I selected.
I would like to see the terminal running something for jupyter (.ipynb) and also for my .py files.
Any help would be really appreciated.

How to edit and debug a library in python?

I have created a my own library(package) and installed as development using pip install -e
Now, I would like to edit this library(.py) files and see the update in jupyter notebook. Every time, I edit a library(.py) files I am closing and reopening ipython notebook to see the update. Is there any easy way to edit and debug .py package files ?
Put this as first cell of your notebooks:
%load_ext autoreload
%autoreload 2
More info in the doc.
When you load jupyter, you are initializing a python kernel. This will lock your python to the environment it was at when you loaded the kernel.
In this case, your kernel contains your local egg installed package at the point where it was when you loaded jupyter. Unfortunately, you will need to reload jupyter every time you update your local package.
#BlackBear has a great solution of using autoreload in your first cell:
%load_ext autoreload
%autoreload 2
A follow up solution assumes you do not need to make changes to your notebooks, but just want the updated outputs given changes to your package. One way I have gotten around this is to use automated notebook generation processes using jupyter nbconvert and shell scripting. You essentially create some jupyter templates stored in a templates folder that you will auto execute every time you update your package.
An example script:
rm ./templates/*.nbconvert.ipynb
rm ./*.nbconvert.ipynb
for file in "templates"/*.ipynb
do
echo $file
jupyter nbconvert --to notebook --execute $file
done
mv ./templates/*.nbconvert.ipynb .
Assuming you want to actively debug your package, I would recommend writing test scripts that load a fresh kernel every time. EG:
#mytest.py
from mypackage import myfunction
expected_outputs={'some':'expected','outputs':'here'}
if myfunction(inputs)==expected_outputs:
print('Success')
else:
print('Fail')
python3 mytest.py

How to interpret .py files as jupyter notebooks

I am using an online jupyter notebook that is somehow configured to read all .py files as jupyter notebook files:
I am a big fan of this setup and would like to use it everywhere. On my own jupyter installation however, .py files are just interpreted as test files and are not by default loaded into jupyter cells. How can I achieve the same configuration for my jupyter notebook?
What you're looking for is jupytext.
You just need to install it into python env from which you're running your jupyter notebooks:
pip install jupytext --upgrade
And you get this:
That's not exactly what you asked, but you can achieve something close to that by using the magic %load FILE.py in a new jupyter notebook.
%load FILE.py will copy the contents of FILE.py in the current cell upon executing it.
You use the python code in your Jupyter Notebook by just pasting the whole code in a cell OR :
%load pythonfile.py to load code from a file (not necessarily .py files) into a jupyter notebook cell;
%run pythonfile.py in order to execute the file instead of loading it (outputs whatever that file outputs).
Also, pythonfile.py should exist in the cd or you can use its full path.

How to test Jupyter notebooks on Travis CI?

Is there a way to deploy Jupyter Notebooks on Travis CI and test running all the cells?
My Jupyter Notebooks use IPython Kernel, and I have a conda environment file.
I've been wondering something similar and have compiled some information but haven't fully tested it yet.
Firstly, you can rely on jupyter nbconvert execute notebooks, where you can then look for errors. There's an example set up with Travis CI and Conda at ghego/travis_anaconda_jupyter. I believe Travis CI relies on pytest too to catch issues, though I'm not entirely sure how this fits together.
Another way you can run this is with pytest-notebook, which relies on you having a working version of the notebooks you want in some environment. This package's main purpose is to detect if changes to the environment will create issues within the notebooks. This can also potentially be used in conjunction with the above method, though it might be redundant.
It might be additionally beneficial for version management (tracking, seeing diffs, legibility) to write your notebooks in markdown format and then use jupytext to convert them into a .ipynb file to then run with the above options. jupytext can also execute notebooks directly with the --execute flag, so perhaps there's an even simpler way to integrate such a workflow!
I will be testing this in the coming weeks and will update this comment if I learn anything new.

Activate ipython notebook autocomplete in .py files

I have a .py file shared with other colleagues that we are modifying almost in parallel. I would like to modify it with ipython notebook. Notebook opens it correctly and even interpret it with colors. Nevertheless, it does not auto complete when I'm programming.
I wonder whether is possible to activate the autocomplete on .py in notebook.
Thanks.

Categories