How do I document the Jupyter Notebook Profile startup? - python

When I start up the Jupyter Notebook I've modified the ipython_config.py in my ipython profile to automatically load numpy as np:
c.InteractiveShellApp.exec_lines = [
'import numpy as np',
]
This works great. When I start up a Notebook, in the first cell I can immediately call all of the numpy library via np.. However, if I'm sharing this Notebook via a gist or some other method, these imports are not explicitly shown. This is suboptimal as it makes clear reproducibility impossible.
My question: Is there a way that I could automatically populate the first cell of a new Notebook with the code that I'm importing? (Or some other similar way to document the imports that are occurring for the Notebook).
I'd be OK with removing the exec_lines option and pre-populating the code that I have to run myself or some other solution that gets at the main idea: clear reproducibility of the code that I'm initially importing in the Notebook.
Edit
A deleted answer that might be helpful to people landing here: I found jupyter_boilerplate which as an installable Notebook extension "Adds a customizable menu item to Jupyter (IPython) notebooks to insert boilerplate snippets of code" -- would allow one to easily create a starting code snippet that could be filled in.
Sidenote to MLavoie because "comments disabled on deleted / locked posts / reviews"
Yes, you are right that:
While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - From Review – MLavoie Jul 8 '16 at 17:27
But, you'll notice, that this is a widget to be installed, so there isn't relevant code to paste here. It was unhelpful to delete the above answer.

Almost automatically:
%load startup.py
Put import/config code in a version controlled file on your PYTHONPATH and %load it into the first cell.
This has the advantage of allowing you to use different startup code without tweaking your startup config, and notebooks remain portable, i.e. send the notebook and startup file to other users and they can run it without tweaking their startup config.

Create a notebook that contains the preparations you want and use that as a template. That is, copy it to a new file and open it.

Related

How come widgets from ipywidgets is not defined in an html file?

I am using nbinteract to develop an interactive web page from a Jupyter notebook. I finally got to the end and published the first version of it but it does not appear the python libraries loaded properly (see image below). This appears to be the problem even in the original nbinteract tutorial. Any ideas on what might be the problem here?
Thank you
It’s possible to export a notebook including widgets to html using nbconvert (https://github.com/jupyter/nbconvert ). You need to make sure to save the notebook using the classic Jupyter Notebook (not JupyterLab) with the “Widgets -> Save Notebook Widget State” menu option.
Unfortunately, it’s not possible to preserve the behavior of the callback functions this way because these functions are defined using Python, and there’s no Python kernel available in the standalone HTML file.
To host interactive figures outside of the notebook , writing a Dash app is always good (https://dash.plot.ly/ ). or If you want to stay in notebook/widget field you could use https://github.com/QuantStack/voila.

Call and run a jupyter notebook from Excel using xlwings?

In the company I work at, we have just started experimenting with the migration of several computation-heavy projects from Excel to Python. In this process, we have discovered xlwings and the power it can bring to the integration of Excel and Python.
Several of our projects include reading in input data from Excel worksheets, doing some background calculations, and then outputting the results to different sheets in the same workbook. From the
example projects on the xlwings website, I know it's possible to replace VBA macros (which we used so far) with Python, while keeping the Excel front-end.
However, my co-workers, who are primarily financial experts and not programmers, really like the interactivity of jupyter notebooks and would appreciate it if they could work in them during the modeling phase of the projects (rather than switching to PyCharm all of a sudden).
All in all, the ideal workflow would look like this for us:
Inputting from Excel sheets, doing some modeling and model calibration in Python through jupyter notebooks, running some tests, then if we're at a final stage, then outputting to Excel. A key constraint is that the end-users of our models are used to the VBA-like functionality (eg. Run buttons in Excel).
So my question is the following:
Is it possible to call and run a jupyter notebook from Excel as it was a .py file (ie. through the RunPython function)? This way, I assume that we could avoid the intermediate step of "converting" the models from .ipynb to .py, not to mention having two code versions of the same model.
Thank you for any suggestions!
Look here: https://github.com/luozhijian/jupyterexcel with using 'def'. That creates functions callable in Jupyter.
Thanks for the replies!
We started experimenting with nbconvert as #Wayne advised, and basically wrote a small wrapper .py code that can be called in the Excel macro via RunPython and that runs the specified notebook as-is. In the notebook, interaction between Excel and jupyter (eg. reading parameter data in from Excel and outputting values from jupyter to Excel) is handled by xlwings in turn.
I'm not sure it is the most optimal solution concerning execution speed though. However, in our case it works just fine and so far we haven't experienced any additional overhead that would hinder user experience.
This is possible with xlOil (disclaimer: I wrote it). The relevant docs are here.
The summarised procedure is: install xloil
pip install xloil
xloil install
Then edit %APPDATA%\xloil\lxloil.ini to add a reference to xloil.jupyter. Then in an Excel cell type
=xloJpyConnect("MyNotebook.ipynb")
Where you have "MyNotebook.ipynb" loaded in a local jupyter kernel. Then you can execute code on the kernel with:
=xloJpyRun(<result of connect func>, "{} + {} * {}", A1, B1, C1)
Or watch a global variable in the kernel with:
=xloJpyWatch(<result of connect func>, "variable_to_watch")
Or you can add a decorator #xloil.func to functions defined in the kernel to make them appear as Excel worksheet functions.
It sounds like this is what you want https://towardsdatascience.com/python-jupyter-notebooks-in-excel-5ab34fc6439
PyXLL and the pyxll-jupyter package allows you to run a Jupyter notebook inside of Excel and seamlessly interact with Excel from Python. You can also write Python functions in the notebook and call them from Excel as a user defined function or macro.
From the article:
"""
It used to be an “either/or” choice between Excel and Python Jupyter Notebooks. With the introduction of the PyXLL-Jupyter package now you can use both together, side by side.
In this article I’ll show you how to set up Jupyter Notebooks running inside Excel. Share data between the two and even call Python functions written in your Jupyter notebook from your Excel workbook!
"""
If the above link doesn't work you can also find it here https://www.pyxll.com/blog/python-jupyter-notebooks-in-excel/

Is there a docstring autocompletion tool for jupyter notebook?

I am looking for a tool/extension that helps you writing python docstrings in jupyter notebook.
I normally use VS code where you have the autodocstring extension that automatically generates templates (e.g. the sphinx or numpy template) for docstrings. Is there an equivalent to this in jupyter notebook?
I have been looking online for a long time now, but have trouble finding it.
run this in a Notebook cell:
%config IPCompleter.greedy=True
Then press tab where you want to do autocomplete.
(Extracted from Reddit post)
To make use of auto complete without the use of tab or shift+tab follow this.
However, I do not think there is an autodocstring extension for jupyter notebook like the one on VS Code you mentioned.

MyBinder.Org - Jupyter Notebook interactive function not showing

I used this answer to build a requirements.txt file for my Jupyter notebook. I then put both the ipynb and requirements.txt file in a Git repo and published it to Binder. As I understand, this should give me an interactive version of my notebook which I can share with people for them to play around with.
The published Binder can be found here.
Does anyone know why the interactive bit is not showing? Specifically, the sliders.
You need to enable the extension that allows the sliders, namely ipywidgets. There is an example of using ipywidgets presently here among the Binder example template repos. (If that repo gets altered, I was talking about this specific commit.)
Right now the extension gets enabled separately for JupyterLab vs the classic interface. If you just want to have your launches default to the classic interface, you can leave out the JupyterLab-related enabling.

How to merge changes in Jupyter notebooks

Collaboration with a coworker on a Jupyter notebook is driving me nuts. We're working on different versions (I would say "branches", but that's probably too fancy for what we're doing) of the same notebook. I try to merge (some of) the changes he introduces, into my version. Since diffing JSON files is a nightmare, I convert the two notebooks to .py files (Download as\Python (.py file) from the File menu of the notebooks) and then compare the .py files in PyCharm. This works nicely, also because all output is removed when exporting to .py.
The problem now is to import the changed .py file into Jupyter. Is this possible? The one thing that gives me hope of an affirmative answer, is that into the exported .py files there are some # In[4]: comments, which maybe the Jupyter interface may use to understand how the code is divided into cells. Or is it just impossible to go back? If so, do you have any other suggestions to merge some of changes between two different versions of a Jupyter notebook?
To answer the second question:
(And this question looks related there)
When we had that problem, using jq as described in this post worked okay. (The part starting from "Enter jq".)
To use this, you would have a second, "stripped" version of the notebook which is the one you add to git, in addition to your development notebook (which is not added to git, otherwise you get merge conflicts with the development notebooks of your teammates).
You always need an additional step,
nbstrip_jq mynotebook.ipynb > mynotebook_stripped.ipynb
before doing git add mynotebook_stripped.ipynb, git commit etc. But if everyone in your team does that, the changes are more or less nicely manageable with git. For bigger projects, you could try automating it like described further below in the same post.

Categories