Reconnect jupyter notebook to existing kernel - python

I am somewhat confused about the frontend/backend separation of Jupyter Notebook. I am working on a Linux server and suddenly encountered a memory error which was very unlikely given the available memory. And it lead me to realize that the kernels never completely shut down as it was reported here Memory leakage due to presumably unclosed kernels. Here the solution is to just kill all the processes but as I don't want to run my notebook all from the beginning again it would be handy to connect the notebook back to the existing kernel which holds all the variables and data that were already calculated.
I tried it with
jupyter lab --existing kernel-XXkernelnrXX.json
but it just brings me to the starting page of Jupiter lab where I can open a new file. When I open the file I want to reconnect a new kernel is started which I see in htop and %connect_info executed in the notebook gives me a different kernel. I also tried clicking "Reconnect to kernel" from the menu bar and also "Select kernel for: YX.ipynb" but it doesn't work either. So how can I connect the frontend visual notebook again with a running kernel?

Related

Jupyter notebook looses connection

I am fairly new to working with python, so I am sorry if this is a naive question.
I have set up a jupyter notebook that I start through the windows terminal. I run it with python 3.9.7 in an Anaconda virtual environment. I use microsoft edge as host browser because I had problems with google chrome blocking jupyter lab.
Since yesterday, I receive an error message after opening my notebook with - jupyter lab command (which has worked for me since some weeks now). The puzzling thing is, that I can open my notebook and work in it for some minutes usually, but then this message pops up:
Server Connection Error: A connection to the Jupyter server could not be established. JupyterLab will continue trying to reconnect. Check
your network connection or Jupyter server configuration.
Unfortunately, I have really no idea why this could be, therefore I have not tried much to fix this yet. I have a stable internet connection. Simply closing the notebook and reopening it worked for some times, but after a while of working in the notebook it looses the connection again.
Has anyone experienced similar problems?
Thank you for your help and ideas!
I had this issue with jupyter lab my error seems the same:
"A connection to the Jupyter server could not be established.
JupyterLab will continue trying to reconnect. Check your network
connection or Jupyter server configuration."
The problem for me was that I had started the program from the command line using the "jupyter-lab" command and had subsequently closed my terminal after the jupyter lab opened. Closing terminal shut down the required server. Left the terminal open and is working normally.

Run local code in the Jupyter notebook on remote server via kernel

I want to run local code using local data on a remote server and get back execution results back to my Jupyter notebook cells.
Not usual scheme "run Jupyter notebook remotely, connect to remote notebook via ssh tunneling" but more sophisticated via custom remote kernel which I may choose from the kernel list, and run local code on remote server seamlessly.
Some packages (like this -- https://pypi.org/project/remote-kernel) mention that it is possible, but look dated and come with limited usage instructions.
Anyone knows how to implement this? If so, be as more detailed as possible, thanks!

do i have to restart the kernel everytime I run a jupyter notebook?

I am completely new in data-science and the jupyter notebook world. Is there any way to start from where I left on without restarting the whole notebook?
i.e I did some operations on a dataset and get a final_data. Whenever I want to use that final_data after shutting down and opening, I get NameError: name 'final_data' is not defined. How to solve this?
I think that's how the Jupyter Notebook works. You have to run all the notebook again. I go to Cell>Run All after opening my notebook.
It depends whether you are using Google Colab or a simple Jupyter notebook, I think you are using simple Jupyter, however, I will discuss my experience with both.
For Google Colab: Runtime is destroyed with inactivity and you have to run it again from the start.
For Jupyter Notebook: If the notebook tab is open on your browser, the results will not be cleared with time (due to inactivity) as long as you close the tab itself. If you are shutting down/restarting your pc, you will lose the progress (number of lines which were compiled and run), and there is not an easy way to save the progress.
The best way is to keep the Jupyter notebook tab open, even if you put your computer in sleep mode (by flipping the lid of your laptop), the data will be retained, until and unless you do not restart or shut down your PC.

Jupyter Notebook becomes unresponsive after inactivity for a while, does not log to terminal

This is a strange thing that I've noticed only for a particular computer. If I leave the Jupyter Notebook page inactive for a while (without closing the browser page or putting the computer to sleep) and come back to it, the kernel would appear to be completely unresponsive; but it wouldn't say "Kernel Dead" either. Restarting the kernel from the Jupyter Notebook does nothing so I always ended up having to close the command window from which Jupyter Notebook was run, and it goes without saying that all the variables were lost.
Whenever this happens, any activity on the Jupyter Notebook page following its "death" is not logged in the command window that runs it.
Tried searching around on GitHub which might have been a more appropriate channel with no luck.
I'd be happy to provide more info. Thanks!
This wight not be a helpful answer as it will not address the root cause of the problem but rather present a workaround. The kernel is an independent process from the browser so you can close the browser tab and reopen it (e.g http://localhost:8888/) and it connects to the still running kernel without any problems.

(is) kernel busy in ipython notebook

I'm using an EC2 spot instance (my windows to ubuntu instance) to run a function that was well beyond my laptop's capabilities. The kernel busy dot has been filled for hours. Previously, I would just listen to my laptop as it was obvious when something was running as opposed to ipnb getting stuck. Is there any way I can tell now?
If I try something like 1+1 in the box below my function it will also turn into an asterisk, but I can open a new notebook and have zero issues running simple commands in the new notebook.
this is because all though each notebook has multiple cells but only one kernel, so the commands in the other cells are queued until the first cell finishes its task. When you open a new notebook that notebook is provided with its own kernel so it can do other simple commands quickly without attempting whatever it is that's taking so much cpu power

Categories