Jupyter notebook online and uninterrupted session - python

I'm trying to train a model in Jupyter notebook on my computer. Unfortunately, it will take about a few days (more than a week).
Is there a Jupyter notebook somewhere on the Internet that I could start, disconnect (turn off the laptop) and then come back after a few days and connect to the still ongoing session?

What are you looking for is a server. You need to host your Jupyter Notebook session on a remote host. The idea is that your Jupyter Notebook needs to run continuously, and as a result you need a machine that runs continuously. The fix for your issue will be a server, now the problem that you will have is the fact that if you need specific hardware requirements, like graphic cards, a cloud service provider that offers servers with your specific setup will be harder to find than choosing to train your model on the cpu of a server. The main idea would be to browse Amazon services and a free trial of a service in order to train your model.
Remote GPU Machine learning training service Amazon: https://aws.amazon.com/sagemaker/train/?sagemaker-data-wrangler-whats-new.sort-by=item.additionalFields.postDateTime&sagemaker-data-wrangler-whats-new.sort-order=desc
Jupyter Notebook on remote server config: https://www.digitalocean.com/community/tutorials/how-to-install-run-connect-to-jupyter-notebook-on-remote-server

Related

Can I run sections of a local Jupyter Notebook on a remote server?

I am running a Jupyter Notebook on my laptop. Is it possible to run one/two cells of the script on a remote server that I have access to?
The remote server is more powerful, however I have been allocated a limited amount of storage on the server so the bulk of work has to happen on my device.
I have used OS.system to run a python script on the server from Jupyter on my laptop, but it seems inefficient.
Both devices are running Ubuntu.

Dedicated Colab VM stops execution 1 hour after closing the tab

As the title says, I have spun a dedicated VM for Colab & connected it to the notebook, but after I close the tab it runs for only one more hour, even though it is stated in the documentation that "Connecting to a custom GCE VM puts you in control of your machine lifecycle. You will still experience disconnections from your VM from interruptions to your connection, but Colab will not alter the VM state: your work and progress will be saved and available when you reconnect."
Hovering over the RAM & Disk section does indeed show that I am connected to the VM.
Is there something I'm doing wrong / do I have to enable something else from the UI ?
Thanks a lot!

Run local code in the Jupyter notebook on remote server via kernel

I want to run local code using local data on a remote server and get back execution results back to my Jupyter notebook cells.
Not usual scheme "run Jupyter notebook remotely, connect to remote notebook via ssh tunneling" but more sophisticated via custom remote kernel which I may choose from the kernel list, and run local code on remote server seamlessly.
Some packages (like this -- https://pypi.org/project/remote-kernel) mention that it is possible, but look dated and come with limited usage instructions.
Anyone knows how to implement this? If so, be as more detailed as possible, thanks!

How to run a google Colab Notebook from terminal?

Suppose I have a Google Colab Notebook in an address like below:
https://colab.research.google.com/drive/XYZ
I want to keep it running for 12 hours, however, then again I want to turn my computer off. As a solution, I can connect to our Lab's server via ssh. The server is running all the time. I would like to know if it's possible that I load and run the notebook there?
I found a solution to connect to a Google Colab Session via ssh (colab_ssh package), but it again needs a running Colab Session.
I also tried to browse the link with lynx, but it needs login and this isn't supported by this browser.
Yes, it is possible. You would first need to download your colab notebook as an .ipynb file, then copy it to your server. Then, you can follow one of the guides on how to connect to a remotely running jupyter notebook session, like this one. All you need is the jupyter notebook software on your server, and an ssh client on your local computer.
Edit: I forgot to mention this: To keep your session alive even after closing the ssh connection, you can use tools like screen. The link provides more detailed explanation, but the general idea is that after connecting to your server, first you need to create a session like this:
screen -S <session_name>
which will create a new session and attach you to it (which is the term used when you are inside a session). Then, you can fire up your jupyter notebook here, and it will keep running even after closing the ssh connection. (You just have to make sure you don't kill the screen session using Ctrl+a followed by k)
Now, you have an indefinitely running jupyter notebook session on your server. You can connect to it via
ssh -N -f -L localhost:YYYY:localhost:XXXX remoteuser#remotehost
as mentioned in the first linked guide, use the browser to run a code cell on your jupyter notebook, and then turn off your laptop without worrying about interrupting your notebook session.

Run jupyter notebook on GCP without any dependency on my laptop

I'm currently using GCP to run the Jupyter notebooks on the notebook server provided by Google. Every time I open the notebook server in commandline, it shuts down when there is a network interruption or power outage on my end. I'm very naive on GCP too.
Is there any way that I could run the Ipython notebooks on the server and later collect the results without having to bother about anything else?
Thanks in advance!
Have you tried using GCP's AI Platform Notebooks?
https://cloud.google.com/ai-platform-notebooks/
You can open these notebooks directly in your browser unlike the older Datalab notebooks (no need to SSH). That should solve your network interruption issues

Categories