I followed the steps in - https://github.com/aws/graph-notebook , and I was able to run the magic commands from my Mac. But I need to get this configured in JupyterLab.
Here is what I did:
Created a new conda env in JupyterLab
Installed all required packages in conda env
When i run the magic commands like %status, I get "UsageError: Line magic function %status not found.". Please note that ipython, graph-notebook and all required libraries are installed in conda environment.
I skipped the last step in the instruction - python -m graph_notebook.start_notebook --notebooks-dir ~/notebook/destination/dir. Running this created a new notebook, which Im unable to access. (Note: JupyterLab is in remote server)
I want to connect to AWS Neptune and visualize graphs in JupyterLab using gremlin queries/magic commands.
Currently, graph-notebook only supports Jupyter not JupyterLabs, so you will not be able to use the notebook widgets. There is an open issue on adding support for this, so please +1 this feature request, so we can help prioritize this work. https://github.com/aws/graph-notebook/issues/55
Related
I'm trying to run TensorFlow1.15.0 under a specific environment (python3.7) in JupyterLab, and this JupyterLab is operated under a remote cluster.
I've installed Python3.7 kernel under a new environment called "newenv" by the code below:
And I have checked the packages list by "pip list", there are what I want under "newenv" environment. However, I don't know how to run ".ipynb" under this specific environment. I tried to change the kernel to "python3.7", it doesn't work. By "!conda info" it's showing below:
I have also tried run "jupyter notebook" after activating the newenv, but I got an error below:
I really need somebody can help me! Many Thanks in advance!!
I am attempting to import cartopy into a notebook but am running into issues. If I do a "conda install cartopy" in my base environment, I get the frozen/flexible solve issue (https://github.com/conda/conda/issues/9367). When I open a new environment and do a "conda install cartopy", it seems like everything works. "conda list" shows version 0.18.0 in that environment. Then I open a Jupyter Notebook from within that environment and try to import it into the notebook but originally got the response "no module named cartopy". I tried fiddling with my environments and settings based on feedback on this page (In which conda environment is Jupyter executing?), but now the error is "no module named numpy"! Can someone please help me understand why the notebook isn't seeing these modules? Thank you.
Are you certain that numpy is installed in this new environment?
Given that you are using the Anaconda distribution of Python, you should be able to view/configure your environment and it's installed modules using the Anaconda Navigator. Here you can see a full list of all installed modules, and via the terminal/console, launch a Jupyter Notebook from not only within a specific environment, but within a specific directory!
On the left hand side, you can choose the environment that you want to use to start Jupyter Notebook, and on the right you can view all of the installed modules in that environment. Make sure that jupyter, numpy, and cartopy are all listed as installed packages.
To make sure that I am launching Jupyter from my desired environment, I always launch it directly from the terminal. I "Open Terminal" with my environment, "cd" into the directory that I want to use, and then use the jupyter notebook or jupyter lab command.
I would like to use Jupyter notebook inside Pycharm. The project interpreter is a python2.7 from a virtual environment inside WSL (ubuntu 18.04).
The Jupiter package is correctly installed inside the virtual environment (I can run it by jupyter notebook).
My problem is that when I want to use Jupyter notebook inside Pycharm, I get the following error: Run Error Jupyter package is not installed (see picture).
Any idea what's going on here?
I had this problem in Python 3. Below are the steps I took to resolve the issue; I believe they should resolve the issue for you too:
I had Jupyter Lab installed. Pycharm only works with Jupyter Notebook. Long story short, if you have Jupyter Lab installed you need to uninstall all your packages using:
$ pip freeze | xargs pip uninstall -y
Restart your computer
Follow Jupyter Notebook installation instructions
Make sure WSL is set up through pycharm instructions: wsl pycharm instructions
In Pycharm, open an .ipynb file. Click the dropdown that says "Managed Jupyter server" It's right above the text editor. Select "configure Jupyter server". Check configured server.
In your wsl terminal, type jupyter notebook. Copy and paste the text that looks like: http://localhost:8888/?token=874asdf687asd6fasd8f74ds6f4s9d8f7sddf into the cofigured server box in Pycharm.
That's it. You should be able to run the jupyter cells in pycharm now.
I have Pycharm 2020.3 For me the issue was I was using a virtual environment with "inherit global site packages." I had Jupyter installed in global site packages but NOT the virtual environment.
Once I installed Jupyter within the virtual environment Jupyter notebook worked. Not sure why inheriting Jupyter from global packages wasn't working for me.
The above solution using a designated url with token seems to work with older versions of PyCharm. A simpler solution is to upgrade to the latest PyCharm. I no longer had an issue with the auto server using PyCharm 2019.3.2 (Mac)
I had this problem with Datalore plugin enabled on 2020.2 linux, running on bare metal but displaying to a remote X server (probably doesn't matter). My solution was to disable the Datalore plugin (it's enabled for professional pycharm by default).
This way I was still able to use the "managed" auto-start version with better integration / debugging vs the "configured" option (or at least with less hassle).
Note since it's been a year, my problem is probably different than OP.
This happened for me, when the interpreter was a remote one. I fixed this by changing the interpreter to one from a local env.
This can be done by selecting the Configure Jupyter Server.
I also meet this problem,and i solved it
i create the new project with the global sit-packages like below
then i meet the problem
i create the new project with no global sit-packages and install jupyter notebook in the virtualenv
then the problem is gone
My environment:
Windows 10 Professional
Python 3.7.2
virtualenv 16.4.3
I created a new virtual environment with D:\Python37\Scripts\virualenv env
Then I activated the virtual environment with env\Scripts\activate
Then I installed jupyter with pip install --upgrade jupyter
Finally, I started jupyter with jupyter notebook
Everything starts up fine, and I create a new Python 3 notebook. Unfortunately, the notebook never connects to the server. I get the following error message in powershell
Replacing stale connection: (token)
In the browser, I get the following error message:
"A connection to the notebook server could not be established. The notebook will continue trying to reconnect. Check your network connection or notebook server configuration."
My two prior virtual environments (with Jupyter) work fine. I've deleted .ipython, .jupyter, AppData\Roaming\jupyter, without any luck.
I've cleared cookies from my browser and have tried a different browser. Nothing works.
I've created two other virtual environments before, and both of those still work.
All jupyter notebooks in the two working environments start up as untrusted, whereas the new environment starts up as trusted. I'm guessing that I clicked on something and now the notebook is looking to start up in a trusted fashion - which may require HTTPS.
Where do I look to fix this problem?
This appears to be a tornado issue. I found clues here.
Jupyter no connection to server
Jupyter kernel not connecting
I looked at the version of tornado (from the above links) in an environment that was working. It turns out that the version was 5.1.1.
I looked at the version of tornado in an environment that was NOT working. It turns out that the version was 6.0.
I downgraded the version of tornado in my non-working environment to 5.1.1 with the following command.
pip install --upgrade tornado==5.1.1
And now the non-working environment works!
Anaconda is pretty good at handling any dependencies.I just tried this using Anaconda in the terminal:
# see current envs
conda info -e
# make new environment, feel free to add your version of python with python=3.7 handle
conda create -n test
activate test
conda list #This should appear empty
conda install jupyter #y to install everything.
jupyter notebook #launch jupyter notebook
Mine comes up as 'trusted'. The method above may not necessarily be the most minimalist way of doing things, but at least nothing breaks and you're up in running in no time. I'm using conda version: 4.6.2
Since this is one of the top answers to a Google search on the error :
"A connection to the notebook server could not be established. The notebook will continue trying to reconnect. Check your network connection or notebook server configuration."
This might also have nothing to do with any install or library.
It may just be a proxy setting in your browser or on your system directly.
One solution may be to deactivate the proxy or add an exception to Jupyter's URI.
In my case, the situation was different. It was a browser caching issue, i.e., I would call jupyter-lab using a batch script and it would just open a tab. Closing all explicitly tabs and then the browser worked me.
If there was an old instance of another disconnected jupyter-lab, it would somehow not establish a proper connection.
This solution work for me :
pip uninstall Pyzmq
pip install Pyzmq==19.0.2
Using pip to install some packages resulted in confusing the jupyter installation. So you can uninstall the packages installed with pip, disable the jupyter_contrib_nbextensions, then try to use conda install as possible.
jupyter labextension disable my-extension
I am working on a setup where several developers, working on different projects, all execute their code on a remote machine, using Jupyter notebook.
Since every projects requires a different virtualenv what happens now is that every developer for every projects, sets up a project specific virtualenv, installs notebook to it, runs it on a different port and connects to the remote machine through that port.
Is there a way to have 1 Jupyter notebook running on the remote machine, but be able to choose which virtualenv to use as kernel?
My main consideration is being able to expose only one port on the remote machine, but be able to use different virtual python environment for running the notebooks
I am working on a setup where several developers
If you have many dev working on a remote machine you must use JupyterHub, JupyterHub is made for that, and JupyterHub is the first step toward easing your pain; if you do not use JupyterHub, things will go wrong.
Once you have JupyterHub installed, your devs will be able to login with their credential wit exposing a single port, and will be able to start/stop notebook servers without sshing in.
Once this is done, you can investigate multiple venv.
In each environment you want to install ipykernel. It is the module that knows how to talk to the notebook. And in each environment you need to issue the python -m ipykernel install --user --name=my-env-name as said in the comments below your posts. This register each env with Jupyter, telling it "Hey I exist expose me to your users". You may also decide to install this that does part of this automatically for you, but have some caveats.
As other commenters have pointed out you likely want to read Jake's post, and if you have several users you should absolutely always almost without questions use JupyterHub.
Is there a way to have 1 Jupyter notebook running on the remote
machine, but be able to choose which virtualenv to use as kernel?
This is how I managed to use multiple kernels in the same Jupyter notebook instance
conda install nb_conda
nb_conda is a notebook extention that allows you to manage conda environments from your notebook. It also allows you to switch kernels directly from the Kernal menu.
I have noticed that the above command installs it with a few extras (nbpresent, nb_anacondacloud) which can be optionally disabled.
jupyter-nbextension disable nb_anacondacloud --py --sys-prefix
jupyter-serverextension disable nb_anacondacloud --py --sys-prefix
jupyter-nbextension disable nbpresent --py --sys-prefix
jupyter-serverextension disable nbpresent --py --sys-prefix
If you do not yet use conda you should consider it for your package management and virtualenv needs [source].
I believe that this system does not have many of the pitfalls mentioned in jakesvdp's post that #denfromufa mentions as the notebook extension nb_conda should be dealing with all the internals.
Screenshots
Conda tab in jupyter notebook allows you to manage your environments right from within your notebook.
You can also select which kernel to run a notebook in by using the Change kernel option in Kernel menu