Anaconda Connect to Environment Created from Another Server - python

I have the following resources:
Server A with Anaconda installed at D:\Anaconda
Server B with Anaconda installed at D:\Anaconda
Network Attached Share mapped to drive E: on Server A and Server B
I created a conda environment on the NAS from Server A. Executing Python scripts from Server A within that environment works.
I would like Server B to be able to execute scripts against the environment on the NAS. The environment doesn't show up when running conda --info envs, because it wasn't created from Server B.
How do I use an environment on a NAS to execute scripts on a server that did not create the environment?

Add the directory above the environment to the conda config of server B. If the environment is at E:\conda\envs\shared_env use:
conda config --append envs_dirs E:\conda\envs
Given the information below, you need to do some testing from within both servers.
from conda.base.context import locate_prefix_by_name
locate_prefix_by_name('<environment name>')

Related

How to activate venv- remote interpreter- VM from GCP connected via SSH

I would like to activate the venv. I'am using the remote interpreter because pycharm has got the connection via SSH with GCP VM. I used to activate env by using this command:
On Unix or MacOS, using the bash shell: source /path/to/venv/bin/activate
In local mode there is no tackle with it but for remote interpreter I do not know how to find the source. Could you please help me with this tackle?

VSCODE how to debug python in conda environment in docker container in remote server

I start by enabling the docker container in the remote server, then I connect VSCODE to that server via SSH, and finally I attach VSCODE to the docker (docker extension installed).
I have selected the interpreter in the conda environment by editing a ./.vscode/launch.json file
When I start the python program debug the packages available in the conda environment are not visible to the python program.
What am I doing wrong?
thanks
You most likely need to select the Interpreter inside VSCode.
First open up the Command Palette Ctrl+Shift+P, then type Python: Select Interpreter then select the environment you want to use.

Mounting a virtual environment via SSHFS on local machine using it's Python3 file not working

So I have mounted a part of a development server which hold a virtual environment that is used for development testing. The reason for this is to get access to the installed packages such as Django-rest-framework and Django itself and not having it set up locally (to be sure to use the same version as the development server has). I know that it's perhaps better to use Docker for this, but that's not the case right now.
The way I've done it is installing SSHFS via an external brew (as it's no longer supported in the brew core) - via this link https://github.com/gromgit/homebrew-fuse
After that I've run this command in the terminal to via SSH mount the specific part of the development server that holds the virtual enviornment:
sshfs -o ssh_command='ssh -i /Users/myusername/.ssh/id_rsa' myusername#servername:/home/myusername/projectname/env/bin ~/mnt/projectname
It works fine and I have it mounted on my local disk in mnt/projectname.
Now I go into VSCode and go into the folder and select the file called "python3" as my interpreter (which I should, right?). However, this file is just an alias, being 16 bytes in size. I suspect something is wrong here, but I'm not sure on how to fix it. Can someone maybe take a look and give some input? I'll attach a screenshot of the mounted directory.
Screenshot of virtualenv directory mounted on local machine
The solution to the problem was using the VSCode extension Remote - SSH and run VSCode directly in the remote location, and from there being able to access the virtual environment.

How to use pycharm to run an application in remote spark cluster

I have installed PyCharm on my local system and have configured it to run spark applications in local mode in windows.
My spark cluster is in a remote Ubuntu box.
How can I run a spark application in the remote spark cluster, which is on Ubuntu, from my locally installed PyCharm which is on Windows?
My goal is to run the application in a remote cluster so workarounds are also welcome.
PyCharm is already setup for this. Ideally you want to setup a deployment and a remote interpreter for your setup, ideally via ssh.
This allows you to upload your codebase to the cluster (so that the pyspark driver has access to it), but run it from your laptop. Remote interpreter then takes care of resolving dependencies on the cluster.
Have a look here https://www.jetbrains.com/help/pycharm/configuring-remote-interpreters-via-ssh.html and here https://www.jetbrains.com/help/pycharm/creating-a-remote-server-configuration.html.
NB: Before you start configuring the remote interpreter, it's better to install venv or conda on your cluster and create a virtual environment, so that you don't have any dependencies or outdated packages. You then point the remote interpreter config to the python binary of the environment, such as /app/anaconda3/envs/my_env/bin/python.

Virtualenv not recognised in ipython Notebook Server

I have iPython running from a secured server on an Ubuntu server VM running on my laptop.
Command line ipython works on the server vm from the virtualenv. I can also start the notebook server on the server vm from the virtualenv without errors.
I can access notebooks from the host laptop and execute code in cells, but if I start the notebook server after activating a virtualenv I can't import any of the Python modules I've installed in the virtualenv.
It looks like the notebook server process is running the system Python but not the version in my virtualenv. Is there a way to tell the notebook server process which virtualenv to use?
Because virtualenv on activation adds its own ways to the begining of the PATH environment variable, you have two options:
a) create correct virtualenv on the notebook server and install everything from there
b) modify PYTHONPATH variable in order to get access to your libraries.

Categories