I know how to install packages in Anaconda using conda install and also how to install packages that are on PyPi which is described in the manual.
But how can I permanently include packages/folders into the PYTHONPATH of an Anaconda environment so that code that I am currently working on can be imported and is still available after a reboot?
My current approach is to use sys:
import sys
sys.path.append(r'/path/to/my/package')
which is not really convenient.
Any hints?
I found two answers to my question in the Anaconda forum:
1.) Put the modules into into site-packages, i.e. the directory $HOME/path/to/anaconda/lib/pythonX.X/site-packages which is always on sys.path. This should also work by creating a symbolic link.
2.) Add a .pth file to the directory $HOME/path/to/anaconda/lib/pythonX.X/site-packages. This can be named anything (it just must end with .pth). A .pth file is just a newline-separated listing of the full path-names of directories that will be added to your path on Python startup.
Alternatively, if you only want to link to a particular conda environment then add the .pth file to ~/anaconda3/envs/{NAME_OF_ENVIRONMENT}/lib/pythonX.X/site-packages/
Both work straightforward and I went for the second option as it is more flexible.
*** UPDATE:
3.) Use conda develop i. e. conda-develop /path/to/module/ to add the module which creates a .pth file as described under option 2.).
4.) Create a setup.py in the folder of your package and install it using pip install -e /path/to/package which is the cleanest option from my point of view because you can also see all installations using pip list. Note that the option -e allows to edit the package code. See here for more information.
Thanks anyway!
I'm able to include local modules using the following:
conda-develop /path/to/module/
I hope it helps.
The way I do this, which I believe is the most native to conda, is by creating env_vars.sh files in my environment, as per the official documentation here.
For macOS and Linux users, the steps are as follows:
Go to your environment folder (e.g. /miniconda1/env/env_name). $CONDA_PREFIX is the environemnt variable for your environment path.
cd $CONDA_PREFIX
Create the activate.d and deactivate.d directories.
mkdir -p ./etc/conda/activate.d
mkdir -p ./etc/conda/deactivate.d
Inside the each respective directory, create one env_vars.sh file. The one in the activate.d directory will set (or export) your environment variables when you conda activate your environment. The file in the deactivate.d directory will serve to unset the environment variables when you conda deactivate your environment.
touch ./etc/conda/activate.d/env_vars.sh
touch ./etc/conda/deactivate.d/env_vars.sh
First edit the $CONDA_PREFIX/etc/conda/activate.d/env_vars.sh to export the desired environment variables.
#!/bin/sh
export VAR_A='some-thing-here'
export VAR_B=/path/to/my/file/
Afterwards, open to edit the $CONDA_PREFIX/etc/conda/deactivate/env_vars.sh, in order to unset the env variables when you conda deactivate like so:
#!/bin/sh
unset VAR_A
unset VAR_B
Again, the source of my description comes straight from the conda docs here.
Just to add to Cord Kaldemeyer's answer above, for the 2nd option. If you only want to link to a particular conda environment then add the .pth file to ~/anaconda3/envs/{NAME_OF_ENVIRONMENT}/lib/pythonX.X/site-packages/
Related
Basically the question is already up there. I created my environemt for the project with miniconda. Now I want to include some modules, bundled in directory into this environemt. So what I did was putting the directory in the '''/miniconda/env/..../sitepackages/mymodule/''' directory. When I run the module from the command line, where my current working directory is this directory it works. As soon as I just activate this conda environment and work in a different directory it tells me ModuleNotFoundError: No module named 'stdiio'
Hope it kind of makes sense and got more or less clear the question. Any help would be really appreciated.
If your module is installable (e.g., you have a setup.py), then you can activate your Conda env and install using pip:
conda activate -n myenv
pip install /some/path/to/mymodule
If you are actively developing the module, then use pip install -e, instead.
If your module is not installable, but just some source folders with __init__.py files, then another option is to add the containing folder to PYTHONPATH. For example, if your module is in /some/path/to/mymodule, then you would use
export PYTHONPATH="/some/path/to:$PYTHONPATH"
Be careful with PYTHONPATH - one can encounter confusing problems if you allow conflicting outside modules to "leak" into your Conda environment (e.g., adding a site-packages from another Python install).
Installation should be the preferred option, and if you need to use PYTHONPATH, set it in an env-specific manner using activation hooks.
If you're using Python on Windows, then it may be best to resolve this by adding PYTHONPATH to the system variables, or if it's already set, then you may want to append the folder path to the variable.
If you don't already know how to access the system variables, or are confused on how to adjust PYTHONPATH for your needs, then you can check out this StackOverflow question, which has several in-depth answers that explain how to set/update PYTHONPATH on Windows:
How to add to the PYTHONPATH in Windows, so it finds my modules/packages?
I am trying to create a "clean" Python virtual environment using conda:
conda create -n somename python=3.7 --no-default-packages
But somehow I have access to all the packages installed in base inside this environment. pip list returns all the Python packages, and I can import those packages in Python.
However, the actual environment's "site-packages" folder does not contain those extra Python modules as the base folder does.
So what should I do to create an independent/separate virtual environment? I am using Windows10.
I had PYTHONPATH/HOME explicitly specified in path, after deleting now it works good.
It sounds silly, but make sure that you are actually activating the new environment. Also make sure to check that which python and which pip refer to the new environment, I've had problems before where tmux makes conda activations silently fail.
I would also check your PYTHONPATH variable
echo $PYTHONPATH
just in case you inherit dist-packages (check your ~/.profile and ~/.bashrc)
I have a RHEL server with Anaconda3 installed. Each user in the system gets 2 GiG space in the /home/ folder and another large folder in a mounted drive. When the user is trying to create a conda environment using conda create -n my_env it fills all the .tar files in .conda folder and installation breaks. Is there a way I specify a custom location for the .conda folder.
Best
Jagan
you can use --prefix option documentation
Option 1:
If you want to create your virtual environment in current directory then use
conda create --prefix=envName python=X.X
Option 2: if you want to mention the directory then give full path
conda create --prefix=/YourPath/yourEnvName python=x.x
Option 3: If you dont want to explicitly mention the path everytime and want all your environments to be stored somewhere else by default, you can set that up in your .condarc file documentation
You can do this in command line using:
conda config --add envs_dirs <path to directory>
envs_dirs in your .condarc file will add an additional location to the package cache search path.
Came across this while having a similar problem with lack of space in my home directory...
Building on Ajay Bisht's solution, to change the package cache search path, you can set
conda config --add pkgs_dirs <path to directory>/pkgs
as well as
conda config --add envs_dirs <path to directory>/envs
See here https://docs.conda.io/projects/conda/en/latest/user-guide/configuration/use-condarc.html#specify-package-directories-pkgs-dirs
I had the same disk space problem.
All docs and forums told me to uninstall and reinstall anaconda to new location...
I didn't want that, so here another approach:
copy anaconda folder to new location.
lets assume old location /home/uname/anaconda3,
new location /home/uname/mountx/anaconda3
cp -r /home/uname/anaconda3 /home/uname/mountx/anaconda3
rename original anaconda folder (to be sure its not used later)
mv /home/uname/anaconda3 /home/uname/anaconda3.bak
replace all occurrences of "/home/uname/anaconda3" with "/home/uname/mountx/anaconda3".
do this in new anaconda folder, .conda, .bashrc, and your projects.
I used PyCharm with no open project to do the replacing,
sed in a shell might also work
start new anaconda shell and PyCharm to test it
remove renamed original anaconda folder afterwards if it works
rm -rf /home/uname/anaconda3.bak
done
worked fine like a PyCharm for me :)
By error, I forgot to specify the WORKON_HOME variable before creating my virtual environments, and they were created in /root/.virtualenvs directory. They worked fine, and I did some testing by activating certain environment and then doing (env)$ pip freeze to see what specific modules are installed there.
So, whe I discovered the workon home path error, I needed to change the host directory to /usr/local/pythonenv. I created it and moved all the contents of /root/.virtualenvs directory to /usr/local/pythonenv, and changed the value of WORKON_HOME variable. Now, activating an environment using workon command seems to work fine (ie, the promt changes to (env)$), however if I do (env)$ pip freeze, I get way longer list of modules than before and those do not include the ones installed in that particular env before the move.
I guess that just moving the files and specifying another dir for WORKON_HOME variable was not enough. Is there some config where I should specify the new location of the host directory, or some config files for the particular environment?
Virtualenvs are not by default relocatable. You can use virtualenv --relocatable <virtualenv> to turn an existing virtualenv into a relocatable one, and see if that works. But that option is experimental and not really recommended for use.
The most reliable way is to create new virtualenvs. Use pip freeze -l > requirements.txt in the old ones to get a list of installed packages, create the new virtualenv, and use pip install -r requirements.txt to install the packages in the new one.
I used the virtualenv --relocatable feature. It seemed to work but then I found a different python version installed:
$ . VirtualEnvs/moslog/bin/activate
(moslog)$ ~/VirtualEnvs/moslog/bin/mosloganalisys.py
python: error while loading shared libraries: libpython2.7.so.1.0: cannot open shared object file: No such file or directory
Remember to recreate the same virtualenv tree on the destination host.
I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()