How to update an existing Conda environment with a .yml file - python

How can a pre-existing conda environment be updated with another .yml file. This is extremely helpful when working on projects that have multiple requirement files, i.e. base.yml, local.yml, production.yml, etc.
For example, below is a base.yml file has conda-forge, conda, and pip packages:
base.yml
name: myenv
channels:
- conda-forge
dependencies:
- django=1.10.5
- pip:
- django-crispy-forms==1.6.1
The actual environment is created with:
conda env create -f base.yml.
Later on, additional packages need to be added to base.yml. Another file, say local.yml, needs to import those updates.
Previous attempts to accomplish this include:
creating a local.yml file with an import definition:
channels:
dependencies:
- pip:
- boto3==1.4.4
imports:
- requirements/base.
And then run the command:
conda install -f local.yml.
This does not work. Any thoughts?

Try using conda env update:
conda activate myenv
conda env update --file local.yml --prune
--prune uninstalls dependencies which were removed from local.yml, as pointed out in this answer by #Blink.
Or without the need to activate the environment (thanks #NumesSanguis):
conda env update --name myenv --file local.yml --prune
See Updating an environment in Conda User Guide.

The suggested answer is partially correct. You'll need to add the --prune option to also uninstall packages that were removed from the environment.yml.
Correct command:
conda env update -f local.yml --prune

alkamid's answer is on the right lines, but I have found that Conda fails to install new dependencies if the environment is already active. Deactivating the environment first resolves this:
source deactivate;
conda env update -f whatever.yml;
source activate my_environment_name; # Must be AFTER the conda env update line!

Related

Get new commits from a git repo installed with pip in a conda environment when updating

I am using a YAML config to install a package from BitBucket into my conda environment, a subset of the file looks like this:
# config.yaml
name: ll4ma_opt
channels:
- defaults
- conda-forge
dependencies:
- pip
- pip:
- git+https://bitbucket.org/robot-learning/ll4ma_util.git
This works great. I do conda env create -f config.yaml and it creates the environment just fine.
My problem is that when I make a change to the BitBucket package ll4ma_util, I do not get those changes in my conda environment, even after doing conda env update -f config.yaml
I see this output in the terminal when I try to do the update:
Pip subprocess output:
Collecting git+https://bitbucket.org/robot-learning/ll4ma_util.git (from -r /home/adam/ll4ma-opt-sandbox/conda/condaenv.65mn0x4h.requirements.txt (line 3))
Cloning https://bitbucket.org/robot-learning/ll4ma_util.git to /tmp/pip-req-build-5qdqlgww
Resolved https://bitbucket.org/robot-learning/ll4ma_util.git to commit 9673a4ff2025356a4eff72b0ee44e7f02d76b414
The hash shown is in fact the latest commit, but when I try to use the code after the update it's still using the old code and doesn't reflect the changes I made to the ll4ma_util package. The only way I've been successful is to completely remove my environment with conda env remove -n ll4ma_opt and then create it new again.
Is there a way I can force an update of the BitBucket package such that if I installed the package using git and pip in the conda environment, it will pull and use any recent changes from the git repo when I run an update of my conda environment?
As mentioned elsewhere, to have a Pip install that respects file changes (such as repository pulls), one needs the -e flag. In the Conda YAML context, you'd want:
name: ll4ma_opt
channels:
- defaults
- conda-forge
dependencies:
- pip
- pip:
- -e git+https://bitbucket.org/robot-learning/ll4ma_util.git

Activating conda environment during gitlab CI

My .gitlab-ci.yml file looks like this:
anomalydetector:
image: continuumio/miniconda:4.7.10
stage: build
tags:
- docker
script:
- conda env create -f environment.yml
- conda activate my-env
- pytest tests/.
On Gitlab, this job starts fine, and the logs read
$ conda env create -f environment.yml
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
==> WARNING: A newer version of conda exists. <==
current version: 4.7.10
latest version: 4.7.11
Ok, so I'm using a conda version later than 4.4, so conda activate should work. However, the job fails with the following:
# To activate this environment, use
#
# $ conda activate my-env
#
# To deactivate an active environment, use
#
# $ conda deactivate
$ conda activate my-env
CommandNotFoundError: Your shell has not been properly configured to use 'conda activate'.
To initialize your shell, run
$ conda init <SHELL_NAME>
I have then tried editing my .gitlab-ci.yml file so that there is a command
conda init bash
but then get the message
==> For changes to take effect, close and re-open your current shell. <==
How can I activate my conda environment in the gitlab CI process?
conda init touches the .bashrc file. To reinitialize the shell you can source it:
- conda create --name myenv
- conda init bash
- source ~/.bashrc # <- !!!
- conda activate myenv
Whether this is better or worse than source activate myenv is a separate discussion, I guess.
Similarly to Tommy's answer, this needs to be done for the Windows Powershell as well. Contrary to bash conda activate myenv does not fail in the powershell. It just has no effect (i.e. the environment is not switched) without calling conda init powershell which makes it even more awkward. Reloading the profile in the powershell is more complicated since there are six of them [1]. I used:
- conda create --name myenv
- conda init powershell
- "if (test-path $PROFILE.CurrentUserAllHosts) { & $PROFILE.CurrentUserAllHosts}"
- conda activate myenv
Why Conda uses the $PROFILE.CurrentUserAllHosts profile has been asked in an issue [2].
references:
[1] https://devblogs.microsoft.com/scripting/understanding-the-six-powershell-profiles/
[2] https://github.com/conda/conda/issues/8608
Another possibility that you might find more succinct and elegant: source directly the code needed for conda to run run with bash. This also has the effect of adding conda to the PATH, if it's not the case.
This is done with
- source <anaconda_root>/etc/profile.d/conda.sh
- conda activate myenv
(Stolen from https://stackoverflow.com/a/60523131/11343)
Somehow all of these answers failed me. I ended up using conda run instead of activating an environment. That allowed me to run pytest without activating the environment
- conda run -n <environment-name> python -m pytest <addl-pytest-args>

How to export a conda environment with packages that have been pip installed from github?

I understand I can export conda environments with syntax like so:
conda env export -n my_env -f /somewhere/environment.yml
And import them with:
conda env create -f /somewhere/environment.yml -p /somewhere/else/
However, if there is a package I have installed from my private github, using syntax like so:
(my_env) ~/ $ pip install git+https://github.com/user/my_package.git#master#egg=my_package
Or have this in my requirements.txt, like so:
aiofiles==0.4.0
git+https://github.com/user/my_package.git#master#egg=my_package
chardet==3.0.4
When I do my export, I see this:
name: my_env
channels:
- defaults
dependencies:
- ca-certificates=2019.5.15=0
...
- pip:
- aiofiles==0.4.0
- my_package # UH OH, NO github INSTRUCTION OR VERSION
- chardet==3.0.4
This is a problem, because when I try to run:
conda env create -f /somewhere/environment.yml -p /somewhere/else/
I get an error that conda fails to install because it cannot find my_package. And this makes sense, the environment does not tell it to look in github.
How can I ask the conda env export command to be github-pip-installation-aware so that I can faithfully re-create my conda environment without the export failing? (Or even to do this in such a way I won't create exports that are doomed to failure? Ie, this export takes a rather long time---it would be helpful if the export command would fail fast before spending tens of minutes producing an export that cannot be imported.)
Unlike this similar question, I am not using wheels.

How to transfer Anaconda env installed on one machine to another? [Both with Ubuntu installed]

I have been using Anaconda(4.3.23) on my GuestOS ubuntu 14.04 which is installed on Vmware on HostOS windows 8.1. I have setup an environment in anaconda and have installed many libraries, some of which were very hectic to install (not straight forward pip installs). few libraries had inner dependencies and had to be build together and from their git source.
Problem
I am going to use Cloud based VM (Azure GPU instance) to use GPU. but I don't want to get into the hectic installation again as i don't want to waste money on the time it will take me to install all the packages and libraries again
Is there any way to transfer/copy my existing env (which has everything already installed) to the Cloud VM?
From the very end of this documentation page:
Save packages for future use:
conda list --export > package-list.txt
Reinstall packages from an export file:
conda create -n myenv --file package-list.txt
If conda list --export failes like this ...
Executing conda list --export > package-list.txt creates a file which looks like this:
# This file may be used to create an environment using:
# $ conda create --name <env> --file <this file>
# platform: win-64
_tflow_1100_select=0.0.1=gpu
absl-py=0.5.0=py_0
astor=0.7.1=py_0
...
But creating a new environment by executing conda create -n myenv --file package-list.txt gives me this error:
Solving environment: ...working... failed
PackagesNotFoundError: The following packages are not available from current channels:
- markdown==2.6.11=py_0
...
... then try to use conda env export
According to this discussion execute the following command on your source machine:
source activate yourEnvironment
conda env export --no-builds > file.txt
On the target machine execute:
conda env create --file /path/to/file.txt
The file generated by conda env export looks a bit different, but it contains pip packages as well:
name: yourEnvironment
channels:
- conda-forge
- defaults
dependencies:
- absl-py=0.5.0
...
- pip:
- astroid==2.0.4
...
## You can try below approach to move all the package from one machine to other :
## Note : Machine that packages are being moved should be same and python version also should be same
$ pip install conda-pack
# To package an environment:
## Pack environment my_env into my_env.tar.gz
$ conda pack -n my_env
## Pack environment my_env into out_name.tar.gz
$ conda pack -n my_env -o out_name.tar.gz
## Pack environment located at an explicit path into my_env.tar.gz
$ conda pack -p /explicit/path/to/my_env
# After following above approach, you will end up with a tar.gz file. Now to install package from this zip file follow below approach.
## To install the environment:
## Unpack environment into directory `my_env`
$ mkdir -p my_env
$ tar -xzf my_env.tar.gz -C my_env
## Use Python without activating or fixing the prefixes. Most Python
## libraries will work fine, but things that require prefix cleanups
## will fail.
$ ./my_env/bin/python
## Activate the environment. This adds `my_env/bin` to your path
$ source my_env/bin/activate
## Run Python from in the environment
(my_env) $ python
## Cleanup prefixes from in the active environment.
## Note that this command can also be run without activating the environment
## as long as some version of Python is already installed on the machine.
(my_env) $ conda-unpack
You can probably get away with copying the whole Anaconda installation to your cloud instance.
According to github thread execute the following command on your source machine:
https://github.com/conda/conda/issues/3847
source activate yourEnvironment
conda env export --no-builds > environment.yml
On the target machine execute:
conda env create -f environment.yml
The file generated by conda env export looks a bit different, but it contains pip packages as well:
name: yourEnvironment
channels:
conda-forge
defaults
dependencies:
absl-py=0.5.0
...
pip:
astroid==2.0.4
...
I found the answer from this
you can export your Anaconda environment using:
conda env export > environment.yml
In order to recreate it on another machine using:
conda env create -f environment.yml
You can modify the environment.yml as required because some of the python libraries may be obsolete or due to version conflict in future releases.

How can you "clone" a conda environment into the root environment?

I'd like the root environment of conda to copy all of the packages in another environment. How can this be done?
There are options to copy dependency names/urls/versions to files.
Recommendation
Normally it is safer to work from a new environment rather than changing root. However, consider backing up your existing environments before attempting changes. Verify the desired outcome by testing these commands in a demo environment. To backup your root env for example:
λ conda activate root
λ conda env export > environment_root.yml
λ conda list --explicit > spec_file_root.txt
Options
Option 1 - YAML file
Within the second environment (e.g. myenv), export names+ to a yaml file:
λ activate myenv
λ conda env export > environment.yml
then update the first environment+ (e.g. root) with the yaml file:
λ conda env update --name root --file environment.yml
Option 2 - Cloning an environment
Use the --clone flag to clone environments (see #DevC's post):
λ conda create --name myclone --clone root
This basically creates a direct copy of an environment.
Option 3 - Spec file
Make a spec-file++ to append dependencies from an env (see #Ormetrom):
λ activate myenv
λ conda list --explicit > spec_file.txt
λ conda install --name root --file spec_file.txt
Alternatively, replicate a new environment (recommended):
λ conda create --name myenv2 --file spec_file.txt
See Also
conda env for more details on the env sub-commands.
Anaconada Navigator desktop program for a more graphical experience.
Docs on updated commands. With older conda versions use activate (Windows) and source activate (Linux/Mac OS). Newer versions of conda can use conda activate (this may require some setup with your shell configuration via conda init).
Discussion on keeping conda env
Extras
There appears to be an undocumented conda run option to help execute commands in specific environments.
# New command
λ conda run --name myenv conda list --explicit > spec_file.txt
The latter command is effective at running commands in environments without the activation/deactivation steps. See the equivalent command below:
# Equivalent
λ activate myenv
λ conda list --explicit > spec_file.txt
λ deactivate
Note, this is likely an experimental feature, so this may not be appropriate in production until official adoption into the public API.
+Conda docs have changed since the original post; links updated.
++Spec-files only work with environments created on the same OS. Unlike the first two options, spec-files only capture links to conda dependencies; pip dependencies are not included.
To make a copy of your root environment (named base), you can use following command; worked for me with Anaconda3-5.0.1:
conda create --name <env_name> --clone base
you can list all the packages installed in conda environment with following command
conda list -n <env_name>
When setting up a new environment and I need the packages from the base environment in my new one (which is often the case) I am building in the prompt a identical conda environment by using a spec-file.txt with:
conda list --explicit > spec-file.txt
The spec-file includes the packages of for example the base environment.
Then using the prompt I install the the packages into the new environment:
conda create --name myenv --file spec-file.txt
The packages from base are then available in the new environment.
The whole process is describe in the doc:
https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#building-identical-conda-environments
I also ran into the trouble of cloning an environment onto another machine and wanted to provide an answer. The key issue I had was addressing errors when the current environment contains development packages which cannot be obtained directly from conda install or pip install. For these cases I highly recommend conda-pack (see this answer):
pip install conda-pack
or,
conda install conda-pack
then back up the environment, to use the current environment just omit the my_env name,
# Pack environment my_env into my_env.tar.gz
$ conda pack -n my_env
# Pack environment my_env into out_name.tar.gz
$ conda pack -n my_env -o out_name.tar.gz
# Pack environment located at an explicit path into my_env.tar.gz
$ conda pack -p /explicit/path/to/my_env
and restoring,
# Unpack environment into directory `my_env`
$ mkdir -p my_env
$ tar -xzf my_env.tar.gz -C my_env
# Use Python without activating or fixing the prefixes. Most Python
# libraries will work fine, but things that require prefix cleanups
# will fail.
$ ./my_env/bin/python
# Activate the environment. This adds `my_env/bin` to your path
$ source my_env/bin/activate
# Run Python from in the environment
(my_env) $ python
# Cleanup prefixes from in the active environment.
# Note that this command can also be run without activating the environment
# as long as some version of Python is already installed on the machine.
(my_env) $ conda-unpack

Categories