The conda docs at http://conda.pydata.org/docs/using/envs.html explain how to share environments with other people.
However, the docs tell us this is not cross platform:
NOTE: These explicit spec files are not usually cross platform, and
therefore have a comment at the top such as # platform: osx-64 showing the
platform where they were created. This platform is the one where this spec
file is known to work. On other platforms, the packages specified might not
be available or dependencies might be missing for some of the key packages
already in the spec.
NOTE: Conda does not check architecture or dependencies when installing
from an explicit specification file. To ensure the packages work correctly,
be sure that the file was created from a working environment and that it is
used on the same architecture, operating system and platform, such as linux-
64 or osx-64.
Is there a good method to share and recreate a conda environment in one platform (e.g. CentOS) in another platform (e.g. Windows)?
This answer is given with the assumption that you would like to make sure that
the same versions of the packages that you generally care about are on
different platforms and that you don't care about the exact same versions of
all packages in the entire dependency tree. If you are trying to install the
exact same version of all packages in your entire dependency tree that has a
high likelihood of failure since some conda packages have different
dependencies for osx/win/linux. For example, the recipe for
otrobopt
will install different packages on Win vs. osx/linux, so the environment list
would be different.
Recommendation: manually create an environment.yaml file and specify or pin
only the dependencies that you care about. Let the conda solver do the rest.
Probably worth noting is that conda-env (the tool that you use to manage conda
environments) explicitly recommends that you "Always create your
environment.yml file by hand."
Then you would just do conda env create --file environment.yml
Have a look at the readme for
conda-env.
They can be quite simple:
name: basic_analysis
dependencies:
- numpy
- pandas
Or more complex where you pin dependencies and specify anaconda.org channels to
install from:
name: stats-web
channels:
- javascript
dependencies:
- python=3.4 # or 2.7 if you are feeling nostalgic
- bokeh=0.9.2
- numpy=1.9
- nodejs=0.10
- flask
- pip:
- Flask-Testing
In either case, you can create an environment with conda env create --file environment.yaml.
NOTE: You may need to use .* as a version suffix if you're using an older version of conda.
Whilst it is possible to create your environment.yml file by hand, you can ensure that your environment works across platforms by using the conda env export --from-history flag.
This will only include packages that you’ve explicitly asked for, as opposed to including every package in your environment.
For example, if you create an environment and install a package conda install python=3.8 numpy, it will install numerous other dependencies as well as python and numpy.
If you then run conda env export > environment.yml, your environment.yml file will include all the additional dependencies conda automatically installed for you.
On the other hand, running conda env export --from-history will just create environment.yml with python=3.8 and numpy and thus will work across platforms.
Answer adapted from the docs.
For those interested in a solution to maintain a single environment file that can be used in Linux, macOS, and Windows, please check the conda-devenv tool at https://github.com/ESSS/conda-devenv.
conda-env export should be used used to export your complete environment to file named my_env.yml.
Check the working solution on getting only prefix on OS X instead of complete dependency including pip.
Step 1: deactivate from the environment if activated. else it will create yml file with only prefix.
Step 2: run below command to export
conda-env export -n my_env > my_env.yml
it will export every required dependency, channel and pip install in a yml file which is importable to share with others.
Step 3: run below command to import
conda-env create -n my_env -f= my_env.yml
it will create the exact environment as is on sharing fellow machine.
An aspect missing from the other answers is that the question asker mentions "spec files" and not "environment.yml" file. These are different.
Spec file
A spec file specifies the exact package URL's and is used to recreate identical environments (on the same platform).
It looks like this:
# This file may be used to create an environment using:
# $ conda create --name <env> --file <this file>
# platform: osx-64
#EXPLICIT
https://repo.anaconda.com/pkgs/free/osx-64/mkl-11.3.3-0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/numpy-1.11.1-py35_0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/openssl-1.0.2h-1.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/pip-8.1.2-py35_0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/python-3.5.2-0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/readline-6.2-2.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/setuptools-25.1.6-py35_0.tar.bz2
It can be obtained with conda list --explicit from the conda environment of interest.
To create a new environment with it one would use the conda create command:
conda create --name <env_name> --file <spec file path>
environment.yml
The environment.yml file is described well in this answer.
It can be obtained with the following commands from the conda environment of interest:
conda env export to get all packages in the current environment
conda env export --from-history to only get the packages explicitly installed (i.e. not automatically added depenedencies)
This question is quite old and conda has developed in the meantime. Perhaps the original meaning of spec file was equal to environment.yml files, but for completeness I am adding this answer.
None of these solutions worked out for me, the problem the OP has raised is about platform dependent suffixes added in the dependency making it impossible to be used it in a cross-platform way.
Turns out the solution for this is to export the environment with an additional option called no-builds
Suppose you want to export your environment from MacOS to Debian.
1.) Invoke conda env export --no-builds > env_macos.yml
2.) Invoke cp env_macos.yml env_debian.yml
3.) Move env_debian.yml to your Debian host
4.) conda env create -f env_debian.yml
Soon after you do 4, again there may be same package resolving related issues for certain packages, just remove those entries alone and invoke 4 again. Things will work.
Reference
Related
I want to export an environment AND its packages, using conda 4.10.
Reading the conda docs, it suggests exporting environments using conda env export > environment.yml. However, I am not sure if it is my problem (and if so, what the solution is), but there is no package information.
name: guest
channels:
- defaults
prefix: C:\Anaconda3\envs\guest
After some googling, I learnt to export packages using conda list --export > requirements.txt. This time, there is no environment information.
# This file may be used to create an environment using:
# $ conda create --name <env> --file <this file>
# platform: win-64
ca-certificates=2021.5.30=h5b45459_0
certifi=2021.5.30=py38haa244fe_0
...
How do I export both into one file and use it? Or should I just export two files, create an environment first, and install the packages?
On a side note, how do I make my packages match requirements.txt, that is, to remove extra packages, install missing ones, and update/ downgrade to the specific version? Is there a command for this, or should I delete the whole environment and start from scratch?
I am trying to update the package flopy, within a virtual environment called flopyenv using the Anaconda Prompt command line. First, I activate the virtual environment using conda activate flopyenv. Then to update flopy, I've tried conda update flopy. I get the following error:
PackageNotInstalledError: Package is not installed in prefix.
prefix: C:\Users\person\Anaconda3\envs\flopyenv
package name: flopy
which makes sense since the flopy directory was installed in a different directory (C:\Users\person\Anaconda3\envs\flopyenv\lib\site-packages\flopy). Also, I have checked using conda list and flopy is listed in the environment. How do I point conda update to the proper directory to update flopy within the virtual environment?
Edit: Per merv's comment I've included the output below.
(flopyenv) C:\Users\person>conda list -n flopyenv flopy
# packages in environment at C:\Users\person\Anaconda3\envs\flopyenv:
#
# Name Version Build Channel
flopy 3.3.1 pypi_0 pypi
Looks like I used pip to install flopy not conda which I guess is why the directories weren't lining up when I tried updating using conda. I was able to successfully update the flopy package using pip.
Seems like OP figured it out, but it may be worth mentioning that in addition to using pip to update, it might also work to enable the pip_interop_enabled configuration option. I would only do this on a per-environment basis:
conda activate flopyenv
conda config --env --set pip_interop_enabled true
conda update flopy
However, this is still (as of Conda v 4.9) considered an experimental feature, AFAIK.
I'm developing a Python library, which depends on multiple packages. I'm struggling to find the most straightforward of managing all those dependencies with the following constraints :
Some of those dependencies are only available as conda packages (technically the source is available but the build process is not something I want to get into)
Other dependencies are only available via pip
I need to install my own library in editable or developer mode
I regularly need to keep the dependencies up-to-date
My current setup for the initial install :
Create a new conda environment
Install the conda-only dependencies with conda install ...
Install my library with pip install -e .
At this point, some packages were installed and are now managed by conda, others by pip. When I want to update my environment, I need to:
Update the conda part of the environment with conda update --all
Update the pip part of the environment by hand
My problem is that this is unstable : when I update all conda packages, it ensures the consistency of the packages it manages. However, I can't guarantee that the environment as a whole stays consistent, and I just realized that I was missing some updates because I forgot to check for updates in the pip part of the environment.
What's the best way to do this ? I've thought of :
Using conda's pip interoperability feature : this seems to work, but I've had some dubious results, probably because of my use of extras_require
Since pip can see the conda packages, the initial install is consistent, which means I can simply reinstall everything when I want to update. This works but is not exactly elegant.
The recommendation in the official documentation for managing a Conda environment that also requires PyPI-sourced or pip-installed local packages is to define all dependencies (both Conda and Pip) in a YAML file. Something like:
env.yaml
name: my_env
channels:
- defaults
dependencies:
- python=3.8
- numpy
- pip
- pip:
- some_pypi_only_pkg
- -e path/to/a/local/pkg
The workflow for updating in such an environment is to update the YAML file (which I would recommend to keep under version control) and then either create a new environment or use
conda env update -f env.yaml
Personally, I would tend to create new envs, rather than mutate (update) an existing one, and use minimal constraints (i.e., >=version) in the YAML. When creating a new env, it should automatically pull the latest consistent packages. Plus, one can keep the previous instances of the env around in case a regression is need during the development lifecycle.
The conda docs at http://conda.pydata.org/docs/using/envs.html explain how to share environments with other people.
However, the docs tell us this is not cross platform:
NOTE: These explicit spec files are not usually cross platform, and
therefore have a comment at the top such as # platform: osx-64 showing the
platform where they were created. This platform is the one where this spec
file is known to work. On other platforms, the packages specified might not
be available or dependencies might be missing for some of the key packages
already in the spec.
NOTE: Conda does not check architecture or dependencies when installing
from an explicit specification file. To ensure the packages work correctly,
be sure that the file was created from a working environment and that it is
used on the same architecture, operating system and platform, such as linux-
64 or osx-64.
Is there a good method to share and recreate a conda environment in one platform (e.g. CentOS) in another platform (e.g. Windows)?
This answer is given with the assumption that you would like to make sure that
the same versions of the packages that you generally care about are on
different platforms and that you don't care about the exact same versions of
all packages in the entire dependency tree. If you are trying to install the
exact same version of all packages in your entire dependency tree that has a
high likelihood of failure since some conda packages have different
dependencies for osx/win/linux. For example, the recipe for
otrobopt
will install different packages on Win vs. osx/linux, so the environment list
would be different.
Recommendation: manually create an environment.yaml file and specify or pin
only the dependencies that you care about. Let the conda solver do the rest.
Probably worth noting is that conda-env (the tool that you use to manage conda
environments) explicitly recommends that you "Always create your
environment.yml file by hand."
Then you would just do conda env create --file environment.yml
Have a look at the readme for
conda-env.
They can be quite simple:
name: basic_analysis
dependencies:
- numpy
- pandas
Or more complex where you pin dependencies and specify anaconda.org channels to
install from:
name: stats-web
channels:
- javascript
dependencies:
- python=3.4 # or 2.7 if you are feeling nostalgic
- bokeh=0.9.2
- numpy=1.9
- nodejs=0.10
- flask
- pip:
- Flask-Testing
In either case, you can create an environment with conda env create --file environment.yaml.
NOTE: You may need to use .* as a version suffix if you're using an older version of conda.
Whilst it is possible to create your environment.yml file by hand, you can ensure that your environment works across platforms by using the conda env export --from-history flag.
This will only include packages that you’ve explicitly asked for, as opposed to including every package in your environment.
For example, if you create an environment and install a package conda install python=3.8 numpy, it will install numerous other dependencies as well as python and numpy.
If you then run conda env export > environment.yml, your environment.yml file will include all the additional dependencies conda automatically installed for you.
On the other hand, running conda env export --from-history will just create environment.yml with python=3.8 and numpy and thus will work across platforms.
Answer adapted from the docs.
For those interested in a solution to maintain a single environment file that can be used in Linux, macOS, and Windows, please check the conda-devenv tool at https://github.com/ESSS/conda-devenv.
conda-env export should be used used to export your complete environment to file named my_env.yml.
Check the working solution on getting only prefix on OS X instead of complete dependency including pip.
Step 1: deactivate from the environment if activated. else it will create yml file with only prefix.
Step 2: run below command to export
conda-env export -n my_env > my_env.yml
it will export every required dependency, channel and pip install in a yml file which is importable to share with others.
Step 3: run below command to import
conda-env create -n my_env -f= my_env.yml
it will create the exact environment as is on sharing fellow machine.
An aspect missing from the other answers is that the question asker mentions "spec files" and not "environment.yml" file. These are different.
Spec file
A spec file specifies the exact package URL's and is used to recreate identical environments (on the same platform).
It looks like this:
# This file may be used to create an environment using:
# $ conda create --name <env> --file <this file>
# platform: osx-64
#EXPLICIT
https://repo.anaconda.com/pkgs/free/osx-64/mkl-11.3.3-0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/numpy-1.11.1-py35_0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/openssl-1.0.2h-1.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/pip-8.1.2-py35_0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/python-3.5.2-0.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/readline-6.2-2.tar.bz2
https://repo.anaconda.com/pkgs/free/osx-64/setuptools-25.1.6-py35_0.tar.bz2
It can be obtained with conda list --explicit from the conda environment of interest.
To create a new environment with it one would use the conda create command:
conda create --name <env_name> --file <spec file path>
environment.yml
The environment.yml file is described well in this answer.
It can be obtained with the following commands from the conda environment of interest:
conda env export to get all packages in the current environment
conda env export --from-history to only get the packages explicitly installed (i.e. not automatically added depenedencies)
This question is quite old and conda has developed in the meantime. Perhaps the original meaning of spec file was equal to environment.yml files, but for completeness I am adding this answer.
None of these solutions worked out for me, the problem the OP has raised is about platform dependent suffixes added in the dependency making it impossible to be used it in a cross-platform way.
Turns out the solution for this is to export the environment with an additional option called no-builds
Suppose you want to export your environment from MacOS to Debian.
1.) Invoke conda env export --no-builds > env_macos.yml
2.) Invoke cp env_macos.yml env_debian.yml
3.) Move env_debian.yml to your Debian host
4.) conda env create -f env_debian.yml
Soon after you do 4, again there may be same package resolving related issues for certain packages, just remove those entries alone and invoke 4 again. Things will work.
Reference
I would like to use Python for scientific applications and after some research decided that I will use Anaconda as it comes bundled with loads of packages and add new modules using conda install through the cmd is easy.
I prefer to use the 64 bit version for better RAM use and efficiency but
32bit version is needed as well because some libraries are 32bit. Similarly, I prefer to use Python 3.5 as that is the future and the way things go. But loads of libraries are still 2.7 which means I need both.
I have to install 4 versions of Anaconda (64bit 2.7, 64bit 3.5, 32bit 2.7, 64bit 3.5). Each version is about 380MB. I am aiming to use Jupyter notebook and Spyder as the IDE. I had to switch between versions when required. I had conflicting libraries, path issues and all sorts of weird problems.
So, I am planning to do a clean install from scratch. I would like to know if there is a more sensible way to handle this. I use Windows 7 64 bit for now if that matters.
Make sure to set the right environmental variables (https://github.com/conda/conda/issues/1744)
Create a new environment for 32bit Python 2.7:
set CONDA_FORCE_32BIT=1
conda create -n py27_32 python=2.7
Activate it:
set CONDA_FORCE_32BIT=1
activate py27_32
Deactivate it:
deactivate py27_32
Create one for 64 bit Python 3.5:
set CONDA_FORCE_32BIT=
conda create -n py35_64 python=3.5
Activate it:
set CONDA_FORCE_32BIT=
activate py35_64
The best would be to write the activation commands in a batch file so that you have to type only one command and cannot forget to set the right 32/64 bit flag.
UPDATE
You don't need to install a full Anaconda distribution for this. Miniconda is enough:
These Miniconda installers contain the conda package manager and Python. Once Miniconda is installed, you can use the conda command to install any other packages and create environments, etc. ...
There are two variants of the installer: Miniconda is Python 2 based and Miniconda3 is Python 3 based. Note that the choice of which Miniconda is installed only affects the root environment. Regardless of which version of Miniconda you install, you can still install both Python 2.x and Python 3.x environments.
I would recommend you to use Miniconda3 64-bit as your root environment.
You can always install a full Anaconda later with:
conda install anaconda
Note that it might downgrade some of your previously install packages in your active environment.
Setting the Subdirectory Constraint
Conda has a configuration variable subdir that can be used to constrain package searching to platforms (e.g., win-32). I think the most reliable procedure is to create the empty environment, set its subdir, then proceed with the (constrained) installations. For example,
win-32, Python 2.7
conda create -n py27_32
conda activate py27_32
conda config --env --set subdir win-32
conda install python=2.7
win-64, Python 3.7
conda create -n py37_64
conda activate py37_64
conda config --env --set subdir win-64
conda install python=3.7
Alternatively, if you need to, for example, create an environment from a YAML file, but want a win-32 platform, one can use the CONDA_SUBDIR environment variable:
set CONDA_SUBDIR=win-32
conda env create -f env.yaml -n my_env_32
set CONDA_SUBDIR=
conda activate my_env_32
conda config --env --set subdir win-32
The nice thing about this procedure is the variable will now always be set whenever activating the environment, so future changes to the environment will remain within the specified subdirectory.
Ad Hoc Constraints
It is also possible to specify the platform in the --channel|-c argument:
conda install -c defaults/win-32 --override-channels python=3.7
Here the --override-channels is required to ensure that only the provided channel(s) and subdirectory (win-32) is used.
However, setting the subdir on the whole env is likely a more reliable practice.
YAML Constraints
It is also possible to use subdir specifications in a YAML environment definition. However, this is less reliable (see below and comments). For example,
py37_win32.yaml
name: py37_win32
channels:
- defaults/win-32
dependencies:
- python=3.7
#Bicudo has tried this and confirms it works, but notes that it does not set any environment-specific constraints on future updates to the environment. Additionally, #Geeocode pointed out that the default subdir can still leak in, which is likely due to conda env create still having access to the global channels configuration during solving (there is no equivalent --override-channels flag for conda env create). Hence, it would be good practice to still set the subdir before and after environment creation, e.g.,
set CONDA_SUBDIR=win-32
conda env create -f py37_win32.yaml
set CONDA_SUBDIR=
conda activate py37_win32
conda config --env --set subdir win-32
Alternatively, beginning with Conda v4.9, one can also specify environment variables as part of the YAML. That is, one can effectively define an environment's CONDA_SUBDIR value at environment creation:
py37_win32.yaml
name: py37_win32
channels:
- defaults/win-32
dependencies:
- python=3.7
variables:
CONDA_SUBDIR: win-32
I just wanted to add to Mike Mullers answer, as I also wanted my IPython to switch between 32 bit and 64 bit.
After setting up the 32bit or 64bit environment. Use the following commands
pip install ipykernel
to install ipykernel on this env. Then assign it with:
python -m ipykernel install --user --name myenv --display-name "Python (myenv)"
here myenv is the name of your new environment. See this page here for further details on switching kernels - http://ipython.readthedocs.io/en/stable/install/kernel_install.html
(now in conda win64 - python64 activate env)
set CONDA_SUBDIR=win-32
conda install python
you will see
The following packages will be SUPERSEDED by a higher-priority
channel:
ca-certificates anaconda/pkgs/main/win-64::ca-certifi~ -->
anaconda/pkgs/main/win-32::ca-certificates-2021.7.5-h9f7ea03_1
openssl anaconda/pkgs/main/win-64::openssl-1.~ -->
anaconda/pkgs/main/win-32::openssl-1.1.1k-hc431981_0 python
anaconda/pkgs/main/win-64::python-3.9~ -->
anaconda/pkgs/main/win-32::python-3.9.5-h53c7b84_3
Proceed ([y]/n)?
just type "y"
this setting is saved in file "\anaconda\envs\ you env \ .condarc"