How can an anaconda package require two channels to install? - python

I just tried to install some packages into a fresh environment. I tend to specify channels for each install e.g. conda install -c <channel> <package>, rather than using conda config --add channels <channel name>; conda install <package>. However, I found that certain packages could only be installed when using multiple channels at once. How can this work?
I think I have a fundamental misunderstanding of how packages and channels work. How can a package install require multiple channels? It was my understanding that a particular channel hosted particular packages, e.g. conda-forge hosts x packages and they (and their dependencies) are installable using just conda-forge.
Thanks for any help.

It was my understanding that a particular channel hosted particular packages, e.g. conda-forge hosts x packages and they (and their dependencies) are installable using just conda-forge.
That is not necessarily true. If there is a package that lower-level package that is required as part of an install, but it is only hosted on, perhaps, the default channels, it is often easier to just list it as a requirement than to try to get the originated to post it to multiple channels.
You can always chain together multiple channels in a single conda command as well.
conda install <package> -c defaults -c conda-forge -c <other channel>

Related

Confusion about conda distributions/channels on arm processor

I've been using Anaconda for a few years now, but since I started using a Mac with a M1 processor I had to deal with a bunch of problem with the installation of some packages, which left me a little confused about some basic concepts.
For example, I was trying to install Tensorflow, and it turns out the proper way is to install miniforge, and get Tensorflow from the conda-forge channel (which is the default for miniforge), as explained here.
Then, I was wondering whether I could do the same using Anaconda/Miniconda...set up the conda-forge channel as default, and install Tensorflow (or any other arm-compatible package), but I've been told it's not possible
So, now I'm trying to understand how this all works.
If a Tensorflow version compatible with M1 processors exists in the conda-forge channel (and it does exist), why can't I install it by using Anaconda/Miniconda, after configuring it to use said channel? To phrase it in another way, what is the difference between Anaconda/Miniconda and Miniforge, other than the channels they look into for packages (and, as I understand, some licenses)?
Here there is a similar question, but the answers don't seem to address my main concern (why Anaconda/Miniconda with conda-forge as default channel is different than miniforge).
It's not impossible, but you'll have to jump through hoops to get it done.
First, if you have an Anaconda installation, you can't install conda-forge packages into the base environment consistently, because the anaconda package in the base environment of Anaconda will conflict with packages from conda-forge.
Second, since Anaconda is only x86_64 at the moment, you can only install it via Rossetta emulation. After that, you need to tell conda that you need arm64 compatible packages by setting the env variable CONDA_SUBDIR.
CONDA_SUBDIR=osx-arm64 conda create -n native numpy -c conda-forge
will get you a new env with native arm64 packages. However if you want to update this env, you have to prefix all your conda commands with CONDA_SUBDIR=osx-arm64.
To fix this permanently, you can do the following
conda activate native
conda config --env --set subdir osx-arm64
which will make conda use osx-arm64 for this environment.

Install prebuilt packages from conda-forge (e.g. cartopy) using poetry without relying on conda (using only the channel)

I'm testing poetry and I was wondering if it is possible to install prebuilt packages from conda-forge, as cartopy without relying on conda (so keeping a 100% poetry process). I googled this a bit but the only way I found is to install poetry within a conda venv using pip and then installing from conda-forge using conda and then tweaking poetry files to make it aware of the conda venv so that the TOML is written properly.
Packages like cartopy are a pain to install if not from a prebuilt version, if possible I'd change my conda stack to poetry stack if something like poetry add [?conda-forge?] cartopy works
Thanks.
Not currently possible. Conda is a generic package manager, not just a Python package manager. Furthermore, there is no dedicated metadata in Conda packages to discriminate whether or not they are Python packages, which I think would be a prerequisite for Poetry being able to determine whether the Conda package is even valid for installation. Hence, what OP requests cannot be a thing, or at least would it be a major undertaking to make it one.
However, others have requested similar features, so someone hopeful for such functionality could subscribe to notifications on those, or follow the Feature Roadmap.

Cannot find package on Anaconda navigator after installing it using pip

I followed the instructions here: Can't find package on Anaconda Navigator. What to do next?
I clicked Open terminal from environment on Anaconda navigator, and then used "pip3 install lmfit" in the terminal. But after installing the lmfit package using pip3, I still cannot find it in the conda list. What should I do?
The Problem
At the time of this question, Conda builds of pip had only just started including a pip3 entrypoint,1 therefore pip3 is very likely referring to a non-Conda version of Python and that is where the package was installed. Try checking which pip3 to find out where it went.
Recommendation
Conda First
Generally, it is preferable to use Conda to install packages in Conda environments, and in this case the package is available via the Conda Forge channel:
conda install -c conda-forge lmfit
Contrary to M. Newville's answer, this recommendation to prefer Conda packages is not about benefiting Conda developers, but instead a rule of thumb to help users avoid creating unstable or unreproducible environments. More info about the risks of mixing pip install and conda install can be found in the post "Using Pip in a Conda Environment".
Nevertheless, the comment that not all packages (and specifically lmfit) are found in the default repository and this complicates installation by requiring resorting to third-party channels is a good point. In fact, because third-parties are free to use different build stacks there are known problems with mixing packages built by Anaconda and those from Conda Forge. However, these issues tend to be rare and limited to compiled code. Additionally, adding trusted channels to a configuration and setting channel priorities heuristically solves the issue.
As for risks in using third-party channels, arbitrary Anaconda Cloud user channels are risky: one should only source packages from channels you trust (just like anything else one installs). Conda Forge in particular is well-reputed and all feedstocks are freely available on GitHub. Moreover, many Python package builds on Conda Forge are simply wrappers around the PyPI build of the package.
PyPI Last
Sometimes it isn't possible to avoid using PyPI. When one must resort to installing from PyPI, it is better practice to use the pip entrypoint from an activate environment, rather than pip3, since only some Conda builds of pip include pip3. For example,
conda activate my_env
pip install lmfit
Again, following the recommendations in "Using Pip in a Conda Environment", one should operate under the assumption that any subsequent calls to conda (install|upgrade|remove) in the environment could have undefined behavior.
PyPI Only
For the sake of completeness, I will note that a stable way of using Conda that is consistent with the recommendations is to limit Conda to the role of environment creation and use pip for all package installation.
This strategy is perhaps the least burden on the Python-only user, who doesn't want to deal with things like finding the Conda-equivalent package name or searching non-default channels. However, its applicability seems limited to Python-only environments, since other libraries may still need to resort to conda install.
[1]: Conda Forge and Anaconda started consistently including pip3 entrypoints for the pip module after version 20.2.
Installing a pure Python package, such as lmfit with the correct version of pip install lmfit should be fine.
Conda first is recommended to make the life of the conda maintainers and packagers easier, not the user's life. FWIW, I maintain both kinds of packages,
and there is no reason to recommend conda install lmfit over pip install lmfit.
In fact, lmfit is not in the default anaconda repository so that installing it requires going to a third-party conda channel such as conda-forge. That adds complexity and risk to your conda environment.
Really, pip install lmfit should be fine.

Difference between "conda install" with "-c anaconda" and without it

I am new to python and I am trying to install new packages in Anaconda. I am using anaconda prompt and Windows 10.
Can you please explain what is the difference between conda install with -c anaconda and without it? For example conda install -c anaconda mysqlclient and conda install mysqlclient.
Which is better to use when and why?
conda, as you know is a package manager that can install packages to your machine. If you do conda install, it needs a place to search for these packages to download them from. For conda, this is solved with the concept of channels, which are, as #David Kabii has pointed out, like repositories that can exist either locally/a network location or be a url. By default, conda install will try to download packages from repo.anaconda.com, specifically on windows, these locations are searched by default:
https://repo.anaconda.com/pkgs/main/win-64
https://repo.anaconda.com/pkgs/main/noarch
https://repo.anaconda.com/pkgs/r/win-64
https://repo.anaconda.com/pkgs/r/noarch
https://repo.anaconda.com/pkgs/msys2/win-64
https://repo.anaconda.com/pkgs/msys2/noarch
More information on the difference can be found in the docs on using default repositories.
Now if you go to www.anaconda.org and search for a package, let's say numpy, you will see that it is available from different channels. You should only worry about those in case a package is not available from the default channels. This you can also check by running conda search <package name> which will list all available versions in the currently configured channels.
Coming to your question. The -c options specifies an additional channel to search first which is needed if a package is not available from default channels. E.g. some bioinformatics tools are only available by specifying -c bioconda. For those packages that are available from the default channels you should not specify anything and using -c anaconda will make no difference, as the anaconda channel is only a mirror of the default ones and should not be used (see the channel description):
This channel is used internally for mirroring. You should very much prefer https://repo.anaconda.com, which is conda's default and needs no "-c" setting.
When you use the -c option, you are specifying the channel from which to get the package. The default is -c anaconda, so they are similar. To use packages built locally, you would use -c local.
Here is a link for more info:
Docs explaining usage of conda install

Does conda env support development dependencies?

Does conda allow you to install a dependency into an environment as a development dependency?
I'm thinking of something like how bower does this with --save-dev
AFAICT, no, it does not. This repo represents work around options that might be useful elsewhere:
https://github.com/dazza-codes/conda_container
In short, it supplements a conda install with subsequent pip installs from a requirements.txt and/or a requirements.dev file. Since there can be inconsistencies in conda vs. pip packages (like different name variants etc.), there are use cases for having a combination of conda and pip. Also conda can support a pip array in an environment.yml file but the version specs for conda vs. pip packages are not compatible. Liberal use of pip check is recommended for any combination of packages from different packaging systems.

Categories