Conflict resolution in pip vs. conda - python

I assume both pip and conda, despite differences, are package managers and check for consistency of packages installed in an environment! In my case though, I have a list of requirements.txt, on top of python=3.6. In my conda virtual environment, I installed them one-by-one. The strange thing is that when locating some packages in anaconda.org channels and installing them with conda install, conda complains! An example is when I tried to install statistics=1.0.3.5, and I got this message on terminal:
UnsatisfiableError: The following specifications were found
to be incompatible with the existing python installation in your environment:
Specifications:
- statistics=1.0.3.5 -> python[version='2.7.*|<3|>=2.7,<2.8.0a0']
Your python: python=3.6
However, when I did it with pip, it worked!
Why is that?
Am I going to bump into a problem down the road with this package?
I read this Stackoverflow post about the difference between pip and conda and tried to understand it from the doc (Although not that successful).

When working with conda virtual environments, installing packages with pip should be the last resort. If a package isn't available through the default channel, try installing from conda-forge first.
The difference between conda and pip is huge (not to mention virtual environments): Conda aims to install a consistent set of packages - which results in an optimization problem - while pip just installs dependencies, no matter if that is in conflict with any previously installed package.
However, since you are writing unit tests with your code you'll immediatly realize if you bump into a problem.

Related

How to change pinned packages in mamba package manager

When I try to install a python package in a minimamba distribution (mamba 0.14), there is no dependency solution for the python version installed (shown as pinned package below).
How can I allow this package to be downgraded to allow a dependency solution?
(base) C:\Users\user>mamba install nipy
(...)
Pinned packages:
- python 3.9.4
Encountered problems while solving:
- package nipy-0.4.1-py37hfa6e2cd_1001 requires python >=3.7,<3.8.0a0, but none of the providers can be installed
I have tried the --no-pin parameter with no changes.
Did you try mamba install python=3.7 nipy?
This should downgrade python package to the required version allowing you to install nipy and its dependencies.
Changing a Python version in-place in a Conda environment has so many downsides (e.g., complicated solve; almost every package has to be redownloaded; risks breaking base) that whenever one requires a different Python version, the better thing to do is almost universally to create a new environment. Try instead
mamba create -n nipy_env nipy

Install prebuilt packages from conda-forge (e.g. cartopy) using poetry without relying on conda (using only the channel)

I'm testing poetry and I was wondering if it is possible to install prebuilt packages from conda-forge, as cartopy without relying on conda (so keeping a 100% poetry process). I googled this a bit but the only way I found is to install poetry within a conda venv using pip and then installing from conda-forge using conda and then tweaking poetry files to make it aware of the conda venv so that the TOML is written properly.
Packages like cartopy are a pain to install if not from a prebuilt version, if possible I'd change my conda stack to poetry stack if something like poetry add [?conda-forge?] cartopy works
Thanks.
Not currently possible. Conda is a generic package manager, not just a Python package manager. Furthermore, there is no dedicated metadata in Conda packages to discriminate whether or not they are Python packages, which I think would be a prerequisite for Poetry being able to determine whether the Conda package is even valid for installation. Hence, what OP requests cannot be a thing, or at least would it be a major undertaking to make it one.
However, others have requested similar features, so someone hopeful for such functionality could subscribe to notifications on those, or follow the Feature Roadmap.

Cannot find package on Anaconda navigator after installing it using pip

I followed the instructions here: Can't find package on Anaconda Navigator. What to do next?
I clicked Open terminal from environment on Anaconda navigator, and then used "pip3 install lmfit" in the terminal. But after installing the lmfit package using pip3, I still cannot find it in the conda list. What should I do?
The Problem
At the time of this question, Conda builds of pip had only just started including a pip3 entrypoint,1 therefore pip3 is very likely referring to a non-Conda version of Python and that is where the package was installed. Try checking which pip3 to find out where it went.
Recommendation
Conda First
Generally, it is preferable to use Conda to install packages in Conda environments, and in this case the package is available via the Conda Forge channel:
conda install -c conda-forge lmfit
Contrary to M. Newville's answer, this recommendation to prefer Conda packages is not about benefiting Conda developers, but instead a rule of thumb to help users avoid creating unstable or unreproducible environments. More info about the risks of mixing pip install and conda install can be found in the post "Using Pip in a Conda Environment".
Nevertheless, the comment that not all packages (and specifically lmfit) are found in the default repository and this complicates installation by requiring resorting to third-party channels is a good point. In fact, because third-parties are free to use different build stacks there are known problems with mixing packages built by Anaconda and those from Conda Forge. However, these issues tend to be rare and limited to compiled code. Additionally, adding trusted channels to a configuration and setting channel priorities heuristically solves the issue.
As for risks in using third-party channels, arbitrary Anaconda Cloud user channels are risky: one should only source packages from channels you trust (just like anything else one installs). Conda Forge in particular is well-reputed and all feedstocks are freely available on GitHub. Moreover, many Python package builds on Conda Forge are simply wrappers around the PyPI build of the package.
PyPI Last
Sometimes it isn't possible to avoid using PyPI. When one must resort to installing from PyPI, it is better practice to use the pip entrypoint from an activate environment, rather than pip3, since only some Conda builds of pip include pip3. For example,
conda activate my_env
pip install lmfit
Again, following the recommendations in "Using Pip in a Conda Environment", one should operate under the assumption that any subsequent calls to conda (install|upgrade|remove) in the environment could have undefined behavior.
PyPI Only
For the sake of completeness, I will note that a stable way of using Conda that is consistent with the recommendations is to limit Conda to the role of environment creation and use pip for all package installation.
This strategy is perhaps the least burden on the Python-only user, who doesn't want to deal with things like finding the Conda-equivalent package name or searching non-default channels. However, its applicability seems limited to Python-only environments, since other libraries may still need to resort to conda install.
[1]: Conda Forge and Anaconda started consistently including pip3 entrypoints for the pip module after version 20.2.
Installing a pure Python package, such as lmfit with the correct version of pip install lmfit should be fine.
Conda first is recommended to make the life of the conda maintainers and packagers easier, not the user's life. FWIW, I maintain both kinds of packages,
and there is no reason to recommend conda install lmfit over pip install lmfit.
In fact, lmfit is not in the default anaconda repository so that installing it requires going to a third-party conda channel such as conda-forge. That adds complexity and risk to your conda environment.
Really, pip install lmfit should be fine.

Installing dependencies from (Conda) environment.yml without Conda?

I currently use Conda to capture my dependencies for a python project in a environment.yml.
When I build a docker service from the project I need to reinstall these dependencies. I would like to get around, having to add (mini-)conda to my docker image.
Is it possible to parse environment.yml with pip/pipenv or transform this into a corresponding requirements.txt?
(I don't want to leave conda just yet, as this is what MLflow captures, when I log models)
Nope.
conda automatically installs dependencies of conda packages. These are resolved differently by pip, so you'd have to resolve the Anaconda dependency tree in your transformation script.
Many conda packages are non-Python. You couldn't install those dependencies with pip at all.
Some conda packages contain binaries that were compiled with the Anaconda compiler toolchain. Even if the corresponding pip package can compile such binaries on installation, it wouldn't be using the Anaconda toolchain. What you'd get would be fundamentally different from the corresponding conda package.
Some conda packages have fixes applied, which are missing from corresponding pip packages.
I hope this is enough to convince you that your idea won't fly.
Installing Miniconda isn't really a big deal. Just do it :-)

Does conda env support development dependencies?

Does conda allow you to install a dependency into an environment as a development dependency?
I'm thinking of something like how bower does this with --save-dev
AFAICT, no, it does not. This repo represents work around options that might be useful elsewhere:
https://github.com/dazza-codes/conda_container
In short, it supplements a conda install with subsequent pip installs from a requirements.txt and/or a requirements.dev file. Since there can be inconsistencies in conda vs. pip packages (like different name variants etc.), there are use cases for having a combination of conda and pip. Also conda can support a pip array in an environment.yml file but the version specs for conda vs. pip packages are not compatible. Liberal use of pip check is recommended for any combination of packages from different packaging systems.

Categories