After I run the conda env create -f environment.yml in Conda, I receive the following warning:
Warning : you have pip-installed dependencies in your environment file, but you do not list pip itself as one of your conda dependencies...
What does this mean and what should I be doing instead?
When creating an environment the warning disappeared by including - pip explicitly in the yaml file. Yes, it is a bit awkward because if your environment has pip packages you already have declared that you used pip packages with - pip:
The yaml file would look like:
# Packages omitted for simplicity
name: myenv
channels:
- anaconda
- conda-forge
- defaults
dependencies:
- python
- scipy
- pip
- pip:
- datetime
At the time of creating a new environment from scratch this warming can be avoided by explicitly installing pip, for instance with: conda create -n env_with_pip python=3.7 numpy pip
In your environment yml file under list of the packages you install through conda you must also add pip as a package to be installed. This installs the pip, and so your pip packages can be installed using this pip.
Previously pip was shipped with conda but now we have to explicitly install pip when using conda
Related
I am completly new to Conda and I have been struggling for a day without success.
I would like to use the pvtrace module. The documentation specifically says to run the following commands to install the package:
conda create --name pvtrace-env python=3.7.8
conda activate pvtrace-env
conda install Rtree
pip install pvtrace
I have the following error:
PackagesNotFoundError: The following packages are not available from current channels:
- python=3.7.8
I have Anaconda 3 which original Python version is 3.9. I installed the version 3.7, but it did not solve my problem.
Not sure why, but the defaults (main/anaconda) channels are missing specifically the Python 3.7.8 build. This is available through Conda Forge, so, instead try
## one should prefer to list all known package at creation
conda create -n pvtrace-env -c conda-forge python=3.7.8 pip rtree
conda activate pvtrace-env
pip install pvtrace
Alternatively, use a YAML file:
pvtrace-env.yaml
name: pvtrace-env
channels:
- conda-forge
dependencies:
- python=3.7.8
- rtree
- pip
- pip:
- pvtrace
which is used as:
conda env create -n pvtrace-env -f pvtrace-env.yaml
I was trying to install pycryptodome, python-jose-cryptodome using pip within anaocnda3 environment.
I got this error:
ERROR: Failed building wheel for pycryptodome
I have tried many versions many solutions(latest versions, specified version, with python 3.8 or 3.7, using requirements text without cache and even alone installation) but nothing worked for me :(. Any solution?
While using pip in an anaconda environment is allowed and fine, issues may arise when using pip and conda together, this was clearly mentioned in the conda docs.
One of the best practices when installing packages in an anaconda environment is to use conda for search and install before using pip.
So instead of directly using pip, try to :
Search for pycryptodome in anaconda packages repo
conda search pycryptodome
pycryptodome is available in anaconda repo .
The next step is to install pycryptodome :
conda install -c anaconda pycryptodome
or if you want to use conda-foge channel :
conda install -c conda-forge pycryptodome
this should get pycryptodome installed into your env
To use a requirements.txt file with conda :
conda install --yes --file requirements.txt
Summary : Best Practices Checklist When Using Pip in a Conda Environment
Use pip only after conda
install as many requirements as possible with conda, then use pip
pip should be run with –upgrade-strategy only-if-needed (the default)
Do not use pip with the –user argument, avoid all “users” installs
Use conda environments for isolation
create a conda environment to isolate any changes pip makes
environments take up little space thanks to hard links
care should be taken to avoid running pip in the “root” environment
Recreate the environment if changes are needed
once pip has been used conda will be unaware of the changes
to install additional conda packages it is best to recreate the
environment
Store conda and pip requirements in text files
package requirements can be passed to conda via the –file argument
pip accepts a list of Python packages with -r or –requirements
conda env will export or create environments based on a file with
conda and pip requirements .
you can read more about this topic here on anaconda website, and on conda docs
I have been using the following conda & python verison:
conda version : 4.6.14
conda-build version : 3.17.8
python version : 3.7.3.final.0
I installed simpletransformers in the following manner:
conda create -n simpletransformers python pandas tqdm
conda activate simpletransformers
conda install pytorch cpuonly -c pytorch
conda install -c anaconda scipy
conda install -c anaconda scikit-learn
pip install transformers
pip install seqeval
pip install tensorboardx
pip install simpletransformers
After doing so, I've been trying to import the classification model without much luck:
import simpletransformers
I get the following error:
ModuleNotFoundError: No module named 'simpletransformers'
Can someone point out where I'm going wrong? I'm using PyCharm as my IDE.
The setup docs work for me on Mac and Ubuntu using Anaconda:
Install Anaconda or Miniconda
Create a new virtual python 3.7 environment and install pandas and tqdm
conda create -n simplet python=3.7 pandas tqdm
conda activate simplet
PyTorch
3 a. GPU (use_cuda=True in your model): conda install pytorch cudatoolkit=10.1 -c pytorch
3 b. CPU (use_cuda=False in your model): conda install pytorch cpuonly -c pytorch
If you want to use fp16 training on an NVIDIA GPU install apex (don't use pip)
Install simpletransformers.
pip install simpletransformers
Download the .whl file from "https://pypi.org/project/simpletransformers/#files"
Open command prompt
type pip install "path/simpletransformers-0.13.2-py3-none-any.whl" and hit enter
Check whether the package gets installed.
Note that simpletransformers requires Python '>=3.6'
Whenever I have a package that is not available via Anaconda Cloud, i.e., I have to install from PyPI or GitHub, then I create a YAML environment definition for it. This follows the best practices enumerated in "Using Pip in a Conda Environment".
The advantage of a YAML is that it allows Conda to solve for everything at once and it lets one treat envs as immutable objects (i.e., if you need to alter the env, edit the YAML and recreate). This helps avoid the mess that inevitably seems to result from running a series of conda install, pip install, or conda update commands.
For me this is a multi-stage process, but it has been a reliable workflow for me:
Workflow for Mixed Conda-PyPI Environments
Look at the setup.py or requirements.txt of the non-Conda package. Here it is for simpletransformers.
For each requirement, check Anaconda Cloud (or conda search) to see if it is available as a Conda package.
If it is available, add it to the YAML file as a (non-pip) dependency. This ensures that everything that can come from Conda does.
Also, keep track of the channels that these packages come from. Note, I will not use a private channel I am unfamiliar with. In this case pytorch, conda-forge, and defaults (i.e., anaconda) suffice.
Include the packages that are PyPI-only under the pip section of the YAML, including the main package of interest (i.e., simpletransformers). Technically, you don't need to include the other dependencies, since pip will pull them in automatically, but I like to keep them explicit so that if I ever update the YAML I might check again if someone ported the PyPI packages to Conda Forge.
Create the env using the YAML
conda env create -n st_env -f simpletransformers.yaml
Check to see if any additional packages were implicitly pulled in as dependencies from PyPI, but were actually available through Conda. Edit the YAML to put these in the Conda dependencies section. In this case, keras is apparently also needed.
Remove the env and recreate using the updated maximally Conda version.
Most important: never change the env except through editing the YAML.
YAML for SimpleTransformers Environment
simpletransformers.yaml
name: st_env
channels:
- pytorch
- conda-forge
- defaults
dependencies:
- python=3.7
- pandas
- tqdm
- cpuonly
- pytorch
- transformers
- scipy
- scikit-learn
- requests
- tensorboardx
- keras
- pip
- pip:
- seqeval
- simpletransformers
Install with
conda env create -n st_env -f simpletransformers.yaml
If you have pip installed in your environment, just do hit a pip install simpletransformers in your terminal or If you're using jupyter notebook/colab, etc. then paste !pip install simpletransformers in your first cell and run it.
Then import simpletransformers
import simpletransformers
I'm working on a (python) project where the choice was to create a virtual environment using virtualenv. However, one of the project dependencies can't be installed through pip on macOS due to this bug: https://github.com/streamlit/streamlit/issues/283
The workaround is to conda install one of the dependencies to bypass the gcc compiler.
How do you conda install something in a virtual environment not created with conda?
I think the easiest approach would be to create a conda env by it's own.
1) Create a requirement.txt file by doing pip freeze > requirements.txt inside your virtualenv environment
2) Create conda env: conda create --name myenv
3) Activate your environment: source activate myenv
4) Install your dependencies: conda install --file requirements.txt
5) Install missing dependecy: conda install YOUR_MISSING_DEPENDENCY
In the accepted answer (upvoted) you can also change point 1) to use conda-installed packages (compatible with subsequent conda install, and excluding pip-installed packages that would be unavailable in conda channels, identified by "pypi" in their extended version names that only conda displays):
conda list --export | grep -v pypi > requirements.txt
And if you still want to use pip, the correct syntax that gets you packages versions list in a format compatible with pip install is now:
pip list --format=freeze > requirements.txt
Is there a way to use pip as a 'fallback' option for some packages in a conda environment, like you can have multiple prioritized conda channels?
For normal conda channels, my environment.yml would be as follows:
name: my_env
channels:
- defaults
- conda-forge
dependencies:
- some-package>=1.2.3
Where some-package would be installed from default channels if possible, or conda-forge otherwise. It would fail if neither channel has the appropriate package version.
The environment.yml with pip:
name: my_env
channels:
- defaults
- conda-forge
dependencies:
- pip
- pip:
- some-package>=1.2.3
Where some-package would always come from pip.
But what I want is something like this:
name: my_env
channels:
- defaults
- conda-forge
dependencies:
- some-package>=1.2.3
- pip
- pip:
- some-package>=1.2.3
Where the package would come from defaults first, else conda-forge, or else from pip.
However, this gives a ResolvePackageNotFound error before trying pip. Is there any way to achieve this?
I've read somewhere that conda shall be extended to allow satisfying dependencies with pip-installed packages. But I cannot find the reference with a quick search, and I don't think it's a production-ready feature anyway. And what I remember was not conda installing pip packages, but conda accepting already present packages that have been installed by pip.
Anaconda packages define their dependencies in terms of other Anaconda packages. Therefore, conda resolves dependencies of Anaconda packages within its own world of packages and metadata. Some packages don't even have the same name in Anaconda channels and on PyPI.
In other words: No, I don't think that what you want is possible. You'll have to call pip when you want something installed by pip.