In Conda environment, how to upgrade a package that is pip installed? - python

I use AWS SageMaker for ML software development. In SageMaker, there are several conda environments to choose from. I need to upgrade some packages in a conda environment that are pip installed. From my research, pip and conda are not compatible. So what is the best way to upgrade these pip-installed package?
As an example, the below image shows a conda_tensorflow_p36 environment and the keras package is pip installed. I want to upgrade the keras package to the current version. How do I do that?

You need to specify the name of the conda environment to use when upgrading, so change your conda upgrade keras command to:
conda upgrade -n conda_tensorflow_p36 keras
EDIT: Alternatively, the Install External Libraries and Kernels documentation page for SageMaker gives an example script that downloads/installs an entirely new version/instance of miniconda from the notebook. Then any packages you need (including keras) can be installed into that miniconda instance, independent of the versions provided by SageMaker.

Related

ERROR: Could not find a version that satisfies the requirement and remote connection warning

I am trying to install some packages in a conda environment and i keep getting the following error. At first I thought that it had something to do with jupyter notebook as I had installed it recently in my base environment. So I uninstalled jupyter but I still can't install the packages I want. Any ideas on what is the problem would be very helpful.
If I try to install keras tuner for example with conda, I get:
The keras-tuner package (not keras_tuner) is only available through Conda Forge. Installing Conda Forge packages into an Anaconda base environment is not recommended. Instead, create a new environment and specify exactly the packages you would like to have available. E.g.,
conda create -n foo -c conda-forge python=3.10 tensorflow=2.10 keras=2.10 keras-tuner
Also include the ipykernel package if you intend to use this as a Jupyter notebook kernel.

Can poetry and conda consider the dependencies of each other?

In a python project, I'm using conda and poetry together. conda is used to create the environment and to install poetry and python. The project dependencies are mainly managed using poetry.
However, there are some packages which I would like to install using conda as it is more convenient. Such examples include spyder, ipykernel and packages using CUDA.
The problem is that these packages installed by conda of course install dependencies themselves. If I install some other packages using poetry afterwards, these dependencies might get replaced by other versions which are again incompatible with the packages installed by conda.
Is there any way to have common dependencies which are used by both conda and poetry?
Listing all dependencies of all children recursively is not an option!

Unable to access pip installed packages from anaconda jupyter notebooks

when i try to import tensorflow from jupyter notebooks. I'm facing a error No module named 'tensorflow' .
But i have installed tensorflow using pip command, and it available in this path c:\program files\python38\lib\site-packages.
please tell me how to access packages installed via pip from jupyter notebooks?
When you installed tensorflow you had a specific environment active and that is where tensorflow was installed. If you are using Anaconda and did not specify which environment to make active it installed it in the base environment. If you want to install tensorflow to a specific environment (lets call it tf) then start the anaconda prompt and enter the text conda activate tf. Then install tensorflow with pip in the same window. My recommendation is to install tensorflow with conda versus pip. conda installs tensorflow and also installs the cuda toolkit and the proper version of cuDNN. pip does not do that. If you install tensorflow with conda I believe it installs version 2.1, cuda toolkit version 10.1.243 and cuDNN version 7.6.5.

Installation in a new environement: conda install or pip install?

I recently installed tensorflow 2.0 by creating a new environment called "tensorflow". I usually activate that environment before doing any work (e.g., calling jupyter notebook) by typing "conda activate tensorflow" in Anaconda prompt.
Sorry i am a beginner in this. it happens that i need to install new package there called tensorflow_datasets . I know that i can do pip install tensorflow_datasets but i am concerned that using pip under this tensorflow conda environment may mess up my tensorflow installation like it did in the past. Can you please advise me on how i should do the installation? would you recommend conda install tensorflow_datasets?
More generally, once we create a new environment how can we install new packages there?
Many Thanks

Importing custom built pip installed tensorflow to conda environment

I'm a newbie to python and tensorflow and after installing a custom wheel version of tensorflow 1.5.0 using pip install, I created a conda environment using Pycharm. On activating the environment, I observed that the tensorflow isn't one of its packages.
Am just wondering if there was any way I could import the pip installed tensorflow package into the environment without building it with conda and reinstalling it in the environment? Thanks
Edited:
Discovered that you have to install it in the new environment before
you can use it

Categories