pip uninstall - what is getting deleted? - python

I have a pipeline using biopython. I installed a different software that also uses biopython, and it overwrote my existing biopython version to give me the latest version. Some bits of my pipeline now no longer work, I suspect because they use deprecated elements of the earlier biopython pipeline that are no longer useable.
I would therefore like to uninstall biopython and reinstall an earlier version.
When I do pip uninstall biopython, this comes up in my command prompt:
C:\Users\u03132tk>pip uninstall biopython
Uninstalling biopython-1.79:
Would remove:
c:\anaconda3\lib\site-packages\bio\*
c:\anaconda3\lib\site-packages\biopython-1.79.dist-info\*
c:\anaconda3\lib\site-packages\biosql\*
Would not remove (might be manually added):
c:\anaconda3\lib\site-packages\bio\profiler.py
c:\anaconda3\lib\site-packages\bio\~aps\__init__.py
.....many more paths
As I understand, * is a wild card. Wouldn't removing c:\anaconda3\lib\site-packages\bio\* (first removed path) remove all of the paths under Would not remove that are prefixed with c:\anaconda3\lib\site-packages\bio?
I'm just trying to clarify what it's doing.
Cheers for reading!

c:\anaconda3\lib\site-packages\bio\profiler.py
c:\anaconda3\lib\site-packages\bio\~aps\__init__.py
These files are not part of a standard biopython installation. ~aps looks odd. Possibly a previous installation was killed half-way and now your biopython installation is broken?
That being said, you are right that c:\anaconda3\lib\site-packages\bio\* would include c:\anaconda3\lib\site-packages\bio\profiler.py. The output is a little confusing here. All files listed under Would not remove (might be manually added): will be kept. Note however that for a clean biopython installation there would be no such entries:
(py39) C:\>pip install biopython
Collecting biopython
Using cached biopython-1.79-cp39-cp39-win_amd64.whl (2.3 MB)
Collecting numpy
Using cached numpy-1.23.2-cp39-cp39-win_amd64.whl (14.7 MB)
Installing collected packages: numpy, biopython
Successfully installed biopython-1.79 numpy-1.23.2
(py39) C:\>pip uninstall biopython
Found existing installation: biopython 1.79
Uninstalling biopython-1.79:
Would remove:
c:\miniconda3\envs\py39\lib\site-packages\bio\*
c:\miniconda3\envs\py39\lib\site-packages\biopython-1.79.dist-info\*
c:\miniconda3\envs\py39\lib\site-packages\biosql\*
Proceed (Y/n)? y
Successfully uninstalled biopython-1.79
(py39) C:\>
So I would suggest that after uninstalling, you just remove all remaining files in the BIO folder.
Proof that they are kept
One can reproduce the message you get by just adding a custom file in the BIO folder after installation:
(py39) C:\>pip install biopython
Collecting biopython
Using cached biopython-1.79-cp39-cp39-win_amd64.whl (2.3 MB)
Requirement already satisfied: numpy in c:\miniconda3\envs\py39\lib\site-packages (from biopython) (1.23.2)
Installing collected packages: biopython
Successfully installed biopython-1.79
(py39) C:\>type nul >> C:\miniconda3\envs\py39\lib\site-packages\Bio\profile.py
(py39) C:\>pip uninstall biopython
Found existing installation: biopython 1.79
Uninstalling biopython-1.79:
Would remove:
c:\miniconda3\envs\py39\lib\site-packages\bio\*
c:\miniconda3\envs\py39\lib\site-packages\biopython-1.79.dist-info\*
c:\miniconda3\envs\py39\lib\site-packages\biosql\*
Would not remove (might be manually added):
c:\miniconda3\envs\py39\lib\site-packages\bio\profile.py
Proceed (Y/n)? y
Successfully uninstalled biopython-1.79
(py39) C:\>dir c:\miniconda3\envs\py39\lib\site-packages\bio
Volume in drive C has no label.
Volume Serial Number is F060-053A
Directory of c:\miniconda3\envs\py39\lib\site-packages\bio
18/08/2022 15:16 <DIR> .
18/08/2022 15:16 <DIR> ..
18/08/2022 15:16 0 profile.py
1 File(s) 0 bytes
2 Dir(s) 35.953.205.248 bytes free
(py39) C:\>
Side Note
biopython is part of the anaconda distribution, so using pip install in the base env will have overwritten a package installed via conda using pip, which is bad practice and can cause inconsistencies in your environment. I would suggest modifying an anaconda base env as little as possible. Instead, you could create an environment that holds all the packages that you need for your pipeline to run stably. You can even have a .yml file specifying the required package versions so that you can easily recreate it. This has the advantage that if your experiments lead you to a broken state, you can just recreate the environment (which is a lot more of a pain for the anaconda base with >600 packages)

Related

Pip wheel collects 2 versions of a package then pip install gets a conflict

We use a pipeline that first uses pip wheel to collect all the packages that are needed in the project and then it creates a docker image that calls to pip install on the collected wheels.
The issue I am encountering is that when calling pip wheel, pip is collecting 2 different versions of a package. This has started occurring once a new version of the package is available.
The project has a requirement for an internal library ecs-deployer==10.1.2 and that library has in turn a requirement in the form of: elb-listener>=3.2.1+25,<4
The relevant output of pip wheel with the verbose option says:
Collecting elb-listener>=3.2.1+25,<4
Created temporary directory: /tmp/pip-unpack-zr930807
File was already downloaded /home/user/path/dist/elb_listener-3.2.2+26-py3-none-any.whl
Added elb-listener>=3.2.1+25,<4 from https://internal-repository.com/path/elb_listener/3.2.2%2B26/elb_listener-3.2.2%2B26-py3-none-any.whl#md5=foo (from ecs-deployer==10.1.2->service==1.0.0) to build tracker '/tmp/pip-req-tracker-1tz9t5ls'
Removed elb-listener>=3.2.1+25,<4 from https://internal-repository.com/path/elb_listener/3.2.2%2B26/elb_listener-3.2.2%2B26-py3-none-any.whl#md5=blabla (from ecs-deployer==10.1.2->service==1.0.0) to build tracker '/tmp/pip-req-tracker-1tz9t5ls'
And also:
Collecting elb-listener>=3.2.1+25,<4
Created temporary directory: /tmp/pip-unpack-yfnxim_u
File was already downloaded /home/user/path/dist/elb_listener-3.2.3+27-py3-none-any.whl
Added elb-listener>=3.2.1+25,<4 from https://internal-repository.com/path/elb_listener/3.2.3%2B27/elb_listener-3.2.3%2B27-py3-none-any.whl#md5=bar (from ecs-deployer==10.1.2->service==1.0.0) to build tracker '/tmp/pip-req-tracker-1tz9t5ls'
Then when the pip install is called I get this:
ERROR: Cannot install elb-listener 3.2.2+26 (from /opt/elb_listener-3.2.2+26-py3-none-any.whl) and cad-aws-elb-listener-target-group-builder 3.2.3+27 (from /opt/elb_listener-3.2.3+27-py3-none-any.whl) because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested elb-listener 3.2.2+26 (from /opt/elb_listener-3.2.2+26-py3-none-any.whl)
The user requested elb-listener 3.2.3+27 (from /opt/elb_listener-3.2.3+27-py3-none-any.whl)
We use pip 20.2.3 with the option --use-feature=2020-resolver
Is it normal that pip wheel collects several versions of the same package?
If so, can I indicate in any way to either pip wheel to only collect one of the versions or to pip install to only use the latest version?
If not, is there any way to solve this problem? I guess changing the requirement to elb-listener>=3.2.1+27,<4 would solve it, but we don't have direct access to that library and it would take a while for other team to change it.
As per #sinoroc comment, upgrading the python to 3.10 and pip version to 21.2.4 solved this particular issue.
As far as I understood, "local version identifiers" such as 3.2.1+25 are far from usual, apparently they are not meant to be used anywhere public (like PyPI), and that might be the reason for all the trouble here. I am really not sure how well they are supported by Python packaging tools and maybe they confuse the dependency resolution.
Local version identifiers SHOULD NOT be used when publishing upstream projects to a public index server, but MAY be used to identify private builds created directly from the project source. Local version identifiers SHOULD be used by downstream projects when releasing a version that is API compatible with the version of the upstream project identified by the public version identifier, but contains additional changes (such as bug fixes). As the Python Package Index is intended solely for indexing and hosting upstream projects, it MUST NOT allow the use of local version identifiers.
-- "Local version identifiers" section of _PEP 440

How to get rid of non-existing module visible in modules list

I'm working with brew python 3.7 distribution on macOS High Sierra. When installing several modules at once via pip install [modules...] I came across several errors that I dealt with at the time.
However now when accessing to modules list via pip list at the beginning of the list the following information is displayed:
Package Version
--------------------------------- -----------
- r
-BB 0.1
-br 5.4.1
-ock 2.0.0
-ocutils 0.15.1
-r 5.4.1
-stropy 3.2.1
absl-py 0.7.1
... remaining normal modules ...
Those 'modules' seem to be parts of names of actually existing modules. With general uninstall procedure like: pip uninstall -BB or
pip uninstall '-BB' I cannot remove none of those 'modules'. How to get rid of them?
it sounds like those packages are dependency of other packages you have installed
you can check the dependency tree with pipdeptree
lookout for those packages you mentioned see if they're dependency of other packages

Python pip install cannot find module

So I am trying to install pylint using pip, as my work machine is offline I have downloaded pylint using pip and transferred this using a CD. As part of pylint download it also brought down asteroid, colorama, isort, lazy_object_proxy, McCabe, six, typed ast and wrapt.
However when running the install for pylint using the following command inside the directory with all the above files in:
python -m pip install --no-index --find-links . -r requirements.txt
This starts to work with it collecting pylint, isort and a couple of others, however after collecting asteroid it goes to collect lazy object proxy (which is in the directory) and gives the following error:
Could not find a version that satisfies the requirement lazy-object-proxy (from asteroid<3,>=2.2.0->pylint->-r requirements.txt (line 1)) (from versions: ) No matching distribution found for lazy-object-proxy (from asteroid<3,>=2.2.0->pylint->-r requirements.txt (line 1))
The version of lazy object proxy downloaded is 1.4.1
Im fairly new to this so maybe there is something in this error that highlights why it doesn't see or like the version that is downloaded and in the directory, any help would be much appreciated.
OS is windows 7, running python version 3.6.0
NOTE: even trying to just install lazy object proxy on its own fails saying it doesn't exist, its like its not there although it is in the folder.

How to update packages hashes?

I'm sort of new to python, and using Anaconda3-5.1.0 on windows 10
i need to install package kwant, kwant is n't coming for anaconda windows , so i try pip install it ,after some struggle and reading here I found
kwant-1.3.2-cp36-cp36m-win_amd64.whl
so it match the platform , but this package need scipy and numpy and this comes with anaconda and when i try to pip install kwant ,i get this
THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
scipy>=0.14 from https://files.pythonhosted.org/packages/62/e2/364f0bcc641aeff79d743c732769d5dc31a1e78c27699229431412c4b425/scipy-1.1.0-cp36-none-win_amd64.whl#sha256=698c6409da58686f2df3d6f815491fd5b4c2de6817a45379517c92366eea208f (from kwant==1.3.2):
Expected sha256 698c6409da58686f2df3d6f815491fd5b4c2de6817a45379517c92366eea208f
Got 7072c63cb59028a73b639b354c0054525b002ef2d87a1d45ed7cdeba736b5cc6
numpy>=1.8.1 from https://files.pythonhosted.org/packages/af/e4/7d7107bdfb5c33f6cf33cdafea8c27d1209cf0068a6e3e3d3342be6f3578/numpy-1.14.3-cp36-none-win_amd64.whl#sha256=560e23a12e7599be8e8b67621396c5bc687fd54b48b890adbc71bc5a67333f86 (from kwant==1.3.2):
Expected sha256 560e23a12e7599be8e8b67621396c5bc687fd54b48b890adbc71bc5a67333f86
Got 143abb1baa1e5a3427ed09a4f52223aa3947bf76ca25dc4c71da0c2ae663040a
as i said I have updated scipy and numpy packages
so how could i update the HASHES so i can use kwant package
use this to install ,no-cache-dir
pip install --no-cache-dir kwant-1.3.2-cp36-cp36m-win_amd64.whl
it means to no use cache to do install , i also get it when i use window 10 .have fun,fix it

Pip is not installing any libraries

I have been trying to install numpy using pip on Windows.
But it doesn't seem to be working.
I tried installing numpy and t told me that microsoft C++ package is missing and asked me to install it. I did that and tried re-installing numpy. But this time it doesn't seems to work. It doesn't seem to forward after this point
C:\Users\neil>pip install numpy
Collecting numpy
Using cached numpy-1.9.1.tar.gz
Running from numpy source directory.
Installing collected packages: numpy
Running setup.py install for numpy
Then I tried installing scipy, even that doesn't seem to move forward after this point.
C:\Users\neil>pip install scipy
Collecting scipy
Downloading scipy-0.14.1.tar.gz (10.9MB)
100% |################################| 10.9MB 284kB/s ta 0:00:01
You could use Anaconda. Anaconda is a completely free Python distribution (including for commercial use and redistribution). It includes over 195 of the most popular Python packages for science, math, engineering, data analysis (including numpy and scipy).
After downloading the Anaconda installer, double click on the installer application icon and run it.
Follow the instructions in the installer.
The installer is also capable of running in silent mode, without bringing up the graphical interface. To install Anaconda, type the following command into a command prompt:
> Anaconda-2.x.x-Windows-x86[_64].exe /S /D=C:\Anaconda
Good luck.

Categories