Python pip doesn't build dependencies during installation - python

Pip doesn't seem to build dependencies from source on my Ubuntu server, while it always does that on my OS X machine. For example, when I try to install package qiime in a conda or virtualenv (I tried both) environment it takes seconds to install a hell lot of things that take loads of time to compile on my Mac.
(qiime)user#server:~$ pip install qiime
Collecting qiime
Collecting qiime-default-reference<0.2.0,>=0.1.2 (from qiime)
Collecting burrito<1.0.0,>=0.9.1 (from qiime)
Collecting pandas>=0.13.1 (from qiime)
Collecting natsort<4.0.0 (from qiime)
Using cached natsort-3.5.6-py2.py3-none-any.whl
Collecting matplotlib!=1.4.2,>=1.1.0 (from qiime)
Collecting numpy>=1.9.0 (from qiime)
Collecting gdata (from qiime)
Collecting scikit-bio<0.3.0,>=0.2.3 (from qiime)
Collecting pynast==1.2.2 (from qiime)
Collecting biom-format<2.2.0,>=2.1.4 (from qiime)
Collecting burrito-fillings<0.2.0,>=0.1.1 (from qiime)
Collecting qcli<0.2.0,>=0.1.1 (from qiime)
Collecting scipy>=0.14.0 (from qiime)
Collecting cogent==1.5.3 (from qiime)
Collecting emperor<1.0.0,>=0.9.51 (from qiime)
Collecting six (from qiime-default-reference<0.2.0,>=0.1.2->qiime)
Using cached six-1.10.0-py2.py3-none-any.whl
Collecting future (from burrito<1.0.0,>=0.9.1->qiime)
Collecting pytz>=2011k (from pandas>=0.13.1->qiime)
Using cached pytz-2015.7-py2.py3-none-any.whl
Collecting python-dateutil (from pandas>=0.13.1->qiime)
Using cached python_dateutil-2.4.2-py2.py3-none-any.whl
Collecting cycler (from matplotlib!=1.4.2,>=1.1.0->qiime)
Using cached cycler-0.9.0-py2.py3-none-any.whl
Collecting pyparsing!=2.0.0,!=2.0.4,>=1.5.6 (from matplotlib!=1.4.2,>=1.1.0->qiime)
Using cached pyparsing-2.0.6-py2.py3-none-any.whl
Collecting IPython (from scikit-bio<0.3.0,>=0.2.3->qiime)
Using cached ipython-4.0.0-py2-none-any.whl
Collecting click (from biom-format<2.2.0,>=2.1.4->qiime)
Using cached click-5.1-py2.py3-none-any.whl
Collecting pyqi (from biom-format<2.2.0,>=2.1.4->qiime)
Collecting decorator (from IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Using cached decorator-4.0.4-py2.py3-none-any.whl
Collecting simplegeneric>0.8 (from IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Collecting pexpect (from IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Collecting traitlets (from IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Using cached traitlets-4.0.0-py2.py3-none-any.whl
Collecting pickleshare (from IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Collecting ptyprocess>=0.5 (from pexpect->IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Collecting ipython-genutils (from traitlets->IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Using cached ipython_genutils-0.1.0-py2.py3-none-any.whl
Collecting path.py (from pickleshare->IPython->scikit-bio<0.3.0,>=0.2.3->qiime)
Using cached path.py-8.1.2-py2.py3-none-any.whl
Installing collected packages: six, qiime-default-reference, future, burrito, pytz, python-dateutil, numpy, pandas, natsort, cycler, pyparsing, matplotlib, gdata, scipy, decorator, simplegeneric, ptyprocess, pexpect, ipython-genutils, traitlets, path.py, pickleshare, IPython, scikit-bio, cogent, pynast, click, pyqi, biom-format, burrito-fillings, qcli, emperor, qiime
Successfully installed IPython-4.0.0 biom-format-2.1.5 burrito-0.9.1 burrito-fillings-0.1.1 click-5.1 cogent-1.5.3 cycler-0.9.0 decorator-4.0.4 emperor-0.9.51 future-0.15.2 gdata-2.0.18 ipython-genutils-0.1.0 matplotlib-1.5.0 natsort-3.5.6 numpy-1.10.1 pandas-0.17.0 path.py-8.1.2 pexpect-4.0.1 pickleshare-0.5 ptyprocess-0.5 pynast-1.2.2 pyparsing-2.0.6 pyqi-0.3.2 python-dateutil-2.4.2 pytz-2015.7 qcli-0.1.1 qiime-1.9.1 qiime-default-reference-0.1.3 scikit-bio-0.2.3 scipy-0.16.1 simplegeneric-0.8.1 six-1.10.0 traitlets-4.0.0
When I try to use the package I get various errors that prove that pip hasn't really compiled any dependencies. What should I do with that? For example, let's try to import pandas
In [1]: import pandas
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-d6ac987968b6> in <module>()
----> 1 import pandas
/home/user/.conda/envs/qiime/lib/python2.7/site-packages/pandas/__init__.py in <module>()
11 "pandas from the source directory, you may need to run "
12 "'python setup.py build_ext --inplace' to build the C "
---> 13 "extensions first.".format(module))
14
15 from datetime import datetime
ImportError: C extension: /home/user/.conda/envs/qiime/lib/python2.7/site-packages/pandas/hashtable.so: undefined symbol: PyFPE_jbuf not built. If you want to import pandas from the source directory, you may need to run 'python setup.py build_ext --inplace' to build the C extensions first.
I know I can build everything manually, but I really want to fix pip.

Passing --no-cache-dir to pip during installation seems to solve the issue, though I don't understand what caches have to do with compilation.

Related

Can't install package mxnet for using DeepAREstimator by gluonts

I am trying to use DeepAR for forecasting time series. I install gluonts, but when i import the module i get the error with absence mxnet.
Use python version 3.9.7, numpy version 1.20.3
As I understand it, the error is related to the version? mxnet install only with numpy 1.16.6?
Error wit install mxnet:
Collecting mxnet
Using cached mxnet-1.7.0.post2-py2.py3-none-win_amd64.whl (33.1 MB)
Collecting graphviz<0.9.0,>=0.8.1
Using cached graphviz-0.8.4-py2.py3-none-any.whl (16 kB)
Requirement already satisfied: requests<2.19.0,>=2.18.4 in c:\users\tred1\anaconda3\lib\site-packages (from mxnet) (2.18.4)
Collecting numpy<1.17.0,>=1.8.2
Using cached numpy-1.16.6.zip (5.1 MB)
Requirement already satisfied: certifi>=2017.4.17 in c:\users\tred1\anaconda3\lib\site-packages (from requests<2.19.0,>=2.18.4->mxnet) (2021.10.8)
Requirement already satisfied: urllib3<1.23,>=1.21.1 in c:\users\tred1\anaconda3\lib\site-packages (from requests<2.19.0,>=2.18.4->mxnet) (1.22)
Requirement already satisfied: idna<2.7,>=2.5 in c:\users\tred1\anaconda3\lib\site-packages (from requests<2.19.0,>=2.18.4->mxnet) (2.6)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in c:\users\tred1\anaconda3\lib\site-packages (from requests<2.19.0,>=2.18.4->mxnet) (3.0.4)
Building wheels for collected packages: numpy
Building wheel for numpy (setup.py): started
Building wheel for numpy (setup.py): finished with status 'error'
Running setup.py clean for numpy
Failed to build numpy
Installing collected packages: numpy, graphviz, mxnet
Attempting uninstall: numpy
Found existing installation: numpy 1.20.3
Uninstalling numpy-1.20.3:
Successfully uninstalled numpy-1.20.3
Running setup.py install for numpy: started
Running setup.py install for numpy: finished with status 'error'
Rolling back uninstall of numpy
Moving to c:\users\tred1\anaconda3\lib\site-packages\numpy-1.20.3.dist-info\
from C:\Users\tred1\anaconda3\Lib\site-packages\~umpy-1.20.3.dist-info
Moving to c:\users\tred1\anaconda3\lib\site-packages\numpy\
from C:\Users\tred1\anaconda3\Lib\site-packages\~umpy
Moving to c:\users\tred1\anaconda3\scripts\f2py-script.py
from C:\Users\tred1\AppData\Local\Temp\pip-uninstall-5bceooxs\f2py-script.py
Moving to c:\users\tred1\anaconda3\scripts\f2py.exe
from C:\Users\tred1\AppData\Local\Temp\pip-uninstall-5bceooxs\f2py.exe
Note: you may need to restart the kernel to use updated packages

How can I downgrade packaging when installing scanpy?

When trying to install scanpy, I get the following error
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
poetry 1.1.11 requires packaging<21.0,>=20.4, but you have packaging 21.2 which is incompatible
In that sense, I tried to downgrade packaging from version 21.2 to 20.9 or 20.8, with
! pip install --upgrade packaging==20.9
or with
! pip uninstall packaging -y
! pip install -I packaging==20.8
but I still get that error, as if the packaging version did not change.
Here is the full block
! pip install --upgrade packaging==20.9
#! pip uninstall packaging -y
#! pip install -I packaging==20.8
#! pip install poetry --upgrade
! pip uninstall scanpy -y
! pip install -I scanpy
and the full output
Collecting packaging==20.9
Using cached packaging-20.9-py2.py3-none-any.whl (40 kB)
Requirement already satisfied: pyparsing>=2.0.2 in /data04/projects04/MarianaBoroni/lbbc_members/lib/conda_envs/diogoamb/lib/python3.9/site-packages (from packaging==20.9) (3.0.4)
Installing collected packages: packaging
Attempting uninstall: packaging
Found existing installation: packaging 21.2
Uninstalling packaging-21.2:
Successfully uninstalled packaging-21.2
Successfully installed packaging-20.9
Found existing installation: scanpy 1.8.2
Uninstalling scanpy-1.8.2:
Successfully uninstalled scanpy-1.8.2
Collecting scanpy
Using cached scanpy-1.8.2-py3-none-any.whl (2.0 MB)
Collecting anndata>=0.7.4
Using cached anndata-0.7.6-py3-none-any.whl (127 kB)
Collecting h5py>=2.10.0
Using cached h5py-3.5.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (4.5 MB)
Collecting scipy>=1.4
Using cached scipy-1.7.1-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.whl (28.5 MB)
Collecting sinfo
Using cached sinfo-0.3.4-py3-none-any.whl
Collecting matplotlib>=3.1.2
Using cached matplotlib-3.4.3-cp39-cp39-manylinux1_x86_64.whl (10.3 MB)
Collecting umap-learn>=0.3.10
Using cached umap_learn-0.5.2-py3-none-any.whl
Collecting numba>=0.41.0
Using cached numba-0.54.1-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.3 MB)
Collecting joblib
Using cached joblib-1.1.0-py2.py3-none-any.whl (306 kB)
Collecting numpy>=1.17.0
Using cached numpy-1.21.4-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.7 MB)
Collecting seaborn
Using cached seaborn-0.11.2-py3-none-any.whl (292 kB)
Collecting natsort
Using cached natsort-8.0.0-py3-none-any.whl (37 kB)
Collecting statsmodels>=0.10.0rc2
Using cached statsmodels-0.13.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (9.9 MB)
Collecting networkx>=2.3
Using cached networkx-2.6.3-py3-none-any.whl (1.9 MB)
Collecting tables
Using cached tables-3.6.1-cp39-cp39-manylinux2010_x86_64.whl (14.3 MB)
Collecting tqdm
Using cached tqdm-4.62.3-py2.py3-none-any.whl (76 kB)
Collecting patsy
Using cached patsy-0.5.2-py2.py3-none-any.whl (233 kB)
Collecting scikit-learn>=0.22
Using cached scikit_learn-1.0.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (24.7 MB)
Collecting packaging
Using cached packaging-21.2-py3-none-any.whl (40 kB)
Collecting pandas>=0.21
Using cached pandas-1.3.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.5 MB)
Collecting xlrd<2.0
Using cached xlrd-1.2.0-py2.py3-none-any.whl (103 kB)
Collecting kiwisolver>=1.0.1
Using cached kiwisolver-1.3.2-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.6 MB)
Collecting cycler>=0.10
Using cached cycler-0.11.0-py3-none-any.whl (6.4 kB)
Collecting pyparsing>=2.2.1
Using cached pyparsing-3.0.4-py3-none-any.whl (96 kB)
Collecting python-dateutil>=2.7
Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Collecting pillow>=6.2.0
Using cached Pillow-8.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.1 MB)
Collecting llvmlite<0.38,>=0.37.0rc1
Using cached llvmlite-0.37.0-cp39-cp39-manylinux2014_x86_64.whl (26.3 MB)
Collecting numpy>=1.17.0
Using cached numpy-1.20.3-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.4 MB)
Collecting setuptools
Using cached setuptools-58.5.3-py3-none-any.whl (946 kB)
Collecting pyparsing>=2.2.1
Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting pytz>=2017.3
Using cached pytz-2021.3-py2.py3-none-any.whl (503 kB)
Collecting six>=1.5
Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Collecting threadpoolctl>=2.0.0
Using cached threadpoolctl-3.0.0-py3-none-any.whl (14 kB)
Collecting pynndescent>=0.5
Using cached pynndescent-0.5.5-py3-none-any.whl
Collecting stdlib-list
Using cached stdlib_list-0.8.0-py3-none-any.whl (63 kB)
Collecting numexpr>=2.6.2
Using cached numexpr-2.7.3-cp39-cp39-manylinux2010_x86_64.whl (471 kB)
Installing collected packages: numpy, threadpoolctl, six, setuptools, scipy, llvmlite, joblib, scikit-learn, pytz, python-dateutil, pyparsing, pillow, numba, kiwisolver, cycler, xlrd, tqdm, stdlib-list, pynndescent, patsy, pandas, packaging, numexpr, natsort, matplotlib, h5py, umap-learn, tables, statsmodels, sinfo, seaborn, networkx, anndata, scanpy
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
poetry 1.1.11 requires packaging<21.0,>=20.4, but you have packaging 21.2 which is incompatible.
Successfully installed anndata-0.7.6 cycler-0.11.0 h5py-3.5.0 joblib-1.1.0 kiwisolver-1.3.2 llvmlite-0.37.0 matplotlib-3.4.3 natsort-8.0.0 networkx-2.6.3 numba-0.54.1 numexpr-2.7.3 numpy-1.21.1 packaging-21.2 pandas-1.3.4 patsy-0.5.2 pillow-8.4.0 pynndescent-0.5.5 pyparsing-3.0.4 python-dateutil-2.8.2 pytz-2021.3 scanpy-1.8.2 scikit-learn-1.0.1 scipy-1.7.1 seaborn-0.11.2 setuptools-58.5.3 sinfo-0.3.4 six-1.16.0 statsmodels-0.13.0 stdlib-list-0.8.0 tables-3.6.1 threadpoolctl-3.0.0 tqdm-4.62.3 umap-learn-0.5.2 xlrd-1.2.0
You could try installing all your packages in one line like this and see if the conflicts are resolved, pip install packaging scanpy. You could also try with your specific versions like this pip install packaging==20.9 scanpy==1.0 or whatever version you like. You can also try installing like this pip install package>=20.9 package2<1.9
You could also try using a package manager like conda or mamba, create a new environment and try installing the packages there. The package managers try to resolve conflicts like that.

Deep animator wont install correctly

I've gotten a problem when I tried to install a piece of software called "Deep animator".
When I tried to install it (pip install deep_animator). I can't use the deep_animator tool because torch can't install. This was the result of it.
Collecting deep_animator
Using cached deep_animator-0.1.1-py3-none-any.whl (22 kB)
Collecting Pillow==5.2.0
Using cached Pillow-5.2.0-cp36-cp36m-win_amd64.whl (1.6 MB)
Collecting scipy==1.1.0
Using cached scipy-1.1.0-cp36-none-win_amd64.whl (31.1 MB)
Collecting torchvision>=0.2.1
Using cached torchvision-0.5.0-cp36-cp36m-win_amd64.whl (1.2 MB)
ERROR: Could not find a version that satisfies the requirement torch>=1.3.0 (from deep_animator) (from versions: 0.1.2, 0.1.2.post1, 0.1.2.post2)
ERROR: No matching distribution found for torch>=1.3.0 (from deep_animator)```

Why 'sudo pip install' if 'pip install' works? (Issues with HelloAnalytics.py)

Google provides a sample HelloAnalytics.py to demonstrate the use of google-api-python-client.
Below the heading "2. Install the client library" they write:
Use pip, the recommended tool for installing Python packages:
sudo pip install --upgrade google-api-python-client
I wonder why I should use sudo while a simple pip (without sudo) seems to work perfectly (on my Mac):
% pip install google-api-python-client
Collecting google-api-python-client
Using cached google_api_python_client-1.9.3-py3-none-any.whl (59 kB)
Collecting six<2dev,>=1.6.1
Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)
Collecting google-auth-httplib2>=0.0.3
Using cached google_auth_httplib2-0.0.3-py2.py3-none-any.whl (6.3 kB)
Collecting google-api-core<2dev,>=1.18.0
Using cached google_api_core-1.20.1-py2.py3-none-any.whl (90 kB)
Collecting uritemplate<4dev,>=3.0.0
Using cached uritemplate-3.0.1-py2.py3-none-any.whl (15 kB)
Collecting httplib2<1dev,>=0.9.2
Using cached httplib2-0.18.1-py3-none-any.whl (95 kB)
Collecting google-auth>=1.16.0
Using cached google_auth-1.17.2-py2.py3-none-any.whl (90 kB)
Collecting protobuf>=3.12.0
Using cached protobuf-3.12.2-cp37-cp37m-macosx_10_9_x86_64.whl (1.3 MB)
Collecting pytz
Using cached pytz-2020.1-py2.py3-none-any.whl (510 kB)
Collecting requests<3.0.0dev,>=2.18.0
Using cached requests-2.23.0-py2.py3-none-any.whl (58 kB)
Requirement already satisfied: setuptools>=34.0.0 in ./venv/lib/python3.7/site-packages (from google-api-core<2dev,>=1.18.0->google-api-python-client) (41.2.0)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
Using cached googleapis_common_protos-1.52.0-py2.py3-none-any.whl (100 kB)
Collecting cachetools<5.0,>=2.0.0
Using cached cachetools-4.1.0-py3-none-any.whl (10 kB)
Collecting pyasn1-modules>=0.2.1
Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Collecting rsa<5,>=3.1.4; python_version >= "3"
Using cached rsa-4.6-py3-none-any.whl (47 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2020.4.5.2-py2.py3-none-any.whl (157 kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
Using cached urllib3-1.25.9-py2.py3-none-any.whl (126 kB)
Collecting chardet<4,>=3.0.2
Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting idna<3,>=2.5
Using cached idna-2.9-py2.py3-none-any.whl (58 kB)
Collecting pyasn1<0.5.0,>=0.4.6
Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Installing collected packages: six, cachetools, pyasn1, pyasn1-modules, rsa, google-auth, httplib2, google-auth-httplib2, protobuf, pytz, certifi, urllib3, chardet, idna, requests, googleapis-common-protos, google-api-core, uritemplate, google-api-python-client
Successfully installed cachetools-4.1.0 certifi-2020.4.5.2 chardet-3.0.4 google-api-core-1.20.1 google-api-python-client-1.9.3 google-auth-1.17.2 google-auth-httplib2-0.0.3 googleapis-common-protos-1.52.0 httplib2-0.18.1 idna-2.9 protobuf-3.12.2 pyasn1-0.4.8 pyasn1-modules-0.2.8 pytz-2020.1 requests-2.23.0 rsa-4.6 six-1.15.0 uritemplate-3.0.1 urllib3-1.25.9
Is that sample simply outdated? (They use print without () though Python 2.7 seems to be deprecated for google-api-python-client which confused others already.)
I fixed that and still get an ModuleNotFoundError: No module named 'oauth2client'. Am I right that the missing sudo is not the cause for that?
Well, and they do not explain what a VIEW_ID is and where to get this.
It works without sudo for you because you seem to have python installed in a location where your user has write permissions.
Requirement already satisfied: setuptools>=34.0.0 in ./venv/lib/python3.7/site-packages (from google-api-core<2dev,>=1.18.0->google-api-python-client) (41.2.0)
It all depends on how and where you install python. The mac os distro python is installed in /Library/ which is owned by root. So if you want to install a new package, you will need to use sudo.

Error while installing package thats need XGBOOST

I am trying to install the "MLBox" python package on anaconda (Python 3.6).
This package needs "xgboost" so I download the wheel file from this link and I did a pip install wheel-file. I had no issue with it. But when I use the pip install for installing "mlbox" I have this error:
Collecting pandas==0.20.3 (from mlbox)
Using cached pandas-0.20.3-cp36-cp36m-win_amd64.whl
Requirement already satisfied: joblib==0.11 in c:\users\amira ayadi\anaconda3\lib\site-packages (from mlbox)
Collecting scikit-learn==0.19.0 (from mlbox)
Using cached scikit_learn-0.19.0-cp36-cp36m-win_amd64.whl
Requirement already satisfied: Theano==0.9.0 in c:\users\amira ayadi\anaconda3\lib\site-packages (from mlbox)
Collecting xgboost==0.6a2 (from mlbox)
Using cached xgboost-0.6a2.tar.gz
No files/directories in C:\Users\AMIRAA~1\AppData\Local\Temp\pip-build-6ytmh20a\xgboost\pip-egg-info (from PKG-INFO)
I also tired to install xgboost with anaconda solution (https://anaconda.org/anaconda/py-xgboost)
But same error.
Do you have some ideas?
I am on Windows 10

Categories