It's my first time using the build system Semaphore, and I'm having trouble installing scipy while doing my build.
Specifically, it's complaining that BLAS and LAPACK are not installed. Unlike these answers suggests, I can't compile any of the fortran files because Semaphore CI doesn't have them installed on their machines (nor can I install them, because they require root).
What is the proper way of installing scipy in this situation?
(If someone has a suggestion of where to place this question on stackexchange, that would also be appreciated. I'm not sure if this question belongs here.)
It seems travis-ci had a similar issue. Except they resolved it by pre-installing scipy.
Semaphore CI gives you a passwordless sudo in your build environment, so you can use commands suggested in the official documentation in your build setup like:
sudo apt-get update
sudo apt-get install python python-dev libatlas-base-dev gcc gfortran g++
sudo pip install scipy
Related
I was trying to install ggplot using pip install ggplot. As it turned out, I was missing many essential packages, like stated in SciPy and blas and in SciPy with pip. After running
sudo apt-get install build-essential gfortran libatlas-base-dev python-pip python-dev I didn't get these "Cannot build wheel" errors anymore, luckily.
Now as I am trying just pip install scipy (which I need apparently) it fetches the package informations and gets stuck at Running setup.py install for scipy ... /. I can't imagine it takes this long time to install a 12 MB package (I waited for 30 minutes). Pip won't list it, so it's not isntalled. Does it really take so long? Or do I have another problem here, now? What am I missing?
I am running python3.4 on Ubuntu 14.04.
EDIT:
When trying pip install ggplot it gets stuck at the setup.py install for scipy too...
EDIT2:
It seems installing the package scipy for python3 did the trick. It is just completely unclear, why I need so many different packages from different sources to just get it to run. Anyhoo, scipy works and ggplot as well.
Scipy is compiling a lot of stuff. Depending on your computer it might take some time.
try:
pip -v install scipy
pip -vv install scipy
pip -vvv install scipy
(more and more verbose logging output)
Today, I create a virtual environment in my server and also install the scipy package using pip install command.
But when I run my optimization function in server, there is a error showing that ImportError: /data/home/pxu/ve/lib/python2.7/site-packages/scipy/linalg/cython_lapack.so: undefined symbol: zlacn2_
How can I fix this, please?
You'll need to install the SciPy math dependencies via your OS package manager. On Ubuntu (and probably Debian) you can use
sudo apt-get install libatlas-base-dev liblapack-dev
so I have tried installing numpy using Homebrew. While it said on home-brew that I successfully installed it, the program I run couldn't detect it and it recommend using apt-get.
So I got apt-get through fink, but I couldn't install numpy like I wanted to.
The most relevant answer I found online is here:
http://mrprajesh.blogspot.hk/2009/11/e-couldnt-find-package-on-apt-get.html
But it only covers linux and I am not sure how to do the same on an OSX machine. Does anyone has experience with this?
Below is the error message. Any help is appreciated.
yings-mbp:madanalysis5 yvonne$ sudo apt-get install python-numpy
Reading Package Lists... Done
Building Dependency Tree... Done
E: Couldn't find package python-numpy
yings-mbp:madanalysis5 yvonne$ sudo apt-get install update
Password:
Reading Package Lists... Done
Building Dependency Tree... Done
E: Couldn't find package update
yings-mbp:madanalysis5 Sam$
In my opinion the best way to installing numpy and scipy, is just download Anaconda, even though its large, things are compiled and just work. If you need a smaller package then get miniconda, and run "conda install numpy". If you don't care for space, just get anaconda.
Many python packages have build dependencies on non-Python packages. I'm specifically thinking of lxml and cffi, but this dilemma applies to a lot of packages on PyPI. Both of these packages have unadvertised build dependencies on non-Python packages like libxml2-dev, libxslt-dev, zlib1g-dev, and libffi-dev. The websites for lxml and cffi declare some of these dependencies, but it appears that there is no way to do figure this out from a command line.
As a result, there are hundreds of questions on SO that take this general form:
pip install foo fails with an error: "fatal error: bar.h: No such file or directory". How do I fix it?
Is this a misuse of pip or is this how it is intended to work? Is there a sane way to know what build dependencies to install before running pip? My current approach is:
I want to install a package called foo.
pip install foo
foo has a dependency on a Python package bar.
If bar build fails, then look at error message and guess/google what non-Python dependency I need to install.
sudo apt-get install libbaz-dev
sudo pip install bar
Repeat until bar succeeds.
sudo pip uninstall foo
Repeat entire process until no error messages.
Step #4 is particularly annoying. Apparently pip (version 1.5.4) installs the requested package first, before any dependencies. So if any dependencies fail, you can't just ask pip to install it again, because it thinks its already installed. There's also no option to install just the dependencies, so you must uninstall the package and then reinstall it.
Is there some more intelligent process for using pip?
This is actually a comment about the answer suggesting using apt-get but I don't have enough reputation points to leave one.
If you use virtualenv a lot, then installing the python-packages through apt-get can become a pain, as you can get mysterious errors when the python packages installed system-wide and the python packages installed in your virtualenv try to interact with each other. One thing that I have found that does help is to use the build-dep feature. To build the matplotlib dependencies, for example:
sudo apt-get build-dep python-matplotlib
And then activate your virtual environment and do pip install matplotlib. It will still go through the build process but many of the dependencies will be taken care of for you.
This is sort what the cran repositories suggest when installing R packages in ubuntu.
For most popular packages, There is a workaround for recent ubuntu systems. For example, I want to install matplotlib. When you order pip install matplotlib, it usually fails because of a missing dependency.
You can use apt-get install python-matplotlib instead. For python3, you can use apt-get install python3-matplotlib
I'm trying to upgrade Scipy from 0.9.0 to 0.12.0. I use the command:
sudo pip install --upgrade scipy
and I get all sorts of errors which can be seen in the pip.log file here and I'm unfortunately not python-savvy enough to understand what's wrong. Any help will be appreciated.
The error messages all state the same: You lack BLAS (Basic Linear Algebra Subroutines) on your system, or scipy cannot find it. When installing packages from source in ubuntu, as you are effectively trying to do with pip, one of the easiest ways to make sure dependencies are in place is by the command
$ sudo apt-get build-dep python-scipy
which will install all packages needed to build the package python-scipy. You may in some cases run into the problem that the version of the source package you are trying to install have different dependencies than the version included with ubuntu, but in your case, I think chances are good that the above command will be sufficient to fetch BLAS for you, headers included.
I had the same problem upgrading from scipy 0.9 to 0.13.3, and I solved it using the following answer and installing:
sudo apt-get install libblas-dev
sudo apt-get install liblapack-dev
sudo apt-get install gfortran
Make sure libatlas-base-dev and libatlas-sse2-dev are installed, it seems like it can't find your atlas library. Also, see this question:
Does Python SciPy need BLAS?
I found Adam Klein's instructions for setting up scipy (and friends) in a virtual environment very useful.
One problem I ran into (which was probably my own fault): After all was said and done, I found importing scipy still loaded version 0.9.0, not 0.12.0. The problem was that my sys.path was finding the old system version before the new version.
The fix was to make
/path/to/.virtualenvs/arthur/local/lib/python2.7/site-packages
appear before
/usr/lib/python2.7/dist-packages
in sys.path. If you have virtualenvwrapper installed, then
you can add the path using
add2virtualenv /path/to/.virtualenvs/arthur/lib/python2.7/site-packages