Python not installing sklearn - python
I am working with ubuntu 14. I have downloaded the dpkg package for sklearn and unpacked it. i try to run sudo python setup.py installBut it seems to be stuck in a loop
compiling C++ sources
C compiler: c++ -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -fPIC
creating build/temp.linux-x86_64-2.7/sklearn/utils/src
compile options: '-Isklearn/utils/src -I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c'
c++: sklearn/utils/src/MurmurHash3.cpp
c++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/sklearn/utils/murmurhash.o build/temp.linux-x86_64-2.7/sklearn/utils/src/MurmurHash3.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/sklearn/utils/murmurhash.so
building 'sklearn.utils.lgamma' extension
compiling C sources
C compiler: x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC
compile options: '-Isklearn/utils/src -I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c'
x86_64-linux-gnu-gcc: sklearn/utils/lgamma.c
x86_64-linux-gnu-gcc: sklearn/utils/src/gamma.c
x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/sklearn/utils/lgamma.o build/temp.linux-x86_64-2.7/sklearn/utils/src/gamma.o -Lbuild/temp.linux-x86_64-2.7 -lm -o build/lib.linux-x86_64-2.7/sklearn/utils/lgamma.so
building 'sklearn.utils.graph_shortest_path' extension
compiling C sources
C compiler: x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC
compile options: '-I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c'
x86_64-linux-gnu-gcc: sklearn/utils/graph_shortest_path.c
In file included from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1761:0,
from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/ndarrayobject.h:17,
from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/arrayobject.h:4,
from sklearn/utils/graph_shortest_path.c:256:
/usr/lib/python2.7/dist-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: warning: #warning "Using deprecated NumPy API, disable it by " "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp]
#warning "Using deprecated NumPy API, disable it by " \
^
In file included from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/ufuncobject.h:327:0,
from sklearn/utils/graph_shortest_path.c:257:
/usr/lib/python2.7/dist-packages/numpy/core/include/numpy/__ufunc_api.h:241:1: warning: ‘_import_umath’ defined but not used [-Wunused-function]
_import_umath(void)
^
x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -D_FORTIFY_SOURCE=2 -g -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security build/temp.linux-x86_64-2.7/sklearn/utils/graph_shortest_path.o -Lbuild/temp.linux-x86_64-2.7 -o build/lib.linux-x86_64-2.7/sklearn/utils/graph_shortest_path.so
building 'sklearn.utils.fast_dict' extension
compiling C++ sources
C compiler: c++ -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -fPIC
compile options: '-I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/lib/python2.7/dist-packages/numpy/core/include -I/usr/include/python2.7 -c'
c++: sklearn/utils/fast_dict.cpp
In file included from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/ndarraytypes.h:1761:0,
from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/ndarrayobject.h:17,
from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/arrayobject.h:4,
from sklearn/utils/fast_dict.cpp:320:
/usr/lib/python2.7/dist-packages/numpy/core/include/numpy/npy_1_7_deprecated_api.h:15:2: warning: #warning "Using deprecated NumPy API, disable it by " "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp]
#warning "Using deprecated NumPy API, disable it by " \
^
sklearn/utils/fast_dict.cpp: In function ‘PyObject* __pyx_pw_7sklearn_5utils_9fast_dict_1argmin(PyObject*, PyObject*)’:
sklearn/utils/fast_dict.cpp:18786:44: warning: ‘__pyx_v_min_key’ may be used uninitialized in this function [-Wmaybe-uninitialized]
return PyInt_FromLong((long)val);
^
sklearn/utils/fast_dict.cpp:3316:46: note: ‘__pyx_v_min_key’ was declared here
__pyx_t_7sklearn_5utils_9fast_dict_ITYPE_t __pyx_v_min_key;
^
In file included from /usr/lib/python2.7/dist-packages/numpy/core/include/numpy/ufuncobject.h:327:0,
from sklearn/utils/fast_dict.cpp:321:
/usr/lib/python2.7/dist-packages/numpy/core/include/numpy/__ufunc_api.h: At global scope:
/usr/lib/python2.7/dist-packages/numpy/core/include/numpy/__ufunc_api.h:241:1: warning: ‘int _import_umath()’ defined but not used [-Wunused-function]
_import_umath(void)
^
.. and continues on like that.
I have installed numpy, but I did it through ubuntu's software center. When I try to import sklearn within python I get
from sklearn.ensemble import RandomForestClassifier Traceback (most
recent call last): File "", line 1, in File
"sklearn/init.py", line 37, in
from . import check_build File "sklearn/__check_build/__init.py", line 46, in
raise_build_error(e) File "sklearn/check_build/__init.py", line 41, in raise_build_error
%s""" % (e, local_dir, ''.join(dir_content).strip(), msg)) ImportError: No module named _check_build
_______________________________________________________________________ Contents of sklearn/check_build: setup.py
__init.py _check_build.pyx
_check_build.c setup.pyc init.pyc
_______________________________________________________________________ It seems that scikit-learn has not been built correctly.
If you have installed scikit-learn from source, please do not forget
to build the package before using it: run python setup.py install or
make in the source directory.
If you have used an installer, please check that it is suited for your
Python version, your operating system and your platform.
I have no idea where sklearn/check_build is located.
My folder in /usr/lib/python2.7/dist-packages is empty, but I can import numpy within python. Like I said, I used the ubuntu software center to install numpy, but not for sklearn which I regret doing now.
I recommend installing sklearn and all dependencies with Anaconda package: https://www.continuum.io/downloads#_unix
It will be installed together with numpy and other packages, full list is available here: http://docs.continuum.io/anaconda/pkg-docs
If you want your package manager to handle everything, that usually works although you won't necessarily be on the most recent version
Otherwise do something along the lines of
sudo apt-get install build-essential gcc g++ python-dev python3-dev python-scipy python3-scipy
and try to install/compile again. compiling python extension modules relies on having a working compilation environment, plus the extended or development headers for python. I'm not sure if those dependencies are 100% exactly right for Ubuntu b/c I've been using more openSUSE lately, but apt-cache search will turn you up the correct naming if I've made a typo
One of new ways to handle issues due to environment is to handle it using docker images. This allows any developer to recreate the environment in any server within a single minute. You can pull the image from here.
This can also be performed very easily using the datmo CLI tool. We faced these problems ourselves and decided to build it.
Edit: You could install as follows,
apt-get update; \
apt-get install -y python python-pip \
python-numpy \
python-scipy \
build-essential \
python-dev \
python-setuptools \
libatlas-dev \
libatlas3gf-base
update-alternatives --set libblas.so.3 /usr/lib/atlas-base/atlas/libblas.so.3; update-alternatives --set liblapack.so.3 /usr/lib/atlas-base/atlas/liblapack.so.3
pip install -U scikit-learn
Disclaimer: I work at Datmo
Related
Why is Cython compiled to C much faster than the C++ equivalent [closed]
Closed. This question needs debugging details. It is not currently accepting answers. Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question. Closed 2 years ago. Improve this question I sadly can't share the source code due to an NDA, but I think the question would still be interesting regardless. Context I have a file (my_cython_file.pyx) which I sped up with cython with the usual: my_cython_file.pyx -transpiling-> my_cython_file.c -compiling-> my_cython_file.so Since I wanted to use some external C++ libraries, I decided to transpile my code to C++ intstead, with minimal changes to the pyx file: my_cython_file.pyx -transpiling-> my_cython_file.cpp -compiling-> my_cython_file.so Since the input file was essentially unchanged, I didn't think this would have any impact on the performance. However, the C++ version is about 20 times slower than the C version. I tried to see if anyone had a similar experience online, and it seems like it could be related to the compiler flags. I've been playing around with the flags, but I don't have much experience with compilers and haven't managed to get very far I'm using a really basic cythonize call in my setup.py file to carry out both compilation steps, just changing the language from C to C++ and running python setup.py build_ext --inplace. (I've pasted my setup.py file at the end of the question) C Compilation running build_ext building 'my_pkg.my_cython_file' extension x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/path/to/my/venv/include -I/usr/include/python3.6m -c my_pkg/my_cython_file.c -o build/temp.linux-x86_64-3.6/my_pkg/my_cython_file.o In file included from /usr/include/python3.6m/numpy/ndarraytypes.h:1809:0, from /usr/include/python3.6m/numpy/ndarrayobject.h:18, from /usr/include/python3.6m/numpy/arrayobject.h:4, from my_pkg/my_cython_file.c:624: /usr/include/python3.6m/numpy/npy_1_7_deprecated_api.h:15:2: warning: #warning "Using deprecated NumPy API, disable it by " "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] #warning "Using deprecated NumPy API, disable it by " \ ^~~~~~~ x86_64-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.6/my_pkg/my_cython_file.o -o build/lib.linux-x86_64-3.6/my_pkg/my_cython_file.cpython-36m-x86_64-linux-gnu.so copying build/lib.linux-x86_64-3.6/my_pkg/my_cython_file.cpython-36m-x86_64-linux-gnu.so -> my_pkg C++ Compilation building 'my_pkg.my_cython_file' extension x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/path/to/my/venv/lib/python3.6/site-packages/numpy/core/include -I/usr/include/python3.6m -I/usr/src/algorithms/venv_dev/include/python3.6m -c my_pkg/my_cython_file.cpp -o build/temp.linux-x86_64-3.6/my_pkg/my_cython_file.o In file included from /usr/include/python3.6m/numpy/ndarraytypes.h:1809:0, from /usr/include/python3.6m/numpy/ndarrayobject.h:18, from /usr/include/python3.6m/numpy/arrayobject.h:4, from my_pkg/my_cython_file.cpp:638: /usr/include/python3.6m/numpy/npy_1_7_deprecated_api.h:15:2: warning: #warning "Using deprecated NumPy API, disable it by " "#defining NPY_NO_DEPRECATED_API NPY_1_7_API_VERSION" [-Wcpp] #warning "Using deprecated NumPy API, disable it by " \ ^~~~~~~ x86_64-linux-gnu-g++ -pthread -shared -Wl,-O1 -Wl,-Bsymbolic-functions -Wl,-Bsymbolic-functions -Wl,-z,relro -Wl,-Bsymbolic-functions -Wl,-z,relro -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 build/temp.linux-x86_64-3.6/my_pkg/my_cython_file.o -o build/lib.linux-x86_64-3.6/my_pkg/my_cython_file.cpython-36m-x86_64-linux-gnu.so I was playing around with the compilation flags using the extra_compile_args in the setup.py (see file at the end). However, those extra args get appended to the end so the log is slightly different: x86_64-linux-gnu-gcc ......... -o /my_cython_file.o -std=c++14 -fopenmp -O3 -ffast-math x86_64-linux-gnu-g++ -o my_cython_file.cpython-36m-x86_64-linux-gnu.so -std=c++14 -fopenmp -O3 -ffast-math Questions So the compilation steps look very similar... Could this really be the reason for the x20 runtime difference? I'm pretty sure that the order of the compilation flags matters, so since cython is putting the extra_compiler_args at the end of the command, do they really have an effect? For instance, there's a -O1 right at the beginning of the command and cython adds my -O3 at the end. Which one has priority in this case? x86_64-linux-gnu-g++ ... -O1 .... -o output.so -std=c++14 -fopenmp -O3 -ffast-math ^^^^ ^^^^ default from "extra_compiler_args" For the C++ compilation, I'm not really sure why cython is using x86_64-linux-gnu-gcc first (to build the .o) and then x86_64-linux-gnu-g++ afterwards (to build the .so). Shouldn't it just use g++ for both? Or just run g++ once? Appendix Here is the C++ setup.py, for reference #! /usr/bin/env python3 # std imports # from distutils.core import setup from setuptools import setup, Extension, find_packages import sys import os import numpy as np from Cython.Build import cythonize extensions = [Extension("*", ['my_pkg/*.pyx'], extra_compile_args=['-std=c++14', '-fopenmp', '-O3', '-ffast-math'], extra_link_args=['-std=c++14', '-fopenmp', '-O3', '-ffast-math'], )] setup( name='my_pkg', version='0.1.0', author='me', packages=find_packages(), ext_modules=cythonize(extensions, compiler_directives={'language_level': '3',}, gdb_debug=True, annotate=True, language='c++', ), url='todo', license='todo', install_requires=requirements )
Error when installing pyminizip on docker
I am getting an error while installing pyminizip package inside docker container ( docker version 17.03.1-ce). I am doing it inside virtual environment with python 2.7.13. I ended up with below message while trying to install it. gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -Isrc -Izlib123 -I/usr/local/include/python2.7 -c src/py_minizip.c -o build/temp.linux-x86_64-2.7/src/py_minizip.o src/py_minizip.c: In function ‘_compress’: src/py_minizip.c:251: warning: ‘filepathnameinzip’ may be used uninitialized in this function gcc -pthread -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -Isrc -Izlib123 -I/usr/local/include/python2.7 -c src/zip.c -o build/temp.linux-x86_64-2.7/src/zip.o In file included from src/zip.c:66: src/crypt.h:34: error: redefinition of typedef ‘z_crc_t’ src/zip.h:83: note: previous declaration of ‘z_crc_t’ was here src/zip.c:201: warning: function declaration isn’t a prototype src/zip.c:203: warning: function declaration isn’t a prototype error: command 'gcc' failed with exit status 1 Is there any alternate of pyminizip package ? I want to create a password protected zip. So, even if there is any alternate solution to it, please let me know.
Would be useful know what docker image you use... but Try to Install python developer version: sudo apt-get install python-dev and libevent libraries sudo apt-get install libevent-dev
This issue has been resolved. It is working fine with pyminizip version 0.2.1. I had faced this issue on 0.2.2 and 0.2.3 but version 0.2.1 is working fine for me.
Intel Galileo - greenlet.h:8:20: fatal error: Python.h: No such file or directory
I'm running Linux dev-tools image (link at the end) on my intel galileo. I tried to install greenlet but I got an error stating python.h no such file. root#clanton:/media/realroot/greenlet-0.4.2# python setup.py install running install running build running build_ext creating /tmp/tmpuKbWhk/tmp creating /tmp/tmpuKbWhk/tmp/tmpuKbWhk i586-poky-linux-uclibc-gcc -m32 -march=i586 -fno-strict-aliasing -O2 -pipe -g -feliminate-unused-debug-types -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -fno-tree-dominator-opts -I/usr/include/python2.7 -c /tmp/tmpuKbWhk/simple.c -o /tmp/tmpuKbWhk/tmp/tmpuKbWhk/simple.o /tmp/tmpuKbWhk/simple.c:1:6: warning: function declaration isn't a prototype [-Wstrict-prototypes] building 'greenlet' extension creating build creating build/temp.linux-i586-2.7 i586-poky-linux-uclibc-gcc -m32 -march=i586 -fno-strict-aliasing -O2 -pipe -g -feliminate-unused-debug-types -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -fno-tree-dominator-opts -I/usr/include/python2.7 -c greenlet.c -o build/temp.linux-i586-2.7/greenlet.o In file included from greenlet.c:5:0: greenlet.h:8:20: fatal error: Python.h: No such file or directory compilation terminated. error: command 'i586-poky-linux-uclibc-gcc' failed with exit status 1** I know there are many posts with the same python.h error but my problem is the Linux image I have can't run sudo or apt commands I need to setup python-dev environment onto my Linux image running on Galileo board. Link to the Linux dev-tools image I'm using is here below. http://telekinect.media.mit.edu/galileo/image-devtools-1.0.1-2.tar.bz2 p.s. it has gcc and python2.7 already.
Errors while compiling Python with SSL support
I'm trying to compile Python 2.7.3 on Centos6. Almost everything works ok, except the thing I really need ^^. When I type make, I'm getting the error: building '_ssl' extension gcc -pthread -fPIC -fno-strict-aliasing -g -O2 -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I. -IInclude -I./Include -I/usr/local/include -I/root/Python-2.7.3/Include -I/root/Python-2.7.3 -c /root/Python-2.7.3/Modules/_ssl.c -o build/temp.linux-i686-2.7/root/Python-2.7.3/Modules/_ssl.o gcc -pthread -shared build/temp.linux-i686-2.7/root/Python-2.7.3/Modules/_ssl.o -L/usr/local/lib -lssl -lcrypto -o build/lib.linux-i686-2.7/_ssl.so *** WARNING: renaming "_ssl" since importing it failed: build/lib.linux-i686-2.7/_ssl.so: undefined symbol: krb5_auth_con_getrcache And at the end I'm getting a message that the build failed for module _ssl (something like this). Have you ever faced this problem? I have installed both OpenSSL (0.9.8e fips) and OpenSSL-dev.
Make sure the right paths to openssl-dev (lib and include) are in your Makefile
How to set CFLAGS and LDFLAGS to compile pycrypto
I am trying to install the fabric library to an old machine. There are some legacy libraries in /usr/lib, such as libgmp. (py27)[qrtt1#hcservice app]$ ls /usr/lib|grep gmp libgmp.a libgmp.so libgmp.so.3 libgmp.so.3.3.3 libgmpxx.a libgmpxx.so libgmpxx.so.3 libgmpxx.so.3.0.5 I have compiled the libgmp 5.x in my $HOME/app, and then am trying to install pycrypto (it is the dependency of fab): CFLGAS=-I/home/qrtt1/app/include LDFLGAS=-L/home/qrtt1/app/lib pip install pycrypto I observed that none of my include or lib directories are in the in the compilation / linking options: gcc -pthread -fno-strict-aliasing -fwrapv -Wall -Wstrict-prototypes -fPIC -std=c99 -O3 -fomit-frame-pointer -Isrc/ -I/usr/include/ -I/home/qrtt1/app/include/python2.7 -c src/_fastmath.c -o build/temp.linux-i686-2.7/src/_fastmath.o gcc -pthread -shared build/temp.linux-i686-2.7/src/_fastmath.o -lgmp -o build/lib.linux-i686-2.7/Crypto/PublicKey/_fastmath.so building 'Crypto.Hash._MD2' extension gcc -pthread -fno-strict-aliasing -fwrapv -Wall -Wstrict-prototypes -fPIC -std=c99 -O3 -fomit-frame-pointer -Isrc/ -I/home/qrtt1/app/include/python2.7 -c src/MD2.c -o build/temp.linux-i686-2.7/src/MD2.o gcc -pthread -shared build/temp.linux-i686-2.7/src/MD2.o -o build/lib.linux-i686-2.7/Crypto/Hash/_MD2.so building 'Crypto.Hash._MD4' extension gcc -pthread -fno-strict-aliasing -fwrapv -Wall -Wstrict-prototypes -fPIC -std=c99 -O3 -fomit-frame-pointer -Isrc/ -I/home/qrtt1/app/include/python2.7 -c src/MD4.c -o build/temp.linux-i686-2.7/src/MD4.o gcc -pthread -shared build/temp.linux-i686-2.7/src/MD4.o -o build/lib.linux-i686-2. How do I assign the CFLAGS and LDFLAGS correctly for building pycrypto ? I try to download pycrypto-2.5 and install it: (py27)[qrtt1#hcservice pycrypto-2.5]$ CFLGAS=-I/home/qrtt1/app/include LDFLGAS=-L/home/qrtt1/app/lib python setup.py install No CFLAGS or LDFLAGS set up with it. May be the pycrypto-2.5 going wrong ?
Please check what you have typed : CFLAGS=-I/home/qrtt1/app/include LDFLAGS=-L/home/qrtt1/app/lib pip install pycrypto it should be CFLAGS