I am trying to install pysqlite (Python interface to the SQLite). I downloaded the file with the package (pysqlite-2.5.5.tar.gz). And I did the following:
gunzip pysqlite-2.5.5.tar.gz
tar xvf pysqlite-2.5.5.tar
\cd pysqlite-2.5.5
python setup.py install
At the last step I have a problem. I get the following error message:
error: command 'gcc' failed with exit status 1
I found that other peoples also had this problem.
As far as I understood in the person had a problem because sqlite2 was not installed. But in my case, I have sqlite3 (I can run it from command line).
May be I should change some paths in "setup.cfg"? At the moment I have there:
#define=
#include_dirs=/usr/local/include
#library_dirs=/usr/local/lib
libraries=sqlite3
define=SQLITE_OMIT_LOAD_EXTENSION
And if I type "which sqlite3" I get:
/usr/bin/sqlite3
I saw a similar question here. The answer was "you need sqlite3-dev". But, even if it is the case, how to check if I have sqlite3-dev. And if I do not have it how to get it?
Can anybody pleas help me with that problem.
Thank you in advance.
For Debian distros I fixed this problem with
sudo apt-get install libsqlite3-dev
I was able to resolve the same build error by installing the sqlite-devel package:
sudo yum install sqlite-devel
I had the same problem, I'm using python 2.4, neither sqlite3-dev nor libsqlite3-dev are available for CentOS.
yum install python-devel
seems to solve the issue.
how to check if I have "sqlite3-dev"
That's entirely dependent on what Linux distro you're using -- is it Fedora, Suse, Ubuntu, Gentoo, Mandrake, or which other one out of the dozens out there; there are several packaging strategies and tools used to check which packages are there, get more, and so forth.
So, never ask questions about checking, getting or tweaking packages on Linux without specifying the distribution[s] of interest -- it makes it essentially impossible to offer precise, specific help.
Edit: the simplest way I know of getting details about your Linux distribution (works on all the ones I have at hand to try, but I don't have a particularly wide array...;-):
$ cat /etc/*-release
DISTRIB_CODENAME=hardy
DISTRIB_DESCRIPTION="Ubuntu 8.04.2"
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=8.04
...etc, etc...
This is probably going to be the contents of file /etc/lsb-release, but I'm suggesting the *-release because I think there may be some other such files involved.
Of course, if the need to check your distro applies inside a file or program, reading this file (or files) and locating specific contents will also be quite feasible; but for the purpose of informing would-be helpers about what distro you're using, the cat at the shell prompt is going to be quite sufficient;-).
What version of Python do you have? SQLite is integrated in Python since 2.5:
http://docs.python.org/library/sqlite3.html
If you insist on compiling it yourself, the package is called sqlite3-devel, you can find it e.g. here
You could use yum or apt-get instead
first type :
sudo yum(or apt-get) search python-sqlite3
you will get something like python-sqlite3dbm.noarch
then type :
sudo yum(or apt-get) install python-sqlite3dbm.noarch
this way your os will install all you need for you and you wont get errors
I had following compile errors on CentOS release 5.6:
src/cache.h:34: error: expected specifier-qualifier-list before 'PyObject_HEAD'
src/cache.h:44: error: expected specifier-qualifier-list before 'PyObject_HEAD'
src/cache.h:61: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'pysqlite_NodeType'
src/cache.h:62: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'pysqlite_CacheType'
src/cache.h:64: error: expected declaration specifiers or '...' before 'PyObject'
src/cache.h:64: error: expected declaration specifiers or '...' before 'PyObject'
src/cache.h:67: error: expected declaration specifiers or '...' before 'PyObject'
src/cache.h:67: error: expected declaration specifiers or '...' before 'PyObject'
Installing python-devel helped me too:
yum install python-devel
I'm the one who answered the other question :) On systems that use RPM packages, i.e. you normally use 'yum' to install things, the package is named sqlite3-devel.
On most Debian based systems (i.e. you use apt-get to install packages), the package is named sqlite3-dev.
This is a very typical difference between the two, most other packages follow the same naming convention.
I had the same trouble with gcc failing with Ubuntu Karmic. I fixed this by installing the python-dev package. In my case, I'm working with python2.4, so I installed the python2.4-dev package. The python-dev package should work for python2.6.
Did you install the python sqlite lib?
sudo apt-get install python-sqlite
you need to install the plug in http://yum.baseurl.org/download/yum-metadata-parser/
wget -c "http://yum.baseurl.org/download/yum-metadata-parser/yum-metadata-parser-1.1.4.tar.gz"
then install it
tar zxf yum-metadata-parser-1.1.4.tar.gz && cd yum-metadata-parser-1.1.4
/path/to/python setup.py install
Related
I recently deleted the default python version on Fedora 31 and installed python 3.9 then made it as default, now I have multiple versions of python.
If I type: whereis python in my terminal this list appear:
python: /usr/bin/python /usr/bin/python3.9 /usr/bin/python3.7 /usr/bin/python3.9-config /usr/bin/python3.7m /usr/bin/python3.9-x86_64-config /usr/lib/python3.9 /usr/lib/python2.6 /usr/lib/python3.7 /usr/lib64/python3.9 /usr/lib64/python3.7 /usr/local/bin/python3.7m-config /usr/local/bin/python3.7 /usr/local/bin/python3.7m /usr/local/lib/python3.7 /usr/include/python3.9 /usr/include/python3.7m /usr/share/man/man1/python.1.gz /usr/src/Python-3.7.4/python
If I type pip then I get ModuleNotFoundError: No module named 'pip'
Also multiple packages are broken such as dnf, argcomplete, pip, etc.
I cannot update or install anything.
How can I solve this problem ?
Grab/Download the original python RPMs for your distro and reinstall them that way if they're not still cached under /var ....
With Python 3.9 you should use pip3...So install python3-pip.
That should do the trick
I tried many solutions and didn't work, however I ended up backing up my data and completely deleting the OS, then I downloaded the last version of fedora and restored my data on it.
thanks for your time
I ran into this unfortunate situation as well on Fedora 35. dnf, yum, and a bunch of other things broke.
I didn't manage to get Python 3.10 back through dnf, yum, or apt-get. I downloaded the rpm from https://fedora.pkgs.org/35/fedora-x86_64/python3-3.10.0-1.fc35.x86_64.rpm.html. It did require a dependency of python3-libs which I downloaded from: https://fedora.pkgs.org/35/fedora-x86_64/python3-libs-3.10.0-1.fc35.x86_64.rpm.html.
I installed python3-libs first with sudo rpm -i python3-libs-3.10.0-1.fc35.x86_64.rpm --force as there were some file writing conflicts. I ran the same command for the python3.10 rpm with the --force flag as well since there were 2 conflicts. After that, everything worked perfectly! Managed to dodge having to do a full reinstall.
I am trying to build the boost python library on my ubuntu. However, when I execute
./b2 --with-python
It always returns me errors related to
./boost/python/detail/wrap_python.hpp:57:11: fatal error: pyconfig.h: No such file or directory
# include <pyconfig.h>
^~~~~~~~~~~~
I tried to look up online, e.g., https://github.com/boostorg/build/issues/289
Follow their suggestion I check my "project-config.jam"
And I found
# Python configuration
import python ;
if ! [ python.configured ]
{
using python : 3.7 : /home/lowlimb/anaconda3 :/home/lowlimb/anaconda3/include/python3.7m;
}
Which is correct, thus I really don't know how to fix this issue.
Can anyone provide me some advice?
In addition to installing the python dev libs as suggested by the other answers, you can specify the python path directly:
CPLUS_INCLUDE_PATH=/usr/include/python3.7 make
Or in your case something like:
CPLUS_INCLUDE_PATH=/home/lowlimb/anaconda3/include/python3.7 ./b2
This worked for me when compiling a projects using Boost Python where I got the same error.
pyconfig.h is installed with sudo apt install python-dev
To build with a specific python version, you can do
./bootstrap.sh --with-python=<path to python>
e.g.
./bootstrap.sh --with-python=python3
to use your system's python3 or
./bootstrap.sh --with-python=$VIRTUAL_ENV/bin/python
to use the python from your virtual environment.
In order to build Boost-Python or more generally, use Python from C/C++, you need the Python development files:
$ sudo apt install python3.7-dev
I am trying to install cartopy on a Windows machine, and have previously installed QGIS and GEOS through OSGeo4W64. Now, when I try installing cartopy, I get the following error:
fatal error: 'geos_c.h' file not found
As mentioned, GEOS does exist and the file is also to be found within the directory. I the tried giving Pip the absolute path to the library as a global option, as follows:
pip install --global-option="-Lc:\OSGeo4W64\include"
This, unfortunately didn't work because Pip didn't recognise the -L library option:
error: option -L not recognized.
I tried -I, -l, and -i as well, just to see what would happen, but I get the same error every time. I also found examples on how to give paths to global-option and they did use -L and -I without problems. What could I be doing wrong?
Any help would be greatly appreciated.
It depends where you get your GEOS from as to which GEOS header file you should be linking against. If you get it from Christoph Gohlke's excellent binaries, or conda-forge, enthought, or Anaconda, I believe all rename geos_c.h to geos.h. If you get it from other sources, it may be that that renaming doesn't take place.
You can see how conda-forge build cartopy on Windows at https://github.com/conda-forge/cartopy-feedstock/blob/master/recipe/. The two important files:
https://github.com/conda-forge/cartopy-feedstock/blob/master/recipe/bld.bat
https://github.com/conda-forge/cartopy-feedstock/blob/master/recipe/cartopy.win.patch
Notice how that latter patch file renames the header dependency to geos.h, rather than geos_c.h because it is using the GEOS packaged by conda-forge. You may need to do a similar thing in your situation.
A history on this subject can also be found at https://github.com/SciTools/conda-recipes-scitools/issues/29#issuecomment-66497972.
I want to install PIL on Mavericks using pip but get this error.
_imagingft.c:73:10: fatal error: 'freetype/fterrors.h' file not found
#include <freetype/fterrors.h>
^
1 error generated.
error: command 'cc' failed with exit status 1
My Command Line Tools are installed and up to date and every hint I found didn't help.
How can I get this to compile?
EDIT: I just checked, freetype is also already installed via homebrew
Instead of symlinking to a specific version of freetype2, do this:
ln -s /usr/local/include/freetype2 /usr/local/include/freetype
This saves you the trouble of recreating the symlink whenever you upgrade freetype2.
With macports, the solution that worked for me:
sudo port install freetype
sudo ln -s /opt/local/include/freetype2 /opt/local/include/freetype
And then re-run the PIL build process.
I've solved this problem with this symlink:
ln -s /usr/local/Cellar/freetype/2.5.1/include/freetype2 /usr/local/include/freetype
I have freetype already installed via homebrew too.
This is caused by a change in the headers of freetype >= 2.1.5. PIL is not using the correct documented way to include the freetype headers, which causes the build to fail now that freetype finally removed the long-deprecated way of including the headers. This problem is documented right at the top of http://freetype.sourceforge.net/freetype2/docs/tutorial/step1.html:
NOTE: Starting with FreeType 2.1.6, the old header file inclusion scheme is no longer supported. This means that you now get an error if you do something like the following:
#include <freetype/freetype.h>
#include <freetype/ftglyph.h>
Please take this problem upstream to the developers of PIL and advise them to use the documented way of including freetype headers:
#include <ft2build.h>
#include FT_ERRORS_H
After many attempts, I solved this problem compiling the PIL without freetype support. To do that, I simply unlinked from my $PATH using brew unlink freetype and then, pip install PIL==1.1.7.
I just solved this using the steps described in this Stackoverflow answer.
Seems this is Xcode's fault for installing freetype in strange locations.
Use Pillow where this issue is fixed "for real":
https://github.com/python-pillow/Pillow/commit/c6040f618d8f2706a7b46d1cdf37d1a587f9701f
And where you can report issues and see them addressed in a timely fashion:
https://github.com/python-pillow/Pillow/issues
In my OSx, I found the .h file in /opt/local/include/freetype2 direcoty. So, I type
sudo ln -s /opt/local/include/freetype2/ /usr/local/include/freetype
it works
Maybe the best way is to add /opt/local/include to your clang's include path.
osx yosemite, this worked for me:
(virtualenv)
$ ln -s /opt/local/include/freetype2/ /usr/local/include/freetype2
$ pip install pil==1.1.7 --allow-external pil --allow-unverified pil
I'm using Arch Linux and had this issue. In my case had to manually download and unpack the zip file from https://pypi.python.org/pypi/Pillow/2.2.1#downloads . I then edited the file _imagingft.c to change the include path from freetype/fterrors.h to fterrors.h as there was no freetype subdirectory of /usr/include/freetype2 where fterrors.h was located. Finally python setup.py install worked fine.
Edit: I should mention this was the solution for installing Pillow, not PIL, but Pillow is just a fork of PIL and it may still be applicable to others with this issue.
If you're still looking for answers like I was after reading this and other googling, you may be interested to see this:
Warning
Pillow >= 2.1.0 no longer supports “import _imaging”. Please use “from PIL.Image import core as _imaging” instead.
from here
By the time you read this, the page will probably have changed, but the text will be still here at least.
I'm not having much success when attempting building pgmagick on OS X Lion with XCode 4.3.1.
I've installed both ImageMagick and GraphicsMagick, along side boost, using the following commands (via homebrew):
$ brew install graphicsmagick --with-magick-plus-plus
$ brew install imagemagick --with-magick-plus-plus
$ brew install boost --with-thread-unsafe
then I'm cloning the repo at https://bitbucket.org/hhatto/pgmagick:
$ hg clone https://bitbucket.org/hhatto/pgmagick/src
$ cd pgmagick
$ python setup.py build
However I always receive the following error:
ld: library not found for -lboost_python
collect2: ld returned 1 exit status
Based on the output on stdout, setup is looking in the right place for the boost (/usr/local/lib).
I've also tried easy_install and pip but with no luck. I'm using Pythonbrew but have also disabled this and tried using the stock python install -- still no success.
Any suggestions on how I can fix the problem, or further diagnose the issue?
According to my own reproduction of this issue in brew 0.9 and OSX 10.6.8, the problem is --with-thread-unsafe isn't being honored by the current brew formula file. You can verify this by checking the formula with brew edit boost and seeing if the option appears within the contents of the formula.
Because of this, libboost_python-mt.a and libboost_python-mt.dylib are being built instead of libboost_python.a and libboost_python.dylib.
The easiest ways to fix this are to edit your pgmagick setup.py to replace boost_lib="boost_python" with boost_lib="boost_python-mt" (as pointed out here) or to follow the instructions and patch here. It's otherwise a known issue.
The boost_python lib inside /usr/local/lib/ is named after libboost_python-mt.a and libboost_python-mt.dylib, since the default compiling is w/ multi-threads supporting enabled.
Grep boost_lib="boost_python" under ELSE condition in setup.py and replace it w/ boost_lib="boost_python-mt", will fix the "not found" issue.
Also it's OK to ln "-mt" version to libboost_python.a: as described here for linux boost which no longer appends '-mt' suffix since 1.42.
Ignore this line or you could "with-boost-python=boost_python-mt python setup.py install".
You could probably append '--with-boost-python=boost_python-mt' to extra_compile_args inside setup.py, to achieve the same goal.
Furthermore, you could install pgmagick through pip in managed envs. Refs http://rohanradio.com/blog/2011/12/02/installing-pgmagick-on-os-x/
Note that as of July 2014 the boost Python library is a separate homebrew package called boost-python.
5254f8f510fb30484f8fac8be3d38e388a4392e2
Author: Tim D. Smith <git#tim-smith.us>
Date: Sat Jul 19 15:37:25 2014 -0700
Split out Boost.Python
You need to install it separately to get the libboost_python shared library.
Does setting DYLD_FALLBACK_LIBRARY_PATH=/usr/local/lib in the environment help before the build
e.g
$ export DYLD_FALLBACK_LIBRARY_PATH=/usr/local/lib
$ hg clone https://bitbucket.org/hhatto/pgmagick/src
$ cd pgmagick
$ python setup.py build
I've submitted a pull request to homebrew to build Boost with both mt and non mt (threaded and thread unsafe) binaries which are required to build pgmagick.
Turns out this is a rather common problem, until the patch is accepted, you can check out or use my formula for Boost to build pgmagick.