After using the netCDF4-python package without any trouble, I needed to enable parallel file access. As I could not find suitable ready-made combinations of HDF5 and netCDF C libraries built against the same MPI library, I decided to build each of the packages from source with OpenMPI. However, importing the netCDF4 package fails due to unresolved symbols:
--------------------------------------------------------------------------- ImportError Traceback (most recent call
last) in
----> 1 import netCDF4
~/anaconda3/lib/python3.6/site-packages/netCDF4-1.5.1.2-py3.6-linux-x86_64.egg/netCDF4/init.py
in
1 # init for netCDF4. package
2 # Docstring comes from extension module _netCDF4.
----> 3 from ._netCDF4 import *
4 # Need explicit imports for names beginning with underscores
5 from ._netCDF4 import doc, pdoc
ImportError: /usr/local/lib/libnetcdf.so.15: undefined symbol:
H5Pset_dxpl_mpio
I tried installing the netcdf4-python package via pip install, and it imports works fine for serial file access, but still fails to load a file with parallel=True, stating that it requires parallel-enabled netcdf-c.
I am installing HDF5 with
export NCPROCS=4
export CC=mpicc
./configure --prefix=/usr/local/ --enable-parallel --enable-hl
make check
sudo make install
and netCDF-C with
export NCPROCS=4
export CC=mpicc
./configure --prefix=/usr/local/ --enable-parallel-tests
make check
sudo make install
and netCDF4-python using nc-config with
export CC=mpicc
python setup.py install
Each step recognizes the parallel functionalities. Am I missing a linking step in the build process somewhere, or why can't libnetcdf.so.15 find the symbols from the HDF5 library?
Related
I am having some issues using CoreMLTools, so I am going to try to debug it using my own Github repo. I forked the original project and modified nothing, but I get an error when I try to import it.
This works perfectly fine:
!pip install coremltools
import coremltools as ct
But this:
!pip install git+https://github.com/[Owner]/[Repo Name].git
import coremltools as ct # Identical Forked Copy of CoreMLTools
has the following error:
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-4-1b9428219e3f> in <module>
1 #!pip install coremltools
2 get_ipython().system('pip install git+https://github.com/[Owner]/[Repo].git')
----> 3 import coremltools as ct
8 frames
/usr/local/lib/python3.7/dist-packages/coremltools/converters/mil/frontend/milproto/load.py in <module>
25 from coremltools.converters.mil.mil.block import curr_block, curr_opset_version
26 from coremltools.converters.mil.mil.ops.registry import SSAOpRegistry as _SSAOpRegistry
---> 27 from coremltools.libmilstoragepython import _BlobStorageReader as BlobReader
28 from coremltools.proto import (
29 MIL_pb2 as pm,
ModuleNotFoundError: No module named 'coremltools.libmilstoragepython'
EDIT:
The following link seems to be a C++ file that does something with the module that can't be found.
https://github.com/apple/coremltools/blob/973eae67f2f273a29e80a9b009987516a070a58b/milstoragepython/MilStoragePython.cpp
coremltools does not build the libmilstoragepython source if you try to do a pip install of its source code - it would run setup.py which doesn't do any of the C++ compilation steps.
Instead, what you have to do is manually build the entire project and install the wheel file it generates. The main documentation for this is from their BUILDING.md.
It looks like you're on some flavor of Linux, so you're probably going to need the following dependencies:
C++ compiler with C++17 support (gcc-8 or newer)
conda (either miniconda or Anaconda)
cmake (either through package manager or installed from pip/conda)
zsh
libuuid-devel (your distro might call this uuid-devel or similar)
# You need conda installed and configured for zsh
# https://docs.conda.io/en/latest/miniconda.html
# substitute your own coremltools fork here
git clone https://github.com/apple/coremltools/
cd coremltools
# You can specify whatever version of Conda-supported Python you want this for in the range of...
# 3.7, 3.8, 3.9
# I also assume you want debug mode
zsh -i scripts/build.sh --python=3.7 --dist --debug
# this will output many cmake messages, but hopefully it succeeds
After this, you will have a directory called build/dist that contains a .whl file that can be installed
ls build/dist
coremltools-6.0b2-cp37-none-manylinux1_x86_64.whl
I failed to mention that this was running in Google Colab. After running:
! /content/coremltools/milstoragepython/MilStorage.cpp
I got a message a permission denied message. It seems you can't run whatever c++ file you want in colab.
I have a problem when I'm trying to use OpenCV (v3.1.0) in Python (v3.4). To make things more complicated, OpenCV is built with the "contrib" package and Qt support (v5.5).
This is what I did (this has some pitfalls on it's own, but those are out of the scope of this question):
Install required software
I installed WinPython (v3.4.4, 64 Bit version)
I installed cmake (v3.6.0)
Download sources
I downloaded the Qt sources from the GIT repo
I downloaded the OpenCV sources from the GIT repo
I downloaded the OpenCV "contrib" sources from the GIT repo
build sources -> RELEASE, 64 Bit (!)
build Qt from sources
build OpenCV from sources (with "contrib" modules)
cmake ( OpenCV 3.1.0 for python 3 ) already took care of copying the "cv2.pyd" to the "your/python/folder/Lib/site-modules" directory.
Now I have the problem that calling "import cv2" from python gives the error "
>>> import cv2
Traceback (most recent call last):
File "<pyshell#0>", line 1, in <module>
import cv2
ImportError: DLL load failed: Module not found
How can I find out which module is missing and how to solve it?
What you may not want to do is just adding paths that are likely needed to your environment variables.
There is a way to find out what is missing:
Download DependencyWalker
open your "cv2.pyd" with the dependency walker
Analyze your file (starts automatically when you select your file)
Hit "F9" such that it shows the full paths to the required DLLs
Check, which DLL files are missing
Copy them to those folders
For me, it was the case that python/cv2.pyd searched to all the Qt DLLs in the folder where cv2.pyd is in.
I just copied them there and that was it.
For Windows:
Step 1. Download python installer from https://www.python.org/downloads/
,install python by double clicking this file and check the option for adding python to PATH as in given image.
Step 2. Download file "numpy‑1.15.4+mkl‑cp27‑cp27m‑win32.whl" from unofficial site.
Go to this file's location open command prompt and type-
pip install numpy‑1.15.4+mkl‑cp27‑cp27m‑win32.whl
Now go to python IDLE and type:
import numpy
It should give no error.
Step 3. Download file "scipy‑1.2.1‑cp27‑cp27m‑win32.whl" from unofficial site.
Go to this file's location open command prompt and type-
pip install scipy‑1.2.1‑cp27‑cp27m‑win32.whl
Now go to python IDLE and type:
import scipy
It should give no error.
step 4 : Download file "opencv_python‑2.4.13.7‑cp27‑cp27m‑win32.whl" from unofficial site.
Go to this file's location open command prompt and type-
pip install opencv_python‑2.4.13.7‑cp27‑cp27m‑win32.whl
Now go to python IDLE and type:
import cv2
It should give no error, means opencv installed successfully in windows.
I'm trying to install an node-based webserver on a cortax a7 embedded system. The repos is pulled via git and I need to run npm install to install the node modules.
The server uses sqlite3, but the package fails when installing, specifically at the build stage, because python cannot find the tarfile module.
node -v // 4.3.1
npm -v // 2.14.12
root#imx6ul-var-dart:~/gateway-server# npm install
> sqlite3#3.1.1 install /home/root/gateway-server/node_modules/sqlite3
> node-pre-gyp install --fallback-to-build
Traceback (most recent call last):
File "./extract.py", line 2, in <module>
import tarfile
ImportError: No module named tarfile
deps/action_before_build.target.mk:13: recipe for target 'Release/obj/gen/sqlite-autoconf-3090100/sqlite3.c' failed
make: *** [Release/obj/gen/sqlite-autoconf-3090100/sqlite3.c] Error 1
make: Leaving directory '/home/root/gateway-server/node_modules/sqlite3/build'
I'm aware that many users experience issues with gyp/node-gyp, but here it seems its actually the python file extract.py that fails, when trying import tarfile ... but this seems to be a core module.
I've not been able to find references to this in my searches and bluntly I'm not a python guy. Ideas?
For anyone finding this after-the-fact, it seems you have to install python-modules package (in my case, with opkg) to get ALL the python standard libraries .... dunno why ... but this brings over tarfile and subsequent dependent zlib
I am trying to export a GeoTiff with Blender using the Blender Python API (based on Python 3), so I've decided to install GDAL on Ubuntu (14.04). What I would like is to get the module as a standalone folder that I could put in the modules directory of Blender (/home/user/.config/blender/2.73/scripts/modules).
The thing is I've run through several different problems trying to install GDAL. I've tried to install from source (for GDAL 2.0.0) here : Official PyPi Gdal
I ran sudo apt-get install libgdal-dev gdal-bin (I list it here because it may be important)
When I am in the extracted GDAL folder, using python setup.py build & python setup.py install, the library installs to /usr/local/lib/python2.7/dist-packages/osgeo. However, when I run python from command line, running from osgeo import osr returns ImportError: No module named _gdal
Following GDAL via pip , I used pip (pip install GDAL) to install the library, and the folder it went to was /usr/lib/python3/dist-packages/osgeo (using pip show ...). Again, running python3 and trying to import results in the same error. Of course, when I copy-paste each folder in the blender module directory, I get the same error in the Blender Python console.
So I decided to compile the sources using ./configure --with-python & make & make install in the source folder. I then copied the folder GDAL-x.x.x/build/lib.linux-x86_64-3.4/osgeo to the blender modules directory and got this time the error when importing : ImportError: /home/yvesu/.config/blender/2.73/scripts/modules/osgeo/_gdal.so: undefined symbol: _Py_ZeroStruct.
Trying to compile with python3 using python3 setup.py build returns the error error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
EDIT 1:
I think I've found the solution : I went to the directory swig/python (not found in a GDAL-1.11.0 folder but gdal-1.11.0 fodler, can't remember where I downloaded it from), ran python3 setup.py build & python3 setup.py install and could finally find the folder in /usr/local/lib/python3.4/dist-packages/GDAL-1.11.0-py3.4-linux-x86_64.egg/osgeo. When I put this osgeo folder oni the Blender modules directory, I was able to import osgeo in Blender. I will report if anything went wrong.
I think I've listed all my attempts at installing GDAL on Ubuntu. Can anyone point me in the right direction? Do you think it is even possible to install it as a standalone module, or do I need linked libraries through LD_LIBRARY_PATH?
Here is the solution I've found :
Download Gdal sources (v2.0.0 is the current stable release) from ftp://ftp.remotesensing.org/gdal/2.0.0/ or http://download.osgeo.org/gdal/2.0.0/ and untar
Go to the directory gdal2.0.0/swig/python
Run python3 setup.py build & python3 setup.py install
Finally find the module folder in, on Ubuntu : /usr/local/lib/python3.4/dist-packages/GDAL-2.0.0-py3.4-linux-x86_64.egg/osgeo
I can now use it in Blender (copying in the modules directory)
I am trying to follow this tutorial and getting an error when I do the following:
(DataVizProject) $ pip install -r requirements.txt
It gives me a big error log, the last few lines of which are :
C:\python\new-coder\dataviz\DataVizProj\build\numpy\numpy\distutils\system_info.
py:1422: UserWarning:
Lapack (http://www.netlib.org/lapack/) sources not found.
Directories to search for the sources can be specified in the
numpy/distutils/site.cfg file (section [lapack_src]) or by setting
the LAPACK_SRC environment variable.
warnings.warn(LapackSrcNotFoundError.__doc__)
error: Unable to find vcvarsall.bat
----------------------------------------
Cleaning up...
Command python setup.py egg_info failed with error code 1 in C:\python\new-coder
\dataviz\DataVizProj\build\numpy
I know it doesn't work because when I do the following steps:
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named numpy
>>> import matplotlib
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named matplotlib
>>>
Thanks a lot!
#Hasnain, when you are using python in windows, eventually you will see this error for some packages.
You have three options when it happen (in order of relevance):
1 - Try to download a MSI file. It will install the library without any problems. To numpy specifically you can download here (http://www.lfd.uci.edu/~gohlke/pythonlibs/#numpy)
2 - You can download the whole file and try a python setup.py install and modify the packacge if it raise some errors.
3 - You can compile your own library for your operational system.
Many MSI files you can find here (http://www.lfd.uci.edu/~gohlke/pythonlibs/)
The method recommended in that tutorial works well for Unix systems. If you are on Windows you will go through a lot of trouble trying to build numpy from sources with pip. I will save you some time, follow the official recomendation and try some binary install of the recommended ones in the official Scipy website. I personally recommend you the Anaconda or the Enthought distribution.
Usually when installing packages on windows, by default python searches for Visual Studio 2008 . You can either install it or use MinGW Compiler.
If you decide to use MinGW you should edit your distutils.cfg file in Python27\Lib\distutils directory :
[build]
compiler = mingw32
[build_ext]
compiler = mingw32