Illegal instruction: 4 when importing python plugins - python

I tried to install a hoomd_script molecular dynamics software on my imac (it's imac pro before 2009, the system is OS X El captain v10.11.3). I have successfully compiled this to iMac, but when I import this hoomd_script in Python 2.7.12, Python crashes completely and I get the error:
Illegal instruction: 4.
I have installed all the prerequisites packages (including boost, sphinx, git, mpich2, numpy, cmake, pkg-config, sqlite) using conda.
I applied python -vc 'hoomd_script' to test, and the result is here. I tried to reinstall all the packages including conda and recompile the hoomd, but nothing changed. I wonder how can I fix this. Thanks!

As stated on the HOOMD-blue web page, the conda builds require a CPU capable of AVX instructions (2011 or newer). The illegal instruction results because you are trying to execute an instruction that your processor does not support.
Compiling hoomd from a clean build directory on your system should result in a binary that your system can execute. Note that conda provided prerequisite libraries are difficult to work with: I recommend using macports or homebrew.

Related

How to use pandas on M1 mac? (without rosetta or changing to x86 environment in any other way)

Last I wrote a python project was less than 2 months ago and everything worked fine. I'm not sure if while working on other project I messed something up on my mac but now when trying to run python files which used to run perfectly, the following error appears:
dlopen(/opt/homebrew/lib/python3.9/site-packages/pandas/_libs/interval.cpython-39-darwin.so, 0x0002): tried: '/opt/homebrew/lib/python3.9/site-packages/pandas/_libs/interval.cpython-39-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64e')), '/usr/local/lib/interval.cpython-39-darwin.so' (no such file), '/usr/lib/interval.cpython-39-darwin.so' (no such file)
I understand there is an issue with the architecture x86 vs arm so I tried seeing what platform the terminal is on with:
python -c 'import platform; print(platform.platform())'
which confirmed it was arm64.
Doing some googling and looking at similar issues such as Trouble installing Pandas on new MacBook Air M1 it seems like it would be possible to run the python project in an x86 environment, however like already mentioned, it worked fine before, and it seems there was no update since, so what could have happened that pandas (and perhaps other libs) no longer work on arm, and how can it be reverted?
You should try using miniforge.
its definition from its GitHub repository:
This repository holds a minimal installer for Conda specific to conda-forge. Miniforge allows you to install the conda package manager with the following features pre-configured:
Its main feature that will be useful for us
An emphasis on supporting various CPU architectures (x86_64, ppc64le, and aarch64 including Apple M1).
The Process I use:
Create a conda environment and usually go with "python3.9".
Install the packages from the conda, most of them are available but some are not.
After trying and installing all the packages possible with miniforge, I use PIP for the remaining packages.
This workflow has worked pretty well for me and hope it helps you.
I want to utilize the native m1 performance and I think you will be able to see the difference.
By default, miniforge only downloads arm compatible builds of python packages. till now I have not faced any major issue working with most data science libraries, except with PyTorch.

Installing python package without a C++ compiler

I'm trying to get the HDBSCAN package to run on a Windows 7 machine with no C++ compiler. Installing a compiler is not an option, unfortunately.
I read that some packages have precompiled wheel files that require no compiler to install. The installation notes state "Binary wheels for a number of platforms are available thanks to the work of Ryan Helinski". However, there is no mention of where those can be found. My questions are then:
How do I obtain the .whl file for the HDBSCAN package?
Is it possible to simply compile+install on another machine and copy it? If so, what should the machine on which I compile have in common with the one on which the code must run? Can I do it on a Windows 10 machine, or does it have to be Win7 as well, do the same Windows updates need to have been run on both, etc?
Looks like there is no pre-built wheel distribution published by the maintainers of the project themselves on PyPI.
As already mentioned by other contributors, one could get such wheel distributions from a third party such as Christoph Gohlke's "Unofficial Windows Binaries for Python Extension Packages".
It is also of course possible to build such wheels yourself on one machine and then reuse it on another. As far as I know in the case of Windows, both machines need to have the same Python interpreter (minor) version as well as the same bitness (both 32 bits or both 64 bits). The exact version of the operating system should not matter (from Windows 10 to Windows 7 or the other way around should work).
I ended up getting this to work by compiling on another machine and copying the package from it. It was critical that the required packages were of the same version on both machines, so I simply set up a new conda environment which had the same package versions as the target machine, then intalled hdbscan with pip there, and copied.
I was worried about Windows version compatibility, but this worked even though I installed on Windows 10 and moved to a machine running Win7.
here is the site you can download the wheel file :https://www.lfd.uci.edu/~gohlke/pythonlibs/
for python 3.5>
or PyPi for older versions
run python -m pip install thefiledownloadedforyourpythonversion.whl
For the second question , yes you can but is rather complicated and you should avoid it when you can :)
I just checked. Python 3.8 is coded in C. You need a C compiler, not a C++ one. MINGW is one, and TinyCC Win32 is another one (a small one, producing quickly slow executables). Look also into this list of free C or C++ compilers.
A possibility (which could take several days of work) might be to use some WSL or some Linux emulator such as JSLinux (it runs in a web browser). Then you could (painfully) build a cross compiler (starting first with a tinycc-win32, then compiling an old cross GCC 3 compiler, then compiling with that a newer C++ GCC, etc....)
But the reality is that you should not be allowed to do this. Your real issue is non technical: why are you not allowed to install then use a C++ compiler, such as MinGW? Get permission (and resources) to install some....
Alternatively, consider installing some Linux distribution (ensure you are allowed to do so). Most of them have a recent Python and GCC...
Is it possible to simply compile+install on another machine and copy it?
This is called cross-compilation and is possible in general. The point is to be permitted to do that. You should find relevant cross-compilers for your situation.
If you are allowed to, you could even use a live Linux USB stick....
Some C compilers written in Python do exist.... You could use them to cross-compile tinycc for Win32. Then you should have a C compiler for Win32. You could then compile an old GCC, etc.... Qemu exist for windows. You could run a Linux with a cross-GCC compiler in Qemu.

install natgrid in matplotlib in the environment of python 2.7

I'm quit new in using python. The current version I'm using is 2.7. I need to employ function mncontour in minuit which requires the installation of natgrid as additional toolkit for matplotlib. I downloaded natgrid 0.2.1 with a file named setup.py in it. I ran this setup.py through python shell without reaching any error. But it seems that the installation was not succeed. Anyone has any idea how the installation can be done? Many thanks.
Liang
Could you please provide some more information on this topic.
what operating system you are working with ( some offer more support than others for python)
have you installed python headers, and a C/C++ compiler in your environment? ( numerical libraries might require native code to speed up the computation)
Have you tried a package manager for python ( such as easy or pip)? Both work on both windows and unixes, and usually download and install all the needed packages to make your module working.
A piece of the setup's output before setup.py finished would help us help you greatly.
Python for Windows [Nt - 7] is compiled with either cygwin or mingw, thus you not only need the python environment, but also said compiler, and python headers. If you want a more point and click install, then there's this professor at this university who maintains a good and up to date repository of scientific python modules, that depend on native extensions ( among which numpy, scipy, matplotlib).
http://www.lfd.uci.edu/~gohlke/pythonlibs/
Manuals to setup mingw and python :
https://docs.python.org/2/using/windows.html
MingW can be downloaded from here:
http://www.mingw.org/
Best option for installing natgrid is from conda
conda install -c jochym natgrid=0.2

VLFeat installation for Python

I am new to Python and I want to install VLFeat on Ubuntu (13.04).
I am using Eclipse 3.8. For Python, I have installed the PyDev extension on Eclipse.
I have installed Numpy, but I don't know how to install VLFeat. I tried to use their website, but I can't get anything for Python. I have downloaded packages, but I don't know how to install them.
The Menpo Project provides a wrapper around VLFeat: it's called cyvlfeat.
To install cyvlfeat, we strongly suggest you use conda:
conda install -c menlo cyvlfeat
If you don't want to use conda, your
mileage will vary. In particular, you must satisfy the
linking/compilation requirements for the package, which include the
vlfeat dynamic library.
In other words, the nice thing about installing with conda is that it will install (and link) VLFeat dependencies as well.
It may not include all functionalities of VLFeat. Current State as of March 2017:
fisher
fisher
generic
set_simd_enabled, get_simd_enabled, cpu_has_avx, cpu_has_sse3, cpu_has_sse2
get_num_cpus,
get_max_threads, set_num_threads, get_thread_limit
hog
hog
kmeans
kmeans
kmeans_quantize
ikmeans, ikmeans_push
hikmeans, hikmeans_push
sift
dsift
sift
Relevant reading
Dev blog by Simmi Mourya. This includes descriptions and usage examples.
More alternatives:
vlfeat-ctypes: minimal VLFeat interface for python
pyvlfeat fork by jchazalon: A high-level Python wrapper around a subset of the VLFeat library [more recently updated than the original]
Note about the IDE
Installing python packages should be independent of the IDE (Eclipse + PyDev, in the OP case), as long as the interpreter and libraries paths are correctly set up.
Note about conda
It is not required to install the Anaconda distribution in order to use conda. The much lighter Miniconda is enough.
Assuming you are getting VLFeat from the Python Package Index, the instructions are
Download distribution, extract it, get to command prompt and type:
$ python setup.py install

Lift the fog of confusion over installs, or How to install libsndfile on OSX 10.7.3 Lion?

This is a long post, so I'm putting a short summary with the question at the top.
Lion has been out long enough that I can finally install most of the Python libraries etc. I need using binary installs. I'm stuck when it comes to libsndfile. What is the best way to install libsndfile, and if it relates, the rest of this stuff?
Read on if you need more details...
I'm working on a program that uses the following, developing under OSX 10.7.3 initially and need to be able to build cross-platform on WinXp and Win7 as well:
Python 2.7.2
Numpy 1.6.1
SciPy 0.10.1
matplotlib 1.1.0
SciPy.scikits.samplerate
SciPy.scikits.audiolab
PortMidi
pyinstaller
I have all of these installed on my OSX dev machine, and everything works under the debugger. When I try to build with pyinstaller I run into trouble because my installs are a historical hodgepodge of binary installs, builds from source, easy_install, pip, and HomeBrew. With a couple files pulled from a useless MacPorts install. Some of libraries were installed when I was running Lion beta, when it was tricky to get some things working under Lion.
My thinking is that Lion has been out long enough that it should be possible to do a cleaner installation, and that should simplify things going forward, especially with pyinstaller. I created a fresh Lion VM and did the following:
1. Installed Python 2.7.2 Mac OS X 64-bit/32-bit x86-64/i386 from binaries.
2. Installed numpy-1.6.1-py2.7-python.org-macosx10.6 from binaries.
3. Installed scipy-0.10.1-py2.7-python.org-macosx10.6 from binaries.
4. Installed matplotlib-1.1.0-py2.7-python.org-macosx10.6 from binaries.
5. Installed Xcode 4.3.1 and downloaded command tools
6. Installed libsamplerate-0.1.8 from source; required by scikits.samplerate
So far everything has gone fine, although I'm not sure about i386 vs x86_64 architectures for libsamplerate; I may need to go back and install it once for each architecture and then create a universal binary.
Next up is libsndfile, which is required by scikits.audiolab. This one is trouble, as I find an ever-expanding web of dependencies:
libiconv
ncurses
expat
gettext
glib
pkg-config
libFlac
libogg
libvorgis
Gettext was a pain, as the stpncpy error I was getting has been posted about several places, but finding the patches that actually fix the problem was a bit tedious.
Even after apparently building all of the dependencies I've listed, libsndfile still won't build without errors and I'm stuck.
Looking at the amount of time I've spent failing to get libsndfile working, I'm starting to question the basic approach. I seem to not have the ability to figure out all the errors and make the appropriate changes in finite time.
I find myself heading back down the package manager path... easy_install to pip and HomeBrew; but some things only MacPorts seems to handle, but overall MacPorts screws with Python in ways I can't accept, and pyinstaller hates MacPorts. If I do this, I may or may not get things working again, but even if I do, I'm concerned that I won't know which pieces were actually required or be able to maintain the dev environment over time.

Categories