I currently have Xcode (along with command line tools) and gfrotran from HPC installed on my Yosemite system, and would like to replace HPC's gfortran with Homebrew's (because I'm having trouble building Python packages with the HPC gfortran).
What are the steps to accomplish this?
I want to be sure that
HPC's gfortran is gone (I just want one Fortran) and
Apple's tools still work (for Xcode, Swift, OS X and iOS development, etc.)
and of course
that I have a working version of gfortran that reliably builds Python packages.
Can this be done? I see for example that Homebrew's gfortran is now packaged as part of gcc, so it looks like I'll end up with two versions gcc (which I'd like to avoid) or one that doesn't play well with Xcode.
What are the steps to accomplish 1-3?
I recently worked through this very same problem. This is how I got/have it working on my Yosemite system.
Some things about home-brew and the command line tools compilers:
The compiler you get through home-brew will overwrite the existing one if it is in the same place. If you have a fortran compiler that you obtained from the command line tools then it will not be affected.(Home-brew will warn you about this before doing anything)
The binaries for the compilers you get through the command line tools are in /usr/bin .
The compilers (and anything else) you obtain using home-brew are stored in the cellar ( /usr/local/Cellar)
and the executables are linked into the directory /usr/local/bin.
What to do?
Modify the path:
I only needed to use the compilers I got through home-brew so I moved /usr/local/bin to the top of my $path this ensures that the /usr/local/bin directory is searched before /usr/bin.
Use aliases :
You can create an alias for each compiler so that you can use them interchangeably as you wish. To create an alias you will have to modify your shell-rc file. On my system I use the tcsh, and to create an alias for the home-brew compiler I would add something similar to this to my ~/.cshrc file.
alias brewgfortran '/usr/local/bin/gfortran'
this now executes the home-brew gfortran-4.9 executable that is stored in my cellar and linked to /usr/local/bin/gfortran/
IF you absolutely want rid of the apple compilers you can of course remove them from the /usr/bin/ directory completely. I don't think this is the best idea though. I have shown you a few ways to avoid using them and if you ever needed them for some reason you would be SOL. I cant say if those tools you need will work without them as I have never used any of those, however I know it will build python packages for you (sorry hope this still helps)
NOTE: If you are not using the same shell as me the syntax for the alias is a little different (I think) just google it or man alias
Related
I am building Python 3.10 from source on Ubuntu 18.04, following instructions from several web links, primarily the Python website (https://devguide.python.org/setup) and RealPython (https://realpython.com/installing-python/#how-to-build-python-from-source-code). I extracted Python-3.10.0.tgz into /opt/Python3.10. I have three questions.
First, the Python website says to use ./configure --with-pydebug and RealPython says to use ./configure --enable-optimizations --with-ensurepip=install. Another source says to include --enable-shared and --enable-unicode=ucs4. Which of these is best? Should I use all of those flags?
Second, I currently have Python 3.6 and Python 3.8 installed. They are installed in several directories under /usr. Following the directions I have seen on the web I am building in /opt/Python3.10. I assume that make altinstall (the final build step) will take care of installing the build in the usual folders under /usr, but that's not clear. Should I use ./configure --prefix=directory although none of the web sources mention doing that?
Finally, how much does --enable-optimizations slow down the install process?
This is my first time building Python from source, and it will help to clear these things up. Thanks for any help.
Welcome to the world of Python build configuration! I'll go through the command line options to ./configure one by one.
--with-pydebug is for core Python developers, not developers (like you and me) just using Python. It creates debugging symbols and slows down execution. You don't need it.
--enable-optimizations is good for performance in the long run, at the expense of lengthening the compiling process, possibly by 3-fold (or more), depending on your system. However, it results in faster execution, so I would use it in your situation.
--with-ensurepip=install is good. You want the most up-to-date version of pip.
--enable-shared is maybe not a good idea in your case, so I'd recommend not using it here. Read Difference between static and shared libraries? to understand the difference. Basically, since you'll possibly be installing to a non-system path (/opt/local, see below) that almost certainly isn't on your system's search path for shared libraries, you'll very likely run into problems down the road. A static build has all the pieces in one place, so you can install and run it from wherever. This is at the expense of size - the python binary will be rather large - but is great for non-sys admins. Even if you end up installing to /usr/local, I would argue that static is better/easier than shared.
--enable-unicode=ucs4 is optional, and may not be compatible with your system. You don't need it. ./configure is smart enough to figure out what Unicode settings are best. This option is left over from build instructions that are quite a few versions out of date.
--prefix I would suggest you use --prefix=/opt/local if that directory already exists and is in your $PATH, or if you know how to edit your $PATH in ~/.bashrc. Otherwise, use /usr/local or $HOME. /usr/local is the designated system-wide location for local software installs (i.e., stuff that doesn't come with Ubuntu), and is likely already on your $PATH. $HOME is always an option that doesn't require the use of sudo, which is great from a security perspective. You'll need to add /home/your_username/bin to your $PATH if it isn't already present.
What I'm trying to do is ship my code to a remote server, that may have different python version installed and/or may not have packages my app requires.
Right now to achieve such portability I have to build relocatable virtualenv with interpreter and code. That approach has some issues (for example, you have to manually copy a bunch of libraries into your virtualenv, since --always-copy doesn't work as expected) and generally slow.
There's (in theory) a way to build python itself statically.
I wonder if I could pack interpreter with my code into one binary and run my application as module. Something like that: ./mypython -m myapp run or ./mypython -m gunicorn -c ./gunicorn.conf myapp.wsgi:application.
There are two ways you could go about to solve your problem
Use a static builder, like freeze, or pyinstaller, or py2exe
Compile using cython
This answer explains how you can go about doing it using the second approach, since the first method is not cross platform and version, and has been explained in other answers. Also, using programs like pyinstaller typically results in huge file sizes, while using cython will result in a file that's much smaller
First, install cython.
sudo -H pip3 install cython
Then, you can use cython to generate a C file out of the Python .py file
(in reference to https://stackoverflow.com/a/22040484/5714445)
cython example_file.py --embed
Use GCC to compile it after getting your current python version (Note: The below assumes you are trying to compile it to Python3)
PYTHONLIBVER=python$(python3 -c 'import sys; print(".".join(map(str, sys.version_info[:2])))')$(python3-config --abiflags)
gcc -Os $(python3-config --includes) example_file.c -o output_bin_file $(python3-config --ldflags) -l$PYTHONLIBVER
You will now have a binary file output_bin_file, which is what you are looking for
Other things to note:
Change example_file.py to whatever file you are actually trying to compile.
Cython is used to use C-Type Variable definitions for static memory allocation to speed up Python programs. In your case however, you will still be using traditional Python definitions.
If you are using additional libraries (like opencv, for example), you might have to provide the directory to them using -L and then specify the name of the library using -l in the GCC Flags. For more information on this, please refer to GCC flags
The above method might not work for anaconda python, as you will likely have to install a version of gcc that is compatible with your conda-python.
You might wish to investigate Nuitka. It takes python source code and converts it in to C++ API calls. Then it compiles into an executable binary (ELF on Linux). It has been around for a few years now and supports a wide range of Python versions.
You will probably also get a performance improvement if you use it. Recommended.
You're probably looking for something like Freeze, which is able to compile your Python application with all its libraries into a static binary:
PyPi page of Freeze
Python Wiki page of Freeze
Sourceforge page of Freeze
If you are on a Mac you can use py2app to create a .app bundle, which starts your Django app when you double-click on it.
I described how to bundle Django and CherryPy into such a bundle at https://moosystems.com/articles/14-distribute-django-app-as-native-desktop-app-01.html
In the article I use pywebview to display your Django site in a local application window.
Freeze options:
https://pypi.python.org/pypi/bbfreeze/1.1.3
http://cx-freeze.sourceforge.net/
However, your target server should have the environment you want -> you should be able to 'create' it. If it doesn't, you should build your software to match the environment.
I found this handy guide on how to install custom version of python to a virtualenv, assuming you have ssh access: https://stackoverflow.com/a/5507373/5616110
In virtualenv, you should be able to pip install anything and you shouldn't need to worry about sudo privileges. Of course, having those and access to package manager like apt makes everything a lot easier.
I have created a docker image that relies on Nuitka and a custom statically linked python3.10 to create a static binary.
Did not test it extensively, if you have the chance please let me know if it works for your use case.
You can check it at:
https://github.com/joaompinto/docker-build-python-static-bin
Maybe a stupid question, but I was wondering where Python's distutils get the compiler options from? It gets some linked directories wrong and I want to correct that once and for all.
I know there should be a prefix/lib/pythonver/distutils/distutils.cfg but I can't find any distutils.cfg anywhere on the computer. Obviously I haven't got a local setup.cfg or any $HOME/.pydistutils.cfg.
I'm using the Enthought 64-bit distribution, version 7.3 (Python 2.7) on Mac OS X 10.8.3
Cheers,
U.
I actually export them to the environment, just like for autotools' configure:
export CC=/usr/local/bin/clang
export CFLAGS=-I${HOME}/include
export LDFLAGS=-lboost
If you also need to override the linker separately:
export LDSHARED=/usr/local/bin/clang -shared
And if you don't like exporting the settings to your environment, do something like this for a one-time setting:
CC=/usr/local/bin/clang CFLAGS=-I${HOME}/include python setup.py build
If you want to find out what the default options were when python was build, use python-config --<flag>. Some flags are cflags, ldflags, libs or includes.
Compiler options are taken from CPython’s Makefile. IOW they are the same as the ones used to compile Python. You can override most of them on the command line as Evert described.
The global distutils.cfg is something that a sysadmin can create to set default options, not a file that is installed with Python.
I'm currently doing some embedded systems programming. This was set up by somebody else a few years ago. So now I'm looking to upgrade to Python 2.7.2 to make things simpler because I have already run into two cases where what I coded wasn't supported.
What is currently running:
: uname -a
Linux host1 2.6.18-6-486 #1 Sun Feb 10 22:06:33 UTC 2008 i586 GNU/Linux
: python -v
Python 2.4.4
: pyversions -i
python2.4
So right now only 2.4 is installed.
I untarred python2.7.2 and when I go to that directory and run python27 setup.py install --home=/home/jhemilian and it seems like python2.4 doesn't seem to know the with...as statement syntax:
host1:/home/jhemilian/src/Python-2.7.2: python setup.py install --home=/home/jhe
milian
File "setup.py", line 361
with open(tmpfile) as fp:
^
SyntaxError: invalid syntax
Before I go figuring this out I first have a question: python itself is being used to install Python? What if I didn't have the first version of Python installed? I know it's shipped with most Linux but hypothetically -- how does such a seeming catch-22 like that work?
What I am looking to do is install python2.7 in a benign location, keeping the python command still as using Python 2.4 just in case the "legacy" software i'm running is dependent on it, and running python2.7 myscript.py et cetera when I want to run one of my newer scripts. Feel free to comment if there is a cleaner or more practical (or even safer!) way to do this.
I don't think it would make much sense to go replacing all the with statements with compatible try blocks. I've looked though the READMEs and online documentation but I can't seem to find a way to install Python without already having Python. Note that I DO NOT have internet connection, although if desirable or necessary I could. It would be great if somebody could point me in the right direction. Thanks!!
It's all right in the README...
You don't need to use python to install, in fact, you shouldn't...just:
./configure
make
make install
If you want to install in a specific dir, just follow what the README says:
Installing
To install the Python binary, library modules, shared library modules
(see below), include files, configuration files, and the manual page,
just type
make install
This will install all platform-independent files in subdirectories of
the directory given with the --prefix option to configure or to the
prefix' Make variable (default /usr/local). All binary and other
platform-specific files will be installed in subdirectories if the
directory given by --exec-prefix or theexec_prefix' Make variable
(defaults to the --prefix directory) is given.
If DESTDIR is set, it will be taken as the root directory of the
installation, and files will be installed into $(DESTDIR)$(prefix),
$(DESTDIR)$(exec_prefix), etc.
All subdirectories created will have Python's version number in their
name, e.g. the library modules are installed in
"/usr/local/lib/python/" by default, where is the
. release number (e.g. "2.1"). The Python binary is
installed as "python" and a hard link named "python" is
created. The only file not installed with a version number in its
name is the manual page, installed as "/usr/local/man/man1/python.1"
by default.
If you want to install multiple versions of Python see the section
below entitled "Installing multiple versions".
The only thing you may have to install manually is the Python mode for
Emacs found in Misc/python-mode.el. (But then again, more recent
versions of Emacs may already have it.) Follow the instructions that
came with Emacs for installation of site-specific files.
EDIT: virtualenv is apparently for already-installed Python versions. Disregard this recommendation.
I think what you want is virtualenv.
I haven't used it myself, but I understand this is what it's meant for.
From the website:
virtualenv is a tool to create isolated Python environments.
The basic problem being addressed is one of dependencies and versions, and indirectly permissions. Imagine you have an application that needs version 1 of LibFoo, but another application requires version 2. How can you use both these applications? If you install everything into /usr/lib/python2.7/site-packages (or whatever your platform's standard location is), it's easy to end up in a situation where you unintentionally upgrade an application that shouldn't be upgraded.
EDIT: Upon review, I think you want Alberto's answer, so I voted him up for visibility.
short version: how can I get rid of the multiple-versions-of-python nightmare ?
long version: over the years, I've used several versions of python, and what is worse, several extensions to python (e.g. pygame, pylab, wxPython...). Each time it was on a different setup, with different OSes, sometimes different architectures (like my old PowerPC mac).
Nowadays I'm using a mac (OSX 10.6 on x86-64) and it's a dependency nightmare each time I want to revive script older than a few months. Python itself already comes in three different flavours in /usr/bin (2.5, 2.6, 3.1), but I had to install 2.4 from macports for pygame, something else (cannot remember what) forced me to install all three others from macports as well, so at the end of the day I'm the happy owner of seven (!) instances of python on my system.
But that's not the problem, the problem is, none of them has the right (i.e. same set of) libraries installed, some of them are 32bits, some 64bits, and now I'm pretty much lost.
For example right now I'm trying to run a three-year-old script (not written by me) which used to use matplotlib/numpy to draw a real-time plot within a rectangle of a wxwidgets window. But I'm failing miserably: py26-wxpython from macports won't install, stock python has wxwidgets included but also has some conflict between 32 bits and 64 bits, and it doesn't have numpy... what a mess !
Obviously, I'm doing things the wrong way. How do you usally cope with all that chaos ?
I solve this using virtualenv. I sympathise with wanting to avoid further layers of nightmare abstraction, but virtualenv is actually amazingly clean and simple to use. You literally do this (command line, Linux):
virtualenv my_env
This creates a new python binary and library location, and symlinks to your existing system libraries by default. Then, to switch paths to use the new environment, you do this:
source my_env/bin/activate
That's it. Now if you install modules (e.g. with easy_install), they get installed to the lib directory of the my_env directory. They don't interfere with existing libraries, you don't get weird conflicts, stuff doesn't stop working in your old environment. They're completely isolated.
To exit the environment, just do
deactivate
If you decide you made a mistake with an installation, or you don't want that environment anymore, just delete the directory:
rm -rf my_env
And you're done. It's really that simple.
virtualenv is great. ;)
Some tips:
on Mac OS X, use only the python installation in /Library/Frameworks/Python.framework.
whenever you use numpy/scipy/matplotlib, install the enthought python distribution
use virtualenv and virtualenvwrapper to keep those "system" installations pristine; ideally use one virtual environment per project, so each project's dependencies are fulfilled. And, yes, that means potentially a lot of code will be replicated in the various virtual envs.
That seems like a bigger mess indeed, but at least things work that way. Basically, if one of the projects works in a virtualenv, it will keep working no matter what upgrades you perform, since you never change the "system" installs.
Take a look at virtualenv.
What I usually do is trying to (progressively) keep up with the Python versions as they come along (and once all of the external dependencies have correct versions available).
Most of the time the Python code itself can be transferred as-is with only minor needed modifications.
My biggest Python project # work (15.000+ LOC) is now on Python 2.6 a few months (upgrading everything from Python 2.5 did take most of a day due to installing / checking 10+ dependencies...)
In general I think this is the best strategy with most of the interdependent components in the free software stack (think the dependencies in the linux software repositories): keep your versions (semi)-current (or at least: progressing at the same pace).
install the python versions you need, better if from sources
when you write a script, include the full python version into it (such as #!/usr/local/bin/python2.6)
I can't see what could go wrong.
If something does, it's probably macports fault anyway, not yours (one of the reasons I don't use macports anymore).
I know I'm probably missing something and this will get downvoted, but please leave at least a little comment in that case, thanks :)
I use the MacPorts version for everything, but as you note a lot of the default versions are bizarrely old. For example vim omnicomplete in Snow Leopard has python25 as a dependency. A lot of python related ports have old dependencies but you can usually flag the newer version at build time, for example port install vim +python26 instead of port install vim +python. Do a dry run before installing anything to see if you are pulling, for example, the whole of python24 when it isn't necessary. Check portfiles often because the naming convention as Darwin ports was getting off the ground left something to be desired. In practice I just leave everything in the default /opt... folders of MacPorts, including a copy of the entire framework with duplicates of PyObjC, etc., and just stick with one version at a time, retaining the option to return to the system default if things break unexpectedly. Which is all perhaps a bit too much work to avoid using virtualenv, which I've been meaning to get around to using.
I've had good luck using Buildout. You set up a list of which eggs and which versions you want. Buildout then downloads and installs private versions of each for you. It makes a private "python" binary with all the eggs already installed. A local "nosetests" makes things easy to debug. You can extend the build with your own functions.
On the down side, Buildout can be quite mysterious. Do "buildout -vvvv" for a while to see exactly what it's doing and why.
http://www.buildout.org/docs/tutorial.html
At least under Linux, multiple pythons can co-exist fairly happily. I use Python 2.6 on a CentOS system that needs Python 2.4 to be the default for various system things. I simply compiled and installed python 2.6 into a separate directory tree (and added the appropriate bin directory to my path) which was fairly painless. It's then invoked by typing "python2.6".
Once you have separate pythons up and running, installing libraries for a specific version is straightforward. If you invoke the setup.py script with the python you want, it will be installed in directories appropriate to that python, and scripts will be installed in the same directory as the python executable itself and will automagically use the correct python when invoked.
I also try to avoid using too many libraries. When I only need one or two functions from a library (eg scipy), I'll often see if I can just copy them to my own project.