Installing Multiple Python Distributions, Windows - python

I am using PythonXY (2.7, 32-bit) and the official Python (2.7, 32-bit).
Normally it is recommended to install according to python version, example C:\python27. But since they are both python27, can I arbitrarily change the base name (example C:\pythonxy27)?
When using python extras like pylauncher, or when utilizing the setuptools user-site, will they automatically recognize my custom installation sites (they will easily differentiate C:\python27 and C:\python33), or will both installations compete for the python27 namespace. (specifically when installing 3rd party packages to user-site, which normally locates as such \APPDATA\Python\PythonVer)

As far as setuptools/distribute are concerned, the python installer will handle custom locations for you. As long as you don't move that directory, all should be fine.
As for Pylauncher:
Things are not quite so clean. Pylauncher has simple configuration/call-parameters (for shebang lines in particular), that can handle version/platform selection quite well (2.7 vs 3.3, and 32bit vs 64bit).
As for the scenario in question (two different deployments where both are based on 32bit Python 2.7), pylauncher will attempt to guess which installation you wanted. If it is picking the wrong installation, there is some debugging information you can review to tune pylauncher's selection.
If an environment variable PYLAUNCH_DEBUG is set (to any value), the
launcher will print diagnostic information
It does not seem like there is a portable way to configure this, and will have to be done per-system (once you have your installations configured, YOU CAN set an alias that will be recognized on the shebang line)
Virtualenv and friends
I have also found (after struggling with pylauncher focused solutions), virtualenv addresses many of the deployment isolation hurdles. At the time of posting, working with virtualenv was not nearly as intuitive (on Windows) as compared to a linux shell environment. But I have discovered support packages like virtualenvwrapper which handle a lot of the ugly batch file interfaces very nicely.
Final Notes
Originally, I was also handling python globally with admin accounts. Forcing myself to stay within my user home directory (C:/Users/username), utilizing python user-site configurations, and making optimum use of ipython: have all given me a much better interactive command-line experience.

Related

Python virtualenv with relative paths

I would like to install a python virtualenv with relative paths, so that I can move the virtualenv to another machine (that has the same operative systems).
I googled about a solution, some suggests using the --portable options of the virtualenv command, but it is not available on my system. Maybe it is old?
Apart from changing the paths by hand, are there any other official way to create a portable virtualenv?
I am planning to use this on OSX and Linux (without mixing them of course).
The goal of virtualvenv is to create isolated Python environments.
The basic problem being addressed is one of dependencies and versions, and indirectly permissions.
Creating portable installations is not the supported use case:
Created python virtual environments are usually not self-contained. A complete python packaging is usually made up of thousands of files, so it’s not efficient to install the entire python again into a new folder. Instead virtual environments are mere shells, that contain little within themselves, and borrow most from the system python (this is what you installed, when you installed python itself). This does mean that if you upgrade your system python your virtual environments might break, so watch out.
You could always try to run a virtual environment on another machine that has the same OS, version and packages. But be warned, for having tried it myself in the past it is very fragile and prone to weird errors.
There are other tools to do what you are looking for depending on your use case and target OS (e.g. single executable, MSI installer, Docker image, etc.). See this answer.

Python Manual/Isolated/Portable Windows Installation

I thought it is an easy question but I spent a lot of google time to find the answer with no luck. Hope you can help me.
My company has a large SW system on windows which is portable, meaning copy some folders, add some folder to windows path and you are ready to go.
No registry, no dll in system directory, no shortcuts, Nothing!
I want to start using python 3.x in our system in the same paradigm. I also want the ability to add to this distribution a pip/conda 3rd packages from time to time.
I don't want to install python msi on all the systems.
I don't want to pack it to standalone executable like py2exe and pyinstaller or use special python distribution like PyWin32.
Somehow, I couldn't find a formal official solution for that.
The closest thing was here but no pip is supported, python is minimal, and the system isolation is "almost".
3.8. Embedded Distribution New in version 3.5.
The embedded distribution is a ZIP file containing a minimal Python
environment. It is intended for acting as part of another application,
rather than being directly accessed by end-users.
When extracted, the embedded distribution is (almost) fully isolated
from the user’s system, including environment variables, system
registry settings, and installed packages. The standard library is
included as pre-compiled and optimized .pyc files in a ZIP, and
python3.dll, python36.dll, python.exe and pythonw.exe are all
provided. Tcl/tk (including all dependants, such as Idle), pip and the
Python documentation are not included.
Note The embedded distribution does not include the Microsoft C
Runtime and it is the responsibility of the application installer to
provide this. The runtime may have already been installed on a user’s
system previously or automatically via Windows Update, and can be
detected by finding ucrtbase.dll in the system directory. Third-party
packages should be installed by the application installer alongside
the embedded distribution. Using pip to manage dependencies as for a
regular Python installation is not supported with this distribution,
though with some care it may be possible to include and use pip for
automatic updates. In general, third-party packages should be treated
as part of the application (“vendoring”) so that the developer can
ensure compatibility with newer versions before providing updates to
users.
Any ideas?
Thanks.
How about... installing Python in one machine and replicate that installation on others computers?
Usually, I install Python in a Windows Virtualbox machine (Microsoft usually give it for free to try it or for testing old Internet Explorer versions).
Then I copy the Python directory to my Windows machine (the real host) and usually works. This makes possible to using various python versions.
Did you try to complete the Python Embedded Distribution? Usually they not come with Tkinter, but once I could copy files and put in this distribution in a way that works. Try it too.
You can install pip with get-pip.py

Python embeddable zip

With the 3.5.0 release, Python.org has introduced a distribution billed as embeddable zip file.
Unfortunately the zipped file comes without a help file (not even a readme). The download page on Python.org just lists it among the downloads.
Apparently this is a portable Python distribution. It is anyway quite different in structure and size from the standard distribution using the installer.
I realised that it is possible to install pip with get-pip.py and, thanks to pip, it is a breeze to add many other application packages, though I am still unable to add Tkinter (adjust slashes according to your shell):
curl https://www.python.org/ftp/python/3.x.x/python-3.x.x-embed-amd64.zip > epython.zip
unzip -o epython.zip -d env1
curl -L https://bootstrap.pypa.io/get-pip.py>env1/get-pip.py
env1/python env1/get-pip.py
Add what you need, e.g django:
env1/python -m pip install django
Given the size (6.5 Mega for the 3.5.1-x64), I think that it can be convenient as a means to create isolated environments.
In fact the general Python documentation says that
the embedded distribution is (almost) fully isolated from the user’s system, including environment variables, system registry settings, and installed package
Given this, in Windows there are now two isolated Python environments, the second being the standard
Virtualenv. The same process in Virtualenv is like follows:
virtualenv env2
and for django it would be:
env2/Scripts/python -m pip install django
Comparing the contents of env1 and env2, they appear to have the same files. The only significant difference is Tkinter1, which is anyway not much significant for desktop apps.
Which is the difference between Python Virtualenv and Python embeddable?
Specifically, which is the difference between the isolated web app created with the embeddable zip (env1) and Virtualenv (env2)?
As you can see from the documentation, it is mainly meant for running Python based applications on ms-windows and for embedding Python in an application. As you can see, they left out tkinter. Maybe to keep the size down?
Comparing it to a virtualenv doesn't make much sense, I think. They have completely different use cases.
In the ms-windows world, applications are generally distributed as monolithic independant entities. In contrast, basically every UNIX flavor has a working package management system which makes it easier to have packages that depend on others. So if you install a python-based app in UNIX, the package management system will basically install Python for you if it isn't installed yet. On ms-windows this doesn't work. Several Python distributions for ms-windows have sprung up because (for technical reasons) compiling and setting up stuff on ms-windows is painful [1] compared to UNIX. So having an embeddable Python could make sense for people who want to distribute Python-based programs or who want to embed Python into their application.
In general though I recommend that ms-windows users install either Canopy or Anaconda because they come with most of the external modules that you'll be likely to need.
Edit As of 2020, the python.org distribution has come a long way; you don't need a special compiler for it anymore, and more and more modules distribute precompiled binaries for ms-windows on PyPI. So my recommendation for ms-windows users has changed: use the python.org releases of Python.

#!/usr/bin/python and #!/usr/bin/env python, which support?

How should the shebang for a Python script look like?
Some people support #!/usr/bin/env python because it can find the Python interpreter intelligently. Others support #!/usr/bin/python, because now in most GNU/Linux distributions python is the default program.
What are the benefits of the two variants?
The Debian Python Policy states:
The preferred specification for the Python interpreter is /usr/bin/python or /usr/bin/pythonX.Y. This ensures that a Debian installation of python is used and all dependencies on additional python modules are met.
Maintainers should not override the Debian Python interpreter using /usr/bin/env python or /usr/bin/env pythonX.Y. This is not advisable as it bypasses Debian's dependency checking and makes the package vulnerable to incomplete local installations of python.
Note that Debian/Ubuntu use the alternatives system to manage which version /usr/bin/python actually points to. This has been working very nicely across a lot of python versions at least for me (and I've been using python from 2.3 to 2.7 now), with excellent transitions across updates.
Note that I've never used pip. I want automatic security upgrades, so I install all my python needs via aptitude. Using the official Debian/Ubuntu packages keep my system much cleaner than me messing around with the python installation myself.
Let me emphasize one thing. The above recommendation refers to system installation of python applications. It makes perfectly sense to have these use the system managed version of python. If you are actually playing around with your own, customized installation of python that is not managed by the operating system, using the env variant probably is the correct way of saying "use the user-preferred python", instead of hard-coding either the system python installation (which would be /usr/bin/python) or any user-custom path.
Using env python will cause your programs to behave differently if you call them from e.g. a python virtualenv.
This can be desired (e.g. you are writing a script to work only in your virtualenv). And it can be problematic (you write a tool for you, and expect it to work the same even within a virtualenv - it may suddenly fail because it is missing packages then).
My humble opinion is that you should use the env-variant. It's a POSIX component thus found in pretty much every system, while directly specifying /usr/bin/python breaks in many occasions, i.e. virtualenv setups.
I use #!/usr/bin/env python as the default install location on OS-X is NOT /usr/bin. This also applies to users who like to customize their environment -- /usr/local/bin is another common place where you might find a python distribution.
That said, it really doesn't matter too much. You can always test the script with whatever python version you want: /usr/bin/strange/path/python myscript.py. Also, when you install a script via setuptools, the shebang seems to get replaced by the sys.executable which installed that script -- I don't know about pip, but I would assume it behaves similarly.
As you note, they probably both work on linux. However, if someone has installed a newer version of python for their own use, or some requirement makes people keep a particular version in /usr/bin, the env allows the caller to set up their environment so that a different version will be called through env.
Imagine someone trying to see if python 3 works with the scripts. They'll add the python3 interpreter first in their path, but want to keep the default on the system running on 2.x. With a hardcoded path that's not possible.

How to maintain long-lived python projects w.r.t. dependencies and python versions?

short version: how can I get rid of the multiple-versions-of-python nightmare ?
long version: over the years, I've used several versions of python, and what is worse, several extensions to python (e.g. pygame, pylab, wxPython...). Each time it was on a different setup, with different OSes, sometimes different architectures (like my old PowerPC mac).
Nowadays I'm using a mac (OSX 10.6 on x86-64) and it's a dependency nightmare each time I want to revive script older than a few months. Python itself already comes in three different flavours in /usr/bin (2.5, 2.6, 3.1), but I had to install 2.4 from macports for pygame, something else (cannot remember what) forced me to install all three others from macports as well, so at the end of the day I'm the happy owner of seven (!) instances of python on my system.
But that's not the problem, the problem is, none of them has the right (i.e. same set of) libraries installed, some of them are 32bits, some 64bits, and now I'm pretty much lost.
For example right now I'm trying to run a three-year-old script (not written by me) which used to use matplotlib/numpy to draw a real-time plot within a rectangle of a wxwidgets window. But I'm failing miserably: py26-wxpython from macports won't install, stock python has wxwidgets included but also has some conflict between 32 bits and 64 bits, and it doesn't have numpy... what a mess !
Obviously, I'm doing things the wrong way. How do you usally cope with all that chaos ?
I solve this using virtualenv. I sympathise with wanting to avoid further layers of nightmare abstraction, but virtualenv is actually amazingly clean and simple to use. You literally do this (command line, Linux):
virtualenv my_env
This creates a new python binary and library location, and symlinks to your existing system libraries by default. Then, to switch paths to use the new environment, you do this:
source my_env/bin/activate
That's it. Now if you install modules (e.g. with easy_install), they get installed to the lib directory of the my_env directory. They don't interfere with existing libraries, you don't get weird conflicts, stuff doesn't stop working in your old environment. They're completely isolated.
To exit the environment, just do
deactivate
If you decide you made a mistake with an installation, or you don't want that environment anymore, just delete the directory:
rm -rf my_env
And you're done. It's really that simple.
virtualenv is great. ;)
Some tips:
on Mac OS X, use only the python installation in /Library/Frameworks/Python.framework.
whenever you use numpy/scipy/matplotlib, install the enthought python distribution
use virtualenv and virtualenvwrapper to keep those "system" installations pristine; ideally use one virtual environment per project, so each project's dependencies are fulfilled. And, yes, that means potentially a lot of code will be replicated in the various virtual envs.
That seems like a bigger mess indeed, but at least things work that way. Basically, if one of the projects works in a virtualenv, it will keep working no matter what upgrades you perform, since you never change the "system" installs.
Take a look at virtualenv.
What I usually do is trying to (progressively) keep up with the Python versions as they come along (and once all of the external dependencies have correct versions available).
Most of the time the Python code itself can be transferred as-is with only minor needed modifications.
My biggest Python project # work (15.000+ LOC) is now on Python 2.6 a few months (upgrading everything from Python 2.5 did take most of a day due to installing / checking 10+ dependencies...)
In general I think this is the best strategy with most of the interdependent components in the free software stack (think the dependencies in the linux software repositories): keep your versions (semi)-current (or at least: progressing at the same pace).
install the python versions you need, better if from sources
when you write a script, include the full python version into it (such as #!/usr/local/bin/python2.6)
I can't see what could go wrong.
If something does, it's probably macports fault anyway, not yours (one of the reasons I don't use macports anymore).
I know I'm probably missing something and this will get downvoted, but please leave at least a little comment in that case, thanks :)
I use the MacPorts version for everything, but as you note a lot of the default versions are bizarrely old. For example vim omnicomplete in Snow Leopard has python25 as a dependency. A lot of python related ports have old dependencies but you can usually flag the newer version at build time, for example port install vim +python26 instead of port install vim +python. Do a dry run before installing anything to see if you are pulling, for example, the whole of python24 when it isn't necessary. Check portfiles often because the naming convention as Darwin ports was getting off the ground left something to be desired. In practice I just leave everything in the default /opt... folders of MacPorts, including a copy of the entire framework with duplicates of PyObjC, etc., and just stick with one version at a time, retaining the option to return to the system default if things break unexpectedly. Which is all perhaps a bit too much work to avoid using virtualenv, which I've been meaning to get around to using.
I've had good luck using Buildout. You set up a list of which eggs and which versions you want. Buildout then downloads and installs private versions of each for you. It makes a private "python" binary with all the eggs already installed. A local "nosetests" makes things easy to debug. You can extend the build with your own functions.
On the down side, Buildout can be quite mysterious. Do "buildout -vvvv" for a while to see exactly what it's doing and why.
http://www.buildout.org/docs/tutorial.html
At least under Linux, multiple pythons can co-exist fairly happily. I use Python 2.6 on a CentOS system that needs Python 2.4 to be the default for various system things. I simply compiled and installed python 2.6 into a separate directory tree (and added the appropriate bin directory to my path) which was fairly painless. It's then invoked by typing "python2.6".
Once you have separate pythons up and running, installing libraries for a specific version is straightforward. If you invoke the setup.py script with the python you want, it will be installed in directories appropriate to that python, and scripts will be installed in the same directory as the python executable itself and will automagically use the correct python when invoked.
I also try to avoid using too many libraries. When I only need one or two functions from a library (eg scipy), I'll often see if I can just copy them to my own project.

Categories