In my application I would like to use:
packageA, which requires packageX==1.3
packageB, which requires packageX==1.4
packageX==1.5
How can I install multiple versions of packageX with pip to handle this situation?
pip won't help you with this.
You can tell it to install a specific version, but it will override the other one. On the other hand, using two virtualenvs will let you install both versions on the same machine, but not use them at the same time.
You best bet is to install both version manually, by putting them in your Python path with a different name.
But if your two libs expect them to have the same name (and they should), you will have to modify them so they pick up the version they need with some import alias such as:
import dependencyname_version as dependencyname
There is currently no clean way to do this. The best you can hope is for this hack to work.
I'd rather ditch one of the two libs and replace it with an equivalent, or patch it to accept the new version of the dependency and give the patch back to the community.
Download the source for ea. package. Install each on its own separate folder. For example. I had version 1.10 package, but wanted to switch to the dev version for some work. I downloaded the source for the dev module:
git clone https://github.com/networkx/networkx.git
cd netwokrx
I created a folder for this version:
mkdir /home/username/opt/python, then I set the PYTHONPATH env var to: export PYTHONPATH=/home/username/opt/python/lib/python2.7/site-packages/. Next, I installed it using: python setup.py install --prefix=/home/username/opt/python
Now, since my PYTHONPATH is now pointing to this other site-packages folder, when I run python on the command line, and import the new module, it works. To switch switch back, remove the new folder from PYTHONPATH.
>>> import networkx as nx
>>> nx.__version__
'2.0.dev_20151209221101'
an ugly workaround I use with python in blender is I'll install (and keep off path) a like version of python and use subprocess to have the other version do the needed work. Blenders python tends to get a little temperamental if you do much more than install pandas and scipy. I've tried this using virtualenvs with blender but that tends to break things.
Also on the off chance you are using blender for data visualization, you are going to want to add a config folder to your version number folder, this will keep all of your addons in that folder and it makes it far more portable and far less likely to mess up other installs of blender. Many people who make addons for blender are not 'programmers', so often those savvy people will do some very hackish things and this has been the best workaround I've been able to use.
Another workaround (and this has so many flags on the play that it should disqualify me from touching a keyboard) is to manually locate the init file and manually add it to globals with importlib ... this comes with risks. Some modules will play alright when you do this, other modules will toss crapfits that can lead to extra special troubleshooting sessions. Keep it to like versions and it does cut down on the issues and I've had 'alright' luck with using this to import modulus from behind virtual envs, but there is a reason why I use subprocess calls when working with blender's python.
def importfromfilelocation(x,y,z):
#"""x = 'tk',y = "tkinter", z =r'C:\pyth"""
mod_alis = x
spec = importlib.util.spec_from_file_location(y, z)
print(spec)
mod_alis = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod_alis)
globals()[str(x)]= mod_alis
Another "workaround" is to use IPC/RPC and run isolated packages in services. If the dependencies are on the different libraries, maybe can separate by the usage of the packages.
Related
I'm a Java/Scala dev transitioning to Python for a work project. To dust off the cobwebs on the Python side of my brain, I wrote a webapp that acts as a front-end for Docker when doing local Docker work. I'm now working on packaging it up and, as such, am learning about setup.py and virtualenv. Coming from the JVM world, where dependencies aren't "installed" so much as downloaded to a repository and referenced when needed, the way pip handles things is a bit foreign. It seems like best practice for production Python work is to first create a virtual environment for your project, do your coding work, then package it up with setup.py.
My question is, what happens on the other end when someone needs to install what I've written? They too will have to create a virtual environment for the package but won't know how to set it up without inspecting the setup.py file to figure out what version of Python to use, etc. Is there a way for me to create a setup.py file that also creates the appropriate virtual environment as part of the install process? If not — or if that's considered a "no" as this respondent stated to this SO post — what is considered "best practice" in this situation?
You can think of virtualenv as an isolation for every package you install using pip. It is a simple way to handle different versions of python and packages. For instance you have two projects which use same packages but different versions of them. So, by using virtualenv you can isolate those two projects and install different version of packages separately, not on your working system.
Now, let's say, you want work on a project with your friend. In order to have the same packages installed you have to share somehow what versions and which packages your project depends on. If you are delivering a reusable package (a library) then you need to distribute it and here where setup.py helps. You can learn more in Quick Start
However, if you work on a web site, all you need is to put libraries versions into a separate file. Best practice is to create separate requirements for tests, development and production. In order to see the format of the file - write pip freeze. You will be presented with a list of packages installed on the system (or in the virtualenv) right now. Put it into the file and you can install it later on another pc, with completely clear virtualenv using pip install -r development.txt
And one more thing, please do not put strict versions of packages like pip freeze shows, most of time you want >= at least X.X version. And good news here is that pip handles dependencies by its own. It means you do not have to put dependent packages there, pip will sort it out.
Talking about deploy, you may want to check tox, a tool for managing virtualenvs. It helps a lot with deploy.
Python default package path always point to system environment, that need Administrator access to install. Virtualenv able to localised the installation to an isolated environment.
For deployment/distribution of package, you can choose to
Distribute by source code. User need to run python setup.py --install, or
Pack your python package and upload to Pypi or custom Devpi. So the user can simply use pip install <yourpackage>
However, as you notice the issue on top : without virtualenv, they user need administrator access to install any python package.
In addition, the Pypi package worlds contains a certain amount of badly tested package that doesn't work out of the box.
Note : virtualenv itself is actually a hack to achieve isolation.
If I install all packages using
python setup.py install --prefix=~/.local
how can I make Python read my packages from there and not from the system wide version? I tried editing PYTHONPATH to put ~/.local/lib/python2.x/site-packages/ first, but it does not help.
I thought that ~/.local is guaranteed to be read first in Python 2.6 and later versions. Is this true? Is something special required to make it true? That would solve the issue. Right now it seems that PYTHONPATH paths get incorporated into sys.path but are in the list after the system-wide site-packages directories, making the system wide version get used instead of the one in ~/.local.
Using pip is not an option unfortunately.
It is recommended that you use virtualenv with a proper activation script which will take care of settings up PYTHONPATH properly, without messing up system-wide Python installation in any way
http://www.virtualenv.org/en/latest/index.html
short version: how can I get rid of the multiple-versions-of-python nightmare ?
long version: over the years, I've used several versions of python, and what is worse, several extensions to python (e.g. pygame, pylab, wxPython...). Each time it was on a different setup, with different OSes, sometimes different architectures (like my old PowerPC mac).
Nowadays I'm using a mac (OSX 10.6 on x86-64) and it's a dependency nightmare each time I want to revive script older than a few months. Python itself already comes in three different flavours in /usr/bin (2.5, 2.6, 3.1), but I had to install 2.4 from macports for pygame, something else (cannot remember what) forced me to install all three others from macports as well, so at the end of the day I'm the happy owner of seven (!) instances of python on my system.
But that's not the problem, the problem is, none of them has the right (i.e. same set of) libraries installed, some of them are 32bits, some 64bits, and now I'm pretty much lost.
For example right now I'm trying to run a three-year-old script (not written by me) which used to use matplotlib/numpy to draw a real-time plot within a rectangle of a wxwidgets window. But I'm failing miserably: py26-wxpython from macports won't install, stock python has wxwidgets included but also has some conflict between 32 bits and 64 bits, and it doesn't have numpy... what a mess !
Obviously, I'm doing things the wrong way. How do you usally cope with all that chaos ?
I solve this using virtualenv. I sympathise with wanting to avoid further layers of nightmare abstraction, but virtualenv is actually amazingly clean and simple to use. You literally do this (command line, Linux):
virtualenv my_env
This creates a new python binary and library location, and symlinks to your existing system libraries by default. Then, to switch paths to use the new environment, you do this:
source my_env/bin/activate
That's it. Now if you install modules (e.g. with easy_install), they get installed to the lib directory of the my_env directory. They don't interfere with existing libraries, you don't get weird conflicts, stuff doesn't stop working in your old environment. They're completely isolated.
To exit the environment, just do
deactivate
If you decide you made a mistake with an installation, or you don't want that environment anymore, just delete the directory:
rm -rf my_env
And you're done. It's really that simple.
virtualenv is great. ;)
Some tips:
on Mac OS X, use only the python installation in /Library/Frameworks/Python.framework.
whenever you use numpy/scipy/matplotlib, install the enthought python distribution
use virtualenv and virtualenvwrapper to keep those "system" installations pristine; ideally use one virtual environment per project, so each project's dependencies are fulfilled. And, yes, that means potentially a lot of code will be replicated in the various virtual envs.
That seems like a bigger mess indeed, but at least things work that way. Basically, if one of the projects works in a virtualenv, it will keep working no matter what upgrades you perform, since you never change the "system" installs.
Take a look at virtualenv.
What I usually do is trying to (progressively) keep up with the Python versions as they come along (and once all of the external dependencies have correct versions available).
Most of the time the Python code itself can be transferred as-is with only minor needed modifications.
My biggest Python project # work (15.000+ LOC) is now on Python 2.6 a few months (upgrading everything from Python 2.5 did take most of a day due to installing / checking 10+ dependencies...)
In general I think this is the best strategy with most of the interdependent components in the free software stack (think the dependencies in the linux software repositories): keep your versions (semi)-current (or at least: progressing at the same pace).
install the python versions you need, better if from sources
when you write a script, include the full python version into it (such as #!/usr/local/bin/python2.6)
I can't see what could go wrong.
If something does, it's probably macports fault anyway, not yours (one of the reasons I don't use macports anymore).
I know I'm probably missing something and this will get downvoted, but please leave at least a little comment in that case, thanks :)
I use the MacPorts version for everything, but as you note a lot of the default versions are bizarrely old. For example vim omnicomplete in Snow Leopard has python25 as a dependency. A lot of python related ports have old dependencies but you can usually flag the newer version at build time, for example port install vim +python26 instead of port install vim +python. Do a dry run before installing anything to see if you are pulling, for example, the whole of python24 when it isn't necessary. Check portfiles often because the naming convention as Darwin ports was getting off the ground left something to be desired. In practice I just leave everything in the default /opt... folders of MacPorts, including a copy of the entire framework with duplicates of PyObjC, etc., and just stick with one version at a time, retaining the option to return to the system default if things break unexpectedly. Which is all perhaps a bit too much work to avoid using virtualenv, which I've been meaning to get around to using.
I've had good luck using Buildout. You set up a list of which eggs and which versions you want. Buildout then downloads and installs private versions of each for you. It makes a private "python" binary with all the eggs already installed. A local "nosetests" makes things easy to debug. You can extend the build with your own functions.
On the down side, Buildout can be quite mysterious. Do "buildout -vvvv" for a while to see exactly what it's doing and why.
http://www.buildout.org/docs/tutorial.html
At least under Linux, multiple pythons can co-exist fairly happily. I use Python 2.6 on a CentOS system that needs Python 2.4 to be the default for various system things. I simply compiled and installed python 2.6 into a separate directory tree (and added the appropriate bin directory to my path) which was fairly painless. It's then invoked by typing "python2.6".
Once you have separate pythons up and running, installing libraries for a specific version is straightforward. If you invoke the setup.py script with the python you want, it will be installed in directories appropriate to that python, and scripts will be installed in the same directory as the python executable itself and will automagically use the correct python when invoked.
I also try to avoid using too many libraries. When I only need one or two functions from a library (eg scipy), I'll often see if I can just copy them to my own project.
First: I'm running Macports. No problems with that, except:
/opt/local/Library/Frameworks/Python.framework/Versions/2.6/bin
which is the value of sys.exec_prefix, for my macports python even though:
/opt/local/lib/python2.6/site-packages/
seems to be quite a logical place to put things, /opt/local being the macports --prefix, as it were. Why does easy_install put things in this odd Frameworks/Python.framework thing?
More importantly, can i use the methods here, to ensure that all my systemwide python, particularly the scripts which I really want in /opt/local/bin, things I use all over the place like (i|b)python for example are accessible?
So first I'd make the observation that the directory
/Library/Frameworks/Python.framework/Versions/2.6/...
is where you can find the OS X provided version of Python so I'd guess that what the MacPorts developers want to do is replicate the OS X Python directory structure but keep it as far away from the OS X version of Python as possible. Also I think replicating the structure allows you to install third-party extensions outside of the MacPorts modules. I've done that before so I know it's possible, its just you still need to make sure:
When installing the modules, the version that the installer encounters when doing a PATH search is the Python installation you want to install it to (ie. a module meant for MacPorts Python doesn't end up in OS X Python). The page you cited earlier spells out the procedure an installer goes through when trying to find a Python installation to install to so you have some hints there about what to look into if you have problems installing third party modules.
Once you have installed your modules you'll have to keep track of them yourself because MacPorts won't do that for you. On the bright side everything should be in the site-extensions folder of the Python installation, I think.
This question from the MacPorts FAQ kind of lays the groundwork for their philosophy of directory organization I think. Also this question too.
I'm in a bit of a discussion with some other developers on an open source project. I'm new to python but it seems to me that site-packages is meant for libraries and not end user applications. Is that true or is site-packages an appropriate place to install an application meant to be run by an end user?
Once you get to the point where your application is ready for distribution, package it up for your favorite distributions/OSes in a way that puts your library code in site-packages and executable scripts on the system path.
Until then (i.e. for all development work), don't do any of the above: save yourself major headaches and use zc.buildout or virtualenv to keep your development code (and, if you like, its dependencies as well) isolated from the rest of the system.
We do it like this.
Most stuff we download is in site-packages. They come from pypi or Source Forge or some other external source; they are easy to rebuild; they're highly reused; they don't change much.
Must stuff we write is in other locations (usually under /opt, or c:\opt) AND is included in the PYTHONPATH.
There's no great reason for keeping our stuff out of site-packages. However, our feeble excuse is that our stuff changes a lot. Pretty much constantly. To reinstall in site-packages every time we think we have something better is a bit of a pain.
Since we're testing out of our working directories or SVN checkout directories, our test environments make heavy use of PYTHONPATH.
The development use of PYTHONPATH bled over into production. We use a setup.py for production installs, but install to an alternate home under /opt and set the PYTHONPATH to include /opt/ourapp-1.1.
The program run by the end user is usually somewhere in their path, with most of the code in the module directory, which is often in site-packages.
Many python programs will have a small script located in the path, which imports the module, and calls a "main" method to run the program. This allows the programmer to do some upfront checks, and possibly modify sys.path if needed to find the needed module. This can also speed up load time on larger programs, because only files that are imported will be run from bytecode.
Site-packages is for libraries, definitely.
A hybrid approach might work: you can install the libraries required by your application in site-packages and then install the main module elsewhere.
If you can turn part of the application to a library and provide an API, then site-packages is a good place for it. This is actually how many python applications do it.
But from user or administrator point of view that isn't actually the problem. The problem is how we can manage the installed stuff. After I have installed it, how can I upgrade and uninstall it?
I use Fedora. If I use the python that came with it, I don't like installing things to site-packages outside the RPM system. In some cases I have built rpm myself to install it.
If I build my own python outside RPM, then I naturally want to use python's mechanisms to manage it.
Third way is to use something like easy_install to install such thing for example as a user to home directory.
So
Allow packaging to distributions.
Allow selecting the python to use.
Allow using python installed by distribution where you don't have permissions to site-packages.
Allow using python installed outside distribution where you can use site-packages.