Reasons to use distutils when packaging C/Python project - python

I have an open source project containing both Python and C code. I'm wondering that is there any use for distutils for me, because I'm planning to do a ubuntu/debian package. The C code is not something that I could or want to use as Python extension. C and Python programs communicate with TCP/IP through localhost.
So the bottom line here is that while I'm learning packaging, does the usage of distutils specific files only make me more confused since I can't use my C-code as Python extensions? Or should I divide my C and Python functionality to separate projects to be able to understand packaging concepts better?

distutils can be used to install end user programs, but it's most useful when using it for Python libraries, as it can create source packages and also install them in the correct place. For that I would say it's more or less required.
But for an end user Python program you can also use make or whatever you like and are used to, as you don't need to install any code in the Python site-packages directory, and you don't need to put your code onto PyPI and it doesn't need to be accessible from other Python-code.
I don't think distutils will be neither more or less complicated to use in installing an end-user program compared to other tools. All such install/packaging tools are hella-complex, as Cartman would have said.

Because it uses an unified python setup.py install command? distutils, or setuptools? Whatever, just use one of those.
For development, it's also really useful because you don't have to care where to find such and such dependency. As long as it's standard Python/basic system library stuff, setup.py should find it for you. With setup.py, you don't require anymore ./configure stuff or ugly autotools to create huge Makefiles. It just works (tm)

Related

Where in a virtualenv should I put a custom library?

I have a virtualenv that serves a few separate projects. I'd like to write a utility library that each of these projects can use. It seems to make sense to stick that in the virtualenv. By all means shoot me down now but please give an alternative if you do.
Assuming I'm not completely crazy though, Where's the best place to stick my library?
My virtualenv sticks everything I install with pip in lib/pyton2.7/site-packages. I wonder if it would make more sense to follow suit or to hack in a separate home (further up the directory tree) so if things do ever clash, my work isn't overwritten by pip (et al).
If your project follows the standard packaging practices with setuptools, then all you have to do is run python setup.py develop inside the virtualenvs that you want the library to be used for. A .egg-link file will be created pointing to your package from which your other libraries will use much like any other packages, with the added benefit that your latest changes will be available to all packages at the same time (if that's your intention). If not, then either you could call python setup.py install or use multiple versions at different locations on the file system.
To get yourself started, take a look at Getting Started With setuptools and setup.py (you can skip the part on registering your package on pypi if this is your private work).
Another relevant stackoverflow thread: setup.py examples? The Hitchhiker's Guide to Packaging can also be quite useful.

python dependency + deployment tool?

Does anybody know of a tool to handle module dependencies + deployment in Python?
Details:
By handle, I mean:
list,
keep track of and
bundle up a zip/installable file for me.
Make it trivial to redeploy on another system (ie: includes all modules at the correct version in a deploy file, and does not have to go somewhere to get them *).
Alerts me if I am about to do something which changes the environment.
It must follow module dependencies all the way, not just one level deep.
Plus some stuff I probably haven't thought of.
I'm not talking about Virtualenv, Fabric, pip freeze** and (I don't think) Paver.
This evening I tried to count the modules that Pylons depends on. After a detour into Snakefood and Graphviz, the answer is A LOT. 100+ (and Snakefood did not get them all).
As I'm getting more and more into Python, handling this problem manually is starting to take up more of my time than I would like, and it's unreliable.
If it matters, I use Python 2.7 on Windows 7.
* I know this will introduce some artifacts.
** Combining virtualenv and pip freeze goes some way to solving this, but it's still not what I am looking for.
Setuptools plus pypi is made for that. The setuptools is an enhanced distutils, with which you can specify dependencies. For example, in the setup function:
install_requires = ['simplejson>=2.0,==dev'],
Will pull in that dependency when you use easy_install.
Since you are on windows take a look at py2exe.
Something of interest from the py2exe FAQ:
How does py2exe decide which modules you need?
To determine which modules should go in the final .exe file, py2exe
does a recursive search of the script that you are packaging to find
its dependencies and, in turn, all of their dependencies.

Best practice to install dependencies?

I am thinking of a good way to ship my application which is a python package. Installing my package is easy making use of pythons distutils package.
The trouble comes with the dependencies my package relies on. If the dependencies are python packages I can deal with them easily again using distutils, but non python packages? Some of them even need a lot of care while building and installing them since very special compiler flags need to be set and so forth...
If I want to automate the installation procedure for the user what is the best way to go about it?
Writing a make file that downloads and installs the dependencies
Write a script that installs the dependencies
No automation is best. simply write a manual that tells the user how to install the
dependencies
???
Thx in advance for any answer or suggestion
We have a project named Kivy ( http://kivy.org/ ), that have exactly the same issue. At the early stage, we've done a all-in-one package that include every setup of every dependencies. However the user was having a lot of "Next >" button to click... for every deps (Windows). So now, we have managed to take care ourself of the dependencies.
Except linux related (since all our deps are already packaged on "linux"), we have taken the approach of managing what we named "portable-deps" zipfile for each platform. And then, we have a script that:
Download the portable-deps zip
Include the latest version of our project
Add launcher script in the root directory
Zip the root directory, and rename the zip to project--.zip
With a special case for MacOSX, where the zip is a dmg with a little UI.
The good part is that the user don't have to care about deps, and developers know exactly what binaries are delivered with the project :)
For information, we have build_portable commands for distutils :
https://github.com/kivy/kivy/blob/master/kivy/tools/packaging/osx/build.py
https://github.com/kivy/kivy/blob/master/kivy/tools/packaging/win32/build.py
The most important thing to help you decide is to consider your audience.
Are they technically-inclined and likely to be comfortable following instructions specifying how to build the dependencies themselves? If so, go with (3). If not, writing a python or shell script, or a makefile to automate the task may be the way to go. Pick whichever you feel most comfortable writing.

xcopy python deployment

I'm new to python and I'm writing my first program. I would like after I finish to be able to run the program from the source code on a windows or mac machine. My program has dependencies on 3rd party modules.
I read about virtualenv but I don't think it helps me because it says it's not relocatable and it's not cross-platform (see Making Environments Relocatable http://pypi.python.org/pypi/virtualenv).
The best scenario is to install the 3rd party modules locally in my project, aka xcopy installation.
I will be really surprised if python doesn't support this easily especially since it promotes simplicity and frictionless programming.
You can do what you want, you just have to make sure that the directory containing your third-party modules is on the python path.
There's no requirement to install modules system-wide.
Note, while packaging your whole app with py2exe may not be an option, you can use it to make a simple launcher environment. You make a script with imports your module/package/whatever and launches the main() entry-point. Package this with py2exe but keep your application code outside this, as python code or an egg. I do something similar where I read a .pth text file to learn what paths to add to the sys.path in order to import my application code.
Simply, that's generally not how python works. Modules are installed site-wide and used that way. Are you familiar with pip and/or easy_install? Those + pypi let you automatically install dependencies no matter what you need.
If you want to create a standalone executable typically you'd use py2exe, py2app or something like that. Then you would have no dependencies on python at all.
I also found about zc.buildout that can be used to include dependencies in an automatic way.

How to use Python distutils?

I wrote a quick program in python to add a gtk GUI to a cli program. I was wondering how I can create an installer using distutils. Since it's just a GUI frontend for a command line app it only works in *nix anyway so I'm not worried about it being cross platform.
my main goal is to create a .deb package for debian/ubuntu users, but I don't understand make/configure files. I've primarily been a web developer up until now.
edit: Does anyone know of a project that uses distutils so I could see it in action and, you know, actually try building it?
Here are a few useful links
Ubuntu Python Packaging Guide
This Guide is very helpful. I don't know how I missed it during my initial wave of gooling. It even walks you through packaging up an existing python application
The Ubuntu MOTU Project
This is the official package maintaining project at ubuntu. Anyone can join, and there are lots of tutorials and info about creating packages, of all types, which include the above 'python packaging guide'.
"Python distutils to deb?" - Ars Technica Forum discussion
According to this conversation, you can't just use distutils. It doesn't follow the debian packaging format (or something like that). I guess that's why you need dh_make as seen in the Ubuntu Packaging guide
"A bdist_deb command for distutils
This one has some interesting discussion (it's also how I found the ubuntu guide) about concatenating a zip-file and a shell script to create some kind of universal executable (anything with python and bash that is). weird. Let me know if anyone finds more info on this practice because I've never heard of it.
Description of the deb format and how distutils fit in - python mailing list
See the distutils simple example. That's basically what it is like, except real install scripts usually contain a bit more information. I have not seen any that are fundamentally more complicated, though. In essence, you just give it a list of what needs to be installed. Sometimes you need to give it some mapping dicts since the source and installed trees might not be the same.
Here is a real-life (anonymized) example:
#!/usr/bin/python
from distutils.core import setup
setup (name = 'Initech Package 3',
description = "Services and libraries ABC, DEF",
author = "That Guy, Initech Ltd",
author_email = "that.guy#initech.com",
version = '1.0.5',
package_dir = {'Package3' : 'site-packages/Package3'},
packages = ['Package3', 'Package3.Queries'],
data_files = [
('/etc/Package3', ['etc/Package3/ExternalResources.conf'])
])
apt-get install python-stdeb
Python to Debian source package conversion utility
This package provides some tools to produce Debian packages from Python packages via a new distutils command, sdist_dsc. Automatic defaults are provided for the Debian package, but many aspects of the resulting package can be customized via a configuration file.
pypi-install will query the Python Package Index (PyPI) for a
package, download it, create a .deb from it, and then install
the .deb.
py2dsc will convert a distutils-built source tarball into a Debian
source package.
Most Python programs will use distutils. Django is a one - see http://code.djangoproject.com/svn/django/trunk/setup.py
You should also read the documentation, as it's very comprehensive and has some good examples.
I found the following tutorial to be very helpful. It's shorter than the distutils documentation and explains how to setup a typical project step by step.
distutils really isn't all that difficult once you get the hang of it. It's really just a matter of putting in some meta-information (program name, author, version, etc) and then selecting what files you want to include. For example, here's a sample distutils setup.py module from a decently complex python library:
Kamaelia setup.py
Note that this doesn't deal with any data files or or whatnot, so YMMV.
On another note, I agree that the distutils documentation is probably some of python's worst documentation. It is extremely inclusive in some areas, but neglects some really important information in others.

Categories