Robust way to ensure other people can run my python program - python

I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.

Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.

There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.

The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.

If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.

In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...

I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.

I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!

Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.

Related

Is there an easy solution to "No module named [...]" in Python 3? [duplicate]

I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.
Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.
There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.
The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.
If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.
In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...
I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.
I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!
Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.

Include a PIP module in my project so that user doesn't have to download it

Edit: I am self taught, I don't know the right terms.
This question is probably a duplicate but I am not able to find it. I need to include a python pip package in my application say numpy. I don't want the user to pip install -r requirements.txt I want to include the module when the user downloads the application.
you don't have to include pip when you build the app, because when you build the app, all depencies will be included in the app resources, i am not sure which app you are building, but in general, all depencies will be converted to pyc packages when building the app and compiled
It's a whole world to explore, honestly ) you need to explore the subjects of deployment and distribution; those will depend on target operating systems... I would suggest investigating Docker, which allows you to package the OS (some Unix), runtime and dependencies into one "thing".
Vendoring
The only way I can think of besides pip to do this with pure source code would be to vendor the code you need. This means downloading the source code of the package (like numpy) and including it in your project.
This comes with many potential issues including:
licensing issues
issues with installation if there are cpython files that need to be compiled
Having to manually update the files when new releases come out
increased download size for your source code
etc.
I would not recommend doing this unless there is a hard technical requirement for it, because it's a real hassle to deal with. If there is something in particular besides having to run the extra command as to a reason why you want to avoid pip it might help to better address this question.
Binary distribution
Also depending on the app you could look into something like pyinstaller to make a single .exe or binary file out of your app so they don't even need python or your dependencies, but be warned this has it's own set of complexities to look out for like having to build for every target platform (Windows, Mac and Linux).

python and package installment while application is installed

I have done some image processing works using python 3.5, opencv, scikit modules etc for an unreal engine game application.
I have manually installed python and other modules using pip in my windows system.
Now when a user installs the application, i want python and those modules to be installed auto with the application's installment.
I saw pyinstaller which turns py file to application file but unfortunately could not understand how to work it of what i want.
Thank you for any piece of advice.
First, let me say Python packaging has improved a lot over the years, but is still considered very hard compared to other languages like e.g. golang.
Generally, I see two ways how to bring your applications to your user.
Either make a Python package or create an installable package for an operation system.
A Python package means, you could upload it somewhere (e.g. PyPi) and your users could pip install your_package. This involves a lot of work. A good starting point would be:
https://packaging.python.org/tutorials/packaging-projects/
The second option is to create an installer or e.g. Windows.
There are several tools out there, like the mentioned pyinstaller, more on this page: https://docs.python-guide.org/shipping/freezing/
Also, there is a new option called PyOxidizer ( https://pyoxidizer.readthedocs.io/en/stable/overview.html ).
At work we used cx_Freeze - which worked ok.
Unfortunately, there is no easy way. Have a look at several options, and then decide for one.

Python testing / TeamCity integration / sane package management

My colleagues and I are currently taking baby steps to automate the testing of an embryonic python codebase, but we've run into some problems regarding environment setup & package management. Any help is appreciated, as we've not done this before with Python (and it looks a bit... fractured).
Requirements:
Tests are runnable via a script (Nose looks quite good)
It runs on windows machines
It is runnable via TeamCity as well as standard developer machines. Good TeamCity reporting / integration would be a bonus.
We should be able to invoke the scripts and get correct, repeatable results on multiple machines.
All dependencies / package requirements are met in a simple, repeatable fashion (we do this with our main codebase using ruby & bundler and are struggling to repeat the trick with python). If people have to go about manually installing eggs / using easy_install etc. it's going to be hellish. You should just be able to invoke a script that says "please make sure these dependencies are accounted for, then run our tests".
Ideally the workflow should work like this (ignoring how we install/get python for the moment):
Windows machine syncs up to our SCM
Machine runs a script to ensure that all python dependencies (Shapely etc.) are accounted for
Machine is able to invoke a script that runs nose or some other test runner
The script returns a value to indicate whether the build failed
Bonus points question:
We are willing to install python on each dev machine / build agent rather than checking it in to source control, though it would be nice if we could just check it in and forget about it. Our best bet on this front thus far is just to check the python install directory into SCM along with the pythonxx.dll found in Windows/System32, but I'm not sure if this is a flawed approach.
We've spotted Movable Python and Portable Python. Any idea what the best approach is? Like I said, we are willing to just bite the bullet and install python on each machine using an .msi if this is not viable.
Cheers!
In the absence of anything better, we prodded at the problem via trial and error. At the moment, our codebase is tiny, so there may be flaws in this approach that have yet to be exposed, but we're going with it for the moment.
In the end, we:
Added PortablePython 2.7 to SCM. This just-works(tm).
Added curl to SCM
Created a windows .bat file that invokes curl and pulls down and installs python setup tools. We then install pip via setup tools (meta packaging ahoy). Pip is then run against a requirements file. The pip requirements file contains nose (a python test runner) and teamcity-nose (TeamCity CI integration for nose) amongst other things.
We then invoke the test runner which automagically hoovered up our tests without any hassle. Not bad!
Also going to check out virtualenv in future.
Edit (N years into the future):
Pip comes as standard in Python 2.7.9 and beyond. Easiest way to do things these days (imo) is something like:
use pip out of the box
install virtualenv
create a virtualenv for the project & activate it
install dependencies using pip (occasionally have to fall back to easy_install, unfortunately)
disco.
Also, don't use portable python unless you really need it. Really wish we'd not settled for that.

Best practices for Python deployment -- multiple versions, standard install locations, packaging tools etc

Many posts on different aspects of this question but I haven't seen a post that brings it all together.
First a subjective statement: it seems like the simplicity we experience when working with the Python language is shot to pieces when we move outside the interpreter and start grappling with deployment issues. How best to have multiple versions of Python on the same machine? Where should packages be installed? Disutils vs. setuptools vs. pip etc. It seems like the Zen of Python is being abused pretty badly when it comes to deployment. I'm feeling eerie echoes of the "DLL hell" experience on Windows.
Do the experts agree on some degree of best practice on these questions?
Do you run multiple versions of Python on the same machine? How do you remain confident that they can co-exist -- and the newer version doesn't break assumptions of other processes that rely on the earlier version (scripts provided by OS vendor, for example)? Is this safe? Does virtualenv suffice?
What are the best choices for locations for different components of the Python environment (including 3rd party packages) on the local file system? Is there a strict or rough correspondence between locations for many different versions of Unixy and Windows OS's that can be relied upon?
And the murkiest corner of the swamp -- what install tools do you use (setuptools, distutils, pip etc.) and do they play well with your choices re: file locations, Python virtual environments, Python path etc.
These sound like hard questions. I'm hopeful the experienced Pythonistas may have defined a canonical approach (or two) to these challenges. Any approach that "hangs together" as a system that can be used with confidence (feeling less like separate, unrelated tools) would be very helpful.
I've found that virtualenv is the only reliable way to configure and maintain multiple environments on the same machine. It even has as a way of packaging up environment and installing it on another machine.
For package management I always use pip since it works so nicely with virtualenv. It also makes it easy to install and upgrade packages from a variety of sources such a git repositories.
I agree this is quite a broad question, but I’ll try to address its many parts anyway.
About your subjective statement: I don’t see why the simplicity and elegance of Python would imply that packaging and deployment matters suddenly should become simple things. Some things related to packaging are simple, other are not, other could be. It would be best for users if we had one complete, robust and easy packaging system, but it hasn’t turned that way. distutils was created and then its development paused, setuptools was created and added new solutions and new problems, distribute was forked from setuptools because of social problems, and finally distutils2 was created to make one official complete library. (More on Differences between distribute, distutils, setuptools and distutils2?) The situation is far from ideal for developers and users, but we are working on making it better.
How best to have multiple versions of Python on the same machine? Use your package manager if you’re on a modern OS, or use “make altinstall” if you compile from source on UNIX, or use the similar non-conflicting installation scheme if you compile from source on Windows. As a Debian user, I know that I can call the individual versions by using “pythonX.Y”, and that what the default versions (“python” and “python3”) are is decided by the Debian developers. A few OSes have started to break the assumption that python == python2, so there is a PEP in progress to bless or condemn that: http://www.python.org/dev/peps/pep-0394/ Windows seems to lack a way to use one Python version as default, so there’s another PEP: http://www.python.org/dev/peps/pep-0397/
Where should packages be installed? Using distutils, I can install projects into my user site-packages directory (see PEP 370 or docs.python.org). What exactly is the question?
Parallel installation of different versions of the same project is not supported. It would need a PEP to discuss the changes to the import system and the packaging tools. Before someone starts that discussion, using virtualenv or buildout works well enough.
I don’t understand the question about the location of “components of the Python environment”.
I mostly use system packages (i.e. using the Aptitude package manager on Debian). To try out projects, I clone their repository. If I need something that’s not available with Aptitude, I install (or put a .pth file to the repo) in my user site-packages directory. I don’t need a custom PYTHONPATH, but I have changed the location of my user site-packages with PYTHONUSERBASE. I don’t like the magic and the eggs concept in setuptools/distribute, so I don’t use them. I’ve started to use virtualenv and pip for one project though (they use setuptools under the cover, but I made a private install so my global Python does not have setuptools).
One resource for this area is the book Expert Python Programming by Tarek Ziade. I'm ambivalent about the quality of the book, but the topics covered are just what you're focusing on.

Categories