Python - packaging a source distribution - python

I'm currently writing a python program and I want to distribute it to some en users (and developers). I would like to reduce the number of necessary steps to run the program to a minimum.
My use case is relatively simple. I'd like the process/tool/whatever to:
A) Download the list of packages required for the application to work.
B) Run a list of python scripts, sequentially (e.g create database and then run migrations).
I understand that distlib does some of this already. However I find the documentation kind of confusing, there seems to be an API to install scripts, but not one to execute them automatically.
Ideally I would specify a list of scripts, and a list of dependencies and have the program install them automatically.

Maybe the best way to tackle this would be to use make with a Makefile (https://www.gnu.org/software/make/).
Distlib, via the setup.py file, would help you make it more readable by giving names to some python scripts. And you could make use of make target/dependencies system to execute tasks sequentially.
If you want to stick to python, you could also use Luigi (https://luigi.readthedocs.io/en/stable/) but it seems like overkill here.

Ok, so I ended writing my own thing, based on how I wanted the interface to look. The code that installs the application looks like this:
from installtools import setup
scripts = ['create_database.py', 'run_migrations.py']
setup("Shelob", "requirements.txt", scripts)
The full script can be found here: https://gist.github.com/fdemian/808c2b95b4521cd87268235e133c563f
Since PIP doesn't have a public API(and isn't likely to have one in the near future) the script uses the subprocess API to call:
pip install -r [requirements_file_path]
After that, it calls the specified python scripts, one by one. While it is probably not a very robust, as a stopgap solution it seems to do the trick.

Related

Proper Chef way to use Poise installed Python/Ruby

We are trying to use Poise to manage runtimes for Python and Ruby on our Centos7 servers. From my understanding this works with other recipes, but I can't figure out what the "right" way is to link the binaries to the standard bin locations (/usr/bin/, etc.). So far I have been unable to find a way to do this as part of the standard process - only by digging around to figure out where they were installed and then adding those links as a separate step later in the recipe - it seems like a major hack.
In other words, adding the following in a recipe that has some scripts that get copied to the server that require Python 3 looks like it installs Python 3:
python_runtime '3'
But the scripts (which cannot be changed) will never know that Python 3 exists.
Everything obviously works fine if I just do an install of Python3 using yum - which poise actually appears to do as well for Centos.
I am relatively new to Chef, but I have checked with our other devops team members and done a lot of searching and we couldn't figure out how this is officially supposed to be done. We aren't looking for more hacks as we can obviously do that, but what is the "Chef" way to do this?
Thanks in advance.
Unfortunately just linking the binaries wouldn't really help you much since by default on CentOS it will use the SCL packages which require some special environment variables to operate. If you want it to use the "normal" system you can do this:
python_runtime '3' do
provider :system
end
However that will probably fail because there is no EL7 distro package for Python 3. If you want to continue using SCL packages but have them look like normal binaries, maybe try something like this:
file '/usr/local/bin/python' do # or .../python3 if you prefer
owner 'root'
group 'root'
mode '755'
content "#!/bin/sh\nexec scl enable rh-python35 -- python3 \"$#\""
end
Or something like that. That still hardwires the fact that it is SCL under the hood and which SCL package is being used, which is not lovely, but the fully generic form (while doable) is a lot more complex.

Do i make an egg when i want to deploy a python script with custom imports to a new host machine?

Kind of new to python. I have been working through some development and have been working on some scripts in pycharm. Anyways, I have been doing a lot of custom installs on my local machine leveraging things like pythoncom, or pyHook. Those may not be on my target machine so instead of just running the python script, I am thinking I should turn it into an egg file that i just run on the machine instead?
I figure it would just do what a jar does and builds all the imports etc into the egg so it is self contained and will run correctly.
Is that true? Is that the best course of action? I was tempted to do something like npm install but for python, though a lot of things within my defined modules arent really found in pip but instead on sourceforge.
How would I best do this approach? Is making an egg the way I am suppose to do it, or will I want to build some sort requred structure (sort of like a packages.json file for npm and then just run that when I clone a repo)?
I was hoping to do it all by way of a repo, but i dont know what the python development / distribution standard is.

distutils setup script under linux - permission issue

So I created a setup.py script for my python program with distutils and I think it behaves a bit strange. First off it installs all data_files into /usr/local/my_directory by default which is a bit weird since this isn't a really common place to store data, is it?
I changed the path to /usr/share/my_directory/. But now I'm not able to write to the database inside that directory and I can't set required permission from within setup.py neither since the actual database file has not been created when I run it.
Is my approach wrong? Should I use another tool for distributing?
Because at least for Linux, writing a simple setup sh script seems easier to me at the moment.
The immediate solution is to invoke setup.py with --prefix=/the/path/you/want.
A better approach would be to include the data as package_data. This way they will be installed along side your python package and you'll find it much easier to manage it (find paths etc).

How to distribute a Python program with external libraries

I have a Python program that uses the following libraries: Tkinter, NumPy, SciPy, SymPy, and Matplotlib. And probably it will include more libraries in the near future while being developed.
How can I distribute such a program to Mac, Windows, and Linux users, without requiring users to install the right version of each library, and hopefully by just downloading a single file and executing it?
What I initially wanted was compiling the program into a static binary program, but it seems that it's not an easy goal.
Probably I can ask users to install Python, but that's the maximum requirement that I can ask for them, and I want to avoid it if possible.
I don't care about hiding the code at all; in the end I will distribute both the code and the program. What I want is to make the program very easy for any user to download and run it.
Even such an advice as 'a Python program is not suitable for such a distribution' is welcome. I had a fair amount of experience with distributing C programs but I don't know what to expect with a Python program.
For convenient, you could try something like pyinstaller.
It will package all of needed module into one folder or or one executable as you like. And it can run in all platforms.
The simple command to make a directory contains an executable file and all needed library is
$pyinstaller --onedir --name=directory_name --distpath="path_to_put_that_directory" "path to your main_program.py"
You can change --onedir into --onefile to make that folder into an one executable file which has all the thing it need to run inside.
You can use Setuptools to do the packaging stuff .
It create eggs, which are the equivalent of jars.
https://pypi.python.org/pypi/setuptools#using-setuptools-and-easyinstall
https://pythonhosted.org/setuptools/setuptools.html
You can have a look at Py2exe , even though you risk the application becoming bigger than it already is, and some packages need to be installed manually .

easy, straightforward way to package a python program for debian?

i'm having trouble navigating the maze of distribution tools for python and debian; cdbs, debhelper, python-support, python-central, blah blah blah ..
my application is a fairly straightforward one - a single python package (directory containing modules and a __init__.py), a script for running the program (script.py) and some icons (.png) and menu items (.desktop files).
from these files, how can i construct a simple, clean .deb file from scratch without using the nonsensical tools listed above?
i'm mainly targeting ubuntu, but would like it if the package worked on straight debian
python-stdeb should work for you. It's on Debian testing/unstable and Ubuntu (Lucid onwards). apt-get install python-stdeb
It is less a shortcut method than a tool that tries to generate as much of the source package as possible. It can actualy build a package that both works properly and is almost standards compliant. If you want your package to meet the quality standards for inclusion in Debian, Ubuntu, etc you will need to fill out files like debian/copyright, etc.
As much as people claim cdbs is really easy, I'd like to point out that the rules file Nick mentioned could easily have been done with debhelper7. Not to forget, dh7 can be customized far more easily than cdbs can.
#!/usr/bin/make -f
%:
dh $#
Note: You should check whether your package meets the Debian Policy, Debian Python Policy, etc before you submit to Debian. You will actually need to read documents for that - no shortcut.
First, the answer is that there is no straightforward way to make a dpkg, and the documentation is parceled out in a million tiny morsels from as many places. However, the ubuntu Python Packaging Guide is pretty useful.
For simple packages (ones easy to describe to setuptools), the steps are pretty simple once you have a debian control structure set up:
Run setup.py --sdist --prune and also make sure to set dist-dir to something reasonable
Invoke dpkg-buildpackage with the proper options for your package (probably -b at least)
You will need a debian/rules file for buildpackage to function from, but luckily the work is done for you if you use cdbs, you'll want something very similar to:
#!/usr/bin/make -f
DEB_PYTHON_SYSTEM := pysupport
include /usr/share/cdbs/1/rules/debhelper.mk
include /usr/share/cdbs/1/class/python-distutils.mk
If you're not using distutils, you might want to take a look at the DebianPython/Policy page on the wiki (under "CDBS + the hard way"). There is a pycentral option for DEB_PYTHON_SYSTEM as well, which you can google if you want to find some more information about.

Categories