Proper Chef way to use Poise installed Python/Ruby - python

We are trying to use Poise to manage runtimes for Python and Ruby on our Centos7 servers. From my understanding this works with other recipes, but I can't figure out what the "right" way is to link the binaries to the standard bin locations (/usr/bin/, etc.). So far I have been unable to find a way to do this as part of the standard process - only by digging around to figure out where they were installed and then adding those links as a separate step later in the recipe - it seems like a major hack.
In other words, adding the following in a recipe that has some scripts that get copied to the server that require Python 3 looks like it installs Python 3:
python_runtime '3'
But the scripts (which cannot be changed) will never know that Python 3 exists.
Everything obviously works fine if I just do an install of Python3 using yum - which poise actually appears to do as well for Centos.
I am relatively new to Chef, but I have checked with our other devops team members and done a lot of searching and we couldn't figure out how this is officially supposed to be done. We aren't looking for more hacks as we can obviously do that, but what is the "Chef" way to do this?
Thanks in advance.

Unfortunately just linking the binaries wouldn't really help you much since by default on CentOS it will use the SCL packages which require some special environment variables to operate. If you want it to use the "normal" system you can do this:
python_runtime '3' do
provider :system
end
However that will probably fail because there is no EL7 distro package for Python 3. If you want to continue using SCL packages but have them look like normal binaries, maybe try something like this:
file '/usr/local/bin/python' do # or .../python3 if you prefer
owner 'root'
group 'root'
mode '755'
content "#!/bin/sh\nexec scl enable rh-python35 -- python3 \"$#\""
end
Or something like that. That still hardwires the fact that it is SCL under the hood and which SCL package is being used, which is not lovely, but the fully generic form (while doable) is a lot more complex.

Related

How to manage multiple ironpython versions?

I have the following issue:
I have some software that installs a particular version of IronPython to GAC. But I need to install a newer version of IronPython without affecting GAC. Hence the need for somehow using pyenv on windows with IronPython.
I am not from a programming background, more of a brick and mortar background, so please bear with me here.
[pyenv-win][1] doesn't support IronPython yet, and given my background, I have no idea how to modify a GIT repository and then install it (I'm trying to learn all that, but first I need to set this environment up, so that I don't mess things up, its a vicious cycle :P).
I downloaded a copy of the code and I was looking into how it is addressing different python versions.
It seems in [this file][2]
there are variables:
mirror, mirrorEnvPath, listEnv
that point to location of the exe that pyenv-win is using to install and maintain python version
So I somehow need to add an iron python mirror location and another array to the list that reads something like this
ironmirror = "https://github.com/IronLanguages/ironpython2/releases/download"
and add a line to listEnv
Array("ipy-2.7.9", ironmirror&"/ipy-2.7.9/", "IronPython-2.7.9.msi", "x64")_
That's how far I can get. If someone could help me put all this together, that would be nice. I have no idea how to run this modified code from my local hard drive. I would also like to somehow add this functionality to the package on GitHub as well so that others can use the same. Also, I am not sure if I am allowed to look into the code that others have shared and modify it. This is a new world for me. Apologies if that's the case.
Any help is appreciated.
[1]: https://github.com/pyenv-win/pyenv-win
[2]: https://github.com/pyenv-win/pyenv-win/blob/master/pyenv-win/libexec/pyenv-install.vbs

Python - packaging a source distribution

I'm currently writing a python program and I want to distribute it to some en users (and developers). I would like to reduce the number of necessary steps to run the program to a minimum.
My use case is relatively simple. I'd like the process/tool/whatever to:
A) Download the list of packages required for the application to work.
B) Run a list of python scripts, sequentially (e.g create database and then run migrations).
I understand that distlib does some of this already. However I find the documentation kind of confusing, there seems to be an API to install scripts, but not one to execute them automatically.
Ideally I would specify a list of scripts, and a list of dependencies and have the program install them automatically.
Maybe the best way to tackle this would be to use make with a Makefile (https://www.gnu.org/software/make/).
Distlib, via the setup.py file, would help you make it more readable by giving names to some python scripts. And you could make use of make target/dependencies system to execute tasks sequentially.
If you want to stick to python, you could also use Luigi (https://luigi.readthedocs.io/en/stable/) but it seems like overkill here.
Ok, so I ended writing my own thing, based on how I wanted the interface to look. The code that installs the application looks like this:
from installtools import setup
scripts = ['create_database.py', 'run_migrations.py']
setup("Shelob", "requirements.txt", scripts)
The full script can be found here: https://gist.github.com/fdemian/808c2b95b4521cd87268235e133c563f
Since PIP doesn't have a public API(and isn't likely to have one in the near future) the script uses the subprocess API to call:
pip install -r [requirements_file_path]
After that, it calls the specified python scripts, one by one. While it is probably not a very robust, as a stopgap solution it seems to do the trick.

Do i make an egg when i want to deploy a python script with custom imports to a new host machine?

Kind of new to python. I have been working through some development and have been working on some scripts in pycharm. Anyways, I have been doing a lot of custom installs on my local machine leveraging things like pythoncom, or pyHook. Those may not be on my target machine so instead of just running the python script, I am thinking I should turn it into an egg file that i just run on the machine instead?
I figure it would just do what a jar does and builds all the imports etc into the egg so it is self contained and will run correctly.
Is that true? Is that the best course of action? I was tempted to do something like npm install but for python, though a lot of things within my defined modules arent really found in pip but instead on sourceforge.
How would I best do this approach? Is making an egg the way I am suppose to do it, or will I want to build some sort requred structure (sort of like a packages.json file for npm and then just run that when I clone a repo)?
I was hoping to do it all by way of a repo, but i dont know what the python development / distribution standard is.

easy, straightforward way to package a python program for debian?

i'm having trouble navigating the maze of distribution tools for python and debian; cdbs, debhelper, python-support, python-central, blah blah blah ..
my application is a fairly straightforward one - a single python package (directory containing modules and a __init__.py), a script for running the program (script.py) and some icons (.png) and menu items (.desktop files).
from these files, how can i construct a simple, clean .deb file from scratch without using the nonsensical tools listed above?
i'm mainly targeting ubuntu, but would like it if the package worked on straight debian
python-stdeb should work for you. It's on Debian testing/unstable and Ubuntu (Lucid onwards). apt-get install python-stdeb
It is less a shortcut method than a tool that tries to generate as much of the source package as possible. It can actualy build a package that both works properly and is almost standards compliant. If you want your package to meet the quality standards for inclusion in Debian, Ubuntu, etc you will need to fill out files like debian/copyright, etc.
As much as people claim cdbs is really easy, I'd like to point out that the rules file Nick mentioned could easily have been done with debhelper7. Not to forget, dh7 can be customized far more easily than cdbs can.
#!/usr/bin/make -f
%:
dh $#
Note: You should check whether your package meets the Debian Policy, Debian Python Policy, etc before you submit to Debian. You will actually need to read documents for that - no shortcut.
First, the answer is that there is no straightforward way to make a dpkg, and the documentation is parceled out in a million tiny morsels from as many places. However, the ubuntu Python Packaging Guide is pretty useful.
For simple packages (ones easy to describe to setuptools), the steps are pretty simple once you have a debian control structure set up:
Run setup.py --sdist --prune and also make sure to set dist-dir to something reasonable
Invoke dpkg-buildpackage with the proper options for your package (probably -b at least)
You will need a debian/rules file for buildpackage to function from, but luckily the work is done for you if you use cdbs, you'll want something very similar to:
#!/usr/bin/make -f
DEB_PYTHON_SYSTEM := pysupport
include /usr/share/cdbs/1/rules/debhelper.mk
include /usr/share/cdbs/1/class/python-distutils.mk
If you're not using distutils, you might want to take a look at the DebianPython/Policy page on the wiki (under "CDBS + the hard way"). There is a pycentral option for DEB_PYTHON_SYSTEM as well, which you can google if you want to find some more information about.

What possibilities exist to build an installer for a windows application on Linux (install target=windows, build environment=Linux)

After playing around with NSIS (Nullsoft Scriptable Installation System) for a few days, I really feel the pain it's use brings me. No wonder, the authors claim it's scripting implementation is a "mixture of PHP and Assembly".
So, I hope there is something better to write installation procedures to get Windows programs installed, while creating the installation package on Linux.
But I did not find anything yet. Wix looks promising, but seems not really to run on Linux, Python can create .msi files - but only when running on Windows.
Izpack is out of the game because it requires Java for the installer to run on the target system.
Our app to be installed is a python app (and I'm even thinking about scripting the whole install myself in Python).
Any other ideas?
Forgot to say: Free/OpenSource apps preferred.
Not only because of cost, because of the power to control and adjust everything.
We might be willing to pay professional support if it helps us getting to our goals fast, but we also want to have full control over the build system.
You may be interested by BitRock
You might try looking at InstallAnywhere, but it may require Java.
Try running InnoSetup under Wine. It should work unless you have some very specific needs. InnoSetup is open source, BTW.
It seems that pyinstaller might do the trick. I'm also looking for something like what you need. I have not tried it yet ...

Categories