I have the following problem. I need to distribute our own version of python with some magic in it. In order to do this, the process is the following:
I build the python interpreter (on a redhat linux)
install it somewhere
tar.gz the whole thing
when it's time to make the user-package, unpack the tar.gz into the directory which will become the user package
tar.gz the user package directory
put the tar.gz on the web
This is the method I have to use. Good, bad? I don't know, I have little experience as a packager, and in any case I can't propose a change. This is the way they always did.
It turns out that when the user unpacks this tar.gz on suse, and tries to run the python setuptools (which has been installed together with python), the hashlib module raises an exception. What I found out is that building python on redhat, the python configure script finds the openssl library, which in turns makes it skip the building of the shamodule.c, md5.c and so on, and compiles hashmodule.c to attach to the openssl library. apparently, the openssl 0.9.7 on suse and the 0.9.8 on redhat are somehow different, meaning that, for some reason, the _hashlib module, when imported on suse, raises an import error, leading hashlib try to import _md5, _sha, _sha256, which are not there because on the redhat there was no reason to compile them (since openssl was jolly good there).
Does anyone know how to solve this problem. As I said, my experience as a packager is the bare minimum, so any hint and proposal are welcome, and I will try to deploy it as much as I am allowed by our legacy.
Does anyone know how to solve this problem.
You can't, really. If it's not the OpenSSL library that's a problem, it could be the C library itself, or some other critical component. Your best solutions are either:
(A) Build a version of Python for each operating system you wish to support, or
(B) Rework your code to use the native system Python on each platform.
Your alternative is to create a completely self-contained build environment so that when building under RedHat you're not using the system OpenSSL library, but are instead using one that you built yourself. This will work for just about everything other than the C library, but it can be tricky to set up. The idea is to minimize the relationship between your package and the system libraries.
If you're only supporting RedHat and SUSE, you could conceivably pursue option (A) by crafting an appropriate spec file and building binary packages for each platform. This would be a nice way to package everything up.
You should consider distributing a source RPM of your version of Python, rather than a binary tarball. You can take an existing Python release and repackage it with a patch of your changes. There's more detail on how to do this in the RPM Book.
Related
I want to package a python module containing python source and a native c++ library. Cppyy is used to dynamically generate the bindings so the library is really just a normal library. The build system for the library is meson and should not be replaced. The whole thing is in a git repository. I only care about Linux.
My question is how to get from this to “pip install url_to_package builds/installs everything.” in the least complicated way possible.
What I’ve tried:
Extending setuptools with a custom build command:
…that executes meson compile and copies the result in the right place. But pip install will perform its work in some random split-off temporary directory and I can’t find my C++ sources from there.
The Meson python module:
…can build my library and install files directly into some python env. Does not work with pip and has very limited functionality.
Wheels:
…are incredibly confusing and overkill for me. I will likely be the only user of this module. Actually, all I want is to easily use the module in projects that live in different directories…
Along the way, I also came across different CMake solutions, but those are disqualified because of my build system choice. What should I do?
I am new to Python having come from a proprietary compiled language (Xojo) that produces self-contained executables.
I understand that Python is an interpreted language. I understand that it requires an interpreter (let’s stick with CPython) and presumably it requires a number of accessory frameworks/C libraries in order to run. What I don’t understand is why is it so hard to create a folder containing the interpreter and all required files and libraries and simply bundle these up with my script to distribute.
I have discovered that there are a bunch of tools that attempt to do this (py2app, cx_freeze, etc) but many of them seem either broken, not maintained or really buggy.
I guess my question is: is there any documentation that describes the exact things I need to bundle with a “Hello World” script to get it running? This seems to be a really straightforward problem to solve but it hasn’t been (which suggests that it is far more complex than I appreciate).
My understanding is that PyInstaller works fine for making a single exe for distribution. But barring packaging tools like that, in general, there isn't an obvious "bare minimum"; the modules don't have documented dependencies, so it's usually best to ship the whole standard library.
Typically, if you need a redistributable version, you use the embedded Python zip redistributable, shipping Python alongside your main application.
The exact list of files/libraries depends on how the python interpreter is built. In windows for example, you can obtain CPython binaries built from Visual Studio, Cygwin and Mingw-w64. They have different dependency of cause. In Linux distributions, python is normally installed by default.
Below is the list of .dll and .exe files that you can find in the official CPython release for windows.
libcrypto-1_1-x64.dll python.exe python37.dll sqlite3.dll
libssl-1_1-x64.dll pythonw.exe python3.dll vcruntime140.dll
The total size of this ZIP file release is only 6.7 MB. So it would be easy to bundle it in your main executable. You can use whatever bundler at hand, not necessary those designed for python. Quoting from the documentation here:
extracting the embedded distribution to a subdirectory of the application installation is sufficient to provide a loadable Python interpreter.
I feel the absolute best way to experience Python for beginners in thonny and an esp32.
A very good way to get started with python is to use Anaconda https://www.anaconda.com/distribution/#download-section - this distribution contains the CPython interpreter and the most commonly used packages. For quite a while you will get along without installing more packages.
To be able to make a simple distributable piece of code just include a requirements.txt along with your code which should list down the packages (and versions) you are using in your code.
More on that here : https://www.idiotinside.com/2015/05/10/python-auto-generate-requirements-txt/
pip freeze generates a superset of all packages in your running environment so you would ideally go with the second smarter option in the link : pipreqs
So, in short along with your code just an additional requirements.txt should be fine using which people can install all required packages as
pip install -r requirements.txt
and they are good to go to run your code.
For advanced scenarios you might want to look up creating virtual environments using conda.
What is a conda environment?
https://docs.conda.io/projects/conda/en/latest/user-guide/concepts.html#conda-environments
How to create/manage a conda environment
https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
All the best in your Python journey!
I finally managed to install numpy, but it seems to only work in python2.6 . I don't know how to install it in the 2.7 folder (been trying for hours, but I'm just a beginner developer in my first months). Anyway, if I use Python 2.7 and append the absolute path to sys.path, could there be problems?
Any suggestions?
Thank you.
It could partially work but this is a bad idea. Just don't do it. Even if it seems to work, it may not. And if it really does, then it will fail randomly in the future.
These are the potential problems that come into my mind:
Extensions (those written in C, C++ etc.) are specific to a particular Python version; and numpy has a few extensions, AFAICS. It will work only if you don't use any of them (i.e. use pure Python modules);
Python compiles modules into bytecode. The bytecode is specific to a particular Python version. If you use modules from python2.6 directory in python2.7, the compiled files will collide. I doubt this will cause major problems except for the fact that they will be recompiled every time Python version is switched;
Python code can be version-specific. It's unlikely for minor versions (but for example Python 2/3 could have serious differences) but still can happen. In other words, the modules installed for Python2.6 can be actually a bit different than those for Python2.7;
If you change the loading order, Python2.7 may start loading some standard modules from Python2.6. it could work, it could cause random breakages;
It will make all modules installed for Python2.6 visible. It can cause a few random switches somewhere with unpredictable result. I doubt there's something specific for that version but some modules may actually decide to use some kind of deprecated interface finding it visible.
There could be more. You may actually try but be prepared that you may waste a lot of time trying to find out why something does not work as expected later.
And unless I'm missing something, I think installing numpy for Python2.7 involves mostly running the setup using Python 2.7; like:
python2.7 setup.py clean
python2.7 setup.py build
python2.7 setup.py install
Depending on your particular install/system, it may be preferable to use the package manager, binary bundle or a tool like pip instead. If you'd like more details on how to enforce Python2.7 with those, you'd have to tell us which one is of your choice.
short version: how can I get rid of the multiple-versions-of-python nightmare ?
long version: over the years, I've used several versions of python, and what is worse, several extensions to python (e.g. pygame, pylab, wxPython...). Each time it was on a different setup, with different OSes, sometimes different architectures (like my old PowerPC mac).
Nowadays I'm using a mac (OSX 10.6 on x86-64) and it's a dependency nightmare each time I want to revive script older than a few months. Python itself already comes in three different flavours in /usr/bin (2.5, 2.6, 3.1), but I had to install 2.4 from macports for pygame, something else (cannot remember what) forced me to install all three others from macports as well, so at the end of the day I'm the happy owner of seven (!) instances of python on my system.
But that's not the problem, the problem is, none of them has the right (i.e. same set of) libraries installed, some of them are 32bits, some 64bits, and now I'm pretty much lost.
For example right now I'm trying to run a three-year-old script (not written by me) which used to use matplotlib/numpy to draw a real-time plot within a rectangle of a wxwidgets window. But I'm failing miserably: py26-wxpython from macports won't install, stock python has wxwidgets included but also has some conflict between 32 bits and 64 bits, and it doesn't have numpy... what a mess !
Obviously, I'm doing things the wrong way. How do you usally cope with all that chaos ?
I solve this using virtualenv. I sympathise with wanting to avoid further layers of nightmare abstraction, but virtualenv is actually amazingly clean and simple to use. You literally do this (command line, Linux):
virtualenv my_env
This creates a new python binary and library location, and symlinks to your existing system libraries by default. Then, to switch paths to use the new environment, you do this:
source my_env/bin/activate
That's it. Now if you install modules (e.g. with easy_install), they get installed to the lib directory of the my_env directory. They don't interfere with existing libraries, you don't get weird conflicts, stuff doesn't stop working in your old environment. They're completely isolated.
To exit the environment, just do
deactivate
If you decide you made a mistake with an installation, or you don't want that environment anymore, just delete the directory:
rm -rf my_env
And you're done. It's really that simple.
virtualenv is great. ;)
Some tips:
on Mac OS X, use only the python installation in /Library/Frameworks/Python.framework.
whenever you use numpy/scipy/matplotlib, install the enthought python distribution
use virtualenv and virtualenvwrapper to keep those "system" installations pristine; ideally use one virtual environment per project, so each project's dependencies are fulfilled. And, yes, that means potentially a lot of code will be replicated in the various virtual envs.
That seems like a bigger mess indeed, but at least things work that way. Basically, if one of the projects works in a virtualenv, it will keep working no matter what upgrades you perform, since you never change the "system" installs.
Take a look at virtualenv.
What I usually do is trying to (progressively) keep up with the Python versions as they come along (and once all of the external dependencies have correct versions available).
Most of the time the Python code itself can be transferred as-is with only minor needed modifications.
My biggest Python project # work (15.000+ LOC) is now on Python 2.6 a few months (upgrading everything from Python 2.5 did take most of a day due to installing / checking 10+ dependencies...)
In general I think this is the best strategy with most of the interdependent components in the free software stack (think the dependencies in the linux software repositories): keep your versions (semi)-current (or at least: progressing at the same pace).
install the python versions you need, better if from sources
when you write a script, include the full python version into it (such as #!/usr/local/bin/python2.6)
I can't see what could go wrong.
If something does, it's probably macports fault anyway, not yours (one of the reasons I don't use macports anymore).
I know I'm probably missing something and this will get downvoted, but please leave at least a little comment in that case, thanks :)
I use the MacPorts version for everything, but as you note a lot of the default versions are bizarrely old. For example vim omnicomplete in Snow Leopard has python25 as a dependency. A lot of python related ports have old dependencies but you can usually flag the newer version at build time, for example port install vim +python26 instead of port install vim +python. Do a dry run before installing anything to see if you are pulling, for example, the whole of python24 when it isn't necessary. Check portfiles often because the naming convention as Darwin ports was getting off the ground left something to be desired. In practice I just leave everything in the default /opt... folders of MacPorts, including a copy of the entire framework with duplicates of PyObjC, etc., and just stick with one version at a time, retaining the option to return to the system default if things break unexpectedly. Which is all perhaps a bit too much work to avoid using virtualenv, which I've been meaning to get around to using.
I've had good luck using Buildout. You set up a list of which eggs and which versions you want. Buildout then downloads and installs private versions of each for you. It makes a private "python" binary with all the eggs already installed. A local "nosetests" makes things easy to debug. You can extend the build with your own functions.
On the down side, Buildout can be quite mysterious. Do "buildout -vvvv" for a while to see exactly what it's doing and why.
http://www.buildout.org/docs/tutorial.html
At least under Linux, multiple pythons can co-exist fairly happily. I use Python 2.6 on a CentOS system that needs Python 2.4 to be the default for various system things. I simply compiled and installed python 2.6 into a separate directory tree (and added the appropriate bin directory to my path) which was fairly painless. It's then invoked by typing "python2.6".
Once you have separate pythons up and running, installing libraries for a specific version is straightforward. If you invoke the setup.py script with the python you want, it will be installed in directories appropriate to that python, and scripts will be installed in the same directory as the python executable itself and will automagically use the correct python when invoked.
I also try to avoid using too many libraries. When I only need one or two functions from a library (eg scipy), I'll often see if I can just copy them to my own project.
I need to make some Python applications for a work project. The target platform is AIX 5.3.
My question is: What version of Python should I be using?
My requirements are:
The Python version must be easy to install on the target machines. Others will do that according to instructions that I write, so no compiling from source or anything like that.
The Python version must have ncurses or curses support (I'm making a form handler).
I've found two different precompiled versions of Python for AIX, but one (2.1.something) didn't include the curses module, and the other (2.3.4, RPM format) had prerequisites that I failed to fulfill).
Any help would be greatly appreciated.
Use the AS Package of Python 2.6.3.7 from Activestate. They have a binary package for AIX on their download site.
If you don't have an AIX machine to test it on, the install works the same way on Solaris or Linux, so you could write your documentation based on that. Basically, you ungzip the tarball file, use tar to unpack the archive, change directory to the unpacked folder, run a shell script to install it, tell the shell script what directory to place it in, and wait.
Normally this would be used to install into a user directory, without superuser permissions, but you could install it anywhere that you like. You might also need to edit the system profile in order to make sure that all users can find the Python binary.
I suggest the latest Python 2.6, because it has a lot of bugfixes, and there is now a critical mass of 3rd party libraries ported to it. Also, the standard library includes a lot of useful stuff that you used to have to collect separately. Curses is in the standard library of Python 2.6.
Make sure to avoid Python 3.1 since it has not yet matured enough and provides few benefits for most business applications development.
I'd compile it from source myself and tell them where to download it from in the instructions
We've used ActiveState's Python as well as Pware's compiled version. Both have worked well. For AS, we've used 2.5 and 2.6. For Pware, just 2.6. Both 2.5 and 2.6 from AS support curses on our machine.
I've compiled from source but usually wind up having trouble with with ctypes or SSL. Currently I have the Frankenstein option going of AS Python2.6 installed but I pulled out a couple of *.so files from Pware's. I'm using GCC since we've never ponied up for a compiler but depending on what you need from Python, it's definitely doable if I can do it.
I will mention that AS Python claims to be 100% compatible with standard Python and it has been for everything we've done so far (mostly web applications).