Cython Build Service - python

I'm working on porting a Python package over to Cython, but would rather provide as many pre-compiled binary packages of it as possible so that users don't need to have Cython on their system.
I've had no lucking finding one so it probably doesn't exist but is there a Cython package build service available somewhere? Basically, I want to be able to build for Windows, Linux, Mac, ARM in both x86 and x64 varieties which means I need to create at least 8 separate builds. I'd certainly be willing to either pay for or go through the hassle of setting up an automated build system that would do that for me, on demand.
Also, I don't currently own a Mac and would rather not have to buy one just for the sake of building this package.

I don't think this question is specific to Cython, but rather any Python module which has Extensions.
Several of the Continuous Integration services allow you to create build "artifacts". You can use this feature to automate your builds.
I've used Appveyor to do this for Windows (https://www.appveyor.com/) - see an example project here: https://ci.appveyor.com/project/snorfalorpagus/appveyordemo
I'm aware that Travis can do this for Linux and OS X, although I've not tried this myself.
I've found that using Anaconda greatly simplifies things (http://conda.pydata.org/miniconda.html).
I'm not aware of a single system that does everything but that may just be because I've never looked.

Related

Releasing for Ubuntu

I've built a few pieces of C++ software I want to release for Ubuntu. What ways are there and what can you recommend? Is building .deb files and setting up an apt repo for them the best way? What about make install, is it considered an acceptable way to install software?
By far simplest for me, and perhaps most transparent for the user, would be to just have a github repository in which one could run make install to get all programs installed at one go.
Do I always install the binaries into /usr/bin?
One of the programs contains Python 3 library code, should that be installed in /usr/lib/python3/dist-packages? (I don't want to create a pip package, that would make the installation harder -- and waste more of my time.) The program also contains Python 3 examples/tutorials intended for the user to tweak and learn from, where do I install those? Do I create a ~/my-prog-tutorial-dir/ to put them in? If so: how should I name that directory?
Edit: if I simply release the statically linked binaries in a tarball, what will break eventually? Libc? Are there any major application APIs that usually change between Ubuntu LTSs? I only use pthreads, X11 and OpenGL so I suspect statically linked binaries could be a fairly stable option?
In general, building a binary package will make your software much easier for your users to install and keep up to date. The same goes for python packages. There are generally tools to generate apt packages from pip packages, so you can just list your python code as a dependency of your binary package(s).
You may see packaging and installers as a waste of your time, but only providing a source distribution wastes your users' time. Users don't want to constantly have to check github for new versions, and they often don't want to have to install all of your build dependencies if they just want to use your software. If your software is targeted towards developers this may be less of an issue, but it's still extra work that your users have to go through.
As for examples, the general convention is to put those in /usr/share/doc/myprogram/samples or a samples directory in your python package.
I was asked to expand my comment in an answer, and so I do.
The project I was talking about is called Woodpecker hash Bruteforce, and I distribute it as plain archived executables for Mac OS, Windows and Linux.
Woodpecker hash Bruteforce has only two dependencies I have to care about (the users don't need to install anything): OpenSSL and Botan - libraries to do hashing. I've got two virtual machines on my Mac where I build the project and several scripts to automate the process. I'm using Docker (in collaboration with VirtualBox) and VMware Fusion.
Above I said the users don't need to worry about any third-party libraries because everything's linked statically with the executable: you just download the appropriate file from the official website, unarchive it (if needed), sudo chmod +x the executable and that's it!
This works on any version of Linux, including Ubuntu (this is where I perform the build) and Kali Linux.
The best way to release software for Ubuntu depends on the software itself and its target audience, as Miles Budnek already pointed out.
Your goal is to lower the barriers to software usage. If you are targeting developers of your software (i.e., you develop source files that are supposed to be edited by others) or you are developing piece of code supposed to be included in other projects (e.g., gnulib), probably it is best to just provide sources and documentation.
In any other case that I currently imagine (including when you are targeting developers), providing precompiled binaries is a better option. In this case the optimal solution would be to have the software in Ubuntu. In this case How to get my software into Ubuntu? provides a lot of useful information, as suggested by Mark K.
Getting software into Debian or Ubuntu can be difficult and may require a large amount of time (you have to respect a lot of policies that you may not be aware of and you have to find a sponsor) and you will soon learn that a key point is to use a decent and popular build system for your software (e.g., autotools, cmake, distutils, ...) as mentioned in the Debian Upstream Guide. Being compliant with the guide will also be beneficial for users of other distributions.
In general I suggest to proceed in this order:
provide sources;
use a common build system (from the point of view of system administrators, i.e., people installing the software, autotools works best in my experience for Posix systems);
create a binary package (please keep in mind that you have to maintain it or your users will likely encounter binary incompatibilities);
add the package in private repository (I suggest aptly for this task);
try to get the package in the distribution of choice (please keep in mind maintainance costs).
Another option that I do not suggest, is to provide statically linked builds. This reduces the possibility of binary incompatibilities, but augment the costs of bug fixing (e.g., if the bug is in a dependency) and security, as explained in this and following comments. Another reason to avoid static linking is if several implementations of the same ABI exists, in order to exploit hardware acceleration (e.g., OpenGL), but you can also mix static and dynamic linking.
Finally you may also provide a container, like docker, to ship your software and all its dependencies: your users will just need docker to run your application very portably. However it is probably overkill in most situations and if it is a practical solution or not depends on your application and target audience.

Is Python support for FreeBSD as good as for say CentOS/Ubuntu/other linux flavors?

The development environment, we use, is FreeBSD. We are evaluating Python for developing some tools/utilities. I am trying to figure out if all/most python packages are available for FreeBSD.
I tried using a CentOS/Ubuntu and it was fairly easy to install python as well as packages (using pip). On FreeBSD, it was not as easy but may be I'm not using the correct steps or am missing something.
We've some tools/utilities on FreeBSD that run locally and I want Python to interact with them - hence, FreeBSD.
Any inputs/pointers would be really appreciated.
Regards
Sharad
The assumption that powerful and high-profile existing python tools use a lot of different python packages almost always holds true. We use FreeBSD in our company for quite some time together with a lot of python based tools (web frameworks, py-supervisor, etc.) and we never ran into the issue that a certain tool would not run on freeBSD or not be available for freeBSD.
So to answer your question:
Yes, all/most python packages are available on FreeBSD
One caveat:
The freeBSD ports system is really great and will manage all compatibility and dependency issues for you. If you are using it (you probably should), then you might want to avoid pip. We had a problem in the past where the package manager for ruby did not really play well with the ports database and installed a lot of incompatible gems. This was a temporary issue with rubygems but gave us a real headache. We tend to install everything from ports since then and try to avoid 3rd party package managers like composer, pip, gems, etc. Often the ports invoke the package managers but with some additional arguments so they ensure not to break dependencies.
Is Python support for FreeBSD as good as for say CentOS/Ubuntu/other linux flavors?
It is probably better than on other OSes, but I'm a FreeBSD-bigot.
However! As Freitags says, you do not want to use pip (nor gem, I might add). All of these language-specific packaging systems were born out of developers' frustration with the various inadequacies of OS-specific packagers.
Had the world been using BSD, pip (nor gem) would've been unnecessary.
Why am singing this pean here? To warn you, that you might not find some obscure Python package already ported -- despite being available via pip. Packages of any prominence are ported (here is the current list), but something less known might not be.
Do not despair -- create a port yourself using any of the existing examples and FreeBSD Handbook. It is very easy to do and, if you submit it to FreeBSD, it will already be there the next time you need it.
Best of luck.

Packaging a proprietary Python library for multiple OSs

I am developing a proprietary Python library. The library is currently Windows-only, and I want to also make it available for other platforms (Linux & Mac).
The library is currently shipped to the (paying) customers in the form of a Zip file. This archive contains the compiled Python code for the implementation of my library, plus all dependencies. By placing the Zip file on his PYTHONPATH, the customer can use the library from his Python scripts.
Shipping a Zip file with all dependencies included has the following advantages:
No internet access or administrator privileges are required to install the library.
The customer does not have to worry about installing / managing dependencies of my library.
It also has the disadvantage that the customer is not (easily) able to use his own versions of my library's dependencies.
Even though I am not generating an EXE, I am using py2exe to obtain the distributable Zip file for my library. This "hack" is very convenient, as py2exe allows me to simply say which packages I require and does the work of performing a dependency analysis of the required libraries for me. py2exe automatically generates the Zip file with my (compiled) library code, and all dependencies.
Unfortunately, py2exe is only available for Windows. I need to also be able to build it on Linux & Mac, hence change the build process to not use py2exe.
My questions are:
Is it considered a bad idea in the Python community to ship one large Zip file with all dependencies? From what I have seen, it seems to be an unusual approach, at the least.
Is it possible to distribute the library in a form that allows for an offline installation without administrator privileges using other tools, such as setuptools?
My insight into the state of the art in Python regarding these matters is limited, so I would appreciate advice from someone with more experience in the subject.
A hard requirement is that I can only ship binary distributions of my library, as this is a proprietary product. I looked at Distutils and Setuptools, where the recommended approach seems to be to simply ship all sources.
Many thanks!

Python development setup

So, id like to start serious python development, and its proven to be a big pain. Im not worried at all about the language itself; I like it well enough and I will have no problems picking it up. But the ecosystem is driving me crazy.
First I tried to get up and running under windows. I gave up on that after a few days, as 90% of packages dont include windows support / install instructions. So I switched to macosx, which people said was good for mac development.
More frustration ensues. Id like to use python as a matlab replacement and tool development platform, so spyderlib seems like an excellent tool. But now ive been busy trying to build pyqt on my mac for two days, to no avail, and im starting to question the wisdom of it all. Obviously, following several guides literally invariantly ends in cryptic errors. For which platform was this dependency built? What arcane compiler flags need to be set? I dont know and I dont care; why doesnt the installer figure it out? Oh wait, there isnt any... I want to USE these tools, not first completely reverse engineer them to find out how to build them.
There is a vast amount of implied knowledge in all the documentation I can find on these matters, both with regard to unix and pythonic quirks. Is there any way to scale this mountain, in a place with a managable learning curve? Right now I have no idea what im doing. Or should I go back to windows and try to coerce the unix packages I need into cooperation?
On Mac OS X, you can get spyder with macports. This should build everything needed.
If you prefer Windows, take a look at Python(x,y). It has a bunch of scientific tools pre-built, including spyder.
Finally, the Enthought Python distribution is worth considering for scientific work.
Have you tried ActivePython?
Why battle with compiling the modules yourself when you can get the pre-built packages from PyPM?
pypm install pyqt4 matplotlib scipy numpy
From my experience the best platform for kind of project you're describing is Linux. There you just install the libs you need from package manager and that's it. Binary packages, so compiling is not required.
If you want to stick with MacOS X, you should install either MacPorts or Fink. It's usually easy to use. Problem is, that things like Qt take forever to compile. But you won't be doing that very often.
As for installing Python modules, the best is PIP, which is very nice replacement for easy_install did does much more. Especially useful if you want virtualenv setup.
This is nearly the exact opposite of my experience with Python on Windows. Python itself installs with a binary installer, most add-on packages support easy_install, others provide binary installers of their own. The IDE I use is SciTE, which uses the old DOS install model - copy the files to a directory and run the SciTE.exe file. If you get a source distribution of a Python package, go to the directory containing setup.py and run python setup.py install. Maybe that's the implied knowledge you're talking about.
You can also find many unofficial Windows binaries at http://www.lfd.uci.edu/~gohlke/pythonlibs/.
I switched to Mac a few years ago and found that it took me quite a while of googling to properly install all the packages I needed for Python development. While I installed everything I made a list of the steps required to setup a functional system that may be appropriate for you as well. I usually use NetCDF4, HDF5, Numpy, Matplotlib, f2py, and Fortran in combination with Python. I published my list of 22 setup-steps for installing from source on my website. Installing from source is somewhat more time-consuming than using macports and fink, but enables you to have a working environment that is optimized for your system.

Differences between Framework and non-Framework builds of Python on Mac OS X

Question
What are the differences between a Framework build and a non-Framework build (i.e., standard UNIX build) of Python on Mac OS X? Also, what are the advantages and disadvantages of each?
Preliminary Research
Here is the information that I found prior to posting this question:
[Pythonmac-SIG] Why is Framework build of Python needed
B. Grainger: "I seem to recall that a Framework build of Python is needed if you want to do anything with the native Mac GUI. Is my understanding correct?"
C. Barker: "Pretty much -- to access the Mac GUI, an app needs to be in a proper Mac application bundle. The Framework build supplies that."
Apple Developer Connection: Framework Definition
"A framework is a bundle (a structured directory) that contains a dynamic shared library along with associated resources, such as nib files, image files, and header files. When you develop an application, your project links to one or more frameworks. For example, iPhone application projects link by default to the Foundation, UIKit, and Core Graphics frameworks. Your code accesses the capabilities of a framework through the application programming interface (API), which is published by the framework through its header files. Because the library is dynamically shared, multiple applications can access the framework code and resources simultaneously. The system loads the code and resources of a framework into memory, as needed, and shares the one copy of a resource among all applications."
Framework Programming Guide: What are Frameworks?
"Frameworks offer the following advantages over static-linked libraries and other types of dynamic shared libraries:
Frameworks group related, but separate, resources together. This grouping makes it easier to install, uninstall, and locate those resources.
Frameworks can include a wider variety of resource types than libraries. For example, a framework can include any relevant header files and documentation.
Multiple versions of a framework can be included in the same bundle. This makes it possible to be backward compatible with older programs.
Only one copy of a framework’s read-only resources reside physically in-memory at any given time, regardless of how many processes are using those resources. This sharing of resources reduces the memory footprint of the system and helps improve performance."
Background
Prior to Mac OS X 10.6 Snow Leopard, I hadn't thought much about this, as I simply would download and install the Python 2.6.2 Mac Installer Disk Image, which is a framework build, and go about my business using virtualenv, pip, etc. However, with the changes in Snow Leopard to 64-bit, gcc, etc., I've noticed some issues that have made me want to build/compile Python 2.6.2+ myself from source, which leads me to my question of the differences and advantages/disadvantages of building Python as a MacOSX|Darwin framework.
You've already listed all important advantages of making a framework (congratulations for excellent research and reporting thereof!); the only flip side is that it's harder to arrange to build one properly, but if you take your clues from the examples in the installer you quote, it should be doable.
BTW, what's wrong with the system Python that comes with Snow Leopard? I haven't upgraded from Leopard yet (long story... I do have the "family license" upgrade DVD, but need Snow Leopard to fix some things before I can upgrade), so I have no first-hand experience with that yet, but I do know it's a 2.6 build and it comes in both 32-bit and 64-bit versions... so why do you need to build your own framework?
There is another difference: typically the Framework installation provided by the installer from python.org has several architectures.
$ file libpython2.7.dylib
libpython2.7.dylib: Mach-O universal binary with 2 architectures
libpython2.7.dylib (for architecture i386): Mach-O dynamically linked shared library i386
libpython2.7.dylib (for architecture x86_64): Mach-O 64-bit dynamically linked shared library x86_64
If you install from source and you do not deliberately change this, your libpython has only one architecture.
I have had cases where the two architectures actually resulted in problems (at least I believe that this was the reason), namely when installing the HDF5 python bindings (h5py).
And there is yet another difference: some tools require the framework installation. For instance PyQt, and in particular sip. While it is possible to install sip and PyQt even for the non-framework version of python, it is much more complicated.
As for the decision what to prefer, I still do not know. At the moment, I went for the non-framework option, but I must say, that it also caused me some headache.
If you are going to ship your code (have it running on another machine), you'd better use the system version of python otherwise your program behavior will be undefined on the other machines.
I use Macports on 10.6, which makes it very simple to install multiple versions of python and switch between them and Apple's version:
sudo port install python26
sudo port install python_select
sudo python_select -l
The most recent version of python26 is 2.6.2, and compiles and runs fine on 10.6.1:
trac.macports.org/browser/trunk/dports/lang/python26/Portfile
Framework builds are owned by the 'root' account when installed. A source build will be owned by the account installing it. The advantage (and disadvantage) of having ownership of the Python installation is that you don't need to change accounts to modify it.
A small difference is that Framework builds are built against the EditLine library. Source builds are usually compiled against the Readline library. Depending upon which library Python is compiled against, the readline module in the standard library works slightly differently. See 'man python' on Mac OS X for more details on this.
There is a nice buildout for automating the compile of Python 2.4, 2.5 and 2.6 from source on Mac OS X, which is explained here. This will compile against a custom build of readline. However, the usefulness of scripting the source install is that you can make additional tweaks to your custom Python builds, e.g. installing essential distributions such as virtualenv, or harder to install distributions such as PIL.

Categories