Cython, how to include 'external' modules and compile a single library? - python

I have created a plugin for a software, this plugin is written in Python and I wish to distribute it in the form of a shared library (.so) for Mac OS (only). In my code I have a number of imports to packages that may not be installed in the 'target' Macs and I want to avoid having the user to download those packages as they many not have pip on their Mac and so on...
The simplest solution would be to 'build' a shared library with all the dependencies/modules to ensure that the user just downloads it and activates the plugin in the software and everything automatically works.
Is that possible?
I have been doing a lot of searching and reading on the web, I've look at pyInstaller, but I don' want to create a stand-alone executable but a single shared library with imports from packages such as Crypto, zerconf, twisted, etc.
If not what will be the alternatives?
I have tried Cython but I only get my plugin code compiled into a library so when I try to use it in a computer without the necessary packages I get an import error.
Is there a way to 'tell' Cython to compile all the dependencies of the code into a single shared library?
As I mentioned above pyInstaller could work for a stand-alone executable but in my case it needs to be a shared library...
Many thanks!

Related

How to install python modules using cppyy?

I want to package a python module containing python source and a native c++ library. Cppyy is used to dynamically generate the bindings so the library is really just a normal library. The build system for the library is meson and should not be replaced. The whole thing is in a git repository. I only care about Linux.
My question is how to get from this to “pip install url_to_package builds/installs everything.” in the least complicated way possible.
What I’ve tried:
Extending setuptools with a custom build command:
…that executes meson compile and copies the result in the right place. But pip install will perform its work in some random split-off temporary directory and I can’t find my C++ sources from there.
The Meson python module:
…can build my library and install files directly into some python env. Does not work with pip and has very limited functionality.
Wheels:
…are incredibly confusing and overkill for me. I will likely be the only user of this module. Actually, all I want is to easily use the module in projects that live in different directories…
Along the way, I also came across different CMake solutions, but those are disqualified because of my build system choice. What should I do?

Run a python script in an environment where a library may not be present

I have a python script where I am including a third party library:
from docx import Document.
Now, I need to run this script in an environment where bare-bones python is present but not this library.
Installing this library in the target environment is beyond my scope and I tried using distutils, but couldn't go far with it. The target environment just need to run the script, not install a package.
I am from Java background and in Java I would have just exported and created a jar file which would have included all the libraries I needed. I need to do similar with python.
Edit: With distutils, I tried creating a setup.py:
from distutils.core import setup
import docx
setup(name='mymodule',
version='1.0',
py_modules=['mymodule', docx]
)
But I am not sure this works.
PyInstaller won't work if you can't make a pyc file and you cannot make pyc file unless your code runs without fatal errors.
You could have the import in a try block that excepts ImportError 's but that will result in NameError 's where the package is referenced. Long story short if the package is integral to the script no amount of avoiding it will fix your problem. You need the dependencies.
You said installing the package is beyond your scope, well then, it is time to expand your scope. Docx is an open source package you can find on github here
You can download that and run setup.py
You can include the modules for docx in your application. Just distribute them together.
But docx depends on the lmxl operating system package and needs to run setup on that. You can't just copy it to the target machine.
I'm not sure PyInstaller supports docx, especially add it has the non python dependency.
Really using pip or easy_install is the way to go.
PyInstaller is a program that converts (packages) Python programs into stand-alone executables, under Windows, Linux, Mac OS X, Solaris and AIX.

Python deployment with third-party libraries

I want to deploy an executable (.exe) from my python2.7 project with everything included. I have seen pyinstaller and py2exe but the problem is that my project uses a lot of third-party packages that are not supported by default. What is the best choice for such cases? Is there any other distribution packager that could be used?
Thank you
The executable creation packages should be able to grab 3rd party packages if they're installed. Sometimes you have to specify what to include if the library abuses Python's importing system or it's not a "pure Python" package. For example, I would sometimes have to specifically include lxml to get py2exe to pick it up properly.
The py2exe project for Python 2 hasn't been updated in quite a long time, so I would certainly recommend one of the alternatives: PyInstaller, cx_freeze or bb_freeze.
I have only seen issues with MSVCP90.dll when using non pure Python packages, such as wxPython. Normally you can add that in your setup.py to include it. If that doesn't work, then you could also add it using an installer utility like NSIS. Or you may just have to state in your README that your app depends on Microsoft's C++ redistributable and include a link to it.

Packaging a proprietary Python library for multiple OSs

I am developing a proprietary Python library. The library is currently Windows-only, and I want to also make it available for other platforms (Linux & Mac).
The library is currently shipped to the (paying) customers in the form of a Zip file. This archive contains the compiled Python code for the implementation of my library, plus all dependencies. By placing the Zip file on his PYTHONPATH, the customer can use the library from his Python scripts.
Shipping a Zip file with all dependencies included has the following advantages:
No internet access or administrator privileges are required to install the library.
The customer does not have to worry about installing / managing dependencies of my library.
It also has the disadvantage that the customer is not (easily) able to use his own versions of my library's dependencies.
Even though I am not generating an EXE, I am using py2exe to obtain the distributable Zip file for my library. This "hack" is very convenient, as py2exe allows me to simply say which packages I require and does the work of performing a dependency analysis of the required libraries for me. py2exe automatically generates the Zip file with my (compiled) library code, and all dependencies.
Unfortunately, py2exe is only available for Windows. I need to also be able to build it on Linux & Mac, hence change the build process to not use py2exe.
My questions are:
Is it considered a bad idea in the Python community to ship one large Zip file with all dependencies? From what I have seen, it seems to be an unusual approach, at the least.
Is it possible to distribute the library in a form that allows for an offline installation without administrator privileges using other tools, such as setuptools?
My insight into the state of the art in Python regarding these matters is limited, so I would appreciate advice from someone with more experience in the subject.
A hard requirement is that I can only ship binary distributions of my library, as this is a proprietary product. I looked at Distutils and Setuptools, where the recommended approach seems to be to simply ship all sources.
Many thanks!

How to properly deploy python webserver application with extension deps?

I developed my first webserver app in Python.
It's a but unusual, because it does not only depend on python modules (like tornado) but also on some proprietary C++ libs wrapped using SWIG.
And now it's time to deliver it (to Linux platform).
Due to dependency on C++ lib, just sending sources with requirements.txt does not seem enough. The only workaround would be to have exact Linux installation to ensure binary compatibility of the lib. But in this case there will be problems with LD_PATH etc.
Another option is to write setup.py to create sdist and then deploy it with pip install.
Unfortunately that would mean I have to kill all instances of the server before installing my package. The workaround would be to use virtualenv for each instance though.
But maybe I'm missing something much simpler?
If you need the package to be installed by some user the easiest way will be to write the setup.py - but no just with simple setup function like most of installers. If you look at some packages, they have very complicated setup.py scripts which builds many things and C extensions with installation scripts for many external dependences.
The LD_PATH problem you can solve like this. If your application have an entry-point like some script which you save in python's bin directory (or system /usr/bin) you override LD_PATH like export LD_PATH="/my/path:$LD_PATH".
If your package is system service, like some servers or daemons, you can write system package, for example debian package or rpm. Debian has a lot of scripts and mechanism to point out the dependencies with packages.
So, if you need some system libraries on the list you write it down in package source and debian will install them when you will be installing your package. For example your package have dependencies for SWIG and other DEV modules, and your C extension will be built properly.

Categories