How do I turn a python program into an .egg file?
Setuptools is the software that creates .egg files. It's an extension of the distutils package in the standard library.
The process involves creating a setup.py file, then python setup.py bdist_egg creates an .egg package.
Also, if you need to get an .egg package off a single .py file app, check this link: EasyInstall - Packaging others projects as eggs.
Python has its own package for creating distributions that is called distutils. However instead of using Python’s distutils’ setup function, we’re using setuptools’ setup. We’re also using setuptools’ find_packages function which will automatically look for any packages in the current directory and add them to the egg. To create said egg, you’ll need to run the following from the command line:
c:\Python34\python.exe setup.py bdist_egg
Related
As the title says, I was wondering if Google Cloud Functions (where I currently have some pure python code) support cython'd modules?
I guess, more specifically, I'm asking about how I would use said modules? It's a private project, I'm using cython via setup.py and cythonize(files) which creates a bunch of shared object modules (example.cpython-38-darwin.so, example1.cpython-38-darwin.so, example2.cpython-38-darwin.so).
Those are all for Mac, so won't work on Firebase.
Is there any way to get Cloud Functions to run the setup.py and compile some files? Or, better yet, is there some way to pre-compile those files for the appropriate OS and just deploy the shared libs?
I know a variety of libraries I'm installing via pip on Cloud Functions use Cython under the hood, but I don't really know the process of creating a wheel or other pip dependency...
You need to specify cython as a build-time dependency for your private project by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with the modern version of pip in the Cloud Functions runtime, cython will be installed into the build environment before your setup.py script is run.
I appear to have been able to (eventually) solve this... I might have several unnecessary steps, but I think they improve my overall build system (again, with the intention of being able to use cython'd shared libraries on Firebase).
From Docker (or in my case, a Linux VM), in my private repo, I cythonize the important code, and turn everything into a wheel. From here, I run auditwheel show over the wheel, to check if it adheres to the manylinux1 tag (or whichever manylinux I want). In this case, it did adhere to manylinux1 off the bat, so there was no need to repair the wheel or do any shenanigans this time.
... .py # Other irrelevant .py files
magic.py # Source code that needs to be cython'd
setup.py
Simplified setup.py:
from setuptools import setup, find_packages
from Cython.Build import cythonize
setup(
name='magiclib',
version='0.1.0',
packages=find_packages(),
ext_modules=cythonize(
"magic.py",
compiler_directives={'language_level': 3}
)
)
Running python setup.py bdist_wheel creates a wheel named dist/magiclib-0.1.0-cp37-cp37m-linux_x86_64.whl
From here, I run auditwheel show dist/magiclib-0.1.0-cp37-cp37m-linux_x86_64.whl which shows me that the code already adheres to the manylinux1 tag, but I nonetheless run auditwheel repair dist/magiclib-0.1.0-cp37-cp37m-linux_x86_64.whl which creates wheelhouse/magiclib-0.1.0-cp37-cp37m-manylinux1_x86_64.whl.
At this point, I bring this wheel into my GCF project, and use:
pip install -t magiclib magiclib-0.1.0-cp37-cp37m-manylinux1_x86_64.whl
which basically unzips the wheel into a sub-directory that I can vendor and deploy to Google Cloud and call from my Functions.
Works fine on some of my simple code, and I'll be experimenting with some more involved code.
How can I build an .egg file include a '.so' file?
here is my file tree:
--setup.py
--__init__.py
--mcpack.py`
--mcpack.so
how can I make a package with mcpack.so in it?
I've tried using pack_data, it turns out .so file is packaged but not findable in python file.
Put it in native_libs.txt and build the egg with bdist_egg, see this section of the documentation. You should, however, switch to wheels instead of eggs.
Shared object files are only distributable if you can guarantee the exact same tool chain that was used to build your .so is also present in every other distribution that is going to run it - this is highly unlikely and would break the distributable nature of egg files, that is unless you really only want to target a specific version of a Linux distribution.
Your egg, if installed in any other distribution than the one the .so was built on will not be able to be used.
If you want to build a distributable package containing native code have a look at wheel files and for wheels that can be distributed to all Linux OSs, have a look at manylinux.
I have a simple script that has a dependency on dnspython for parsing zone files. I would like to distribute this script as a single .py that users can run just so long as they have 2.6/2.7 installed. I don't want to have the user install dependencies site-wide as there might be conflicts with existing packages/versions, nor do I want them to muck around with virtualenv. I was wondering if there was a way to embed a package like dnspython inside the script (gzip/base64) and have that script access that package at runtime. Perhaps unpack it into a dir in /tmp and add that to sys.path? I'm not concerned about startup overhead, I just want a single .py w/ all dependencies included that I can distribute.
Also, there would be no C dependencies to build, only pure python packages.
Edit: The script doesn't have to be a .py. Just so long as it is a single executable file.
You can package multiple Python files up into a .egg. Egg files are essentially just zip archives with well defined metadata - look at the setuptools documentation to see how to do this. Per the docs you can make egg files directly executable by specifying the entry point. This would give you a single executable file that can contain your code + any other dependencies.
EDIT: Nowadays I would recommend building a pex to do this. pex is basically an executable zip file with non stdlib dependencies. It doesn't contain a python distribution (like py2app/py2exe) but holds everything else and can be built with a single command line invocation. https://pex.readthedocs.org/en/latest/
The simplest way is just to put your python script named __main__.py with pure Python dependencies in a zip archive, example.
Otherwise PyInstaller could be used to produce a stand-alone executable.
please don't do this. If you do DO NOT make a habit of it.
pydns is BDS licensed but if you try to "embed" a gpl module in this way you could get in trouble
you can learn to use setuptools and you will be much happier in the long run
setuptools will handle the install of dependencies you identified (I'm not sure if the pydns you are using is pure python so you might create problems for your users if you try to add it yourself without knowing their environment)
you can set a url or pypi so that people could upgrade your script with easy_install -U
I have questions about egg files in Python.
I have much Python code organized by package and I'm trying to create egg files.
I'm following instructions, but they are very common.
According to that, it seems I need to have a setup.py file.
Would you please tell me what I need to put into setup.py file and where it should reside?
I suppose it's enough to create setup.py and then start "setup.py bdist_egg" for getting egg file. Could you please confirm?
Is it possible to include only .pyc files into egg file?
Having .egg file how I can just start the code from it without unpacking like java -jar <jar file> does?
You are reading the wrong documentation. You want this: https://setuptools.readthedocs.io/en/latest/setuptools.html#develop-deploy-the-project-source-in-development-mode
Creating setup.py is covered in the distutils documentation in Python's standard library documentation here. The main difference (for python eggs) is you import setup from setuptools, not distutils.
Yep. That should be right.
I don't think so. pyc files can be version and platform dependent. You might be able to open the egg (they should just be zip files) and delete .py files leaving .pyc files, but it wouldn't be recommended.
I'm not sure. That might be “Development Mode”. Or are you looking for some “py2exe” or “py2app” mode?
For #4, the closest thing to starting java with a jar file for your app is a new feature in Python 2.6, executable zip files and directories.
python myapp.zip
Where myapp.zip is a zip containing a __main__.py file which is executed as the script file to be executed. Your package dependencies can also be included in the file:
__main__.py
mypackage/__init__.py
mypackage/someliblibfile.py
You can also execute an egg, but the incantation is not as nice:
# Bourn Shell and derivatives (Linux/OSX/Unix)
PYTHONPATH=myapp.egg python -m myapp
rem Windows
set PYTHONPATH=myapp.egg
python -m myapp
This puts the myapp.egg on the Python path and uses the -m argument to run a module. Your myapp.egg will likely look something like:
myapp/__init__.py
myapp/somelibfile.py
And python will run __init__.py (you should check that __file__=='__main__' in your app for command line use).
Egg files are just zip files so you might be able to add __main__.py to your egg with a zip tool and make it executable in python 2.6 and run it like python myapp.egg instead of the above incantation where the PYTHONPATH environment variable is set.
More information on executable zip files including how to make them directly executable with a shebang can be found on Michael Foord's blog post on the subject.
I think you should use python wheels for distribution instead of egg now.
Wheels are the new standard of python distribution and are intended to
replace eggs. Support is offered in pip >= 1.4 and setuptools >= 0.8.
I'm trying to get a package installed on Google App Engine. The package relies rather extensively on pkg_resources, but there's no way to run setup.py on App Engine.
There's no platform-specific code in the source, however, so it's no problem to just zip up the source and include those in the system path. And I've gotten a version of pkg_resources installed and working as well.
The only problem is getting the package actually registered with pkg_resources so when it calls iter_entry_points it can find the appropriate plugins.
What methods do I need to call to register modules on sys.path with all the appropriate metadata, and how do I figure out what that metadata needs to be?
Yes, for setuptools-based libraries you'll need to deploy the library's "Egg" metadata along with it. The easiest way I've found is to deploy a whole virtualenv environment containing your project and the required libraries.
I did this process manually and added this code to main.py to initialize the site-packages folder in a way that pkg_resources will work:
import site
site.addsitedir('lib/python2.5/site-packages')
However, you could try appengine-monkey which automates most of this for you.
Create a setup.py for the package just as you would normally, and then use "setup.py sdist --formats=zip" to build your source zip. The built source zip will include an .egg-info metadata directory, which will then be findable by pkg_resources. Alternately, you can use bdist_egg for all your packages.
On your local development system, run python setup.py bdist_egg, which will create a Zip archive with the necessary metadata included. Add it to your sys.path, and it should work properly.