Let's say I am building an event-driven architecture, with Python microservices. I have a stream package handling all interactions with the streaming platform. It is supposed to be used internally only, in the company.
stream
__init__.py
produce.py
process.py
security.py
What are my options for sharing this package between all my Python microservices?
Should I publish my stream package so my projects can install it?
Is there some kind of Gradle for python including the Multi project feature?
You can package your Python code to reusable packages.
Poetry is a popular modern tool to manage your Python package.
Poetry, and other Python package managers like pip, can directly install private packages from an internal file server or Git link.
You don't have to publish your Python modules to install them with pip. See Packaging Python projects from the Python documentation to understand how to create installable packages (Wheel, .whl) from your project and stop once you've got the .whl file (don't have to upload to the public package index). Then, you can put them either into your company-internal Python package index (e.g., artifactory) or into an object storage service or Git LFS, from where your other microservices can access and install the package during build with
pip install /path/to/your/stream-0.0.1-py3-none-any.whl
Related
Im currently refactoring my monolith into microserivces. To make them communicate each service has as client module with client in it that call per request the other services. I want to manage the different packages as easy as possible so I created a repository which is my package. Then each folder/module is the service with the modules of it that are needed.
What I want to achieve is that I can simply call pip install package["subpackage"] and it installs only that specific module of the package. I choose a big package over small packages because of the naming problem that most services have basic names where pip packages already exist with these names.
Repository of package
repo
payments/
client/
models/
auth/
client/
models/
setup.py
Is there a way to provide the information what each submodule / module needs for installing like install_requires for each module ?
Is there a other good approach that I should take ? I know some companies to it with java so that each module is its own "package" but they are all under a company package. Maybe in python there is a better solution for this
You can add additional requirements using:
setup(
name="package",
version=__version__,
install_requires=["stuff", "every", "package", "needs"],
extras_require={
"subpackage": ["dependency_1", "dependency_2"]
}
)
You can then run
pip install package[subpackage]
To install the extra dependencies.
In my organization, we
Split each server into their own repositories
Have a 'mono repo' that has all of the servers as git submodules
Make each server pip-installable with an internal pip server
Have a deployment repo that pins the versions of the servers we are using, as well as all the deployment scripts such as pushing docker contianers to local testing environments and terraform scripts to deploy to AWS.
I have a repository which contains protocol buffer messages and services, originally written for a Golang project. I need to write a Python microservice but I'm not 100% clued up on how Python dependency management works. I need to import the definitions into my new Python project and compile them to Python.
How should I import my protocol buffer definitions repository into my Python project and compile the protobufs for Python? I can't copy the .pb files into my new project as they're shared amongst a number of other projects.
Protobufs is installable via pip
$ pip install protobuf
Read Installing Python Modules in the documentation if you need help on using pip.
In python we do pip install to install some external libraries, modules etc.
If i have to create a reusable python module that can be used by several different APIs in an enterprise, is there a known way to create this module and ship it so that consuming applications just install it somehow and import the module rather that taking the source code from a common repository, creating a local module out of it and then do a module import.
Can someone educate me with what best practices we have in python for this use case ?
The flow is quite simple:
You create your lib.
You define setup.py file (versioning here is important).
You build your lib.
You upload it to a pypi server (either public or your private).
Other applications simply bump version and pip install from either public or your private pypi server (there is a flag in pip command tool to switch to other server).
You should start with learning either distutils or setuptools (my favourite) for points 2 and 3.
For a private pypi server you have to set it up. I've actually never done it by myself but I assume it can't be hard. Google pypi server.
I am trying to build a package for an apps in python. It uses sklearn, pandas, numpy, boto and some other scientific module from anaconda. Being very unexperienced with python packaging, I have various questions:
1- I have some confidential files .py in my project which I don't want anyone to be able to see. In java I would have defined private files and classes but I am completely lost in python. What is the "good practice" to deal with these private modules? Can anyone link me some tutorial?
2- What is the best way to package my apps? I don't want to publish anything on Pypi, I only need it to execute on Google App engine for instance. I tried a standalone package with PyInstaller but I could not finish it because of numpy and other scipy packages which makes it hard. Is there a simple way to package in a private way python projects made with anaconda?
3- Since I want to build more apps in a close future, shall I try to make sub-packages in order to use them for other apps?
The convention is to lead with a single underscore _ if something is internal. Note that this is a convention. If someone really wants to use it, they still can. Your code is not strictly confidential.
Take a look at http://python-packaging-user-guide.readthedocs.org/en/latest/. You don't need to publish to pypi to create a Python package that uses tools such as pip. You can create a project with a setup.py file and a requirements.txt file and then use pip to install your package from wherever you have it (e.g., a local directory or a repository on github). If you take this approach then pip will install all the dependencies you list.
If you want to reuse your package, just include it in requirements.txt and the install_requires parameter in setup.py (see http://python-packaging-user-guide.readthedocs.org/en/latest/requirements/). For example, if you install your package with pip install https://github/myname/mypackage.git then you could include https://github/myname/mypackage.git in your requirements.txt file in future projects.
I developed my first webserver app in Python.
It's a but unusual, because it does not only depend on python modules (like tornado) but also on some proprietary C++ libs wrapped using SWIG.
And now it's time to deliver it (to Linux platform).
Due to dependency on C++ lib, just sending sources with requirements.txt does not seem enough. The only workaround would be to have exact Linux installation to ensure binary compatibility of the lib. But in this case there will be problems with LD_PATH etc.
Another option is to write setup.py to create sdist and then deploy it with pip install.
Unfortunately that would mean I have to kill all instances of the server before installing my package. The workaround would be to use virtualenv for each instance though.
But maybe I'm missing something much simpler?
If you need the package to be installed by some user the easiest way will be to write the setup.py - but no just with simple setup function like most of installers. If you look at some packages, they have very complicated setup.py scripts which builds many things and C extensions with installation scripts for many external dependences.
The LD_PATH problem you can solve like this. If your application have an entry-point like some script which you save in python's bin directory (or system /usr/bin) you override LD_PATH like export LD_PATH="/my/path:$LD_PATH".
If your package is system service, like some servers or daemons, you can write system package, for example debian package or rpm. Debian has a lot of scripts and mechanism to point out the dependencies with packages.
So, if you need some system libraries on the list you write it down in package source and debian will install them when you will be installing your package. For example your package have dependencies for SWIG and other DEV modules, and your C extension will be built properly.