I want to write a daemon in python which gets started via systemd.
I wan to use Type=notify, this way I don't have to do the double fork magic.
According to the docs:
The reference implementation for this notification is provided by libsystemd-daemon.so
... how to do this with Python?
Probably, you could use sdnotify python module which is a pure-python implementation of sd_notify protocol. Actually, the protocol is rather simple, so the module implementation is quite short.
To use watchdog machinery you should add WatchdocSec=<smth> to the unit file, and then send WATCHDOG=1 messages on a regular basis from your service. Check Restart= option as well.
use the package
https://pypi.org/project/systemd-python/
it is the offical systemd devs and maintained.
Related
I would like to load-test a SignalR service using Locust. I found that the following library can send and receive SignalR requests: https://pypi.org/project/signalrcore/
Now, according to the Locust docs, the next step would be to write a custom client for Locust that can send SignalR requests. But there is the following warning:
Any protocol libraries that you use must be gevent-friendly (use the
Python socket module or some other standard library function like
subprocess), or your calls are likely to block the whole Locust/Python
process.
Some C libraries cannot be monkey patched by gevent, but allow for
other workarounds. For example, if you want to use psycopg2 to
performance test PostgreSQL, you can use psycogreen
I am a beginner in Python so I don't understand exactly what it means. The library "signalrcore" I am using is 100% synchronous. Does it means I can't use it with Locust?
I found an a fork of signalrcore that uses asyncio. Should I use that fork instead and just make sure all my signalr calls are non blocking?
Thanks!
SignalRCore seems to use requests and websocket-client under the hood, both of which are gevent-friendly. I cant say for sure, but I’d give it 90% probability that it will work ”out of the box” :)
If you do use the asyncio one you’d need to do some magic yourself. At least I have never combined that with gevent.
I'm not super familiar with asyncio, but I was hoping there would be some easy way to use asyncio.Queue to push log messages to a Queue instead of writing them to the disk, and then have a worker on a thread wait for these Queue events and write them to disk when resources are available. This seems pretty widely necessary, as logging is a huge bottleneck in a lot of code but isn't always needed in real time. Are there any pre-existing packages for this or can anyone with more experience write a short example script to get me started? NOTE: This needs to interface with existing code, so making it all packaged in a class would probably be preferred.
It's handled in the standard library in recent Python versions. See this post for information, and the official documentation. This functionality predates asyncio, and so doesn't use it (and doesn't especially need to).
For Python 2.7, you can use the logutils package which provides equivalent functionality.
I was hoping to implement an SVN communicator in my python program so that any file being worked on is automatically stored into the user's SVN account without any user interaction (username and password already provided so Python takes care of storage). Are there any libraries that can handle this kind of communication?
Thanks!
There are Python bindings for SVN. They follow the C API, so present a fairly low-level interface, not very "Pythonic". I'm not sure how easy they are to install these days. I've tried to use it in the past, and found that it requires some digging into the C API documentation to figure out how to make it work.
pysvn provides a more "Pythonic" API. I've used this, and found it very simple in comparison.
There is a pysvn project, which provides python interface to various svn tasks. You could use that invoke the svn commit operation for the user action which you want to act upon.
I am about to begin a project where I will likely use PyQt or Pyside.
I will need to interface with a buggy 3rd party piece of server software that provides C++ and Java APIs. The Java APIs are a lot easier to use because you get Exceptions where with the C++ libraries you get segfaults. Also, the Python bindings to the Java APIs are automatic with Jython whereas the Python bindings for the C++ APIs don't exist.
So, how would a CPython PyQt client application be able to communicate with these Java APIs? How would you go about it?
Would you have another separate Java process on the client that serializes / pickles objects and communicates with the PyQt process over a socket?
I don't want to re-invent the wheel... is there some sort of standard interface for these types of things? Some technology I should look into? RPC, Corba, etc?
Thanks,
~Eric
If you want to maintain complete isolation and increase your robustness (the 3rd party library going down and not taking your client, and if it's buggy I would recommend that) then perhaps something like CORBA is the way forwards. Don't forget that Java comes with a CORBA implementation as standard, so you just need to generate your C proxy from the IDL.
Swig may be of interest if you want to run stuff in-process. It simplifies the binding of components in different languages. Note in particular that it generates bindings for Python and Java.
If the criteria is not reinventing the wheel, there is the SimpleXMLRPCServer and xmlrpclib modules available in the standard library. They should work in Jython too.
We have a collection of Unix scripts (and/or Python modules) that each perform a long running task. I would like to provide a web interface for them that does the following:
Asks for relevant data to pass into scripts.
Allows for starting/stopping/killing them.
Allows for monitoring the progress and/or other information provided by the scripts.
Possibly some kind of logging (although the scripts already do logging).
I do know how to write a server that does this (e.g. by using Python's built-in HTTP server/JSON), but doing this properly is non-trivial and I do not want to reinvent the wheel.
Are there any existing solutions that allow for maintaining asynchronous server-side tasks?
Django is great for writing web applications, and the subprocess module (subprocess.Popen en .communicate()) is great for executing shell scripts. You can give it a stdin,stdout and stderr stream for communication if you want.
Answering my own question, I recently saw the announcement of Celery 1.0, which seems to do much of what I am looking for.
I would use SGE, but I think it could be overkill for your need...