Scheduling events in a WSGI framework - python

I've written a Python app that reads a database of tasks, and schedule.enter()s those tasks at various intervals. Each task reschedules itself as it executes.
I'd like to integrate this app with a WSGI framework, so that tasks can be added or deleted in response to HTTP requests. I assume I could use XML-RPC to communicate between the framework process and the task engine, but I'd like to know if there's a framework that has built-in event scheduling which can be modified via HTTP.

Sounds like what you really want is something like Celery. It's a Python-based distributed task queue which has various task behaviours including periodic and crontab.
Prior to version 2.0, it had a dependency on Django, but that has now been reduced to an integration plugin.

Related

Simple websocket server in Python for publishing

I have a running CLI application in Python that uses threads to execute some workers. Now I am writing a GUI using electron for this application. For simple requests/responses I am using gRPC to communicate between the Python application and the GUI.
I am, however, struggling to find a proper publishing mechanism to push data to the GUI: gRPCs integrated streaming won't work since it uses generators; as already mentioned my longer, blocking tasks are executed using threads (subclasses of threading.Thread). Also I'd like to emit certain events (e.g., the progress) from within those threads.
Then I've found the Flasks SocketIO implementation, which is, however, a blocking execution and thus not really suited for what I have in mind - I'd have to again execute two processes (Flask and my CLI application)...
Another package I've found is websockets but I can't get my head around how I could implement this producer() function that they mention in the patterns.
My last idea would be to deploy a broker-based message system like Redis or simply fall back to the brokerless zmq, which is a bit of a hassle to setup for the GUI application.
So the simple question:
Is there any easy framework that allows to create a server-"task" in a Python that I can pass messages to publish to?
For anyone struggling with concurrency in python:
No, there isn't any simple framework. IMHO pythons' concurrency handling is a bit of a mess (compared to other languages like golang, where concurrency is built in). There's multiple major packages implementing this, one of them asyncio, but most of them are incompatible. I've ended up using a similar solution like proposed in this question.

Django asynchronous tasks locally

I have a web application that runs locally only (does not run on a remote server). The web application is basically just a ui to adjust settings and see some information about the main application. Web UI was used as opposed to a native application due to portability and ease of development. Now, in order to start and stop the main application, I want to achieve this through a button in the web application. However, I couldn't find a suitable way to start a asynchronous and managed task locally. I saw there is a library called celery, however that seems to be suitable to a distributed environment, which mine is not.
My main need to be able to start/stop the task, as well as the check if the task is running (so I can display that in the ui). Is there any way to achieve this?
celery can work just fine locally. Distributed is just someone else's computer after all :)
You will have to install all the same requirements and the like. You can kick off workers by hand, or as a service, just like in the celery docs.

What am I missing from running background tasks in-process? [Python, Gevent, Flask]

I am writing a Gevent/Flask server in Python. Some of the requests my Flask app takes need to run in the background; there is an endpoint for the client to poll the server for the task's result.
If you search the wisdom of the Internet for the best way to do this, everybody seems to be in favor of setting up one or several worker processes such as Celery or RQ, with a message queue or store such as RabbitMQ or Redis.
My app is small and my deployment is modest. This seems like too much of a hassle for me. I already have cooperative multitasking with Gevent, so I thought I'd just create a greenlet to do the background work in-process, that is, within the Flask app process itself.
This is not the mainstream solution, so my question is: Am I missing something? What am I missing? Is there something in this solution that makes it particularly bad?

Django asynchronous call

I'm wondering if there is a way to make a request to my Django backend that runs asynchronous. At page load, I need to kick off a process that takes ~30 seconds, but while it is running I cannot perform any other actions on the page that require a response from Django (specifically waiting on data for jqGrids).
Is there an easy way to tell Django that certain methods should be run asynchronously?
Django has not a native way to do asynchronous tasks but you could see Celery and using a django-celery task.
Celery web: http://www.celeryproject.org/
Django-Celery web: https://pypi.python.org/pypi/django-celery
You need to use celery. Celery is an asynchronous task queue/job queue based on distributed message passing. You can read more about celery here.
This is a great tutorial for setting up celery.

Celery tasks functions - web server vs remote server

I'm willing to send tasks from a web server (running Django) to a remote machine that is holding a Rabbitmq server and some workers that I implemented with Celery.
If I follow the Celery way to go, it seems I have to share the code between both machines, which means replicating the workers logic code in the web app code.
So:
Is there a best practice to do that? Since code is redundant, I am thinking about using a git submodule (=> replicated in the web app code repo, and in the workers code repo)
Should I better use something else than Celery then?
Am I missing something?
One way to manage this is to store your workers in your django project. Django and celery play nice to each other allowing you to use parts of your django project in your celery app. http://celery.readthedocs.org/en/latest/django/first-steps-with-django.html
Deploying this would mean that your web application would not use the modules involved with your celery workers, and on your celery machine your django views and such would never be used. This usually only results in a couple of megs of unused django application code...
You can use send_task. It takes same parameters than apply_async but you only have to give the task name. Without loading the module in django you can send tasks:
app.send_task('tasks.add', args=[2, 2], kwargs={})
http://celery.readthedocs.org/en/latest/reference/celery.html#celery.Celery.send_task

Categories