I'm working on a Django project, in which I will need the use of multithreading and multiprocessing concepts (Send and receive data from other servers such as PACS server/ I/O Threads... ).
The question I have is Django capable of applying multithreading /multiprocessing?
Thank you
By far the most popular tool in the Django world for doing this is Celery Here is a good intro tutorial
There are some more lightweight packages like Dramatiq and django-db-queue, which are intended for use cases where the configuration associated with Celery could be considered overkill.
You could, of course, 'roll-your-own' with the threading module, as sketched out in this answer
Related
I have a running CLI application in Python that uses threads to execute some workers. Now I am writing a GUI using electron for this application. For simple requests/responses I am using gRPC to communicate between the Python application and the GUI.
I am, however, struggling to find a proper publishing mechanism to push data to the GUI: gRPCs integrated streaming won't work since it uses generators; as already mentioned my longer, blocking tasks are executed using threads (subclasses of threading.Thread). Also I'd like to emit certain events (e.g., the progress) from within those threads.
Then I've found the Flasks SocketIO implementation, which is, however, a blocking execution and thus not really suited for what I have in mind - I'd have to again execute two processes (Flask and my CLI application)...
Another package I've found is websockets but I can't get my head around how I could implement this producer() function that they mention in the patterns.
My last idea would be to deploy a broker-based message system like Redis or simply fall back to the brokerless zmq, which is a bit of a hassle to setup for the GUI application.
So the simple question:
Is there any easy framework that allows to create a server-"task" in a Python that I can pass messages to publish to?
For anyone struggling with concurrency in python:
No, there isn't any simple framework. IMHO pythons' concurrency handling is a bit of a mess (compared to other languages like golang, where concurrency is built in). There's multiple major packages implementing this, one of them asyncio, but most of them are incompatible. I've ended up using a similar solution like proposed in this question.
The core logic rests in a Python project, which is ready, and I am creating the UI using Django framework. The Python project uses ZMQ for messaging.
Is it wise to connect Django to ZMQ (from the Python project) to send and receive values? I am dealing with MultiAgent systems, and the core project is all about controlling few hardware devices in buildings.
Yes, it's in fact, a recognized pattern for solutions like you're working on.
I gathered for you some links that show diferent approaches:
Task queuing in Django with ZeroMQ
Long Running Taks in Web App/Django
This it's about RabbitMQ, but since RabbitMQ is a MOM (Message Oriented Middleware) too it might worth take a look this article: Django and asynchronous jobs
You might find interesting this package/application too: Django-ztask
I hope you find the ansewr you're looking for.
I need a queue to send data from ruby to python
The system is an application with a Ruby frontend and python backend and I'd rather not add another complicated piece. If it was ruby only I'd just go with delayed_job, but ruby->python is harder.
So
I'm looking for a simple database based queue (similar to delayed_job) for python for which I'm planning to hack a ruby 'producer' part.
Or just surprise me with a solution I haven't think of yet.
Maybe you could have a look at Celery.
Pretty old question, but just for anyone stumbling across this question now and looking for a simple answer that isn't Celery:
django-background-tasks is based Ruby's DelayedJob.
Django Background Task is a databased-backed work queue for Django,
loosely based around Ruby's DelayedJob library. This project was
adopted and adapted from this repo.
To avoid conflicts on PyPI we renamed it to django-background-tasks
(plural). For an easy upgrade from django-background-task to
django-background-tasks, the internal module structure were left
untouched.
In Django Background Task, all tasks are implemented as functions (or
any other callable).
There are two parts to using background tasks:
creating the task functions and registering them with the scheduler
setup a cron task (or long running process) to execute the tasks
I need to develp a real production webservice with python that will be used by another client application (with another progamming language ) .
I mean in real production webservice that this webserivce is will be used on critical environment that failure of the webserivce could cause major problems.
could someone provide /suggest which library to use in order to build such webservice with python ?
I know that python has the built in simpleXMLRPCServer but i don't know its quality and if its apropriate for real production usage .
Python has been used to develop production grade web services. There are numerous framework to do that. (Django, Twisted etc).
You expect certain quality attributes from production grade servers like availability, scalability etc. For mission critical applications, availability becomes important. Your application architecture and development may influence these attributes more than the frameworks that you may use to develop them with. You can plan to provide extensive fault tolerance, redundant systems and various other strategies to improve availability.
This applies to building application with Python framework too.
Twisted is a very good framework to develop networking and web applications. There are other frameworks available in Python too, for example : Tornado etc
You can go through certain twisted docs and also the following blog posts that can help understanding twisted better.
Twisted in 60 seconds series
A very good twisted introduction
I have been exploring twisted basics and have posted a few notes at my blog
Twisted docs:
http://twistedmatrix.com/documents/10.1.0/web/howto/xmlrpc.html
Python: deferToThread XMLRPC Server - Twisted - Cherrypy?
http://nullege.com/codes/search/SimpleXMLRPCServer.SimpleXMLRPCDispatcher/all/1
http://code.activestate.com/recipes/526625-twisted-xml-rpc-server-with-basic-http-authenticat/
http://www.artima.com/weblogs/viewpost.jsp?thread=156396
Some projects along this line:
http://freshmeat.net/projects/python-xmlrpc-server-w-ssl-authentication
Django:
https://launchpad.net/django-xmlrpc
http://djangosnippets.org/snippets/2078/
http://www.drdobbs.com/184405364
http://www.davidfischer.name/2009/06/django-with-jsonrpc-and-xmlrpc/
Others:
http://www.f4ntasmic.com/2009/03/simple-xmlrpc-server.html
I hope this helps. :)
We have a collection of Unix scripts (and/or Python modules) that each perform a long running task. I would like to provide a web interface for them that does the following:
Asks for relevant data to pass into scripts.
Allows for starting/stopping/killing them.
Allows for monitoring the progress and/or other information provided by the scripts.
Possibly some kind of logging (although the scripts already do logging).
I do know how to write a server that does this (e.g. by using Python's built-in HTTP server/JSON), but doing this properly is non-trivial and I do not want to reinvent the wheel.
Are there any existing solutions that allow for maintaining asynchronous server-side tasks?
Django is great for writing web applications, and the subprocess module (subprocess.Popen en .communicate()) is great for executing shell scripts. You can give it a stdin,stdout and stderr stream for communication if you want.
Answering my own question, I recently saw the announcement of Celery 1.0, which seems to do much of what I am looking for.
I would use SGE, but I think it could be overkill for your need...