Adding twisted code to a pygtk app - python

I have a simple pygtk app using urllib2, what changes should I make to add working twisted code?
The pbgtk2.py example it's confusing

You switch from using the gtk mainloop to the right Twisted reactor. Or you decide to run Twisted in a separate thread using reactor.run(installSignalHandlers=0), and stay with the gtk mainloop.
You decide if you want to defer the urllib2 call to its own thread, or if you want to rewrite that code using Twisted's HTTP client libraries.
You go to the Twisted mailing list or IRC channel and ask for help.

Related

Simple websocket server in Python for publishing

I have a running CLI application in Python that uses threads to execute some workers. Now I am writing a GUI using electron for this application. For simple requests/responses I am using gRPC to communicate between the Python application and the GUI.
I am, however, struggling to find a proper publishing mechanism to push data to the GUI: gRPCs integrated streaming won't work since it uses generators; as already mentioned my longer, blocking tasks are executed using threads (subclasses of threading.Thread). Also I'd like to emit certain events (e.g., the progress) from within those threads.
Then I've found the Flasks SocketIO implementation, which is, however, a blocking execution and thus not really suited for what I have in mind - I'd have to again execute two processes (Flask and my CLI application)...
Another package I've found is websockets but I can't get my head around how I could implement this producer() function that they mention in the patterns.
My last idea would be to deploy a broker-based message system like Redis or simply fall back to the brokerless zmq, which is a bit of a hassle to setup for the GUI application.
So the simple question:
Is there any easy framework that allows to create a server-"task" in a Python that I can pass messages to publish to?
For anyone struggling with concurrency in python:
No, there isn't any simple framework. IMHO pythons' concurrency handling is a bit of a mess (compared to other languages like golang, where concurrency is built in). There's multiple major packages implementing this, one of them asyncio, but most of them are incompatible. I've ended up using a similar solution like proposed in this question.

How to use pycurl with Twisted Python?

I am writing an application using pycurl, and need to make it work in Twisted. I've been searching for either making pycurl somehow compatible with Twisted framework, or using an existing Twisted library. I am suggested Twisted web, but there is no direct map of functions from pycurl to Twisted web. Can anyone point me to the right direction?
Edit: One solution is run pycurl in another thread, but preferably, I wish to use Twisted framework or pycurl that is nonblocking, so I don't have to create another thread.
For any blocking function, if there's no other asynchronous alternative, Twisted lets you run it on another thread but treat it as a Deferred.
from twisted.internet import threads
d = threads.deferToThread(pycurl.some_function)
d.addCallback(callback)
See Integrating blocking code with Twisted in Generatin Deferreds.
Have you considered using Twisted web.client.Agent? It is very basic agent, but integrates very well with the Twisted event loop.

Flask alternatives to achieve true multi-threading?

I had implemented a multi-threaded web server using the Flask micro framework. Basically, my server has a task queue and a thread pool. Hence, it can handle multiple requests. Since Flask is implemented in Python and Python threads are not truly concurrent, my web app is a bit laggy.
Are there are any alternatives to Flask to overcome the issue of multi-threading?
I came across this question and I was a little disappointed nobody had pointed out how flask (and most python web apps are meant to be deployed). See: http://flask.pocoo.org/docs/deploying/#deployment
My preferred deployment option is the super-simple Tornado which works equally well on Linux and Windows (if I am deploying it alongside existing websites, or even a hybrid deployment as part of an existing site, I usually use IIS Application Request Routing [ARR] as a Reverse Proxy to Tornado). I've also used gevent on both with great success.
Tornado is an open source version of the scalable, non-blocking web server and tools that power FriendFeed. Because it is non-blocking and uses epoll, it can handle thousands of simultaneous standing connections, which means it is ideal for real-time web services. Integrating this service with Flask is straightforward:
So, if your flask application is in yourapplication.py, you might create another called tornado_web.py and use it to serve your application like so:
from tornado.wsgi import WSGIContainer
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from yourapplication import app
http_server = HTTPServer(WSGIContainer(app))
http_server.listen(5000)
IOLoop.instance().start()
via: http://flask.pocoo.org/docs/deploying/wsgi-standalone/#tornado
This isn't Flask's fault, it is a limitation in the Python interpreter, so any framework that you use will be subject to it.
But there is a great way to avoid this problem. To have true concurrence you can use a pool of processes instead of threads. The multiprocessing module provides an API that is compatible with that of the threading module, but it creates child processes for the workers. I have used this module to create background workers for Flask applications and found to work very well.
There is a new package in the trend now which is robust for production also, it is implemented in python and its easy to understand. Please do have a look at it.
FastAPI

what's a good module for writing an http web service interface for a daemon?

To give a little background, I'm writing (or am going to write) a daemon in Python for scheduling tasks to run at user-specified dates. The scheduler daemon also needs to have a JSON-based HTTP web service interface (buzzword mania, I know) for adding tasks to the queue and monitoring the scheduler's status. The interface needs to receive requests while the daemon is running, so they either need to run in a separate thread or cooperatively multitask somehow. Ideally the web service interface should run in the same process as the daemon, too.
I could think of a few ways to do it, but I'm wondering if there's some obvious module out there that's specifically tailored for this kind of thing. Any suggestions about what to use, or about the project in general are quite welcome. Thanks! :)
Check out the class BaseHTTPServer -- a "Basic HTTP server" bundled with Python.
http://docs.python.org/library/basehttpserver.html
You can spin up a second thread and have it serve your requests for you very easily (probably < 30 lines of code). And it all runs in the same process and Python interpreter space, so it can access all your objects, etc.
I'm not sure I understand your question properly, but take a look at Twisted
I believed all kinds of python web framework is useful.
You can pick up one like CherryPy, which is small enough to integrate into your system. Also CherryPy includes a pure python WSGI server for production.
Also the performance may not be as good as apache, but it's already very stable.
Don't re-invent the bicycle!
Run jobs via cron script, and create a separate web interface using, for example, Django or Tornado.
Connect them via a database. Even sqlite will do the job if you don't want to scale on more machines.

cherrypy and wxpython

I'm trying to make a cherrypy application with a wxpython ui. The problem is both libraries use closed loop event handlers. Is there a way for this to work? If I have the wx ui start cherrypy is that going to lock up the ui?
See my answer at CherryPy interferes with Twisted shutting down on Windows
In short, CherryPy handles the main loop by default, but it definitely doesn't need to. Stop using quickstart and call engine.start without engine.block, and CP will run in its own threads and leave the main thread for your other framework to control.
If you use threading, you should be able to start up the CherryPy server in one thread and run wxPython in the other. This article (http://wiki.wxpython.org/LongRunningTasks) on the wxPython wiki has some info on threading, and the CherryPy server source code (http://www.cherrypy.org/browser/trunk/cherrypy/wsgiserver/__init__.py) has some documentation on how the server works, and possibly how you could get it to interact with threads.
One way to decouple them would be to start them up as two separate processes and have them communicate via some kind of IPC mechanism. You might have to write a small adaptor to have them speak a common protocol.
Since you're doing CherryPy, you might also be able to expose a control interface via HTTP which the wx GUI can use to drive your server.
I would encourage you to take a look at the Calibre (e-book manager) source. It is written in PyQT, but uses CherryPy to allow people to view their library from outside their LAN.

Categories