How to make Flask communicate with Flask across two different machines? - python

I have a Flask app on one machine, and a second machine where some queries are required to be run from. The second machine doesn't render any pages, it will just be doing things behind the scenes for the first app. If I create a Flask app on the second machine to control those queries, how do I communicate with it from the first app? Is making a second Flask app with an API correct, or is there a simpler way to do this?

You communicate with it like you would any other HTTP server: by making HTTP requests. Python has the built-in urllib, or you could consider the easy to use requests library.
If all the second machine is doing is running background tasks, there's no reason to set up another Flask app. You can use a task queue such as Celery instead, or an RPC library such as Pyro.

Related

How does server-side rendering work with a non-Node.js backend on Heroku?

I have been developing a Python app that serves a React frontend with server-side rendering.
Locally, this has worked fine as I'm able to run two servers on separate ports to handle different parts of my application. My Python backend receives the initial request and then sends an http request to my Node.js server which does my server-side rendering. The result is then sent back to my Python backend which injects the server-rendered frontend into the HTML which is sent to the client.
However, Heroku limits applications to a single, dynamically generated port. This limits me to only running one web server which means I'm no longer able to run my Node.js server to do my server-side rendering. I have thought of some gimmicky ways to make this work, but I don't want to have to create an entirely new app on Heroku just to run the Node.js server I need.
I'm not sure how I can make this work with these limitations in place so I'm hoping I can learn some alternative ways to make this work on Heroku. What are some viable workarounds to handle this problem?
I think you need to create to separate apps on Heroku(even though you don't want to), as far as I know there's no other available options on Heroku.
I use Heroku for a SSR application running on two apps. One for frontend(react) and one for backend(nodejs). Works like a charm

Best way to make a web interface for python script

I've made a small python script to scrap the web. I would like to make a nice and simple web interface where the user can enter data to search for and have the result displayed as a list.
I understand that there's many different ways to do that but don't know which one would be the best in my case.
I would like something :
Really simple and light
Running locally with the less dependencies possible.
So far I've thinking about :
A NodeJS server displaying content and executing the script
A web framework in Python (web.py, Flask, Django..?)
A local webserver (XAMPP) and cgi
Please note that I don't know much about web in python but I'm a bit used to NodeJS.
What would you recommend ?
Thanks, Victor
Personally I prefer gevent, bottle or Flask, and some front end framework like bootstrap or framework7.
Gevent easily makes it asynchronous and has websockets built right in, and bottle is the easiest (and fastest) way to build a web app or api.
Socket.io ist Very easy to send Data between Websites and scripts.
The Website Connect with the Socket.io Server and Inside the Server the Python Script can be executed

Flask Rest API: how to switch to using sockets from AJAX?

I've been trying websocket-client and socketio-client with no luck so far. The broad picture of what I want to accomplish is this:
Currently, I have a Flask Rest API that has both a web front-end and a command line interface, and it handles several different sets of file uploads/downloads. Both communicate with the server using HTTP requests, the web one from JQuery AJAX and the CLI uses python requests. I would like to switch to using sockets so that database changes from one client appear on all of them. I have been able to get Flask-SocketIO working between my JQuery and Flask server, but I'm struggling with getting any client libraries working from the CLI portion. Is there an easy to use python library for sockets similar to requests I should be using for this transition, or am I going in a totally wrong direction with making this switch?
Another option, unsure of the viability, would be to try and keep both the REST API for the CLI and have sockets for the web interface. Sounds very messy though.
After doing a lot of searching and messing around with various libraries, the one that was easiest to get up and running connecting a command line tool with a Flask-SocketIO webapp was socketIO-client.
This repository came in handy for the issues where I was struggling to understand how to correctly use the waits to receive info on the client side.
Once I've finished the project in a few weeks, I will come back and add more details so people finding this in the future can have an easier time getting this set up.

Flask alternatives to achieve true multi-threading?

I had implemented a multi-threaded web server using the Flask micro framework. Basically, my server has a task queue and a thread pool. Hence, it can handle multiple requests. Since Flask is implemented in Python and Python threads are not truly concurrent, my web app is a bit laggy.
Are there are any alternatives to Flask to overcome the issue of multi-threading?
I came across this question and I was a little disappointed nobody had pointed out how flask (and most python web apps are meant to be deployed). See: http://flask.pocoo.org/docs/deploying/#deployment
My preferred deployment option is the super-simple Tornado which works equally well on Linux and Windows (if I am deploying it alongside existing websites, or even a hybrid deployment as part of an existing site, I usually use IIS Application Request Routing [ARR] as a Reverse Proxy to Tornado). I've also used gevent on both with great success.
Tornado is an open source version of the scalable, non-blocking web server and tools that power FriendFeed. Because it is non-blocking and uses epoll, it can handle thousands of simultaneous standing connections, which means it is ideal for real-time web services. Integrating this service with Flask is straightforward:
So, if your flask application is in yourapplication.py, you might create another called tornado_web.py and use it to serve your application like so:
from tornado.wsgi import WSGIContainer
from tornado.httpserver import HTTPServer
from tornado.ioloop import IOLoop
from yourapplication import app
http_server = HTTPServer(WSGIContainer(app))
http_server.listen(5000)
IOLoop.instance().start()
via: http://flask.pocoo.org/docs/deploying/wsgi-standalone/#tornado
This isn't Flask's fault, it is a limitation in the Python interpreter, so any framework that you use will be subject to it.
But there is a great way to avoid this problem. To have true concurrence you can use a pool of processes instead of threads. The multiprocessing module provides an API that is compatible with that of the threading module, but it creates child processes for the workers. I have used this module to create background workers for Flask applications and found to work very well.
There is a new package in the trend now which is robust for production also, it is implemented in python and its easy to understand. Please do have a look at it.
FastAPI

Should I use orbited or gevent for integrating comet functionality into a django app

I have been working with Django for some time now and have written several apps on a setup that uses Apache 2 mod_wsgi and a PostgreSQL database on ubuntu.
I have aa app that uses xsendfile to serve files from Apache via a Django view, and also allow users to upload files via a form as well. All this working great, but I now want to ramp up the features (and the complexity I am sure) by allowing users to chat and to see when new files have been uploaded without refreshing their browser.
As I want this to be scale-able, I don't want to poll continually with AJAX as this is going to get very heavy with large numbers of users.
I have read more posts, sites and blogs then I can count on integrating comet functionality into a Django app but there are so many different opinions out there on how to do this that I am now completely confused.
Should I be using orbited, gevent, iosocket?
Where does Tornado fit into this debate?
I want the messages also be stored on the database, so do I need any special configuration
to prevent my application blocking when writing to the database?
Will running a chat server with Django have any impact on my ability to serve files from Apache?
I'd recommend using WebSockets for bidirectional realtime communication. Keep running Django as is and run a WebSocket server on another port. As far as your database blocking, yes, you'll need to keep that in mind as you write your WebSocket server and either use a non-blocking database driver, or address that in some way.
Client-side you'll want to use Socket.IO or web-socket-js to support flash fallback for older browsers which don't support flash.
For the server, I would lean towards gevent or tornado, personally. For gevent there is gevent-websocket and gevent-socketio, for tornado you get built-in WebSocket support and can use tornadio if you want to use Socket.IO. Eventlet and twisted both support WebSockets as well. There is also a pretty cool new project called autobahn which is built on twisted, and meinheld has WebSocket middleware you can use.
WebSockets are pretty exciting, and as such there are tons of great posts out there on the subject. I found these posts useful:
http://gehrcke.de/2011/06/the-best-and-simplest-tools-to-create-a-basic-websocket-application-with-flash-fallback-and-python-on-the-server-side/
http://codysoyland.com/2011/feb/6/evented-django-part-one-socketio-and-gevent/
http://toastdriven.com/blog/2011/jul/31/gevent-long-polling-you/
http://blog.jupo.org/post/8858247674/real-time-web-apps-with-django-and-websockets/
Instead of Apache + X-Sendfile you could use Nginx + X-Accel-Redirect. That way you can run a gevent/wsgi/django server behind Nginx with views that provide long-polling. No need for a separate websockets server.
I've used both Apache + X-Sendfile and Nginx + X-Accel-Redirect to serve (access-protected) content on Webfaction without any problems.

Categories