maybe it is simple but I didn't find a satisfactory answer yet.
I have a python application that collects data over CAN bus (temperature, weight, ...) and I want to visualize them over Angular.
On the one side, I wrote the Python application that cyclic read the CAN-bus data and writes them to the console and on the other hand I wrote a small Angular application that contains the first step a simple table.
Now I want to fill in the table every 10 seconds with data from the Python application instead of printing them to the console.
How can I connect these both?
My first thought was a simple file where I save the values from Python and read them with Angular.
The second solution is a database, but I think this is too much for only a few values
So is there a direct way to access the Python data from Angular?
Basic idea is to create an api in python and let angular consume that
then there is the question of weather you want to have backup data in python,
if so then save it a db or file and use that as response for angular
if you want to do some fancy real time stuff may be look into long polling or http event stream
There are several ways you can access Python data from an Angular application:
One way is to use a REST API. You can create a REST API in Python
using a web framework like Flask or Django, and then use Angular's
HTTP client to make requests to the API and retrieve the data.
Another option is to use WebSockets. You can use a Python library
like asyncio or websockets to set up a WebSocket server, and then
use Angular's WebSocket client to connect to the server and receive
updates in real-time.
You can also use a message queue like RabbitMQ or ZeroMQ to allow
your Python and Angular applications to communicate with each other
asynchronously.
Overall, the best approach will depend on your specific requirements and how you want to structure your application. A REST API is a good choice if you need to retrieve data from the Python application on demand, while WebSockets or a message queue can be used for real-time communication and updates.
Related
I am in the process of making a web application that essentially takes in some web-stream from the client via their browser, and in real-time, sends it to a python server (Flask probably) that processes the frames in real-time and sends a response to the user. Now the backend has to be capable of handling web-streams from multiple clients simultaneously.
I am trying to grasp the framework for this entire application. What I have in mind is the following:
The user accesses the web-cam via their browser (e.g using webcamJS), the frames are sent from the frontend to the back-end through a web-socket. The task here is to establish a seemless handshake between the multiple clients and their processing requests.
There is a need for concurrency if the processing is to be done in real-time, multiple threads of the same image-processing-algorithm need to be executed. My take is that I make use of the multiple threads for this purpose or is there a better way of doing this? Is this even a feasible approach as the image-processing-algorithm (trained model) takes some time to load up , so it has to be always initialized at the backend and not start from scratch at every request.
The response from image-processing-algorithm need to get back to the frontend and the process goes on.
What I really need help is in drawing out the complete framework of this implementation. Any suggestions on the modules/frameworks to use with some implementations would be greatly appreciated.
Thank you.
You can use Flask for your web server, Keras to process the videos.
The standard library multiprocessing module will also be helpful to treat multiple feeds at once.
Say that I have a Python Flask server running which has a backend script that produces a number and a string.
How can I pass the number and string from the backend to a script that runs of client-side so that the user can run it rather than the server?
Example:
backend script data_producer.py produces "asdaslkdjasdlksja" and 18 from its functions
I want to pass "asdaslkdjasdlksja" and 18 to a Brython or JavaScript HTML embed ( tag) so that it can be processed in the browser and the results be sent back to my server.
Edit: I realized that I can just use Jinja2's "{{ }}" when rendering a template so that I can use data in an HTML script embed
The question you asked is too broad. It's almost equivalent to asking, how you can connect two computers. Since you haven't even specified any data type, the first thing that comes to my mind is using sockets which is as low level as you can get.
A more high level and appropriate approach would be to use an HTTP REST server (with flask-RESTful), since you already have a flask server running.
However, there's another million ways to transfer data between two Python scripts, from WebSockets, WebRTCs, sshing, the new IPFS to even emails. Now most of them would probably be overkill, so I would suggest you to make a simple REST server and make the client send GET or POST requests to it.
After looking at the new edit, I still think a REST api is the best option. Since you can easily make a GET or POST request using the fetch api in Javascript. In Brython you can use ajax to do the same thing.
I have a running python application that needs to receive some data and process them. and I also have a PHP server that can get these data. I want to send JSON data from PHP to my python app.
anyway except running a python web server and send data to it, or insert into DB and get from DB with python?
thanks.
I tried using python cherryPy web server.
#Niklas D It would be easier to answer your question, if you can give some more context about the application or use case you want to solve.
Some further possibilities are:
Glue Code (I never did it with python and php only C++ with python, but you should be able to find examples on the internet e.g. https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages#PHP )
Messaging Systems like RabbitMQ, ActiveMQ, ZeroMQ, etc.
Redis (I know you said except writing to a database, but Redis provides some features for publish subscribe https://redis.io/commands/pubsub which allows you to write to Redis from the one side and get data on the other side without polling the db all the time, which is the issue you have with using a database I guess) It's a bit easier to setup and use, than a messaging system.
TCP connection between the python and php application. https://medium.com/swlh/lets-write-a-chat-app-in-python-f6783a9ac170
If you want to send data to a python application using web protocols, i.e send POST, GET requests etc then you need to create a python web app to receive and handle those requests. Which in turn needs to be running off a webserver or you could build serverless functions to handle this, see https://serverless.com/
If you want to get data using a python application, i.e the python app sends POST and GET requests etc to your php app to ask for the JSON payload you can build an app using python's standard requests library https://docs.python.org/3/library/urllib.request.html or better still us the Requests package http://docs.python-requests.org/en/master/
Or you could do something and save the JSON file to disk and then open it with your python app. You'd need to set up scheduling or make your php app execute python code on the server... This last suggestion is a bad idea please don't unless your app is isolated and not publicly accessible or you know how to lock down your security.
I am creating an application that basically has multiple connections to a third party Chat Streaming API(Socket based).
The way it works is - Every user has an account on my app and another account on the third party app. He gives me an access token for the third party chat app and I connect to the third party API to stream his chats. This happens for hundreds of users.
I need to create a socket connection pool for every user and run parallel threads. I am using a python library(for that API) and am able to achieve real time feeds for single users. How do I implement an asynchronous socket connection pool in Python or NodeJS? I have a Linux micro instance on EC2 and I need to run this application for 1000 users.
I am exploring Redis+Tornado to implement this. Are there any better alternatives?
This will be messy and also a couple of things to consider.
If you are going to use multiple threads remember that you can only run so many per CPU as the OS permits, rather go multiprocessing.
If you are going async with long polling processes it will prevent other clients from processing requests.
Solution
When your application absolutely needs to be real-time I would suggest websockets for server-client interaction.
Then from your clients request start a single process that listens\polls on your streaming API using multiprocessing in python. So you will essentially create a separate process for each client.
And now, to make your WebSocketHandler and Background API Streamer interact with each other you can use the Observer Pattern (https://en.wikipedia.org/wiki/Observer_pattern) to notify the WebSocket that you have received data from the API.
Make sure that you assign a unique ID to every client and make sure that you only post the data to the intended client when using websockets.
EDIT:
Web:
Also on your question regarding Tornado. It is a good lightweight framework for running a couple of users maybe 1000. But anything more than that I would suggest looking at Django as it will allow you to be more productive in producing code and also there are lots of tools out there that the community have developed over time.
Database:
Red.is is a good choice if you need a very fast no-sql db, also have a look at mongodb. If you require a multi-region DB I would suggest going with Cassandra or CouchDB due to the partitioned nodes. The image below might help you better decide which DB to use.
I'm developing an application with Django. Part of my application that I am designing is supposed to take a keyword, pull a bunch of data using various APIs and scrapers, and then display and then send that data to the client in the form of the table. Obviously I am caching all of this for performance reasons.
I don't fully grok Celery yet, and would like to know if this is the best way to handle this task. I need to cache all of this into a database, so should I use an AMQP result backend, or a database result backend?
Also, what would be the best way send status updates on the task to the client (browser)? If possible I would like to be able to send incremental results to the client so that the data being worked on appears as it is processed.