I'm having an hard time figuring out how to solve a problem with a little project.
Basically i have a Django application. On the other hand i have an external Python script running. I would like to create a system where, each time a form in my Django app is submitted, the data submitted in the form is sent to the external Python application.
The external Python service should receive the data, read it, and according to who is the user and what did he submit, it should perform some tasks, then send a response.
Here is what i thought: 1) Connect the external Python app to the same database that Django is using. So that when the form is submitted, it is saved on the database and the data can be 'shared' with the second Python service. The problem with this solution is that i would need the second app to query the Database every second and perform a lot of queries, that would be a problem with performances. 2) Create an epi endpoint, so that the external python app would connect to the endpoint and fetch the data saved in the database from there. The problem is the same of the first solution. Would a service like Redis or RabbitMQ help in this case?
Importing the external Python process in my Django app is not a solution, it needs to be separate from the Django app. An important requirement for this, is speed. When new data is submitted, it needs to be received by the second Python app in the shortest time possible.
That said, i'm open to any advice or possible solution to solve this problem, thanks in advance :)
You could use a microservices architecture to build this. Instead of sharing databases between two applications you have them communicate with each other through web requests. Django would shoot a request to your other app with the relevant data, and the other server would respond back with the results.
Usually one would use something like Flask (synchronous server) or Sanic (asynchronous server) to receive/reply, but you can also look into something like Nameko. Would also recommend looking into Docker as eventually, as you set up more of these microservices, you'll need it.
The idea is (i.e. using Flask), to create an access point that does some computation to your data and returns it back to the Django server.
computation.py
from flask import Flask
from flask import request
app = Flask(__name__)
#app.route("/", methods=["POST"])
def computation():
data = request.get_json()
print(data)
return f"Hey! {data}"
app.run(host="0.0.0.0", port=8090)
The Django server is simply sending a request to your server application.
django_mock.py
import requests
req = requests.post('http://0.0.0.0:8090/', json={"data": "Hello"})
print(req.text)
The above will print out on the computation.py app:
{'data': 'Hello'}
and will print out on the django_mock.py example:
Hey! {'data': 'Hello'}
You should build an API. The 2nd app would now be an application server and the 1st app, when it receives a form submission from the user, would persist its data to the DB and then make an API call to the 2nd app via this API. You would include key information in the API request that identifies the record in the DB.
You can use Django (e.g. DRF) or Flask to implement a simple API server in Python.
Now, this requires your app server to be up and running all the time. What if it's down? What should the 1st app do? If you need this level of flexibility, then you need to decouple these apps in some way. Either the 1st app implements some kind of backoff/retry if it can't send to the 2nd app. Or you use a reliable queueing mechanism (something like Amazon SQS).
Related
I'm trying to learn Google App Engine (and general web app programming) by building a simple app that periodically polls a radio station RSS feed (~1 request/min), writes the result to a database, and updates a Spotify playlist with the current song. I am using Python with the Flask framework for the web app. I have a simple front-end site which is able to implement the Spotify authentication protocol, however, I am now struggling with the best way to poll information from the RSS feed in the background. I have looked into using the deferred task workflow with Google App Engine Task Queues, but it seems like cron might be a better option for something this simple. The Google App Engine cron docs say to implement a URL call, which is then handled in my app. Is this handled by my Flask URL handlers (ie routes), or by the app engine handlers? My initial thought was that it would look something like this:
In the cron.yaml file:
cron:
- description: "Poll Song RSS"
url: /playlistupdate
schedule: every 1 minute
And then in my routes.py I would have a route to do the work:
#app.route('/playlistupdate')
def playlistupdate ()
<Send HTTP request to RSS site, store results in SQLite db, add song to spotify playlist via Spotify API>
Is this the right idea? Or am I missing something about how the cron flow should work? What happens if a user tries to go to http://[MY_HOSTNAME]/playlistupdate?
Any help on what my options are for a simple background polling flow like this, and how it would work with the Flask framework would be much appreciated. Thanks in advance.
Yes, use cron.
Yes, you could implement the handler exactly as you describe.
Yes, you should secure the endpoint, otherwise, anyone would be able to access it.
See here for a way to secure the endpoint.
I have a running python application that needs to receive some data and process them. and I also have a PHP server that can get these data. I want to send JSON data from PHP to my python app.
anyway except running a python web server and send data to it, or insert into DB and get from DB with python?
thanks.
I tried using python cherryPy web server.
#Niklas D It would be easier to answer your question, if you can give some more context about the application or use case you want to solve.
Some further possibilities are:
Glue Code (I never did it with python and php only C++ with python, but you should be able to find examples on the internet e.g. https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages#PHP )
Messaging Systems like RabbitMQ, ActiveMQ, ZeroMQ, etc.
Redis (I know you said except writing to a database, but Redis provides some features for publish subscribe https://redis.io/commands/pubsub which allows you to write to Redis from the one side and get data on the other side without polling the db all the time, which is the issue you have with using a database I guess) It's a bit easier to setup and use, than a messaging system.
TCP connection between the python and php application. https://medium.com/swlh/lets-write-a-chat-app-in-python-f6783a9ac170
If you want to send data to a python application using web protocols, i.e send POST, GET requests etc then you need to create a python web app to receive and handle those requests. Which in turn needs to be running off a webserver or you could build serverless functions to handle this, see https://serverless.com/
If you want to get data using a python application, i.e the python app sends POST and GET requests etc to your php app to ask for the JSON payload you can build an app using python's standard requests library https://docs.python.org/3/library/urllib.request.html or better still us the Requests package http://docs.python-requests.org/en/master/
Or you could do something and save the JSON file to disk and then open it with your python app. You'd need to set up scheduling or make your php app execute python code on the server... This last suggestion is a bad idea please don't unless your app is isolated and not publicly accessible or you know how to lock down your security.
I've been working on a project recently where basically I need to make a motor spin at certain times throughout the day for a few seconds, that can be customised using your phone.
So far I have followed many tutorials and done a lot of browsing and I've managed to have my Pi zero host its own network(using nginx, hostapd and dnsmasq), which you can connect to on your phone and go to 192.168.4.1 to access an index.html page in /var/www/html/
I also have a Python script which, when run, turns one of the GPIO pins on for a few seconds and then off again, and this GPIO pin is in turn connected to the motor.
The trouble I am having is setting up the rest of the web side, where you are able to connect to the network, go to a page, insert 2 or 3 different times, submit them, and then when it's that time the Python scripts will run.
Since I've set up the pi as a access point, I'm not sure how to reverse it and allow it to connect to wifi again without ruining the access point and current set up, so I'm not sure if there's an easy way to download any packages or modules I may need.
Anyway, any help anyone could give me would be incredibly useful - many thanks!!
Unless you are planning for a production web server, for simple application like display sensor status or control sensor via web page, there is a simpler solution for beginners and for python programmers. Since you are using python, so you don't have to use PHP, and you probably don't need to have Nginx at this stage. There are actually two ways in my experiences to do it.
1) Using http.server
The 'simple way' to serve web page using python based on python standard library http.server, utilising python build-in socket based http server. But it is less intuitive to set it up for GET/POST requests/responses. It is too long to describe it here, but I have a blog post on how to do it here.
2) Using Flask web development micro framework
Flask allows you to setup html template, handling route and run a web server easily within python environment. You need to install Flask package for python web development. The simplest flask python code that addressed your question of serving the data to a web page would be:
from flask import Flask, render_template_string
app = Flask(__name__)
data = 200 #assuming this is the data you want to show in your web page
#app.route('/')
def index():
return render_template_string('''
<h1>My Sensor Web Page</h1>
<p>My sensor reading is {}".format(data))</p>
'''
if __name__ == '__main__':
app.run(debug=True)
Launch your browser and point it to http://localhost:8000, you should see the data to be rendered as webpage per our simple example code.
What you will need is to either import your code into this flask example, or integrate it into the example, and pass the data you want to display to render_template_string function.
I would suggest using a web framework to host the "website". Flask is one that I have used for similar applications. Since this method allows you to directly call python functions in response to http requests it should be fairly easy to implement what you are trying to do.
As a bonus, you can use flask with nginx but I really don't think you need it for this specific application.
I am currently learning how to use django. I have a standalone python script that I want to communicate with my django app. However, I have no clue how to go about doing this. My django app has a login function and a database with usernames and passwords. I want my python script to talk to my app and verify the persons user name and password and also get some account info like the person's name. How do I go about doing this? I am very new to web apps and I am not really sure where to begin.
Some Clarifications: My standalone python program is so that the user can access some information about their account. I am not trying to use the script for login functionality. My django app already handles this. I am just trying to find a way to verify that they have said account.
For example: If you have a flashcards web app and you want the user to have a program locally on their computer to access their flashcards, they need to login and download the cards from the web app. So wouldn't the standalone program need to communicate with the app to get login information and access to the cards on that account somehow? That's what I am trying to accomplish.
If I understand you correctly, you're looking to have an external program communicate with your server. To do this, the server needs to expose an API (Application Interface) that communicates with the external program. That interface will receive a message and return a response.
The request will need to have two things:
identifying information for the user - usually a secret key - so that other people can't access the user's data.
a query of some sort indicating what kind of information to return.
The server will get the request, validate the user's secret key, process the query, and return the result.
It's pretty easy to do in Django. Set up a url like /api/cards and a view. Have the view process the request and return the response. Often, these days, these back and forth messages are encoded in JSON - an easy way to encapsulate and send data. Google around with the terms django, api, and json and you'll find a lot of what you need.
I have a Django web application which shows a website to display some data. So, this application consists of the html pages and views to display this data which i am storing in a SQLite DB.
At the end of the day a third party needs to connect to this web application and upload binary data over to the application. What is the best way to host this service, as an independent python web server or part of Django application or how else ?
Any suggestions would be appreciated !
If the uploading doesn't occur too often, why not just create a Django POST/PUT view for the job that simply accepts the file over HTTP? With the information you've provided, I cannot see why this simple solution wouldn't be up to the task.