I am using Heroku platform to put simple webhook, which is written on python and uses Flask. Cloud system I use sends POST request to my webhook, sending me some data and I should respond.
I would like to keep a log of all requests and responses.
Sometimes it happens, that Cloud system may send the same requests to my webhook again (for whatever reason - I didn't send a response in specified time or something like that).
I would like to distinguish such request. If I simply put information into the log about what I get it may happen that I will see in my log something like:
1) Starting processing request (1st request)
2) Starting processing request (2nd request)
3) Calculating response (1st request)
4) Calculating response (2nd response)
I would like to put some unique identifier to each line, something like session id, to filter all entries that belong to the same process.
How is this possible?
Here is the sample code:
# import of necessary modules
import modules
app = Flask(__name__)
app.config.from_json('config.json')
#app.route("/", methods=['GET', 'POST'])
def index():
body = request.data
signature = request.headers['x-pyrus-sig']
secret = str.encode(app.config['SECRET_KEY'])
if _is_signature_correct(body, secret, signature):
return _prepare_response(body.decode('utf-8'))
def _is_signature_correct(message, secret, signature):
...
# Verifying that request is from trusted source
def _prepare_response(body):
...
# Sending the response
if __name__ == "__main__":
app.run()
It seems that instead of trying to find a unique session id from the system, it's better to generate your own.
Something like this:
How can I generate a unique ID in Python?
Related
I have encountered an issue, as I have to create a cookie in the backend, which I will later use to send a request from the frontend. Both apps are on the same domain. This is the general idea behind it: https://levelup.gitconnected.com/secure-frontend-authorization-67ae11953723.
Frontend - Sending GET request to Backend
#app.get('/')
async def homepage(request: Request, response_class=HTMLResponse):
keycloak_code = 'sksdkssdk'
data = {'code': keycloak_code}
url_post = 'http://127.0.0.1:8002/keycloak_code'
post_token=requests.get(url=url_post, json = data )
return 'Sent'
if __name__ == '__main__':
uvicorn.run(app, host='local.me.me', port=7999,debug=True)
Backend
#app.get("/keycloak_code")
def get_tokens(response: Response, data: dict):
code = data['code']
print(code)
....
requests.get(url='http://local.me.me:8002/set')
return True
#app.get("/set")
async def createcookie(response: Response):
r=response.set_cookie(key='tokic3', value='helloworld', httponly=True)
return True
if __name__ == '__main__':
uvicorn.run(app, host='local.me.me', port=8002, log_level="debug")
When I open the browser and access http://local.me.me:8002/set, I can see that the cookie is created.
But when I make a GET request from my frontend to backend to the same URL, the request is received—as I can see in the terminal—but the backend does not create the cookie. Does anyone know what I might be doing wrong?
I have tried different implementations from FastAPI docs, but none has similar use cases.
127.0.0.1 and localhost (or local.me.me in your case) are two different domains (and origins). Hence, when making a request you need to use the same domain you used for creating the cookie. For example, if the cookie was created for local.me.me domain, then you should use that domain when sending the request. See related posts here, as well as here and here.
You also seem to have a second FastAPI app (listenning on a different port) acting as your frontend (as you say). If that's what you are trying to do, you would need to use Session Objects in Python requests module, or preferably, use a Client instance from httpx library, in order to persist cookies across requests. The advantage of httpx is that it offers an asynchronous API as well, using the httpx.AsyncClient(). You can find more details and examples in this answer, as well as here and here.
this is a two-part question: I have seen individual pieces discussed, but can't seem to get the recommended suggestions to work together. I want to create a web service to store images and their metadata passed from a caller and run a test call from Postman to make sure it is working. So to pass an image (Drew16.jpg) to the web service via Postman, it appears I need something like this:
For the web service, I have some python/flask code to read the request (one of many variations I have tried):
from flask import Flask, jsonify, request, render_template
from flask_restful import Resource, Api, reqparse
...
def post(self, name):
request_data = request.get_json()
userId = request_data['UserId']
type = request_data['ImageType']
image = request.files['Image']
Had no problem with the data portion and straight JSON but adding the image has been a bugger. Where am I going wrong on my Postman config? What is the actual set of Python commands for reading the metadata and the file from the post? TIA
Pardon the almost blog post. I am posting this because while you can find partial answers in various places, I haven't run across a complete post anywhere, which would have saved me a ton of time. The problem is you need both sides to the story in order to verify either.
So I want to send a request using Postman to a Python/Flask web service. It has to have an image along with some metadata.
Here are the settings for Postman (URL, Headers):
And Body:
Now on to the web service. Here is a bare bones service which will take the request, print the metadata and save the file:
from flask import Flask, request
app = Flask(__name__)
# POST - just get the image and metadata
#app.route('/RequestImageWithMetadata', methods=['POST'])
def post():
request_data = request.form['some_text']
print(request_data)
imagefile = request.files.get('imagefile', '')
imagefile.save('D:/temp/test_image.jpg')
return "OK", 200
app.run(port=5000)
Enjoy!
Make sure `request.files['Image'] contains the image you are sending and follow http://flask.pocoo.org/docs/1.0/patterns/fileuploads/ to save the file to your file system. Something like
file = request.files['Image']
file.save('./test_image.jpg')
might do what you want, while you will have to work out the details of how the file should be named and where it should be placed.
I'm building a small tool (no UI) that will accept webhooks from several services, reformat the content, and then send standardized content to another webhook.
Example: Stripes webhook has lots of raw data, but I want to ping another webhook with a summary. So I want to take the raw data stripe sends me, reformat it to a simple string, and then send to another webhook.
I imagine the script doing this:
# 1. receive webhook sent to URL with ?service=stripe param
# 2. parse URL to get service param
# 3. parse data received in payload
# 4. use some of the data received in new string
# 5. send string to new webhook
I'd love to host this on GAE. I've built lots of projects with Django, but since this doesn't need a UI or database that seems heavy. I'd love any help. I've got the GAE project set up, and this going:
import web #using web.py
import logging
urls = (
"/.*", "hooks",
)
app = web.application(urls, globals())
class hooks:
# 1. DONE receive webhook sent to URL with ?service=stripe param
def POST(self):
# 2. parse URL to get service param
# ...
# service_name = [parsed service name]
# 3. DONE parse data received in payload
data = web.data()
logging.info(data)
# 4. DONE use some of the data received in new string
# (I've got the reformatting scripts already written)
# data_to_send = reformat(data, service_name)
# 5. send data_to_send as payload to new webhook
# new_webhook_url = 'http://example.com/1823123/'
# CURL for new webhook is: curl -X POST -H 'Content-Type: application/json' 'http://example.com/1823123/' -d '{"text": data_to_send}'
return 'OK'
app = app.gaerun()
So on GAE, is there a preferred method for (2) parsing incoming URL and (5) sending a webhook?
I'm not familiar with web.py. Many GAE apps are based on webapp2.
For parsing URLs with webapp2, you create routes. Here is a simple route that I created for processing PayPal IPNs:
(r'/ipn/(\S+)/(\w+)', website.ProcessIPN)
I then have a handler that processes this route:
class ProcessIPN(webapp2.RequestHandler):
def post(self, user, secret):
payload = json.loads(self.request.body)
...
The route is a regular expression that captures two parts of the URL and these are passed as the two parameters of the handler (user and secret). Assuming your payload is JSON so you can easily get it with webapp2 as indicated above.
For sending a webhook, you need to use urlfetch.
Given how simple your use case is, I recommend not using web.py and instead using webapp2.
I'm working on my first Flask app (version 0.10.1), and also my first Python (version 3.5) app. One of its pieces needs to work like this:
Submit a form
Run a Celery task (which makes some third-party API calls)
When the Celery task's API calls complete, send a JSON post to another URL in the app
Get that JSON data and update a database record with it
Here's the relevant part of the Celery task:
if not response['errors']: # response comes from the Salesforce API call
# do something to notify that the task was finished successfully
message = {'flask_id' : flask_id, 'sf_id' : response['id']}
message = json.dumps(message)
print('call endpoint now and update it')
res = requests.post('http://0.0.0.0:5000/transaction_result/', json=message)
And here's the endpoint it calls:
#app.route('/transaction_result/', methods=['POST'])
def transaction_result():
result = jsonify(request.get_json(force=True))
print(result.flask_id)
return result.flask_id
So far I'm just trying to get the data and print the ID, and I'll worry about the database after that.
The error I get though is this: requests.exceptions.ConnectionError: None: Max retries exceeded with url: /transaction_result/ (Caused by None)
My reading indicates that my data might not be coming over as JSON, hence the Force=True on the result, but even this doesn't seem to work. I've also tried doing the same request in CocoaRestClient, with a Content-Type header of application/json, and I get the same result.
Because both of these attempts break, I can't tell if my issue is in the request or in the attempt to parse the response.
First of all request.get_json(force=True) returns an object (or None if silent=True). jsonify converts objects to JSON strings. You're trying to access str_val.flask_id. It's impossible. However, even after removing redundant jsonify call, you'll have to change result.flask_id to result['flask_id'].
So, eventually the code should look like this:
#app.route('/transaction_result/', methods=['POST'])
def transaction_result():
result = request.get_json()
return result['flask_id']
And you are absolutely right when you're using REST client to test the route. It crucially simplifies testing process by reducing involved parts. One well-known problem during sending requests from a flask app to the same app is running this app under development server with only one thread. In such case a request will always be blocked by an internal request because the current thread is serving the outermost request and cannot handle the internal one. However, since you are sending a request from the Celery task, it's not likely your scenario.
UPD: Finally, the last one reason was an IP address 0.0.0.0. Changing it to the real one solved the problem.
I am running a webserver based on Flask, which serves a resource being versioned (e.g. installation file of some versioned program). I want to serve my HTTP client with new resource only in case, it already does not have the current version available. If there is new version, I want the client to download the resource and install it.
my Flask server looks like this
import json
import redis
import math
import requests
from flask import Flask,render_template,request
app=Flask(__name__)
#app.route('/version', methods=['GET','POST'])
def getversion():
r_server=redis.Redis("127.0.0.1")
if request.method == 'POST':
jsonobj_recieve=request.data
data=json.loads(jsonobj)
currentversion=r_server.hget('version')
if data == currentversion:
#code to return a 'ok'
else:
#code to return 'not ok' also should send the updated file to the client
else:
return r_server.hget('version')
if __name__ == '__main__':
app.run(
debug=True,
host="127.0.0.1",
port=80
)
my client is very basic:
import sys
import json
import requests
url="http://127.0.0.1/version"
jsonobj=json.dumps(str(sys.argv[1]))
print jsonobj
r=requests.post(url,data=jsonobj)
I will likely have to recode the entire client, this is not a problem but I really have no idea where to start....
Requirements Review
have web app, serving a versioned resource. It can be e.g. file with an applications.
have client, which allows fetching the resource only in case, the version of resource on the server and what client has locally already available differ
the client is aware of version string of the resource
allow client to learn new version string if new version is available
HTTP like design of your solution
If you want to allow downloading an application only in case, the client does not have it already, following design could be used:
use etag header. This usually contains some string describing unique status of resource you want to get from that url. In your case it could be current version number of your application.
in your request, use header "if-none-match", providing version number of your application present at client. This will result in HTTP Status code 306 - Not Modified in case, your client and server share the same version of resource. In case it differs, you would simply provide the content of the resource and use it. Your resource shall also denote in etag current version of the resource and your client shall take note of it, or find new version name from other sources (like from the downloaded file).
This design follows HTTP principles.
Flask serving resource with declaring version in etag
This is focusing on showing the principle, you shall elaborate on providing real content of the resource.
from flask import Flask, Response, request
import werkzeug.exceptions
app = Flask(__name__)
class NotModified(werkzeug.exceptions.HTTPException):
code = 304
def get_response(self, environment):
return Response(status=304)
#app.route('/download/app')
def downloadapp():
currver = "1.0"
if request.if_none_match and currver in request.if_none_match:
raise NotModified
def generate():
yield "app_file_part 1"
yield "app_file_part 2"
yield "app_file_part 3"
return Response(generate(), headers={"etag": currver})
if __name__ == '__main__':
app.run(debug=True)
Client getting resource only, if it is new
import requests
ver = "1.0"
url = "http://localhost:5000/download/app"
req = requests.get(url, headers={"If-None-Match": ver})
if req.status_code == 200:
print "new content of resource", req.content
new_ver = req.headers["etag"]
else:
print "resource did not change since last time"
Alternative solution of web part using web server (e.g. NGINX)
Assuming the resource is static file, which updates only sometime, you shall be able configuring your web server, e.g. NGINX, to serve that resource and declaring in your configuration explicit value for etag header to the version string.
Note, that as it was not requested, this alternative solution is not elaborated here (and was not tested).
Client implementation would not be modified by that (here it pays back the design is following HTTP concepts).
There are multiple ways of achieving this but as this is a Flask app, here's one using HTTP.
If the version is OK, just return a relevant status code, like a 200 OK. You can add a JSON response in the body if that's necessary. If you return a string with flask, the status code will be 200 OK and you can inspect that in your client.
If the version differs, return the URL where the file is located. The client will have to
download the file. That's pretty simple using requests. Here's a typical example for downloading file by streaming requests:
def get(url, chunk_size=1024):
""" Download a file in chunks of n bytes """
fn = url.split("/")[-1] # if you're url is complicated, use urlparse.
stream = requests.get(url, stream=True)
with open(fn, "wb") as local:
for chunk in stream.iter_content(chunk_size=chunk_size):
if chunk:
f.write(chunk)
return fn
This is very simplified. If your file is not static and cannot live on the server (like software update patches probably shouldn't) then you'll have to figure out a way to get the file from a database or generate it on the fly.