I'm building a small tool (no UI) that will accept webhooks from several services, reformat the content, and then send standardized content to another webhook.
Example: Stripes webhook has lots of raw data, but I want to ping another webhook with a summary. So I want to take the raw data stripe sends me, reformat it to a simple string, and then send to another webhook.
I imagine the script doing this:
# 1. receive webhook sent to URL with ?service=stripe param
# 2. parse URL to get service param
# 3. parse data received in payload
# 4. use some of the data received in new string
# 5. send string to new webhook
I'd love to host this on GAE. I've built lots of projects with Django, but since this doesn't need a UI or database that seems heavy. I'd love any help. I've got the GAE project set up, and this going:
import web #using web.py
import logging
urls = (
"/.*", "hooks",
)
app = web.application(urls, globals())
class hooks:
# 1. DONE receive webhook sent to URL with ?service=stripe param
def POST(self):
# 2. parse URL to get service param
# ...
# service_name = [parsed service name]
# 3. DONE parse data received in payload
data = web.data()
logging.info(data)
# 4. DONE use some of the data received in new string
# (I've got the reformatting scripts already written)
# data_to_send = reformat(data, service_name)
# 5. send data_to_send as payload to new webhook
# new_webhook_url = 'http://example.com/1823123/'
# CURL for new webhook is: curl -X POST -H 'Content-Type: application/json' 'http://example.com/1823123/' -d '{"text": data_to_send}'
return 'OK'
app = app.gaerun()
So on GAE, is there a preferred method for (2) parsing incoming URL and (5) sending a webhook?
I'm not familiar with web.py. Many GAE apps are based on webapp2.
For parsing URLs with webapp2, you create routes. Here is a simple route that I created for processing PayPal IPNs:
(r'/ipn/(\S+)/(\w+)', website.ProcessIPN)
I then have a handler that processes this route:
class ProcessIPN(webapp2.RequestHandler):
def post(self, user, secret):
payload = json.loads(self.request.body)
...
The route is a regular expression that captures two parts of the URL and these are passed as the two parameters of the handler (user and secret). Assuming your payload is JSON so you can easily get it with webapp2 as indicated above.
For sending a webhook, you need to use urlfetch.
Given how simple your use case is, I recommend not using web.py and instead using webapp2.
Related
I'm new to Flask and I'm creating a website with the following functionality: I want to retrieve the user location and store it in a database in order to build some analytics about the website audience (and eventually, also include some degree of personalisation according to the user location).
In order to do so I'm using an external API (https://ipinfo.io) and my goal would be to send a GET request to this API, with the user IP address every time somebody logs in my website.
My current approach is this:
def get_my_ip():
# Get user id
ip_address = request.remote_addr
# Retrieve user location data
location_details = get_ip_location_details(ip_address="80.57.40.203", access_token=ACCESS_TOKEN)
# Store data in the database
webview = WebsiteView(**location_details)
db.session.add(webview)
db.session.commit()
And I'm passing this function to all my routes along with their function for rendering the templates.
# Home route
#app.route("/")
def home():
get_my_ip()
return render_template('home.html')
This actually works but sometimes the website sends multiple requests to the external API when somebody logs in or refresh the page (even though the user interaction with the website is the same and all the requests are successful)
I would like to store this information only once every time somebody logs in the website and in order to do so the only idea I got is putting some constraints in the database to not keep track of the same user in multiple records, but what I would also like to do is sending the least amount of requests to the Ipinfo API.
Is this something possible to tell the website to wait for the response of request before sending new ones?
I'm using an OpenAPI 3.0 specification (swagger.yml) and use Swagger Codegen to create the corresponding Python Flask application stubs. This is how I run the application to expose my Swagger API:
app = connexion.App(__name__, specification_dir='./swagger/')
app.app.json_encoder = encoder.JSONEncoder
app.add_api('swagger.yaml', arguments={'title': 'My Test API'})
# add CORS support to send Access-Control-Allow-Origin header
CORS(app.app)
So far so good. The application logic is handled within the generated Python stubs which are linked by the x-openapi-router-controller: swagger_server.controllers.user_controller.
I now however need to access HTTP Request specific information within the application itself to for example react differently based on the HTTP_CLIENT_IP address
How can I obtain that information within my controller endpoint?
Use Flask's request context.
For example, to get the HTTP_CLIENT_IP, use:
from flask import request
http_client_ip = request.remote_addr
You can read more about request here.
Attached two related links addressing the same issue on request header parameters and how connexion does not forward them to custom controllers. I ended up manually accessing them via
access_token = connexion.request.headers['access_token']
I am using Heroku platform to put simple webhook, which is written on python and uses Flask. Cloud system I use sends POST request to my webhook, sending me some data and I should respond.
I would like to keep a log of all requests and responses.
Sometimes it happens, that Cloud system may send the same requests to my webhook again (for whatever reason - I didn't send a response in specified time or something like that).
I would like to distinguish such request. If I simply put information into the log about what I get it may happen that I will see in my log something like:
1) Starting processing request (1st request)
2) Starting processing request (2nd request)
3) Calculating response (1st request)
4) Calculating response (2nd response)
I would like to put some unique identifier to each line, something like session id, to filter all entries that belong to the same process.
How is this possible?
Here is the sample code:
# import of necessary modules
import modules
app = Flask(__name__)
app.config.from_json('config.json')
#app.route("/", methods=['GET', 'POST'])
def index():
body = request.data
signature = request.headers['x-pyrus-sig']
secret = str.encode(app.config['SECRET_KEY'])
if _is_signature_correct(body, secret, signature):
return _prepare_response(body.decode('utf-8'))
def _is_signature_correct(message, secret, signature):
...
# Verifying that request is from trusted source
def _prepare_response(body):
...
# Sending the response
if __name__ == "__main__":
app.run()
It seems that instead of trying to find a unique session id from the system, it's better to generate your own.
Something like this:
How can I generate a unique ID in Python?
I am trying to learn how to inform ServiceNow via REST API that I updated a record in MySQL (AWS RDS) using a Python script (in AWS EC2 Windows Server 2012) for it fetch that updated record. What particular Python libraries / modules should I learn to put me in the right direction?
Currently my Python script and MySQL RDS are communicating just fine.
I am still in the phase of trying to get a better understanding of REST API and AWS EC2.
Any other AWS, ServiceNow, or Python related information that could be share would be greatly appreciated.
The ServiceNow table REST API is very straightforward so inserting a record into an arbitrary table with Python is a breeze. For example:
#Need to install requests package for python
#easy_install requests
import requests
# Set the request parameters
url = 'https://[instance name].service-now.com/api/now/table/[table name]'
# Eg. User name="admin", Password="admin" for this code sample.
user = 'admin'
pwd = 'admin'
# Set proper headers
headers = {"Content-Type":"application/json","Accept":"application/json"}
# Do the HTTP request - this is fake data
response = requests.post(url, auth=(user, pwd), headers=headers ,data="[your json string of fields and values]")
# Check for HTTP codes other than 200
if response.status_code != 200:
print('Status:', response.status_code, 'Headers:', response.headers, 'Error Response:',response.json())
exit()
# Decode the JSON response into a dictionary and use the data
data = response.json()
print(data)
The REST API Explorer in ServiceNow if very useful for building and testing queries. It even generates sample code. You can search REST API Explorer in the navigator to find it.
Another option is to create a Scripted REST API in ServiceNow to create a custom URL that you can hit to achieve the notification. This is good if you don't need to persist the data and just want to be notified.
I'm trying to download data from an external API. There's going to be a lot of downloads, so I want to use pipelines for easier parallelization. The way the API is set up, I can make a request to start a download job, and pass a postback url in that request. When the download job finishes, their API sends a POST to the given url. I want to do the following:
class DownloadPipeline(pipeline.Pipeline):
async = True
public_callbacks = True
def run(self, filename):
postback = self.get_callback_url()
# make API request with postback as a param
def callback(self):
# Read data from the POST
However, all the docs I've read online only have examples of GET requests on the callback url, where data is passed through a query string on the URL. Is there a way to read POST data instead?
Looks like both the POST and GET both call over to run_callback() ... so you should be able to do either