How to receive json POST request using web2py - python

I'm very new to web2py and I'm trying to use it for testing remote server's application (I send a http POST request using python requests with file to process and expect to get counter POST request with report in json and display it in shell or save to file). I found following code for similar issue for XML data
# Controller code:
def index():
response.headers['content-type'] = 'text/xml'
xml = request.body.read() # retrieve the raw POST data
if len(xml) == 0:
xml = '<?xml version="1.0" encoding="utf-8" ?><root>no post data</root>'
return response.render(dict(xml=XML(xml)))
# View code:
{{=xml}}
but I can't make proper changes that will allow to use it for my purpose.
So the question is: how to simply receive json data via POST request and save it directly on my computer or to display it somehow using web2py? No buttons, upload fields, data bases needed.. only to get data from incoming request

In your client:
import requests
:
your_url='http://domain.com/app/controllerfile/j'
r=requests.post(your_url, json=jsonData)
In a web2py controller file 'controllerfile':
import json
def j():
with open("json.txt","a") as f:
json.dump(request.vars,f)
return

Related

How to send application/json data with image file in python flask?

I am using the send_from_directory function.
#app.route('/get_data',methods=["GET"])
def get_data():
path = os.path.join(app.config['UPLOAD_FOLDER'], 'uploads')
return send_from_directory(path,"cat.jpg")
HTTP responses should respond with one type of data. It depends on what is your client does.
For example, you can return a json to the client which contains a URL for an image and than the client should fetch that imgae.
Another option is to return an image and return the json data in an HTTP header, or as the first option - have an http header that contains a URL for the JSON data, and make the client fetch the JSON data later.
Good luck :)

Python Falcon - get POST data

I try to use falcon package in my project. Problem is I didn't find a way to get body data from the HTTP post request.
I used code from example, but req.stream.read() doesn't return JSON as expected.
The code is:
raw_json = req.stream.read()
result.json(raw_json, encoding='utf-8')
resp.body = json.dumps(result_json, encoding='utf-8')
How to get the POST data?
Thanks for any help
in falcon 2, if you work with json type, use req.media
for example:
import falcon
from json import dumps
class Resource(object):
def on_post(self, req, resp, **kwargs):
result = req.media
# do your job
resp.body = dumps(result)
api = falcon.API()
api.add_route('/test', Resource())
Little digging into the problem led to the following linked issue on github. It states that falcon framework at least in its version 0.3 and working with Python 2 didn't parse data 'POSTed' as string if they are aptly escaped. We could use more information on what data you are trying to send over POST request and in what format is that being sent, as in if its being send as simple text, or with Header Information Content-Type:application/json, or if its coming through an HTML form.
While the exact issue is not clear from the question I could still suggest trying to use bounded_stream instead of stream as in:
raw_json = req.bounded_stream.read()
result.json(raw_json, encoding='utf-8')
resp.body = json.dumps(result_json, encoding='utf-8')
for the official documentation suggests use of bounded_stream where uncertain conditions such as Content-Length undefined or 0, or if header information is missing altogether.
bounded_stream is described as the following in the official falcon documentation.
File-like wrapper around stream to normalize certain differences between the native input objects employed by different WSGI servers. In particular, bounded_stream is aware of the expected Content-Length of the body, and will never block on out-of-bounds reads, assuming the client does not stall while transmitting the data to the server.
Falcon receives the HTTP requests data as buffer object as passed by WSGI wrapper which receives the data from client, and its possible it doesn't run proper parsing on top of the data to convert to a more usable data structure for performance reasons.
Big thanks to Ryan (and Prateek Jain) for the answer.
The solution is simply to put app.req_options.auto_parse_form_urlencoded=True. For example:
import falcon
class ThingsResource(object):
def on_post(self, req, resp):
value = req.get_param("value", required=True)
#do something with value
app = falcon.API()
app.req_options.auto_parse_form_urlencoded=True
things = ThingsResource()
app.add_route('/things', things)
The field you're looking for is somewhat confusingly named, but it's req.media:
Returns a deserialized form of the request stream. When called, it will attempt to deserialize the request stream using the Content-Type header as well as the media-type handlers configured via falcon.RequestOptions.
If the request is JSON, req.media already contains a python dict.
I have added changes in request.py in falcon framework to parse application/x-www-form-urlencoded and multipart/from-data.
I have raised pull request - https://github.com/falconry/falcon/pull/1236 but it is not yet merged in master.
Check this - https://github.com/branelmoro/falcon
I have added new code to parse POST, PUT and DELETE application/x-www-form-urlencoded and multipart/form-data.
Text fields will be available in req.form_data dictionary and upload file buffer stream will be available in req.files dictionary.
I hope this will help to access POST and GET parameters separately and we will be able to upload files as well.
Good thing about the change is that it will not load entire uploaded file in memory.
Below is sample code to show how to use POST, PUT and DELETE application/x-www-form-urlencoded and multipart/form-data:
import falcon
class Resource(object):
def on_post(self, req, resp):
# req.form_data will return dictionary of text field names and their values
print(req.form_data)
# req.form_data will return dictionary of file field names and
# their buffer class FileStream objects as values
print(req.files)
# support we are uploading a image.jpg in `pancard` file field then
# req.files["pancard"] will be FileStream buffer object
# We can use set_max_upload_size method to set maximum allowed
# file size let say 1Mb = 1*1024*1024 bytes for this file
req.files["pancard"].set_max_upload_size(1*1024*1024)
# We can use uploadto method to upload file on required path (Note: absolute filepath is required)
# This method returns boolean - `True` on successful upload
# and if upload is unsuccessful then it returns `False` and sets error on failure.
path = "/tmp/" + req.files["pancard"].name
response = req.files["pancard"].uploadto("/tmp/" + path)
print(response)
# Once file is uploaded sucessfully, we can check it's size
print(req.files["pancard"].size)
# If file is not uploaded sucessfully, we can check it's error
print(req.files["pancard"].error)
resp.body = "Done file upload"
resp.status = falcon.HTTP_200
# falcon.API instances are callable WSGI apps
app = falcon.API()
things = Resource()
# things will handle post requests to the '/post_path' URL path
app.add_route('/post_path', things)
Do let me know if you have any doubts.
So far... for me bounded_stream.read() and stream.read() both get the posted data as type str. I have only found one way around the issue so far:
def on_post(self, req, resp):
posted_data = json.loads(req.stream.read())
print(str(type(posted_data)))
print(posted_data)
Loading the string into a json dict once the posted data is received is my only solution that I can come up with
Here's something I used while designing an API.
import falcon
import json
class VerifierResource():
def on_post(self, req, resp):
credentials = json.loads(req.stream.read())
if credentials['username'] == USER \
and credentials['passwd'] == PASSWORD:
resp.body = json.dumps({"status": "verified"})
else:
resp.body = json.dumps({"status": "invalid"})
api = falcon.API()
api.add_route('/verify', VerifierResource())
This returns a serialized JSON with corresponding response body.
there is a sample way to get media from body. I use to get the body in the post method:
def on_post(req,resp)
arguments = {}
# get body media on post method
body = req.get_media()
if 'something' in body:
arguments['something'] = body['something']
send body content type Media-Type and print resp or use in code, but if want to send JSON body your code should cover give JSON parameters.
Do let me know if you have any doubts.

Python - Accept POST (raw body) data

I am completely new to Python. I am using GitLab which offers system hook feature wherein i can specify a URL and it will send event details in the form of JSON POST data. When i create a RequestBin URL and provide that URL in GitLab's system hook, then in case of any event such as project creation, it sends the event details and i can see the same in RequestBin as shown in the snapshot below.
Now, i want to fetch this JSON data in some variable so i can process it as per my need but i'm not sure how to do read that data.
I've seen some posts which explain how to read JSON data but as you can see below in the screenshot, the FORM/POST PARAMETERS is showing as None. It's the raw body that contains all the details (in JSON format):
I have tried reading the data using Java and it works with the code shown below:
String recv;
String recvbuff="";
BufferedReader buffread = new BufferedReader(new InputStreamReader(request.getInputStream()));
while ((recv = buffread.readLine()) != null)
recvbuff += recv;
buffread.close();
System.out.println(recvbuff);
out.println(recvbuff);
Looking for something similar in Python.
I would suggest using CherryPy. It's a neat Python library that allows you to build a simple webserver application, it's fits pretty nicely in your use case: it can easily accept JSON requests (http://docs.cherrypy.org/en/latest/basics.html#dealing-with-json).
If you write a file called myserver.py with the following code:
#!/usr/bin/python3
import cherrypy
class Root(object):
#cherrypy.expose
#cherrypy.tools.json_in()
def index(self):
data = cherrypy.request.json
# You can manipulate here your json data as you wish
print(data['name'])
if __name__ == '__main__':
cherrypy.quickstart(Root(), '/')
You can simply launch the server with the command line:
python3 myserver.py
And test it with the following curl command:
curl -H "Content-Type: application/json" -POST http://127.0.0.1:8080 -d '{"name": "test", "path": "/"}'
You will then see test printed in your server log.
Your Flask application doesn't return any data so you're not going to see anything returned. You need to return something like:
return "test data"
Your screenshot is only showing the request not the response. You sent no form encoded parameters, which is why it's showing "None".
The correct Content-type for JSON is: application/json

Webapp2 request application/octet-stream file upload

I am trying to upload a file to s3 from a form via ajax. I am using fineuploader http://fineuploader.com/ on the client side and webapp2 on the server side. it sends the parameter as qqfile in the request and I can see the image data in the request headers but I have no idea how to get it back in browsers that do not use the multipart encoding.
This is how I was doing it in the standard html form post with multipart.
image = self.request.POST["image"]
this gives me the image name and the image file
currently I have only been able to get the image filename back not the data with
image = self.request.get_all('image')
[u'image_name.png']
when using the POST I get a warning about the content headers being application/octet-stream
<NoVars: Not an HTML form submission (Content-Type: application/octet-stream)>
How do I implement a BlobStoreHandler in webapp2 outside of GAE?
Your question code is not very clear to me. But you can use an example.
Have a look at this article from Nick Johnson. He implements a dropbox service using app engine blobstore and Plupload in the client : http://blog.notdot.net/2010/04/Implementing-a-dropbox-service-with-the-Blobstore-API-part-2
I ended up using fineuploader http://fineuploader.com/ which sends a multipart encoded form to my handler endpoint.
inside the handler I could simply just reference the POST and then read the FieldStorage Object into a cStringIO object.
image = self.request.POST["qqfile"]
imgObj = cStringIO.StringIO(image.file.read())
# Connect to S3...
# create s3 Key
key = bucket.new_key("%s" % uuid.uuid4());
# guess mimetype for headers
file_mime_type = mimetypes.guess_type(image.filename)
if file_mime_type[0]:
file_headers = {"Content-Type": "%s" % file_mime_type[0]}
else:
file_headers = None
key.set_contents_from_string(imgObj.getvalue(), headers=file_headers)
key_str = key.key
#return JSON response with key and append to s3 url on front end.
note: qqfile is the parameter fineuploader uses.
I am faking the progress but that is ok for my use case no need for BlobStoreUploadHandler.

sending data to xml api of webservice

Im trying to write a python script that basically interacts with a webservice that uses an xml api. The request method is POST.
Usually I would write a request of the form request(url, data, headers) - however, in the case of an xml api it would not work. Also something like data.encode('utf-8') or urllib.urlencode(data) would not work as the data is not a dict.
In this case, data is xml so how am i supposed to sent it over?
[EDIT]
When I send a string of XML I get a urllib2.HTTPError: HTTP Error 415: Unsupported Media Type Exception. Is there any other way I'm supposed to send the data?
Also, the API I am using the Google Contacts API. I'm trying to write a script that adds a contact to my gmail account.
You probably need to set proper Content-Type header, for XML it would probably be:
application/xml
So something like this should get you going:
request = urllib2.Request( 'xml_api.example.com' )
request.add_header('Content-Type', 'application/xml')
response = urllib2.urlopen(request, xml_data_string)
Hope that helps :)

Categories