How to view Terminal output when running Requests in Python? - python

I have a simple Python scripts that POSTs a local file to a given URL via Requests:
import requests
url = 'http://myWebsite.com/extension/extension/extension'
files = {'file': open("myLocalFile.csv")}
r = requests.post(url, files=files)
print r.headers
When I run a cURL command in Terminal that does the exact same thing:
curl -k -F docfile=#myLocalFile.csv http://myWebsite.com/extension/extension/extension
I get the output:
{"success":true, "data":{"uploaded":39, "errors":0, "unchanged":39, "skipped":0, "updated":0, "created":0, "failed":[]}, "numRows":1}
which indicates that I have successfully uploaded the file. How can I view this same output when I run my python script? I want to be able to parse through this output and check to see if "success" is true/false.
Sorry for the weird formatting, I tried to fix it but couldn't :(

print r.text
gives you the response body and then you can parse through the response body to check if success=true or false.

Not having used the library before the example on their home page seems helpful
python-requests.
More specifically:
print(r.json()) #possibly print(r.content)
# prints {u'private_gists': 419, u'total_private_repos': 77, ...}
Also see this question

Related

How to get information without using pyCURL (requests)

I need to implement the following curl:
curl -k https://somelinkhere/get_info -d auth_info='{"session_id":"blablablaidhere"}' -d params='{"id":"12345"}'
Currently I have the following code. It is working, but not exactly as I need. I need to get json content from the reply, just one parameter.
url = 'https://somelinkhere/get_info'
data = {'auth_info':'{"session_id":"blablablaidhere"}', 'params':'{"id":"12345"}'}
response = requests.post(url, data=data)
res = response.content
print res
Now it returns
'�Z[o�6�+���֖���ې�0�{h�`
AK�M�"����o�9�dI���t��#RI<��"�GD�D��.3MDeN��
��hͣw�fY)SW����`0�{��$���L��Zxvww����~�qA��(�u*#��݅Pɣ����Km���'
etc.
What i need is to output
res['balance_info']['balance']
If i launch cURL (provided above) from a command line, i have the following:
{"balance_info":{"enabled":"Y","balance":"0.55000","out_date_format":"MM-DD-YYYY","password":"12345","blocked":"N"
But do not know how to get this parameter using python script
As in documentation,content property gives the binary version of response.
You'll need to get the decoded version of request using .text then load it as json.
response = requests.post(url, data=data)
#load it as json
item = json.loads(response.text)
And now you can access your keys as:
response['balance_info']['balance']
What you get is a Gziped JSON string.
You have to decompress before reading json. Then you can use the response as a python dict.
Something like res = json.loads(zlib.decompress(response.content))
Here is an example using Requests:
>>> import requests
>>> r = requests.post('https://somelinkhere/get_info',
data = {"id": "12345"})
See also the documentation to install Requests on your virtualenv.

unable to post file+data using python-requests

I'm able to post file using curl
curl -X POST -i -F name='barca' -F country='spain' -F
file=#/home/messi/Desktop/barca.png 'http://localhost:8080/new_org/hel/concerts'
Which I can get (file) as
curl -X GET -H 'Accept: image/png' 'http://localhost:8080/new_org/hel/concerts/<id or name of entity>'
But when I tried same thing using requests.post, I got error. Does anybody know why this happen. (Post Error encounter when file pointer is not at last, but when file pointer is at last, I got response 200 but file is not posted)
import requests
url = 'http://localhost:8080/new_org/hel/concerts'
file = dict(file=open('/home/messi/Desktop/barca.png', 'rb'))
data = dict(name='barca', country='spain')
response = requests.post(url, files=file, data=data)
Error: (from usergrid) with response code: 400
{u'duration': 0,
u'error': u'illegal_argument',
u'error_description': u'value is null',
u'exception': u'java.lang.IllegalArgumentException',
u'timestamp': 1448330119021}
https://github.com/apache/usergrid
I believe the problem is that Python is not sending a content-type field for the image that you are sending. I traced through the Usergrid code using a debugger and saw that curl is sending the the content-type for the image and Python is not.
I was able to get this exact code to work on my local Usergrid:
import requests
url = 'http://10.1.1.161:8080/test-organization/test-app/photos/'
files = { 'file': ('13.jpg', open('/Users/dave/Downloads/13.jpg', 'rb'), 'image/jpeg')}
data = dict(name='barca', country='spain')
response = requests.post(url, files=files, data=data)
It is possible that Waken Meng's answer did not work because of the syntax of the files variable, but I'm no Python expert.
I met a problem before when i try to upload image files. Then I read the doc and do this part:
You can set the filename, content_type and headers explicitly:
Here is how I define the file_data:
file_data = [('pic', ('test.png', open('test.png'), 'image/png'))]
r = requests.post(url, files=file_data)
file_data should be a a list: [(param_name, (file_name, file, content_type))]
This works for me, hope can help you.

python request.get is not working while curl does work

I can basically do the following in bash: (works fine)
$COOKIE=mycookies
curl -b $COOKIE http://localhost:8080/data
but not in python (I followed: http://docs.python-requests.org/en/latest/user/quickstart/),
cookies = dict(cookies_are='mycookies')
response = requests.get(url='http://localhost:8080/data',
cookies=cookies)
print response.status_code
print response.text
I keep getting
<li>Unauthenticated</li>
So... it might seem silly, and I can't tell if this is just an error in how you posted the question or not, but it would seem that you are messing up the dictionary and using the wrong version.
cookies = dict(cookies_are='mycookies')
should be more like
cookies = os.environ['COOKIE']
cookies = json.dumps(mycookies)

Emulating a cURL command with Python

I've got a cURL command that does what I need, and I'm trying to translate it into python. Here's the cURL:
curl http://example.com:1234/faye -d 'message={"channel":"/test","data":"hello world"}'
This talks to a Faye server and publishes a message to the channel /test. This works. I'm trying to do that same publishing from within Python. I've looked at this and this, and neither of them helped me; I get a 400 error with both of those methods. Here's some of the stuff I've tried from within the Python shell:
import urllib2, json, requests
addr = 'http://example.com:1234/faye'
data = {'message': {'channel': '/test', 'data': 'hello from python'}}
data_as_json = json.dumps(data)
requests.post(addr, data=data)
requests.post(addr, params=data)
requests.post(addr, data=data_as_json)
requests.post(addr, params=data_as_json)
req = urllib2.Request(addr, data)
urllib2.urlopen(req)
req = urllib2.Request(addr, data_as_json)
urllib2.urlopen(req)
# All these things give 400 errors
Unfortunately I can't wireshark the connection since it's over an SSH tunnel (so everything's encrypted and on the wrong ports). Using the --trace option from cURL I can see that it's not url-encoding the data, so I know I don't need to do that. I also really don't want to Popen cURL itself.
message in this case is the name of a POST variable, and shouldn't be included in the JSON.
Thus, what you actually want to do is this:
data = urllib.urlencode({'message': json.dumps({'channel': '/test', 'data': 'hello from python'}))
conn = urllib2.urlopen('http://example.com:1234/faye', data=data)
print conn.read()

Python 3 script to upload a file to a REST URL (multipart request)

I am fairly new to Python and using Python 3.2. I am trying to write a python script that will pick a file from user machine (such as an image file) and submit it to a server using REST based invocation. The Python script should invoke a REST URL and submit the file when the script is called.
This is similar to multipart POST that is done by browser when uploading a file; but here I want to do it through Python script.
If possible do not want to add any external libraries to Python and would like to keep it fairly simple python script using the core Python install.
Can some one guide me? or share some script example that achieve what I want?
Requests library is what you need. You can install with pip install requests.
http://docs.python-requests.org/en/latest/user/quickstart/#post-a-multipart-encoded-file
>>> url = 'http://httpbin.org/post'
>>> files = {'file': open('report.xls', 'rb')}
>>> r = requests.post(url, files=files)
A RESTful way to upload an image would be to use PUT request if you know what the image url is:
#!/usr/bin/env python3
import http.client
h = http.client.HTTPConnection('example.com')
h.request('PUT', '/file/pic.jpg', open('pic.jpg', 'rb'))
print(h.getresponse().read())
upload_docs.py contains an example how to upload a file as multipart/form-data with basic http authentication. It supports both Python 2.x and Python 3.
You could use also requests to post files as a multipart/form-data:
#!/usr/bin/env python3
import requests
response = requests.post('http://httpbin.org/post',
files={'file': open('filename','rb')})
print(response.content)
You can also use unirest . Sample code
import unirest
# consume async post request
def consumePOSTRequestSync():
params = {'test1':'param1','test2':'param2'}
# we need to pass a dummy variable which is open method
# actually unirest does not provide variable to shift between
# application-x-www-form-urlencoded and
# multipart/form-data
params['dummy'] = open('dummy.txt', 'r')
url = 'http://httpbin.org/post'
headers = {"Accept": "application/json"}
# call get service with headers and params
response = unirest.post(url, headers = headers,params = params)
print "code:"+ str(response.code)
print "******************"
print "headers:"+ str(response.headers)
print "******************"
print "body:"+ str(response.body)
print "******************"
print "raw_body:"+ str(response.raw_body)
# post sync request multipart/form-data
consumePOSTRequestSync()
You can check out this post http://stackandqueue.com/?p=57 for more details

Categories