Send Netscape Cookie with Python - python

How can i load Netscape cookie from a file to auth to a website REST API
with python requests[session], pycurl or something else ?
Similar to curl -b ${home}.cookie
-b, --cookie (HTTP) Pass the data to the HTTP server as a cookie. It is supposedly the data previously received from the server
in a "Set-Cookie:" line. The data should be in the format
"NAME1=VALUE1; NAME2=VALUE2".
import requests
from requests.auth import HTTPDigestAuth
proxi = {'http': 'http://proxy',
'https': 'http://proxy'}
url = 'http://192.196.1.98:8080/a/changes/?q=status:new'
r = requests.get(url, proxies=proxi) #cookies=cookie
print r.status_code
print r.json()
print r.headers
print r.request.headers
print r.text

Related

Python Requests API call not working

I'm having an issue converting a working cURL call to an internal API to a python requests call.
Here's the working cURL call:
curl -k -H 'Authorization:Token token=12345' 'https://server.domain.com/api?query=query'
I then attempted to convert that call into a working python requests script here:
#!/usr/bin/env python
import requests
url = 'https://server.domain.com/api?query=query'
headers = {'Authorization': 'Token token=12345'}
r = requests.get(url, headers=headers, verify=False)
print r
I get a HTTP 401 or 500 error depending on how I change the headers variable around. What I do not understand is how my python request is any different then the cURL request. They are both being run from the same server, as the same user.
Any help would be appreciated
Hard to say without knowing your api, but you may have a redirect that curl is honoring that requests is not (or at least isn't send the headers on redirect).
Try using a session object to ensure all requests (and redirects) have your header.
#!/usr/bin/env python
import requests
url = 'https://server.domain.com/api?query=query'
headers = {'Authorization': 'Token token=12345'}
#start a session
s = requests.Session()
#add headers to session
s.headers.update(headers)
#use session to perform a GET request.
r = s.get(url)
print r
I figured it out, it turns out I had to specify the "accept" header value, the working script looks like this:
#!/usr/bin/env python
import requests
url = 'https://server.domain.com/api?query=query'
headers = {'Accept': 'application/app.app.v2+json', 'Authorization': 'Token token=12345'}
r = requests.get(url, headers=headers, verify=False)
print r.json()

how to disable SSL authentication in python 3

I am new to python. I have a script, trying to post something to a site. now how do I disable SSL authentication in the script?
In python2, you can use
requests.get('https://kennethreitz.com', verify=False)
but I don't know how to do it in python 3.
import urllib.parse
import urllib.request
url = 'https://something.com'
headers = { 'APILOGIN' : "user",
'APITOKEN' : "passwd"}
values = {"dba":"Test API Merchant","web":"","mids.mid":"ACH"}
data = urllib.parse.urlencode(values)
data = data.encode('utf-8') # data should be bytes
req = urllib.request.Request(url, data, headers)
with urllib.request.urlopen(req) as response:
the_page = response.read()
See Verifying HTTPS certificates with urllib.request - by not specifying either cafile or capath in your call to urlopen, by default any HTTPS connection is not verified.

how to make post request in python

Here is the curl command:
curl -H "X-API-TOKEN: <API-TOKEN>" 'http://foo.com/foo/bar' --data #
let me explain what goes into data
POST /foo/bar
Input (request JSON body)
Name Type
title string
body string
So, based on this.. I figured:
curl -H "X-API-TOKEN: " 'http://foo.com/foo/bar' --data '{"title":"foobar","body": "This body has both "double" and 'single' quotes"}'
Unfortunately, I am not able to figure that out as well (like curl from cli)
Though I would like to use python to send this request.
How do i do this?
With the standard Python httplib and urllib libraries you can do
import httplib, urllib
headers = {'X-API-TOKEN': 'your_token_here'}
payload = "'title'='value1'&'name'='value2'"
conn = httplib.HTTPConnection("heise.de")
conn.request("POST", "", payload, headers)
response = conn.getresponse()
print response
or if you want to use the nice HTTP library called "Requests".
import requests
headers = {'X-API-TOKEN': 'your_token_here'}
payload = {'title': 'value1', 'name': 'value2'}
r = requests.post("http://foo.com/foo/bar", data=payload, headers=headers)

How to execute this request using python

I wish to execute this using python. It is a RESTful request
curl -XPOST -u 'userid:password' -H
'Content-Type: application/json' -H 'X-Forwarded-For: 100.100.0.144' -k
'http://myurl
' -d ' jsonObject
What I have till now
import urllib2
def make_payment_request():
manager = urllib2.HTTPPasswordMgrWithDefaultRealm()
manager.add_password(None, 'https://url', 'userid', 'passwrd')
handler = urllib2.HTTPBasicAuthHandler(manager)
director = urllib2.OpenerDirector()
director.add_handler(handler)
req = urllib2.Request('https://url', headers = {'Accept' : 'application/json'})
result = director.open(req)
# To get say the content-length header
print "this is result",result
I am getting response as None also how do I add jSonobject all with it
Use requests
import requests
import json
url = 'http://myurl'
headers = {'X-Forwarded-For': '100.100.0.144'}
user = 'user'
password = 'sekret'
data = {'foo': 'bar'}
r = requests.post(url,
data=json.dumps(data), auth=(user,password), headers=headers)
I am getting 'requests.exceptions.SSLError: [Errno 1] _ssl.c:504:
error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate
verify failed
Just like a web browser, requests will try to validate SSL. In a browser when the SSL doesn't validate you get a warning and then you can continue on to the site if you choose.
For requests, you can achieve the same by passing verify=False to ignore SSL verification errors.
r = requests.post(url, verify=False,
data=json.dumps(data), auth=(user,password), headers=headers)

How do I set cookies using Python urlopen?

I am trying to fetch an html site using Python urlopen.
I am getting this error:
HTTPError: HTTP Error 302: The HTTP server returned a redirect error that would lead to an infinite loop
The code:
from urllib2 import Request
request = Request(url)
response = urlopen(request)
I understand that the server redirects to another URL and that it is looking for a cookie.
How do I set the cookie it is looking for so I can read the html?
Here's an example from Python documentation, adjusted to your code:
import cookielib, urllib2
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
request = urllib2.Request(url)
response = opener.open(request)

Categories