In my application, I have my API that is in localhost:8000/api/v0.1/save_with_post.
I've also made a Python Script in order to do a Post Request on such Api.
### My script
import requests
url = 'localhost:8000/api/v0.1/save_with_post'
myobj = {'key': 'value'}
x = requests.post(url, data = myobj)
Is it possible to view headers and body of the request in Chrome rather than debugging my application code?
You want Postman.
With Postman you can either generate a request to your service from Postman itself, or set up Postman as a proxy so you can see the requests that your API client is generating and the responses from the server.
If you want to view the response headers from the post request, have you tried:
>>> x.headers
Or you could just add headers yourself to your POST request as so:
h = {"Content-Type": "application/xml", ("etc")}
x = requests.post(url, data = myobj, headers = h)
well, I don't know if there is a way you could view the request in Chrome DevTools directly (I don't think there is) however, I know there are two alternatives for seeing the request body and response:
1 - use selenium with a chrome webdriver
this will allow you to run chrome automated by python. then you can open a test page and run javascript in it to do your post request,
see this for more info on how to do this:
1 https://selenium-python.readthedocs.io/getting-started.html
2 Getting the return value of Javascript code in Selenium
you will need to use Selenium-requests library to use requests library with selenium
https://pypi.org/project/selenium-requests/3
2 - use Wireshark
this program will allow you to see all the traffic that is going on your network card and therefore you will be able to monitor all the requests going back and forth. however, Wireshark will throw all the traffic that you network card send or receives it may be hard to see the specific request you want
Related
I am trying to write an azure function to manage SSO between two services. The first one will host the link to the HTTP triggered Azure Function which then should respond with the formatted SAML Response which then gets sent to the consumption URL as a POST, but I can only make GET requests with the azure.functions.HttpResponse method needed to parse outputs for Azure Functions (unless I'm wrong).
Alternatively I've tried to set the cookie that I get as a response from sending the SAML Response with the python requests method, but the consumption URL doesn't seem to care that the cookie is there and just brings me back to the login page.
The SP in this situation is Absorb LMS and I can confirm that the SAML Response is formatted correctly because submitting it from an HTTP form works fine (which I've also tried returning as the body of the azure.functions.HttpResponse, but I just get HTTP errors which I can't make heads or tails of).
import requests
import azure.functions as func
headers = {
'Content-Type': 'application/x-www-form-urlencoded'
}
body = {"SAMLResponse": *b64 encoded saml response and assertion*}
response = requests.post(url=*acs url*, headers=headers, data=body)
headers = response.headers
headers['Location'] = *acs url*
return func.HttpResponse(headers=headers, status_code=302)
I'm relatively new to Python so excuse any errors or misconceptions I may have. I've done hours and hours of research and have hit a stopping point.
I'm using the Requests library to pull data from a website that requires a login. I was initially successful logging in through through a session.post,(payload)/session.get. I had a [200] response. Once I tried to view the JSON data that was beyond the login, I hit a [403] response. Long story short, I can make it work by logging in through a browser and inspecting the web elements to find the current session cookie and then defining the headers in requests to pass along that exact cookie with session.get
My questions is...is it possible to set/generate/find this cookie through python after logging in? After logging in and out a few times, I can see that some of the components of the cookie remain the same but others do not. The website I'm using is garmin connect.
Any and all help is appreciated.
If your issue is about login purposes, then you can use a session object. It stores the corresponding cookies so you can make requests, and it generally handles the cookies for you. Here is an example:
s = requests.Session()
# all cookies received will be stored in the session object
s.post('http://www...',data=payload)
s.get('http://www...')
Furthermore, with the requests library, you can get a cookie from a response, like this:
url = 'http://example.com/some/cookie/setting/url'
r = requests.get(url)
r.cookies
But you can also give cookie back to the server on subsequent requests, like this:
url = 'http://httpbin.org/cookies'
cookies = dict(cookies_are='working')
r = requests.get(url, cookies=cookies)
I hope this helps!
Reference: How to use cookies in Python Requests
basically I am trying to look at the http redirect data when getting a link with selenium webdriver.
With python requests I would do it like this:
r = requests.get(link, allow_redirects=False)
match = re.search(r'some regex', r.headers['Location'])
But now the site is behind cloudflare protection, so simple http requests do not work anymore.
Any idea how I could look into the redirect headers with selenium on python?
Another option might be to inject the selenium cookie into the request, but that does not seem as robust.
More details on the redirects:
- I send GET request to URL_A
--> I receive redirect response to URL_B (< This is the one i want)
- URL_B is another redirect response to URL_C (I do not want that)
Basically I end up on URL_C but I want to know URL_B, so I have to look into the requests headers somehow with selenium
I'm trying to get data from
https://www.biman-airlines.com/bookings/flight_selection.aspx
For example, when I choose flight from Dhaka(DAC) to Sylhet(ZYL), it goes to
https://www.biman-airlines.com/bookings/flight_selection.aspx?TT=RT&SS=&RT=&FL=on&DC=DAC&AC=ZYL&AM=2018-01&AD=09&DC=&AC=&AM=&AD=&DC=&AC=&AM=&AD=&DC=&AC=&AM=&AD=&RM=2018-01&RD=10&PA=1&PT=&PC=&PI=&CC=&NS=&CD=&FS=B4B9631
and shows the flight information
but when I'm trying to perform such get request using python, it shows no info
here is my code:
import requests
print(requests.get('https://www.biman-airlines.com/bookings/flight_selection.aspx?TT=RT&SS=&RT=&FL=on&DC=DAC&AC=ZYL&AM=2018-01&AD=09&DC=&AC=&AM=&AD=&DC=&AC=&AM=&AD=&DC=&AC=&AM=&AD=&RM=2018-01&RD=10&PA=1&PT=&PC=&PI=&CC=&NS=&CD=&FS=').text)
What am I doing wrong?
thanks in advance for any help
but when I'm trying to perform such get request using python, it shows no info. What am I doing wrong?
The request result shows no info because there is no cookie data in the python HTTP request.
If you check the HTTP request in browser debug window, you can see there is cookie along with the request -- the cookie identifies who the client is and tells server "Hi, server, I'm a valid user":
With reasonable guess, in this biman-airlines.com case, the server would check the cookie and return result only if the cookie is valid.
Thus, you need to add your Cookie header in the python code:
# The cookie below is just for example, you would get your own cookie once visiting the website.
headers = {
'Cookie': 'chocolateChip=nbixfy44dvziejjdxd2wmzs3; BNI_bg_zapways=0000000000000000000000009301a8c000005000; ASPSESSIONIDSQDCSSDT=PFJPADACFOGBDMONPBHPMFJN'
}
print(requests.get('https://www.biman-airlines.com/bookings/flight_selection.aspx?TT=RT&SS=&RT=&FL=on&DC=DAC&AC=ZYL&AM=2018-01&AD=09&DC=&AC=&AM=&AD=&DC=&AC=&AM=&AD=&DC=&AC=&AM=&AD=&RM=2018-01&RD=10&PA=1&PT=&PC=&PI=&CC=&NS=&CD=&FS=B4B9631', headers=headers).text)
I have problem with simple authorization and upload API script.
When authorized, client receives several cookies, including PHPSESSID cookie (in browser).
I use requests.post method with form data for authorization:
r = requests.post(url, headers = self.headers, data = formData)
self.cookies = requests.utils.dict_from_cookieja(r.cookies)
Headers are used for custom User-Agent only.
Authorization is 100% fine (there is a logout link on the page).
Later, i try to upload data using the authorized session cookies:
r = requests.post(url, files = files, data = formData, headers = self.headers, cookies = self.cookies)
But site rejects the request. If we compare the requests from script and google chrome (using Wireshark), there is no differences in request body.
Only difference is that 2 cookies sent by requests class, while google chrome sends 7.
Update: Double checked, first request receives 7 cookies. post method just ignore half...
My mistake in code was that i was assigning cookies from each next API request to the session cookies dictionary. On each request since logged in, cookies was 'reset' by upcoming response cookies, that's was the problem. As auth cookies are assigned only at login request, they were lost at the next request.
After each authorized request i use update(), not assigning.
self.cookies.update( requests.utils.dict_from_cookiejar(r.cookies) )
Solves my issue, upload works fine!