How to view what python requests is doing on browser - python

I just started experimenting with Requests with python to interact with different sites. However sometimes I want to see if the POST Requests I'm sending is actually working. Is there anyway to open a browser to see what is actually happening in the browser when I send POST requests?

Related

Python requests get all other requests after sending post

I am logging in into a website with python request by sending a post with required data.
I am trying to get other http requests after sending the previous http post.
Is there a way to do it?
If I log in manually in browser I can see all other requests that are being sent after logging in (which is the first POST in screenshot), I want to grab them all (the ones marked with green marker):
I assume that when you login a new html side is responded to your web browser.
During the rendering of this site some files like images or javascript are requested from the server side. With selenium you can automate user interactions with a web browser and log the traffic like described in this example.

Selenium Python get data from HTTP request

I am running automation with Selenium and Python on Opera web driver, when I enter the specific page that I need, a request is sent to the server, it is authenticated with anti-content which blocks me from requesting it, then the only solution is to get the returned JSON after sending the request, I had checked selenium-wire, but I think it doesn't fit my needs, I thought if there is another way to do that, any suggestions?
You can try to use Titanium Web Proxy. It is a proxy server and can be installed via Nuget package and used with Selenium.
string body = await e.GetResponseBodyAsString();
Reference:
https://github.com/justcoding121/Titanium-Web-Proxy/issues/176
https://www.automatetheplanet.com/webdriver-capture-modify-http-traffic/#tab-con-9
Hello there are some pages which is created to be impossible automatize the request.
That rule works in JavaScript and there are companies which makes this detection and close the access for a bot.
So I am sorry to cannot solve your problem, I tried to do the same as You and there are not way.

Comparing request module vs selenium in Python

I made a program that works with selenium, and it automates for posting comment to the some blogs' contents. I'm not familiar with the requests module of python. (working on it for just a week) The thing that I'm wondering is, my program with selenium is a bit slow for page loading, and it loads everything from ads to the images/videos. If I'd made my program with requests module, would it save data and a bit faster according to the selenium module?
I searched this issue at some forum-sites, generally they say request modules a bit faster, but not all. Also I couldn't find any info about saving data by comparing this modules?
Plz don't give me directly the thumbs down. I need this answer with details.
Selenium is used for web automation via clicking in web elements and sending keys to input boxes.
To speed up selenium, use headless mode, so that the visual components like ads are not loaded and the work is fast , go to selenium's documentation to learn more about headless mode.
While requests is used for HTTP methods
Like GET, POST etc. Learn more about requests from here
If the blogging site has a public api, then you can use requests module.
If you are new to API , I recommend watching this YouTube video
https://youtu.be/GZvSYJDk-us
For example to create issues on GitHub you can use GitHub API.
But to comment on a blogging site which has no public api, you need to use selenium.
Requests directly send and receive data from the server which hosts a particular service, so it is fast.
But selenium interacts with the web browser.
When you are using requests , you can do an action directly, without having to perform a bunch of clicks or send keys.
Selenium allows you to control a browser and execute actions on a webpage.
requests library is for making HTTP requests.
So, if you know how to write your program for posting comments with just using HTTP API then I’d go with requests, Selenium would be an overhead in this case
If you are proficient with HTTP requests and verb (know how to make a POST request to a server with requests library), then choose requests. If you want to test your script, use selenium or BeautifulSoup.

Simulate active session on a website with python

I'm looking for a way to simulate an active session on a website using Python. What I mean by that, is I want to create a software, which makes the website think that an actual user with an actual browser has the website open. I've found urllib3 and it's request.urlopen methon, but it seems that this only reads the content provided from url and closes the connection. Thanks for any suggestions
You can try simulate browser requests to get necessary cookies for authentication. Google Chrome Dev Tools and requests python lib will do the job.
Some websites have another way to handle sessions, but I believe the majority is using cookies set through post requests.

Python requests and my browser aren't giving me the same result

I was trying to extract data from a website with the python requests library
When I launched the request from my browser, I got the data
When I launched the request from requests library, I got a captcha
So I guess that website doesn't want me to extract its data and it's fine, but it made me curious.
How can I get different results from a same request ? Headers were the same, URLs were the same too

Categories