Is it possible to use selenium and requests at the same time? - python

I am thinking of creating a web automation using python, basically it will open browser using selenium webdriver proceeds to click on a few buttons, then using requests post method, fill up a form and then continue to use selenium again. So in short I am asking if we are able to use both selenium and python requests interchangeably?

Of course you can! I use both the libraries interchangeably in the same code file. It is very helpful.
For eg. First I use requests library to fetch the webpage, next I use Selenium whenever I have to change specific parameter in the webpage (like selecting a radio button, inserting form credentials, etc.), and then based on the complexity of the source code, I either use BeautifulSoup, or I continue using Selenium!

Related

Is there a way to make python open a webpage?

Im new to python and programming and wanted to make a simple program that opens a webpage after its execution, how can it be made?
Yes, the most common way to do this is with selenium and a webdriver manager. If you don't need to open the whole webpage and just need the HTML, use beautifulsoup4 and requests.
Depends on what you mean with make python open a webpage.
You can either call your default browser to open the URL with something like the following:
firefox.exe <url>
Or you can create an application using QT to show the webpage in "plain" Python: https://pythonspot.com/pyqt5-webkit-browser/
If you need to interact with the page through your program, see the links in the answer mentioning selenium.

Creating a script that takes live data from a website (for now) and displays it

This isn't really a specific question i'm sorry for that. I'm trying to create a script that would take real time data from another site ( from table tag to be exact, make it an array and display it somewhere ). I've created a simple python script:
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
import requests
import time
driver = webdriver.Chrome('C:/drivers/chromedriver.exe')
driver.set_page_load_timeout("10")
driver.get("link to the site")
driver.find_element_by_id("username-real").send_keys("login")
driver.find_element_by_id("pass-real").send_keys("pwd")
driver.find_element_by_xpath('//input[#class="button-login"]').submit()
#here potentially for loop that would refresh every second:
for elem in driver.find_elements_by_xpath('//[#class="table-body#"]'):
#do something
As you can see it's pretty simple, basically open chrome webdriver, log in to the website and do something with the table, I didn't try to properly get the data yet because i don't like this method.
I was wondering if there's another way to do it, without running the webdriver - some console like application? I'm pretty lost what should i look into in order to create a script like that. Other programming language? Some kind of framework/method?
If you want to use Selenium you have to use the WebDriver. See it as a "connection" between your Programm and Google Chrome. If you can use Safari you can use Selenium without any WebDrivers that have to be installed manually.
If you want to use other tools I can recommend Beautifulsoup. It's basically a HTML-Parser wich looks into the HTML-Code of the WebPage. With BS you don't have to install any Drivers etc. You also can use BS with Python.
A other Method I'm thinking of is, downloading the HTML-Text of the WebPage and search locally through the file. But I wouldn't recommend this Method.
For WebPages Selenium is really the way to go. I often use it for my own projects

Programmatically access and modify.aspx page

I am working on a project which needs to programmatically access and update a .aspx (ASP.NET) page. Specifically, I need to automatically access this page, use several html and JavaScript elements (click checkboxes, enter text in form fields, "click" buttons), and reload the page. Also, during the time the page is accessed, there is information being sent back and forth between the client and server.
What is the most efficient way to go about this? I am most likely thinking about writing something in bash + python to do this but I am not sure it is the best tool for the job.
Thanks
The optimal solution for your problem is using Selenium with python.
The selenium package is used to automate web browser interaction from Python.
pip install -U selenium
You can read the documentation to get familiar with the Selenium Webdriver API.
You cannot edit the pages that are hosted by others, but you can mimic the requests using selenium.

Submit form on internet website

I want to build a python script to submit some form on internet website. Such as a form to publish automaticaly some item on site like ebay.
Is it possible to do it with BeautifulSoup or this is only to parse some website?
Is it possible to do it with selenium but quickly without open really the browser?
Are there any other ways to do it?
Look at the requests library.. Also, check out the chrome debugger toolbar to see the requests fly by. There is also a utility called postman, where you can "design", queries, then generate code in many different flavors (including pythons requests library).
BeautifulSoup is for parsing HTML.
You can use selenium with PhantomJS to do this without the browser opening. You have to use the Keys portion of selenium to send data to the form to be submitted. It is also worth noting that this method will not work if there are captcha's on the form.
The mechanize library can fill and submit forms.

Python and webbrowser form fill

Hello how can i make changes in my web browser with python? Like filling forms and pressing Submit?
What lib's should i use? And maybe someone of you have some examples?
Using urllib does not make any changes in opened browser for me
Urllib is not intended to do anyting with your browser, but rather to get contents from urls.
To fill in forms and this kind of things, have a look into mechanize, to scrap the webpages, consider using pyquery.
Selenium is great for this. It's a browser automation tool that you can use to launch a browser (any major browser or a 'headless' one), navigate to a url, and interact with the page.
It's used primarily for testing web code against multiple browsers, but is also very useful for 'scraping' pages and automating mundane tasks.
Here are the python docs: http://selenium-python.readthedocs.org/en/latest/index.html

Categories