Python Selenium can't click on button in a pop up - python

I am playing around with selenium on https://www.autozone.com/, trying to fill out the add vehicle form using selenium and python
first I click on the add vehicle button
URL = "https://www.autozone.com/"
ADD_VEHICLE_XPATH = "/html/body/div[1]/div/div[2]/div[2]/header/div[2]/div/div/div[2]/div/button"
driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()))
driver.implicitly_wait(WAIT_TIME)
driver.get(URL)
add_vehicle_button = driver.find_element(By.XPATH, ADD_VEHICLE_XPATH)
add_vehicle_button.click()
Then a pop up window pops up, and I try to locate the button for the year dropdown
YEAR_BUTTON_XPATH = "/html/body/div[4]/div[3]/div/div[2]/div/div/div[1]/div/div[2]/div[1]/div/div/div[1]/div/button"
year_button = driver.find_element(By.XPATH, YEAR_BUTTON_XPATH)
This throws the NoSuchElementException
according to this script
def is_in_iframe():
driver.execute_script("""function iniFrame() {
if ( window.location !== window.parent.location )
{
// The page is in an iFrames
document.write("The page is in an iFrame");
}
else {
// The page is not in an iFrame
document.write("The page is not in an iFrame");
}
}
// Calling iniFrame function
iniFrame();""")
I am not in an iframe after clicking on add vehicle
I have also checked the names of all windows before and after clicking add vehicle, there is only ever 1 window and it is the same before and after clicking.
Some things to note:
I have tried adding both python sleep() and waiting in selenium
The div that contains the pop up and all buttons does not show up until I click add vehicle, and after that it shows up near the bottom
The xpath I'm using is unique to that button
What are some other ways I can try to locate the button? Please let me know if I need to add any more code or description.

You need to improve your locators.
Also you need to wait for elements to become clickable before clicking them.
The following code opens the "Add Vehicle" dialog and selects 2020 year.
from selenium import webdriver
from selenium.webdriver import DesiredCapabilities
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
options = Options()
options.add_argument("start-maximized")
caps = DesiredCapabilities().CHROME
caps["pageLoadStrategy"] = "eager"
webdriver_service = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(options=options, desired_capabilities=caps, service=webdriver_service)
wait = WebDriverWait(driver, 10)
url = "https://www.autozone.com/"
driver.get(url)
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "button[data-testid='deskTopVehicle-menu-lg']"))).click()
wait.until(EC.element_to_be_clickable((By.ID, "yearheader"))).click()
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "button[data-testid='yearheader-dropdown-list-item-4']"))).click()
This is the screenshot of web page state after applying the code above:
That site containing several JavaScript scripts making the page loading time long, so I added a special settings to driver not to wait for it.
This is what
caps = DesiredCapabilities().CHROME
caps["pageLoadStrategy"] = "eager"
coming for

Related

Unable to Click to next page while crawling selenium

I'm trying to scrap the list of services we have for us from this site but not able to click to the next page.
This is what I've tried so far using selenium & bs4,
#attempt1
next_pg_btn = browser.find_elements(By.CLASS_NAME, 'ui-lib-pagination_item_nav')
next_pg_btn.click() # nothing happens
#attemp2
browser.find_element(By.XPATH, "//div[#role = 'button']").click() # nothing happens
#attempt3 - saw in some stackoverflow post that sometimes we need to scroll to the
#bottom of page to have the button clickable, so tried that
browser.execute_script("window.scrollTo(0,2500)")
browser.find_element(By.XPATH, "//div[#role = 'button']").click() # nothing happens
I'm not so experienced with scrapping, pls advice how to handle this and where I'm going wrong.
Thanks
Several issues with your code:
You tried wrong locators.
You probably need to wait for the element to be loaded before clicking it. But if before clicking the pagination you performing some actions on the page this is not needed since during you scraping the page content web elements are already got loaded.
Pagination button is on the buttom of the page, so you need to scroll the page to bring the pagination button into the visible screen.
After scrolling some delay should be added, as you can see in the code below.
Now pagination element can be clicked.
The following code works
import time
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
options = Options()
options.add_argument("start-maximized")
webdriver_service = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(options=options, service=webdriver_service)
wait = WebDriverWait(driver, 10)
url = "https://www.tamm.abudhabi/en/life-events/individual/HousingProperties"
driver.get(url)
pagination = wait.until(EC.presence_of_element_located((By.CLASS_NAME, "ui-lib-pagination__item_nav")))
pagination.location_once_scrolled_into_view
time.sleep(0.5)
pagination.click()

selenium python element select from dropdown menu

Trying to select multiple elements from dropdown menu via selenium in python.
Website from URL. But Timeoutexception error is occurring.
I have tried Inspect menu from GoogleChrome. //label[#for="inputGenre"]/parent::div//select[#placeholder="Choose a Category"] gives exactly the select tag that I need. But unfortunately, with selenium I cannot locate any element within this tag. Any ideas why the error occur?
code is below;
slect_element = Select(WebDriverWait(driver, 10).until(EC.element_located_to_be_selected((By.XPATH, '//label[#for="inputGenre"]/parent::div//select[#placeholder="Choose a Category"]'))))
slect_element.select_by_index(1)
slect_element.select_by_value('23')
strangeness is it is possible to locate it and get its text values from code below;
drp_menu=driver.find_elements(By.XPATH,'//label[#for="inputGenre"]/parent::div//div[#class="dropdown-main"]/ul/li')
print(len(drp_menu))
ls_categories=[]
for i in drp_menu:
ls_categories.append(i.get_attribute('innerText'))
print is giving 15 elements, and get_attribute(innerText) gives text of each option element text.
Anyway, Thanks a lot #Prophet
That Select element is hidden and can't be used by Selenium as we use normal Select elements to select drop-down menu items.
Here we need to open the drop-down as we open any other elements by clicking them, select the desired options and click the Search button.
So, these lines are opening that drop down and selecting 2 options in the drop-down menu:
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#class='form-group'][contains(.,'Category')]//div[#class='dropdown-display-label']"))).click()
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#class='form-group'][contains(.,'Category')]//li[#data-value='23']"))).click()
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#class='form-group'][contains(.,'Category')]//li[#data-value='1']"))).click()
The result so far is:
And by finally clicking the Search button the result is:
The entire code is:
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
options = Options()
options.add_argument("start-maximized")
webdriver_service = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(service=webdriver_service, options=options)
wait = WebDriverWait(driver, 20)
url = "https://channelcrawler.com/"
driver.get(url)
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#class='form-group'][contains(.,'Category')]//div[#class='dropdown-display-label']"))).click()
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#class='form-group'][contains(.,'Category')]//li[#data-value='23']"))).click()
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#class='form-group'][contains(.,'Category')]//li[#data-value='1']"))).click()
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "button[type='submit']"))).click()

Python - Selenium is complaining about element not being scrolled into view after scrolling to that element

I have the following code which is supposed to scroll down the page and then click a button. When I run my script, I can see that the page does scroll until the element is at the very bottom of the page, but then the script fails when it gets time to click on that button and I get this error:
selenium.common.exceptions.ElementNotInteractableException: Message: Element could not be scrolled into view
I have tried using these methods:
driver.execute_script("arguments[0].scrollIntoView();", element)
driver.execute_script("arguments[0].scrollIntoView();", driver.find_element_by_xpath(xpath of element))
actions.move_to_element(element)
All of these methods have the same result: the page will scroll to the element, but when it is time to click on the element it complains that the element could not be scrolled into view.
CODE:
hide_partial_rows_button_per_100 = driver.find_element_by_xpath('//button[#id="per_poss_toggle_partial_table"]')
#driver.execute_script("arguments[0].scrollIntoView();",driver.find_element_by_xpath('//button[#id="per_poss_toggle_partial_table"]'))
#driver.execute_script("arguments[0].scrollIntoView();",hide_partial_rows_button_per_100)
actions.move_to_element(hide_partial_rows_button_per_100)
actions.perform
hide_partial_rows_button_per_100.click()
LINK TO PAGE I'M WORKING ON: https://www.basketball-reference.com/players/v/valanjo01.html
Someone on this site had a similar question but they were using JavaScript instead of Python and they added a time.sleep(1) between scrolling to the element and clicking on it, this did not work for me.
As said above, I've tried both the driver.execute_script("arguments[0].scrollIntoView();",element) and actions.move_to_element(element) methods and both work for scrolling, but when it is time to click on the element it complains that it cannot be scrolled into view.
You can apply location_once_scrolled_into_view method to perform scrolling here.
The following code worked for me:
from selenium import webdriver
from selenium.webdriver import DesiredCapabilities
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
options = Options()
options.add_argument("start-maximized")
options.add_argument('--disable-notifications')
caps = DesiredCapabilities().CHROME
caps["pageLoadStrategy"] = "eager"
webdriver_service = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(options=options, desired_capabilities=caps, service=webdriver_service)
wait = WebDriverWait(driver, 20)
url = "https://www.basketball-reference.com/players/v/valanjo01.html"
driver.get(url)
element = wait.until(EC.presence_of_element_located((By.XPATH, '//button[#id="per_poss_toggle_partial_table"]')))
element.location_once_scrolled_into_view

An element is visible and still not able to click on it like I click on others like it

I click on a specific button on a page, but for some reason there is one of the buttons that I can't click on, even though it's positioned exactly like the other elements like it that I can click on.
The code below as you will notice, it opens a page, then clicks to access another page, do this step because only then can you be redirected to the real url that has the //int.
import datetime
import time
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.firefox.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
with open('my_user_agent.txt') as f:
my_user_agent = f.read()
headers = {
'User-Agent': my_user_agent
}
options = Options()
options.set_preference("general.useragent.override", my_user_agent)
options.set_preference("media.volume_scale", "0.0")
options.page_load_strategy = 'eager'
driver = webdriver.Firefox(options=options)
today = datetime.datetime.now().strftime("%Y/%m/%d")
driver.get(f"https://int.soccerway.com/matches/{today}/")
driver.find_element(by=By.XPATH, value="//div[contains(#class,'language-picker-trigger')]").click()
time.sleep(3)
driver.find_element(by=By.XPATH, value="//li/a[contains(#href,'https://int.soccerway.com')]").click()
time.sleep(3)
try:
WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.XPATH, "//a[contains(#class,'tbl-read-more-btn')]")))
driver.find_element(by=By.XPATH, value="//a[contains(#class,'tbl-read-more-btn')]").click()
time.sleep(0.1)
except:
pass
WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.XPATH, "//div[#data-exponload='']//button[contains(#class,'expand-icon')]")))
for btn in driver.find_elements(by=By.XPATH, value="//div[#data-exponload='']//button[contains(#class,'expand-icon')]"):
btn.click()
time.sleep(0.1)
I've tried adding btn.location_once_scrolled_into_view before each click to make sure the button is correctly in the click position, but the problem still persists.
I also tried using the options mentioned here:
Selenium python Error: element could not be scrolled into view
But the essence of the case kept persisting in error, I couldn't understand what the flaw in the case was.
Error text:
selenium.common.exceptions.ElementNotInteractableException: Message: Element <button class="expand-icon"> could not be scrolled into view
Stacktrace:
RemoteError#chrome://remote/content/shared/RemoteError.jsm:12:1
WebDriverError#chrome://remote/content/shared/webdriver/Errors.jsm:192:5
ElementNotInteractableError#chrome://remote/content/shared/webdriver/Errors.jsm:302:5
webdriverClickElement#chrome://remote/content/marionette/interaction.js:156:11
interaction.clickElement#chrome://remote/content/marionette/interaction.js:125:11
clickElement#chrome://remote/content/marionette/actors/MarionetteCommandsChild.jsm:204:29
receiveMessage#chrome://remote/content/marionette/actors/MarionetteCommandsChild.jsm:92:31
Edit 1:
I noticed that the error only happens when the element is colored orange (when they are colored orange it means that one of the competition games is happening now, in other words it is live).
But the button is still the same, it keeps the same element, so I don't know why it's not being clicked.
See the color difference:
Edit 2:
If you open the browser normally or without the settings I put in my code, the elements in orange are loaded already expanded, but using the settings I need to use, they don't come expanded. So please use the settings I use in the code so that the page opens the same.
What you missing here is to wrap the command in the loop opening those sections with try-except block.
the following code works. I tried running is several times.
import datetime
import time
from selenium import webdriver
from selenium.webdriver import DesiredCapabilities
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
options = Options()
options.add_argument("start-maximized")
caps = DesiredCapabilities().CHROME
caps["pageLoadStrategy"] = "eager"
webdriver_service = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(service=webdriver_service, options=options, desired_capabilities=caps)
wait = WebDriverWait(driver, 10)
today = datetime.datetime.now().strftime("%Y/%m/%d")
driver.get(f"https://int.soccerway.com/matches/{today}/")
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[contains(#class,'language-picker-trigger')]"))).click()
time.sleep(5)
wait.until(EC.element_to_be_clickable((By.XPATH, "//li/a[contains(#href,'https://int.soccerway.com')]"))).click()
time.sleep(5)
try:
wait.until(EC.element_to_be_clickable((By.XPATH, "//a[contains(#class,'tbl-read-more-btn')]")))
driver.find_element(By.XPATH, "//a[contains(#class,'tbl-read-more-btn')]").click()
time.sleep(0.1)
except:
pass
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[#data-exponload='']//button[contains(#class,'expand-icon')]")))
for btn in driver.find_elements(By.XPATH, "//div[#data-exponload='' and not(contains(#class,'status-playing'))]//button[contains(#class,'expand-icon')]"):
btn.click()
time.sleep(0.1)
UPD
We need to open only closed elements. The already opened sections should be stayed open. In this case click will always work without throwing exceptions. To do so we just need to add such indication - click buttons not inside the section where status is currently playing.

Changing country while scraping Aliexpress using python selenium

I'm working on a scraping project of Aliexpress, and I want to change the ship to country using selenium,for example change spain to Australia and click Save button and then scrap the page, I already found an answer it worked just I don't know how can I save it by clicking the button save using selenium, any help is highly appreciated. This is my code using for this task :
country_button = driver.find_element_by_class_name('ship-to')
country_button.click()
country_buttonn = driver.find_element_by_class_name('shipping-text')
country_buttonn.click()
WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//li[#class='address-select-item ']//span[#class='shipping-text' and text()='Australia']"))).click()
Well, there are 2 pop-ups there you need to close first in order to access any other elements. Then You can select the desired shipment destination. I used WebDriverWait for all those commands to make the code stable. Also, I used scrolling to scroll the desired destination button before clicking on it and finally clicked the save button.
The code below works.
Just pay attention that after selecting a new destination pop-ups can appear again.
import time
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.action_chains import ActionChains
options = Options()
options.add_argument("--start-maximized")
s = Service('C:\webdrivers\chromedriver.exe')
driver = webdriver.Chrome(options=options, service=s)
url = 'https://www.aliexpress.com/'
wait = WebDriverWait(driver, 10)
actions = ActionChains(driver)
driver.get(url)
try:
wait.until(EC.element_to_be_clickable((By.XPATH, "//div[contains(#style,'display: block')]//img[contains(#src,'TB1')]"))).click()
except:
pass
try:
wait.until(EC.element_to_be_clickable((By.XPATH, "//img[#class='_24EHh']"))).click()
except:
pass
wait.until(EC.element_to_be_clickable((By.CLASS_NAME, "ship-to"))).click()
wait.until(EC.element_to_be_clickable((By.CLASS_NAME, "shipping-text"))).click()
ship_to_australia_element = driver.find_element(By.XPATH, "//li[#class='address-select-item ']//span[#class='shipping-text' and text()='Australia']")
actions.move_to_element(ship_to_australia_element).perform()
time.sleep(0.5)
ship_to_australia_element.click()
wait.until(EC.element_to_be_clickable((By.XPATH, "//button[#data-role='save']"))).click()
I mostly used XPath locators here. CSS Selectors could be used as well

Categories