Selenium with Python : how to click dojo combobox values - python

I'm navigating a JS-heavy webpage with Selenium and I need to be able to interact with a dojo component on the page. The page I'm looking at has a dojo dijit form with a combobox that has subject names for my university. I want to expose and iteratively click on every item in the list in order to scrape the course names for that subject when it redirects. The list items are exposed when the dropdown arrow button is clicked.
the url I'm automating: http://sis.rutgers.edu/soc/#subjects?semester=12020&campus=NB,NK,CM&level=U,G
I'm inspecting the element for the dropdown button and copying the XPath.
dropdownButton = driver.find_element_by_xpath('//*[#id="widget_dijit_form_FilteringSelect_0"]/div[1]/input')
Running this yields:
NoSuchElementException: Message: no such element: Unable to locate
element:
{"method":"xpath","selector":"//*[#id="widget_dijit_form_FilteringSelect_0"]/div[1]/input"}
EDIT: I've made some progress, turn out the element wasn't rendered by the time find_by_xpath was called. I added a wait in my program, and now Selenium is able to locate and click the drowdown button.

Use WebDriverWait to wait require element conditions. Dropdown disappears on any action on the page, that's why to get option locator you can do one of the following:
all options loaded after first expand, that's why you can search option element by text in chrome dev tools and get locator
pause and inspect the element.
You can google best practice for locators,here and here.
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
driver = webdriver.Chrome()
wait = WebDriverWait(driver, 10)
with driver:
driver.get("http://sis.rutgers.edu/soc/#subjects?semester=12020&campus=NB,NK,CM&level=U,G")
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "#filteringSelectDiv .dijitArrowButtonInner"))).click()
options = driver.execute_script('return [...arguments[0]].map(e=>e.textContent)',
wait.until(EC.presence_of_all_elements_located(
(By.CSS_SELECTOR, ".dijitComboBoxMenuPopup .dijitMenuItem[item]"))))
for option in options:
driver.find_element_by_css_selector(".dijitInputInner").clear()
driver.find_element_by_css_selector(".dijitInputInner").send_keys(option, Keys.TAB)
wait.until(lambda d: d.execute_script("return document.readyState === 'complete'"))
# collect data

Related

Cannot locate form-control object to send_keys using python Selenium

I am trying to navigate a scheduling website to eventually auto populate a schedule using the following script:
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
# Create a Chrome webdriver
driver = webdriver.Chrome(r'C:\Users\chromedriver_win32\chromedriver.exe')
# Navigate to https://www.qgenda.com/
driver.get('https://www.qgenda.com/')
# Wait for the page to load
driver.implicitly_wait(5) # 5 seconds
# You can now interact with the page using the webdriver
# Locate the sign in button
sign_in_button = driver.find_element(By.XPATH,'/html/body/div[1]/div/header[3]/div/div[3]/div/div/div/div/a')
# Click the sign in button
sign_in_button.click()
# Find the input element
input_email = driver.find_element(By.XPATH,'//*[#id="Input_Email"]')
# Send text
input_email.send_keys('Josh')
However, I cannot seem to find the Input_Email object. I've tried all the Xpaths and Id's that make sense and also tried waiting until the object is clickable with no luck. Would really appreciate some guidance on this.
I was expecting Selenium to find the html object form box and pass in text but instead I get an error:
NoSuchElementException: no such element: Unable to locate element: {"method":"xpath","selector":"//*[#id="Input_Email"]"}
even though the Xpath definitely exists.
The XPath seems fine. I am guessing you need to do some explicit wait or implicit wait to ensure the page is fully loaded before allocating the element.
Another thing I would like to point out is that given the login URL is available. Locating the sign in button seems to be redundant. You can access it directly via driver.get('https://login.qgenda.com/')
For instance,
from selenium.webdriver.common.by import By
from selenium.webdriver.support.wait import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
driver.get('https://login.qgenda.com/')
input_email = WebDriverWait(driver, 10).until(
EC.presence_of_element_located((By.XPATH, '//*[#id="Input_Email"]'))
)
input_email.send_keys('Josh')
You can read more about it here.

Clicking on button not working in selenium + scrapy

I want to scrape links to news articles using scrapy + selenium. The website I am using uses a 'Load more' button, so I obviously want selenium to click on this button to load all articles.
I have looked for similar questions and tried various options already such as
element = driver.find_element(By.XPATH, value='//*[#id="fusion-app"]/main/div/div/div/div/div[4]/div/div/button')
driver.execute_script("arguments[0].click();", element)
and
element = WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.CSS_SELECTOR, ".ais-InfiniteHits-loadMore")))
ActionChains(driver).move_to_element(element).click().perform()
All to no result. I've also inserted some print statements in between to check whether it does run the code, and that seems to work fine; I think it's just a matter of the button not being located/clicked on.
This is the html of the button btw:
<button class="ais-InfiniteHits-loadMore">Load more </button>
And when I print element, this is what I get: <selenium.webdriver.remote.webelement.WebElement (session="545716eef622a12bdbeddef99e02bdef", element="551741ec-4616-4bd4-b8fd-57c2f4bffb00")>
Is someone able to help me out? Thank you in advance.
driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()),options=options)
driver.maximize_window()
wait = WebDriverWait(driver, 30)
driver.get('https://www.businessoffashion.com/search/?q=Louis+Vuitton&f=Articles%2CFashion+Shows%2CNews')
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR,'button.ab-close-button'))).click()
elem=wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, ".ais-InfiniteHits-loadMore")))
driver.execute_script("arguments[0].click()", elem)
You hit two different errors with a pop up and an element click intereception when you can just use javascript to click that element.
Import:
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

Unable to select a menu item with selenium

I am making a screen scraper using the Selenium Python library and I have already made some code so that I can log in. For some reason I am now stuck on the main menu and can't select any of the options. I have tried using CSS Selector, Class Name, and XPATH and none have been able to select any of the possible choices. No matter what happens, I always get a TimeoutException even with a long delay.
The portion of the page I am trying to scrape from is here.
The relevant code is as follows:
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
# Open browser and go to the Webex login page
driver = webdriver.Chrome()
driver.get('https://admin.webex.com')
delay = 10 # seconds
long_delay = 20
# Login portion removed
# Menu selection goes here.
# I have tried the following with no luck
# The following lines produce a TimeoutException error
# Selecting menu item
WebDriverWait(driver, long_delay).until(EC.presence_of_element_located((By.XPATH, "span[#class='left-nav-item__link']"))).click()
WebDriverWait(driver, long_delay).until(EC.presence_of_element_located((By.XPATH, '//mch-left-nav-item-group[4]/ul/mch-left-nav-item[3]/li/span'))).click()
WebDriverWait(driver, long_delay).until(EC.presence_of_element_located((By.XPATH, "//webex-root/webex-main[#class='control-hub-container']//webex-sidebar/mch-left-nav/nav/mch-left-nav-item-group[4]/ul/mch-left-nav-item[3]//span[#class='left-nav-item__link']"))).click()
WebDriverWait(driver, long_delay).until(EC.presence_of_element_located((By.CSS_SELECTOR, "span[class='left-nav-item__link']"))).click()
WebDriverWait(driver, delay).until(EC.presence_of_element_located((By.CSS_SELECTOR, "[aria-label] mch-left-nav-item-group:nth-of-type(4) mch-left-nav-item:nth-of-type(3) .left-nav-item__link"))).click()
# Selecting group of items
WebDriverWait(driver, delay).until(EC.presence_of_all_elements_located((By.CLASS_NAME, 'left-nav-item')))
WebDriverWait(driver, delay).until(EC.presence_of_all_elements_located((By.CSS_SELECTOR, 'li.left-nav-item')))
# Selecting parent
WebDriverWait(driver, long_delay).until(EC.presence_of_element_located((By.CSS_SELECTOR, "li[data-test-name='calling']")))
Does anyone have an idea why I'm not able to select any element?
Try xpath for the third one:
WebDriverWait(driver, long_delay).until(EC.presence_of_element_located((By.XPATH, "span[#class='left-nav-item__link']"))).click()
To anyone having this issue.
When a new tab is opened while you are using Selenium, even though you are viewing the new page, Selenium is not. You have to switch to the correct window. Something such as the following can help you with that.
driver.switch_to.window(driver.window_handles[NUMBER])

Cannot click a button because Selenium is unable to locate an element

I have tried to make my script click the Purchase/Buy Family button on the Spotify Checkout Page. No matter what class, CSS, XPath, ID, or whatever I put in, it's just saying it could not find the object.
This is the button. It's not in an iframe:
<div class="sc-fzXfOu cvoJMt">
<button id="checkout_submit" class="Button-oyfj48-0 kaOWUo sc-fzXfOv tSdMK">
Buy Premium Family
</button>
</div>
My code:
time.sleep(3)
buy = driver.find_element_by_xpath("/html/body/div[3]/div/div/div/div/div/div/div[3]/div/div/div[2]/form/div[2]/button").click()
I am able to click the button,by a different xpath
driver.findElement(By.xpath("//button[#id='checkout_submit']")).click();
Edit -
Your xpath also works for me only when i load the page initially and there is no change in the dom - /html/body/div[3]/div/div/div/div/div/div/div[3]/div/div/div[2]/form/div[2]/button
It doesn't work when some new event or error is displayed and the dom structure changes.
Why use such relative XPath when the element has distinguishable attributes?
The problem here is the form is not static you have to wait loading of all elements.
And page loads with pane to accept cookies who can mask elements you want to run action on.
The best way in that case is first accept cookies and then run all actions you need.
Try to adapt this code for your needs, that run with success on my test.
In case you run this code you have to brake execution to login first, when the driver get the page.
from selenium import webdriver
from selenium.webdriver.common.action_chains import ActionChains
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.remote.webelement import WebElement
# change this line
path_driver = "your_path_to_chrome_driver"
by, buy_selector, cookies_selector = By.CSS_SELECTOR, 'button#checkout_submit', "button#onetrust-accept-btn-handler"
driver = webdriver.Chrome(path_driver)
driver.maximize_window()
actions = ActionChains(driver)
driver.get("https://www.spotify.com/us/purchase/offer/premium-family/?country=US")
# wait for loading buy button
sls = wait.until(EC.presence_of_all_elements_located((by, buy_selector)))
if sls:
# get accept cookies button element and click
cookies_accept = driver.find_element_by_css_selector(cookies_selector)
if isinstance(cookies_accept, WebElement):
cookies_accept.click()
# get buy button element, move to element and click
buy = driver.find_element_by_css_selector(buy_selector)
if isinstance(buy, WebElement) and buy.is_displayed() and buy.is_enabled():
actions.move_to_element(buy).click(buy).perform()

How to click same button in another section while scraping using selenium

So I'm scraping using selenium and I want to click 'next' button in 'Defensive' section but the code I wrote clicks 'next' on 'Summary'.
Here's the url for you to try :
https://www.whoscored.com/Regions/252/Tournaments/2/Seasons/7361/Stages/16368/PlayerStatistics/England-Premier-League-2018-2019
So it's selecting 'Defensive' and I can see it selected in the window but the next page doesnt appear. On clicking 'Summary' I found out next function is actually happening there.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
browser= webdriver.Chrome(executable_path ="C:\Program Files (x86)\Google\Chrome\chromedriver.exe")
browser.get('https://www.whoscored.com/Regions/252/Tournaments/2/Seasons/7361/Stages/16368/PlayerStatistics/England-Premier-League-2018-2019')
browser.find_element_by_xpath("""//*[#id="stage-top-player-stats-options"]/li[2]/a""").click()
element = WebDriverWait(browser, 20).until(EC.presence_of_element_located((By.XPATH, """//*[#id="next"]""")))
browser.execute_script("arguments[0].click();", element)
The xpath for next button is not unique for this page. try this,
element = WebDriverWait(browser, 20).until(EC.presence_of_element_located((By.XPATH, "//*[#id='stage-top-player-stats-defensive']//a[#id='next']")))
browser.execute_script("arguments[0].click();", element)
or
element = WebDriverWait(browser, 20).until(EC.presence_of_element_located((By.XPATH, "//*[#id='stage-top-player-stats-defensive']//a[#id='next']")))
element.click()
For each tab (Summary, Defensive, ..) new next button with same id=next added to the DOM.
Select Defensive and you will see there will be two next buttons with same id=next, select Offensive and there will be three next buttons.
With basic id=next selector you always click to the first next button from Summary tab. Because you're using JavaScript and nothing happen, try to click with Selenium click method and you will get an error.
To solve the problem adjust your selector to be more specific to the dom - #statistics-paging-defensive #next.
Also when you first time open the page there's cookies acceptance screen appears and block the page, you can use method like below to skip it.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
import selenium.common.exceptions as EX
def accept_cookies():
try:
WebDriverWait(browser, 20)\
.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "button.qc-cmp-button")))\
.click()
except EX.NoSuchElementException or EX.TimeoutException:
pass
#...
browser = webdriver.Chrome(executable_path ="C:\Program Files (x86)\Google\Chrome\chromedriver.exe")
browser.get('https://www.whoscored.com/Regions/252/Tournaments/2/Seasons/7361/Stages/16368/PlayerStatistics/England-Premier-League-2018-2019')
wait = WebDriverWait(browser, 20)
browser.get(baseUrl)
accept_cookies()
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "[href='#stage-top-player-stats-defensive']"))).click()
next_button = wait.until(
EC.element_to_be_clickable((By.CSS_SELECTOR, "#statistics-paging-defensive #next")))
next_button.click()
Your elements locators must be unique
Avoid using XPath wildcards - * as it will cause performance degradation and prolonged elements lookup timings
Avoid using JavaScriptExecutor for clicking, well-behaved Selenium test must do what real user does and I doubt that real user will be opening browser console and typing something like document.getElementById('next').click(), he will use the mouse
Assuming all above you should come up with a selector which uniquely identifies next button on Defensive tab which would be something like:
//div[#id='statistics-paging-defensive']/descendant::a[#id='next']
References:
XPath Tutorial
XPath Axes
XPath Operators & Functions

Categories