Python Selenium click() - python

I am trying to click on the navigation tab 'MPass' however, it doesn't work.
https://www.gv.com.sg/GVMovies
nav = driver.find_element_by_css_selector('ul.nav.nav-tabs')
tabs = nav.find_elements_by_css_selector('a.ng-binding')[1]
tabs.click()
error
[overview][2]

Try using full XPath:
I am using Firefox web-driver(You can download it from here):
from selenium import webdriver
site = 'https://www.gv.com.sg/GVMovies'
#providing driver path
driver = webdriver.Firefox(executable_path = 'C:\Drivers\geckodriver.exe')
driver.get(site)
try:
driver.find_element_by_xpath("/html/body/div[3]/div[2]/div/div/div[2]/div/ul/li[2]/a").click()
except Exception as e:
pass

Make sure you are waiting for the element to load and visible in the dom.
and then use find_element_by_css_selector
selector = driver.find_element_by_css_selector("li[select=\"changeTab('M Pass Movies')\"]")
selector.click()
or
selector = driver.find_elements_by_css_selector("li[class='tab-pane ng-isolate-scope']")[0]

You could just invoke click on the a tag with M Pass Movies and bypass the element in front.
nav = driver.find_element_by_css_selector('ul.nav.nav-tabs')
tab = nav.find_element_by_xpath("//*[text()='M Pass Movies']")
driver.execute_script("arguments[0].click();", tab)

Related

Python selenium not working for css attribute selector

I have this code:
driver = webdriver.Chrome()
driver.get('https://xxx')
input() # pause to do some stuff like login, then manually unpause
driver.find_element(By.CSS_SELECTOR, '*[data-xyz="valImLookingFor"]')
If I inspect element in chrome (the same chrome tab that selenium opened) and type into console document.querySelector('*[data-xyz="valImLookingFor"]'), it finds the element correctly. But selenium isn't able to. What is wrong?
Try:
driver.find_element(By.CSS_SELECTOR, '[data-xyz*="valImLookingFor"]')

Selenium locator only works with inspect tab open

I am trying to scrape names and odds for a project I am working on with Selenium 4, and just having an issue with the locator.
When I use the driver.find_element(By.XPATH), the XPATH I give it only seems to work when I open the inspect window on that particular page. When I close the inspect window, it gives me an NoSuchElementException: no such element: Unable to locate element: error.
Code:
from selenium import webdriver
from selenium.webdriver.common.by import By
driver = webdriver.Chrome()
url = 'https://www.bet365.com.au/#/AC/B13/C1/D50/E2/F163/'
driver.get(url)
Player1 = driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[3]/div/div/div/div[1]/div/div/div[2]/div/div/div[1]/div[2]/div/div[1]/div[2]/div[1]/div/div[2]/div/div[1]/div').text
Player2 = driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[3]/div/div/div/div[1]/div/div/div[2]/div/div/div[1]/div[2]/div/div[1]/div[2]/div[1]/div/div[2]/div/div[2]/div').text
Odds1 = driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[3]/div/div/div/div[1]/div/div/div[2]/div/div/div[1]/div[2]/div/div[2]/div[2]/span').text
Odds2 = driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[3]/div/div/div/div[1]/div/div/div[2]/div/div/div[1]/div[2]/div/div[3]/div[2]/span').text
print(f'{Player1}\t{Odds1}')
print(f'{Player2}\t{Odds2}')
Run just the section from Player1 onwards with the inspect window open and without it open. Hopefully you'll be able to replicate the issue.
I also ran
try:
if driver.find_element(By.XPATH, '/html/body/div[1]/div/div[4]/div[3]/div/div/div/div[1]/div/div/div[2]/div/div/div[1]/div[2]/div/div[1]/div[2]/div[1]/div/div[2]/div/div[1]/div').text:
print("yay")
except:
print("nay")
and it seemed to show the same situation. The element couldn't be found without the inspect window open. See attached image for where I got the XPATHs from.
Many thanks in advance!

Can’t find element by Xpath, or the selectors

I am making an automated JKLM bomb party bot using selenium.py (prank my friends). When it is given a link to a private JKLM, it will go there, confirm the username, but then get stuck on the “join game” button (I get TimeoutException Error).
driver = webdriver.Safari()
driver.get(link)
element = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.XPATH, "/html/body/div[2]/div[3]/form/div[2]/input")))
element.submit()
element1 = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.XPATH, "//button[#class='styled joinRound']")))
element1.click()
I have tried Absolute XPath:
/html/body/div[2]/div[3]/div[1]/div[1]/button
Relative XPATH:
//button[#class='styled joinRound’]
And Class Name:
styled joinRound
Along with Tag Name and CSS selector.
Any help would be greatly appreciated.
HTML I am trying to access and click on:
<button class="styled joinRound" data-text="joinGame">Join game</button>
I believe you may need to switch to the iframe first in Selenium. I had success with this:
import selenium.webdriver
def main():
driver = selenium.webdriver.Firefox()
driver.get('https://jklm.fun/DKCY')
driver.switch_to_frame(0)
xpath = '//div[#class="seating"]/div[#class="join"]/button'
els = driver.find_elements_by_xpath(xpath)
if els is None or len(els) == 0:
print('failed to find element')
return
els[0].click()
if __name__ == '__main__':
main()
See:
https://www.tutorialspoint.com/how-to-handle-frames-in-selenium-with-python
https://selenium-python.readthedocs.io/navigating.html#moving-between-windows-and-frames

Selenium click() method on angular.js site

I am scraping an angular.js site. My initial link has a search button. I find by xpath and click with no issues. After I click search, I want to be able to click each of the athletes in the table to go to their info pages, but I am not having success with the click method. The links are attached to their names.
from selenium import webdriver
from selenium.common.exceptions import TimeoutException
TIMEOUT = 5
driver = webdriver.Firefox()
driver.set_page_load_timeout(TIMEOUT)
url = 'https://n.rivals.com/search#?formValues=%7B%22sport%22:%22Football%22,%22recruit_year%22:2021,%22offer_and_visit_type%22:%5B%22Offer%22%5D,%22prospect_profiles.prospect_colleges.offer%22:true,%22page_number%22:1,%22page_size%22:50%7D'
try:
driver.get(url)
except TimeoutException:
pass
search_button = driver.find_element_by_xpath('//*[#id="articles"]/div/div[2]/div/div/div[1]/form/div[2]/div[5]/button')
search_button.click();
#below is where I tried, but could not get to click
first_athlete = driver.find_element_by_xpath('//*[#id="content_"]/td[1]/div[2]/a')
first_athlete.click();
Works if you remove the last /a in the xpath:
first_athlete = driver.find_element_by_xpath('//*[#id="content_"]/td[1]/div[2]')
first_athlete.click()
If you want to search for all athletes and you have the name of athletes with you, you can use CSS selector as well.
athelete = driver.find_elements_by_css_selector(`#content_ > td > div > a[href *="donovan-jackson"]);
athelete.click();
This code will give you a unique web element for each player.
Thanks

Python + Selenium can't find element by XPath

I am trying to click on the "chercher" button on the left of the page (middle).
url = "https://www.fpjq.org/repertoires/repertoire-des-medias/"
driver = webdriver.Firefox()
driver.get(url)
time.sleep(2)
driver.find_element_by_xpath('//*[#id="recherche"]/input[3]').click()
However, it can't find the element. I copy pasted the XPath so I am not sure why it's not working.
Thanks.
That's because required button located inside an iframe and to be able to click it you need to switch to that iframe:
url = "https://www.fpjq.org/repertoires/repertoire-des-medias/"
driver = webdriver.Firefox()
driver.get(url)
time.sleep(2)
driver.switch_to.frame(driver.find_element_by_tag_name("iframe"))
driver.find_element_by_xpath('//*[#id="recherche"]/input[3]').click()
Also note that using time.sleep() is not a good practice. You can try to implement Explicitwait instead

Categories