Selenium GDPR NoSuchElementException - python

I want to scrape some data from "https://www.techadvisor.co.uk/review/wearable-tech/". I figured out that looping through the pages with Beautifulsoup does not work. This is the reason why I tried to open it with selenium. The "Accept All" Button to overcome the GDPR blocker cannot be located.
I tried:
browser = webdriver.Chrome()
browser.get("https://www.techadvisor.co.uk/review/wearable-tech/")
# button = browser.find_element_by_xpath('/html/body/div/div[3]/div[5]/button[2]')
# WebDriverWait(browser, 20).until(EC.element_to_be_clickable((By.XPATH, "html/body/div/div[3]/div[5]/button[2]"))).click()
I always receive NoSuchElementException
To be honest, I found the Xpath really weird, but I got this from the Google Chrome inspect.
Every solution proposal or tip is appreciated :)

To click on Accept All button which is inside an iframe.You need to switch to iframe first in order to click the button.
Induce WebDriverWait() and wait for frame_to_be_available_and_switch_to_it() and use the following css selector.
Induce WebDriverWait() and wait for element_to_be_clickable() and use the following xpath selector.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
browser = webdriver.Chrome()
browser.get("https://www.techadvisor.co.uk/review/wearable-tech/")
WebDriverWait(browser,10).until(EC.frame_to_be_available_and_switch_to_it((By.CSS_SELECTOR,"iframe[id^='sp_message_iframe']")))
WebDriverWait(browser, 10).until(EC.element_to_be_clickable((By.XPATH, "//button[text()='Accept All']"))).click()

I know the question is old,
but i would like provide my own solution!
First step is to recognize the "id" of the form that you are actually view, and then you need to move the focus on it!
driver.switch_to_frame(driver.find_element_by_xpath('//*[#id="gdpr-consent-notice"]'))
cookies = driver.find_element_by_xpath('/html/body/app-root/app-theme/div/div/app-notice/app-theme/div/div/app-home/div/div[3]/div[2]/a[3]/span')
cookies.click()

Related

Clicking on button not working in selenium + scrapy

I want to scrape links to news articles using scrapy + selenium. The website I am using uses a 'Load more' button, so I obviously want selenium to click on this button to load all articles.
I have looked for similar questions and tried various options already such as
element = driver.find_element(By.XPATH, value='//*[#id="fusion-app"]/main/div/div/div/div/div[4]/div/div/button')
driver.execute_script("arguments[0].click();", element)
and
element = WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.CSS_SELECTOR, ".ais-InfiniteHits-loadMore")))
ActionChains(driver).move_to_element(element).click().perform()
All to no result. I've also inserted some print statements in between to check whether it does run the code, and that seems to work fine; I think it's just a matter of the button not being located/clicked on.
This is the html of the button btw:
<button class="ais-InfiniteHits-loadMore">Load more </button>
And when I print element, this is what I get: <selenium.webdriver.remote.webelement.WebElement (session="545716eef622a12bdbeddef99e02bdef", element="551741ec-4616-4bd4-b8fd-57c2f4bffb00")>
Is someone able to help me out? Thank you in advance.
driver = webdriver.Chrome(service=Service(ChromeDriverManager().install()),options=options)
driver.maximize_window()
wait = WebDriverWait(driver, 30)
driver.get('https://www.businessoffashion.com/search/?q=Louis+Vuitton&f=Articles%2CFashion+Shows%2CNews')
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR,'button.ab-close-button'))).click()
elem=wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, ".ais-InfiniteHits-loadMore")))
driver.execute_script("arguments[0].click()", elem)
You hit two different errors with a pop up and an element click intereception when you can just use javascript to click that element.
Import:
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

Selenium WebDriver find_element_by_xpath not working for text

I'm trying to click on a link on a webpage that has no ID and no individual class. The only thing to lock it down to is the text 'Sessions'.
I have tried:
driver.find_element_by_xpath("//*[contains(text(),'Sessions')]");
driver.find_element_by_xpath("//*[text()='Sessions']");
Both come back with "No such element".
Edit: I have also tried driver.find_element_by_link_text which also didn't work.
I've tried using the full xpath:
/html/body/div/div/div[1]/div/nav/a[3]
To no avail.
That is a link_Text cause it's between anchor tag, use this :
driver.find_element_by_link_text('Sessions').click()
or
A way more good approach is to use ExplicitWaits :
wait = WebDriverWait(driver, 10)
element = wait.until(EC.element_to_be_clickable((By.LINK_TEXT, 'Sessions')))
element.click()
If you want explicit wait you would need to import the below :
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
If the above gives you NoSuchElementException, I would probably suspect this it is in iframe (See the screenshot first tag - I can see body), if it happens to be then in that case you would need to switch to iframe first and continute with this web element.
Code
wait = WebDriverWait(driver, 10)
wait.until(EC.frame_to_be_available_and_switch_to_it((By.XPATH, "iframe xpath here")))
wait.until(EC.element_to_be_clickable((By.PARTIAL_LINK_TEXT, "Sessions"))).click()
Imports :

How to click same button in another section while scraping using selenium

So I'm scraping using selenium and I want to click 'next' button in 'Defensive' section but the code I wrote clicks 'next' on 'Summary'.
Here's the url for you to try :
https://www.whoscored.com/Regions/252/Tournaments/2/Seasons/7361/Stages/16368/PlayerStatistics/England-Premier-League-2018-2019
So it's selecting 'Defensive' and I can see it selected in the window but the next page doesnt appear. On clicking 'Summary' I found out next function is actually happening there.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
browser= webdriver.Chrome(executable_path ="C:\Program Files (x86)\Google\Chrome\chromedriver.exe")
browser.get('https://www.whoscored.com/Regions/252/Tournaments/2/Seasons/7361/Stages/16368/PlayerStatistics/England-Premier-League-2018-2019')
browser.find_element_by_xpath("""//*[#id="stage-top-player-stats-options"]/li[2]/a""").click()
element = WebDriverWait(browser, 20).until(EC.presence_of_element_located((By.XPATH, """//*[#id="next"]""")))
browser.execute_script("arguments[0].click();", element)
The xpath for next button is not unique for this page. try this,
element = WebDriverWait(browser, 20).until(EC.presence_of_element_located((By.XPATH, "//*[#id='stage-top-player-stats-defensive']//a[#id='next']")))
browser.execute_script("arguments[0].click();", element)
or
element = WebDriverWait(browser, 20).until(EC.presence_of_element_located((By.XPATH, "//*[#id='stage-top-player-stats-defensive']//a[#id='next']")))
element.click()
For each tab (Summary, Defensive, ..) new next button with same id=next added to the DOM.
Select Defensive and you will see there will be two next buttons with same id=next, select Offensive and there will be three next buttons.
With basic id=next selector you always click to the first next button from Summary tab. Because you're using JavaScript and nothing happen, try to click with Selenium click method and you will get an error.
To solve the problem adjust your selector to be more specific to the dom - #statistics-paging-defensive #next.
Also when you first time open the page there's cookies acceptance screen appears and block the page, you can use method like below to skip it.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
import selenium.common.exceptions as EX
def accept_cookies():
try:
WebDriverWait(browser, 20)\
.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "button.qc-cmp-button")))\
.click()
except EX.NoSuchElementException or EX.TimeoutException:
pass
#...
browser = webdriver.Chrome(executable_path ="C:\Program Files (x86)\Google\Chrome\chromedriver.exe")
browser.get('https://www.whoscored.com/Regions/252/Tournaments/2/Seasons/7361/Stages/16368/PlayerStatistics/England-Premier-League-2018-2019')
wait = WebDriverWait(browser, 20)
browser.get(baseUrl)
accept_cookies()
wait.until(EC.element_to_be_clickable((By.CSS_SELECTOR, "[href='#stage-top-player-stats-defensive']"))).click()
next_button = wait.until(
EC.element_to_be_clickable((By.CSS_SELECTOR, "#statistics-paging-defensive #next")))
next_button.click()
Your elements locators must be unique
Avoid using XPath wildcards - * as it will cause performance degradation and prolonged elements lookup timings
Avoid using JavaScriptExecutor for clicking, well-behaved Selenium test must do what real user does and I doubt that real user will be opening browser console and typing something like document.getElementById('next').click(), he will use the mouse
Assuming all above you should come up with a selector which uniquely identifies next button on Defensive tab which would be something like:
//div[#id='statistics-paging-defensive']/descendant::a[#id='next']
References:
XPath Tutorial
XPath Axes
XPath Operators & Functions

How to click on the popup element in chrome using Selenium and Python

During test my automation code, I met a horrible web page.
When I click a element, the page present new browser window and alert.
After that, I can't do anything because the alert is invincible.
How can I go through this?
My environment is as following:
Python 3.6.7
Selenium 3.141.0
Please try this.
from selenium import webdriver
driver = webdriver.Chrome()
driver.get('https://www.kebhana.com/foreign/index.do')
el = driver.find_element_by_xpath('//*[#id="header"]/div[2]/div/div[2]/div/div/ul/li[21]/ul/li[2]/ul/li[4]/a')
driver.execute_script("arguments[0].click();",el)
Then, you can see a new browser window with alert.
And I can't find any solution to dismiss that alert.
If you have some brilliant way to handle alert, please show me.
Here is the logic to switch to the new window and then accept the alert.
# this will switch to the new window
driver.switch_to.window(driver.window_handles[-1])
# now accept the alert
driver.switch_to.alert.accept()
The element is a Layer Message and part of the HTML DOM and to locate and to click/dismiss the element you have to induce WebDriverWait for the element_to_be_clickable() and you can use either of the following solutions:
Using CSS_SELECTOR:
WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.CSS_SELECTOR, "a#opbLayerMessage0_OK[href$='HanaBank']"))).click()
Using XPATH:
WebDriverWait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//a[#id='opbLayerMessage0_OK' and contains(#href, 'HanaBank')]"))).click()
Note : You have to add the following imports :
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC

How to click on the list of the <li> elements in an <ul> elements with selenium in python?

I tried to select 2002 in dropdown menu.
It doesn't work at any late.
I used xpath
driver.find_element_by_xpath("html/body/main/div/form/div[3]/div[1]/section/div[3]/fieldset/div[7]/dl[1]/dd/ul/li[1]/a").click()
but it doesn't work..I tried all the solutions I got...
How can I select this?
If you're able to open dropdown item but unable to click on item, you should try using Explicit Waits with WebDriverWait to wait until this element is visible and enable to click as below :-
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
element = WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.CSS_SELECTOR, "ul#ulBirthYear a[data-value='2002']")))
element.click()
Or
element = WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.LINK_TEXT, "2002")))
element.click()
First of all, try to avoid using absolute XPATH.
Use something like this:
'//ul[#id="uiBirthYear"]/li/a[#data-value="2002"]'
Also ensure, that the DOM is fully built, before you trying to get/click on this element.
Try to set an implicit wait
driver.implicitly_wait(10)
or an explicit wait (read more: http://selenium-python.readthedocs.io/waits.html)

Categories