How do I click a button on cookies pop up using Selenium? - python

Hi I want to click 'Save Services' using Selenium on this website to make the pop up disappear: https://www.hugoboss.com/uk/home. However I receive a timeout exception.
import numpy as np
import pandas as pd
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
import time
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium import webdriver
import time
driverfile = r'C:\Users\Main\Documents\Work\Projects\extra\chromedriver'
driver = webdriver.Chrome(executable_path=driverfile)
driver.get("https://www.hugoboss.com/uk/men/")
WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.XPATH,"//button[contains(text(),'SAVE SERVICES')]"))).click()
Further information: When I try to find the x_path of the button using class: //button[#data-testid='uc-save-button'] on the inspect element finder, It returns 0 results as if it does not exist?
I ran len(driver.window_handles) 10 seconds after the webpage was loaded and which returned 1, meaning selenium could see one window open only.

Your element is in a shadow root. Find your element in devtools, scroll up and you'll see this in the DOM:
To get into the shadowroot, an easy way is the get it's parent item then use JS to get the object.
Within that returned object you can find your button:
edit:: Updated the code from the original answer. This runs for me:
driver = webdriver.Chrome()
driver.implicitly_wait(10)
url = "https://www.hugoboss.com/uk/home"
driver.get(url)
driver.implicitly_wait(10)
shadowRoot = driver.find_element(By.XPATH,"//div[#id='usercentrics-root']").shadow_root
shadowRoot.find_element(By.CSS_SELECTOR, "button[data-testid='uc-save-button']").click()
#########
Update - a demo of the code working:
pip list tells me I'm using:
selenium 4.1.3

Related

Use Python Selenium to extract span text

Hi i'm new at selenium and webscraping and i need some help.
i try to scrape one site and i need and i dont know how to get span class.
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
import time
PATCH = "/Users/bobo/Downloads/chromedriver"
driver = webdriver.Chrome(PATCH)
driver.get("https://neonet.pl")
print(driver.title)
search = driver.find_element_by_class_name("inputCss-input__label-263")
search.send_keys(Keys.RETURN)
time.sleep(5)
i try to extract this span
<span class="inputCss-input__label-263">Szukaj produktu</span>
I can see that you are trying to search something in the search bar.
First I recommend you to use the xpath instead of the class name, here is a simple technique to get the xpath of every element on a webpage:
right-click/ inspect element/ select the mouse in a box element on the upper left/ click on the element on the webpage/ it will directly show you the corresponding html/ then right click on the selected html/ copy options and then xpath.
Here is a code example that searches an element on the webpage, I also included the 'Webdriver-wait' option because sometimes the code runs to fast and can't find the next element so this function make the code wait till the element is visible:
from selenium import webdriver
from selenium.webdriver.common.by import By
from time import sleep
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
driver = webdriver.Chrome(executable_path="/Users/bobo/Downloads/chromedriver")
driver.get("https://neonet.pl") #loading page
wait = WebDriverWait(driver, 20) #defining webdriver wait
search_word = 'iphone\n' # \n is going to act as an enter key
wait.until(EC.visibility_of_element_located((By.XPATH, '//*[#id="root"]/main/div[1]/div[4]/div/div/div[2]/div/button[1]'))).click() #clicking on cookies popup
wait.until(EC.visibility_of_element_located((By.XPATH, '//*[#id="root"]/main/header/div[2]/button'))).click() #clicking on search button
wait.until(EC.visibility_of_element_located((By.XPATH, '//*[#id="root"]/aside[2]/section/form/label/input'))).send_keys(search_word) #searching on input button
print('done!')
sleep(10)
Hope this helped you!
wait=WebDriverWait(driver,10)
driver.get('https://neonet.pl')
elem=wait.until(EC.visibility_of_element_located((By.XPATH, "//span[contains(#class,'inputCss-input')]"))).text
print(elem)
To output the value of the search bar you use .text on the Selenium Webelement.
Imports:
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC

Python 2.7 Selenium No Such Element on Website

I'm trying to do some webscraping from a betting website:
As part of the process, I have to click on the different buttons under the "Favourites" section on the left side to select different competitions.
Let's take the ENG Premier League button as example. I identified the button as:
(source: 666kb.com)
The XPath is: //*[#id="SportMenuF"]/div[3] and the ID is 91.
My code for clicking on the button is as follows:
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
chrome_path = "C:\Python27\Scripts\chromedriver_win32\chromedriver.exe"
driver = webdriver.Chrome(chrome_path)
driver.get("URL Removed")
content = driver.find_element_by_xpath('//*[#id="SportMenuF"]/div[3]')
content.click()
Unfortunately, I always get this error message when I run the script:
"no such element: Unable to locate element:
{"method":"xpath","selector":"//*[#id="SportMenuF"]/div[3]"}"
I have tried different identifiers such as CCS Selector, ID and, as shown in the example above, the Xpath. I tried using waits and explicit conditions, too. None of this has worked.
I also attempted scraping some values from the website without any success:
from selenium import webdriver
from selenium.webdriver.common.by import By
chrome_path = "C:\Python27\Scripts\chromedriver_win32\chromedriver.exe"
driver = webdriver.Chrome(chrome_path)
driver.get("URL removed")
content = driver.find_elements_by_class_name('price-val')
for entry in content:
print entry.text
Same problem, nothing shows up.
The website embeddes an iframe from a different website. Could this be the cause of my problems? I tried scraping directly from the iframe URL, too, which didn't work, either.
I would appreciate any suggestions.
Sometimes elements are either hiding behind an iframe, or they haven't loaded yet
For the iframe check, try:
driver.switch_to.frame(0)
For the wait check, try:
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
element = WebDriverWait(driver, 10).until(
EC.presence_of_element_located((By.XPATH, '-put the x-path here-')))

Cannot select element on page with any kind of selector with python selenium

Im trying to simply fill login form, but no matter how much i try I just cannot. I'm trying for two days with all kinds of selectors, nothing. Here is my code:
# -*- coding: iso-8859-2 -*-
from __future__ import print_function
import pyautogui, sys
import time
import random
import subprocess
import os
import urllib
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.common.exceptions import NoSuchElementException
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait as wait
from selenium.webdriver.common.keys import Keys
options = webdriver.ChromeOptions()
options.add_argument("user-data-dir=C:\Users\Administrator\AppData\Local\Google\Chrome\User Data") #Path to your chrome profile
options.add_argument("disable-infobars")
options.add_argument("--start-maximized")
driver = webdriver.Chrome(executable_path="C:\\chromedriver.exe", chrome_options=options)
driver.get("https://awario.com/login?r=%2F")
wait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//*[#id='loginform-email']"))).click
wait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//*[#id='loginform-email']"))).send_keys('email')
wait(driver, 20).until(EC.element_to_be_clickable((By.XPATH, "//*[#id='loginform-password']"))).send_keys('pass')
recommend you use selenium server way for run and debug script on local as guide: http://selenium-python.readthedocs.io/getting-started.html#using-selenium-with-remote-webdriver
The console window running selenium server will print out log of all operation on browser as below:
INFO - Executing: [find elements: By.cssSelector: a[ng-if*="uxdRegion"]])
INFO - Done: [find elements: By.cssSelector: a[ng-if*="uxdRegion"]]
INFO - Executing: [get element attribute: 0 [
css selector: a[ng-if*="uxdRegion"]], href])
INFO - Done: [get element attribute: 0 [] ->
css selector: a[ng-if*="uxdRegion"]], href]
And stop running and print out log when meet exception.
From the log you can easy to know which step in script fail and failed reason.
FYI, the way of using selenium in your above script, called 'directconnect'.
I tried your xpath "//*[#id='loginform-email']" on your login page on my laptop with normal browser size, it will find 3 matched elements.
The first and the second one not visiable, they used for small browser size, like
open your site on mobile device. only the third one visible in my trying.
For selenium not visible element is always not clickable.
In your script you used the api: EC.element_to_be_clickable(), because there are more than one matched element, it will use the first one which is not visible,
equivalent to not clickable, so EC.element_to_be_clickable() should not return
any element for next click.
you can try with more strict locator as below code on desktop with maixmize browser size to verify my answer correct or not.
css_locator_email = ".top-index-section + div #loginform-email"
wait(driver, 20).until(EC.element_to_be_clickable((By.CSS_SELECTOR, css_locator_email))).click();

click() doesn't work in selenium

i am currently using selenium with python and my webdriver is firefox
i tried the click event but it doesn't work
website = www.cloudsightapi.com/api
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.common.exceptions import TimeoutException
import time
from selenium.webdriver.common.action_chains import ActionChains
import os
driver = webdriver.Firefox()
driver.get("http://cloudsightapi.com/api")
wait = WebDriverWait(driver, 10)
element = wait.until(EC.element_to_be_clickable((By.ID, "dropzoneTarget")))
element.click()
Please help !!
Clicking via javascript worked for me:
element = wait.until(EC.element_to_be_clickable((By.ID, "dropzoneTarget")))
driver.execute_script("arguments[0].click();", element)
Now, the other problem is that clicking that element would only get you into more troubles. There will a file upload popup being opened. And, the problem is, you cannot control it via selenium.
A common way to approach the problem is to find the file input and set it's value to the absolute path to the file you want to upload, see:
How to upload file ( picture ) with selenium, python
In your case, the input is hidden, make it visible and send the path to it:
element = wait.until(EC.presence_of_element_located((By.CSS_SELECTOR, "input.dz-hidden-input[type=file]")))
# make the input visible
driver.execute_script("arguments[0].style = {};", element)
element.send_keys("/absolute/path/to/image.jpg")

Unable to submit keys using selenium with python

Following is the code which im trying to run
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
import os
import time
#Create a new firefox session
browser=webdriver.Firefox()
browser.maximize_window()
#navigate to app's homepage
browser.get('http://demo.magentocommerce.com/')
#get searchbox and clear and enter details.
browser.find_element_by_css_selector("a[href='/search']").click()
search=browser.find_element_by_class_name('search-input')
search.click()
time.sleep(5)
search.click()
search.send_keys('phones'+Keys.RETURN)
However, im unable to submit the phones using send_keys.
Am i going wrong somewhere?
Secondly is it possible to always use x-path to locate an element and not rely on id/class/css-selections etc ?
The input element you are interested in has the search_query class name. To make it work without using hardcoded time.sleep() delays, use an Explicit Wait and wait for the search input element to be visible before sending keys to it. Working code:
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
browser = webdriver.Firefox()
browser.maximize_window()
wait = WebDriverWait(browser, 10)
browser.get('http://demo.magentocommerce.com/')
browser.find_element_by_css_selector("a[href='/search']").click()
search = wait.until(EC.visibility_of_element_located((By.CLASS_NAME, "search-query")))
search.send_keys("phones" + Keys.RETURN)

Categories