I am writing script in python using selenium.
This code should switch focus to iframe on the webpage.
But instead, I get this error:
selenium.common.exceptions.NoSuchElementException: Message: Unable to locate element: //iframe[#id="lightbox_iframe_4"]
here is the code:
iframe = browser.find_element("xpath", "//iframe[#id="lightbox_iframe_4"]")
browser.switch_to.frame(iframe)
Please help, how to get the iframe element?
Here are all modules:
from selenium import *
from time import sleep
from selenium.common.exceptions import NoSuchElementException
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
Tried adding other modules, changing browser. to driver./webdriver.
Related
I was trying to access the search bar of this website: https://www.kissanime.ru
using selenium. I tried it using xpath, class, css_selector but every time this error pops up in the terminal.
selenium.common.exceptions.NoSuchElementException: Message: Unable to
locate element: //*[#id="keyword"]
My approach to the problem was :
from selenium import webdriver
from selenium.webdriver.support.select import Select
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
driver=webdriver.Firefox()
driver.get("https://kissanime.ru/")
driver.maximize_window()
search=driver.find_element_by_xpath('//*[#id="keyword"]')
search.send_keys("boruto")
search.send_keys(Keys.RETURN)
Try adding some wait
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions
search = WebDriverWait(driver, 10).until(expected_conditions.visibility_of_element_located((By.ID, "keyword")))
Add wait to avoid race condition
driver.implicitly_wait(20)
I've been trying to scrape some information of this E-commerce website with selenium. However when I access the website I need to accept cookies to continue. This only happens when the bot accesses the website, not when I do it manually. When I try to find the corresponding element either by xpath, as I find it when I inspect the page manually I always get this error message:
selenium.common.exceptions.StaleElementReferenceException: Message: stale element reference: element is not attached to the page document
My code is mentined below.
import time
import pandas
pip install selenium
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.common.exceptions import TimeoutException
from bs4 import BeautifulSoup #pip install beautifulsoup4
PATH = "/Users/Ziye/Desktop/Python/chromedriver"
delay = 15
driver = webdriver.Chrome(PATH)
driver.implicitly_wait(10)
driver.get("https://en.zalando.de/women/?q=michael+michael+kors+taschen")
driver.find_element_by_xpath('//*[#id="uc-btn-accept-banner"]').click()
This is the HTML corresponding to the button "That's ok". The XPATH is as above.
<button aria-label="" id="uc-btn-accept-banner" class="uc-btn uc-btn-primary">
That’s OK <span id="uc-optin-timer-display"></span>
</button>
Does anyone know where my mistake lies?
You should add explicit wait for this button:
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.support.wait import WebDriverWait
driver = webdriver.Chrome(executable_path='/snap/bin/chromium.chromedriver')
driver.implicitly_wait(10)
driver.get("https://en.zalando.de/women/?q=michael+michael+kors+taschen")
wait = WebDriverWait(driver, 15)
wait.until(EC.element_to_be_clickable((By.XPATH, '//*[#id="uc-btn-accept-banner"]')))
driver.find_element_by_xpath('//*[#id="uc-btn-accept-banner"]').click()
Your locator is correct.
As css selector, you can use .uc-btn-footer-container .uc-btn.uc-btn-primary
I am using selenium in Python and Colab.
I have some code that works in spyder, extracts elements but gives me error in collab
NoSuchElementException: Message: no such element: Unable to locate element:
What is a possible explanation and is it possible to fix this problem?
Try using this method of waiting before an element Is accessed.
from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support import expected_conditions as EC
from selenium.common.exceptions import NoSuchElementException
driver = webdriver.Chrome(executable_path='path')
waitshort = WebDriverWait(driver,.5)
wait = WebDriverWait(driver, 20)
waitLonger = WebDriverWait(driver, 100)
visible = EC.visibility_of_element_located
driver.get('website')
element = wait.until(visible((By.XPATH,'element_xpath'))).click()
I already tried several methods to click on a link on a specific website, with the help of Selenium. All of them resulting in following error message:
ElementClickInterceptedException: Message: element click intercepted: Element LMGP06050001 is not clickable at point (159, 364). Other element would receive the click: ...
(Session info: chrome=89.0.4389.90)
What am I doing wrong?
The goal is to reach following site and grab several data from there:
https://www.lipidmaps.org/data/LMSDRecord.php?LMID=LMGP06050001
Below my code so far:
from time import sleep
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
PATH = "C:\\Users\\xxxxxx\\anaconda3\\chromedriver.exe"
browser = webdriver.Chrome(PATH)
browser.get("https://www.lipidmaps.org/data/structure/LMSDSearch.php?Mode=ProcessClassSearch&LMID=LMGP0605")
link = browser.find_element_by_link_text("LMGP06050001")
browser.implicitly_wait(5)
link.click()
Reason you have faced that issue because cookie pop up appeared and you need to accept the cookie first.
Use WebDriverWait() and wait for element_to_be_clickable()
browser.get("https://www.lipidmaps.org/data/structure/LMSDSearch.php?Mode=ProcessClassSearch&LMID=LMGP0605")
#Accept to cookie
WebDriverWait(browser,20).until(EC.element_to_be_clickable((By.CSS_SELECTOR,"button#cookie_notice_accept"))).click()
WebDriverWait(browser,20).until(EC.element_to_be_clickable((By.LINK_TEXT,"LMGP06050001"))).click()
you need to import below libraries
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium import webdriver
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from webdriver_manager.chrome import ChromeDriverManager
import chromedriver_binary
from selenium.webdriver.common.keys import Keys
import time
element = driver.find_element_by_xpath('//*[#id="username"]').send_keys(user_name)
This is how code looks like (I've tried xpath,id, class name, etc..) but I will ALWAYS get an error
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"xpath","selector":"//[#id="username"]"}
(Session info: chrome=80.0.3987.87)
I am trying to get this done on here
Any ideas please?
I noticed that the given form is within Iframe so switch to iframe is required to interact with any element which is within IFrame.
Following code works for me :
driver.switch_to_frame(driver.find_element_by_xpath('(//iframe[#title="Registration form"])[1]'))
element = driver.find_element_by_xpath('//*[#id="username"]').send_keys("username")
Feel free to use explicit wait if require.