I'm trying to click links on a website, there's a full page of them. I tried using the is_displayed() option, I came back "true" but then still gave me this error. I've encountered this error before on other projects, its because selenium doesn't see the link. I tried putting a scroll down option in the code but that only works so many times as the page down ends up scrolling too far.
What other options do I have to get the link visible to click on?
Code:
href1 = driver.find_element_by_xpath("//*[#id='divDesktopResults']//div//div//div//a[#href='" + link + "']")
if (href1.is_displayed()):
print('true')
href1.click()
else:
print('False')
Error:
selenium.common.exceptions.WebDriverException: Message: unknown error: Element <a class="popimg" data-toggle="popover" style="text-decoration:underline;margin-right:20px;" data-content="<img style='max-width:250px;' src='/Home/GetPng?ID=D218098469' ></a>" data-html="true" data-trigger="hover" href="#pdfviewer?ID=D218098469">...</a> is not clickable at point (441, 514). Other element would receive the click: <div class="row">...</div>
(Session info: chrome=66.0.3359.181)
(Driver info: chromedriver=2.38.552522 (437e6fbedfa8762dec75e2c5b3ddb86763dc9dcb),platform=Windows NT 10.0.16299 x86_64)
Edit:
Solved another way, put all the links in an array and used driver.get instead of clicking the link.
Selenium is complaining not that the link can't be seen, but that something else is essentially on top of the link. Clicking on the screen where the link is would instead click on the "row" div. This would imply a CSS error where you have multiple elements one on top of the other. You could potentially confirm this by giving the link a really high Z value so that it sits on top of everything else and re-run your test.
Related
I have developed an application using Angular, Node.js/Express.js and MySQL. The application has a login page that is displayed when any user visits the link. After logging in, the user is taken to the home page with a lot of other pages that are displayed in the navbar. I am trying to perform an automation test on the application and I am getting this error when I try to make my automated test click on one of the links in the navbar.
TypeError: 'WebElement' object is not subscriptable
After trying different things the error still does not get resolved. I ended up getting an error of Message: element not interactable
I have found the element using the console in Chrome and it is also clickable when I perform the click action in the console $x("//a")[1].click() or $$("[class = 'navbar-nav ml-auto'] li a")[1].click() but when I run the script in selenium then it throws the error. Can anybody tell me how can I make the navbar item clickable in selenium?
Here's my code:
from selenium import webdriver
browser = webdriver.Chrome()
browser.get('http://name-of-the-local-server:3000')
browser.implicitly_wait(3);
email_element = browser.find_element_by_css_selector("input[name = 'usermail']")
password_element = browser.find_element_by_css_selector("input[name = 'passcode']")
submit_btn_element = browser.find_element_by_css_selector("button[name = 'loginButton']")
email_element.send_keys("someone#example.com")
password_element.send_keys("pass123")
submit_btn_element.click()
#executes perfectly until here
navbar_element = browser.find_elements_by_css_selector("[class = 'navbar-nav ml-auto'] li a")
navbar_element[1].click()
EDIT: I have added a bit of HTML that I want to click as by default it loads the home page
<a _ngcontent-imt-c24="" routerlinkactive="active" routerlink="/admin"
class="nav-link" href="/admin"><i _ngcontent-imt-c24="" class="bi bi-people"
style="font-size: 1.3rem;"></i> Admin</a>
<i _ngcontent-sun-c24="" class="bi bi-people" style="font-size: 1.3rem;"></i>
Any help would be appreciated. Thanks!
Instead of this :
navbar_element = browser.find_element_by_xpath("//a")
use this
navbar_element = browser.find_elements_by_xpath("//a")
find_element will return a single web element where as find_elements will return list of web elements
You are using navbar_element[0].click() if stored in list, it will click on first web element.
or iterate over a list like this :
for link in navbar_element:
ActionChains(driver).move_to_element(link).click().perform()
This question already has an answer here:
Selenium "selenium.common.exceptions.NoSuchElementException" when using Chrome
(1 answer)
Closed 2 years ago.
I'm trying to scrape the promotion information of each product from a website by clicking on the product and go to its detailed page. When the spider clicks on the product, the web will ask it to log in, and I tried the following code:
def __init__(self):
self.driver = webdriver.Chrome(executable_path = '/usr/bin/chromedriver')
...
def start_scraping(self, response):
self.driver.get(response.url)
self.driver.find_element_by_id('fm-login-id').send_keys('iamgooglepenn')
self.driver.find_element_by_id('fm-login-password').send_keys('HelloWorld1_')
self.driver.find_element_by_class_name('fm-button fm-submit password-login').click()
...
However, there is NoSuchElementException when I run it.
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element: {"method":"css selector","selector":"[id="fm-login-id"]"}
'spider_exceptions/NoSuchElementException': 14,
The HTML of the login page is as follows:
<div class='input-plain-wrap input-wrap-loginid'>
<input id='fm-login-id' class='fm-text' name='fm-login-id'...>
event
</div>
So, I'm pretty sure the id should be 'fm-login-id'. The reason I could think of that might cause this issue is that this login page is a popup.
Basically, it pops up in the middle of the main page. Looking at the HTML of the site, I can see that the login type seems to be a new HTML window
<!DOCTYPE html>
<html>event
....
<\html>
I'm not sure if this is the issue, and if so, how to fix it? Also, is there other reasons that might've caused the issue?
The popup will have an ID. You might have to add f'#{popup_id}' to the end of response.url. Like this URL: https://stackoverflow.com/questions/62906380/nosuchelementexception-when-using-selenium-python/62906409#62906409. It contains #62906409 because 62906409 is the ID of an element in the page.
The login page inside a frame, you need switch it first:
#switch it first
self.driver.switch_to.frame(driver.find_element_by_id('J_loginIframe'))
self.driver.find_element_by_id('fm-login-id').send_keys('iamgooglepenn')
self.driver.find_element_by_id('fm-login-password').send_keys('HelloWorld1_')
And for login button you can't use .find_element_by_class_name, this method just for single class name. This element having multiple class name, so use .find_element_by_css_selector like bellow:
#submit button
self.driver.find_element_by_css_selector('.fm-button.fm-submit.password-login').click()
The login content seems to be nested in an iFrame element (if you trace it all the way to the top, you should find an iFrame with id="sufei-dialog-content"), which means you need to switch to that iFrame for that nested html before selecting your desired element, otherwise it will not work.
First you will need to use driver.switch_to.frame("sufei-dialog-content"), and then select your element with driver.find_element_by_name() or whatever you had.
A similar issue can be found here: Selenium and iframe in html
Just a simple mistake:
<div class='input-plain-wrap input-wrap-loginid'>
<input id='fm-login-id class='fm-text' name='fm-login-id'...>
event
</div>
is actually supposed to be:
<div class='input-plain-wrap input-wrap-loginid'>
<input id='fm-login-id' class='fm-text' name='fm-login-id'...>
event
</div>
You forgot a single-quote.
Have you tried driver.find_element_by_name('fm-login-id')?
You should try finding the elements by their XPaths. You just have to inspect the element, right-click on it and copy its XPath. The XPath of the first <input ... is //*[#id="fm-login-id"].
Really Need help from this community!
When I am attempting to scrape the dynamic content from a travel website, the prices and related vendor info can be obtained only if I click on the "View Prices" button on the website. So I am considering using 'for loop' to click on all the 'View Prices' buttons before I do my scraping using Selenium.
The Question is that every single button can be click through browser.find_element_by_xpath().click() , but when I create a list which includes all the button info, an error pops up :
Code Block :
browser=webdriver.Chrome("C:/Users/Owner/Downloads/chromedriver_win32/chromedriver.exe")
url="https://www.cruisecritic.com/cruiseto/cruiseitineraries.cfm?port=122"
browser.get(url)
#print(browser.find_element_by_css_selector(".has-price.meta-link.show-column").text)
ButtonList=[ "//div[#id='find-a-cruise-full-results-container']/div/article/ul/li[3]/span[2]",
"//div[#id='find-a-cruise-full-results-container']/div/article[2]/ul/li[3]/span[2]",
"//div[#id='find-a-cruise-full-results-container']/div/article[3]/ul/li[3]/span[2]"]
for button in ButtonList:
browser.implicitly_wait(20)
browser.find_element_by_xpath(str(button)).click()
Error Stack Trace :
WebDriverException: unknown error: Element <span class="label hidden-xs-down" data-title="...">View All Prices</span> is not clickable at point (862, 12). Other element would receive the click: ...
(Session info: chrome=63.0.3239.132)
(Driver info: chromedriver=2.35.528161 (5b82f2d2aae0ca24b877009200ced9065a772e73),platform=Windows NT 10.0.16299 x86_64)
My question would be How can I click on all the buttons on the web page before scraping, or is there any other way to scrape the dynamic content on a webpage if we have to click on certain button to 'parse' the data into Python. The attached picture is the webpage screenshot.
Really appreciate the help from the community!
You might need to go for Relative path for the Xpath you are using.
It might be the case where the data which is shown is only partially present while you are performing the data.
Methods to try:
Increase the wait time
Change the xpath / use the relative xpath
Splinter - You can use it as a browser calls regular way
You need to check for the data is it present in the DOM element while making the calls. If that's the case waiting till to load the complete page will help you out.
Hello Use the Following code to click on each price button, If you want you can also introduce a implicit wait.
for one_row_view_price in browser.find_elements_by_xpath('//span[#data-title="View All Prices"]'):
one_row_view_price.click()
let me know if your BOT is able to click on price button
Thanks
Happy Coding
Here is the function which is designed on the basis of your requirement:
def click_handler(xpath):
# Find total number of element available on webpage
xpath = re.sub('"', "'", xpath)
total_element = browser.execute_script("""
var elements = document.evaluate("%s",
document,
null,
XPathResult.UNORDERED_NODE_SNAPSHOT_TYPE,
null);
return elements.snapshotLength;
""" %xpath
)
# Check if there is any element
if(total_element):
# Iterate over all found elements
for element_pos in range(total_element):
# Call click element function
browser.execute_script("""
var elements = document.evaluate("%s",
document,
null,
XPathResult.UNORDERED_NODE_SNAPSHOT_TYPE,
null);
var im = elements.snapshotItem(%d);
im.click();
""" %(xpath, element_pos)
)
print "***" + str(element_pos + 1) + " Elements clicked"
print "\n****************************"
print "Total " + str(total_element) + " Elements clicked"
print "****************************\n"
# Inform user that there is no element found on webpage
else:
print "\n****************************"
print "Element not found on Webpage"
print "****************************\n"
# Element not found on Webpage
click_handler('//span[#data-title="View All Prices"]')
I'm new with Selenium and struggle with this one for few hours.
I have an HTML page that contains icon and stream view (both images), the browser view is on 100% and I would like to modify it and make it smaller,
it can be done by changing browser zoom, modifying CSS property or clicking on the image.
My code starts with open browser and waits till all elements load:
driver = webdriver.Firefox()
driver.get('http://' + user +':'+password+'#'+camera_address)
driver.maximize_window()
driver.implicitly_wait(15) # seconds
I tried to do so via zoom:
driver.execute_script("document.body.style.zoom='0.4'")
Didn't worked.
Tried to do it via Selenium -- find element and change CSS:
myDynamicElement = driver.find_elements_by_id("stream")
or find the image and click it with:
driver.find_element_by_xpath('//img[#src="../pics/button_downsize_27x27px.gif"]').click()
or
driver.find_elements_by_xpath('//*[#img]').click()
or
driver.find_elements_by_tag_name('img').click()
The HTML page looks like this:
<img src="/pics/button_downsize_27x27px.gif" width="27" height="27" border="0" title="Scale down to 800 px width" alt="Scale down to 800 px width">
<td colspan="3" align="center"><img id="stream" src="/mjpg/video.mjpg" width="2560" height="1920" border="0" alt="If no image is displayed, there might be too many viewers, or the browser configuration may have to be changed. See help for detailed instructions on how to do this."><br></td>
Tried with #title and #alt and even contain but nothing works.
What am I doing wrong?!?! How to find and click on this image (/pics/button_downsize_27x27px.gif)?
To click on your desired image -- please use the reference code mentioned here.
new WebDriverWait(driver, 10).until(ExpectedConditions.visibilityOfAllElementsLocatedBy((By.cssSelector("a"))));
List<WebElement> elem = driver.findElements(By.tagName("a"));
for(WebElement el : elem){
String element = el.getAttribute("src");
if(element.contains("/pics/button_downsize_27x27px.gif")){
el.click();
break;
}
}
It is in Java, you can implement it in Python.
Thank you all, finally, I did a little workaround since I saw that in chrome there is a little bug and via Firefox, I cannot find the element.
So I retrieve again the URL after all elements loaded and modify it so the JavaScript function will be triggered:
myElem = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.NAME, 'indexMain')))
driver.get(driver.current_url+'&size=8')
Thank you all for trying, this one was a challenge since the element have no id/name/class/css and the browser could not find the XPath.
I was trying to open stackoverflow and search for a query and then click the search button.
almost everything went fine except I was not able to click submit button
I encountered error
WebDriverException: unknown error: Element ... is not clickable at point (608, 31). Other element would
receive the click: (Session info:
chrome=60.0.3112.101) (Driver info: chromedriver=2.29.461591
(62ebf098771772160f391d75e589dc567915b233),platform=Windows NT
6.1.7601 SP1 x86)
browser=webdriver.Chrome()
browser.get("https://stackoverflow.com/questions/19035186/how-to-select-element-with-selenium-python-xpath")
z=browser.find_element_by_css_selector(".f-input.js-search-field")#use .for class and replace space with .
z.send_keys("geckodriver not working")
submi=browser.find_element_by_css_selector(".svg-icon.iconSearch")
submi.click()
<button type="submit" class="btn js-search-submit">
<svg role="icon" class="svg-icon iconSearch" width="18" height="18" viewBox="0 0 18 18">
<path d="..."></path>
</svg>
</button>
You are trying to click on the svg. That icon is not clickable, but the button is.
So change the button selector to .btn.js-search-submit will work.
Use below code to click on submit button:
browser.find_element_by_css_selector(".btn.js-search-submit").click()
Click the element with right locator, your button locator is wrong. Other code is looking good
try this
browser=webdriver.Chrome()
browser.get("https://stackoverflow.com/questions/19035186/how-to-select-element-with-selenium-python-xpath")
z=browser.find_element_by_css_selector(".f-input.js-search-field")#use .for class and replace space with .
z.send_keys("geckodriver not working")
submi=browser.find_element_by_css_selector(".btn.js-search-submit")
submi.click()