How can I click on the link here with using id.class.href python selenium, I using pyCharm [closed] - python

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
How can I click on the link here with using id.class.href python selenium, I using pyCharm *href is Changeable
i try
driver.find_element_by_xpath("//div[#id='result_26']//a[#class='a-link-normal s-access-detail-page s-color-twister-title-link a-text-normal']/#href").click()
enter image description here

You are clicking the href attribute instead of the <a> tag.
Assuming your xpath is correct, just remove the /#href from the end and it should ideally work.

Related

can find h3 tag in html using beautifulsoup, how to find the h3 tag from html code? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed yesterday.
Improve this question
I try to scrap data from the website and I can try to scrap the title from the website, and the title using h3 tag. but when I try to scrap it ,the result is None

How to show python output on webpage using html [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
Helo,
I am a newbie ;
I know I can use <p>
This is for showing a text on the webpage using HTML but
I have a file name my.py
And it have a code
print("Helo");
So now I have to show this helo on the screen so how can I do that 🙂
You can follow this Tutorial to create a basic web server that will display any HTML you want if that's what you're looking for.

HTML - How to scrape not visible elements using python? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I'm using beautiful soup to scrape a webpages.
I am trying to scrape data from this https://painel-covid19.saude.ma.gov.br/vacinas. But the problem is I am getting the tags in outputs empty. In the Inspect Element I can see the data, but in page source not. You can see the code is hidden in . How can I retrieve it using python? Someone can help me?
The issue isn't "not visible". The issue is that the data is being filled in by Javascript code. You won't see the data unless you are executing the Javascript on the page. You can do that with the selenium package, which runs a copy of Chrome to do the rendering.

Selenium python click on button dont work [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
[] In the photo you can see two buttons SING UP which is where the web opens by default as soon as you load it and the one I want to click LOGIN, the site is full of Java and I know that solving this problem will help me later, hopefully someone can help me, I would appreciate it a million 1
To simply click the a href tag with the LOGIN text.
driver.find_element_by_xpath("a[text()='LOGIN']").click()

How can I loop through the elements of an angularjs drop down menu using python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
This is the Html code for the drop down menu with list of cities:
I just want to loop through all of the cities using python. BTW I am trying to create a web scraper, and have no api for the site :(
Just for anyone with the same issue you can just use the xpath and find the index pattern.

Categories