I'm having a problem when trying to login to target.com with selenium. I have used firefox and chromium webdriver. It's always success with the browser that opened manually. But with selenium, it always failed.
The error happens when I submit the login form. It says "error T83072242".
I have attached the AJAX response that I get here.
After doing some analysis, I got a conclusion that these variable on request header is the one that caused the error. When I replace this variable with the one from the another browser(that I open manually), the ajax request is success.
So, how to make the selenium-browser behave like the normal browser?
Pardon for my english.
Related
This is what the alert looks like. It reads: The proxy it3.prmsrvs.com is requesting a username and password I have a program in Selenium that automates a certain amount of stuff on a website.
For some reason (this has been encountered also by my colleagues when using the browser normally without automation) there is the possibility of a pop-up showing in the webpage at a random point.
My program's objective is to load a page, get a list of all the elements corresponding to a specific tag and then click on them one by one. They all open in a new tab.
There is the possibility of a pop-up to show in the page after closing one of the tabs.
This pop-up asks for a login but all I have to do is dismiss it and the page will keep working like always.
Now, I've seen people using driver.switch_to.alert.dismiss() but this doesn't seem to work on this page.
I checked the function on a very basic js alert and the dismiss works perfectly, so I think it's the type of alert that is the issue.
I can't inspect the page so I don't know how to retrieve the iFrame of this alert. (I saw this as a possible solution online)
What should I do?
I'm writing a python code to, at first, get a full source code of a web page to later scrape it. But when I try to get the source code - I see the aforementioned message ("If you're seeing this message, that means JavaScript has been disabled on your browser, please enable JS to make this app work") with partial html code. Also when I click F12 to see 'elements' the entire code appears meanwhile, pressing Cntrl + U to view the source code yields the same result as getting it with the below mentioned py script
source = requests.get(link).text
soup = BeautifulSoup(source, 'lxml').prettify()
I've seen similar questions to mine but none of them had a satisfactory solution, for example, it was recommended to use selenium to open a new web page and then to work with it, but it would take additional time. JS is enabled in my browser
It is as you have seen on the other answers, you have to use selenium (or another browser automation tool) to enable javascript rendering. The web page you are trying to access uses client side rendering, which means that the first thing it sends when you access the url is a bunch of javascript code. Then the browser executes the javascript code to create the DOM of the web page.
You are saying that javascript is enabled in the browser but that has nothing to do with your python code. The library you are using requests is sending a HTTP GET request to the server to fetch the web page, and the server replies as it would to any other request with the javascript that knows how to render the web page. That's why you need something like selenium, that runs a browser instead of doing a simple HTTP request.
I try reading the website cookies after logging in, but the following code cannot read it.
After the code runs, the driver_cookies I get is just an empty list. However, I can manually find the cookies in Chrome Development Tools.
driver.find_element(by=By.ID,value='login_id').send_keys(login_id)
driver.find_element(by=By.ID,value='password').send_keys(password)
driver.find_element(by=By.ID,value='login_btn').click()
while(driver.current_url!="https://www.theWebsiteThatWeGoTo.com"):
continue
time.sleep(3)
# debug, the cookies can be found in the development tools after the code run here,
# but the following code just cannot read it.
os.system("pause")
driver.switch_to.window(driver.window_handles[0])
driver_cookies = driver.get_cookies()
print(driver_cookies)
Something that I think I need to mention is the website, which the login website redirects to, uses a different protocol because it is used for an electron application and thus Chrome cannot directly load the page (will show ERR_SSL_VERSION_OR_CIPHER_MISMATCH). I don't know if it is the reason.
I am looking for some help for scraping www.mobile.de, while I get an "Access Denied Page"
A regular spider results in the attached picture.
So far I have tried/recognized:
I am not blocked, since I can open the page in Firefox/Chrome
I allowed cookies
I used the same header as used currently by Firefox
I used a referer
I enabled/disabled "Obey robots.txt"
I used Splash to activate/render Javascript
So right now, I cannot conclude how the page detects that my program is a bot and how to avoid that.
https://ibb.co/7RsMkM3
Please note, this question is Python 3.5.2, only Python answers will be accepted. Unless this can definitely be handled in Java? Automating a process as part of an internal project. Everything works just fine using the IE webdriver, but not phantomJS web driver (which is expected due to limited functionality). However, a work-around / solution is required.
When opening the internal site, a Windows Security login dialog box comes up prompting for a username, password and press 'Ok'. With the IE web driver, it is handled just fine with:
loginAlert = driver.switch_to_alert()
loginAlert.authenticate(username, password)
The javascript:
driver.execute_script("window.confirm = function(){return true;}")
Being run before loading the page that gives the prompt, doesn't seem to confirm the login alert, for either phantom or IE. Even if it did, this doesn't type in the login details. As mentioned, it's a Windows Security prompt from the browser, not an element.
Once logged in, the page is reloaded with an ASP.NET_SessionId Cookie which expires once the session is ended. I've tried logging in through IE, then adding the cookie into Phantom, but it doesn't seem to match up the domains.
I've tried using:
driver.save_screenshot(filename) to see what's happening in phantom
Which works with IE driver, but with PhantomJS, only a transparent image is saved. The whole http://username:pass#site.com thing doesn't work for either IE or phantom driver. It can't load / use the URL when this is done.
How can the Windows Security login dialog be handled, or worked around? I tried looking into alternatives, such as pyvirtualdisplay, but found no information on how to get this working with Python 3 on windows.
I have also tried setting phantomjs desired capabilities custom header authentication, but that doesn't seem to do anything for this either.
I have also tried using ActionChains, however they don't work when the Alert is there (in either IE or phantom driver). An UnexpectedAlertPresentException is thrown, even if this is caught and you try to perform the actions, once caught, the alert seems to close.
My bad!
Whilst the username:pass#domain.com didn't work in the IE webdriver - it did work in the PhantomJS web driver.
However, the website has limited browser compatibility - it doesn't load properly in either Chrome or Firefox, it is IE particular.
PhantomJS seems to handle the site the same way as Chrome / Firefox based on page source comparisons.
As such, I am trying to find a way to make the current IE driver invisible / hidden.
I have found:
headless-selenium-for-win using Python
However, despite the user here saying they got it to work, when I try to initialize the driver, it just hangs, the code doesn't proceed and no error messages are provided.
Asking another question regarding this.