Restoring Firefox Session in Selenium - python

I am currently automating a website and have a test which checks the functionality of the Remember Me option.
My test script logs in, entering a valid username and password and checks the Remember Me checkbox before clicking Log In button.
To test this functionality I save the cookies to file using pickle, close the browser and then reopen the browser (loading the cookies file).
def closeWebsite(self, saveCookies=False):
if saveCookies:
pickle.dump(self.driver.get_cookies(), open('cookies.pkl', 'wb'))
self.driver.close()
def openWebsite(self, loadCookies=False):
desired_caps = {}
desired_caps['browserName'] = 'firefox'
profile = webdriver.FirefoxProfile(firefoxProfile)
self.driver = webdriver.Firefox(profile)
self.driver.get(appUrl)
if loadCookies:
for cookie in pickle.load(open('cookies.pkl', 'rb')):
self.driver.add_cookie(cookie)
However, when I do this, the new browser is not logged in. I understand that everytime you call the open the browser a new Session is created and this session ID can be obtained using driver.session_id
Is it possible, in the openWebsite method to load a driver and specify the sessionID?
When I test this manually, the Remember Me option works as expected. I'm just having trouble understanding how Selenium handles this case.

For starters you're loading the page before adding the cookies. Although there is the potential for them to arrive before the page needs / queries them, this isn't correct let alone reliable.
Yet, if you try to set the cookies before any page has loaded you will get an error.
The solution seems to be this:
First of all, you need to be on the domain that the cookie will be valid for. If you are trying to preset cookies before you start interacting with
a site and your homepage is large / takes a while to load an
alternative is to find a smaller page on the site, [...]
In other words:
Navigate to your home page, or a small entry page on the same domain as appUrl (no need to wait until fully loaded).
Add your cookies.
Load appUrl. From then on you should be fine.

Related

Python Selenium cannot read cookies after redirecting to another website

I try reading the website cookies after logging in, but the following code cannot read it.
After the code runs, the driver_cookies I get is just an empty list. However, I can manually find the cookies in Chrome Development Tools.
driver.find_element(by=By.ID,value='login_id').send_keys(login_id)
driver.find_element(by=By.ID,value='password').send_keys(password)
driver.find_element(by=By.ID,value='login_btn').click()
while(driver.current_url!="https://www.theWebsiteThatWeGoTo.com"):
continue
time.sleep(3)
# debug, the cookies can be found in the development tools after the code run here,
# but the following code just cannot read it.
os.system("pause")
driver.switch_to.window(driver.window_handles[0])
driver_cookies = driver.get_cookies()
print(driver_cookies)
Something that I think I need to mention is the website, which the login website redirects to, uses a different protocol because it is used for an electron application and thus Chrome cannot directly load the page (will show ERR_SSL_VERSION_OR_CIPHER_MISMATCH). I don't know if it is the reason.

Allow facebook cookies to track me through multiple sessions in selenium

I'm working on scraping data using selenium, for an academic research which will test how certain user behaviors across facebook and the web will affect the ads they see.
For this, I need to have a kinds of fake user which will first interact with facebook, then visit some sites with facebook cookies, allowing facebook to continue tracking its behavior, and then go back to facebook.
I haven't done much web development, and it seems I'm confused about how exactly to keep and load cookies for this scenario.
I've been trying to save and load cooking using the following code snippets:
# saving
pickle.dump(driver.get_cookies(), cookiesfile)
# loading
cookies = pickle.load(cookiesfile)
for cookie in cookies:
driver.add_cookie(cookie)
On facebook , this will either create an error message popup telling me to reload, or redirect me to the login page. On other sites, even ones that explicitly state they have facebook trackers, this will cause an InvalidCookieDomainException.
What am I doing wrong?
Instead of handling cookies yourself, I would recommend using ChromeOptions to persist a browser session. This could be more helpful in maintaining local storage and other cookies.
The next time you open a browser session, the chrome instance will have loaded the previous "profile" and will continue maintaining it.
options = webdriver.ChromeOptions()
options.add_argument('user-data-dir={}'.format(<path_to_a_folder_reserved_for_browser_data>))
driver = webdriver.Chrome(executable_path=<chromedriver_exe_path>, options=options)

Selenium doesn't keep cache valid

I'm working on a python software with selenium. The problem is I want my script and selenium to save cookies after logging in. I save cookies using both "pickle" module and the below argument:
opts.add_argument("user-data-dir=cachedD")
But when I quit the browser and then launch it again and going to the same URL as it left off, the website again redirects to the login page. The website is using "moodle", and I guess it's cookies expire after quitting the browser. How can I save cookies and continue where it left off? I should say that there's just a maximum 15 seconds gap between two launches.
You're potentially not using the tag correctly.
With this tag you specify a folder path. If you review this page:
--user-data-dir
Directory where the browser stores the user profile. ↪
That link may not look right, but the chromium page say that's the right list.
Historically, I've had success with:
.add_argument("user-data-dir=C:\Temp")
If that is still not working as you expect, there are a few other things you can look at.
Review this page - cookies can be deleted when you close your browser. You'll want to verify the value of this option.
Another check is to open your chromedriver via selenium and goto chrome://version/ . From here you can review what you're running and you'll see there are a lot more tags that are enabled by default. You should check that these match up to how you want your browser to behave.

How to login to a website using Python/Selenium?

from selenium import webdriver
from selenium.common.exceptions import TimeoutException
from selenium.webdriver.support.ui import WebDriverWait # available since 2.4.0
from selenium.webdriver.support import expected_conditions as EC # available since 2.26.0
browser = webdriver.Chrome('C:/Users/xyz/Downloads/chromedriver.exe')
# Define all variables required
urlErep = browser.get('http://www.erepublik.com')
xPathToSubmitButton = "//*[#id='login_form']/div[1]/p[3]/button"
urlAlerts = 'https://www.erepublik.com/en/main/messages-alerts/1'
one = 1
xPathToAlerts = "//*[#id='deleteAlertsForm']/table/tbody/tr[%d]/td[3]/p" %one
def logintoerep():
email = browser.find_element_by_id("citizen_email")
password = browser.find_element_by_id("citizen_password")
email.send_keys('myemail')
password.send_keys('mypassword')
browser.find_element_by_xpath(xPathToSubmitButton).click()
logintoerep()
The text above is code I wrote using Selenium to login to erepublik.com.
My main goal is to verify some information on eRepublik.com whenever someone fills a Google Form, and then complete an action based on the Google Form data. I'm trying to login to eRepublik using Selenium, and in each attempt to run the script(which I need to run 24/7, so that whenever the form gets a new response the script is ran) it creates a new window, and after 10-20 times I've logged in to the website it asks for captcha which Selenium can't complete. While in my existing browser window, I'm already logged in so I don't have to worry about Captcha and can just run my code.
How can I bypass this problem? Because I need the script to be able to login every time on its own, but captcha won't allow that. The best solution would be to use Selenium on my existing browser windows, but it doesn't allow that.
Is is possible to copy some settings from my normal browser windows to the Selenium-run browser windows so that every time logs in automatically instead?
I'm open to any suggestions as long as they can get me to verify and complete a few minor actions in the website I've linked.
You can attach your Chrome profile to Selenium tests
options = webdriver.ChromeOptions()
options.add_argument("user-data-dir=C:\\Path") #Path to your chrome profile
browser = webdriver.Chrome(executable_path="C:\\Users\\chromedriver.exe", chrome_options=options)
First off, CAPTCHAs are meant to do exactly that: repel robots/scripts from brute-forcing, or doing repeated actions on certain app features (e.g: login/register flows, send messages, purchase flows, etc.). So you can only go around... never through.
That being said, you can simulate the logged-in state by doing one of the following:
loading the authentication cookies required for the user to be logged in (usually it's only one cookie with a token of some sorts);
loading a custom profile in the browser that already has that user logged in;
use some form of basic auth when navigating to that specific URL (if the web-app has any logic to support this);
Recommended approach: Usually in most companies (at least from my exp), there usually is a specific cookie, or flag that you can set to disable CAPTCHAs for testing purposes. If this is not the case, talk to your PM/DEVs to create such a feature that permits the testing of your web-app.
Don't want to advertise advertise my content, but I think I best tackled this topic HERE. Maybe it can further help.
Hope you solve the problem. Cheers!

Selenium - use session from current Chrome instance

I'm trying to automate some form filling for a web app. Users have to login to the application and then start filling up pages of forms. I have the following Python script using Selenium that can open a window to my application:
from selenium import webdriver
driver = webdriver.Ie("C:\\Python\\Selenium\\Chrome\\chromedriver.exe")
driver.add_cookie()
driver.set_page_load_timeout(30)
driver.get("myurl/formpages")
driver.maximize_window()
driver.quit()
However, when Selenium starts the Chrome window, the user is not logged in. I want to bypass the need to log in every time.
On another Chrome window, I am already logged in as my test user. So whenever I go to the url on my Chrome window, I am already logged in and don't have to log in again.
Is there any way to pass this data into my Selenium script so that it uses the session currently on my existing Chrome instance, therefore not having to log in via the script?
Historically, this is not possible unfortunately (made frustrating by my agreement when I realize the effort it involves and for each browser!).
I've written code before that takes variables out of a CSV for username and password. This is bad because it's in plaintext but you can also hash the information if you like and handle that in your code.
So to recap, there are mediocre solutions, but no native way to handle this in selenium :(
Selenium by default creates a temporary chrome profile each time you start a new instance and deletes that profile once you quit the instance. If you change the chrome options in selenium driver to use a custom chrome profile and and allow that profile to save cookies, you will be able to login without each time typing your login details etc.

Categories