I am using selenium in python, chrome as browser, I enabled the cookies in the host server and in the code, it works perfect when I run it locally, however, when I run it from Azure DevOps the I encounter an issue, from the logs I see that the webdriver disable the cookies.
Here is the code I use to enable cookies:
chrome_options.add_argument("--user-data-dir=C:\\Users\\" + getpass.getuser()+"\\AppData\\Local\\Google\\Chrome\\User Data\\Default")
What I am missing?
Related
I am writing automation for one of the site. I am able to access that site in chrome when I open that site manually in chrome or incognito mode. But when I am trying to launch it using automation in any browser its giving me site can not be reached error
Below is my simple code -
ChromeOptions options = new ChromeOptions();
options.AddArguments("excludeSwitches", "enable-automation");
options.AddArguments("useAutomationExtension", "False");
options.AddArgument("incognito");
IWebDriver driver = new ChromeDriver(options);
Console.WriteLine("Test MY Site");
driver.Manage().Timeouts().ImplicitWait = TimeSpan.FromSeconds(60);
driver.Manage().Window.Maximize();
driver.Manage().Cookies.DeleteAllCookies();
driver.Navigate().GoToUrl("https://example.net");
I tried with Selenium C#, Selenium Python , Cypress etc. but getting same error.
I also tried to launch site in chrome, edge, firefox and safari there also getting same error.
Note- I am using latest version of chrome, latest version of selenium and windows 10. I also tried after downgrading selenium and chrome
I have also tried setting proxy manual and automatic. also tried with no proxy setting. Internet connection is not at all issue since site is working fine when I am using that site manually in chrome
I found undetected-chrome driver helps in this case but I did find exe for undetected-chromedriver in order to launch site
Please help
I want to use a proxy for selenium python using the firefox webdriver. I can assign firefox a proxy through the normal browser's GUI but I don't know how to do it Selenium Python. It uses port 8080 and it does not need authentication.
I attempted to follow this post [here][1]:
how to set proxy with authentication in selenium chromedriver python? to use an authenticated proxy with Chome Webdriver. It doesn't work and I get ERR_TUNNEL_CONNECTION_FAILED from Chrome. Is there a new solution to this or does the code from the linked StackOverflow need to be updated? I am using ChromeDriver v81 for Windows.
I've write a script in python for a bot that run using selenium, untill now I used the normal selenium webdriver, but now I want to move to Selenium Standalone server.
I have 2 servers on DigitalOcean, I want to use one server as the selenium standalone server and the other for send the requests.
In the Main server I run java -jar ~/selenium/selenium-server-standalone-3.14.0.jar for run the Selenium server, and it works.
In the second I have my scripts but I can't figure out how to allow the connection.
My options for run the webdriver are:
chrome_options = webdriver.ChromeOptions()
chrome_options.add_argument('user-data-dir=/var/www/users/'+ Setup.db +'/cookies')
chrome_options.add_argument('--headless')
chrome_options.add_argument('--no-sandbox')
chrome_options.add_argument('--lang=en')
chrome_options.add_argument('CHROME')
browser = webdriver.Remote(command_executor='http://MAIN_SERVER_IP:4444/wd/hub', desired_capabilities=DesiredCapabilities.CHROME)
But I can't run the webriver, I see that the 2 servers are "talking" beacause when I try to run the script, in the main server I read Only local connections are allowed. so that mean that there are some problem with the firewall or the settings but I don't know what to do.
I am experiencing a very strange behaviour when testing Chrome via selenium webdriver.
Instead of navigating to pages like one would expect the 'get' command leads only to the download of tiny files (no type or.apsx files) from the target site.
Importantly - this behavior only occurs when I pass chrome_options as an argument
to the Chrome driver.
The same testing scripts work flawless with firefox driver.
Code:
from selenium import webdriver
from selenium.webdriver.chrome.options import Options # tried with and without
proxy = '127.0.0.1:9951' # connects to a proxy application
chrome_options = webdriver.ChromeOptions()
chrome_options.add_argument('--proxy-server=%s' % proxy)
driver = webdriver.Chrome(chrome_options=chrome_options)
driver.get('whatismyip.com')
Leads to the automatic download of a file called download (no file extension, Size 2 Byte).
While calling other sites results in the download of small aspx files.
This all happens while the browser page remains blank and no interaction with
elements happen = the site is not loaded at all.
No error message, except element not found is thrown.
This is really strange.
Additional info:
I run Debian Wheezy 32 bit and use Python 2.7.
Any suggestions how to solve this issue?
I tried your code and captured the traffic on localhost using an SOCKS v5 proxy through SSH. It is definitely sending data through the proxy but no data is coming back. I have confirmed the proxy was working using Firefox.
I'm running Google Chrome on Ubuntu 14.04 LTS 64-bit. My Chrome browser gives me the following message when I try to configure a proxy in its settings menu:
When running Google Chrome under a supported desktop environment, the
system proxy settings will be used. However, either your system is not
supported or there was a problem launching your system configuration.
But you can still configure via the command line. Please see man
google-chrome-stable for more information on flags and environment
variables.
Unfortunately I don't have a man page for google-chrome-stable.
I also discovered that according to the selenium documentation Chrome is using the system wide proxy settings and according to their documentation it is unknown how to set the proxy in Chrome programmatically: http://docs.seleniumhq.org/docs/04_webdriver_advanced.jsp#using-a-proxy