This question already has answers here:
How to get around python requests SSL and proxy error?
(2 answers)
Python requests with proxy results in SSLError WRONG_VERSION_NUMBER
(2 answers)
Closed 2 years ago.
I am trying to connect to websites via proxies. I am using the python library requests.
proxies = {
"http": "http://x.x.x.x:pn",
"https": "https://x.x.x.x:pn"
}
request_result = requests.get("https://www.iplocation.net/find-ip-address", proxies=proxies)
These configuration is correct, according to the documentation. I have one proxy from a paid site and I have tried several others that are available on the web. I have also configured this proxies in the windows settings and used them in the browser, it works.
When making the request through Python I get this error:
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.iplocation.net', port=443): Max retries exceeded with url: /find-ip-address (Caused by SSLError(SSLError(1, '[SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:852)'),))
This post says I should check the TLS-version, my environment support TLS 1.2. This post says I should have pyOpenSSL and idna installed, which I do. Why does it say unknown protocol, the request library should be able to connect with Https?
If change the url in the get() function to use the http protocol I sometimes receive:
(Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response',)))
On other occasions I receive a 502 http response.
It is quite frustrating because it should be working. I understand it is a hard question to answer with limited information. Can you give me any suggestions of what might be wrong.
Related
I'm currently working with an Imgur Bot using Python and requests lib with a rotating proxy. I run multiple instances of this bot, but sometimes, some of them got this error:
HTTPSConnectionPool(host='api.imgur.com', port=443): Max retries exceeded with url: /3/credits (Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response')))
Here is how the session is defined:
session = requests.Session()
session.proxies = {"http": cre["Proxy"]}
I have no idea why it crashes since only a part instances I run got this error. The others work well.
Thanks, mates!
Maybe your proxies are wrong configurated?
I think you run into some form of "requesting too much too fast".
Try to delay some requests (like 3 seconds).
So, I was sending a request using the requests library in Python 3.9.1. The problem is, when I tried to use an HTTP proxy it gave me this error:
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000002B08D6BC9A0>: Failed to establish a new connection:
[WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond')))
This my code, I would appreciate any help.:
import requests
for proxy in open('proxies.txt','r').readlines():
proxies = {
'http': f'http://{proxy}',
'https': f'http://{proxy}'
}
e = requests.get('https://google.com/robots.txt',proxies=proxies)
open('uwu.txt','a').write(e.text)
print(e.text)
I am pretty sure it is not problem with my proxies as they are really good private proxies with 100 gigs of bandwidth. (from zenum.io).
I'm having a problem trying to make a get request in a specific site using python.
My code:
import requests
url = 'https://www.beneficiossociais.caixa.gov.br/consulta/beneficio/04.01.00-00_00.asp'
r = requests.get(url, verify=False)
The error:
SSLError: HTTPSConnectionPool(host='www.beneficiossociais.caixa.gov.br', port=443): Max retries exceeded with url: /consulta/beneficio/04.01.00-00_00.asp (Caused by SSLError(SSLError("bad handshake: SysCallError(-1, 'Unexpected EOF')")))
The server you are trying to reach is practically broken. Apart from supporting long obsolete and insecure SSL 2 and SSL 3 and only supporting TLS 1.0 as kind of secure enough protocol version it only supports ciphers which are considered insecure or weak. Since these ciphers are disabled in Python and sometimes not even compiled into current OpenSSL versions (i.e. cannot be enabled with hacks like this) any TLS handshake with this broken server fails. Ignoring certificate errors as you do will not help since it is not a certificate problem which causes the handshake to fail but the lack of shared ciphers. For more see the SSLLabs report of this site.
I'm tryng to retrieve the geocoordinates of a given address using the herepy package on Python. As I'm working behind a network proxy, I've initialized the proxy environment variable with the proxy.
import os
import herepy
os.environ['http_proxy'] = proxy
os.environ['HTTP_PROXY'] = proxy
geocoderApi = herepy.GeocoderApi(HERE_AppID, HERE_AppCode)
response = geocoderApi.free_form('200 S Mathilda Sunnyvale CA')
However, I'm getting the SSLError when I run the codes. Does anyone has any idea of what went wrong?
SSLError: HTTPSConnectionPool(host='geocoder.cit.api.here.com', port=443): Max retries exceeded with url: /6.2/geocode.json?searchtext=200+S+Mathilda+Sunnyvale+CA&app_id=xxxxxxxxxx&app_code=xxxxxxxxxxxxCaused by SSLError(SSLError("bad handshake: Error([('SSL routines', 'tls_process_server_certificate', 'certificate verify failed')],)",),))
This herepy package is not provided and supported by HERE. We provide standard REST APIs in addition to our JavaScript, Android and IOS SDKs.
Please use urllib(3) and other standard requests libraries in Python that provides means to handle proxies.
I'm trying to deploy a simple python telegram bot to heroku. It web-crawls 'http://www.wordreference.com/sinonimos/' using requests and bs4, and extracts synonyms for that word. But once uploaded to the server it is unable to stablish connection, it raises the following exception
requests.exceptions.ConnectionError:
HTTPConnectionPool(host='www.wordreference.com', port=80): Max retries
exceeded with url: /sinonimos/Calor (Caused by
NewConnectionError('<urllib3.connection.HTTPConnection object at
0x7fb3aeed80b8>: Failed to establish a new connection: [Errno 101] Network
is unreachable',))
It works perfectly executed locally. I tried to crawl other urls aswell on heroku and it just works fine. So I guess that if it was a port-binding issue, I'd had get a response for other urls. I tried using a free proxy server in case free accounts had limitations to certain addresses as shown here
proxies = {'http': 'http:/40.141.163.122:8080'}
page='http://www.wordreference.com/sinonimos/{}'.format(word)
r = requests.get(page,proxies=proxies)
It worked for a while, and then stopped, not raising any exceptions to the logs. I contacted Heroku support, but they didn't reply yet, so if anyone can shine some light on the issue so I stop flying in the dark it'd be appreciated