Max retries exceeded while using rotating proxy with Python requests - python

I'm currently working with an Imgur Bot using Python and requests lib with a rotating proxy. I run multiple instances of this bot, but sometimes, some of them got this error:
HTTPSConnectionPool(host='api.imgur.com', port=443): Max retries exceeded with url: /3/credits (Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response')))
Here is how the session is defined:
session = requests.Session()
session.proxies = {"http": cre["Proxy"]}
I have no idea why it crashes since only a part instances I run got this error. The others work well.
Thanks, mates!

Maybe your proxies are wrong configurated?
I think you run into some form of "requesting too much too fast".
Try to delay some requests (like 3 seconds).

Related

Connecting with Python library using a proxy [duplicate]

This question already has answers here:
How to get around python requests SSL and proxy error?
(2 answers)
Python requests with proxy results in SSLError WRONG_VERSION_NUMBER
(2 answers)
Closed 2 years ago.
I am trying to connect to websites via proxies. I am using the python library requests.
proxies = {
"http": "http://x.x.x.x:pn",
"https": "https://x.x.x.x:pn"
}
request_result = requests.get("https://www.iplocation.net/find-ip-address", proxies=proxies)
These configuration is correct, according to the documentation. I have one proxy from a paid site and I have tried several others that are available on the web. I have also configured this proxies in the windows settings and used them in the browser, it works.
When making the request through Python I get this error:
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.iplocation.net', port=443): Max retries exceeded with url: /find-ip-address (Caused by SSLError(SSLError(1, '[SSL: UNKNOWN_PROTOCOL] unknown protocol (_ssl.c:852)'),))
This post says I should check the TLS-version, my environment support TLS 1.2. This post says I should have pyOpenSSL and idna installed, which I do. Why does it say unknown protocol, the request library should be able to connect with Https?
If change the url in the get() function to use the http protocol I sometimes receive:
(Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response',)))
On other occasions I receive a 502 http response.
It is quite frustrating because it should be working. I understand it is a hard question to answer with limited information. Can you give me any suggestions of what might be wrong.

Requests giving errors while using HTTP proxies

So, I was sending a request using the requests library in Python 3.9.1. The problem is, when I tried to use an HTTP proxy it gave me this error:
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x000002B08D6BC9A0>: Failed to establish a new connection:
[WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond')))
This my code, I would appreciate any help.:
import requests
for proxy in open('proxies.txt','r').readlines():
proxies = {
'http': f'http://{proxy}',
'https': f'http://{proxy}'
}
e = requests.get('https://google.com/robots.txt',proxies=proxies)
open('uwu.txt','a').write(e.text)
print(e.text)
I am pretty sure it is not problem with my proxies as they are really good private proxies with 100 gigs of bandwidth. (from zenum.io).

Python Requests - Hiding 'Max retries exceeded with url'

I'm using requests to send a post to an API with proxies however I'm trying to make the console output look clean, not spamming with errors that the proxy 'fails to connect'. Is there a way to hide errors like this one?:
HTTPSConnectionPool(host='test.com', port=443): Max retries exceeded with url: /v1/ (Caused by ProxyError('Cannot connect to proxy.', OSError(0, 'Error')))
I set a timeout for as well, nothing seems to be working and my console output is being spammed with that

Where is default `server.socket_queue_size` configured in CherryPy in python

I have a CherryPy server, which is processing requests using a thread pool of, say, 10 threads.
If I send many requests in parallel (~200 processes constantly sending), I start seeing in my logs (client side) 3 types of errors:
HTTPConnectionPool(host='localhost', port=8080):
Max retries exceeded with url: /my-url
(Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x1095da780>:
Failed to establish a new connection: [Errno 61] Connection refused'))
BrokenPipeError(32, 'Broken pipe'))
and
('Connection aborted.', ConnectionResetError(54, 'Connection reset by peer'))
Why 3 different types, btw?
I suppose I see these errors because there were too many requests sent.
I adjusted server.socket_queue_size to match the number of parallel requests I send, and it started working fine. My server configuration looks like this:
cherrypy.config.update({
'server.socket_port': 8080,
'server.socket_host': 'localhost',
'server.thread_pool': 10,
'server.socket_queue_size': 200
})
However, I'm struggling to find the default value of socket_queue_size set in CherryPy. How much is it? cherrypy.config['server.socket_queue_size'] gives me nothing unsell I set this value myself.
So, what is the default value? How do I determine reasonable socket_queue_size?

Making requests through tor, requests.exceptions.ConnectionError Errno 61: Connection Refused

I'm trying to make a simple request to a whatsmyip site while connected to tor but no matter what I try I continue to get this error:
requests.exceptions.ConnectionError: SOCKSHTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with url: /get (Caused by NewConnectionError('<urllib3.contrib.socks.SOCKSHTTPSConnection object at 0x1018a7438>: Failed to establish a new connection: [Errno 61] Connection refused'))
I've looked at a lot of posts on here with similar issues but I can't seem to find a fix that works.
This is the current code but I've tried multiple ways and its the same error every time:
import requests
def main():
proxies = {
'http': 'socks5h://127.0.0.1:9050',
'https': 'socks5h://127.0.0.1:9050'
}
r = requests.get('https://httpbin.org/get', proxies=proxies)
print(r.text)
if __name__ == '__main__':
main()
Well the error says Max retries exceeded with url:, so possibly could be too many requests has been made from the tor exit nodes ip. Attempt to do it with a new Tor identity and see if that works.
If you wanted to you could catch the exception and put it in a loop to attempt every number of seconds, but this may lead to that ip address being refused by the server for longer.

Categories