Request Max Retries TOR - python

I am trying to connect to TOR's localhost loopback and send data through it.
The address I am using is:
127.0.0.1:9050
I am using the following script to do this:
import requesocks, requests
session = requesocks.session()
session.proxies = {'http': 'socks5://127.0.0.1:9050',
'https': 'socks5://127.0.0.1:9050'}
print session.get("https://api.ipify.org?format=json").json()
It is supposed to retrieve my IP and print it. However, it gives the following error:
Max retries exceeded with url: https://api.ipify.org/?format=json
I can verify that TOR is up and running. What could be the problem raising this exception?

I got it working. I had to install the "expert" installer and add the exe to my PATH. Thank you

Related

proxy works fine with http but not https

I wanted to use proxies in python requests but when I run the code with like this
req = requests.get("https://httpbin.org/ip", proxies={'https': 'user:pass#host:port',
'http': 'user:pass#host:port'})
print(req.content)
I get this error
HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with url: /ip (Caused by ProxyError('Cannot connect to proxy.', TimeoutError(10060, 'A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', None, 10060, None
but If I use "http://httpbin.org/ip" instead of "https://httpbin.org/ip"
it works really fine
and in other stuff like if I run this code
proxies = { 'http' : 'user:pass#host:port' }
req =requests.get("https://lumtest.com/myip.json",proxies =proxies )
print(req.content)
I get my ip address which means that the proxies are not working. But if I use this
which is the same url just without the s in https and I run it over HTTP
proxies = { 'http' : 'user:pass#host:port' }
req =requests.get("http://lumtest.com/myip.json",proxies =proxies )
print(req.content)
I get the ip of the proxy which means that its working fine
It doesn't bother me changing the s in HTTP or HTTPS but in some website when I use proxies over HTTP
I get a different response I get this
b''
instead of getting the response that I wanted that works fine without proxies even If I run it on HTTPS or HTTP
but If I run it it only works over http with the proxies and it doesn't give me a valid response
I hope someone can help me bcuz I have been trying to solve this forever

Make a python 3 request using proxy

I want to make an anonymous web request using python 3.
I've tried few suggestions such as: Make requests using Python over Tor
I've managed to get a fake ip using this snippet:
Installation
pip install requests requests[socks]
Basic usage
import requests
def get_tor_session():
session = requests.session()
# Tor uses the 9050 port as the default socks port
session.proxies = {'http': 'socks5://127.0.0.1:9150',
'https': 'socks5://127.0.0.1:9150'}
return session
# Make a request through the Tor connection
# IP visible through Tor
session = get_tor_session()
print(session.get("http://httpbin.org/ip").text)
# Above should print an IP different than your public IP
# Following prints your normal public IP
print(requests.get("http://httpbin.org/ip").text)
But that works only on port 9150 and when the tor web browser works.
I want to make a request without the tor browser, as i want to Dockerize the whole thing.
I've read about Socks5, and as you can see i've installed it, but when i make a request on port 9050 on the same snippet i get:
requests.exceptions.ConnectionError:
SOCKSHTTPConnectionPool(host='httpbin.org', port=80): Max retries
exceeded with url: /ip (Caused by
NewConnectionError(': Failed to establish a new connection: [WinError
10061] No connection could be made because the target machine actively
refused it',))
How can i solve it?
Thanks!
10061 is 'connection refused'
That means there was nothing listening on that port you tried to connect to, no service is up and running (no open port) or firewall on target IP blocks it
you can test that port with telnet
telnet `IP` `PORT`
And also check this port issue on Windows: here
I was also facing this issue, in my case my tor service was not running, actually I was using kalitorify which is a transparent proxy, and whenever I was using this I was not be able to use normal sites such as google search or similar, so to use these sites I was turning off my kalitorify service which also turns off your tor service
So if you're also using that then also check it once

Python: How can I use urllib or requests modules from a corporate domain (firewall, proxy, cntlm etc)

I am trying to do the following:
from urllib.request import urlopen
data = urlopen("https://www.duolingo.com/users/SaifullahS6").read()
I get the following error:
URLError: <urlopen error [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
Similarly, when I try this:
import requests
session = requests.Session()
data = {"login": "SaifullahS6", "password": "mypassword"}
req = requests.Request('POST', "https://www.duolingo.com/login", data=data,
cookies=session.cookies)
prepped=req.prepare()
returned = session.send(prepped)
I get:
ConnectionError: HTTPSConnectionPool(host='www.duolingo.com', port=443): Max retries exceeded with url: /login (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x000000000E6948D0>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond',))
I am not sure how to give details of my internet connection.
I'm at work and I know we have a corporate proxy.
We have Windows Firewall turned on, but i have checked that python and pythonw are ticked in the "Domain" column of the control panel for allowing a program through the firewall.
When I ping google.co.uk from a command shell, all four requests time out, but I can access it from a browser.
In the Internet Options control panel, I click on the Connections tab and then LAN settings, and I have "Automatically detect settings" turned on, and also "Use a proxy server for your LAN", "Address" is "localhost" and "Port" is 3128. This is cntlm. I set it up once to do download python packages, and it appears to still be active because I have just managed to update one of my packages.
I don't even need a direct answer to my question; at this point I'll just settle for some clarity on what is actually going on behind the scenes. Any help much appreciated!
For the first case above (urllib module), I solved it by inserting the following lines before the data = urlopen(...).read() line:
proxies = { "http": "localhost:3128",
"https": "localhost:3128"}
proxy = urllib.request.ProxyHandler(proxies)
opener = urllib.request.build_opener(proxy)
urllib.request.install_opener(opener)
For the second case (requests module), everything was the same except the last line:
proxies = { "http": "localhost:3128",
"https": "localhost:3128"}
returned = session.send(prepped, proxies=proxies)
Hope this note helps others who come across this page.

Fetching a .onion domain with requests

I'm trying to access the following domain nzxj65x32vh2fkhk.onion using requests.
I have tor running and I configured the session's object proxies correctly.
import requests
session = requests.session()
session.proxies = {'http': 'socks5://localhost:9050',
'https': 'socks5://localhost:9050'}
print(session.get('http://httpbin.org/ip').text) # prints {"origin": "67.205.146.164" }
print(requests.get('http://httpbin.org/ip').text) # prints {"origin": "5.102.254.76" }
However when I try to access the URL with the .onion domain I get the following error:
session.get('http://nzxj65x32vh2fkhk.onion/all')
ConnectionError: SOCKSHTTPConnectionPool(host='nzxj65x32vh2fkhk.onion', port=80): Max retries exceeded with url: /all (Caused by NewConnectionError('<requests.packages.urllib3.contrib.socks.SOCKSConnection object at 0x7f5e8c2dbbd0>: Failed to establish a new connection: [Errno -2] Name or service not known',))
I also tried to replace localhost with 127.0.0.1 as suggested in one of the answers. The result is the same unfortunately.
Performing the same request using urllib2 works just fine.
import socks, socket, urllib2
def create_connection(address, timeout=None, source_address=None):
sock = socks.socksocket()
sock.connect(address)
return sock
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, '127.0.0.1', 9050)
socket.socket = socks.socksocket
socket.create_connection = create_connection
print(urllib2.urlopen('http://nzxj65x32vh2fkhk.onion/all').read()) # Prints the URL's contents
cURL also retrieves the contents of the page correctly.
I'm using Python 2.7.13, requests 2.13.0 & PySocks 1.6.7. Tor is running through a docker container with the following command:
sudo docker run -it -p 8118:8118 -p 9050:9050 -d dperson/torproxy
What am I doing wrong here? What do I need to do to make requests recognize the .onion URLs?
The solution is to use the socks5h protocol in order to enable remote DNS resolving in case the local DNS resolving process fails. See https://github.com/kennethreitz/requests/blob/e3f89bf23c53b98593e4248054661472aacac820/requests/packages/urllib3/contrib/socks.py#L158
The following code works as expected:
import requests
session = requests.session()
session.proxies = {'http': 'socks5h://localhost:9050',
'https': 'socks5h://localhost:9050'}
print(session.get('http://httpbin.org/ip').text) # prints {"origin": "67.205.146.164" }
print(requests.get('http://httpbin.org/ip').text) # prints {"origin": "5.102.254.76" }
print(session.get('http://nzxj65x32vh2fkhk.onion/all').text) # Prints the contents of the page

Python requests returns “cannot connect to proxy & error 10061”

I have developed a desktop client using PyQt4, it connect to my web service by requests lib. You know, requests maybe one of the most useful http client, I think it should be no problem. My desktop client works all right until something strange happened.
I use the following code to send request to my server.
response = requests.get(url, headers = self.getHeaders(), timeout=600, proxies = {}, verify = False)
where header only includes auth token.
def getHeaders(self, additional = None):
headers = {
'Auth-Token' : HttpBasicClient.UserAuthToken,
}
if additional is not None:
headers.update(additional)
return headers
I cannot connect to my web service, all the http request pop the same error "'Cannot connect to proxy.', error(10061, '')". For example:
GET Url: http:// api.fangcloud.com/api/v1/user/timestamp
HTTPSConnectionPool(host='api.fangcloud.com', port=443): Max retries exceeded with url: /api/v1/user/timestamp (Caused by ProxyError('Cannot connect to proxy.', error(10061, '')))
this API does nothing but return the timestamp of my server. When I copy the url into Chrome in same machine with same environment, it returns correct response. However, my desktop client can only returns error. Is it anything wrong with requests lib?
I googled this problem of connection error 10061 ("No connection could be made because the target machine actively refused it"). This maybe caused by TCP connect rejection of web server.
The client sends a SYN packet to the server targeting the port (80 for HTTP). A server that is running a service on port 80 will respond with a SYN ACK, but if it is not, it will respond with a RST ACK. Your client reaches the server, but not the intended service. This is one way a server could “actively refuse” a connection attempt.
But why? My client works all right before and Chrome still works. I use no proxy on my machine. Is there anything I miss?
I notice there is a white space in URL, is that correct?
I tested in my ipython with requests.. that the response was:
{
"timestamp": 1472760770,
"success": true
}
For HTTP and HTTPS.

Categories