python simple_salesforce proxy usage - python

I'm using python simple_salesforce module from this example: https://pypi.python.org/pypi/simple-salesforce. Specifically:
proxies = {
"http": "http://10.10.1.10:3128"
}
from simple_salesforce import Salesforce
sf = Salesforce(username='myemail#example.com.sandbox', password='password', security_token='token', sandbox=True, proxies=proxies)
Its failing with the below error.
requests.exceptions.ConnectionError: ('Connection aborted.', error(111, 'Connection refused'))
If I dont use proxy, it works fine. My requirement is to enable proxy.
Any suggestions?

Adding the following to the beginning of the program will solve this problem.
I was using urllib2 in python and that takes care of forwarding the request through proxy.
For the answer to my question:
If your hostname and port for proxy are xyz1-pqr01.abc.company.com and 3128 then
import os
os.environ['http_proxy'] = 'http://xyz1-pqr01.abc.company.com:3128'
os.environ['https_proxy'] = 'http://xyz1-pqr01.abc.company.com:3128'

Related

Python SSL Context with wrapped Requests

I'm trying to connect to a websocket server that protected with CloudFlare through upgrade: websocket header. Expected result is 101 Switching Protocol. Using a raw Socket, I was able to connect into the server but with several issues such as SSLv3 Handshake Failure or the server doesn't give any response; sometimes occur.
import ssl
import socket
socketch = ssl._create_unverified_context().wrap_socket(socket.socket(), server_hostname='unpkg.com')
socketch.connect(('unpkg.com', 443))
socketch.sendall(b'''GET / HTTP/1.1\r
Host: identity.o2.co.uk.zainvps.tk\r
User-Agent: cpprestsdk/2.9.0\r
Upgrade: websocket\r
Connection: Upgrade\r
Sec-WebSocket-Key: dGhlIHNhbXBsZSBub25jZQ==\r
Sec-WebSocket-Version: 13\r\n\r
''')
print(socketch.recv(10000))
print('')
Using a raw socket is unstable, so I think it's better to use requests module.
import requests
heading = {'Host':'identity.o2.co.uk.zainvps.tk','Connection':'upgrade','Upgrade':'websocket','Sec-Websocket-Version':'13','Sec-Websocket-Key':'dGhlIHNhbXBsZSBub25jZQ=='}
r = requests.get('https://unpkg.com', headers=heading)
print(r.status_code)
Using requests; the server responded with 403 status codes which means it's rejected by the CloudFlare protection but when using Socket, it gives the correct 101 status code. I'm assuming that it is because of wrapped socket gives an expected SSL Hostname through server_hostname.
Is this idea can also be implemented inside requests.Session()?
UPDATE 1:
Someone mentioning about the use of CloudScraper module to bypass the CloudFlare protection. Using CloudScraper still returns in 403 status code with Custom Headers.
import cloudscraper
scraper = cloudscraper.create_scraper()
url = 'https://unpkg.com'
sc = scraper.get(url, headers={"Host": "usaws1.sshstores.vip", "Connection": "upgrade", "Upgrade": "websocket","Sec-WebSocket-Key": "dGhlIHNhbXBsZSBub25jZQ==", "Sec-WebSocket-Version": "13"})
print(sc.status_code)

Python: How can I use urllib or requests modules from a corporate domain (firewall, proxy, cntlm etc)

I am trying to do the following:
from urllib.request import urlopen
data = urlopen("https://www.duolingo.com/users/SaifullahS6").read()
I get the following error:
URLError: <urlopen error [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
Similarly, when I try this:
import requests
session = requests.Session()
data = {"login": "SaifullahS6", "password": "mypassword"}
req = requests.Request('POST', "https://www.duolingo.com/login", data=data,
cookies=session.cookies)
prepped=req.prepare()
returned = session.send(prepped)
I get:
ConnectionError: HTTPSConnectionPool(host='www.duolingo.com', port=443): Max retries exceeded with url: /login (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x000000000E6948D0>: Failed to establish a new connection: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond',))
I am not sure how to give details of my internet connection.
I'm at work and I know we have a corporate proxy.
We have Windows Firewall turned on, but i have checked that python and pythonw are ticked in the "Domain" column of the control panel for allowing a program through the firewall.
When I ping google.co.uk from a command shell, all four requests time out, but I can access it from a browser.
In the Internet Options control panel, I click on the Connections tab and then LAN settings, and I have "Automatically detect settings" turned on, and also "Use a proxy server for your LAN", "Address" is "localhost" and "Port" is 3128. This is cntlm. I set it up once to do download python packages, and it appears to still be active because I have just managed to update one of my packages.
I don't even need a direct answer to my question; at this point I'll just settle for some clarity on what is actually going on behind the scenes. Any help much appreciated!
For the first case above (urllib module), I solved it by inserting the following lines before the data = urlopen(...).read() line:
proxies = { "http": "localhost:3128",
"https": "localhost:3128"}
proxy = urllib.request.ProxyHandler(proxies)
opener = urllib.request.build_opener(proxy)
urllib.request.install_opener(opener)
For the second case (requests module), everything was the same except the last line:
proxies = { "http": "localhost:3128",
"https": "localhost:3128"}
returned = session.send(prepped, proxies=proxies)
Hope this note helps others who come across this page.

Fetching a .onion domain with requests

I'm trying to access the following domain nzxj65x32vh2fkhk.onion using requests.
I have tor running and I configured the session's object proxies correctly.
import requests
session = requests.session()
session.proxies = {'http': 'socks5://localhost:9050',
'https': 'socks5://localhost:9050'}
print(session.get('http://httpbin.org/ip').text) # prints {"origin": "67.205.146.164" }
print(requests.get('http://httpbin.org/ip').text) # prints {"origin": "5.102.254.76" }
However when I try to access the URL with the .onion domain I get the following error:
session.get('http://nzxj65x32vh2fkhk.onion/all')
ConnectionError: SOCKSHTTPConnectionPool(host='nzxj65x32vh2fkhk.onion', port=80): Max retries exceeded with url: /all (Caused by NewConnectionError('<requests.packages.urllib3.contrib.socks.SOCKSConnection object at 0x7f5e8c2dbbd0>: Failed to establish a new connection: [Errno -2] Name or service not known',))
I also tried to replace localhost with 127.0.0.1 as suggested in one of the answers. The result is the same unfortunately.
Performing the same request using urllib2 works just fine.
import socks, socket, urllib2
def create_connection(address, timeout=None, source_address=None):
sock = socks.socksocket()
sock.connect(address)
return sock
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, '127.0.0.1', 9050)
socket.socket = socks.socksocket
socket.create_connection = create_connection
print(urllib2.urlopen('http://nzxj65x32vh2fkhk.onion/all').read()) # Prints the URL's contents
cURL also retrieves the contents of the page correctly.
I'm using Python 2.7.13, requests 2.13.0 & PySocks 1.6.7. Tor is running through a docker container with the following command:
sudo docker run -it -p 8118:8118 -p 9050:9050 -d dperson/torproxy
What am I doing wrong here? What do I need to do to make requests recognize the .onion URLs?
The solution is to use the socks5h protocol in order to enable remote DNS resolving in case the local DNS resolving process fails. See https://github.com/kennethreitz/requests/blob/e3f89bf23c53b98593e4248054661472aacac820/requests/packages/urllib3/contrib/socks.py#L158
The following code works as expected:
import requests
session = requests.session()
session.proxies = {'http': 'socks5h://localhost:9050',
'https': 'socks5h://localhost:9050'}
print(session.get('http://httpbin.org/ip').text) # prints {"origin": "67.205.146.164" }
print(requests.get('http://httpbin.org/ip').text) # prints {"origin": "5.102.254.76" }
print(session.get('http://nzxj65x32vh2fkhk.onion/all').text) # Prints the contents of the page

Python requests returns “cannot connect to proxy & error 10061”

I have developed a desktop client using PyQt4, it connect to my web service by requests lib. You know, requests maybe one of the most useful http client, I think it should be no problem. My desktop client works all right until something strange happened.
I use the following code to send request to my server.
response = requests.get(url, headers = self.getHeaders(), timeout=600, proxies = {}, verify = False)
where header only includes auth token.
def getHeaders(self, additional = None):
headers = {
'Auth-Token' : HttpBasicClient.UserAuthToken,
}
if additional is not None:
headers.update(additional)
return headers
I cannot connect to my web service, all the http request pop the same error "'Cannot connect to proxy.', error(10061, '')". For example:
GET Url: http:// api.fangcloud.com/api/v1/user/timestamp
HTTPSConnectionPool(host='api.fangcloud.com', port=443): Max retries exceeded with url: /api/v1/user/timestamp (Caused by ProxyError('Cannot connect to proxy.', error(10061, '')))
this API does nothing but return the timestamp of my server. When I copy the url into Chrome in same machine with same environment, it returns correct response. However, my desktop client can only returns error. Is it anything wrong with requests lib?
I googled this problem of connection error 10061 ("No connection could be made because the target machine actively refused it"). This maybe caused by TCP connect rejection of web server.
The client sends a SYN packet to the server targeting the port (80 for HTTP). A server that is running a service on port 80 will respond with a SYN ACK, but if it is not, it will respond with a RST ACK. Your client reaches the server, but not the intended service. This is one way a server could “actively refuse” a connection attempt.
But why? My client works all right before and Chrome still works. I use no proxy on my machine. Is there anything I miss?
I notice there is a white space in URL, is that correct?
I tested in my ipython with requests.. that the response was:
{
"timestamp": 1472760770,
"success": true
}
For HTTP and HTTPS.

How to make python Requests work via SOCKS proxy

I'm using the great Requests library in my Python script:
import requests
r = requests.get("http://example.com")
print(r.text)
I would like to use a SOCKS proxy, how can I do that? Requests seems to only support HTTP proxies.
The modern way:
pip install -U 'requests[socks]'
then
import requests
resp = requests.get('http://go.to',
proxies=dict(http='socks5://user:pass#host:port',
https='socks5://user:pass#host:port'))
In case someone has tried all of these older answers, and is still running into problems like:
requests.exceptions.ConnectionError:
SOCKSHTTPConnectionPool(host='myhost', port=80):
Max retries exceeded with url: /my/path
(Caused by NewConnectionError('<requests.packages.urllib3.contrib.socks.SOCKSConnection object at 0x106812bd0>:
Failed to establish a new connection:
[Errno 8] nodename nor servname provided, or not known',))
It may be because, by default, requests is configured to resolve DNS queries on the local side of the connection.
Try changing your proxy URL from socks5://proxyhost:1234 to socks5h://proxyhost:1234. Note the extra h (it stands for hostname resolution).
The PySocks package module default is to do remote resolution, and I'm not sure why requests made their integration this obscurely divergent, but here we are.
As of requests version 2.10.0, released on 2016-04-29, requests supports SOCKS.
It requires PySocks, which can be installed with pip install pysocks.
Example usage:
import requests
proxies = {'http': "socks5://myproxy:9191"}
requests.get('http://example.org', proxies=proxies)
You need install pysocks , my version is 1.0 and the code works for me:
import socket
import socks
import requests
ip='localhost' # change your proxy's ip
port = 0000 # change your proxy's port
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5, ip, port)
socket.socket = socks.socksocket
url = u'http://ajax.googleapis.com/ajax/services/search/images?v=1.0&q=inurl%E8%A2%8B'
print(requests.get(url).text)
As soon as python requests will be merged with SOCKS5 pull request it will do as simple as using proxies dictionary:
Update: PR was already merged.
#proxy
# SOCKS5 proxy for HTTP/HTTPS
proxies = {
'http' : "socks5://myproxy:9191",
'https' : "socks5://myproxy:9191"
}
#headers
headers = {
}
url='http://example.com/'
res = requests.get(url, headers=headers, proxies=proxies)
See SOCKS Proxy Support
Another options, in case that you cannot wait request to be ready, when you cannot use requesocks - like on GoogleAppEngine due to the lack of pwd built-in module, is to use PySocks that was mentioned above:
Grab the socks.py file from the repo and put a copy in your root folder;
Add import socks and import socket
At this point configure and bind the socket before using with urllib2 - in the following example:
import urllib2
import socket
import socks
socks.set_default_proxy(socks.SOCKS5, "myprivateproxy.example",port=9050)
socket.socket = socks.socksocket
res=urllib2.urlopen(url).read()
You can just run your script with https_proxy environment variable.
Install socks support if it necessary.
pip install PySocks
pip install pysocks5
Setup environment variable
export https_proxy=socks5://<hostname or ip>:<port>
Run your script. This example makes request using proxy and shows IP-address:
echo Your real IP
python -c 'import requests;print(requests.get("http://ipinfo.io/ip").text)'
echo IP with socks-proxy
python -c 'import requests;print(requests.get("https://ipinfo.io/ip").text)'
# SOCKS5 proxy for HTTP/HTTPS
proxiesDict = {
'http' : "socks5://1.2.3.4:1080",
'https' : "socks5://1.2.3.4:1080"
}
# SOCKS4 proxy for HTTP/HTTPS
proxiesDict = {
'http' : "socks4://1.2.3.4:1080",
'https' : "socks4://1.2.3.4:1080"
}
# HTTP proxy for HTTP/HTTPS
proxiesDict = {
'http' : "1.2.3.4:1080",
'https' : "1.2.3.4:1080"
}
I installed pysocks and monkey patched create_connection in urllib3, like this:
import socks
import socket
socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS4, "127.0.0.1", 1080)
def create_connection(address, timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None, socket_options=None):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith('['):
host = host.strip('[]')
err = None
for res in socket.getaddrinfo(host, port, 0, socket.SOCK_STREAM):
af, socktype, proto, canonname, sa = res
sock = None
try:
sock = socks.socksocket(af, socktype, proto)
# If provided, set socket level options before connecting.
# This is the only addition urllib3 makes to this function.
urllib3.util.connection._set_socket_options(sock, socket_options)
if timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
sock.settimeout(timeout)
if source_address:
sock.bind(source_address)
sock.connect(sa)
return sock
except socket.error as e:
err = e
if sock is not None:
sock.close()
sock = None
if err is not None:
raise err
raise socket.error("getaddrinfo returns an empty list")
# monkeypatch
urllib3.util.connection.create_connection = create_connection
I could do this on Linux.
$ pip3 install --user 'requests[socks]'
$ https_proxy=socks5://<hostname or ip>:<port> python3 -c \
> 'import requests;print(requests.get("https://httpbin.org/ip").text)'

Categories