I am using SSL tunneling with a proxy server to connect to a target server. I use http to connect to the proxy server and HTTPS to connect to the target server. The SSL tunneling works as it should and I can exchange HTTPS messages with the remote server, but there is a problem. The proxy server returns a header in its reply to urllib2's request to establish the SSL tunnel that I need to see, but I don't see a way to get access to it using urllib2 (Python 2.7.3).
I suppose I could theoretically implement the SSL tunneling handshake myself, but that would get me way deeper into the protocol than I want to be (or with which I feel comfortable).
Is there a way to get access to the reply using urllib2 when establishing the SSL tunnel?
UPDATE:
Here is the code that uses the proxy server to connect to the target server (the proxy server and the target server's URLs are not the actual ones):
proxy_handler = urllib2.ProxyHandler({'https': 'http://proxy.com'})
url_opener = urllib2.build_opener (proxy_handler)
request = urllib2.Request ('https://target_server.com/')
response = url_opener.open (request)
print response.headers.dict
I used WireShark to look at the message traffic. WireShark won't show me the bodies of the messages exchanged with the target server because they are encrypted, but I can see the body of the SSL Tunnel handshake. I can see the header that I'm interested coming back from the proxy server.
How are you calling the https page.
are you using
resp = urllib2.urlopen('https')
resp.info().headers
Related
I am trying to understand how HTTP/3 works. Ultimately, my goal is to send HTTP/3 request to a host with proxy and receive a response back.
The host I am trying to reach only accepts HTTP/3 Connection.
There is a library that takes care of heavy lifting to initiate a HTTP 3 connection however they don't demonstrate how proxy can be passed into the packets.
https://github.com/aiortc/aioquic/blob/main/examples/http3_client.py
I am running the following file after cloning the repo like this:
python3 examples/http3_client.py 'https://www.truepeoplesearch.com/'
Doing so does route the request via HTTP/3 using QUIC protocol. How can I send the same request behind a proxy with IP, pOrt, username and password of the proxy.
I'm trying to send an HTTPS request through an HTTPS tunnel. That is, my proxy expects HTTPS for the CONNECT. It also expects a client certificate.
I'm using Requests' proxy features.
import requests
url = "https://some.external.com/endpoint"
with requests.Session() as session:
response = session.get(
url,
proxies={"https": "https://proxy.host:4443"},
# client certificates expected by proxy
cert=(cert_path, key_path),
verify="/home/savior/proxy-ca-bundle.pem",
)
with response:
...
This works, but with some limitations:
I can only set client certificates for the TLS connection with the proxy, not for the external endpoint.
The proxy-ca-bundle.pem only verifies the server certificates in the TLS connection with the proxy. The server certificates from the external endpoint are seemingly ignored.
Is there any way to use requests to address these two issues? I'd like to set a different set of CAs for the external endpoint.
I also tried using http.client and HTTPSConnection.set_tunnel but, as far as I can tell, its tunnel is done through HTTP and I need HTTPS.
Looking at the source code, it doesn't seem like requests currently supports this "TLS in TLS", ie. providing two sets of clients/CA bundles for a proxied requests.
We can use PycURL which simply wraps libcurl
from io import BytesIO
import pycurl
url = "https://some.external.com/endpoint"
buffer = BytesIO()
curl = pycurl.Curl()
curl.setopt(curl.URL, url)
curl.setopt(curl.WRITEDATA, buffer)
# proxy settings
curl.setopt(curl.HTTPPROXYTUNNEL, 1)
curl.setopt(curl.PROXY, "https://proxy.host")
curl.setopt(curl.PROXYPORT, 4443)
curl.setopt(curl.PROXY_SSLCERT, cert_path)
curl.setopt(curl.PROXY_SSLKEY, key_path)
curl.setopt(curl.PROXY_CAINFO, "/home/savior/proxy-ca-bundle.pem")
# endpoint verification
curl.setopt(curl.CAINFO, "/home/savior/external-ca-bundle.pem")
try:
curl.perform()
except pycurl.error:
pass # log or re-raise
else:
status_code = curl.getinfo(curl.RESPONSE_CODE)
PycURL will use the PROXY_ settings to establish a TLS connection to the proxy, send it an HTTP CONNECT request. Then it'll establish a new TLS session through the proxy connection to the external endpoint and use the CAINFO bundle to verify those server certificates.
I have developed a desktop client using PyQt4, it connect to my web service by requests lib. You know, requests maybe one of the most useful http client, I think it should be no problem. My desktop client works all right until something strange happened.
I use the following code to send request to my server.
response = requests.get(url, headers = self.getHeaders(), timeout=600, proxies = {}, verify = False)
where header only includes auth token.
def getHeaders(self, additional = None):
headers = {
'Auth-Token' : HttpBasicClient.UserAuthToken,
}
if additional is not None:
headers.update(additional)
return headers
I cannot connect to my web service, all the http request pop the same error "'Cannot connect to proxy.', error(10061, '')". For example:
GET Url: http:// api.fangcloud.com/api/v1/user/timestamp
HTTPSConnectionPool(host='api.fangcloud.com', port=443): Max retries exceeded with url: /api/v1/user/timestamp (Caused by ProxyError('Cannot connect to proxy.', error(10061, '')))
this API does nothing but return the timestamp of my server. When I copy the url into Chrome in same machine with same environment, it returns correct response. However, my desktop client can only returns error. Is it anything wrong with requests lib?
I googled this problem of connection error 10061 ("No connection could be made because the target machine actively refused it"). This maybe caused by TCP connect rejection of web server.
The client sends a SYN packet to the server targeting the port (80 for HTTP). A server that is running a service on port 80 will respond with a SYN ACK, but if it is not, it will respond with a RST ACK. Your client reaches the server, but not the intended service. This is one way a server could “actively refuse” a connection attempt.
But why? My client works all right before and Chrome still works. I use no proxy on my machine. Is there anything I miss?
I notice there is a white space in URL, is that correct?
I tested in my ipython with requests.. that the response was:
{
"timestamp": 1472760770,
"success": true
}
For HTTP and HTTPS.
when using Python 2.7s urllib2 I do not seem to be able to retrieve a resource from a HTTPS server while using a SSL secured proxy server, i.e. to following:
CLIENT ---- (HTTPS) ---> PROXY ---- (https) --- > SERVER
Of cause to get through the proxy server one uses CONNECT. Any ideas?
Alternative question: when using CONNECT one needs to setup a completly independent 2. SSL session inside the tunnel, right? How could one do that in python as simply calling ssl.wrap_socket does not do the trick...?
I am creating a proxy server in python, which is based on BaseHTTPServer.
What it does is create a connection to a squid proxy, identifies the browser request(GET, CONNECT, POST etc) and adds a proxy-authorization header to it, and then forwards this request to the squid proxy.
Problem is, as I understand, when I send a connect request, I should relay all the corresponding traffic to the squid proxy. But, as I can see in wireshark, the squid proxy doesn't reply to the 'Client Hello' part of the handshake, which I think is due to squid proxy not understanding binary data of SSL that I am just forwarding to it.
How do I process HTTPS requests in this case?
The code is more or less similar to TinyHTTPProxy : http://www.oki-osk.jp/esc/python/proxy/
RFC 2817 defines the CONNECT method. It is different from other HTTP methods in that the receiving proxy (your Python proxy) is directed to establish a raw TCP tunnel directly to the destination host (called the authority in the RFC).
A proxy can make no assumptions about the data that will be sent over that tunnel; it will not necessarily be HTTP – the client can use the tunnel to speak any protocol it likes. Indeed, SSL ≠ HTTP.
You have two options:
Open a TCP connection directly to the requested destination host.
Make a CONNECT request to your upstream proxy (Squid). This is within spec:
It may be the case that the proxy itself can only reach the
requested origin server through another proxy. In this case, the
first proxy SHOULD make a CONNECT request of that next proxy,
requesting a tunnel to the authority. A proxy MUST NOT respond
with any 2xx status code unless it has either a direct or tunnel
connection established to the authority.
Make sure that your request includes the required Host header.
CONNECT www.google.com:443 HTTP/1.1
Host: www.google.com:443
Proxy-Authorization: ...