How to know if I used http while https was available - Python - python

I am trying to write a script that checks if HTTPS is available when I used http.
My Idea was to collect all of the HTTP links and use urllib2 in order to open a connection to the server using HTTPS as follows (please ignore syntax problems if there are. I have tried to simplify the code so it will be easier to understand the problem itself):
count=0
for packet in trafficPackets:
if packet["http.host"] != None:
if https_supported(packet["ip.dest"]):
count+=1
where https_supported is the following function:
def https_supported(ip):
try:
if len(urlopen("https://"+ip))>0
return True
except:
return False
return False
I have tried to run the code on a little traffic file which contains an HTTP connection to a site that supports https but the result was unexpected- it was always returned zero.
Where did I go wrong ? Does anyone have an idea of how can I do it?
Thank you!

Using the exact same code with the http.host field instead of the IP seems to work.

Related

Python Request Package Close Connection Method Does not Work

it is the first time that I am working with a REST API in a jupyter notebook and I don't know what I am doing wrong here. When I try to execute the following code in a cell, the cell runs forever without throwing any errors. First I did not include the close method from the request package, but then I thought the problem might be the open connection. However including the close method also did not help. Do you know what could be the reason?
api_key = "exampletoken"
header = {'authorization':"Bearer {}".format(api_key)}
payload = {}
r = request.post('exampleurl', headers = header, data = payload)
r.close()
Thanks in advance!
runs forever without throwing any errors.
By default requests does not timeout, so it can wait infinite amount of time. This might cause behavior you described and mean server did not respond. To figure if that is cause, please set timeout for example
r = request.post('exampleurl', headers = header, data = payload, timeout=180)
would raise Exception after 180 seconds (i.e. 3 minutes) if it do not get response. If you want to know more about timeouts in requests I suggest reading realpython.com tutorial

Why proxy doesn't work with python's requests?

I try this code, proxies are good, but I see my ip in logger, using them. Help, my 'requests' is 2.19.1. Some of my code:
for [...]:
url=''
proxy={'http':'http://'+get_proxy()} #works
useragent={'User-Agent':get_useragent()}
try:
r=requests.get(url,headers=useragent,proxies=proxy)
print('sent')
except:
print('error')
So I can skip bad proxies, but good dont work (I want to change my IP with them, but i actually see my own IP).

Why does requests.get() not raise when the server can't be found?

In the following code snippet, I know for a fact that https://asdasdasdasd.vm:8080/v2/api-docs does not exist. It fails a DNS lookup. Unfortunately, the get() never seems to return, raise, or timeout. My logs have only "A" in them. I would expect A C D or A B D. But I only ever see A in the logs.
try:
sys.stderr.write("A")
resp = requests.get("https://asdasdasdasd.vm:8080/v2/api-docs", timeout=1.0)
sys.stderr.write("B")
except:
sys.stderr.write("C")
sys.stderr.write("D")
sys.stderr.flush()
return swag
(That URL is not sanitized for this post. That's actually the URL I'm trying to use while working on this question.)
What am I missing here?
EDIT - I have also tried specifying the timeout as (1.0,1.0) but the behavior did not change.
EDIT2 - Per suggestions below, I ran my code from the python and ipython consoles. The code behaves as I expect (ACD). Of course, in my real application, I am not running this code from the command line. I don't know how this matters, but the method containing the code is being invoked by a web service. Specifically, a Swagger endpoint. With my browser, I hit an endpoint that's supposed to return our Swagger documentation. The endpoint (which uses flask_swagger) invokes init_swagger(...). init_swagger() calls my method with a Swagger object. That's it. How this matters, I cannot say. It doesn't make any sense to me that something outside of my method should somehow be able to mess with my exception handling.
The only thing I can think of is that Swagger has jacked with the requests class. But now it is dinner time and I am going home.
The following code for me returns A, C, D
import requests
from requests.exceptions import ConnectionError
try:
print("A")
resp = requests.get("https://asdasdasdasd.vm:8080/v2/api-docs", timeout=1.0)
print("B")
except ConnectionError:
print("C")
print("D")
This is because the host cannot be resolved for me, if I swap it out for localhost...
resp = requests.get("http://localhost/v2/api-docs", timeout=1.0)
...then I see an A, followed by a period of time before C and D show.
From reading the comments, I know what is up...
builtins has a ConnectionError that can be used (without importing anything). Requests doesn't use this exception, instead it uses the one found in requests.exceptions if you wish to catch the ConnectionError you must catch the correct exception, or it will drop out and not execute the except clause.

Module to control SSL handshake in python?

Is there a module to control the SSL handshake in python, Both client side and server? The python default SSL module is great but does the handshake automatic. I was wondering if there is a module that will allow me to do it manual similar to this:
import SSLManuel
import socket
s = socket.socket()
s.connect(("server.com",9999))
ClientHello = SSLManuel.generateClientHelloMessage(ssl=TLSv1_2, cipher="ECDHE-RSA-AES128-GCM-SHA256", server="www.server.com")
s.send(ClientHello)
ServerHello = s.recv()#this would receive the server hello
#This would verify the certificate of the server
if SSLManuel.check_cert(ServerHello) == true:
Pre-Master-Key = SSLManuel.generatePreMasterKey()
ClientKeyExchange = SSLManuel.generateClientKeyExchange(Pre-Master-Key)
ChangeCiherSpec = SSLManuel.generateChangeCipherSpec()
ClientFinished = SSLManuel.generateClientFinished()
Sessionkey = SSLManuel.generateMasterKey(Pre-Master-Key)
s.send(ClientKeyExchange)
s.send(ChangeCiherSpec)
s.send(ClientFinished)
ServerFinished = s.recv()
#This will check if the server is ready to communicate securely.
if SSLManuel.checkServerFinshed(ServerFinished) == true:
#I can now use the SessionKey to encrypt data to and from the server
s.send(SSLManuel.encrypt(SessionKey, "GET / HTTP/1.0\n\n"))
response = s.recv()
print(SSLManuel.decrypt(SessionKey, response))
I hope the naming conventions used in this example can help you understand what I'm trying to accomplish. Most of my knowledge of SSL comes from This Article. I have tried to write my own but have failed and I can't seem to find any module that will allow me to do this.
There are several pure-python implementations of SSL/TLS. Any of them will allow you to do this:
https://github.com/pyca/tls
https://github.com/DinoTools/python-flextls
https://github.com/tomato42/tlslite-ng (maintained fork of https://github.com/trevp/tlslite)
As far as I understand your question, your aim is to improve your understanding of the protocol. I would personally use the latter for this purpose, because it has an extensive inline documentation. tlslite.tlsconnection.handshakeClientAnonymous is a good starting point for your investigation, the function eventually calls _handshakeClientAsyncHelper to perform the actual handshake.

Find out if the current machine is on aws in python

I have a python script that runs on aws machines, as well as on other machines.
The functionality of the script depends on whether or not it is on AWS.
Is there a way to programmatically discover whether or not it runs on AWS? (maybe using boto?)
If you want to do that strictly using boto, you could do:
import boto.utils
md = boto.utils.get_instance_metadata(timeout=.1, num_retries=0)
The timeout specifies the how long the HTTP client will wait for a response before timing out. The num_retries parameter controls how many times the client will retry the request before giving up and returning and empty dictionary.
you can easily use the AWS SDK and check for instance id.
beside of that, you can check the aws ip ranges - check out this link
https://forums.aws.amazon.com/ann.jspa?annID=1701
I found a way, using:
try:
instance_id_resp = requests.get('http://169.254.169.254/latest/meta-data/instance-id')
is_on_aws = True
except requests.exceptions.ConnectionError as e:
is_on_awas = False
I tried some of the above, and when not running on Amazon I had troubles accessing 169.254.169.254. Maybe it has something to do with the fact I'm outside the US.
In any case, here's a piece of code that worked for me:
def running_on_amazon():
import urllib2
import socket
# I'm using curlmyip.com, but there are other websites that provide the same service
ip_finder_addr = "http://curlmyip.com"
f = urllib2.urlopen(ip_finder_addr)
my_ip = f.read(100).strip()
host_addr = socket.gethostbyaddr(my_ip)
my_public_name = host_addr[0]
amazon = (my_public_name.find("aws") >=0 )
return amazon # returns a boolean value.

Categories