sub domain brute forcer in python - python

I attempted to create a sub domain brute forcer in python, but my code doesn't work, there's probably a better way to do it, I just need to be guided in the right direction on how to go about doing this.
import sys
import socket
import requests
host = "paypal.com"
sublist = ["cpanel.", "admin.", "manager.", "secure."]
try:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
status = s.connect_ex((host, 80))
if status == 0:
print (host + " is up!")
else:
print (host + " is down!")
s.close()
except socket.error:
print (host + " is not reachable")
def checklist():
try:
for lines in sublist:
check = requests.get("http://" + lines + host).status_code
if check == 200:
print "Found: " + lines + host
except Exception:
print "Error"
checklist()
it just prints out "Error" in the terminal, I don't know if its checking the sub domains with the host.
How can I loop through the list and check every subdomain with the site to see if its available and then display it on the terminal?
The error without the except code:
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='cpanel.paypal.com', port=443): Max retries exceeded with url: / (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -2] Name or service not known',))

the problem is that your sending requests too fast,also you can try using headers to cover your python request
import time
try:
time.sleep(1.5)
check = requests.get("http://" + lines + host).status_code
except requests.exceptions.ConnectionError:
r.status_code = "Connection Refused by Host"

Related

How can i handle [WinError 10057] error in python?

Exercise 1: Change the socket program socket1.py to prompt the user
for the URL so it can read any web page. You can use split('/') to
break the URL into its component parts so you can extract the host
name for the socket connect call. Add error checking using try and
except to handle the condition where the user enters an improperly
formatted or non-existent URL.
import socket
url = input('name:')
word = url.split('/')
host = word[2]
print(host)
try:
mysock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
mysock.connect(('host', 80))
mysock.send(('GET '+url+' HTTP/1.0\r\n\r\n').encode())
except:
print ("Try your best")
while True:
data = mysock.recv(512)
if len(data) < 1:
break
print(data.decode(),end='')
mysock.close()
OSError: [WinError 10057] A request to send or receive data was disallowed because the socket is not connected and (when sending on a datagram socket using a sendto call) no address was supplied
would you please help me? why code return me this error? how can I resolve it without any other new function?
mysock.connect(('host', 80))
Should be
mysock.connect((host, 80))

http proxy in python, when and how exactly should I close connection?

For a school project I need to open a proxy server in python. My proxy server works and shows the page on the browser but the problem is that it doesn't close connections after page shown and no more requests sent. The problem happens specifically after Connect Requests that start a proxy tunnel so I don't know when should I close the connection between the client and the server.
When and how should I close the connection between them?
def get_data(sock):
data = b''
data_add = b'test'
try:
while len(data_add) != 0:
# receive data from web server
data_add = sock.recv(4096)
data += data_add
except Exception as e:
print("2:" + str(e) + " ")
return data
def handle_connect_command(client_socket, my_socket):
request = b'test'
try:
send_data(client_socket, b'HTTP/1.1 200 OK\r\n\r\n')
while True:
request = get_data(client_socket)
send_data(my_socket, request)
response = get_data(my_socket)
send_data(client_socket, response)
except Exception as e:
print("5:" + str(e))
print("Connection lost")
client_socket.close()
my_socket.close()
def threaded(client_socket):
my_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
request = b'Test'
try:
while len(request) > 0:
# data received from client
request = get_data(client_socket)
web_server, port, command = analyze_request(request)
print(web_server + ' ' + str(port))
my_socket = connect_to_server(web_server, port)
if command.lower() == "connect":
handle_connect_command(client_socket, my_socket)
break
else:
send_data(my_socket, request)
response = get_data(my_socket)
my_socket.close()
send_data(client_socket, response)
except Exception as e:
print("6:" + str(e))
# connection closed
client_socket.close()
my_socket.close()
Assuming you're using sock/sockets you can simply run:
server.quit()
or
session.close()
if you're using requests.
after creating a server object.
The matter of WHEN to close the connection is something we would need to see your code for.
logically you would close the connection when no more interaction needs to take place
If you are using the Requests library (and you should) you can do this.
with requests.Session() as session:
session.get('target_url')
This will close the connection automatically when everything in the with condition completes.

Check internet availability before sending the data to database

I am new to Python 2.7. I am writing a program where I need to check the availability of internet for my Wifi (sometimes the internet disconnects) before I proceed to send the data to the database using the Internet. The send data to database will be skipped if there is no internet connection. How can I do that. Is this the correct way that I doing this?
import urllib
#Perhaps check internet availability first
try:
import httplib
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
#send data to database
data = {'date':date_mmddyyyy,'time':time_hhmmss,'airtemperature':temperature_c,'humidity':humidity_c, 'watertemperature':watertemp_c, 'phsolution':pHvalue_c, 'waterlevel':distance_c, 'CO2 concentration':CO2_c, 'TDS value':tds_c}
result = firebase.put('Region 1', 'Parameter Reading', {'date':date_mmddyyyy,'time':time_hhmmss,'airtemperature':temperature_c,'humidity':humidity_c, 'watertemperature':watertemp_c, 'phsolution':pHvalue_c, 'waterlevel':distance_c, 'CO2 concentration':CO2_c, 'TDS value':tds_c})
result2 = requests.post(firebase_url + '/' + reading_location + '/History/Parameter Reading.json', data=json.dumps(data))
print 'All parameter records are inserted.\nResult Code = ' + str(result2.status_code) + ',' + result2.text
I've used the requests module for this.
In the event of a network problem (e.g. DNS failure, refused connection, etc), Requests will raise a ConnectionError exception.
So you could do the following:
import requests
def is_connected():
try:
r = requests.get("http://google.com", timeout=5)
return True
except requests.exceptions.ConnectionError:
return False
Note that it may raise other exceptions, but this should be enough to start.
As suggested by #FranciscoCouzo, you can just try the connect and see what happens. But suppose you want a smaller sanity check before even delving into the database portion of your code. If you know the port number of your database server (1433 for instance) you can try a connect and then reset that connection. You still have to deal with loosing your wifi connection as you work, but this is a light weight way to know its okay to start.
import socket
import struct
def is_alive(host, port):
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.settimeout(5)
s.connect((host, port))
s.setsockopt(socket.SOL_SOCKET, socket.SO_LINGER,
struct.pack("ii", 1, 0))
s.close()
except OSError:
return False
finally:
s.close()
return True
print(is_alive("example.com", 1344))

Cache Proxy Server in Python

I have a homework assignment which involves implementing a proxy cache server in Python. The idea is to write the web pages I access to temporary files on my local machine and then access them as requests come in if they are stored. Right now the code looks like this:
from socket import *
import sys
def main():
#Create a server socket, bind it to a port and start listening
tcpSerSock = socket(AF_INET, SOCK_STREAM) #Initializing socket
tcpSerSock.bind(("", 8030)) #Binding socket to port
tcpSerSock.listen(5) #Listening for page requests
while True:
#Start receiving data from the client
print 'Ready to serve...'
tcpCliSock, addr = tcpSerSock.accept()
print 'Received a connection from:', addr
message = tcpCliSock.recv(1024)
print message
#Extract the filename from the given message
print message.split()[1]
filename = message.split()[1].partition("/")[2]
print filename
fileExist = "false"
filetouse = "/" + filename
print filetouse
try: #Check whether the file exists in the cache
f = open(filetouse[1:], "r")
outputdata = f.readlines()
fileExist = "true"
#ProxyServer finds a cache hit and generates a response message
tcpCliSock.send("HTTP/1.0 200 OK\r\n")
tcpCliSock.send("Content-Type:text/html\r\n")
for data in outputdata:
tcpCliSock.send(data)
print 'Read from cache'
except IOError: #Error handling for file not found in cache
if fileExist == "false":
c = socket(AF_INET, SOCK_STREAM) #Create a socket on the proxyserver
hostn = filename.replace("www.","",1)
print hostn
try:
c.connect((hostn, 80)) #https://docs.python.org/2/library/socket.html
# Create a temporary file on this socket and ask port 80 for
# the file requested by the client
fileobj = c.makefile('r', 0)
fileobj.write("GET " + "http://" + filename + "HTTP/1.0\r\n")
# Read the response into buffer
buffr = fileobj.readlines()
# Create a new file in the cache for the requested file.
# Also send the response in the buffer to client socket and the
# corresponding file in the cache
tmpFile = open(filename,"wb")
for data in buffr:
tmpFile.write(data)
tcpCliSock.send(data)
except:
print "Illegal request"
else: #File not found
print "404: File Not Found"
tcpCliSock.close() #Close the client and the server sockets
main()
To test my code, I run the proxy cache on my localhost and set my browser proxy settings accordingly like so
However, when I run this code and try to access google with Chrome, I'm greeting with an error page saying err_empty_response.
Stepping through the code with the debugger made me realizing it's failing on this line
c.connect((hostn, 80))
and I have no idea why. Any help would be greatly appreciated.
P.S. I'm testing this with Google Chrome, Python 2.7, and Windows 10
You cannot use a name on connect. Connect expects an IP address to connect to.
You can get the socket information you need to build the connection using getaddrinfo(). In my pure-python-whois package I used the following code to create a connection:
def _openconn(self, server, timeout, port=None):
port = port if port else 'nicname'
try:
for srv in socket.getaddrinfo(server, port, socket.AF_UNSPEC, socket.SOCK_STREAM, 0, socket.AI_ADDRCONFIG):
af, socktype, proto, _, sa = srv
try:
c = socket.socket(af, socktype, proto)
except socket.error:
c = None
continue
try:
if self.source_addr:
c.bind(self.source_addr)
c.settimeout(timeout)
c.connect(sa)
except socket.error:
c.close()
c = None
continue
break
except socket.gaierror:
return False
return c
Note that this isn't great code because the loop is actually there for nothing instead of using the different alternatives. You should only break the loop once you have established a connection. However, this should work as an illustration for using getaddrinfo()
EDIT:
You are also not cleaning your hostname correctly. I get /www.example.com/ when I try accessing http://www.example.com/ which obviously won't resolve. I'd suggest that you use a regular expression to get the file name for your cache.

Empty response from server when connecting with python socket

I am trying to connect to URL https://www.ssehl.co.uk/HALO/publicLogon.do in Python.
The simple solution using requests fails:
import requests
r = requests.get('https://www.ssehl.co.uk/HALO/publicLogon.do')
print r.text
with error
File "c:\Python27\lib\site-packages\requests\adapters.py", line 327, in send
raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='www.ssehl.co.uk', port=443): Max retries exceeded with url: /HALO/publicLogon.do (Caused by <class 'httplib.BadStatusLine'>: '')
so I tried to get the raw response from the server using library socket:
import socket #for sockets
import sys #for exit
#create an INET, STREAMing socket
try:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
except socket.error:
print 'Failed to create socket'
sys.exit()
print 'Socket Created'
host = 'www.ssehl.co.uk';
port = 443;
try:
remote_ip = socket.gethostbyname(host)
except socket.gaierror:
#could not resolve
print 'Hostname could not be resolved. Exiting'
sys.exit()
#Connect to remote server
s.connect((remote_ip , port))
print 'Socket Connected to ' + host + ' on ip ' + remote_ip
#Send some data to remote server
message = "GET /HALO/publicLogon.do HTTP/1.1\r\n\r\n"
try :
#Set the whole string
s.sendall(message)
except socket.error:
#Send failed
print 'Send failed'
sys.exit()
print 'Message send successfully'
#Now receive data
reply = s.recv(4096)
print reply
will output:
Socket Created
Socket Connected to www.ssehl.co.uk on ip 161.12.7.194
Message send successfully
Reply:
after reply there is some garbage which I can't paste, however this is a sublime console screenshot:
Screenshot
Is there any way to get a 200 response from the server, just like a browser?
For some reason when you use either Python's built in stuff (urllib2, requests, httplib) or even command line stuff (curl, wget) over https the server spazzes out and gives an erroneous response.
However when you request the page over regular http, it works fine, for example:
import urllib2
print urllib2.urlopen('http://www.ssehl.co.uk/HALO/publicLogon.do').getcode()
prints out
>> 200
My guess is that their servers are configured wrong and your browser somehow deals with it silently.
It worked for me when I used port 80. Sooo:
port = 80;
There must be some error when using HTTPS servers thought Python...
Also, you are sending wrong request. You are not sending the hostname. Fixed request:
message = "GET /HALO/publicLogon.do HTTP/1.1\r\nHostname: %s\r\n\r\n"%host
So here is working code.
I think the problem exists, because port 443 is encrypted. And Python doesn't support encryption (probably).
You should use ssl.wrap_socket if you want support https.
See http://docs.python.org/2/library/ssl.html for details.

Categories