Is there a way in python to check the server response codes (200, 301, 404) in the header of a specified ip range (1.1.1.1 - 1.1.1.254) maybe its even possible to do it multi-threaded?
P.S. fond out that its possible with the "HTTPResponse.status" object (http://docs.python.org/library/httplib.html) how could i now check the ip range with it?
P.S. May be it would be a good idea to first check if port 80 is open and then only test the ones with open ports i think it would speed it really up because of 254 ip's maybe 30 are using port 80.
You can just try and connect with a normal GET request to the root of the host, with a short timeout (or longer one if you want it to wait more). Then you can run it through a map.
import httplib
from multiprocessing import Pool
def test_ip(addr):
conn = httplib.HTTPConnection(addr, timeout=1)
try:
conn.request("GET", "/")
except:
return addr, httplib.REQUEST_TIMEOUT
else:
resp = conn.getresponse()
return addr, resp.status
finally:
conn.close()
p = Pool(20)
results = p.map(test_ip, ["1.1.1.%d" % d for d in range(1,255)], chunksize=10)
print results
# [('1.1.1.1', 408), ('1.1.1.2', 408), ...]
Adjust Pool size and chunksize to suit.
Related
I have a Python script which retrieves the measured data from a smart plug so that I can visualize it on my Rasbperry Pi.
This command gets the data
send_hs_command("192.168.1.26", 9999, b'{"emeter":{"get_realtime":{}}}')
and this is the define
def send_hs_command(address, port, cmd):
data = b""
tcp_sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
tcp_sock.connect((address, port))
tcp_sock.send(encrypt(cmd))
data = tcp_sock.recv(2048)
except socket.error:
print(time.asctime( time.localtime(time.time()) ), "Socket closed.", file=sys.stderr)
finally:
tcp_sock.close()
return data
My problem is that if I take the Smart Plug somewhere else, it will have
a new IP-Address, which means I have to keep rewriting it on my Python script. This is not an option for me. What would be the simplest solution? Thanks
I don't have a Pi to run this on.
If the IP address of the target(Smart Plug) is variable, can you not use a pre-determined host-name(located in '/etc/hostname') instead?
the socket library provides a few handy functions;
You can first use
gethostbyaddr to get the host-name if you don't have the host-name information already.
Then from that point onward you can use the known host-name and use
create_connection to establish connections.
However, if you want to use something more dynamic; I'd suggest using the MAC address as the key.
Please be advised that running scapy which perhaps depends on tcpdump on Raspberry Pi might be CPU exhaustive.
Please take a look at the following snippet:
import socket
import time
import sys
from scapy.all import *
def send_hs_command(address, port, cmd):
data = b""
tcp_sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
tcp_sock.connect((address, port))
tcp_sock.send(encrypt(cmd))
data = tcp_sock.recv(2048)
except socket.error:
print(time.asctime( time.localtime(time.time()) ), "Socket closed.", file=sys.stderr)
finally:
tcp_sock.close()
print(data)
return data
def get_ip_from_mac():
# Match ARP requests
packet_list = sniff(filter="arp", count=10) # increase number of arp counts
for i in packet_list:
# Show all ARP requests
# print(i[Ether].src, "is broadcasting IP", i[ARP].psrc)
if (i[ARP].hwsrc == '00:0c:29:b6:f4:be'): # target MAC address
return (True, i[ARP].psrc)
return (False, '')
def main():
result = get_ip_from_mac()
if result[0] == True:
print("Succeeded to reach server")
send_hs_command(result[1], 22, b'{"emeter":{"get_realtime":{}}}')
else:
# logic to retry or graciously fail
print("Failed to reach server")
if __name__== "__main__":
main()
I am new to Python 2.7. I am writing a program where I need to check the availability of internet for my Wifi (sometimes the internet disconnects) before I proceed to send the data to the database using the Internet. The send data to database will be skipped if there is no internet connection. How can I do that. Is this the correct way that I doing this?
import urllib
#Perhaps check internet availability first
try:
import httplib
except:
import http.client as httplib
def have_internet():
conn = httplib.HTTPConnection("www.google.com", timeout=5)
try:
conn.request("HEAD", "/")
conn.close()
return True
except:
conn.close()
return False
#send data to database
data = {'date':date_mmddyyyy,'time':time_hhmmss,'airtemperature':temperature_c,'humidity':humidity_c, 'watertemperature':watertemp_c, 'phsolution':pHvalue_c, 'waterlevel':distance_c, 'CO2 concentration':CO2_c, 'TDS value':tds_c}
result = firebase.put('Region 1', 'Parameter Reading', {'date':date_mmddyyyy,'time':time_hhmmss,'airtemperature':temperature_c,'humidity':humidity_c, 'watertemperature':watertemp_c, 'phsolution':pHvalue_c, 'waterlevel':distance_c, 'CO2 concentration':CO2_c, 'TDS value':tds_c})
result2 = requests.post(firebase_url + '/' + reading_location + '/History/Parameter Reading.json', data=json.dumps(data))
print 'All parameter records are inserted.\nResult Code = ' + str(result2.status_code) + ',' + result2.text
I've used the requests module for this.
In the event of a network problem (e.g. DNS failure, refused connection, etc), Requests will raise a ConnectionError exception.
So you could do the following:
import requests
def is_connected():
try:
r = requests.get("http://google.com", timeout=5)
return True
except requests.exceptions.ConnectionError:
return False
Note that it may raise other exceptions, but this should be enough to start.
As suggested by #FranciscoCouzo, you can just try the connect and see what happens. But suppose you want a smaller sanity check before even delving into the database portion of your code. If you know the port number of your database server (1433 for instance) you can try a connect and then reset that connection. You still have to deal with loosing your wifi connection as you work, but this is a light weight way to know its okay to start.
import socket
import struct
def is_alive(host, port):
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.settimeout(5)
s.connect((host, port))
s.setsockopt(socket.SOL_SOCKET, socket.SO_LINGER,
struct.pack("ii", 1, 0))
s.close()
except OSError:
return False
finally:
s.close()
return True
print(is_alive("example.com", 1344))
I am writing a small python script in order to use it for checking haproxy. What the script does is to connect on haproxy socket and "poll" for stats.
#!/usr/bin/env python
import socket
import sys
my_socket = socket.socket( socket.AF_UNIX, socket.SOCK_STREAM )
try:
my_socket.connect( "/var/run/haproxy/haproxy.sock" )
except socket.error:
print "cant connect to socket"
sys.exit(1)
my_socket.send("show stat\n")
response = my_socket.recv(1024)
print response
What i wish to do is if there is no response from the socket, meaning if haproxy will not output the stats, to exit the script with exit code (1).Is it possible to somehow evaluate if an answer is received?
By default the socket will be in blocking mode and recv() will block until data is received or the connection is closed.
If you can assume that the proxy will respond within a certain amount of time you can set a timeout on the client socket. The timeout is the number of seconds to wait for a socket operation to complete. If the operation is not complete an exception is raised:
my_socket.settimeout(5.0) # 5 seconds. Set this after connecting.
try:
response = my_socket.recv(1024)
print response
except socket.timeout as exc:
print 'timed out waiting for response from proxy'
my_socket.close()
sys.exit(1)
That's one way and it's probably the easiest way. You could also look at the select() module which provides functions that will let your client wait for the socket to become readable, which indicates that there is data to be read, or that the socket has been closed. It really depends on what behaviour you want. Example using select():
import select
r, _, _ = select.select([my_socket], [], [], 5.0)
if r:
response = my_socket.recv(1024)
print response
else:
print 'Nothing received from proxy in 5 seconds'
my_socket.close()
sys.exit(1)
My requirement is to generate hundreds of HTTP POST requests per second. I am doing it using urllib2.
def send():
req = urllib2.Request(url)
req.add_data(data)
response = urllib2.urlopen(req)
while datetime.datetime.now() <= ftime:
p=Process(target=send, args=[])
p.start()
time.sleep(0.001)
The problem is this code sometimes for some iterations throws either of following exceptions:
HTTP 503 Service Unavailable.
URLError: <urlopen error [Errno -2] Name or service not known>
I have tried using requests(HTTP for humans) as well but I am having some proxy issues with that module. Seems like requests is sending http packets to proxy server even when target machine is within same LAN. I don't want packets to go to proxy server.
The simplest way to limit number of concurrent connections is to use a thread pool:
#!/usr/bin/env python
from itertools import izip, repeat
from multiprocessing.dummy import Pool # use threads for I/O bound tasks
from urllib2 import urlopen
def fetch(url_data):
try:
return url_data[0], urlopen(*url_data).read(), None
except EnvironmentError as e:
return url_data[0], None, str(e)
if __name__=="__main__":
pool = Pool(20) # use 20 concurrent connections
params = izip(urls, repeat(data)) # use the same data for all urls
for url, content, error in pool.imap_unorderred(fetch, params):
if error is None:
print("done: %s: %d" % (url, len(content)))
else:
print("error: %s: %s" % (url, error))
503 Service Unavailable is a server error. It might fail to handle the load.
Name or service not known is a dns error. If you need make many requests; install/enable a local caching dns server.
I'm not able to connect to the server it will print out
"Connecting to port..." then it will just say "Sockets timed out."
My program is due tomorrow and it'd be nice to have this actually work.
EDITED CODE: Now it will only use Connecting to Port....
nothing else printed.
import socket, string, time, random, re, urllib2, cookielib, smtplib, os
class Pibot: #main class
def __init__(self): #basic information to allow for the rest of the program to work.
self.server= 'irc.evilzone.org'
self.port = 6667
self.botname= 'pibot'
self.chan= 'test'
self.owner = 'Josh.H'
self.nick = "bawt"
self.irc = None
self.data = ''
def iConnect(self): #trys to connect to the server and allows the user to see if it failed to connect.
print ("Connecting to ports...")
print self.data
time.sleep(3)
try:
self.irc = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.irc.connect((self.server, self.port))
except (socket.error, socket.herror, socket.gaierror):
print "Failed to connect to Ports"
def iStart(self):
#Not guaranteed to send all your data, iisn't checking the return values
#however this function iStart is used to send the NICK of the bot and the USER to the server through particle data
#it then auto joins the channel
#in future development I'd like to get accuainted with Twisted or IRCutils as they allow it to be quiet powerful and less buggy
self.irc.send('NICK %s\r\n' % self.nick)
self.irc.send("USER %s %s bla :%s\r\n" % ("Ohlook", 'itsnotmy', 'Realname'))
time.sleep(4)
self.irc.send("JOIN #%s\r\n" % self.chan)
self.data = self.irc.recv( 4096 )
def MainLoop(self,iParse = 0): #MainLoop is used to make the commands executable ie !google !say etc;
try:
while True:
# This method sends a ping to the server and if it pings it will send a pong back
#in other clients they keep receiving till they have a complete line however mine does not as of right now
#The PING command is used to test the presence of an active client or
#server at the other end of the connection. Servers send a PING
#message at regular intervals if no other activity detected coming
#from a connection. If a connection fails to respond to a PING
#message within a set amount of time, that connection is closed. A
#PING message MAY be sent even if the connection is active.
#PONG message is a reply to PING message. If parameter <server2> is
#given, this message will be forwarded to given target. The <server>
#parameter is the name of the entity who has responded to PING message
#and generated this message.
self.data = self.irc.recv( 4096 )
if self.data.find ( 'PING' ) != -1:
self.irc.send(( "PONG %s \r\n" ) % (self.recv.split() [ 1 ])) #Possible overflow problem
if self.data.find( "!google" ) != -1:
#googles the search term and displays the first 5 results
#format = !google: <Search Term>
#One thing that I noticed is that it will print on a seperate line without the header
#In the next Update I would have fixed this.
fin = data.split(':')
if not fin:
irc.send("PRIVMSG #%s :syntax'^google :search term\r\n'" % chan)
else:
#In the next version to avoid overflow I will create another if statement and edit the search code
#However I am using what xgoogle has reccomended.
fin = fin[3].strip()
gs = GoogleSearch(fin)
gs.results_per_page = 5
results = gs.get_results()
for result in results:
irc.send("PRIVMSG #%s :%s\r\n" % (chan, result.url.encode("utf8")))
###############################################################################################################################
# No excpetion checking here, these functions can and will fail in time and in later versions will need to be edited.
# If hellboundhackers changes this code may break
# This function takes a quote from the header of hellboundhackers
# it first looks at the header of the USer agent then the header of the website (HBH) and reads it then prints
# the quote when QUOTEM is recognized in the irc closes the connection to the wbesite and deletes the cookie
###############################################################################################################################
if "QUOTEM" in self.data:
#Pulls a quote from HBH
cj = cookielib.CookieJar()
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cj))
opener.addheaders.append(('User-agent', 'Mozilla/4.0'))
opener.addheaders.append( ('Referer', 'http://www.hellboundhackers.org/index.php') )
resp = opener.open('http://www.hellboundhackers.org/index.php')
r = resp.read()
resp.close()
del cj, opener
da = re.findall("Enter; width:70%;'>(.*)",r)
self.irc.send("PRIVMSG #%s :%s\r\n" % (chan, da[0])) # Note Possible overflow
if "!whoareyou" in self.data:
#bot info allows users on IRC to see which commands are currently working
self.irc.send("PRIVMSG #%s :I am %s, I was created By:%s \r\n" % (self.chan, self.nick,self.owner))
self.irc.send("PRIVMSG #%s :I was written in Python 27, and edited with IDLE\r\n" % self.chan)
self.irc.send("PRIVMSG #%s :The Classes used are socket, string, time, re, urllib2, cookielib\r\n" % self.chan)
self.irc.send("PRIVMSG #%s :As well as some functions from various other sources(supybot,twisted,xgoogle)\r\n" % self.chan)
self.irc.send("PRIVMSG #%s :type ^commands for a list of things I can do\r\n" % self.chan)
except (socket.error, socket.timeout):
print "Sockets timed out."
bot = Pibot()
bot.iConnect()
bot.MainLoop()
Side Note: No errors present.
Greatly Appreciated. Also I am just learning so don't flame me. :(
EDIT2: I have fixed most of the problems and am now getting error:
Traceback (most recent call last):
File "L:\txtbot.py", line 119, in <module>
bot.MainLoop()
File "L:\txtbot.py", line 64, in MainLoop
self.irc.send(( "PONG %s \r\n" ) % (self.recv.split() [ 1 ])) #Possible overflow problem
AttributeError: Pibot instance has no attribute 'recv'
It seems you're never passing the connection information to the socket:
self.irc = socket.socket()
I think it should be something like this:
self.irc = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.irc.connect((self.server, self.port))
In iConnect you're just creating a socket, not connecting it to the server. You need to use socket.create_connection.
Also, lumping together socket.error and socket.timeout is not a good idea as it might be misleading when debugging. Also, you should print the error, not just a generic message. It will help you figure out what's wrong.
You don't call iStart anywhere. If I remember my IRC correctly, you need to send your nick information before it will send you any data back.