How to parallelly run code blocks in python - python

I am working on testing mass number of devices hence I am writing a testing program so i could automate the task.
Please bare with me because I am new to Python or any Programming language.
i want to be able to run the try and except code blocks parallely so that I dont have to wait 10 seconds for every new device that gets checked(picture over 1000 devices).
so basically:
try:
preflash = urllib.request.urlopen("http://10.10.10.2", timeout=10).getcode()
print("Web page status code:", preflash)
print("FAIL")
sys.exit(0)
except urllib.error.URLError:
correct = urllib.request.urlopen("http://192.168.100.5", timeout=10).getcode()
print("Web page status code:", correct)
print("IP address: 192.168.100.5 is reachable")
this code block should run parallely. if 10.10.10.2 is first reachable then it shuts down but if 192.168.100.5 is reachable before it then it should continue with the programm.
here is what the output looks like just for your reference.
Any help is much appreciated and please go easy on me guys cheers!
print(100*"#")
try:
preflash = urllib.request.urlopen("http://10.10.10.2", timeout=10).getcode()
print("Web page status code:", preflash)
print("FAIL")
sys.exit(0)
except urllib.error.URLError:
correct = urllib.request.urlopen("http://192.168.100.5", timeout=10).getcode()
print("Web page status code:", correct)
print("IP address: 192.168.100.5 is reachable")
print(100*"#")
print("Fetching device variables")
time.sleep(3)
print(100*"-")
# Declare url String
url_str = 'http://192.168.100.2/globals.xml'
# open webpage and read values
xml_str = urllib.request.urlopen(url_str).read()
# Parses XML doc to String for Terminal output
xmldoc = minidom.parseString(xml_str)
# Finding the neccassary Set points/ Sollwerte from the xmldoc
time.sleep(0.5)
# prints the order_number from the xmldoc
order_number = xmldoc.getElementsByTagName('order_number')
odr_nmr = order_number[0].firstChild.nodeValue
print("The Order number of the current device is:", odr_nmr)
print(100*"-")

Related

Selenium testing connection

What My Code is Supposed to Do: (Ran inside docker) My code uses the Mysterium network (Pretty much just a VPN but it's decentralized) to generate a list of active nodes and then cycles through each one and runs my web scraper. If the node fails the Have_Internet function then that node gets added to the blacklist.
Issue 1: I want to generate new node -> test connection (either just by ping test or short network strength test) -> Run selenium. I only want to run Selenium if the network test passes otherwise blacklist that node and try again. I cannot run the Have_Internet function in an else statement because then it will never run for some reason. I read that the else statement will only run if the try successes which is what I want, but then it never runs so I took it out.
Issue 2: I have kept checking my blacklist file and it never has anything in it so either my code block responsible for printing it fails or my network test is garbage.
Issue 3: I used some bash commands in my python code because I could not figure out a simple solution. if you made them into python lines I would be very happy.
PS. Any other general feedback is more than welcome. I am still pretty new to python :)
Cheers in advance
def Run_Selenium():
# Doing stuff
def Random_Node():
# Doing stuff
def Have_Internet():
conn = httplib.HTTPSConnection("8.8.8.8", timeout=5)
try:
conn.request("HEAD", "/")
return True
except Exception:
return False
finally:
conn.close()
if __name__ == "__main__":
while True == True:
#gets new node
random_node = Random_Node()
Temp_Random_Node = Random_Node()
try:
print("Running new myst session", flush=True)
os.system("/opt/myst/myst connection down") # stops vpn
#time.sleep(5)
#New_Myst_Command = () # makes node command
os.system(str("/opt/myst/myst connection up " + Temp_Random_Node)) # starts new connection with node
#time.sleep(5) # time to wait while dvpn starts
# catch errors when connecting to new node
except Exception as e:
print(e, flush=True)
print("Failed to connect to " + Temp_Random_Node + " moving on anyway and adding it to the blacklist", flush=True)
os.system("echo " + Temp_Random_Node + " >> /root/Dockerdata/blacklist.txt")
Connected = Have_Internet()
if Connected == True:
print("Going to try to run selenium", flush=True)
time.sleep(random.randrange(1, 20)) # 4, 305 # amount of time to wait before loading the page again
try:
Run_Selenium()
print("Just ran a successful run of selenium", flush=True)
except Exception as e:
print("Selenium failed")
print(e, flush=True)
else:
print("Failed network check")
os.system("echo " + random_node + " > /root/Dockerdata/blacklist.txt")

How to Stop python Programm if a certain URL is reachable

Can anybody help me with this small dilemma? I want to stop the python programm if the IP address 10.10.10.2 is reachable WITHIN 10 SECONDS. if it is not reachable in 10 SECONDS it should handle the exception and continue with the programm. if 10.10.10.2 is reachable then it should print "This IP address is reachable you are using the wrong device please disconnect" i thought about putting a ´´´sys.exit(1)´´´ after the except but im constantly getting errors. I am very new to python or any programming language for that matter so any example snippet codes and help are much appreciated
import pandas as pd
from xml.dom import minidom
import urllib.request
import time
from urllib.error import HTTPError
print(100*"#")
try:
preflash = urllib.request.urlopen("http://10.10.10.2", timeout=10).getcode()
print("Web page status code:", preflash)
print("IP address: 10.10.10.2 is reachable")
except urllib.error.URLError:
correct = urllib.request.urlopen("http://192.168.100.5", timeout=10).getcode()
print("Web page status code:", correct)
print("IP address: 192.168.100.5 is reachable")
print(100*"#")
# Declare url String
url_str = 'http://192.168.100.2/globals.xml'
# open webpage and read values
xml_str = urllib.request.urlopen(url_str).read()
# Parses XML doc to String for Terminal output
xmldoc = minidom.parseString(xml_str)
# Finding the neccassary Set points/ Sollwerte from the xmldoc
time.sleep(0.5)
# prints the order_number from the xmldoc
order_number = xmldoc.getElementsByTagName('order_number')
print("The Order number of the current device is:", order_number[0].firstChild.nodeValue)
print(100*"-")
The output of the python programm looks like this:
Web page status code: 200
IP address: 10.10.10.2 is reachable the programm will shut down in 5 seconds
####################################################################################################
The Order number of the current device is: 58184
----------------------------------------------------------------------------------------------------
The programm needs to shut down if 10.10.10.2 is reachable
Quite Stupid of me, all i needed to do was to add sys.exit(1) function before the except.
try:
preflash = urllib.request.urlopen("http://10.10.10.2", timeout=10).getcode()
print("Web page status code:", preflash)
print("IP address: 10.10.10.2 is reachable")
sys.exit(1)
except urllib.error.URLError:
correct = urllib.request.urlopen("http://192.168.100.5", timeout=10).getcode()
print("Web page status code:", correct)
print("IP address: 192.168.100.5 is reachable")

Python proxy server fails to connect to host

I'm making a python proxy server for a school assignment and I've got the code below. When I run it in my command prompt and attempt to connect to google, the code doesn't make it past connecting the server socket, but the page still connects. I honestly have no idea why it doesn't even go through the connection step. Thoughts?
EDIT: And yeah there's been other homework posts about this but none of them seem to have addressed the fact the sys.exit() on line 8 ends the script (to my knowledge anyway) and whenever we comment it out, the script still does not get past connecting the server socket and hits the "illegal request" exception.
from socket import *
from urllib2 import HTTPError #Used for 404 Not Found error
import sys
import requests
if len(sys.argv) <= 1:
print 'Usage : "python ProxyServer.py server_ip"\n[server_ip : It is the IP Address Of Proxy Server]'
#sys.exit(2)
#POST request extension
print 'Fetching webpage using POST'
r = requests.post('http://httpbin.org/post', data = {'key':'value'})
print 'Printing webpage body'
print r.text
print 'Creating and binding socket for proxy server'
# Create a server socket, bind it to a port and start listening
tcpServerSock = socket(AF_INET, SOCK_STREAM)
# Fill in start.
tcpServerSock.bind(('',8888))
tcpServerSock.listen(10) #the number is the maximum number of connections we want to have
# Fill in end.
while 1:
# Start receiving data from the client
print 'Ready to serve...'
tcpClientSock, addr = tcpServerSock.accept()
print 'Received a connection from:', addr
# Fill in start.
message = tcpClientSock.recv(4096) #receive data with buffer size 4096
# Fill in end.
print 'Printing message'
print message
# Extract the filename from the given message
print message.split()[1]
filename = message.split()[1].partition("/")[2]
print '\n'
print 'Printing file name'
print filename
fileExist = "false"
filetouse = "/" + filename
print '\n'
print 'Printing file to use'
print filetouse
print '\n'
try:
# Check whether the file exist in the cache
f = open(filetouse[1:], "r")
outputdata = f.readlines()
fileExist = "true"
# ProxyServer finds a cache hit and generates a response message
tcpClientSock.send("HTTP/1.0 200 OK\r\n")
tcpClientSock.send("Content-Type:text/html\r\n")
# Fill in start.
for x in range(0,len(outputdata)):
tcpClientSock.send(outputdata[x])
# Fill in end.
print 'Read from cache\n'
# Error handling for file not found in cache
except IOError:
if fileExist == "false":
# Create a socket on the proxyserver
# Fill in start.
print 'Creating server socket\n'
c = socket(AF_INET, SOCK_STREAM)
# Fill in end.
hostn = filename
#hostn = filename.replace("www.","",1)
print 'Printing host to connect'
print hostn
print '\n'
print 'Attempting to connect to hostn\n'
try:
# Connect to the socket to port 80
# Fill in start.
c.connect((hostn,80)) #port 80 is used for http web pages
# Fill in end.
# Create a temporary file on this socket and ask port 80
# for the file requested by the client
fileobj = c.makefile('r', 0)
fileobj.write("GET "+"http://" + filename + "HTTP/1.0\n\n")
# Show what request was made
print "GET "+"http://" + filename + " HTTP/1.0"
# Read the response into buffer
# Fill in start.
buff = fileobj.readlines() #reads until EOF and returns a list with the lines read
# Fill in end.
# Create a new file in the cache for the requested file.
# Also send the response in the buffer to client socket
# and the corresponding file in the cache
tmpFile = open("./" + filename,"wb") #creates the temp file for the requested file
# Fill in start.
for x in range(0, len(buff)):
tmpFile.write(buff[x]) #writes the buffer response into the temp file (cache?)
tcpClientSock.send(buff[x]) #sends the response saved in the buffer to the client
# Fill in end.
tmpFile.close()
except:
print "Illegal request\n"
else:
# HTTP response message for file not found
# Fill in start.
print 'File not found'
# Fill in end.
#404 not found error handling
except HTTPError as e:
print 'The server couldn\'t fulfill the request.'
print 'Error code: ', e.code
# Close the client and the server sockets
tcpClientSock.close()
# Fill in start.
tcpServerSock.close()
# Fill in end
I'm aware this question is old, and Jose M's assignment is probably long past due.
if len(sys.argv) <= 1: checks for an additional argument that needs to be passed, which is the IP of the server. Commenting out the exit essentially removes the error checking.
A fix for the code above is to change line 20 from this tcpSerSock.bind(('', 8888)) to this tcpSerSock.bind((sys.argv[1], tcpSerPort))
You must then call the script correctly python ProxyServer.py 127.0.0.1.

Infinite running server-side python script?

I want to replace Cron Jobs for "keeping" my program alive because it calls every XX interval whether or not the scrip is already called, creating duplicate entries.
I investigated the issue, and had a few approaches. One was to modify my program so it checks if it is already called and closes itself. The one I went after was to detach it completely from Cronjob by calling itself over and over again with execfile which works exactly how I want except the following problem:
RuntimeError: maximum recursion depth exceeded
Is there a way to keep the program in "infinite loop" without getting a Stack Overflow?
Here is my code, its a program that checks Mails, and converts them into MySQL DB entries.
imap = imaplib.IMAP4(hst)
try:
imap.login(usr, pwd)
except Exception as e:
errormsg = e
time.sleep(30)
print "IMAP error: " + str(errormsg)
execfile('/var/www/html/olotool/converter.py')
raise IOError(e)
# Authentification & Fetch Step
while True:
time.sleep(5)
'''
The script will always result in an error if there
are no mails left to check in the inbox. It then
goes into sleep mode and relaunches itself to check
if new mails have arrived.
'''
try:
imap.select("Inbox") # Tell Imap where to go
result, data = imap.uid('search', None, "ALL")
latest = data[0].split()[-1]
result, data = imap.uid('fetch', latest, '(RFC822)')
raw = data[0][1] # This contains the Mail Data
msg = email.message_from_string(raw)
except Exception as e:
disconnect(imap)
time.sleep(60)
execfile('/var/www/html/olotool/converter.py')
raise IOError(e)
I solved the problem myself with the only way I see it possible right now.
First I changed my exception in above code:
except Exception as e:
disconnect(imap)
print "Converter: No messages left"
raise os._exit(0)
# This is a special case since this Exception is
# no error thus os._exit(0) gives no false-positives
As you see I refrain from using execfile now. Instead I wrote a controller script that checks the status of my converter.py and launches it if it is not already running:
while True:
presL = os.popen('pgrep -lf python').read()
print "________________________________________"
print "Starting PIDcheck"
print "Current Processes: "
print presL # Check Processes
presRconverter = find('\d{7} python converter.py', presL)
if presRconverter:
# Store the PID
convPID = find('\d{7}', presRconverter)
print "Converter is running at PID: " + convPID
else:
print "PID Controller: Converter not running"
try:
print "PID Controller: Calling converter"
subprocess.check_call('python converter.py', shell=True)
except subprocess.CalledProcessError as e:
errormsg = e
print "Couldn't call Converter Module"
sendMail(esender,ereceiver,esubject,etext,server)
print "Error notification send"
raise IOError(e)
# If we got until here without ERROR, the call was Successfull
print "PID Controller: Call successful"
print "________________________________________"
time.sleep(60)
This method does not raise an: RuntimeError: maximum recursion depth exceeded. Also this provides you with a nohup.out file if you run the controller with command nohup python converter.py where you can see any problems for errorhandling.
I hope I could help anyone running into the same issue.
Something along the lines of this should work without having to resort to subprocess checking and such:
def check_mail_loop():
imap = imaplib.IMAP4(hst)
# Build some function to login, and, in the event of an error, sleep for n seconds and call login function again.
imap.login(usr, pwd)
while True:
try:
imap.select("Inbox")
result, data = imap.uid('search', None, "ALL")
if result and data:
latest = data[0].split()[-1]
result, data = imap.uid('fetch', latest, '(RFC822)')
raw = data[0][1] # This contains the Mail Data
msg = email.message_from_string(raw)
time.sleep(5)
except SomeRelevantException as e:
logging.log(e)
time.sleep(60)
pass
In the event of some random error that you didn't foresee, use a process control manager like supervisord or monit.

Python - User input to CGI via. Threading and reading file

Look at the bottom of this post, for final working code.
It's a working Python/CGI script which can get user-input to a CGI-script by calling another script which then sends it's commands through a local socket.
Original post:
As far as I know, there isn't any way to send user input directly to a Python/CGI script which has allready sent it's header. Like, warning the user under specific circumstances and waiting for a confirmation.
Neither have I been able to find any published solutions to this.
If I'm wrong, please correct me.
I currently have a Python script which can connect to servers, upload firmware, reboot, re-connect, change a few configuration files and such.
Sometimes, it would help alot of the user could send input to script, without having to re-launch the script and execute it from the beginning. Re-connecting over a 2G network takes too long.
I'm thinking that it must be possible to send user input to another script, which then posts it to a file, which the first/main script is watching, until it recieves the input.
It would also be nice, if the was able to stop the execution of the script, with a stop/kill input command.
As for the stop/kill command, the main script would need to have 2 threads. If it did not, it would know it should stop the script, if a process such as a large file upload is being executed, before the upload is completed.
At the same time, I think multipe users should be able to use the script at the same time. Therefore, a unique ID must be generated every time the main script launches.
Here's how I think it could be made:
Main script gets called
Global variable with a unique session ID is generated and sent to client.
Thread 1
pexpect spawns a "tail -F /var/www/cgi/tmp_cmd.log"
Thread 2
Thread status "Busy"
Connects to network element
Does its usual stuff until it reaches a point where the user needs to interact.
Prints the message to user and waits for Thread 1 with a timeout of x seconds.
Thread status "Ready"
Second script gets called by the user through AJAX with 2 headers (session ID & input)
Second script
Session ID and user input is saved to "/var/www/cgi/tmp_cmd.log"
Execution of the input script ends
Main script
Thread 1
User input recieved.
Wait for Thread 2 status to become "Ready" or ignore status if command is equals to "kill" ect.
Send user input (single line) and start Thread 1 from the beginning
Thread 2
Thread 2 status "Busy"
Input recieved and process stops/continues.
Thread 2 status "Ready"
I have made a script allready for connecting, uploading files, and running commands.
However, it cannot recieve user-input.
I could really use some good help, or someone to tell me how to approach this.
Of course, whenever the script has been completed, I will post it here or on pastebin and link to it, for other people to use. :)
Final code
With help from the post below, I have finally have the working code.
It could use Threads, but stopping/cancelling processes appeared to be way easier for me to figure out.
Client - cgi_send.py
#!/usr/bin/python
import sys, cgi, cgitb, socket
cgitb.enable()
TASKS_DIR = "/var/www/cgi-bin/tmp"
def main():
global TASKS_DIR
url = cgi.FieldStorage()
cmd = str(url.getvalue('cmd'))
sessionId = str(url.getvalue('session'))
socketLocation = TASKS_DIR + '/%s.socket' % sessionId
print 'End script Cancel task'
print '<form action=""><input type="hidden" name="session" id="session" value="'+sessionId+'" /><input type="text" name="cmd" id="cmd" value="" /><input type="submit" value="Fun!" />'
try:
sock = socket.socket(socket.AF_UNIX)
sock.setblocking(0)
sock.connect(socketLocation)
sock.send(cmd)
sock.close()
print '<br />Command sent: '+ cmd;
except IOError:
print '<br /><b>Operation failed.</b><br /> Could not write to socket: '+ socketLocation
pass
sock.close()
sys.exit();
if __name__ == '__main__':
sys.stdout.write("Content-type:text/html;charset=utf-8\r\n\r\n")
sys.stdout.write('<!DOCTYPE html>\n<html><head><title>Test</title></head><body>')
main()
print '</body></html>'
sys.exit()
Server
#!/usr/bin/python
import sys, os, socket, uuid, time, multiprocessing
# Options
TASKS_DIR = "/var/www/cgi-bin/tmp/"
def main():
sessionId = str(uuid.uuid4())
print 'Session ID: '+ sessionId
sys.stdout.write ('<br />Send test command')
sys.stdout.flush()
address = os.path.join(TASKS_DIR, '%s.socket' % sessionId)
sock = socket.socket(socket.AF_UNIX)
sock.setblocking(0)
sock.settimeout(.1)
sock.bind(address)
sock.listen(1)
taskList = [foo_task, foo_task, foo_task]
try:
for task in taskList:
print "<br />Starting new task"
runningTask = multiprocessing.Process(target=task)
runningTask.daemon = True # Needed to make KeyboardInterrupt possible when testing in shell
runningTask.start()
while runningTask.is_alive():
conn = None
try:
conn, addr = sock.accept()
data = conn.recv(100).strip()
except socket.timeout:
# nothing ready from a client
continue
except socket.error, e:
print "<br />Connection Error from client"
else:
print "<br />"+ data
sys.stdout.flush()
conn.close()
if data == "CANCEL":
# temp way to cancel our task
print "<br />Cancelling current task."
runningTask.terminate()
elif data == "QUIT":
print "<br />Quitting entire process."
runningTask.terminate()
taskList[:] = []
finally:
if conn:
conn.close()
except (KeyboardInterrupt, SystemExit):
print '\nReceived keyboard interrupt, quitting threads.'
finally:
sock.close()
os.remove(address)
def foo_task():
i = 1
while 10 >= i:
print "<br />Wating for work... "+ str(i)
sys.stdout.flush()
i = i + 1
time.sleep(1)
if __name__ == '__main__':
sys.stdout.write("Content-type:text/html;charset=utf-8\r\n\r\n")
sys.stdout.write('<!DOCTYPE html>\n<html><head><title>Test</title></head><body>')
main()
print '</body></html>'
sys.exit()
A CGI script is a pretty primitive operation. It works basically the same as any normal script you run from your command shell. An http request is made to the web server. The server starts a new process and passes the arguments in via stdin to the script. At this point, it's like a normal script.
A script can't get any more input unless it's looking for input by some means, so you are correct in assuming that once the headers are sent, the web client can no longer directly send more input, because the request is already in progress, and the response is already in progress as well.
A thread watching a file is one way to introduce a control loop to the script. Another is to open a UNIX socket to a path based on your unique ID for each instance. Then have the thread sitting on the socket for input. What you would then have to do is pass the ID back to the web client. And the client could make a call to the second script with the ID, which would then know the proper UNIX socket path to send control commands to: ie.
/tmp/script-foo/control/<id>.socket
You actually might only need 1 thread. You main thread could simply loop over checking for information on the socket, and monitoring the current operation being run in a thread or subprocess. It might be like this in pseudocode:
uid = generate_unique_id()
sock = socket.socket(AF_UNIX)
sock.bind('/tmp/script-foo/control/%s.socket' % uid)
# and set other sock options like timeout
taskList = [a,b,c]
for task in taskList:
runningTask = start task in thread/process
while runningTask is running:
if new data on socket, with timeout N ms
if command == restart:
kill runningTask
taskList = [a,b,c]
break
else:
process command
When the web client sends a command via ajax to your second script, it might look like this in pseudocode:
jobid = request.get('id')
cmd = request.get('cmd')
sock = socket.socket(socket.AF_UNIX)
sock.connect('/tmp/script-foo/control/%s.socket' % jobid)
sock.sendall(cmd)
sock.close()
Update
Based on your code update, here is a working example of what I was suggesting:
import sys
import os
import socket
import uuid
import time
# Options
TASKS_DIR = "."
def main():
sessionId = str(uuid.uuid4())
print 'Session ID: '+ sessionId
sys.stdout.write ('<br />Send test command')
sys.stdout.flush()
address = os.path.join(TASKS_DIR, '%s.socket' % sessionId)
sock = socket.socket(socket.AF_UNIX)
sock.setblocking(0)
sock.settimeout(.1)
sock.bind(address)
sock.listen(1)
fakeTasks = [foo_task, foo_task, foo_task]
try:
for task in fakeTasks:
# pretend we started a task
runningTask = task()
# runningTask = Thread(target=task)
# runningTask.start()
# while runningTask.is_alive():
while runningTask:
conn = None
try:
conn, addr = sock.accept()
data = conn.recv(100).strip()
except socket.timeout:
# nothing ready from a client
continue
except socket.error, e:
print "<br />Connection Error from client"
else:
print "<br />"+ data
sys.stdout.flush()
conn.close()
# for the thread version, you will need some
# approach to kill or interrupt it.
# This is just simulating.
if data == "CANCEL":
# temp way to cancel our task
print "<br />Cancelling current task."
runningTask = False
elif data == "QUIT":
print "<br />Quitting entire process."
runningTask = False
fakeTasks[:] = []
finally:
if conn:
conn.close()
finally:
sock.close()
os.remove(address)
def foo_task():
print 'foo task'
return True
if __name__ == '__main__':
sys.stdout.write("Content-type:text/html;charset=utf-8\r\n\r\n")
sys.stdout.write('<!DOCTYPE html>\n<html><head><title>Test</title></head><body>')
main()
print '</body></html>'
sys.exit()
Instead of using a 10 second global timeout, you set it to something small like 100ms. It loops over each task and starts it (eventually in a thread), and then tries to loop over waiting for a socket connection. If there is no connection within 100ms, it will timeout and continue to loop, while checking if the task is done. At any point, a client can connect and issue either a "CANCEL" or "QUIT" command. The socket will accept the connection, read it, and react.
You can see how you do not need multiple threads here for the solution. The only threading or subprocess you need is to run the task.

Categories