Waiting for the previous function to be completed in Python - python

is there any solution in python which lets a function execute after the previous one was finished?
Here is one of the ideas I'm using now. But it is not solving the problem when files are larger and the program needs more time.
def copy_to_jumphost(self):
try:
if self.connect():
stdin, stdout, stderr = self.client.exec_command('sshpass -p %s scp -r %s#%s:%s/' % (self.password,self.username,self.hostname,self.log_path) + self.lista[self.file_number].rstrip() + ' ' + '/home/%s/' % (self.username) + self.lista[self.file_number].rstrip())
except (AttributeError, TypeError) as e:
print("Error occurred:", e)
try:
if self.connect():
if self.copy_to_jumphost():
ftp_client = self.client.open_sftp()
ftp_client.get(filepath, self.localpath)
print("Success! \nFile coppied to %s" %(self.localpath))
else:
time.sleep(5)
ftp_client = self.client.open_sftp()
ftp_client.get(filepath, self.localpath)
print("Success but needed some time! \nFile coppied to %s" %(self.localpath))
except (AttributeError, TypeError) as e:
print("Error occurred:", e)
Perfect situation for me will be if in else statement there is a solution to wait for finishing the copy_to_jumphost() function, because time.sleep(5) will fail if I will need to copy larger files.

Related

How to fine tune the redundant code in python

I am trying to create two set of databases every time inside my python script for the same of which I have written the below set of code which looks redundant to me since I am initializing the variable ext 2 times and hence if anyone can suggest some better alternatives, that would be really helpful.
def create_datasets(database, ext):
try:
dataset = "bq --location=US mk -d " + database + ext
try:
return_cd, out, err = run_sys_command(dataset)
except Exception as e:
print(e)
except Exception as e:
print(e)
raise
ext = ''
create_datasets(database, ext)
ext = '_stg'
create_datasets(database, ext)
Use a loop?
for ext in ['', '_stg']:
create_datasets(database, ext)
About your function:
def create_datasets(database, ext):
try:
dataset = f"bq --location=US mk -d {database}{ext}"
return_cd, out, err = run_sys_command(dataset)
except Exception as e: # <- you should catch sub exception!
print(e)
Any exception Exception raised in your function is caught and handled by the inner try block. The outer therefore seems redundant.
def create_datasets(database, ext):
try:
dataset = "bq --location=US mk -d " + database + ext
return_cd, out, err = run_sys_command(dataset)
except Exception as e:
print(e)

How to Fix Except Error in Python on Linux

I'm trying to execute a python script from Linux but I'm keep getting this error on the except line. Can someone figure this out?
$ python pygeo_ip.py
def search(self):
message = ''
result_count = 0
gip = pygeoip.GeoIP('GeoLIteCity.dat')
ip = self.ip_textbox.text()
try:
ip = socket.gethostbyname(str(ip))
message = "Host: %s Is Currently Available" % (str(ip))
except socket.error, e:
message = "Host: %s Is Currently Unavailable" % (key, val)
result_count += 1
msg_box("SeArCh CoMpLeTe", "%d REsults Were Found For %s"
% (result_count, str(ip))
except Exception, e: <------- Error
msg_box("", str(e))
msg_box("Search Complete", "No Results Were Found For %s" % (str(ip))
return
Error:
File "pygeo_ip.py", line 142
except Exception, e:
^
SyntaxError: invalid syntax
Pretty sure (without having tested anything) your problem is having missed the last close bracket
Line should read:
msg_box("SeArCh CoMpLeTe", "%d REsults Were Found For %s" % (result_count, str(ip)) )

Making so that my get python-requests are faster [duplicate]

This question already has answers here:
What is the fastest way to send 100,000 HTTP requests in Python?
(21 answers)
Closed 6 years ago.
I have a python-script with a lot of exceptions. I'm trying to make around 50,000 requests. And it is very slow as of now also I'd like for my script to be running therefore I added almost all the exceptions request has which has mostly to do with connectionError etc.
Is there a way I can make this script so it's much faster than it is now and more modular?
for i in range(50450000,50500000):
try:
try:
try:
try:
try:
try:
try:
try:
try:
try:
try:
try:
check_response = 'http://www.barneys.com/product/adidas--22human-race-22-nmd-sneakers-'+str(i)+'.html'
make_requests = requests.get(check_response,headers=headers).text
soup = BeautifulSoup(make_requests)
try:
main_wrapper = soup.find('h1',attrs={'class':'title'}).text
print main_wrapper + ' ' + str(i)
except AttributeError:
arr.append(check_response)
with open('working_urls.json','wb') as outfile:
json.dump(arr,outfile,indent=4)
except requests.exceptions.InvalidURL:
continue
except requests.exceptions.InvalidSchema:
continue
except requests.exceptions.MissingSchema:
continue
except requests.exceptions.TooManyRedirects:
continue
except requests.exceptions.URLRequired:
continue
except requests.exceptions.ConnectTimeout:
continue
except requests.exceptions.Timeout:
continue
except requests.exceptions.SSLError:
continue
except requests.exceptions.ProxyError:
continue
except requests.exceptions.HTTPError:
continue
except requests.exceptions.ReadTimeout:
continue
except requests.exceptions.ConnectionError:
continue
First, please replace all these ugly try/except blocks by a single one, like:
for i in range(50450000,50500000):
try:
check_response = 'http://www.barneys.com/product/adidas--22human-race-22-nmd-sneakers-'+str(i)+'.html'
make_requests = requests.get(check_response,headers=headers).text
soup = BeautifulSoup(make_requests)
try:
main_wrapper = soup.find('h1',attrs={'class':'title'}).text
print main_wrapper + ' ' + str(i)
except AttributeError:
arr.append(check_response)
with open('working_urls.json','wb') as outfile:
json.dump(arr,outfile,indent=4)
except requests.exceptions.InvalidURL:
continue
except requests.exceptions.InvalidSchema:
continue
except requests.exceptions.MissingSchema:
continue
...
And if everything you do is continue in all cases, use the base class RequestException. It becomes:
try:
check_response = 'http://www.barneys.com/product/adidas--22human-race-22-nmd-sneakers-'+str(i)+'.html'
make_requests = requests.get(check_response,headers=headers).text
soup = BeautifulSoup(make_requests)
try:
main_wrapper = soup.find('h1',attrs={'class':'title'}).text
print main_wrapper + ' ' + str(i)
except AttributeError:
arr.append(check_response)
with open('working_urls.json','wb') as outfile:
json.dump(arr,outfile,indent=4)
except requests.exceptions.RequestException:
pass
Maybe not faster, but for sure far easier to read!
As for the speed issue, you should consider using threads/processes. Take a look at the threading and multiprocessing modules.

Checking condition while Exception handling in python

This is part of my code in python. I want to check the error message and if HTTPError() then I want to add the host to the file ok.txt. But it doesn't work. what is the problem here?
except urllib2.URLError, e:
print '%-15s\t%15r' % (url.strip(), e)
if e == 'HTTPError()':
OK.write('%-15s' % (url.strip()) + '\n')
OK.flush()
When I run whole script the output is something like this:
http://a.com HTTPError()
http://b.com URLError(timeout('timed out',),)
http://c.com URLError(timeout('timed out',),)
http://d.com URLError(error(111, 'Connection refused'),)
http://e.com 200
Use isinstance() to check whether or not your error is of type HTTPError:
except urllib2.URLError as e: # use the "as e" instead of the old style comma delimitation.
print '%-15s\t%15r' % (url.strip(), e)
if isinstance(e, HTTPError):
OK.write('%-15s' % (url.strip()) + '\n')
OK.flush()

Printing Ascii Text from using Telnet in Python 3.0

So I'm unable to actually print all of the information that I see after issuing a "help" command. Do i need to change the length of the skt.receive()? Or is there a way to simply print all of the data that comes through? It seems like there has to be a way to account for a data that you want to print of an unknown length? Or am I approaching this in the wrong way.
Thanks.
#!/usr/bin/python
host = '192.168.1.50'
port = 23
msg = "help\r"
msg2 = "y\r"
import socket
import sys
import time
try:
skt = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
except socket.error, e:
print("Error creating socket: %s" % e)
sys.exit(1)
try:
skt.connect((host,port))
except socket.gaierror, e:
print("Address-related error connecting to server: %s" % e)
sys.exit(1)
except socket.error, e:
print("Error connecting to socket: %s" % e)
time.sleep(15)
skt.connect((host,port))
sys.exit(1)
try:
print(skt.send(msg))
skt.send('help\r')
print("SEND: %s" % msg)
except socket.error, e:
print("Error sending data: %s" % e)
sys.exit(1)
while 1:
try:
buf = skt.recv(50000000000)
if(len(buf)):
print(buf)
if 'AMX' in buf:
print("Length buff")
if 'AMX' in buf:
print(skt.send(msg))
#print("first wait")
#print("RECV: %s" % buf)
#time.sleep(9)
#print("second wait")
sys.exit(1)
except socket.error, e:
print("Error receiving data: %s" % e)
sys.exit(1)
if not len(buf):
break
sys.stdout.write(buf)
Have you considered using telnetlib, rather than re-inventing the wheel? :)
Example:
import telnetlib
HOST = "192.168.1.50"
tn = telnetlib.Telnet(HOST)
tn.write("help\n")
print tn.read_all()
So the telnetlib def makes things easier and streamlines the process. No sense in reinventing the wheel.

Categories