Python Script not running ".sh" file - python

I have written a simple python script that is supposed to run a "sh" file when it is executed.
The problem is, that the script runs but it does not start the ".sh" file. When I execute the ".sh" file manually using "puffy" it does the job, but not when I use my python script. So, what do I have to change in my script in order for it to work?
I will post the methods below so you could get a better idea. I also am using python 3.3.5, Oracle Linux 6.8.
The method that calls the ".sh" file, I have used Popen.
def runPrestage2Stage(self):
import time
time.sleep(60)
reload(Queries)
if "FINISHED" in Queries.statusS2S:
#run batch file
p = Popen(["sh", "/u01/app/Oracle_ODI_11/oracledi/agent/bin/start_prestage2stage_simple.sh"], stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate()
print("Prestage2Stage has started")
elif 'ERROR' in Queries.statusS2S:
print("Can not start Prestage Converter, please check runtime status of Stage Converter")
else:
print("Prestage2Stage Converter cannot be started")
Part of main method, that calls the method runPRestage2Stage.
DailyLoadsAutomation.DailyLoads.runPrestage2Stage(self)
load_paraprak_prestage = True
count2 = 0
while load_paraprak_prestage:
reload(Queries)
sleep(300) #waits 300 seconds (5 minutes) and re-checks load status.
if "FINISHED" in Queries.statusPreStage:
load_paraprak_prestage = False
else:
load_paraprak_prestage = True
if count2 == 8:
import sys
sys.exit()
else:
count2 += 1
print("PreStage is finished")
When I run the script,
It will print "Prestage2Stage has started", and "Prestage is finished", as it is supposed to, but It will not run the ".sh" file.
Any idea what is wrong?

usually it is related to user rights or errors in path. You can replace you .sh script with some simple one like "echo "I am running"" and see if it can be accessed and executed. If it is under linux, I hope you are are giving execution rights to you sh script via chmod.

You can run sh file by importing os module like this:
import os
os.system('sh filename.sh')

Related

How to terminate one python script when many python scripts are running?

Hello guys I have opened 3 python scripts that they are running at the same time. I want to terminate(Kill) one of them with other python file. It means if we run many python scripts at the same time how to terminate or kill one of them or two of them? Is it possible with os or subprocess modules? I try to use them but they kill all python scripts with killing python.exe
FirstSc.py
UserName = input("Enter your username = ")
if UserName == "Alex":
#Terminate or Kill the PythonFile in this address C:\MyScripts\FileTests\SecondSc.py
SecondSc.py
while True:
print("Second app is running ...")
ThirdSc.py
while True:
print("Third app is running ...")
Thanks guys I get good answers. Now if we have a Batch file like SecBatch.bat instead of SecondSc.py how to do this. It means we have these and run FirstSc.py and SecBatch.bat at the same time:
FirstSc.py in this directory D:\MyFiles\FirstSc.py
UserName = input("Enter your username = ")
if UserName == "Alex":
#1)How to print SecBatch.bat syntax it means print:
#CALL C:\MyProject\Scripts\activate.bat
#python C:\pyFiles\ThirdSc.py
#2)Terminate or kill SecBatch.bat
#3)Terminate or kill ThirdSc.py
SecBatch.bat in this directory C:\MyWinFiles\SecBatch.bat that it run a Python VirtualEnvironment then run a python script in this directory C:\pyFiles\ThirdSc.py
CALL C:\MyProject\Scripts\activate.bat
python C:\pyFiles\ThirdSc.py
ThirdSc.py in this directory C:\pyFiles\ThirdSc.py
from time import sleep
while True:
print("Third app is running ...")
sleep(2)
I would store the PID of each script in a standard location. Assuming you are running on Linux I would put them in /var/run/. Then you can use os.kill(pid, 9) to do what you want. Some example helper funcs would be:
import os
import sys
def store_pid():
pid = os.getpid()
# Get the name of the script
# Example: /home/me/test.py => test
script_name = os.path.basename(sys.argv[0]).replace(".py", "")
# write to /var/run/test.pid
with open(f"/var/run/{script_name}.pid", "w"):
f.write(pid)
def kill_by_script_name(name):
# Check the pid file is there
pid_file = f"/var/log/{name}.pid"
if not os.path.exists(pid_file):
print("Warning: cannot find PID file")
return
with open(pid_file) as f:
# The following might throw ValueError if pid file has characters
pid = int(f.read().strip())
os.kill(pid, 9)
Later in FirstSc:
if UserName == "Alex":
kill_by_script_name("SecondSc")
kill_by_script_name("ThirdSc")
NOTE: The code is not tested :) but should point to you to the correct direction (at least for one common way to solve this problem)
You may be able to terminate a Python process by the name of the script file using system commands such as taskkill (or pkill on Linux systems). However, a better way to accomplish this would be (if possible) to have FirstSc.py or whatever script that's doing the killing launch the other scripts using subprocess.Popen(). Then you can call terminate() on it to end the process:
import subprocess
# Launch the two scripts
# You may have to change the Python executable name
second_script = subprocess.Popen(["python", "SecondSc.py"])
third_script = subprocess.Popen(["python", "ThirdSc.py"])
UserName = input("Enter your username = ")
if UserName == "Alex":
second_script.terminate()

Understanding subprocess module in python when running program from python script

I usually run a program from my OpenSuse linux terminal by typing ./run file_name. This will bring up a series of options that I can choose from by typing a numeric value 0-9 and hitting return on my keyboard. Now I want to do this from a python script automatically. My example below is not working, but I can't understand where I'm failing and how to debug:
import subprocess
p = subprocess.Popen(["/path/to/program/run", file_name], stdin = subprocess.PIPE,stdout=subprocess.PIPE,shell=False)
print "Hello"
out, err = p.communicate(input='0\r\n')
print out
print err
for line in p.stdout.readlines():
print line
The output of this program is just
>> Hello
>>
i.e. then it seems to freeze (I have no idea whats actually happening!) I would have expected to see what I see when I run ./run file_name
and hit 0 and then return directly in my terminal, but I assure you this is not the case.
What can I do to debug my code?
Edit 1: as suggested in comments
import subprocess
fileName = 'test_profile'
p = subprocess.Popen(["/path/to/program/run", fileName], stdin = subprocess.PIPE,stdout=subprocess.PIPE,shell=False)
print "Hello"
for line in iter(p.stdout.readline,""):
print line
will indeed return the stdout of my program!
communicate waits for the completion of the program. For example:
import subprocess
p = subprocess.Popen(["cut", "-c2"], stdin=subprocess.PIPE, stdout=subprocess.PIPE,shell=False)
out, err = p.communicate(input='abc')
print("Result: '{}'".format(out.strip()))
# Result: 'b'
It sounds like you have a more interactive script, in which case you probably should try out pexpect
import pexpect
child = pexpect.spawn('cut -c2')
child.sendline('abc')
child.readline() # repeat what was typed
print(child.readline()) # prints 'b'

Python running synchronously? Running one executable at a time

Trying to use python to control numerous compiled executables, but running into timeline issues! I need to be able to run two executables simultaneously, and also be able to 'wait' until an executable has finished prior to starting another one. Also, some of them require superuser. Here is what I have so far:
import os
sudoPassword = "PASS"
executable1 = "EXEC1"
executable2 = "EXEC2"
executable3 = "EXEC3"
filename = "~/Desktop/folder/"
commandA = filename+executable1
commandB = filename+executable2
commandC = filename+executable3
os.system('echo %s | sudo %s; %s' % (sudoPassword, commandA, commandB))
os.system('echo %s | sudo %s' % (sudoPassword, commandC))
print ('DONESIES')
Assuming that os.system() waits for the executable to finish prior to moving to the next line, this should run EXEC1 and EXEC2 simultaneously, and after they finish run EXEC3...
But it doesn't. Actually, it even prints 'DONESIES' in the shell before commandB even finishes...
Please help!
Your script will still execute all 3 commands sequentially. In shell scripts, the semicolon is just a way to put more than one command on one line. It doesn't do anything special, it just runs them one after the other.
If you want to run external programs in parallel from a Python program, use the subprocess module: https://docs.python.org/2/library/subprocess.html
Use subprocess.Popen to run multiple commands in the background. If you just want the program's stdout/err to go to the screen (or get dumped completely) its pretty straight forward. If you want to process the output of the commands... that gets more complicated. You'd likely start a thread per command.
But here is the case that matches your example:
import os
import subprocess as subp
sudoPassword = "PASS"
executable1 = "EXEC1"
executable2 = "EXEC2"
executable3 = "EXEC3"
filename = os.path.expanduser("~/Desktop/folder/")
commandA = os.path.join(filename, executable1)
commandB = os.path.join(filename, executable2)
commandC = os.path.join(filename, executable3)
def sudo_cmd(cmd, password):
p = subp.Popen(['sudo', '-S'] + cmd, stdin=subp.PIPE)
p.stdin.write(password + '\n')
p.stdin.close()
return p
# run A and B in parallel
exec_A = sudo_cmd([commandA], sudoPassword)
exec_B = sudo_cmd([commandB], sudoPassword)
# wait for A before starting C
exec_A.wait()
exec_C = sudo_cmd([commandC], sudoPassword)
# wait for the stragglers
exec_B.wait()
exec_C.wait()
print ('DONESIES')

running a python script on server

i have a python script on the server
#!/usr/bin/env python
import cgi
import cgitb; #cgitb.enable()
import sys, os
from subprocess import call
import time
import subprocess
form = cgi.FieldStorage()
component = form.getvalue('component')
command = form.getvalue('command')
success = True
print """Content-Type: text/html\n"""
if component=="Engine" and command=="Start":
try:
process = subprocess.Popen(['/usr/sbin/telepath','engine','start'], shell=False, stdout=subprocess.PIPE)
print "{ans:12}"
except Exception, e:
success = False
print "{ans:0}"
When I run this script and add the component and command parameters to be "Engine" and "Start" respectively - it starts the process and prints to the shell
"""Content-Type: text/html\n"""
{ans:12}
but most importantly - it starts the process!
however, when I run the script by POSTing to it, it returns {ans:12} but does not run the process which was the whole intention in the first place. Any logical explanation?
I suspect it's one of two things, firstly your process is probably running but your python code doesn't handle the output so do:
process = subprocess.Popen(['/usr/sbin/telepath','engine','start'], shell=False, stdout=subprocess.PIPE)
print process.stdout.read()
This is the most likely and explains why you see the output from the command line and not the browser, or secondly because the script is run through the browsers as the user apache and not with your userid check the permission for /usr/sbin/telepath.

How can I tell whether screen is running?

I am trying to run a Python program to see if the screen program is running. If it is, then the program should not run the rest of the code. This is what I have and it's not working:
#!/usr/bin/python
import os
var1 = os.system ('screen -r > /root/screenlog/screen.log')
fd = open("/root/screenlog/screen.log")
content = fd.readline()
while content:
if content == "There is no screen to be resumed.":
os.system ('/etc/init.d/tunnel.sh')
print "The tunnel is now active."
else:
print "The tunnel is running."
fd.close()
I know there are probably several things here that don't need to be and quite a few that I'm missing. I will be running this program in cron.
from subprocess import Popen, PIPE
def screen_is_running():
out = Popen("screen -list",shell=True,stdout=PIPE).communicate()[0]
return not out.startswith("This room is empty")
Maybe the error message that you redirect on the first os.system call is written on the standard error instead of the standard output. You should try replacing this line with:
var1 = os.system ('screen -r 2> /root/screenlog/screen.log')
Note the 2> to redirect standard error to your file.

Categories