subprocess.popen seems to fail when run from crontab - python

I'm running a script from crontab that will just ssh and run a command and store the results in a file.
The function that seems to be failing is subprocess.popen.
Here is the python function:
def _executeSSHCommand(sshcommand,user,node):
'''
Simple function to execute an ssh command on a remote node.
'''
sshunixcmd = '/usr/bin/ssh %s#%s \'%s\'' % (user,node,sshcommand)
process = subprocess.Popen([sshunixcmd],
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
process.wait()
result = process.stdout.readlines()
return result
When it's run from the command line, it executes correctly, from cron it seems to fail with the error message below.
Here are the crontab entries:
02 * * * * /home/matt/scripts/check-diskspace.py >> /home/matt/logs/disklog.log
Here are the errors:
Sep 23 17:02:01 timmy CRON[13387]: (matt) CMD (/home/matt/scripts/check-diskspace.py >> /home/matt/logs/disklog.log)
Sep 23 17:02:01 timmy CRON[13386]: (CRON) error (grandchild #13387 failed with exit status 2)
I'm going blind trying to find exactly where I have gone so wrong. Any ideas?

The cron PATH is very limited. You should either set absolute path to your ssh /usr/bin/ssh or set the PATH as a first line in your crontab.

You probably need to pass ssh the -i argument to tell ssh to use a specific key file. The problem is that your environment is not set up to tell ssh which key to use.
The fact that you're using python here is a bit of a red herring.

For everything ssh-related in python, you might consider using paramiko. Using it, the following code should do what you want.
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect(node, username=user)
stdout = client.exec_command(ssh_command)[0]
return stdout.readlines()

When running python scripts from cron, the environment PATH can be a hangup, as user1652558 points out.
To expand on this answer with example code to add custom PATH values to the environment for a subprocess call:
import os
import subprocess
#whatever user PATH values you need
my_path = "/some/custom/path1:/some/custom/path2"
#append the custom values to the current PATH settings
my_env = os.environ.copy()
my_env["PATH"] = my_path + ":" + my_env["PATH"]
#subprocess call
resp = subprocess.check_output([cmd], env=my_env, shell=True)

Related

Running Command from Python. Works with os.system but not subprocces.run

I have been on this problem for quite a while now. I have this command line that I want run trough python:
Users\name.lastname\Desktop\TESTER\Latitude 5431\Latitude-5431-46KCM_Win10_1.0_A01.exe /s /e=C:Users\name.lastname\Desktop\TESTER\Latitude 5431
this should run the .exe and then extract the files to the specified folder. I tried this with os.system and it worked but when I run it with
import subprocess
x = '"' + "\\Users\\name.lastname\\Desktop\\TESTER\\Latitude 5431\\Latitude-5431-46KCM_Win10_1.0_A01.exe" + '" ' + "/s /e=C:Users\\name.lastname\\Desktop\\TESTER\\Latitude 5431"
p1 = subprocess.run(x, shell=True)
it only shows me 'tips' like these but no error message and the .exe is not executed.
Pass command line arguments directly to vendor installer.
Turn the return code to success if required
Latitude-5431-46KCM_Win10_1.0_A01.exe /factoryinstall /passthrough D:\Sample.xml C:\log\FI.log
Change from the default log location to C:\my path with spaces\log.txt
Latitude-5431-46KCM_Win10_1.0_A01.exe /l="C:\my path with spaces\log.txt"
Force update to continue, even on "soft" qualification errors
Latitude-5431-46KCM_Win10_1.0_A01.exe /s /f
Try running without shell=True as it makes things more complicated than it helps:
import subprocess
prog = r"C:\Users\name.lastname\Desktop\TESTER\Latitude 5431\Latitude-5431-46KCM_Win10_1.0_A01.exe"
args = ["/s", r"/e=C:\Users\name.lastname\Desktop\TESTER\Latitude 5431"]
subprocess.run([prog]+args)

How I can execute commands using subprocess function

I want to execute the commands in a python script :
Open cmd as administrator
cd C:\elastic_stack\logstash-7.6.2
.\bin\logstash -f C:/Users/Asus/Desktop/flask_project_part2/project/logstash_file.conf
This what I'm trying to do but it doesn't execute the last config file :
import os, subprocess
from subprocess import *
os.chdir("C:\\Users")
cmd = subprocess.Popen(["runas", "/noprofile", "/user:Administrator", "|", "cd", "C:/elastic_stack/logstash-7.6.2"], shell=True)
cmd.subprocess.run(["./bin/logstash", "-f", "C:/Users/Asus/Desktop/flask_project_part2/project/logstash_file.conf"], shell=True)
You have a combination of forward and backward slashes, but I am assuming that you have installed ELK on a windows machine:
Unfortunately I do not have access to a windows machine, so I didn't get a chance to test the code. But mainly it should be like this:
import os
import subprocess
# desired path
target_dir = os.path.join("C:",os.sep,"elastic_stack",os.sep,"logstash-7.6.2")
# small check
if os.path.isdir(target_dir):
os.chdir(target_dir)
else:
print(" pathname does not refer to an existing directory")
# current working directory
print(os.getcwd())
# start logstash directly os.system will return the return code of the command if it's 0 means OK
os.system(".\\bin\\logstash -f C:\\Users\\Asus\\Desktop\\flask_project_part2\\project\\logstash_file.conf")
# if you need the output after you started logstash it will work ONLY in Python3
process = subprocess.run([".\\bin\\logstash", "-f", "C:\\Users\\Asus\\Desktop\\flask_project_part2\\project\\logstash_file.conf"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output = process.stdout
errors = process.stderr
PS:
You can os.sep when constructing absolute or relative paths because in this way the separator is system agnostic.
shell argument default is set to False and it means there is no system shell started up, and if shell=True means system shell will first spin up.

redircet stdout to file using variable python 3

I want to redirect o/p of shell commands to file using variable "path" but it is not working
import os, socket, shutil, subprocess
host = os.popen("hostname -s").read().strip()
path = "/root/" + host
if os.path.exists(path):
print(path, "Already exists")
else:
os.mkdir("Directory", path , "Created")
os.system("uname -a" > path/'uname') # I want to redirect o/p of shell commands to file using varibale "path" but it is not working
os.system("df -hP"> path/'df')
I think the problem is the bare > and / symbols in the os.system command...
Here is a python2.7 example with os.system that does what you want
import os
path="./test_dir"
command_str="uname -a > {}/uname".format(path)
os.system(command_str)
Here's a very minimal example using subprocess.run. Also, search StackOverflow for "python shell redirect", and you'll get this result right away:
Calling an external command in Python
import subprocess
def run(filename, command):
with open(filename, 'wb') as stdout_file:
process = subprocess.run(command, stdout=subprocess.PIPE, shell=True)
stdout_file.write(process.stdout)
return process.returncode
run('test_out.txt', 'ls')

setting session variable for paramiko session

Does anyone know how to make environment variables registered for
exec_command calls when using SSHClient?
I'm using a basic script that instantiates the SSHClient class, connects to another computer using the connect method, then sends out commands using the exec_command method. However, none of the environment variables seem to be registered when I try to issue commands. I can do basic things like 'ls' and see the stdout, but when trying to run installed programs, the fact that the environment variables are missing makes it impossible to run them. Using ssh in the command line to do the same thing works, as the environment variables for the user are set.
#!/usr/bin/python
import paramiko
ssh.connect('mymachine',username='myname',password='pass')
stdin,stdout,stderr=ssh.exec_command('cd /myfolder/path')
stdin,stdout,stderr=ssh.exec_command('ls')
....
....
ssh.close()
Note: I can't change my directory in paramiko. I appended the cd command in the followup command in a single ssh.exec_command('cd /dddd/ddd;ls'). I have given ls as an example but my actual followup command is different.
Since release 2.1.0 2016-12-09 , you can add an environment variable dictionary to the exec_command:
import paramiko
paramiko.util.log_to_file("paramiko.log")
ssh = paramiko.SSHClient()
k = paramiko.RSAKey.from_private_key_file("<private_key_file>")
ssh.connect(<hostname>,username=<username>,pkey=k)
env_dict={"LC_TELEPHONE":"ET_HOME","LC_MEASUREMENT":"MILES_APART"}
stdin , stdout, stderr = ssh.exec_command('echo $LC_TELEPHONE; echo "..."; echo $LC_MEASUREMENT',environment=env_dict)
print stdout.read()
output:
ET_HOME
...
MILES_APART
But why did I choose LC_TELEPHONE and LC_MEASUREMENT? Because those are two of the few environments that the target host's ssh config allows me to set:
grep AcceptEnv /etc/ssh/sshd_config
output:
AcceptEnv LANG LC_CTYPE LC_NUMERIC LC_TIME LC_COLLATE LC_MONETARY LC_MESSAGES
AcceptEnv LC_PAPER LC_NAME LC_ADDRESS LC_TELEPHONE LC_MEASUREMENT
AcceptEnv LC_IDENTIFICATION LC_ALL
In other words, this doesn't work:
env_dict={"HELLO":"WORLD","GOODBYE":"CRUEL_WORLD"}
stdin , stdout, stderr = ssh.exec_command("echo $HELLO; echo '...'; echo $GOODBYE")
print stdout.read()
output:
...
As the documentation warns, the environment variables are silently rejected
http://docs.paramiko.org/en/2.1/api/client.html
http://docs.paramiko.org/en/2.1/api/channel.html#paramiko.channel.Channel.set_environment_variable
If you cannot control the target server's sshd config, putting the environment variables into a file and sourcing it works:
stdin , stdout, stderr = ssh.exec_command("cat .set_env;source .set_env; echo $HELLO; echo '...'; echo $GOODBYE")
print stdout.read()
output:
# begin .set_env
HELLO="WORLD"
GOODBYE="CRUEL_WORLD"
# end .set_env
WORLD
...
CRUEL_WORLD
#!/usr/bin/python
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy)
client.connect(myhostname, theport, myuser, thepass)
stdin,stdout,stderr = client.exec_command('cd /tmp;pwd;ls -al')
#returns your output
print stdout.read()
which all works fine for me. If you have special environment Variables you might
have to set them on the remote command prompt. Maybe it helps if you write the
variables into a myENV file and then call
stdin,stdout,stderr = client.exec_command('source ./myEnv')
Did you tried something like that?
You can do: client.exec_command(..., get_pty=True).
This will make paramiko allocate a pseudo terminal, similar to ssh.
I found this problem too. And besides the above approaches, I also fixed the problem by using the following approach:
e.g.,
...
bin_paths = '/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin'
path_prefix = 'PATH=$PATH:%s && ' % bin_paths
command = path_prefix + command
ssh.exec_command(command=command)

Execute .R script within Python using Rscript.exe shell

I have an .R file saved locally at the following path:
Rfilepath = "C:\\python\\buyback_parse_guide.r"
The command for RScript.exe is:
RScriptCmd = "C:\\Program Files\\R\\R-2.15.2\\bin\\Rscript.exe --vanilla"
I tried running:
subprocess.call([RScriptCmd,Rfilepath],shell=True)
But it returns 1 -- and the .R script did not run successfully. What am I doing wrong? I'm new to Python so this is probably a simple syntax error... I also tried these, but they all return 1:
subprocess.call('"C:\Program Files\R\R-2.15.2\bin\Rscript.exe"',shell=True)
subprocess.call('"C:\\Program Files\\R\\R-2.15.2\\bin\\Rscript.exe"',shell=True)
subprocess.call('C:\Program Files\R\R-2.15.2\bin\Rscript.exe',shell=True)
subprocess.call('C:\\Program Files\\R\\R-2.15.2\\bin\\Rscript.exe',shell=True)
Thanks!
The RScriptCmd needs to be just the executable, no command line arguments. So:
RScriptCmd = "\"C:\\Program Files\\R\\R-2.15.2\\bin\\Rscript.exe\""
Then the Rfilepath can actually be all of the arguments - and renamed:
RArguments = "--vanilla \"C:\\python\\buyback_parse_guide.r\""
It looks like you have a similar problem to mine. I had to reinstall RScript to a path which has no spaces.
See: Running Rscript via Python using os.system() or subprocess()
This is how I worked out the communication between Python and Rscript:
part in Python:
from subprocess import PIPE,Popen,call
p = subprocess.Popen([ path/to/RScript.exe, path/to/Script.R, Arg1], stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE)
out = p.communicate()
outValue = out[0]
outValue contains the output-Value after executing the Script.R
part in the R-Script:
args <- commandArgs(TRUE)
argument1 <- as.character(args[1])
...
write(output, stdout())
output is the variable to send to Python

Categories