Unable to background a Python Flask Service via SubProcess library - python

I am trying to background a Python Flask based service via nohup on a redhat linux server, using the subprocess library and here is the issue:
I have a python flask service at location: /home/user/service_location
Name of service: service.py
I have to start this service from another python file on the same server.
Location of the starter script: /home/user/service_regen
Name of this script: restart_services.py
After spending much time on it, here is what is inside restart_services.py now:
working_directory = '/home/user/service_location'
command = 'nohup python service.py &'
ps = subprocess.Popen([command], cwd=working_directory, shell=True, stdout=open('/dev/null', 'w'), stderr=open('logfile.log', 'a'))
stdout, stderr = ps.communicate()
print("stdout: {}, stderr: {}".format(str(stdout),str(stderr)))
But this still seems to not start the service!
I am open to any other solution/alternative also if a feasible solution in subprocess does not exist.
Also, the output of the command does not matter to me, so you don't need to provide any solution for that. Just that I want the service to be started and backgrounded via nohup using Python.

I think it could be due to the command you send in, or the absolute path.
Such operation works on my side, please try out :
import os
import subprocess
import shlex
working_directory = '/home/user/service_location'
ab_path = os.path.abspath(working_directory)
command = 'nohup python service.py &'
args = shlex.split(command) # use shlex to parse the command
ps = subprocess.Popen(args, cwd=ab_path, shell=True, stdout=open('/dev/null', 'w'), stderr=open('logfile.log', 'a'))
stdout, stderr = ps.communicate()
print("stdout: {}, stderr: {}".format(str(stdout),str(stderr)))

Okay, after putting in 5 hours into it. Here is what a senior of mine came up with which is now working!
command = 'nohup python3.6 service.py &'
The exact reasons of this working are not clear to us at the moment. Maybe it has got to do something with python distributions and PATHs in this particular server. (because, the above posted code seemed to work just fine in one of our dev servers). Strangely, both servers have the same distribution installed via the same way!
Python 3.6.0 |Continuum Analytics, Inc.| (default, Dec XX XXXX, 12:22:00)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux
Type "help", "copyright", "credits" or "license" for more information.

Related

subprocess.call always returns an error when script is run by cron

I have a simple python script which I am using to automate updates to a dhcp config file.
The Idea is that it puts the new config file in the dhcpd directory runs a check and if that returns ok it can restart the service. My code looks like this:
syslog.syslog(syslog.LOG_INFO, 'INFO: file copied to /etc/dhcp/conf.d')
return_code = subprocess.call(['dhcpd -t -cf /etc/dhcp/dhcpd.conf'], shell=True)
if return_code != 0:
print('dhcp config test failed, exiting script')
syslog.syslog(syslog.LOG_ERR, 'ERROR: dhcp config test failed, exiting script')
sys.exit()
else:
print('dhcp config test passed restarting service')
syslog.syslog(syslog.LOG_INFO, 'INFO: config check passed, restarting service')
return_code = subprocess.call(['service', conf['service_name'], 'restart'])
if return_code != 0:
print('dhcpd service failed to restart')
syslog.syslog(syslog.LOG_ERR, 'ERROR: dhcpd service failed to restart')
else:
print('dhcpd service restarted')
syslog.syslog(syslog.LOG_INFO, 'INFO: service restarted')
email_results()
This script is kicked off by a cron job, when it runs it always fails at this bit:
print('dhcp config test failed, exiting script')
If I run the script manually it always works fine and continues to the end as expected.
If I open the python shell and run the important commands by hand it seems to work fine:
python3
Python 3.5.2 (default, Nov 23 2017, 16:37:01)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess
>>> return_code = subprocess.call(['dhcpd -t -cf /etc/dhcp/dhcpd.conf'], shell=True)
Internet Systems Consortium DHCP Server 4.3.3
Copyright 2004-2015 Internet Systems Consortium.
All rights reserved.
For info, please visit https://www.isc.org/software/dhcp/
Config file: /etc/dhcp/dhcpd.conf
Database file: /var/lib/dhcp/dhcpd.leases
PID file: /var/run/dhcpd.pid
>>> print(return_code)
0
I have tried using "shell=True" and also tried without.
I have also tried subprocess.check_call with the same results.
Where am I going wrong here?
Use absolute paths instead of just command names like dhcpd in your script.
Try if your script still works when you call it after setting an empty PATH.
The arguments to subprocess should be either an array of strings or a single string. Passing in an array of a single string is an error, though it might happen to work on some platforms which are fundamentally broken anyway.
You want
return_code = subprocess.call(['dhcpd', '-t', '-cf', '/etc/dhcp/dhcpd.conf']) # shell=False implicitly
or
return_code = subprocess.call('dhcpd -t -cf /etc/dhcp/dhcpd.conf', shell=True)
but really, you should avoid shell=True whenever possible; see also Actual meaning of 'shell=True' in subprocess
And of course, if dhcpd is not in the PATH that you get from cron, you want to update the PATH correspondingly, or use an explicit hard-coded path like /usr/sbin/dhcpd (I generally recommend the former).

stdin reading blocking when running sbt with python subprocess.Popen()

I'm launching sbt via Popen(), and my python process stdin reading is not working.
Here is an example:
On the first line I'm launching Popen, on the second line I'm trying to browse throught the history with an arrow key. This does not work for some time, printing ^[[A.
$ python
Python 2.7.10 (default, Jul 13 2015, 12:05:58)
[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess; f = open("/dev/null", "rw"); subprocess.Popen(["sbt"], stdout=f, stderr=f, stdin=f)
<subprocess.Popen object at 0x10fc03950>
>>> ^[[A^[[A^[[A^[[A^[[A^[[A^[[A^[[A^[[Aimport subprocess; f = open("/dev/null", "rw"); subprocess.Popen(["sbt"], stdout=f, stderr=f, stdin=f)
This seems to only happen with sbt.
Any idea why and how to bypass this behavior ?
Thanks
My guess is that sbt is misbehaving when there is no pseudo-tty to interact with the user (probably because of jline).
Hence, let's use a python module to run the commands in a pseudo-tty. Install pexpect via pip (pip3 install pexpect for Python 3.x).
Then, run the following:
import pexpect, sys
f = open("sbt.log", "w+")
# Use `spawnu` for Python3, `spawn` otherwise
sbt = pexpect.spawnu("sbt -Dsbt.log.noformat=true \"version\" \"another-command\"", logfile=f)
# Do whatever is needed while sbt is running
# Force the process to expect EOF and file to be written
sbt.expect(pexpect.EOF)
Tested in Python 3.4.3 (Gentoo Linux) and works.

Source shell script and access exported variables from os.environ

I have a shell script where certain parameters are being set like:
k.sh:
export var="value"
export val2="value2"
Then I have a python script where i am calling the shell script and want to use these enviornment variables
ex1.py:
import subprocess
import os
subprocess.call("source k.sh",shell=True)
print os.environ["var"]
But I am getting a KeyError
How can I use these shell variables in my Python script?
subprocess.call starts a shell in a new process, which calls source. There is no way to modify the environment within your process from a child process.
You could source k.sh and run a Python one-liner to print the contents of os.environ as JSON. Then use json.loads to convert that output back into a dict in your main process:
import subprocess
import os
import json
PIPE = subprocess.PIPE
output = subprocess.check_output(
". ~/tmp/k.sh && python -c 'import os, json; print(json.dumps(dict(os.environ)))'",
shell=True)
env = json.loads(output)
print(env["var"])
yields
value
If you want to set the environment and then run a Python script, well, set the environment and run the Python script using runner:
runner:
#! /bin/bash
. k.sh
exec ex1.py
and that's it.
ex1.py:
#import subprocess
import os
#subprocess.call("source k.sh",shell=True)
print os.environ["var"]
As pointed out by chepner. You'r subprocess part is runnign individually.
Working with environment variables has to be done prior to launching the python script..
For instance:
C:\Users\anton\Desktop\githubs>echo %x%
y
C:\Users\anton\Desktop\githubs>python
Python 2.7.3 (default, Apr 10 2012, 23:31:26) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> os.environ['x']
'y'
>>>
Sourcing your environment variables prior to launching the script will traverse down however, or if you execute multiple commands to the subprocess call that would also be great after you sourced it. for instance:
import subprocess
import os
x = subprocess.call("source k.sh",shell=True, STDIN=subprocess.PIPE, STDOUT=subprocess.PIPE)
y = subprocess.call("echo $var",shell=True, STDIN=x.stdout, STDOUT=subprocess.PIPE)
Never tried that tho, as mentioned. Source it before launch.
/u/unutbu already answered this question. However I fixed couple of bugs in his code:
def run_external_script(script):
if is_windows():
command = script+" && python -c \"import os; print dict(os.environ)\""
else:
command = ". "+ script+" && python -c 'import os; print dict(os.environ)'"
output = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, shell=True).communicate()[0]
r = re.compile('{.*}')
m = r.search(output)
try:
env = eval(m.group(0))
except:
error( "Something went wrong in " + script )
error( output )
return env
There are couple of small differences:
This code works both on windows/linux
I replaced subprocess.check_output with subprocess.call, check_output requires Python 2.7
When I ran his code, there std out of the script would also got printed in the output variables. So I used a re to grab the dictionary everything between two {}, such as {'var1'=1, 'var2'='x'}.
instead of using json, I used python's eval. There is a chance of injection, so use it at your own risk. such as {; exit(1); }
If your on windows you could set an environment variable with powershell
os.system("powershell.exe [System.Environment]::SetEnvironmentVariable('key', 'value','User')")
its worth noting before your application can use the variables you will need to restart you application to allow it to read the updated variables...
then you can access with
yourValue = os.environ['key']

Can python shell have some pre-input?

When testing in python shell, I always have to type some import like:
Python 2.5.4 (r254:67916, Jun 24 2010, 15:23:27)
[GCC 4.4.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>import sys
>>>import datetime
Can someone help me to automatically finish these? It means I run some command to enter python shell it has already done import for me, and a python shell waiting for me to continue type command.
Thanks.
Try:
python -i -c "import sys; import datetime;"
More info:
-i : inspect interactively after running script; forces a prompt even
if stdin does not appear to be a terminal; also PYTHONINSPECT=x
&
-c cmd : program passed in as string (terminates option list)
Create a file with the commands you want to execute during startup, and set the environment variable PYTHONSTARTUP to the location of that file. The interactive interpreter will then load and execute that file. See http://docs.python.org/tutorial/interpreter.html#the-interactive-startup-file
On a sidenote, you might want to consider ipython as an improved Python shell when working in interactive mode.

os.system() execute command under which linux shell?

I am using /bin/tcsh as my default shell.
However, the tcsh style command os.system('setenv VAR val') doesn't work for me. But os.system('export VAR=val') works.
So my question is how can I know the os.system() run command under which shell?
Was just reading Executing BASH from Python, then 17.1. subprocess — Subprocess management — Python v2.7.3 documentation, and I saw the executable argument; and it seems to work:
$ python
Python 2.7.1+ (r271:86832, Sep 27 2012, 21:16:52)
[GCC 4.5.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> print os.popen("echo $0").read()
sh
>>> import subprocess
>>> print subprocess.call("echo $0", shell=True).read()
/bin/sh
>>> print subprocess.Popen("echo $0", stdout=subprocess.PIPE, shell=True).stdout.read()
/bin/sh
>>> print subprocess.Popen("echo $0", stdout=subprocess.PIPE, shell=True, executable="/bin/bash").stdout.read()
/bin/bash
>>> print subprocess.Popen("cat <(echo TEST)", stdout=subprocess.PIPE, shell=True).stdout.read()
/bin/sh: Syntax error: "(" unexpected
>>> print subprocess.Popen("cat <(echo TEST)", stdout=subprocess.PIPE, shell=True, executable="/bin/bash").stdout.read()
TEST
Hope this helps someone,
Cheers!
These days you should be using the Subprocess module instead of os.system(). According to the documentation there, the default shell is /bin/sh. I believe that os.system() works the same way.
Edit: I should also mention that the subprocess module allows you to set the environment available to the executing process through the env parameter.
os.system() just calls the system() system call ("man 3 system"). On most *nixes this means you get /bin/sh.
Note that export VAR=val is technically not standard syntax (though bash understands it, and I think ksh does too). It will not work on systems where /bin/sh is actually the Bourne shell. On those systems you need to export and set as separate commands. (This will work with bash too.)
If your command is a shell file, and the file is executable, and the file begins with "#!", you can pick your shell.
#!/bin/zsh
Do Some Stuff
You can write this file and then execute it with subprocess.Popen(filename,shell=True) and you'll be able to use any shell you want.
Also, be sure to read this about os.system and subprocess.Popen.

Categories