I am trying to invoke a shell script using python's subprocess module.
The shell script activates a virtualenv using virtualenvwrapper and in turn invokes a python script.
The last invoked python script needs libraries installed in virtualenv and it is crashing.
tried activating virtualenv again in python script but of no use
Parent Python code-
command = "/home/arman/analysis_server/new_analysis/run"
output = subprocess.Popen([command], shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
/run script -
#!/bin/bash
export WORKON_HOME=~/Envs
source /usr/local/bin/virtualenvwrapper.sh
workon analytics
python /home/arman/analysis_server/new_analysis/AnalysisWrapper.py
AnalysisWrapper.py -
cmd = "python /home/arman/analysis_server/new_analysis/DataHandlerWrapper.py " + instrument + " &"
subprocess.Popen(cmd, shell=True, executable='/bin/bash', stdout=out, stderr=out)
The DataHandlerWrapper.py script needs virtualenv but it is crashing
I think your issue is that Popen spawns a subshell, so you activating the virtualenv in one subprocess and trying to use it in another is never going to work.
If there's nothing happening in between you could perhaps try chaining your commands into the same process:
command = "/home/arman/analysis_server/new_analysis/run && python /home/arman/analysis_server/new_analysis/DataHandlerWrapper.py " + instrument + " &"
Related
I am trying to run bash commands on spark via Python.
My simple current code is as follows:
import subprocess
print("Test start")
subprocess.Popen("conda install numpy=1.15.2 -n python35env--yes", shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).wait()
print("Test end")
The above code is getting executed successfully, and I do see both the print statement "Test start" and "Test end", but the numpy package is not getting installed in the python35env environment.
Am I missing anything important for running the conda install bash command in the Spark worker nodes? even on the driver node?
You can try to run it from the executable directory. By default subprocess run it from system.
subprocess.Popen("conda install numpy=1.15.2 -n python35env--yes", shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, cwd = 'your conda excutable path')
Using subprocess.
Example:
import subprocess
result = subprocess.getoutput('<bash command>')
OS X 10.13.6 Python 3.6
I am trying to run the following command from a jupyter notebook:
vpn_cmd = '''
sudo openvpn
--config ~/Downloads/configs/ipvanish-US-Chicago-chi-a49.ovpn
--ca ~/Downloads/configs/ca.ipvanish.com.crt'''
proc = Popen(vpn_cmd.split(), stdout=PIPE, stderr=STDOUT)
stdout, stderr = proc.communicate()
print(stdout.decode())
But get the error:
sudo: openvpn: command not found
What I've tried:
added export PATH="/usr/local/sbin:$PATH" to my ~/.bash_profile and can run the the sudo openvpn command from my terminal
edited my sudoers file so sudo no longer prompts for a password
called sudo which openvpn and tried adding /usr/local/sbin/openvpn to my sys.path within python
not splitting vpn_cmd and setting shell=True
tried packaging it in a test.py script and executing from the terminal, but it just hangs at the proc.communicate() line
specified the full path for the --config and --ca flags
So far, nothing has fixed this. I can run openvpn from my terminal just fine. It seems like a simple path issue but I can't figure out what I need to add to my python path. Is there something particular with the jupyter notebook kernel?
Jupyter probably isn't picking up your personal .bashrc settings, depending also on how you are running it. Just hardcode the path or augment the PATH in your Python script instead.
With shell=False you don't get the tildes expanded; so you should change those to os.environ["HOME"], or make sure you know in which directory you run this, and use relative paths.
You should not be using Popen() if run can do what you require.
home = os.environ["HOME"]
r = subprocess.run(
['sudo', '/usr/local/sbin/openvpn',
'--config', home + '/Downloads/configs/ipvanish-US-Chicago-chi-a49.ovpn',
'--ca', home + '/Downloads/configs/ca.ipvanish.com.crt'],
stdout=PIPE, stderr=PIPE, universal_newlines=True)
print(r.stdout)
I want to run a python script on boot of ubuntu 14.04LTS.
My rc.local file is as follows:
sudo /home/hduser/morey/zookeeper-3.3.6/bin/zkServer.sh start
echo "test" > /home/hduser/test3
sudo /home/hduser/morey/kafka/bin/kafka-server-start.sh /home/hduser/morey/kafka/config/server.properties &
echo "test" > /home/hduser/test1
/usr/bin/python /home/hduser/morey/kafka/automate.py &
echo "test" > /home/hduser/test2
exit 0
everything except my python script is working fine even the echo statement after running the python script, but the python script doesnt seem to run.
My python script is as follows
import sys
from subprocess import Popen, PIPE, STDOUT
cmd = ["sudo", "./sbt", "project java-examples", "run"]
proc = Popen(cmd, shell=False, stdout=PIPE, stdin=PIPE, stderr=STDOUT)
proc.communicate(input='1\n')
proc.stdin.close()
which works perfectly fine if executed individually.
I went through the following questions , link
I did a lot of research but couldn't find a solution
Edit : echo statements are for testing purpose only, and the second actual command (not considering the echo statements) is starting a server which keeps on running, and even the python script starts a listener which runs on an infinite loop, if this is any help
The Python script tries to launch ./sbt. Are you sure of what if the current directory when rc.local runs? The rule is always use absolute paths in system scripts
Do not run the Python script in background, run it in foreground. Do not exit from its parent script. Better call another script from "rc.local" that does all the job of "echo" and script launching.
Run that script from "rc.local"; not in background (no &).
You do not need "sudo" as "rc.local" is run as root.
If you want to run python script at system boot there is an alternate solution which i have used.
1:Create sh file like sample.sh and copy paste following content
#!/bin/bash
clear
python yourscript.py
2:Now add a cron job at reboot.If you are using linux you can use as following
a:Run crontab -e(Install sudo apt-get install cron)
b:#reboot /full path to sh file > /home/path/error.log 2>&1
And restart your device
I'm trying to run a python script from python using the subprocess module and executing a script sequentially.
I'm trying to do this in UNIX but before I launch python in a new shell I need to execute a command (ppack_gnu) that sets the environment for python (and prints some lines in the console).
The thing is that when I run this command from python subprocess the process hangs and waits for this command to finish whereas when I do it in the UNIX console it jumps to the next line automatically.
Examples below:
From UNIX:
[user1#1:~]$ ppack_gnu; echo 1
You appear to be in prefix already (SHELL=/opt/soft/cdtng/tools/ppack_gnu/3.2/bin/bash)
1
[user1#1:~]$
From PYTHON:
processes.append(Popen("ppack_gnu; echo 1", shell=True, stdin = subprocess.PIPE))
This will print Entering Gentoo Prefix /opt/soft/cdtng/tools/ppack_gnu/3.2 - run 'bash -l' to source full bash profiles
in the python console and then hang...
Popen() does not hang: it returns immediately while ppack_gnu may be still running in the background.
The fact that you see the shell prompt does not mean that the command has returned:
⟫ echo $$
9302 # current shell
⟫ bash
⟫ echo $$
12131 # child shell
⟫ exit
⟫ echo $$
9302 # current shell
($$ -- PID of the current shell)
Even in bash, you can't change environment variables of the parent shell (without gdb or similar hacks) that is why source command exists.
stdin=PIPE suggests that you want to pass commands to the shell started by ppack_gnu. Perhaps you need to add process.stdin.flush() after the corresponding process.stdin.write(b'command\n').
I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash