Running bash command with python subprocess - python

I have a command in my bash_profile such as id=12345 which I defined the following alias
alias obs="echo $id" since the id will chance over time.
Now what I want to do is call this alias in my python script for different purposes. My default shell is bash so I have tried the following based on the suggestions on the web
import subprocess
subprocess.call('obs', shell=True, executable='/bin/bash')
subprocess.call(['/bin/bash', '-i', '-c', obs])
subprocess.Popen('obs', shell=True,executable='/bin/bash')
subprocess.Popen(['/bin/bash', '-c','-i', obs])
However, none of them seems to work! What am I doing wrong!

.bash_profile is not read by Popen and friends.
Environment variables are available for your script, though (via os.environ).
You can use export in your Bash shell to export a value as an environment variable, or use env:
export MY_SPECIAL_VALUE=12345
python -c "import os; print(os.environ['MY_SPECIAL_VALUE'])"
# or
env MY_SPECIAL_VALUE=12345 python -c "import os; print(os.environ['MY_SPECIAL_VALUE'])"

Related

How to activate virtualenv using python subprocess module?

I am trying to invoke a shell script using python's subprocess module.
The shell script activates a virtualenv using virtualenvwrapper and in turn invokes a python script.
The last invoked python script needs libraries installed in virtualenv and it is crashing.
tried activating virtualenv again in python script but of no use
Parent Python code-
command = "/home/arman/analysis_server/new_analysis/run"
output = subprocess.Popen([command], shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
/run script -
#!/bin/bash
export WORKON_HOME=~/Envs
source /usr/local/bin/virtualenvwrapper.sh
workon analytics
python /home/arman/analysis_server/new_analysis/AnalysisWrapper.py
AnalysisWrapper.py -
cmd = "python /home/arman/analysis_server/new_analysis/DataHandlerWrapper.py " + instrument + " &"
subprocess.Popen(cmd, shell=True, executable='/bin/bash', stdout=out, stderr=out)
The DataHandlerWrapper.py script needs virtualenv but it is crashing
I think your issue is that Popen spawns a subshell, so you activating the virtualenv in one subprocess and trying to use it in another is never going to work.
If there's nothing happening in between you could perhaps try chaining your commands into the same process:
command = "/home/arman/analysis_server/new_analysis/run && python /home/arman/analysis_server/new_analysis/DataHandlerWrapper.py " + instrument + " &"

How can i launch the GIMP command line from a Python script?

From this stackoverflow thread https://stackoverflow.com/questions/4443...mmand-line, I have extracted this command line:
gimp-console -idf --batch-interpreter python-fu-eval -b "import sys;sys.path=['.']+sys.path;import batch;batch.run('./images')" -b "pdb.gimp_quit(1)"
It works perfectly well.
Now, I would like to run this command from a Python script, usually I use subprocess.Popen but this time it does not work and I get this message:
"batch command experienced an execution error"
How can I launch the GIMP command line from a Python script?
One easy way to resolve this is to just put your GIMP startup script into a bash script, say startgimp.sh
#!/bin/bash
#set your path to GIMP or cd into the folder where you installed GIMP
gimp-console -idf --batch-interpreter python-fu-eval -b "import sys;sys.path=['.']+sys.path;import batch;batch.run('./images')" -b "pdb.gimp_quit(1)"
then from Python simply call the bash script like so
import subprocess
subprocess.call(["bash","/path/to/your/script/startgimp.sh"])
If you are able to make the .sh script executable, e.g. chmod +x startgimp.sh then you can skip the bash part and just do subprocess.call("/path/to/your/script/startgimp.sh")
Some caveats
This is assuming you're on a UNIX based system
I used subprocess.call so this WILL block while waiting for GIMP to complete. Use Popen like you've used if you don't want this
I don't have GIMP to try this out, but you could also try splitting your GIMP command into elements in the list and pass it to subprocess and see if that works.
e.g. subprocess.call(["gimp-console","-idf","--batch-interpreter","python-fu-eval" and so on)

Python3 Run Alias Bash Commands

I have the following code that works great to run the ls command. I have a bash alias that I use alias ll='ls -alFGh' is it possible to get python to run the bash command without python loading my bash_alias file, parsing, and then actually running the full command?
import subprocess
command = "ls" # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None, shell=True)
#Launch the shell command:
output = process.communicate()
print (output[0])
Trying with command = "ll" the output I get is:
/bin/sh: ll: command not found
b''
You cannot. When you run a python process it has no knowledge of a shell alias. There are simple ways of passing text from parent to child process (other than IPC), the command-line and through environment (i.e. exported) variables. Bash does not support exporting aliases.
From the man bash pages: For almost every purpose, aliases are superseded by shell functions.
Bash does support exporting functions, so I suggest you make your alias a simple function instead. That way it is exported from shell to python to shell. For example:
In the shell:
ll() { ls -l; }
export -f ll
In python:
import subprocess
command = "ll" # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None, shell=True)
output = process.communicate()
print(output[0].decode()) # Required if using Python 3
Since you are using the print() function I have assumed you are using python 3. In which case you need the .decode(), since a bytes object is returned.
With a bit of hackery it is possible to create and export shell functions from python as well.

How to customize virtualenv shell prompt

How do you define a custom prompt to use when activating a Python virtual environment?
I have a bash script for activating a virtualenv I use when calling specific Fabric commands. I want the shell prompt to say something like "(fab)" so I can easily distinguish it from other shells I have open. Following this example, I've tried:
#!/bin/bash
script_dir=`dirname $0`
cd $script_dir
/bin/bash -c ". .env/bin/activate; PS1='(fab) '; exec /bin/bash -i"
but there's no change to the prompt. What am I doing wrong?
The prompt is set in the virtualenv's activate script (located in the bin folder under the virtualenv). If you only want to change the prompt some times, you could set an environment variable before calling activate (make sure to clear it in the corresponding deactivate file). If you simply want the prompt to be different all the time, you can do that right in activate at the line that looks like
set "PROMPT=(virtualenvname) %PROMPT%"
If you're using virtualenvwrapper, you could do all of this in the postactivate and postdeactivate scripts as well.
I couldn't find any way to do this via a script executed as a child process. Calling a separate bash process seems to forget any previously set PS1. However, it turned out to be trivial if I just sourced the script:
#!/bin/bash
script_dir=`dirname $0`
cd $script_dir
. .env/bin/activate
PS1="(fab) "
It appears the
exec /bin/bash -i
is resetting the PS1 variable. When I run
export PS1="foo "; bash
it resets it too. Curiously, when I look into the bash sources (shell.c and variables.c) it appears to use
set_if_not ("PS1", primary_prompt);
to init it. But I'm not exactly sure what happens between this and main(). Giving up.
I tried on cygwin and on linux (RedHat CentOS) as well. I found solution for both.
CYGWIN
After some investigation I found that the problem is that PS1 is set by /etc/bash.bashrc which overrides the PS1 env.var. So You need to disable to run this file using:
/bin/bash -c ". .env/bin/activate; PS1='(fab) ' exec /bin/bash -i --norc"
or
/bin/bash -c ". .env/bin/activate; export PS1='(fab) '; exec /bin/bash -i --norc"
LINUX
It works much simpler:
/bin/bash -c ". .env/bin/activate; PS1='(fab) ' exec /bin/bash -i"
or
/bin/bash -c ". .env/bin/activate; export PS1='(fab) '; exec /bin/bash -i"
If the script You are calling does not export the variables (and I suppose it does not) and the set variables does not appears in the environment then You could try something like this:
/bin/bash -c "PS1='(fab) ' exec /bin/bash --rcfile .env/bin/activate; "
I hope I could help!

How to execute remote pysftp commands under a specific shell

I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash

Categories