How can I open a shell with a command in python? - python

The following command opens a new shell and opens nano in it, when I type it into bash:
gnome-terminal -e "bash -c 'nano test; bash'"
So I tried the same in my python code with subprocess:
import subprocess
command = "gnome-terminal"
args = " -e \"bash -c 'nano test; bash'\""
subprocess.call([command, args])
I have tried already many combinations. Basically I just want to open a shell with a specific file in nano.
First I thought this would be one the easiest steps, but it turned out to be very difficult. Don't know if the problem exists due to the masking or if it's a common problem with passing variables like I am used to in shells. So it might be rather a question for AskUbuntu or the Unix section ... not sure ...

The args should be the same set of individual strings you use on the command line. It's easier to think about if you construct the list all at once. gnome-terminal is the command, and it takes two arguments. (The second argument is more commonly thought of as the argument to the -e option, but from the caller's perspective, it's just two arguments. gnome-terminal itself is the one that groups them together as an option/argument pair.)
command = ["gnome-terminal", "-e", "bash -c 'nano test; bash'"]
subprocess.call(command)
(Note that you could just pass a single string and let the shell sort it out, but the explicit argument list is superior.
subprocess.call('''gnome-terminal -e "bash -c 'nano test; bash'"''', shell=True)
)

Related

Python Script input via Bash Shell

I have a python script which i am calling from bash script and this bash script get call from cron
#!/bin/bash
set -o errexit
set -o xtrace
echo "Verify/Update Firmware"
/usr/bin/python -u /usr/bin/Update.py
Now when this python run it ask for some input(from keyboard), but i am not able to capture it. How my python can get input in this scenario?
Python script look like below
ip = raw_input('Enter IP for Switch')
tn = telnetlib.Telnet ( ip, 23, 600 )
For giving command line arguments to a bash script you can use $1, $2, $3 etc. The tutorial here talks about this: http://linuxcommand.org/lc3_wss0120.php
For the python part you can use something like argparse to do this pretty nicely. This also had loads of tutorials out there.
For a single line of input use this:
echo "input" | command arg1 arg2
For multiple lines write the expected input to a file, then redirect the input:
command arg1 arg2 < inputfile
It is not guaranteed to work depending on many details.
Please consider the risk of blindly giving input without reading what the program wants.
For a more sophisticated solution check the expect utility.

Calling alias Command from python script

I need to run an OpenFOAM command by automatized python script.
My python code contains the lines
subprocess.Popen(['OF23'], shell=True)
subprocess.Popen(['for i in *; do surfaceConvert $i file_path/$i.stlb; done', shell=True)
where OF23 is a shell command is defined in alias as
alias OF23='export PATH=/usr/lib64/openmpi/bin/:$PATH;export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc'
This script runs the OpenFOAM command in terminal and the file_path defines the stl files which are converted to binary format
But when I run the script, I am getting 'OF23' is not defined.
How do I make my script to run the alias command and also perform the next OpenFOAM file conversion command
That's not going to work, even once you've resolved the alias problem. Each Python subprocess.Popen is run in a separate subshell, so the effects of executing OF23 won't persist to the second subprocess.Popen.
Here's a brief demo:
import subprocess
subprocess.Popen('export ATEST="Hello";echo "1 $ATEST"', shell=True)
subprocess.Popen('echo "2 $ATEST"', shell=True)
output
1 Hello
2
So whether you use the alias, or just execute the aliased commands directly, you'll need to combine your commands into one subprocess.Popen call.
Eg:
subprocess.Popen('''export PATH=/usr/lib64/openmpi/bin/:$PATH;
export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;
source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc;
for i in *;
do surfaceConvert $i file_path/$i.stlb;
done''', shell=True)
I've used a triple-quoted string so I can insert linebreaks, to make the shell commands easier to read.
Obviously, I can't test that exact command sequence on my machine, but it should work.
You need to issue shopt -s expand_aliases to activate alias expansion. From bash(1):
Aliases are not expanded when the shell is not interactive, unless the expand_aliases shell option is set using shopt [...]
If that does not help, check if the shell executed from your Python program is actually Bash (e.g. by echoing $BASH).
If your command may use bash-isms then you could pass executable parameter otherwise /bin/sh is used. To expand aliases, you could use #Michael Jaros' suggestion:
#!/usr/bin/env python
import subprocess
subprocess.check_call("""
shopt -s expand_aliases
OF23
for i in *; do surfaceConvert $i file_path/$i.stlb; done
"""], shell=True, executable='/bin/bash')
If you already have a working bash-script then just call it as is.
Though to make it more robust and maintainable, you could convert to Python parts that provide the most benefit e.g., here's how you could emulate the for-loop:
#!/usr/bin/env python
import subprocess
for entry in os.listdir():
subprocess.check_call(['/path/to/surfaceConvert', entry,
'file_path/{entry}.stlb'.format(entry)])
It allows filenames to contain shell meta-characters such as spaces.
To configure the environment for a child process, you could use Popen's env parameter e.g., env=dict(os.environ, ENVVAR='value').
It is possible to emulate source bash command in Python but you should probably leave the parts that depend on it in bash-script.

Error in check_call() subprocess, executing 'mv' unix command: "Syntax error: '(' unexpected"

I'm making a python script for Travis CI.
.travis.yml
...
script:
- support/travis-build.py
...
The python file travis-build.py is something like this:
#!/usr/bin/env python
from subprocess import check_call
...
check_call(r"mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder", shell=True)
...
When Travis building achieves that line, I'm getting an error:
/bin/sh: 1: Syntax error: "(" unexpected
I just tried a lot of different forms to write it, but I get the same result. Any idea?
Thanks in advance!
Edit
My current directory layout:
- my_project/final_folder/
- cmake-3.0.2-Darwin64-universal/
- fileA
- fileB
- fileC
I'm trying with this command to move all the current files fileA, fileB and fileC, excluding my_project and cmake-3.0.2-Darwin64-universal folders into ./my_project/final_folder. If I execute this command on Linux shell, I get my aim but not through check_call() command.
Note: I can't move the files one by one, because there are many others
I don't know which shell Travis are using by default because I don't specify it, I only know that if I write the command in my .travis.yml:
.travis.yml
...
script:
# Here is the previous Travis code
- mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder
...
It works. But If I use the script, it fails.
I found this command from the following issue:
How to use 'mv' command to move files except those in a specific directory?
You're using the bash feature extglob, to try to exclude the files that you're specifying. You'll need to enable it in order to have it exclude the two entries you're specifying.
The python subprocess module explicitly uses /bin/sh when you use shell=True, which doesn't enable the use of bash features like this by default (it's a compliance thing to make it more like original sh).
If you want to get bash to interpret the command; you have to pass it to bash explicitly, for example using:
subprocess.check_call(["bash", "-O", "extglob", "-c", "mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder"])
I would not choose to do the job in this manner, though.
Let me try again: in which shell do you expect your syntax !(...) to work? Is it bash? Is it ksh? I have never used it, and a quick search for a corresponding bash feature led nowhere. I suspect your syntax is just wrong, which is what the error message is telling you. In that case, your problem is entirely independent form python and the subprocess module.
If a special shell you have on your system supports this syntax, you need to make sure that Python is using the same shell when invoking your command. It tells you which shell it has been using: /bin/sh. This is usually just a link to the real shell executable. Does it point to the same shell you have tested your command in?
Edit: the SO solution you referenced contains the solution in the comments:
Tip: Note however that using this pattern relies on extglob. You can
enable it using shopt -s extglob (If you want extended globs to be
turned on by default you can add shopt -s extglob to .bashrc)
Just to demonstrate that different shells might deal with your syntax in different ways, first using bash:
$ !(uname)
-bash: !: event not found
And then, using /bin/dash:
$ !(uname)
Linux
The argument to a subprocess.something method must be a list of command line arguments. Use e.g. shlex.split() to make the string be split into correct command line arguments:
import shlex, subprocess
subprocess.check_call( shlex.split("mv !(...)") )
EDIT:
So, the goal is to move files/directories, with the exemption of some file(s)/directory(ies). By playing around with bash, I could get it to work like this:
mv `ls | grep -v -e '\(exclusion1\|exclusion2\)'` my_project
So in your situation that would be:
mv `ls | grep -v -e '\(myproject\|cmake-3.0.2-Darwin64-universal\)'` my_project
This could go into the subprocess.check_call(..., shell=True) and it should do what you expect it to do.

How to force os.system() to use bash instead of shell

I've tried what's told in How to force /bin/bash interpreter for oneliners
By doing
os.system('GREPDB="my command"')
os.system('/bin/bash -c \'$GREPDB\'')
However no luck, unfortunately I need to run this command with bash and subp isn't an option in this environment, I'm limited to python 2.4. Any suggestions to get me in the right direction?
Both commands are executed in different subshells.
Setting variables in the first system call does not affect the second system call.
You need to put two command in one string (combining them with ;).
>>> import os
>>> os.system('GREPDB="echo 123"; /bin/bash -c "$GREPDB"')
123
0
NOTE You need to use "$GREPDB" instead of '$GREPDBS'. Otherwise it is interpreted literally instead of being expanded.
If you can use subprocess:
>>> import subprocess
>>> subprocess.call('/bin/bash -c "$GREPDB"', shell=True,
... env={'GREPDB': 'echo 123'})
123
0
The solution below still initially invokes a shell, but it switches to bash for the command you are trying to execute:
os.system('/bin/bash -c "echo hello world"')
I use this:
subprocess.call(["bash","-c",cmd])
//OK, ignore this because I have not notice subprocess not considered.
subprocess.Popen(cmd, shell=True, executable='/bin/bash')
I searched this command for some days and found really working code:
import subprocess
def bash_command(cmd):
subprocess.Popen(['/bin/bash', '-c', cmd])
code="abcde"
// you can use echo options such as -e
bash_command('echo -ne "'+code+'"')
Output:
abcde

package creation file doesnot take value from .bashrc file

I am newbie to python, and for GUIs, I use wxpython.
My Issue is this:
I have to create a debian file for two types of products(say product 1 and product 2).That can be done by running README.package.creation file. For "product1" in ".bashrc" we have to change
Product = product1
After that we have to do "make clean" in new terminal(otherwise changes in .bashrc will not take effect i.e "product" may not be equal to "product 1" if we dont follow the procedure), then we have to run ./Readme.package.creation.process. In Readme.package.creation then it takes automatically product type as "product 1"
If I does this manually it will work fine but if i do this through GUI it Readme.package.creation file will not take product type. From python null value will be sent.
Please help to solve my issue.
My code is:
subprocess.call("sed -i '/export PRODUCT/d' .bashrc", shell=True)
subprocess.call("sed -i '/export BOARD=TYpe/ a\ export PRODUCT=product1' .bashrc", shell=True)
os.chdir("/home/x/y/z")
subprocess.call("make clean", shell=True)
os.chdir("/home/x/main/src/package")
subprocess.call("sed -i 's/re.build -f -gui -p all/re.build -gui -p all -svn no/' README.package.creation", shell=True)
subprocess.call("gksu debian", shell=True)
subprocess.Popen("xfce4-terminal -e 'bash -c \"./README.package.creation -u %s\";sleep 10'" % (str(u_name)),shell=True)
How to do after that I have to follow same procedure for Product 2 also
EDIT:
How about os.environ in python?
I have tried to change with os.putenv and then os.environ seems like it doesnot work fine.
Try:
import OS
os.environ['product']='product1'
subprocess.call("make clean", shell=True)
and so on
Your problem is very simple, and so is the solution:.
In the subprocess.Popen(...), change the call from:
subprocess.Popen("xfce4-terminal -e 'bash -c \"./README.package.creation -u %s\";sleep 10'" % (str(u_name)),shell=True)
to:
subprocess.Popen("xfce4-terminal -e 'bash -c \"source ~/.bashrc; ./README.package.creation -u %s\";sleep 10'" % (str(u_name)),shell=True)
Essentially, you're asking bash to source the .bashrc file before calling the package creation command.
Another illustration:
sgulati#precise:~$ cat /tmp/1.sh
export A=100
sgulati#precise:~$ python -c "import subprocess
print subprocess.Popen(['bash', '-c', 'source /tmp/1.sh; echo \$A'], stdout=subprocess.PIPE).stdout.read()"
100
In this example, I declare the variable A=100 in /tmp/1.sh, source it and then execute echo $A. Because of source /tmp/1.sh, the value of A is known when echo $A is executed.
Please note that the syntax I used in my example is the syntax from python 2.7.3, but the concept is pretty much identical, no matter how you go about it.

Categories