Python calling shell script - no directory? - python

In a python script I call a bash script as follows:
subprocess.Popen(["./scan.sh", dir])
Inside that script, there is,
find $1 -name "*013*.txt" > found_files.txt
For some reason, the dir argument from python is translated into a version with quotes inside the bash script. Printing 'dir' in python yields the exact path as the user typed it:
~/Desktop/Files
however, find fails with
find: '~/Desktop/Files' no such directory
Running scan.sh manually with ~/Desktop/Files as the argument works fine. How come quotes are being put around it...?

There aren't. What's happening is that the ~ is not being interpreted, as it's the shell's job to do so. Use os.path.expanduser() to expand the path before passing it to subprocess.

use $HOME. the '~' is not expanding in quotes nor in double quotes.
python -c "import subprocess;subprocess.Popen(['./scan.sh', '~'])"
python -c "import subprocess;subprocess.Popen(['./scan.sh', '$HOME'])"
my scan.sh contain:
#!/bin/sh
echo =$1=
The first one print =~=, the second =/Users/jomo=.

Related

determine file type of a file without extension

I want to use pygmentize to highlight some script files (python/bash/...) without extension. But pygmentize requires me to specify the lexer using -l. It does not automatically identify the file type from the content.
I have the following options at hand, but none of them work now:
use file -b --mime-type. But this command output x-python and x-shellscript instead of python and bash and I don't know the rules
use vim -e -c 'echo &ft|q' the_file. For any file with or without file extension, vim has a mechanism to guess the file type. But it doesn't work. Since the output goes to the vim window and disappears after q.
What can I do?
#Samborski's method works fine in normal case but it does not work in python subprocess.check_output since the pts is not allocated. If you use nvim, you can use this more straightforward way:
HOME=_ nvim --headless -es file <<EOF
call writefile([&ft], "/dev/stdout")
EOF
You can use vim this way:
vim -c ':silent execute ":!echo " . &ft . " > /dev/stdout"' -c ':q!' the_file
It simply constructs command to run in the shell as a string concatenation.

run python script from a .bat file and using Powershell

I run several python scripts on my W10 server and I'm looking for a way to open them all together.
I run manually my scripts opening a PowerShell on the script folder and executing pythonw script.py
I have tried several ways, with no result...
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& 'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe C:\Users\pc2\script.py'"
also tested:
powershell.exe -noexit -command "'pythonw C:\Users\pc2\script.py'"
And:
powershell -command "& {&'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe ' C:\Users\pc2\script.py}"
None of the above does anything... they should be opening the script.py file in using pythonw.
Can anyone point me in the correct direction to get this done?
To execute a single script:
powershell -c "pythonw 'C:\Users\pc2\script1.py'"
Note: The enclosing '...' around the file path are only needed if the latter contains shell metacharacters such as whitespace.
Alternatively,
use ""..."" when calling from cmd.exe (batch file), and `"...`"
when calling from PowerShell.
To execute multiple scripts, in sequence:
powershell -c "& { $Args | % { pythonw $_ } } 'C:\Users\pc2\script1.py' 'C:\Users\pc2\script2.py'"
Note: This only works when calling from cmd.exe (batch file).
(From within PowerShell, simply execute what's inside "..." directly)
As for what you tried, focusing just on what command string PowerShell ended up seeing:
& 'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe C:\Users\pc2\script.py'
The problem is that you're passing the entire command line to &, whereas it only expects the executable name;
C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe C:\Users\pc2\script.py is obviously not a valid executable name / path.
'pythonw C:\Users\pc2\script.py'
By virtue of being enclosed in '...', this is a string literal, which PowerShell simply outputs as such - no command is ever executed.
& {&'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe ' C:\Users\pc2\script.py}
Since you're using a script block ({ ... }), you're passing C:\Users\pc2\script.py as an argument, so you'd have to refer to it as such inside the block, either via the automatic $Args variable, or via a parameter defined in a param() block.
(As an aside: your file path has an extraneous trailing space, but PowerShell seems to ignore that.)
From the powershell console itself, you can directly run like this:
PS C:\Python27> python C:\Users\pc2\script.py
If necessary please edit the PATHTEXT environmental variable in the powershell profile:
$env:PATHEXT += ";.py"
In the batch also you can put the same command to execute the python. Save the file as something.bat :
python C:\Users\pc2\script.py
Hope it helps.

How to execute a bash script from a Python string in Jupyter Notebook?

I am using Jupyter Notebook and would like to execute a bash script from a python string. I have a python cell creating the bash script which then I need to print, copy to another cell, and then run it. Is it possible to use something like exec('print('hello world!')')?
Here is an example of my bash script:
%%bash -s "$folder_dir" "$name_0" "$name_1" "$name_2" "$name_3" "$name_4" "$name_5" "$name_6" "$name_7" "$name_8" "$name_9" "$name_10" "$name_11"
cd $1
ds9 ${2} ${3} ${4} ${5} ${6} ${7} ${8} ${9} ${10} ${11} ${12} ${13}
If not possible, then how can I go to a different directory, and run
ds9 dir1 dir2 dir3 ...
within my Jupyter Notebook since I only can initialize dir's using python. Note that the number of dir's is not fixed every time I run my code. ds9 is just a command to open multiple astronomical images at the same time.
I know I can save my bash script to a .sh file and execute that, but I am looking for a classier solution.
The subprocess module is the Right Thing for invoking external software -- in a shell or otherwise -- from Python.
import subprocess
folder_dir="/" # your directory
names=["name_one", "name_two"] # this is your list of names you want to open
subprocess.check_call(
['cd "$1" || exit; shift; exec ds9 "$#"', "_", folder_dir] + names,
shell=True)
How it works (Python)
When passing a Python list with shell=True, the first item in that list is the script to run; the second is the value of $0 when that script is running, and subsequent items are values for $1 and onward.
Note that this runs sh -c '...' with your command. sh is not bash, but (in modern systems) a POSIX sh interpreter. It's thus important not to use bash-only syntax in this context.
How it works (Shell)
Let's go over this line-by-line:
cd "$1" || exit # try to cd to the directory passed as $1; abort if that fails
shift # remove $1 from our argument list; the old $2 is now $1, &c.
exec ds9 "$#" # replace the shell in memory with the "ds9" program (as an efficiency
# ...measure), with our argument list appended to it.
Note that "$1" and "$#" are quoted. Any unquoted expansion will be string-split and glob-expanded; without this change, you won't be able to open files with spaces in their names.

Calling alias Command from python script

I need to run an OpenFOAM command by automatized python script.
My python code contains the lines
subprocess.Popen(['OF23'], shell=True)
subprocess.Popen(['for i in *; do surfaceConvert $i file_path/$i.stlb; done', shell=True)
where OF23 is a shell command is defined in alias as
alias OF23='export PATH=/usr/lib64/openmpi/bin/:$PATH;export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc'
This script runs the OpenFOAM command in terminal and the file_path defines the stl files which are converted to binary format
But when I run the script, I am getting 'OF23' is not defined.
How do I make my script to run the alias command and also perform the next OpenFOAM file conversion command
That's not going to work, even once you've resolved the alias problem. Each Python subprocess.Popen is run in a separate subshell, so the effects of executing OF23 won't persist to the second subprocess.Popen.
Here's a brief demo:
import subprocess
subprocess.Popen('export ATEST="Hello";echo "1 $ATEST"', shell=True)
subprocess.Popen('echo "2 $ATEST"', shell=True)
output
1 Hello
2
So whether you use the alias, or just execute the aliased commands directly, you'll need to combine your commands into one subprocess.Popen call.
Eg:
subprocess.Popen('''export PATH=/usr/lib64/openmpi/bin/:$PATH;
export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;
source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc;
for i in *;
do surfaceConvert $i file_path/$i.stlb;
done''', shell=True)
I've used a triple-quoted string so I can insert linebreaks, to make the shell commands easier to read.
Obviously, I can't test that exact command sequence on my machine, but it should work.
You need to issue shopt -s expand_aliases to activate alias expansion. From bash(1):
Aliases are not expanded when the shell is not interactive, unless the expand_aliases shell option is set using shopt [...]
If that does not help, check if the shell executed from your Python program is actually Bash (e.g. by echoing $BASH).
If your command may use bash-isms then you could pass executable parameter otherwise /bin/sh is used. To expand aliases, you could use #Michael Jaros' suggestion:
#!/usr/bin/env python
import subprocess
subprocess.check_call("""
shopt -s expand_aliases
OF23
for i in *; do surfaceConvert $i file_path/$i.stlb; done
"""], shell=True, executable='/bin/bash')
If you already have a working bash-script then just call it as is.
Though to make it more robust and maintainable, you could convert to Python parts that provide the most benefit e.g., here's how you could emulate the for-loop:
#!/usr/bin/env python
import subprocess
for entry in os.listdir():
subprocess.check_call(['/path/to/surfaceConvert', entry,
'file_path/{entry}.stlb'.format(entry)])
It allows filenames to contain shell meta-characters such as spaces.
To configure the environment for a child process, you could use Popen's env parameter e.g., env=dict(os.environ, ENVVAR='value').
It is possible to emulate source bash command in Python but you should probably leave the parts that depend on it in bash-script.

Error in check_call() subprocess, executing 'mv' unix command: "Syntax error: '(' unexpected"

I'm making a python script for Travis CI.
.travis.yml
...
script:
- support/travis-build.py
...
The python file travis-build.py is something like this:
#!/usr/bin/env python
from subprocess import check_call
...
check_call(r"mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder", shell=True)
...
When Travis building achieves that line, I'm getting an error:
/bin/sh: 1: Syntax error: "(" unexpected
I just tried a lot of different forms to write it, but I get the same result. Any idea?
Thanks in advance!
Edit
My current directory layout:
- my_project/final_folder/
- cmake-3.0.2-Darwin64-universal/
- fileA
- fileB
- fileC
I'm trying with this command to move all the current files fileA, fileB and fileC, excluding my_project and cmake-3.0.2-Darwin64-universal folders into ./my_project/final_folder. If I execute this command on Linux shell, I get my aim but not through check_call() command.
Note: I can't move the files one by one, because there are many others
I don't know which shell Travis are using by default because I don't specify it, I only know that if I write the command in my .travis.yml:
.travis.yml
...
script:
# Here is the previous Travis code
- mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder
...
It works. But If I use the script, it fails.
I found this command from the following issue:
How to use 'mv' command to move files except those in a specific directory?
You're using the bash feature extglob, to try to exclude the files that you're specifying. You'll need to enable it in order to have it exclude the two entries you're specifying.
The python subprocess module explicitly uses /bin/sh when you use shell=True, which doesn't enable the use of bash features like this by default (it's a compliance thing to make it more like original sh).
If you want to get bash to interpret the command; you have to pass it to bash explicitly, for example using:
subprocess.check_call(["bash", "-O", "extglob", "-c", "mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder"])
I would not choose to do the job in this manner, though.
Let me try again: in which shell do you expect your syntax !(...) to work? Is it bash? Is it ksh? I have never used it, and a quick search for a corresponding bash feature led nowhere. I suspect your syntax is just wrong, which is what the error message is telling you. In that case, your problem is entirely independent form python and the subprocess module.
If a special shell you have on your system supports this syntax, you need to make sure that Python is using the same shell when invoking your command. It tells you which shell it has been using: /bin/sh. This is usually just a link to the real shell executable. Does it point to the same shell you have tested your command in?
Edit: the SO solution you referenced contains the solution in the comments:
Tip: Note however that using this pattern relies on extglob. You can
enable it using shopt -s extglob (If you want extended globs to be
turned on by default you can add shopt -s extglob to .bashrc)
Just to demonstrate that different shells might deal with your syntax in different ways, first using bash:
$ !(uname)
-bash: !: event not found
And then, using /bin/dash:
$ !(uname)
Linux
The argument to a subprocess.something method must be a list of command line arguments. Use e.g. shlex.split() to make the string be split into correct command line arguments:
import shlex, subprocess
subprocess.check_call( shlex.split("mv !(...)") )
EDIT:
So, the goal is to move files/directories, with the exemption of some file(s)/directory(ies). By playing around with bash, I could get it to work like this:
mv `ls | grep -v -e '\(exclusion1\|exclusion2\)'` my_project
So in your situation that would be:
mv `ls | grep -v -e '\(myproject\|cmake-3.0.2-Darwin64-universal\)'` my_project
This could go into the subprocess.check_call(..., shell=True) and it should do what you expect it to do.

Categories