I am trying to run python script from windows cmd. When I run it under linux I put
python myscript.py filename??.txt
it goes through files with numbers from filename01.txt to filename18.txt and it works.
I tried to run it from cmd like
python myscript.py filename*.txt
or
python myscript.py filename**.txt
but it didnt work. If I tried the script on one single file in windows cmd it works.
Do you have any clue where the problem could be?
Thanks!
Unix shell convert file path pattern to actual files, then pass the result to the program. (python myscript.py)
But in Windows cmd, this does not happen.
See glob.glob if you want get file list that match the pattern.
Those wildcards are expanded at "shell (i.e. bash) level" before running your python script.
So the problem doesn't reside in python, but in the "shell" that you are using on Windows.
Probably you cloud try PowerShell for Windows or bash via CygWin.
try this:
FOR %X IN (filename*.txt) DO (python myscript.py %X)
Edit, you can create a .bat with this and try it.
setlocal EnableDelayedExpansion
set files=
FOR %%X IN (filename*.txt) DO set files=!files! %%X
echo %files%
python myscript.py %files%
From batch file
for %%f in ("filename*.txt") do python myscript.py "%%~nxf"
%%f will get a reference to each of the files. For each of them execute your script. %%~nxf will expand to name and extension of file.
From command line, replace %% with a single %
EDITED - I missunderstood the problem. Next try.
In windows, there is no default expansion of wildcard arguments ( see here). So, to get the same result you will need a batch file. It will concatenate the list of files and pass it to your python script
#echo off
setlocal enabledelayedexpansion
set "fileList="
for %%f in ("*.txt") do set "fileList=!fileList! "%%f""
python myscript.py !fileList!
endlocal
For a more reusable code, use something as (script calls are only echoed to screen to show efect of parameters and to avoid unneeded execution, remove when it works as intended)
#echo off
setlocal enableextensions
call :glob "*.txt" true fileList
echo python myscript.py %fileList%
echo.
call :glob "*.txt" false fileList
echo python myscript.py %fileList%
exit /b
:glob pattern useFullPath outputList
setlocal enabledelayedexpansion
if /i "%~2"=="true" (set "_name=%%%%~ff") else (set "_name=%%%%~nxf")
set "_list="
for %%f in ("%~1") do set "_list=!_list! "%_name%""
endlocal & if not "%~3"=="" set "%~3=%_list%"
As falsetru notes, on Windows the shell doesn't expand the wildcards for you, so the correct answer is glob.glob(). You should iterate over all the command line arguments and expand each. This works fine in Linux/UNIX too, because the expansion of an argument without any wildcards in it (which is what the shell gives you) is the unchanged filename. So something like this, using lazy evaluation to handle a potentially large number of args:
from sys import argv
from glob import glob
from itertools import chain, islice
for name in chain.from_iterable(glob(name) for name in islice(argv, 1, None)):
# do something with each file
Related
I run several python scripts on my W10 server and I'm looking for a way to open them all together.
I run manually my scripts opening a PowerShell on the script folder and executing pythonw script.py
I have tried several ways, with no result...
PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& 'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe C:\Users\pc2\script.py'"
also tested:
powershell.exe -noexit -command "'pythonw C:\Users\pc2\script.py'"
And:
powershell -command "& {&'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe ' C:\Users\pc2\script.py}"
None of the above does anything... they should be opening the script.py file in using pythonw.
Can anyone point me in the correct direction to get this done?
To execute a single script:
powershell -c "pythonw 'C:\Users\pc2\script1.py'"
Note: The enclosing '...' around the file path are only needed if the latter contains shell metacharacters such as whitespace.
Alternatively,
use ""..."" when calling from cmd.exe (batch file), and `"...`"
when calling from PowerShell.
To execute multiple scripts, in sequence:
powershell -c "& { $Args | % { pythonw $_ } } 'C:\Users\pc2\script1.py' 'C:\Users\pc2\script2.py'"
Note: This only works when calling from cmd.exe (batch file).
(From within PowerShell, simply execute what's inside "..." directly)
As for what you tried, focusing just on what command string PowerShell ended up seeing:
& 'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe C:\Users\pc2\script.py'
The problem is that you're passing the entire command line to &, whereas it only expects the executable name;
C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe C:\Users\pc2\script.py is obviously not a valid executable name / path.
'pythonw C:\Users\pc2\script.py'
By virtue of being enclosed in '...', this is a string literal, which PowerShell simply outputs as such - no command is ever executed.
& {&'C:\Users\pc2\AppData\Local\Programs\Python\Python36-32\pythonw.exe ' C:\Users\pc2\script.py}
Since you're using a script block ({ ... }), you're passing C:\Users\pc2\script.py as an argument, so you'd have to refer to it as such inside the block, either via the automatic $Args variable, or via a parameter defined in a param() block.
(As an aside: your file path has an extraneous trailing space, but PowerShell seems to ignore that.)
From the powershell console itself, you can directly run like this:
PS C:\Python27> python C:\Users\pc2\script.py
If necessary please edit the PATHTEXT environmental variable in the powershell profile:
$env:PATHEXT += ";.py"
In the batch also you can put the same command to execute the python. Save the file as something.bat :
python C:\Users\pc2\script.py
Hope it helps.
I have a python script (e.g. test.py) and a commands.txt file which contains a custom bash function (e.g. my_func) along with its parameters, e.g.
my_func arg1 arv2; my_func arg3 arg4; my_func arg5 arg6;
This function is included in the ~/.bash_profile.
What I have tried to do is:
subprocess.call(['.', path/to/commands.txt], shell=True)
I know this is not the best case, in terms of setting the shell argument into True, but I cannot seem to implement it even in this way. What I get when I run test.py is:
my_func: command not found
You will need to invoke bash directly, and instruct it to process both files.
At the command-line, this is:
bash -c '. ~/.bash_profile; . commands.txt'
Wrapping it in python:
subprocess.call(['bash', '-c', '. ~/.bash_profile; . commands.txt'])
You could also source ~/.bash_profile at the top of commands.txt. Then you'd be able to run a single file.
It may make sense to extract the function to a separate file, since .bash_profile is intended to be used by login shells, not like this.
If the first line of your commands.txt file had the correct shebang (for example #!/bin/bash) making your file executable (chmod +x commands.txt) will be enough :
subprocess.call("path/to/commands.txt")
I am using Jupyter Notebook and would like to execute a bash script from a python string. I have a python cell creating the bash script which then I need to print, copy to another cell, and then run it. Is it possible to use something like exec('print('hello world!')')?
Here is an example of my bash script:
%%bash -s "$folder_dir" "$name_0" "$name_1" "$name_2" "$name_3" "$name_4" "$name_5" "$name_6" "$name_7" "$name_8" "$name_9" "$name_10" "$name_11"
cd $1
ds9 ${2} ${3} ${4} ${5} ${6} ${7} ${8} ${9} ${10} ${11} ${12} ${13}
If not possible, then how can I go to a different directory, and run
ds9 dir1 dir2 dir3 ...
within my Jupyter Notebook since I only can initialize dir's using python. Note that the number of dir's is not fixed every time I run my code. ds9 is just a command to open multiple astronomical images at the same time.
I know I can save my bash script to a .sh file and execute that, but I am looking for a classier solution.
The subprocess module is the Right Thing for invoking external software -- in a shell or otherwise -- from Python.
import subprocess
folder_dir="/" # your directory
names=["name_one", "name_two"] # this is your list of names you want to open
subprocess.check_call(
['cd "$1" || exit; shift; exec ds9 "$#"', "_", folder_dir] + names,
shell=True)
How it works (Python)
When passing a Python list with shell=True, the first item in that list is the script to run; the second is the value of $0 when that script is running, and subsequent items are values for $1 and onward.
Note that this runs sh -c '...' with your command. sh is not bash, but (in modern systems) a POSIX sh interpreter. It's thus important not to use bash-only syntax in this context.
How it works (Shell)
Let's go over this line-by-line:
cd "$1" || exit # try to cd to the directory passed as $1; abort if that fails
shift # remove $1 from our argument list; the old $2 is now $1, &c.
exec ds9 "$#" # replace the shell in memory with the "ds9" program (as an efficiency
# ...measure), with our argument list appended to it.
Note that "$1" and "$#" are quoted. Any unquoted expansion will be string-split and glob-expanded; without this change, you won't be able to open files with spaces in their names.
I need to run an OpenFOAM command by automatized python script.
My python code contains the lines
subprocess.Popen(['OF23'], shell=True)
subprocess.Popen(['for i in *; do surfaceConvert $i file_path/$i.stlb; done', shell=True)
where OF23 is a shell command is defined in alias as
alias OF23='export PATH=/usr/lib64/openmpi/bin/:$PATH;export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc'
This script runs the OpenFOAM command in terminal and the file_path defines the stl files which are converted to binary format
But when I run the script, I am getting 'OF23' is not defined.
How do I make my script to run the alias command and also perform the next OpenFOAM file conversion command
That's not going to work, even once you've resolved the alias problem. Each Python subprocess.Popen is run in a separate subshell, so the effects of executing OF23 won't persist to the second subprocess.Popen.
Here's a brief demo:
import subprocess
subprocess.Popen('export ATEST="Hello";echo "1 $ATEST"', shell=True)
subprocess.Popen('echo "2 $ATEST"', shell=True)
output
1 Hello
2
So whether you use the alias, or just execute the aliased commands directly, you'll need to combine your commands into one subprocess.Popen call.
Eg:
subprocess.Popen('''export PATH=/usr/lib64/openmpi/bin/:$PATH;
export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;
source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc;
for i in *;
do surfaceConvert $i file_path/$i.stlb;
done''', shell=True)
I've used a triple-quoted string so I can insert linebreaks, to make the shell commands easier to read.
Obviously, I can't test that exact command sequence on my machine, but it should work.
You need to issue shopt -s expand_aliases to activate alias expansion. From bash(1):
Aliases are not expanded when the shell is not interactive, unless the expand_aliases shell option is set using shopt [...]
If that does not help, check if the shell executed from your Python program is actually Bash (e.g. by echoing $BASH).
If your command may use bash-isms then you could pass executable parameter otherwise /bin/sh is used. To expand aliases, you could use #Michael Jaros' suggestion:
#!/usr/bin/env python
import subprocess
subprocess.check_call("""
shopt -s expand_aliases
OF23
for i in *; do surfaceConvert $i file_path/$i.stlb; done
"""], shell=True, executable='/bin/bash')
If you already have a working bash-script then just call it as is.
Though to make it more robust and maintainable, you could convert to Python parts that provide the most benefit e.g., here's how you could emulate the for-loop:
#!/usr/bin/env python
import subprocess
for entry in os.listdir():
subprocess.check_call(['/path/to/surfaceConvert', entry,
'file_path/{entry}.stlb'.format(entry)])
It allows filenames to contain shell meta-characters such as spaces.
To configure the environment for a child process, you could use Popen's env parameter e.g., env=dict(os.environ, ENVVAR='value').
It is possible to emulate source bash command in Python but you should probably leave the parts that depend on it in bash-script.
I'm on Windows. I am trying to write a Python 2.x script (let's call it setup.py) which would enable the following scenario:
User runs cmd to open a console window
In that console window, user runs setup.py
User finds themselves in the same console window, but now the cmd running there has had its environment (env. variables) modified by setup.py
setup.py modifies the environment by adding a new environment variable FOO with value foo, and by preneding something to PATH.
On Linux, I would simply use os.exec*e to replace the Python process with a shell with the environment configured.
I tried the same approach on Windows (like os.exec*e(os.environ['ComSpec'])), but it doesn't work, the environment of the newly executed cmd is messed up like this:
Running just set doesn't list FOO and doesn't show the effect on PATH. Running set FOO, however, shows FOO=foo, and echo %FOO% echoes foo.
Running set PATH or echo %PATH% shows the modified PATH variable. Running set path or echo %path% shows the value without the modification (even though env. vars are normally case insensitive on Windows).
If I type exit, the conole remains hanging in some state not accepting input, until I hit Ctrl+C. After that, it apparently returns to the cmd which originally called setup.py.
So clearly, os.exec*e doesn't work for this scenario on Windows. Is there a different way to achieve what I want? Is there a combination of subprocess.Popen() flags which would enable me to exit the calling Python process and leave the called cmd runnig, ideally in the same console? Or would accessing CreateProcess through ctypes help?
If necessary, I would settle for launching a new console window and closing the old one, but I certainly can't afford having the old console window hang in frozen state, waiting for a newly created one to close.
There's a much simpler solution if it's acceptable to use a Windows batch file in addition to your script, since the batch file runs in the calling process, and can therefore modify its environment.
Given a file setup.bat, which looks like this...
#echo off
for /f "tokens=*" %%a in ('python setup.py') do %%a
...and a file setup.py which looks like this...
import os
print 'set FOO=foo'
print 'set PATH=%s;%s' % ('C:\\my_path_dir', os.environ['PATH'])
...and assuming python.exe in in the PATH, then calling setup.bat from the command line will set the environment variables in the calling process, while still allowing you to make the setup.py script as complicated as you like, as long as it prints the commands you want to execute to stdout.
Update based on comments
If your setup.py has multiple modes of operation, you could make the setup.bat a generic wrapper for it. Suppose instead setup.bat looks like this...
#echo off
if "%1" == "setenv" (
for /f "tokens=*" %%a in ('python setup.py %1') do %%a
) else (
python setup.py %*
)
...and setup.py looks like this...
import sys
import os
if len(sys.argv) > 1 and sys.argv[1] == 'setenv':
print 'set FOO=foo'
print 'set PATH=%s;%s' % ('C:\\my_path_dir', os.environ['PATH'])
else:
print "I'm gonna do something else with argv=%r" % sys.argv
...would that not suffice?