Unable to run shell commands with * using python subprocess module - python

I am not able to run any commands using python subprocess module which contains * sign in the command.
I am using the call this way,
subprocess.Popen(
'cp /etc/varnida_sys/* /tmp/bucket/'.split(),
stdout=subprocess.PIPE).communicate()[0]
For this I am getting,
cp: cannot stat ‘/etc/varnida_sys/*’: No such file or directory
Why is this error coming, there is one file inside /etc/varnida_sys/genders
My investigations says that using regex like * needs some special handling. I am getting some errors in all those commands that contains *.
PS. I am not getting errors when I am running the same command through paramiko from remote host.

* is only understood by a shell (which expands it to a list of files), you need to pass shell=True to Popen(). Also, there's no need to split the command, you can use a string:
subprocess.Popen("cp /etc/varnida_sys/* /tmp/bucket/",
stdout=subprocess.PIPE, shell=True).communicate()[0]
As #triplee has suggested below, it's better to use some convenience wrapper for this task, e.g. subprocess.call():
subprocess.call("cp /etc/varnida_sys/* /tmp/bucket/", shell=True)

Related

python subprocess won't play nicely with gsutil copy/move commands

In Python I'm using subprocess to call gsutil copy and move commands, but am currently unable to select multiple extensions.
The same gsutil command works at the terminal, but not in python:
cmd_gsutil = "sudo gsutil -m mv gs://xyz-ms-media-upload/*.{mp4,jpg} gs://xyz-ms-media-upload/temp/"
p = subprocess.Popen(cmd_gsutil, shell=True, stderr=subprocess.PIPE)
output, err = p.communicate()
If say there are four filetypes to move but the bucket is empty, the returning gsutil error from terminal is:
4 files/objects could not be transferred.
Whereas the error returned when run through subprocess is:
1 files/objects could not be transferred.
So clearly subprocess is mucking up the command somehow...
I could always inefficiently repeat the command for each of the filetypes, but would prefer to get to the bottom of this!
It seems, /bin/sh (the default shell) doesn't support {mp4,jpg} syntax.
Pass executable='/bin/bash', to run it as a bash command instead.
You could also run the command without the shell e.g., using glob or fnmatch modules to get the filenames to construct the gsutil command. Note: you should pass the command as a list in this case instead.

Running a PowerShell cmdlet in a Python script

I have a Python script and I want to run a PowerShell cmdlet. I've looked online and the only thing I can find is running a PowerShell script, but I feel like writing a cmdlet to a script and then dot sourcing it for execution would take a lot longer than needed.
I've tried using subprocess.Popen in the following way:
cmd = subprocess.Popen(['C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe', ps_cmdlet])
But ps_cmdlet is a python string variable with a powershell cmdlet as its value. So, I'm obviously getting a "No such file or directory" error. Is there any way to run a powershell cmdlet in a python script without using things like IronPython?
Thanks!
This works rather well
import subprocess
pl = subprocess.Popen(['powershell', 'get-process'], stdout=subprocess.PIPE).communicate()[0]
print(pl.decode('utf-8'))
Try the following (ps_cmdlet is a python string):
subprocess.call(ps_cmdlet)
edit: Here is an example that will output your machine's ip configuration to Powershell:
ps_cmdlet = 'ipconfig'
subprocess.call(ps_cmdlet)
another edit: Another way that works for me is:
ps_cmdlet = 'whatever command you would enter in powershell'
p = subprocess.Popen(ps_cmdlet,stdout=subprocess.PIPE)
p.communicate()
import subprocess
process = subprocess.Popen([r"C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe", "get-process"],
shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
process_output = process.read().splitlines()
Above script would help in executing PS Cmdlets from Python.

how to run python script from the console as if called from command prompt?

The python script I would use (source code here) would parse some arguments when called from the command line. However, I have no access to the Windows command prompt (cmd.exe) in my environment. Can I call the same script from within a Python console? I would rather not rewrite the script itself.
%run is a magic in IPython that runs a named file inside IPython as a program almost exactly like running that file from the shell. Quoting from %run? referring to %run file args:
This is similar to running at a system prompt python file args,
but with the advantage of giving you IPython's tracebacks, and of
loading all variables into your interactive namespace for further use
(unless -p is used, see below). (end quote)
The only downside is that the file to be run must be in the current working directory or somewhere along the PYTHONPATH. %run won't search $PATH.
%run takes several options which you can learn about from %run?. For instance: -p to run under the profiler.
If you can make system calls, you can use:
import os
os.system("importer.py arguments_go_here")
You want to spawn a new subprocess.
There's a module for that: subprocess
Examples:
Basic:
import sys
from subprocess import Popen
p = Popen(sys.executable, "C:\test.py")
Getting the subprocess's output:
import sys
from subprocess import Popen, PIPE
p = Popen(sys.executable, "C:\test.py", stdout=PIPE)
stdout = p.stdout
print stdout.read()
See the subprocess API Documentation for more details.

How to use '>>' in popen subprocess

I'm trying to run the command from a python file (2.7):
p=subprocess.Popen("sha256sum file1.zip >> file2.sha")
But i got an error that file '>>' does not exist.
I tried:
p=subprocess.Popen("sha256sum file1.zip >> file2.sha".split())
But still the >> is a problem.
Of course that if I run the command in the prompt line it run Ok and put the output into the file file2.sha.
I know I can add stdout to the Popen but I was wonder if there is a way to run it as simple as runing from the command line.
Thanks.
You can pass values for the stdin and stdout of the child process to Popen like so:
subprocess.Popen("sha256sum file1.zip", stdout = file("file2.sha", "a"))
Note the file needs to be opened in append mode to achieve the same behaviour as >>.
I think you should use shell=True argument to Popen:
If shell is True, the specified command will be executed through the
shell. This can be useful if you are using Python primarily for the
enhanced control flow it offers over most system shells and still want
convenient access to other shell features such as shell pipes,
filename wildcards, environment variable expansion, and expansion of ~
to a user’s home directory.
subprocess.Popen("sha256sum file1.zip >> file2.sha", shell=True)

Starting module shell command from python subprocess module

I'm trying to run vnc server, but in order to do it first I need to run 'module load vnc'.
If I call which module in loaded bash shell then the command in not found is the PATH but in the same time it's available. It looks like the command is built-in.
In other words it looks like I need to execute two commands at once module load vnc;vncserver :8080 -localhost and I'm writing script to start it from python.
I have tried different variants with subprocess.Popen like
subprocess.Popen('module load vnc;vncserver :8080 -localhost', shell=True)
which returns 127 exit code or command not found.
subprocess.Popen('module load vnc;vncserver :8080 -localhost', shell=False)
showing
File <path>/subprocess.py line 621, in \__init__
errread, errwrite)
OSError: [Errno 2] No such file or directory.
If I specify shell=True, it executes from /bin/sh but I need it from /bin/bash.
Specifying executable='/bin/bash' doesn't help as it loads new bash shell but it starts as string but not as process, i.e. I see in ps list exactly the same command I would like to start.
Would you please advise how to do start this command from subprocess module? Is it possible to have it started with shell=False?
Environment Modules usually just modifies a couple environment variables for you. It's usually possible to skip the module load whatever step altogether and just not depend on those modules. I recommend
subprocess.Popen(['/possibly/path/to/vncserver', ':8080', '-localhost'],
env={'WHATEVER': 'you', 'MAY': 'need'})
instead of loading the module at all.
If you do insist on using this basic method, then you want to start bash yourself with Popen(['bash',....
If you want to do it with shell=False, just split this into two Popen calls.
subprocess.check_call('module load vnc'.split())
subprocess.Popen('vncserver :8080 -localhost'.split())
You can call module from a Python script. The module command is provided by the environment-modules software, which also provides a python.py initialization script.
Evaluating this script in a Python script enables the module python function. If environment-modules is installed in /usr/share/Modules, you can find this script at /usr/share/Modules/init/python.py.
Following code enables module python function:
import os
exec(open('/usr/share/Modules/init/python.py').read())
Thereafter you can load your module and start your application:
module('load', 'vnc')
subprocess.Popen(['vncserver', ':8080', '-localhost'])

Categories