I'm attempting to create a looped video file by calling ffmpeg from the python subprocess library. Here's the part that's giving me problems:
import subprocess as sp
sp.Popen(['ffmpeg', '-f', 'concat', '-i', "<(for f in ~/Desktop/*.mp4; do echo \"file \'$f\'\"; done)", "-c", "copy", "~/Desktop/sample3.mp4"])
With the above code I'm getting the following error:
<(for f in /home/delta/Desktop/*.mp4; do echo "file '$f'"; done): No such file or directory
I did find a similarly phrased question here. But I'm not sure how the solution might apply to solving my issue.
Following the advice in the comments and looking elsewhere I ended up changing the code to this:
sp.Popen("ffmpeg -f concat -i <(for f in ~/Desktop/*.mp4; do echo \"file \'$f\'\"; done) -c copy ~/Desktop/sample3.mp4",
shell=True, executable="/bin/bash")
--which works fine. – moorej
If you need to parameterize input and output files, consider breaking out your parameters:
# sample variables
inputDirectory = os.path.expanduser('~/Desktop')
outputDirectory = os.path.expanduser('~/dest.mp4')
sp.Popen(['''ffmpef -f concat -i <(for f in "$1"/*; do
echo "file '$f'";
done) -c copy "$2" ''',
bash, # this becomes $0
inputDirectory, # this becomes $1
outputDirectory, # this becomes $2
], shell=True, executable="/bin/bash")
...as this ensures that your code won't do untoward things even when given an input directory with a hostile name like /uploads/$(rm -rf ~)'$(rm -rf ~)'. (ffmpeg is likely to fail to parse an input file with such a name, and if there's any video you don't want included in the current working directory but we'd need to know the escaping rules it uses to avoid that; but it's far better for ffmpeg to fail than to execute arbitrary code).
Related
I am using Inkscape to take an input single page pdf file and to output an svg file. The following works from the command line
c:\progra~1\Inkscape\inkscape -z -f "N:\pdf_skunkworks\inflation-report-may-2018-page0.pdf" -l "N:\pdf_skunkworks\inflation-report-may-2018-page0.svg"
where -z is short for --without-gui, -f is short for input file, -l is short for --export-plain-svg. And that works from command line.
I could not get the equivalent to work from Python, either passing the command line as one long string or as separate arguments. stderr and stdout give no error as they both print None
import subprocess #import call,subprocess
#completed = subprocess.run(["c:\Progra~1\Inkscape\Inkscape.exe",r"-z -f \"N:\pdf_skunkworks\inflation-report-may-2018-page0.pdf\" -l \"N:\pdf_skunkworks\inflation-report-may-2018-page0.svg\""])
completed = subprocess.run(["c:\Progra~1\Inkscape\Inkscape.exe","-z", r"-f \"N:\pdf_skunkworks\inflation-report-may-2018-page0.pdf\"" , r"-l \"N:\pdf_skunkworks\inflation-report-may-2018-page0.svg\""])
print ("stderr:" + str(completed.stderr))
print ("stdout:" + str(completed.stdout))
Just to test OS plumbing I wrote some VBA code (my normal language) and this works
Sub TestShellToInkscape()
'* Tools->References->Windows Script Host Object Model (IWshRuntimeLibrary)
Dim sCmd As String
sCmd = "c:\progra~1\Inkscape\inkscape -z -f ""N:\pdf_skunkworks\inflation-report-may-2018-page0.pdf"" -l ""N:\pdf_skunkworks\inflation-report-may-2018-page0.svg"""
Debug.Print sCmd
Dim oWshShell As IWshRuntimeLibrary.WshShell
Set oWshShell = New IWshRuntimeLibrary.WshShell
Dim lProc As Long
lProc = oWshShell.Run(sCmd, 0, True)
End Sub
So I'm obviously doing something silly in the Python code. I'm sure experienced Python programmer could solve easily.
Swap your slashes:
import subprocess #import call,subprocess
completed = subprocess.run(['c:/Progra~1/Inkscape/Inkscape.exe',
'-z',
'-f', r'N:/pdf_skunkworks/inflation-report-may-2018-page0.pdf' ,
'-l', r'N:/pdf_skunkworks/inflation-report-may-2018-page0.svg'])
print ("stderr:" + str(completed.stderr))
print ("stdout:" + str(completed.stdout))
Python knows to swap forward slashes for back slashes on windows OS, and your back slashes are currently acting as escape prefixes.
I m want to extract the scene change timestamp using the scene change detection from ffmpeg. I have to run it on a few hundreds of videos , so i wanted to use a python subprocess to loop over all the content of a folder.
My problem is that the command that i was using for getting these values on a single video involve piping the output to a file which seems to not be an option from inside a subprocess call.
this is my code :
p=subprocess.check_output(["ffmpeg", "-i", sourcedir+"/"+name+".mpg","-filter:v", "select='gt(scene,0.4)',showinfo\"","-f","null","-","2>","output"])
this one tell ffmpeg need an output
output = "./result/"+name
p=subprocess.check_output(["ffmpeg", "-i", sourcedir+"/"+name+".mpg","-filter:v", "select='gt(scene,0.4)',metadata=print:file=output","-an","-f","null","-"])
this one give me no error but doesn't create the file
this is the original command that i use directly with ffmpeg:
ffmpeg -i input.flv -filter:v "select='gt(scene,0.4)',showinfo" -f null - 2> ffout
I just need the ouput of this command to be written to a file, anyone see how i could make it work?
is there a better way then subprocess ? or just another way ? will it be easier in C?
You can redirect the stderr output directly from Python without any need for shell=True which can lead to shell injection.
It's as simple as:
with open(output_path, 'w') as f:
subprocess.check_call(cmd, stderr=f)
Things are easier in your case if you use the shell argument of the subprocess command and it should behave the same. When using the shell command, you can pass in a string as the command rather then a list of args.
cmd = "ffmpeg -i {0} -filter:v \"select='gt(scene,0.4)',showinfo\" -f {1} - 2> ffout".format(inputName, outputFile)
p=subprocess.check_output(cmd, shell=True)
If you want to pass arguments, you can easily format your string
I have renamed a css class name in a number of (python-django) templates. The css files however are wide-spread across multiple files in multiple directories. I have a python snippet to start renaming from the root dir and then recursively rename all the css files.
from os import walk, curdir
import subprocess
COMMAND = "find %s -iname *.css | xargs sed -i s/[Ff][Oo][Oo]/bar/g"
test_command = 'echo "This is just a test. DIR: %s"'
def renamer(command):
print command # Please ignore the print commands.
proccess = subprocess.Popen(command.split(), stdout = subprocess.PIPE)
op = proccess.communicate()[0]
print op
for root, dirs, files in walk(curdir):
if root:
command = COMMAND % root
renamer(command)
It doesn't work, gives:
find ./cms/djangoapps/contentstore/management/commands/tests -iname *.css | xargs sed -i s/[Ee][Dd][Xx]/gurukul/g
find: paths must precede expression: |
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression]
find ./cms/djangoapps/contentstore/views -iname *.css | xargs sed -i s/[Ee][Dd][Xx]/gurukul/g
find: paths must precede expression: |
Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression]
When I copy and run the same command (printed above), find doesn't error out and sed either gets no input files or it works.
What is wrong with the python snippet?
You're not trying to run a single command, but a shell pipeline of multiple commands, and you're trying to do it without invoking the shell. That can't possibly work. The way you're doing this, | is just one of the arguments to find, which is why find is telling you that it doesn't understand that argument with that "paths must precede expression: |" error.
You can fix that by adding shell=True to your Popen.
But a better solution is to do the pipeline in Python and keep the shell out of it. See Replacing Older Functions with the subprocess Module in the docs for an explanation, but I'll show an example.
Meanwhile, you should never use split to split a command line. The best solution is to write the list of separate arguments instead of joining them up into a string just to split them out. If you must do that, use the shlex module; that's what it's for. But in your case, even that won't help you, because you're inserting random strings verbatim, which could easily have spaces or quotes in them, and there's no way anything—shlex or otherwise—can reconstruct the data in the first place.
So:
pfind = Popen(['find', root, '-iname', '*.css'], stdout=PIPE)
pxargs = Popen(['xargs', 'sed', '-i', 's/[Ff][Oo][Oo]/bar/g'],
stdin=pfind.stdout, stdout=PIPE)
pfind.stdout.close()
output = pxargs.communicate()
But there's an even better solution here.
Python has os.walk to do the same thing as find, you can simulate xargs easily, but there's really no need to do so, and it has its own re module to use instead of sed. So, why not use them?
Or, conversely, bash is much better at driving and connecting up simple commands than Python, so if you'd rather use find and sed instead of os.walk and re.sub, why write the driving script in Python in the first place?
The problem is the pipe. To use a pipe with the subprocess module, you have to pass shell=True.
I'm trying to write data to files in a chroot environment. Since I'm non-root user, they only way I can communicate with chroot is using schroot command.
Currently I'm using the following trick to write the data.
$ schroot -c chroot_session -r -d /tmp -- bash -c "echo \"$text\" > file.txt"
But I'm sure this will give me a lot of grief if text has some special characters, quotes etc. So whats a better way of sending $text to chroot. Most probably I'll be using the above command through a python script. Is there a simpler method?
Kinda hackish, but…
c = ConfigParser.RawConfigParser()
c.readfp(open(os.path.join('/var/lib/schroot/session', chroot_session), 'r'))
chroot_basedir = c.get(chroot_session, 'mount-location')
with open(os.path.join(chroot_basedir, '/tmp/file.txt'), 'w') as fp:
fp.write(text)
Okay, so privileges don't let you get in by any method other than schroot, huh?
p = subprocess.Popen(['schroot', '-c', name, '-r', 'tee', '/tmp/file.txt'],
stdin=subprocess.PIPE,
stdout=open('/dev/null', 'w'),
stderr=sys.stderr)
p.stdin.write(text)
p.stdin.close()
rc = p.wait()
assert rc == 0
you can use python to write $text into the file (python has the right to write),
then copy that file to file.txt
I want to run a piped command line linux/bash command from Python, which first tars files, and then splits the tar file. The command would look like something this in bash:
> tar -cvf - path_to_archive/* | split -b 20m -d -a 5 - "archive.tar.split"
I know that I could execute it using subprocess, by settings shell=True, and submitting the whole command as a string, like so:
import subprocess
subprocess.call("tar -cvf - path_to_archive/* | split -b 20m -d -a 5 - 'archive.tar.split'", shell=True)
...but for security reasons I would like to find a way to skip the "shell=True" part, (which takes a list of strings rather than a full command line string, and which can not handle the pipe char correctly). Is there any solution for this in Python? I.e., is it possible to set up linked pipes somehow, or some other solution?
If you want to avoid using shell=True, you can manually use subprocess pipes.
from subprocess import Popen, PIPE
p1 = Popen(["tar", "-cvf", "-", "path_to_archive"], stdout=PIPE)
p2 = Popen(["split", "-b", "20m", "-d", "-a", "5", "-", "'archive.tar.split'"], stdin=p1.stdout, stdout=PIPE)
output = p2.communicate()[0]
Note that if you do not use the shell, you will not have access to expansion of globbing characters like *. Instead you can use the glob module.
tar can split itself:
tar -L 1000000 -F name-script.sh cf split.tar largefile1 largefile2 ...
name-script.sh
#!/bin/bash
echo "${TAR_ARCHIVE/_part*.tar/}"_part"${TAR_VOLUME}".tar >&"${TAR_FD}"
To re-assemble
tar -M -F name-script.sh cf split.tar
Add this to your python program.
Is there any reason you can't use tarfile? | http://docs.python.org/library/tarfile.html
import tarfile
tar = tarfile.open("sample.tar.gz")
tar.extractall()
tar.close()
Just write like a file like object using tarfile rather than invoking subprocess.
Shameless plug, I wrote a subprocess wrapper for easier command piping in python:
https://github.com/houqp/shell.py
Example:
shell.ex("tar -cvf - path_to_archive") | "split -b 20m -d -a 5 - 'archive.tar.split'"