Run two ffmpeg commands one after another - python

I need to run two ffmpeg commands, one after the other i.e., wait until the first command has finished, and then run the second command. The first command is
ffmpeg -threads 8 -i D:\imagesequence\dpx\brn_055.%04d.dpx D:\imagesequence\dpx\test2.mov
and the second is
ffmpeg -i D:/imagesequence/background.jpg -vf "movie='D\:/imagesequence/dpx/thumbnail.jpg' [watermark]; [in][watermark] overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/3 [out]" D:/imagesequence/dpx/final_with_text_mod_04.jpg
What is the best way to accomplish this in Python?

You don't have to do anything more than calling 2 times a ffmpeg command with subprocess python module, this is already the default behaviour
import subprocess
execstr1 = 'ffmpeg -x -y -z ...'
execstr2 = 'ffmpeg -a -b -c ...'
out1 = subprocess.check_output(execstr1, shell=True)
out2 = subprocess.check_output(execstr2, shell=True)

Related

subprocess not creating output file of ffmpeg command

I am trying to run an ffmpeg command that records my screen and creates an .mp4 file of the recording in python. The command works when I run it in my shell, but is not working when I am running it in a Python script using subprocess.
The issue is that when running it with subprocess, the output.mp4 file is not created.
Here is the command:
timeout 10 ffmpeg -video_size 1920x1080 -framerate 60 -f x11grab -i :0.0+0,0 -f alsa -ac 2 -i pulse -acodec aac -strict experimental output.mp4
Here is the python code:
os.chdir('/home/user/Desktop/myProject/')
subprocess.run('timeout 5 ffmpeg -video_size 1920x1080 -framerate 60 -f x11grab -i :0.0+0,0 -f alsa -ac 2 -i pulse -acodec aac -strict experimental out.mp4')
Is there an additional configuration to add so that subprocess can write output files?
subprocess.run returns an CompletedProcess object. You should assign that to a variable, and then print out all output and errors of the command (Because i think, ffmpeg gives an error and doesn't try to write the file at all, but you do not see that).
Additionally, you have to either set the keyword argument shell to True, or use shlex.split, else the command will not be formatted right. shlex.split is the preferred way, as you can read here:
Providing a sequence of arguments is generally preferred, as it allows
the module to take care of any required escaping and quoting of
arguments (e.g. to permit spaces in file names).
And you do not want to manually convert the string into a list of arguments !
And there is no need to stop ffmpeg from the outside (another reason why your file might not get written). Use the builtin command line option -t for that.
import shlex
import subprocess
import os
os.chdir('/home/user/Desktop/myProject/')
p = subprocess.run(shlex.split("ffmpeg -video_size 1920x1080 -framerate 60 -f x11grab -i :0.0+0,0 -f alsa -ac 2 -i pulse -acodec aac -strict experimental -t 00:00:05 out.mp4"), stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
print(p.stdout)
Instead of using timeout you may use the -t option as posted here.
Add -t 00:00:05 argument, and remove the timeout:
subprocess.run('ffmpeg -video_size 1920x1080 -framerate 60 -f x11grab -i :0.0+0,0 -f alsa -ac 2 -i pulse -acodec aac -strict experimental -t 00:00:05 out.mp4')
I think it's more elegant to use command argument than using timeout for terminating the process.
On Windows, for hysterical reasons, you can pass in a single string without shell=True and it will work. For portable code, you need to either specify shell=True, or refactor the code to avoid it (which is generally recommended wherever feasible).
Note also that subprocess.run() has keyword arguments both for setting a timeout and for specifying the working directory for the subprocess.
subprocess.run(
['ffmpeg', '-video_size', '1920x1080', '-framerate', '60',
'-f', 'x11grab', '-i', ':0.0+0,0', '-f', 'alsa',
'-ac', '2', '-i', 'pulse', '-acodec', 'aac',
'-strict', 'experimental', 'out.mp4'],
cwd='/home/user/Desktop/myProject/', # current working directory
timeout=5, # timeout
check=True # check for errors
)
With check=True you will get an exception if the command fails, the timeout will raise an exception if the command times out, regardless of whether you have check=True.
Without more information about what failed, it's hard to specify how exactly to fix your problem; but with this, hopefully you should at least get enough information in error messages to guide you.

Running subprocess with spaces in options in python

I tried to search for an answer for a while, but I did not find anything so far for my specific case. I want to run command in python:
ssh -o ConnectTimeout=3 -o ProxyCommand="ssh -q -W %h:%p bastion.host.com" host.com "screen -dmS TEST /bin/bash --login -c 'yes | script.sh --option-1 value1 -option2 value2 2>&1 | tee output.log'"
this is my code:
import subprocess
server_command = "screen -dmS TEST /bin/bash --login -c 'yes | script.sh --option-1 value1 -option2 value2 2>&1 | tee output.log'"
command = ['ssh', '-o', 'ConnectTimeout=3', 'ProxyCommand="ssh -q -W %h:%p bastion.host.com"', 'host.com', server_command]
p = subprocess.Popen(command, stdin=None, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=False)
stdout, stderr = p.communicate(input=None)
Everything was working (screen was spawned with script running) until I added option with spaces: ProxyCommand="ssh -q -W %h:%p bastion.host.com".
After that I get error:
>>> print(stderr)
b'ssh: Could not resolve hostname ProxyCommand="ssh -q -W %h:%p bastion.host.com": Name or service not known\r\n'
How can I please pass this option to my command?
Your SSH command contains invalid arguments: ProxyCommand is an option, so it needs to be preceded by -o, same as ConnectTimeout (and, as noted by Charles Duffy, the redundant quotes inside that option string need to be removed, since the command is not passed to the shell):
server_command = 'screen -dmS TEST /bin/bash --login -c \'yes | script.sh --option-1 value1 -option2 value2 2>&1 | tee output.log\''
command = ['ssh', '-o', 'ConnectTimeout=3', '-o', 'ProxyCommand=ssh -q -W %h:%p bastion.host.com', 'host.com', server_command]
In general when your command line contains spaces and/or quotes and is passed to another command, it may be necessary to shell-quote it. The Python function shlex.quote automates this. In your case it’s not necessary because you (correctly) manually quoted the command you’re passing to screen inside server_command. Alternatively you could have written the following:
script_command = 'yes | script.sh --option-1 value1 -option2 value2 2>&1 | tee output.log'
server_command = f'screen -dmS TEST /bin/bash --login -c {shlex.quote(script_command)}'
— Note the absence of manual quotes inside the shell command line. The advantage over manual quoting is that this will also work with nested levels of shell quoting, e.g. when nesting command invocations.

Trying to send commands via Subprocess to operate with ffmpeg

I am trying to build a script that converts video files via ffmpeg inside Python 3.
Via Windows PowerShell I successfully obtained the desired result via the following command:
ffmpeg -i test.webm -c:v libx264 converted.mp4
However, if I try to repeat the same operation inside python via the following code:
import subprocess
from os import getcwd
print(getcwd()) # current directory
subprocess.call(["ffmpeg", " -f test.webm -c libx264 converted.mp4"])
I get the following error:
Output #0, mp4, to ' -f test.webm -c libx264 converted.mp4':
Output file #0 does not contain any stream
I am in the correct folder where the files are. Do you have better methods to execute commands in shell via Python? That should preferably work on different platforms.
try this:
import shlex
cmd = shlex.split("ffmpeg -f test.webm -c libx264 converted.mp4")
subprocess.call(cmd)
you need pass each argument as a single element in a list, that's how argv works, or let shell do the split:
subprocess.call("ffmpeg -f test.webm -c libx264 converted.mp4", shell=True)

subprocess.call can not send stdout to ffmpeg

My code is python. It call espeak command to generate .wav audio. Then call ffmpeg to convert wav to mp3.
But this command can not send stdout from espeak to ffmpeg via subprocess.call of python:
espeak -f myfile --stdout | ffmpeg -i - final.mp3
The example:
subprocess.call(["espeak", "test text to speak", "--stdout", "|"]+("ffmpeg -i - -vn -y -ar 22050 -ac 1 -ab 16k -af volume=2 -f mp3 mp3OutFile.mp3").split(" "))
What is the mistake? How can I do?
The pipeline you wrote is handled by the shell, and won't work (as written) unless you use shell=True. Instead of doing that, you should construct the pipeline in Python, which is pretty simple with subprocess:
p1 = subprocess.Popen(['espeak', '-f', 'myfile', '--stdout'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['ffmpeg', '-i', '-', 'final.mp3'], stdin=p1.stdout)
p1.stdout.close() # pipe is already attached to p2, and unneeded in this process
p2.wait()
p1.wait()

using os.system for multiple line commands

I am trying to run shell code from a python file to submit another python file to a computing cluster. The shell code is as follows:
#BSUB -J Proc[1]
#BSUB -e ~/logs/proc.%I.%J.err
#BSUB -o ~/logs/proc.%I.%J.out
#BSUB -R "span[hosts=1]"
#BSUB -n 1
python main.py
But when I run it from python like the following I can't get it to work:
from os import system
system('bsub -n 1 < #BSUB -J Proc[1];#BSUB -e ~/logs/proc.%I.%J.err;#BSUB -o ~/logs/proc.%I.%J.out;#BSUB -R "span[hosts=1]";#BSUB -n 1;python main.py')
Is there something I'm doing wrong here?
If I understand correctly, all the #BSUB stuff is text that should be fed to the bsub command as input; bsub is run locally, then runs those commands for you on the compute node.
In that case, you can't just do:
bsub -n 1 < #BSUB -J Proc[1];#BSUB -e ~/logs/proc.%I.%J.err;#BSUB -o ~/logs/proc.%I.%J.out;#BSUB -R "span[hosts=1]";#BSUB -n 1;python main.py
That's interpreted by the shell as "run bsub -n 1 and read from a file named OH CRAP A COMMENT STARTED AND NOW WE DON'T HAVE A FILE TO READ!"
You could fix this with MOAR HACKERY (using echo or here strings taking further unnecessary dependencies on shell execution). But if you want to feed stdin input, the best solution is to use a more powerful tool for the task, the subprocess module:
# Open a process (no shell wrapper) that we can feed stdin to
proc = subprocess.Popen(['bsub', '-n', '1'], stdin=subprocess.PIPE)
# Feed the command series you needed to stdin, then wait for process to complete
# Per Michael Closson, can't use semi-colons, bsub requires newlines
proc.communicate(b'''#BSUB -J Proc[1]
#BSUB -e ~/logs/proc.%I.%J.err
#BSUB -o ~/logs/proc.%I.%J.out
#BSUB -R "span[hosts=1]"
#BSUB -n 1
python main.py
''')
# Assuming the exit code is meaningful, check it here
if proc.returncode != 0:
# Handle a failed process launch here
This avoids a shell launch entirely (removing the issue with needing to deal with comment characters at all, along with all the other issues with handling shell metacharacters), and is significantly more explicit about what is being run locally (bsub -n 1) and what is commands being run in the bsub session (the stdin).
The #BSUB directives are parsed by the bsub binary, which doesn't support ; as a delimiter. You need to use newlines. This worked for me.
#!/usr/bin/python
import subprocess;
# Open a process (no shell wrapper) that we can feed stdin to
proc = subprocess.Popen(['bsub', '-n', '1'], stdin=subprocess.PIPE)
# Feed the command series you needed to stdin, then wait for process to complete
input="""#!/bin/sh
#BSUB -J mysleep
sleep 101
"""
proc.communicate(input);
*** So obviously I got the python code from #ShadowRanger. +1 his answer. I would have posted this as a comment to his answer if SO supported python code in a comment.

Categories