How can I create a way to bulk cut audio? - python

I want to create an automated way to cut .mp3 files to 45 seconds.
So far I have been able to use ffmpeg to cut the audio to 45 seconds with this command:
ffmpeg -t 45 -i input.mp3 -acodec copy output.mp3
However this does not actually speed anything up, as if I have to do this with each file I might as well use audacity. I know that I should be able to use a .bat file to create a loop for this, however I don't know how to set up the loop. In python I would create a list of the file names in my directory with listdir:
fileNames = listdir(path),
and then create a for loop:
(something like
i = 1
for fileName in fileNames:
x = 2 * int(i)
ffmpeg -t 45 -i str(fileName)+'.mp3' -acodec copy str(x)+'.mp3'
that)
However I don't know how to create something like this in a .bat file. Some help with this, or a way to achieve this in python, would be much appreciated.

You can try using below script. Save the code into a *.bat file in the folder where you have your mp3 songs and execute it and it will process all your songs.
#ECHO OFF
setlocal enableextensions enabledelayedexpansion
set /a count = 1
for %%f in (*.mp3) do (
set "output=!count!.mp3"
ffmpeg -t 45 -i %%f -acodec copy !output!
set /a count+=1
)
endlocal

Related

Passing ffmpeg command containing % through Python subprocess

I'm trying to build a GUI with Tkinter where a set of images is converted, via press of a button, to an .mp4 video.
When I run the following from the command line, all is well:
> "ffmpeg -r 5 -i ptimage%03d -crf 20 animation.mp4"
However, in Python, the following gives me an error that I think is related to passing the % in the argument:
commandString = "ffmpeg -r 5 -i ptimage%03d -crf 20 animation.mp4"
args = shlex.split(commandString)
p = subprocess.run(args)
The error I get is ptimage%03d: No such file or directory. I'm 99% sure I'm running the command from the right directory; when I run the same command replacing ptimage%03d with ptimage000.jpg, a specific image the list, I get a (really short) video successfully.
I've tried escaping the % with \%, but that doesn't help.
Any ideas?
You omitted the file extension. Use ptimage%03d.jpg, not ptimage%03d. With ptimage%03d ffmpeg is expecting files named ptimage000 ,ptimage001, etc.
ffmpeg -framerate 5 -i ptimage%03d.jpg -crf 20 animation.mp4
Unrelated notes: Some players (YouTube excluded) can't handle such a low frame, so consider adding the -r 10 output option. Same with the chroma subsampling: consider adding -vf format=yuv420p output option.

FFmpeg same Filename for output but without extension with python

i'm kinda noob with Python but i managed to make this code to encode multiple videos files from a folder to H265, everything woks fine exept for the output name.
Actually the output name keep the old file extension with the new one like this "MyMovie.mov.mp4" and i want it to be named like this "MyMovie.mp4" is there any way to exclude the original file extension from the output file?
import os, sys, re
input_folder= '/content/drive/My Drive/Videos'
output_folder= '/content/drive/My Drive/Videos/X265'
quality_setting = '30'
file_type = 'mp4
my_suffixes = (".mp4", ".mov", ".mkv", ".avi", ".ts", ".flv", ".webm", ".wmv", ".mpg", ".m4v", ".f4v")
from pathlib import Path
Path(output_folder).mkdir(parents=True, exist_ok=True)
for filename in os.listdir(my_folder):
if (filename.endswith(my_suffixes)):
cmd = !ffmpeg -v quiet -stats -hwaccel cuvid -i "$input_folder/{filename}" -metadata comment="X265-QF$quality_setting-AAC" -c:v hevc_nvenc -preset:v slow -rc vbr -cq $quality_setting -c:a aac -b:a 160k "$output_folder/{filename}.$file_type"
Ps: This code is used on Google Colab that's why i need this with python.
In you for-loop you check if filename ends with one of the extensions listed above. So filename contains something like output.mp4 then in cmd you add .$file_type to the end of your command which is set to .mp4. I think you should remove that last part so you will only have the extension contained in filename.
I found a way i used rpartition to cut the file extension
cmd = !ffmpeg -v quiet -stats -hwaccel cuvid -i "$input_folder/{filename}" -metadata comment="X265-QF$quality_setting-AAC" -c:v hevc_nvenc -preset:v slow -rc vbr -cq $quality_setting -c:a aac -b:a 160k "$output_folder/{filename.rpartition('.')[0]}.$file_type"

Show Filename in Video ffmpeg batch script

I have a folder with around 10 different mov files. I would like to add the filename as text on each of the videos using ffmpeg in a bat file. Could someone help me achieve this please?
EDIT:
I have tried using
#ECHO OFF&Setlocal EnableDelayedExpansion
Set INPUT=E:\\Users\\Oli\\Documents\\Projects\\v1.3.0\\downloads3
Set OUTPUT=E:\\Users\\Oli\\Documents\\Projects\\v1.3.0\\downloads3
for %%a in ("%INPUT%\*.*") DO (
set "filename=%%~na"
ffmpeg -i "%%a" -vf "drawtext=text=!fileName:.= !:x=105:y=120:fontfile=E:\\Users\\Oli\\Documents\\Projects\\v1.3.0\\downloads3\\impact.ttf:fontsize=25:fontcolor=white" -b:v 1M -r 60 -b:a 320k -ar 48000 -crf 17 "%%~na.mov"
)`
But it gives me the error:
Cannot find a valid font for the family Sans
[AVFilterGraph # 0000026eb75a9f40] Error initializing filter 'drawtext' with args 'text=FileName1'
Error reinitializing filters!
Failed to inject frame into filter network: No such file or directory
Error while processing the decoded data for stream #0:0
Let's get rid of the variable assignment and simply use variable expansion to set the name. Also, though it will still work, remove the secondary backslash because it is not needed and looks ugly, lastly, always wrap set variables for paths in double quotes. Give this a try.
#echo off
set "INPUT=E:\Users\Oli\Documents\Projects\v1.3.0\downloads3"
set "OUTPUT=E:\Users\Oli\Documents\Projects\v1.3.0\downloads3"
for %%a in ("%INPUT%\*.*") do (
ffmpeg -i "%%~a" -vf "drawtext=text=%%~na:x=105:y=120:fontfile=%~dp0impact.ttf:fontsize=25:fontcolor=white" -b:v 1M -r 60 -b:a 320k -ar 48000 -crf 17 "%%~na.mov"
)

Turn .pyw Python-script to batch process

I've got an existing .pyw-script (with a GUI) and want to turn it into a batch process.
The script itselfs covert me a PDF to a new PDF
(for backup), but it's a bit annoying because I only can process 1 file at once.
Here are the inputs:
Input-Filepath
Output-Filepath
Is there now a way, so I can add a folder-path and it will convert all the existing files inside?
You could automate it using your favourite console script. I like bash:
The scripts are untested!
A version that expects: <thescript> inputfile1 inputfile2 ... inputfileN and outputs inputfileN_out.pdf
for ((i = 1; i < $#; ++i)); do
inputfile="${!i}"
outputfile="${inputfile}_out.pdf"
<your python file> "$inputfile" "$outputfile"
done
And here is a version that takes a folder and processes all pdf files found and outputs pdffilename_out.pdf
while read -d$'\0' -r inputfile; do
outputfile="${inputfile}_out.pdf"
<your python file> "$inputfile" "$outputfile"
done < <(find "$1" -type f -iname '*.pdf' -print0)

"gsutil rm" command using STDIN

I use gsutil in a Linux environment for managing files in GCS. I enjoy being able to use the command
gsutil -m cp -I gs://...
preceded by some other command to pass the STDIN to gsutil for uploading files; in doing so, I can maintain a local list of files that have been uploaded or generate specific patterns to upload and hand them off.
I would like to be able to do a similar command like
gsutil -m rm -I gs://...
to scrub files similarly. Presently, I build a big list of files to remove and run it with the following code:
while read line
do
gsutil rm gs://...
done < "$myfile.txt"
This is extraordinarily slow compared to the multithreaded "gsutil -m rm..." command, and enabling the -m flag has no effect when you have to process files one at a time from a list. I also experimented with just running
gsutil -m rm gs://.../* # remove everything
<my command> | gsutil -m cp -I gs://.../ # put back the pieces that I want
but this involves recopying a lot of a data and wastes a lot of time; the data is already there and just needs to have some removed. Any thoughts would be appreciated. Also, I don't have a lot of flexibility on either end with renaming files; otherwise, a quick rename before uploading would handle all of this.
As an interim solution, since we don't have a -I option for rm right now, how about just creating a string of all the objects you want to delete in your loop and then using gsutil -m rm to delete it? You could also do this with a simple python script that invokes the gsutil command from within python as a separate process.
Expanding on your earlier example, maybe something like the following (disclaimer: my bash-fu isn't the greatest, and I haven't tested this):
objects=''
while read line
do
objects="$objects gs://$line"
done
gsutil -m rm $objects
For anyone wondering, I wound up doing like Zach Wilt indicated above. For reference, I was removing on the order of a couple thousand files from a span of 5 directories, so roughly 10,000 files. Doing this without the "-m" switch was taking upwards of 30 minutes; with the "-m" switch, it takes less than 30 seconds. Zoom!
For a robust example: I am using this to update Google Cloud Storage files to match local files. On the current day, I have a program that dumps lots of files that are incremental, and also a handful that are "rolled up". After a week, the incremental files get scrubbed locally automatically, but the same should happen in GCS to save the space. Here's how to do this:
#!/bin/bash
# get the full date strings for touch
start=`date --date='-9 days' +%x`
end=`date --date='-8 days' +%x`
# other vars
mon=`date --date='-9 days' +%b | tr [A-Z] [a-z]`
day=`date --date='-9 days' +%d`
# display start and finish times
echo "Cleaning files from $start"
# update start and finish times
touch --date="$start" /tmp/start1
touch --date="$end" /tmp/end1
# repeat for all servers
for dr in "dir1" "dir2" "dir3" ...
do
# list files in range and build retention file
find /local/path/$dr/ -newer /tmp/start1 ! -newer /tmp/end1 > "$dr-local.txt"
# get list of all files from appropriate folder on GCS
gsutil ls gs://gcs_path/$mon/$dr/$day/ > "$dr-gcs.txt"
# formatting the host list file
sed -i "s|gs://gcs_path/$mon/$dr/$day/|/local/path/$dr/|" "$dr-gcs.txt"
# build sed command file to delete matches
while read line
do
echo "\|$line|d" >> "$dr-del.txt"
done < "$dr-local.txt"
# run command file to strip lines for files that need to remain
sed -f "$dr-del.txt" <"$dr-gcs.txt" >"$dr-out.txt"
# convert local names to GCS names
sed -i "s|/local/path/$dr/|gs://gcs_path/$mon/$dr/$day/|" "$dr-out.txt"
# new variable to hold string
del=""
# convert newline separated file to one long string
while read line
do
del="$del$line "
done < "$dr-out.txt"
# remove all files matching the final output
gsutil -m rm $del
# cleanup files
rm $dr-local.txt
rm $dr-gcs.txt
rm $dr-del.txt
rm $dr-out.txt
done
You'll need to modify to fit your needs, but this is a concrete and working method for deleting files locally, and then synchronizing the change to Google Cloud Storage. Obviously, modify to fit your needs. Thanks again to #Zach Wilt.

Categories