invalid syntax while concat mpeg files on windows PYTHON - python

I am tying to concatenate all mpeg files together in one new file in windows 7, I adjusted the environment variables and running the code from python shell but it gives invalid syntax. Any help as I am new to Python and ffmpeg library?
My code:
ffmpeg -f concat -i <(for f in glob.glob("*.mpeg"); do echo "file '$PWD/$f'"; done) -c copy output.mpeg
Thanks

Your example code is mix or Python code and Bash code so it can't run in Python Shell nor in Bash Shell :)
On Linux it works in Bash as two commands:
(Windows probably doesn't have printf command)
printf "file '%s'\n" *.wav > input.txt
ffmpeg -f concat -i input.txt -c copy output.mpeg
Python version which doesn't need Bash:
#!/usr/bin/env python3
import os
import sys
import glob
import subprocess
# get Current Working Directory (CWD)
pwd = os.getcwd()
# get list of files
if len(sys.argv) > 1:
#filenames = sys.argv[1:] # Linux
filenames = glob.glob(sys.argv[1]) # Windows
else:
filenames = glob.glob("*.mpg")
#print(filenames)
# generate "input.txt" file
with open("input.txt", "w") as f:
for name in filenames:
f.write("file '{}/{}'\n".format(pwd, name))
#f.write("file '{}'\n".format(name))
# run ffmpeg
subprocess.run('ffmpeg -f concat -i input.txt -c copy output.mpeg', shell=True)
And you can run it with or without argument ie. "*.wav"
python script.py *.wav
(tested only on Linux)
printf (and other Bash commands) for Windows: GnuWin32
More on GnuWin32

Related

Meaning of -u and -f flag inside os.system in python

I am trying to run a C executable say c4.5 and c4.5rules which will take a single command line argument inside a python code using os.system, I am using the C executable written by some other person, and I don't have the original .c file with me. I need to write like below to make sure that executable file is doing okay.
import sys
import os
dataset = sys.argv[1]
os.system(f"/home/Dev/c4.5 -u -f {dataset}")
os.system(f"/home/Dev/c4.5rules -u -f {dataset}")
os.system(f"/home/Dev/c4.5rules -u -f {dataset} > Temp")
f = open('Temp')
...
And if I remove that -u and -f, then it is not working.
Why is it happening and what is the use of -u and -f?

Execute python program with multiple files - Python - Bash

I have hundreds of XML files and I would like to parse it into CSV files. I already code this program.
To execute the python program I use this command (on VScode MS):
python ConvertXMLtoCSV.py -i Alarm120.xml -o Alarm120.csv
My question is, how change this script to integrate a sort of for loop to execute this program for each xml files ?
UPDATE
If my files and folders are organized like in the picture:
I tried this and execute the file .bat in windows10 but it does nothing:
#!/bin/bash
for xml_file in XML_Files/*.xml
do
csv_file=${xml_file/.xml/.csv}
python ConvertXMLtoCSV.py -i XML_Files/$xml_file -o CSV_Files/$csv_file
done
Ideally the for loop would be included inside your ConvertXMLtoCSV.py itself. You can use this to find all xml files in a given directory:
for file in os.listdir(directory_path):
if file.endswith(".xml"):
# And here you can do your conversion
You could change the arguments given to the script to be the path of the directory the xml files are located in and the path for an output folder for the .csv files. For renaming, you can leave the files with the same name but give the .csv extension. i.e.
csv_name = file.replace(".xml", ".csv")
If you want to keep your Python script as-is (process one file), and add the looping externally in bash, you could do:
#!/bin/bash
for xml_file in *.xml
do
csv_file=${xml_file/.xml/.csv}
python ConvertXMLtoCSV.py -i $xml_file -o $csv_file
done
After discussion, it appears that you wish to use an external script so as to leave the original ConvertXMLtoCSV.py script unmodified (as required by other projects), but that although you tagged bash in the question, it turned out that you were not in fact able to use bash to invoke python when you tried it in your setup.
This being the case, it is possible to adapt Rolv Apneseth's answer so that you do the looping in Python, but inside a separate script (let's suppose that this is called convert_all.py), which then runs the unmodified ConvertXMLtoCSV.py as an external process. This way, the ConvertXMLtoCSV.py will still be set up to process only one file each time it is run.
To call an external process, you could either use os.system or subprocess.Popen, so here are two options.
Using os.system:
import os
import sys
directory_path = sys.argv[1]
for file in os.listdir(directory_path):
if file.endswith(".xml"):
csv_name = file.replace(".xml", ".csv")
os.system(f'python ConvertXMLtoCSV.py -i {file} -o {csv_name}')
note: for versions of python too old to support f-strings, that last line could be changed to
os.system('python ConvertXMLtoCSV.py -i {} -o {}'.format(file,csv_name))
Using subprocess.Popen:
import subprocess
import sys
directory_path = sys.argv[1]
for file in os.listdir(directory_path):
if file.endswith(".xml"):
csv_name = file.replace(".xml", ".csv")
p = subprocess.Popen(['python', 'ConvertXMLtoCSV.py',
'-i', file,
'-o', csv_name])
p.wait()
You could then run it using some command such as:
python convert_all.py C:/Users/myuser/Desktop/myfolder
or whatever the folder is where you have the XML files.

Rsync include/exclude don't work in python subprocess

I use rsync to move files from my home computer to a server. Here's the command that I use to transfer and update the directory of only files that contain a grep + glob. I execute this command from the toplevel/ directory in the directory structure I show below.
rsync -r --progress --include='**201609*/***' --exclude='*' -avzh files/ user#server.edu:/user/files
Here's what the file structure of the working directory on my home file looks like:
- toplevel
- items
- files
- 20160315
- 20160910
- dir1
- really_cool_file1
- 20160911
- dir2
This works fine, and the file structure on user#server.edu:/user/files is the same as on my home computer.
I wrote a python script to do this and it doesn't work. It also transfers over the files/20160315, which is not what I want.
#!/usr/bin/env python3
import os
from subprocess import run
os.chdir("toplevel")
command_run = ["rsync", "-r",
"--progress",
"--include='**201609*/***'",
"--exclude='*'",
"-avzh",
"files/", "user#server.edu:/user/files"]
run(command_run, shell=False, check=True)
What's going on here? I had the same problem when command_run was a string, and I passed it to subprocess.run() with shell=True.
Some of those quotes are removed by the shell before being passed to the called process. You need to do this yourself if you call the program with the default shell=False. This little script will tell you what your parameters need to look like
test.py
#!/usr/bin/env python3
import sys
print(sys.argv)
And then running with your command line
~/tmp $ ./test.py -r --progress --include='**201609*/***' --exclude='*' -avzh files/ user#server.edu:/user/files
['./test.py', '-r', '--progress', '--include=**201609*/***', '--exclude=*', '-avzh', 'files/', 'user#server.edu:/user/files']
~/tmp $

Use curl to download multiple files

I have to use cURL on Windows using python script. My goal is: using python script get all files from remote directory ... preferably into local directory. After that I will compare each file with the files stored locally. I am able to get one file at a time but I need to get all of the files from remote directory.
Could someone please advice how to get multiple files?
I use this command:
curl.exe -o file1.txt sftp:///dir1/file1.txt -k -u user:password
thanks
I haven't tested this, but I think you could just try launching each shell command as a separate process to run them simultaneously. Obviously, this might be a bad idea if you have a large set of files, so you might need to manage that more carefully. Here's some untested code, and you'd need to edit the 'cmd' variable in the get_file function, of course.
from multiprocessing import Process
import subprocess
def get_file(filename):
cmd = '''curl.exe -o {} sftp:///dir1/{} -k -u user:password'''.format(filename, filename)
subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT) # run the shell command
files = ['file1.txt', 'file2.txt', 'file3.txt']
for filename in files:
p = Process(target=get_file, args=(filename,)) # create a process which passes filename to get_file()
p.start()

pass arguments of a python script from a powershell script

I have a python script as given below
mypython.py -m Filename -i Filepath
i want to execute this python script from a powershell script.
i tried the following
$myscript='$filepath\mypython.py'
$myargs="-m Filename -i Filepath"
& python.exe $myscript $myargs
But getting an error from mypython.py which says all arguments are not supplied
I got it myself
just pass the argument like below
& python.exe $myscript -m "Filename" -i "Filepath"
This will solve the issue

Categories