The Generated File By The Command Line Doesn't Exist - python

I'm using SOX Command in Windows to Convert .wav files to a spectogram .png file, from the cmd directly it's working but in the python script it returns 1 but the image doesn't exist in the directory specified.
rootdir =r'C:\Users\Heba\output.wav'
wave_path = rootdir
wave_image_path = wave_path.replace(".wav", ".png").
cmdstring = 'sox "{}" -n spectrogram -r -o "{}"'.format(wave_path, wave_image_path)
subprocess.call(cmdstring, shell=True)
it doesn't provide any errors too, any help please?

Related

How to zip all outputs (*.csv and *.tiff) in jupyter notebook

I use this command
! zip -r results.zip . -i *.csv *.pdf
in Jupyter Notebook(Python 3.7.10) in order to zip all the output files. But, it shows
zip' is not recognized as an internal or external command, operable program or batch file.
Can anyone suggest what I miss?
I think you are trying to use,os.system():
If you are using linux
import os
os.system("zip -r results.zip . -i *.csv *.pdf")
#OR
import subprocess
subprocess.Popen("zip -r results.zip . -i *.csv *.pdf")
If you aren't using linux, in windows. There is library called zipfile, you can use it:
from zipfile import ZipFile
import os
filesname=os.listdir("<path or empty") # Or you can alternately use glob inside `with` ZipFile
# Empty means working folder
with ZipFile('output.zip', 'w') as myzip:
for file in files:
if file.endswith(".csv") and file.endswith(".pdf"):
myzip.write(file)
On modern Windows the Zip tool is Tar
It is not well documented nor as easy to use as most 3rd Party zippers thus you would need to custom wrap your OS call.
generally the following should be good enough
Tar -a -cf results.zip *.csv *.pdf
However if there are not one or other type the response will complete for the valid group, but with a very cryptic response:-
Tar: : Couldn't visit directory: No such file or directory
Tar: Error exit delayed from previous errors.

Execute python program with multiple files - Python - Bash

I have hundreds of XML files and I would like to parse it into CSV files. I already code this program.
To execute the python program I use this command (on VScode MS):
python ConvertXMLtoCSV.py -i Alarm120.xml -o Alarm120.csv
My question is, how change this script to integrate a sort of for loop to execute this program for each xml files ?
UPDATE
If my files and folders are organized like in the picture:
I tried this and execute the file .bat in windows10 but it does nothing:
#!/bin/bash
for xml_file in XML_Files/*.xml
do
csv_file=${xml_file/.xml/.csv}
python ConvertXMLtoCSV.py -i XML_Files/$xml_file -o CSV_Files/$csv_file
done
Ideally the for loop would be included inside your ConvertXMLtoCSV.py itself. You can use this to find all xml files in a given directory:
for file in os.listdir(directory_path):
if file.endswith(".xml"):
# And here you can do your conversion
You could change the arguments given to the script to be the path of the directory the xml files are located in and the path for an output folder for the .csv files. For renaming, you can leave the files with the same name but give the .csv extension. i.e.
csv_name = file.replace(".xml", ".csv")
If you want to keep your Python script as-is (process one file), and add the looping externally in bash, you could do:
#!/bin/bash
for xml_file in *.xml
do
csv_file=${xml_file/.xml/.csv}
python ConvertXMLtoCSV.py -i $xml_file -o $csv_file
done
After discussion, it appears that you wish to use an external script so as to leave the original ConvertXMLtoCSV.py script unmodified (as required by other projects), but that although you tagged bash in the question, it turned out that you were not in fact able to use bash to invoke python when you tried it in your setup.
This being the case, it is possible to adapt Rolv Apneseth's answer so that you do the looping in Python, but inside a separate script (let's suppose that this is called convert_all.py), which then runs the unmodified ConvertXMLtoCSV.py as an external process. This way, the ConvertXMLtoCSV.py will still be set up to process only one file each time it is run.
To call an external process, you could either use os.system or subprocess.Popen, so here are two options.
Using os.system:
import os
import sys
directory_path = sys.argv[1]
for file in os.listdir(directory_path):
if file.endswith(".xml"):
csv_name = file.replace(".xml", ".csv")
os.system(f'python ConvertXMLtoCSV.py -i {file} -o {csv_name}')
note: for versions of python too old to support f-strings, that last line could be changed to
os.system('python ConvertXMLtoCSV.py -i {} -o {}'.format(file,csv_name))
Using subprocess.Popen:
import subprocess
import sys
directory_path = sys.argv[1]
for file in os.listdir(directory_path):
if file.endswith(".xml"):
csv_name = file.replace(".xml", ".csv")
p = subprocess.Popen(['python', 'ConvertXMLtoCSV.py',
'-i', file,
'-o', csv_name])
p.wait()
You could then run it using some command such as:
python convert_all.py C:/Users/myuser/Desktop/myfolder
or whatever the folder is where you have the XML files.

Sox in subprocess not creating output file

I am writing a Python script to mix multiple audio files into one combined file. I am using the Sox command within a subprocess call; however, it is not creating a mixed output file. Does anyone know how to fix this?
subprocess.call('sox -m %s %s' % (' '.join(audios), audio_file.mp3), shell=True)
audios is a list of the mp3 files (written as a full path) that I have (e.g. "./Files/Music/a.mp3"), and audio_file.mp3 is the mixed audio file I wish to create. I have included the line !pip install sox at the top of the script.

invalid syntax while concat mpeg files on windows PYTHON

I am tying to concatenate all mpeg files together in one new file in windows 7, I adjusted the environment variables and running the code from python shell but it gives invalid syntax. Any help as I am new to Python and ffmpeg library?
My code:
ffmpeg -f concat -i <(for f in glob.glob("*.mpeg"); do echo "file '$PWD/$f'"; done) -c copy output.mpeg
Thanks
Your example code is mix or Python code and Bash code so it can't run in Python Shell nor in Bash Shell :)
On Linux it works in Bash as two commands:
(Windows probably doesn't have printf command)
printf "file '%s'\n" *.wav > input.txt
ffmpeg -f concat -i input.txt -c copy output.mpeg
Python version which doesn't need Bash:
#!/usr/bin/env python3
import os
import sys
import glob
import subprocess
# get Current Working Directory (CWD)
pwd = os.getcwd()
# get list of files
if len(sys.argv) > 1:
#filenames = sys.argv[1:] # Linux
filenames = glob.glob(sys.argv[1]) # Windows
else:
filenames = glob.glob("*.mpg")
#print(filenames)
# generate "input.txt" file
with open("input.txt", "w") as f:
for name in filenames:
f.write("file '{}/{}'\n".format(pwd, name))
#f.write("file '{}'\n".format(name))
# run ffmpeg
subprocess.run('ffmpeg -f concat -i input.txt -c copy output.mpeg', shell=True)
And you can run it with or without argument ie. "*.wav"
python script.py *.wav
(tested only on Linux)
printf (and other Bash commands) for Windows: GnuWin32
More on GnuWin32

Use curl to download multiple files

I have to use cURL on Windows using python script. My goal is: using python script get all files from remote directory ... preferably into local directory. After that I will compare each file with the files stored locally. I am able to get one file at a time but I need to get all of the files from remote directory.
Could someone please advice how to get multiple files?
I use this command:
curl.exe -o file1.txt sftp:///dir1/file1.txt -k -u user:password
thanks
I haven't tested this, but I think you could just try launching each shell command as a separate process to run them simultaneously. Obviously, this might be a bad idea if you have a large set of files, so you might need to manage that more carefully. Here's some untested code, and you'd need to edit the 'cmd' variable in the get_file function, of course.
from multiprocessing import Process
import subprocess
def get_file(filename):
cmd = '''curl.exe -o {} sftp:///dir1/{} -k -u user:password'''.format(filename, filename)
subprocess.check_output(cmd, shell=True, stderr=subprocess.STDOUT) # run the shell command
files = ['file1.txt', 'file2.txt', 'file3.txt']
for filename in files:
p = Process(target=get_file, args=(filename,)) # create a process which passes filename to get_file()
p.start()

Categories