Show if folders contain files or not in python - python

I have a folder containing subfolders and I'd like to check if these subfolders contain files or not.
Output should look like
C:...\2001 contains files
C:...\2002 contains no files
and so on.
so far, I only found advice on how to delete empty folders, but I want to keep them since they might be filled later.
(I started coding 2 months ago but the people at my company think I'm some kind of IT wizard so I'd be very thankful if someone could help.)

This would work:
from os import listdir
path = "C:.../"
subfolders = listdir(path)
folders_with_files = []
for subfolder in subfolders:
if listdir(path + subfolder) != []:
folders_with_files.append(subfolder)

Related

Python glob - differentiate between similar filenames in different directories

I'm using glob recursively to find all excel files in directory, including files located in sub directories. After that, I stem the whole path of the file and only put the excel name on a list for later usage. However, the problem I'm having is when I encounter 2 different excel files but with similar name.
I want the ability to differentiate between them on my stemmed list.
For example: This what happens now with 3 files in the directory, 2 of which bearing the same name:
mainDirectory/subDirectory1/file1.xlsx
mainDirectory/subDirectory2/file1.xlsx
mainDirectory/subDirectory2/file2.xlsx
The list would be:
file1
file2
file1
And it creates a lot of problems for me afterwards, so I need help on how to add something for each 2 files to make them unique, and later on to remove it.
I'm thinking maybe to add the parent sub directory up, something like this:
subDirectory1/file1
file2
subDirectory2/file1
Is there an option in glob to deal with this issue? Or anything else?
Here is my code:
excel_list = []
file_list = []
f = []
file_list = glob.glob(path + "/**" + "/*.xlsx", recursive=True)
f.extend(file_list)
for x in f:
x = pathlib.Path(x).stem
excel_list.append(x)

Remove images in multiple folders (Python)

I want to write a python script to randomly keep only some images in multiple folders.
I am new to python, and I am trying to find the solution. However, I could not find a good one to start with yet.
I would appreciate it if anyone could help me. Thank you.
This might help you. It firstly retrieves the list of all directories, and afterwards removing random files to get only n files. Note: path_to_all_images_folder has to be declared
import os
import random
def keep_n_dir(directory, n):
files = os.listdir(directory) #You retrieve the list of names of files
if len(files) > n: #If you already have less than n files, you do nothing
diff = len(files) - n
files_to_delete = random.sample(files, k=diff) #Random sample files to delete
for file in files_to_delete:
os.remove(os.path.join(directory, file)) #Delete additional files
directories = os.listdir(path_to_all_images_folder)
directories = [os.path.join(path_to_all_images_folder, folder) for folder in directories]
for directory in directories:
if os.path.isdir(directory):
keep_n_dir(directory, n)
ATTENTION! This code removes from the directory the other files. It only keeps n.

looping through many folders inside other folders in python

I'm new to Python. I have to open a folder which contains several folders. Then inside each folder of the several folders there is one folder contains the files that i want to read. i need to open the folder then loop through the several folders inside then open the folder in each of them then open the files. i don't really know how to start. can anyone please help me where to start.
i'm trying the following but got really stuck
path_to_json = r'main folder'
for file_name in os.listdir(path_to_json):
if file_name.endswith(".json"):
print(file_name)
else:
current_path = "".join((path_to_json, "/", file_name))
if os.path.isdir(current_path):
scan_folder(current_path)
Walk is a good way to do this:
for root_dir, _, file_names in os.walk(starting_directory):
for file_name in file_names:
if file_name.endswith('.json'):
print(f'{root_dir}/{file_name}')

python3 - filter os.walk subdirectories and retrieve file name+paths

I need help getting a list of file names and locations within specific sub-directories. My directory is structured as follows:
C:\folder
-->\2014
----->\14-0023
-------->\(folders with files inside)
----->\CLOSED
-------->\14-0055!
----------->\(folders with files inside)
-->\2015
----->\15-0025
-------->\(folders with files inside)
----->\CLOSED
-------->\15-0017!
----------->\(folders with files inside)
I would like to get a list of files and their paths ONLY if they are within CLOSED.
I have tried writing multiple scripts and search questions on SO, but have not been able to come up with something to retrieve the list I want. While there seems to be questions related to my trouble, such as Filtering os.walk() dirs and files , they don't quite have the same requirements as I do and I've thus far failed to adapt code I've found on SO for my purpose.
For example, here's some sample code from another SO thread I found that I tried to adapt for my purpose.
l=[]
include_prefixes = ['CLOSED']
for dir, dirs, files in os.walk(path2, topdown=True):
dirs[:] = [d for d in dirs if d in include_prefixes]
for file in files:
l.append(os.path.join(dir,file))
^the above got me an empty list...
After a few more failures, I thought to just get a list of the correct folder paths and make another script to iterate within THOSE folders.
l=[]
regex = re.compile(r'\d{2}-\d{4}', re.IGNORECASE)
for root, subFolders, files in os.walk(path2):
try:
subFolders.remove(regex)
except ValueError:
pass
for subFolder in subFolders:
l.append(os.path.join(root,subFolder))
^Yet I still failed and just got all the file paths in the directory. No matter what I do, I can't seem to force os.walk to (a) remove specific subdirs from it's list of subdirs and then (b) make it loop through those subdirs to get the file names and paths I need.
What should I fix in my example code? Or is there entirely different code that I should consider?

Python folder recursion

I'm getting lost... I'll post what I tried, but it didn't work.
So I have to get through 3 folders.
Let's say there is the main folder(label it main for this) and 100 sub folders to the main folder(1-100 labeled), but I need to get inside those subfolders(labeled A,B,C,D,E..etc for what I need, won't go more than D) and I need to read the files inside the subfolders A,B,C,D which are .txt folders.
So
Main--->1-100--->A,B,C,D for each 1-100---> read .txt folders
import os
import glob
os.chdir("/Main")
for folders in glob.glob("20*"):
print(folders)
I tried this with an extra code to get in the subfolders
for folder in folders:
glob.glob("tracking_data*")
print(folder)
Which didn't give me what I needed. The first code works fine, the 2nd code was supposed to give me a list of tracking data folders.
I know what I need to do, and am probably overcomplicating it with my lack of previous programming. Thank you
I'm not sure I understand what it is you wish to do, but assuming you want to iterate over 2-nested folders and search for .txt files, this code should do it:
if __name__ == '__main__':
import os
rootdir = '/'
txt_files = []
for _, dirs_1_100, _ in os.walk(rootdir): # iterate through 1-100
for dir_1_100 in dirs_1_100:
path_1_100 = os.path.join(rootdir, dir_1_100)
for _, dirs_A_E, _ in os.walk(path_1_100): # iterate through A-E
for dir_A_E in dirs_A_E:
path_A_E = os.path.join(path_1_100, dir_A_E)
for _, _, files in os.walk(path_A_E):
for requested_file in files:
if not requested_file.endswith('.txt'):
continue
else:
txt_files.append(os.path.join(dir_A_E, requested_file))

Categories