I've tried to use os.path.abspath(file) as well as Path.absolute(file) to get the paths of .png files I'm working on that are on a separate drive from the project folder that the code is in. The result from the following script is "Project Folder for the code/filename.png", whereas obviously what I need is the path to the folder that the .png is in;
for root, dirs, files in os.walk(newpath):
for file in files:
if not file.startswith("."):
if file.endswith(".png"):
number, scansize, letter = file.split("-")
filepath = os.path.abspath(file)
# replace weird backslash effects
correctedpath = filepath.replace(os.sep, "/")
newentry = [number, file, correctedpath]
textures.append(newentry)
I've read other answers on here that seem to suggest that the project file for the code can't be in the same directory as the folder that is being worked on. But that isn't the case here. Can someone kindly point out what I'm not getting? I need the absolute path because the purpose of the program will be to write the paths for the files into text files.
You could use pathlib.Path.rglob here to recursively get all the pngs:
As a list comprehension:
from pathlib import Path
search_dir = "/path/to/search/dir"
# This creates a list of tuples with `number` and the resolved path
paths = [(p.name.split("-")[0], p.resolve()) for p in Path(search_dir).rglob("*.png")]
Alternatively, you can process them in a loop:
paths = []
for p in Path(search_dir).rglob("*.png"):
number, scansize, letter = p.name.split("-")
# more processing ...
paths.append([number, p.resolve()])
I just recently wrote something like what you're looking for.
This code relies on the assumption that your files are the end of the path.
it's not suitable to find a directory or something like this.
there's no need for a nested loop.
DIR = "your/full/path/to/direcetory/containing/desired/files"
def get_file_path(name, template):
"""
#:param template: file's template (txt,html...)
#return: The path to the given file.
#rtype: str
"""
substring = f'{name}.{template}'
for path in os.listdir(DIR):
full_path = os.path.join(DIR, path)
if full_path.endswith(substring):
return full_path
The result from
for root, dirs, files in os.walk(newpath):
is that files just contains the filenames without a directory path. Using just filenames means that python by default uses your project folder as directory for those filenames. In your case the files are in newpath. You can use os.path.join to add a directory path to the found filenames.
filepath = os.path.join(newpath, file)
In case you want to find the png files in subdirectories the easiest way is to use glob:
import glob
newpath = r'D:\Images'
file_paths = glob.glob(newpath + "/**/*.png", recursive=True)
for file_path in file_paths:
print(file_path)
I am writing a Python script that takes user input in the form of a date eg 20180829, which will be a subdirectory name, it then uses the os.walk function to walk through a specific directory and once it reaches the directory that is passed in it will jump inside and look at all the directory's within it and create a directory structure in a different location.
My directory structure will look something like this:
|dir1
|-----|dir2|
|-----------|dir3
|-----------|20180829
|-----------|20180828
|-----------|20180827
|-----------|20180826
So dir3 will have a number of sub folders which will all be in the format of a date. I need to be able to copy the directory structure of just the directory that is passed in at the start eg 20180829 and skip the rest of directory's.
I have been looking online for a way to do this but all I can find is ways of Excluding directory's from the os.walk function like in the thread below:
Filtering os.walk() dirs and files
I also found a thread that allows me to print out the directory paths that I want but will not let me create the directory's I want:
Python 3.5 OS.Walk for selected folders and include their subfolders.
The following is the code I have which is printing out the correct directory structure but is creating the entire directory structure in the new location which I don't want it to do.
includes = '20180828'
inputpath = Desktop
outputpath = Documents
for startFilePath, dirnames, filenames in os.walk(inputpath, topdown=True):
endFilePath = os.path.join(outputpath, startFilePath)
if not os.path.isdir(endFilePath):
os.mkdir(endFilePath)
for filename in filenames:
if (includes in startFilePath):
print(includes, "+++", startFilePath)
break
I am not sure if I understand what you need, but I think you overcomplicate a few things. If the code below doesn't help you, let me know and we will think about other approaches.
I run this to create an example like yours.
# setup example project structure
import os
import sys
PLATFORM = 'windows' if sys.platform.startswith('win') else 'linux'
DESKTOP_DIR = \
os.path.join(os.path.join(os.path.expanduser('~')), 'Desktop') \
if PLATFORM == 'linux' \
else os.path.join(os.path.join(os.environ['USERPROFILE']), 'Desktop')
example_dirs = ['20180829', '20180828', '20180827', '20180826']
for _dir in example_dirs:
path = os.path.join(DESKTOP_DIR, 'dir_from', 'dir_1', 'dir_2', 'dir_3', _dir)
os.makedirs(path, exist_ok=True)
And here's what you need.
# do what you want to do
dir_from = os.path.join(DESKTOP_DIR, 'dir_from')
dir_to = os.path.join(DESKTOP_DIR, 'dir_to')
target = '20180828'
for root, dirs, files in os.walk(dir_from, topdown=True):
for _dir in dirs:
if _dir == target:
path = os.path.join(root, _dir).replace(dir_from, dir_to)
os.makedirs(path, exist_ok=True)
continue
In Python, I only want to list all the files in the current directory ONLY. I do not want files listed from any sub directory or parent.
There do seem to be similar solutions out there, but they don't seem to work for me. Here's my code snippet:
import os
for subdir, dirs, files in os.walk('./'):
for file in files:
do some stuff
print file
Let's suppose I have 2 files, holygrail.py and Tim inside my current directory. I have a folder as well and it contains two files - let's call them Arthur and Lancelot - inside it. When I run the script, this is what I get:
holygrail.py
Tim
Arthur
Lancelot
I am happy with holygrail.py and Tim. But the two files, Arthur and Lancelot, I do not want listed.
Just use os.listdir and os.path.isfile instead of os.walk.
Example:
import os
files = [f for f in os.listdir('.') if os.path.isfile(f)]
for f in files:
# do something
But be careful while applying this to other directory, like
files = [f for f in os.listdir(somedir) if os.path.isfile(f)]
which would not work because f is not a full path but relative to the current directory.
Therefore, for filtering on another directory, do os.path.isfile(os.path.join(somedir, f))
(Thanks Causality for the hint)
You can use os.listdir for this purpose. If you only want files and not directories, you can filter the results using os.path.isfile.
example:
files = os.listdir(os.curdir) #files and directories
or
files = filter(os.path.isfile, os.listdir( os.curdir ) ) # files only
files = [ f for f in os.listdir( os.curdir ) if os.path.isfile(f) ] #list comprehension version.
import os
destdir = '/var/tmp/testdir'
files = [ f for f in os.listdir(destdir) if os.path.isfile(os.path.join(destdir,f)) ]
You can use os.scandir(). New function in stdlib starts from Python 3.5.
import os
for entry in os.scandir('.'):
if entry.is_file():
print(entry.name)
Faster than os.listdir(). os.walk() implements os.scandir().
You can use the pathlib module.
from pathlib import Path
x = Path('./')
print(list(filter(lambda y:y.is_file(), x.iterdir())))
this can be done with os.walk()
python 3.5.2 tested;
import os
for root, dirs, files in os.walk('.', topdown=True):
dirs.clear() #with topdown true, this will prevent walk from going into subs
for file in files:
#do some stuff
print(file)
remove the dirs.clear() line and the files in sub folders are included again.
update with references;
os.walk documented here and talks about the triple list being created and topdown effects.
.clear() documented here for emptying a list
so by clearing the relevant list from os.walk you can effect its result to your needs.
import os
for subdir, dirs, files in os.walk('./'):
for file in files:
do some stuff
print file
You can improve this code with del dirs[:]which will be like following .
import os
for subdir, dirs, files in os.walk('./'):
del dirs[:]
for file in files:
do some stuff
print file
Or even better if you could point os.walk with current working directory .
import os
cwd = os.getcwd()
for subdir, dirs, files in os.walk(cwd, topdown=True):
del dirs[:] # remove the sub directories.
for file in files:
do some stuff
print file
instead of os.walk, just use os.listdir
To list files in a specific folder excluding files in its sub-folders with os.walk use:
_, _, file_list = next(os.walk(data_folder))
Following up on Pygirl and Flimm, use of pathlib, (really helpful reference, btw) their solution included the full path in the result, so here is a solution that outputs just the file names:
from pathlib import Path
p = Path(destination_dir) # destination_dir = './' in original post
files = [x.name for x in p.iterdir() if x.is_file()]
print(files)
Is there a way to return a list of all the subdirectories in the current directory in Python?
I know you can do this with files, but I need to get the list of directories instead.
Do you mean immediate subdirectories, or every directory right down the tree?
Either way, you could use os.walk to do this:
os.walk(directory)
will yield a tuple for each subdirectory. Ths first entry in the 3-tuple is a directory name, so
[x[0] for x in os.walk(directory)]
should give you all of the subdirectories, recursively.
Note that the second entry in the tuple is the list of child directories of the entry in the first position, so you could use this instead, but it's not likely to save you much.
However, you could use it just to give you the immediate child directories:
next(os.walk('.'))[1]
Or see the other solutions already posted, using os.listdir and os.path.isdir, including those at "How to get all of the immediate subdirectories in Python".
You could just use glob.glob
from glob import glob
glob("/path/to/directory/*/", recursive = True)
Don't forget the trailing / after the *.
Much nicer than the above, because you don't need several os.path.join() and you will get the full path directly (if you wish), you can do this in Python 3.5 and above.
subfolders = [ f.path for f in os.scandir(folder) if f.is_dir() ]
This will give the complete path to the subdirectory.
If you only want the name of the subdirectory use f.name instead of f.path
https://docs.python.org/3/library/os.html#os.scandir
Slightly OT: In case you need all subfolder recursively and/or all files recursively, have a look at this function, that is faster than os.walk & glob and will return a list of all subfolders as well as all files inside those (sub-)subfolders: https://stackoverflow.com/a/59803793/2441026
In case you want only all subfolders recursively:
def fast_scandir(dirname):
subfolders= [f.path for f in os.scandir(dirname) if f.is_dir()]
for dirname in list(subfolders):
subfolders.extend(fast_scandir(dirname))
return subfolders
Returns a list of all subfolders with their full paths. This again is faster than os.walk and a lot faster than glob.
An analysis of all functions
tl;dr:
- If you want to get all immediate subdirectories for a folder use os.scandir.
- If you want to get all subdirectories, even nested ones, use os.walk or - slightly faster - the fast_scandir function above.
- Never use os.walk for only top-level subdirectories, as it can be hundreds(!) of times slower than os.scandir.
If you run the code below, make sure to run it once so that your OS will have accessed the folder, discard the results and run the test, otherwise results will be screwed.
You might want to mix up the function calls, but I tested it, and it did not really matter.
All examples will give the full path to the folder. The pathlib example as a (Windows)Path object.
The first element of os.walk will be the base folder. So you will not get only subdirectories. You can use fu.pop(0) to remove it.
None of the results will use natural sorting. This means results will be sorted like this: 1, 10, 2. To get natural sorting (1, 2, 10), please have a look at https://stackoverflow.com/a/48030307/2441026
Results:
os.scandir took 1 ms. Found dirs: 439
os.walk took 463 ms. Found dirs: 441 -> it found the nested one + base folder.
glob.glob took 20 ms. Found dirs: 439
pathlib.iterdir took 18 ms. Found dirs: 439
os.listdir took 18 ms. Found dirs: 439
Tested with W7x64, Python 3.8.1.
# -*- coding: utf-8 -*-
# Python 3
import time
import os
from glob import glob
from pathlib import Path
directory = r"<insert_folder>"
RUNS = 1
def run_os_walk():
a = time.time_ns()
for i in range(RUNS):
fu = [x[0] for x in os.walk(directory)]
print(f"os.walk\t\t\ttook {(time.time_ns() - a) / 1000 / 1000 / RUNS:.0f} ms. Found dirs: {len(fu)}")
def run_glob():
a = time.time_ns()
for i in range(RUNS):
fu = glob(directory + "/*/")
print(f"glob.glob\t\ttook {(time.time_ns() - a) / 1000 / 1000 / RUNS:.0f} ms. Found dirs: {len(fu)}")
def run_pathlib_iterdir():
a = time.time_ns()
for i in range(RUNS):
dirname = Path(directory)
fu = [f for f in dirname.iterdir() if f.is_dir()]
print(f"pathlib.iterdir\ttook {(time.time_ns() - a) / 1000 / 1000 / RUNS:.0f} ms. Found dirs: {len(fu)}")
def run_os_listdir():
a = time.time_ns()
for i in range(RUNS):
dirname = Path(directory)
fu = [os.path.join(directory, o) for o in os.listdir(directory) if os.path.isdir(os.path.join(directory, o))]
print(f"os.listdir\t\ttook {(time.time_ns() - a) / 1000 / 1000 / RUNS:.0f} ms. Found dirs: {len(fu)}")
def run_os_scandir():
a = time.time_ns()
for i in range(RUNS):
fu = [f.path for f in os.scandir(directory) if f.is_dir()]
print(f"os.scandir\t\ttook {(time.time_ns() - a) / 1000 / 1000 / RUNS:.0f} ms.\tFound dirs: {len(fu)}")
if __name__ == '__main__':
run_os_scandir()
run_os_walk()
run_glob()
run_pathlib_iterdir()
run_os_listdir()
import os
d = '.'
[os.path.join(d, o) for o in os.listdir(d)
if os.path.isdir(os.path.join(d,o))]
Python 3.4 introduced the pathlib module into the standard library, which provides an object oriented approach to handle filesystem paths:
from pathlib import Path
p = Path('./')
# All subdirectories in the current directory, not recursive.
[f for f in p.iterdir() if f.is_dir()]
To recursively list all subdirectories, path globbing can be used with the ** pattern.
# This will also include the current directory '.'
list(p.glob('**'))
Note that a single * as the glob pattern would include both files and directories non-recursively. To get only directories, a trailing / can be appended but this only works when using the glob library directly, not when using glob via pathlib:
import glob
# These three lines return both files and directories
list(p.glob('*'))
list(p.glob('*/'))
glob.glob('*')
# Whereas this returns only directories
glob.glob('*/')
So Path('./').glob('**') matches the same paths as glob.glob('**/', recursive=True).
Pathlib is also available on Python 2.7 via the pathlib2 module on PyPi.
If you need a recursive solution that will find all the subdirectories in the subdirectories, use walk as proposed before.
If you only need the current directory's child directories, combine os.listdir with os.path.isdir
Listing Out only directories
print("\nWe are listing out only the directories in current directory -")
directories_in_curdir = list(filter(os.path.isdir, os.listdir(os.curdir)))
print(directories_in_curdir)
Listing Out only files in current directory
files = list(filter(os.path.isfile, os.listdir(os.curdir)))
print("\nThe following are the list of all files in the current directory -")
print(files)
I prefer using filter (https://docs.python.org/2/library/functions.html#filter), but this is just a matter of taste.
d='.'
filter(lambda x: os.path.isdir(os.path.join(d, x)), os.listdir(d))
Implemented this using python-os-walk. (http://www.pythonforbeginners.com/code-snippets-source-code/python-os-walk/)
import os
print("root prints out directories only from what you specified")
print("dirs prints out sub-directories from root")
print("files prints out all files from root and directories")
print("*" * 20)
for root, dirs, files in os.walk("/var/log"):
print(root)
print(dirs)
print(files)
You can get the list of subdirectories (and files) in Python 2.7 using os.listdir(path)
import os
os.listdir(path) # list of subdirectories and files
Since I stumbled upon this problem using Python 3.4 and Windows UNC paths, here's a variant for this environment:
from pathlib import WindowsPath
def SubDirPath (d):
return [f for f in d.iterdir() if f.is_dir()]
subdirs = SubDirPath(WindowsPath(r'\\file01.acme.local\home$'))
print(subdirs)
Pathlib is new in Python 3.4 and makes working with paths under different OSes much easier:
https://docs.python.org/3.4/library/pathlib.html
Although this question is answered a long time ago. I want to recommend to use the pathlib module since this is a robust way to work on Windows and Unix OS.
So to get all paths in a specific directory including subdirectories:
from pathlib import Path
paths = list(Path('myhomefolder', 'folder').glob('**/*.txt'))
# all sorts of operations
file = paths[0]
file.name
file.stem
file.parent
file.suffix
etc.
Copy paste friendly in ipython:
import os
d='.'
folders = list(filter(lambda x: os.path.isdir(os.path.join(d, x)), os.listdir(d)))
Output from print(folders):
['folderA', 'folderB']
Thanks for the tips, guys. I ran into an issue with softlinks (infinite recursion) being returned as dirs. Softlinks? We don't want no stinkin' soft links! So...
This rendered just the dirs, not softlinks:
>>> import os
>>> inf = os.walk('.')
>>> [x[0] for x in inf]
['.', './iamadir']
Here are a couple of simple functions based on #Blair Conrad's example -
import os
def get_subdirs(dir):
"Get a list of immediate subdirectories"
return next(os.walk(dir))[1]
def get_subfiles(dir):
"Get a list of immediate subfiles"
return next(os.walk(dir))[2]
This is how I do it.
import os
for x in os.listdir(os.getcwd()):
if os.path.isdir(x):
print(x)
Building upon Eli Bendersky's solution, use the following example:
import os
test_directory = <your_directory>
for child in os.listdir(test_directory):
test_path = os.path.join(test_directory, child)
if os.path.isdir(test_path):
print test_path
# Do stuff to the directory "test_path"
where <your_directory> is the path to the directory you want to traverse.
With full path and accounting for path being ., .., \\, ..\\..\\subfolder, etc:
import os, pprint
pprint.pprint([os.path.join(os.path.abspath(path), x[0]) \
for x in os.walk(os.path.abspath(path))])
The easiest way:
from pathlib import Path
from glob import glob
current_dir = Path.cwd()
all_sub_dir_paths = glob(str(current_dir) + '/*/') # returns list of sub directory paths
all_sub_dir_names = [Path(sub_dir).name for sub_dir in all_sub_dir_paths]
This answer didn't seem to exist already.
directories = [ x for x in os.listdir('.') if os.path.isdir(x) ]
I've had a similar question recently, and I found out that the best answer for python 3.6 (as user havlock added) is to use os.scandir. Since it seems there is no solution using it, I'll add my own. First, a non-recursive solution that lists only the subdirectories directly under the root directory.
def get_dirlist(rootdir):
dirlist = []
with os.scandir(rootdir) as rit:
for entry in rit:
if not entry.name.startswith('.') and entry.is_dir():
dirlist.append(entry.path)
dirlist.sort() # Optional, in case you want sorted directory names
return dirlist
The recursive version would look like this:
def get_dirlist(rootdir):
dirlist = []
with os.scandir(rootdir) as rit:
for entry in rit:
if not entry.name.startswith('.') and entry.is_dir():
dirlist.append(entry.path)
dirlist += get_dirlist(entry.path)
dirlist.sort() # Optional, in case you want sorted directory names
return dirlist
keep in mind that entry.path wields the absolute path to the subdirectory. In case you only need the folder name, you can use entry.name instead. Refer to os.DirEntry for additional details about the entry object.
using os walk
sub_folders = []
for dir, sub_dirs, files in os.walk(test_folder):
sub_folders.extend(sub_dirs)
This will list all subdirectories right down the file tree.
import pathlib
def list_dir(dir):
path = pathlib.Path(dir)
dir = []
try:
for item in path.iterdir():
if item.is_dir():
dir.append(item)
dir = dir + list_dir(item)
return dir
except FileNotFoundError:
print('Invalid directory')
pathlib is new in version 3.4
Function to return a List of all subdirectories within a given file path. Will search through the entire file tree.
import os
def get_sub_directory_paths(start_directory, sub_directories):
"""
This method iterates through all subdirectory paths of a given
directory to collect all directory paths.
:param start_directory: The starting directory path.
:param sub_directories: A List that all subdirectory paths will be
stored to.
:return: A List of all sub-directory paths.
"""
for item in os.listdir(start_directory):
full_path = os.path.join(start_directory, item)
if os.path.isdir(full_path):
sub_directories.append(full_path)
# Recursive call to search through all subdirectories.
get_sub_directory_paths(full_path, sub_directories)
return sub_directories
use a filter function os.path.isdir over os.listdir()
something like this filter(os.path.isdir,[os.path.join(os.path.abspath('PATH'),p) for p in os.listdir('PATH/')])
This function, with a given parent directory iterates over all its directories recursively and prints all the filenames which it founds inside. Too useful.
import os
def printDirectoryFiles(directory):
for filename in os.listdir(directory):
full_path=os.path.join(directory, filename)
if not os.path.isdir(full_path):
print( full_path + "\n")
def checkFolders(directory):
dir_list = next(os.walk(directory))[1]
#print(dir_list)
for dir in dir_list:
print(dir)
checkFolders(directory +"/"+ dir)
printDirectoryFiles(directory)
main_dir="C:/Users/S0082448/Desktop/carpeta1"
checkFolders(main_dir)
input("Press enter to exit ;")
we can get list of all the folders by using os.walk()
import os
path = os.getcwd()
pathObject = os.walk(path)
this pathObject is a object and we can get an array by
arr = [x for x in pathObject]
arr is of type [('current directory', [array of folder in current directory], [files in current directory]),('subdirectory', [array of folder in subdirectory], [files in subdirectory]) ....]
We can get list of all the subdirectory by iterating through the arr and printing the middle array
for i in arr:
for j in i[1]:
print(j)
This will print all the subdirectory.
To get all the files:
for i in arr:
for j in i[2]:
print(i[0] + "/" + j)
By joining multiple solutions from here, this is what I ended up using:
import os
import glob
def list_dirs(path):
return [os.path.basename(x) for x in filter(
os.path.isdir, glob.glob(os.path.join(path, '*')))]
Lot of nice answers out there but if you came here looking for a simple way to get list of all files or folders at once. You can take advantage of the os offered find on linux and mac which and is much faster than os.walk
import os
all_files_list = os.popen("find path/to/my_base_folder -type f").read().splitlines()
all_sub_directories_list = os.popen("find path/to/my_base_folder -type d").read().splitlines()
OR
import os
def get_files(path):
all_files_list = os.popen(f"find {path} -type f").read().splitlines()
return all_files_list
def get_sub_folders(path):
all_sub_directories_list = os.popen(f"find {path} -type d").read().splitlines()
return all_sub_directories_list
This should work, as it also creates a directory tree;
import os
import pathlib
def tree(directory):
print(f'+ {directory}')
print("There are " + str(len(os.listdir(os.getcwd()))) + \
" folders in this directory;")
for path in sorted(directory.glob('*')):
depth = len(path.relative_to(directory).parts)
spacer = ' ' * depth
print(f'{spacer}+ {path.name}')
This should list all the directories in a folder using the pathlib library. path.relative_to(directory).parts gets the elements relative to the current working dir.