How to get svn directory status using pysvn - python

I try to get directory "svn" status (unversionned, normal, ...) of a particular directory using pysvn Client.status(dirname) function.
But as said in the doc, pysvn return an array with file status within directory, not directory himself.
Is there an other way to obtain this information ?
Have a nice day.
Ouille

Sorry found it.
Client.status(dirname) return an array. Last element of this array is the directory.
So using myclient.status(dirname)[-1].text_status return directory status.
Have a nice day.
Ouille.

Related

How can I read & write "comments" metadata from .mp4 files with python?

I'm trying to add comments to my video files based on the data they were created (e.g.: age of kids in the video) if they have no comments yet. I want to read the comments section in the file description of each file to make sure it's empty, then add a comment based on time of creation of the file. Simple enough to do manually in windows explorer (right-click->properties->Details->Description section->Comments).
I know how to get some metadata from most files with stat() like the creation date but I have not managed to get to the comments sections of .mp4 files.
from pathlib import Path
testDir = r"C:\temp\test"
current_dir = Path(testDir)
for current_file in current_dir.iterdir():
info = current_file.stat()
print(info.st_mtime)
print(info.comments) # This just throws an 'os.stat_result' object has no attribute 'comments' error
Thanks to #StarGeek for pointing me in the right direction, as it turns out the Exiftool he suggested has a python wrapper called PyExifTool that allows to control ExifTool with python.
I'm sharing my solution here in case anybody else is interested:
import exiftool
vidFile = r"C:\temp\test\2019-09-02 19.52.14.mp4"
with exiftool.ExifTool() as et:
vidComment = et.get_tag("comment", vidFile)
if vidComment is None or vidComment == "":
newComment = '-comment="written by Pyexiftool"'
et.execute(bytes(newComment, 'utf-8'), bytes(vidFile,"utf-8"))
Exiftool needs to be downloaded, renamed to just exiftool (without options) and the .exe file referenced in Path.
PyExiftool needs to be present and imported.
One caveat: The first time comment is renamed it will not show up in Windows Explorer (don't know about Mac/Linux) even though it is present in metadata. I don't know why that is. However, after manually setting comment to anything it can be changed by exiftool and be visible in Windows Explorer.
Good enough for me at the moment, I can select and change comments of all files in a folder with one manual operation, then let python change the comments to something useful.

GitPython: retrieve changed files between commits in specific sub directory

The repo structure looks like this:
- folder_a
- folder_b
- folder_c
- ...
I am particularly interested in the files that changed in a specific commit, but only the ones in folder_a. My solution is
for filename, details in commit.stats.files.items():
if not filename.startswith('folder_a'):
continue
# ...
but it seems the performance is not quite good if there are a great number of files in the other folders. Is there any better way to skip the files I don't care about?
If I understand correctly : you want stats on modifications from a commit, only on one specific subfolder.
Using plain git :
git show [commit] --stat folder_a
will display exactly what you want.
Have a look at what : git.show('<commit identifier>', '--stat', 'folder_a'); returns in your python script.

Get changed files using gitpython

I want to get a list of changed files of the current git-repo. The files, that are normally listed under Changes not staged for commit: when calling git status.
So far I have managed to connected to the repository, pulled it and show all untracked files:
from git import Repo
repo = Repo(pk_repo_path)
o = self.repo.remotes.origin
o.pull()[0]
print(repo.untracked_files)
But now I want to show all files, that have changes (not commited). Can anybody push me in the right direction? I looked at the names of the methods of repo and experimented for a while, but I can't find the correct solution.
Obviously I could call repo.git.status and parse the files, but that isn't elegant at all. There must be something better.
Edit: Now that I think about it. More usefull would be a function, that tells me the status for a single file. Like:
print(repo.get_status(path_to_file))
>>untracked
print(repo.get_status(path_to_another_file))
>>not staged
for item in repo.index.diff(None):
print item.a_path
or to get just the list:
changedFiles = [ item.a_path for item in repo.index.diff(None) ]
repo.index.diff() returns git.diff.Diffable described in http://gitpython.readthedocs.io/en/stable/reference.html#module-git.diff
So function can look like this:
def get_status(repo, path):
changed = [ item.a_path for item in repo.index.diff(None) ]
if path in repo.untracked_files:
return 'untracked'
elif path in changed:
return 'modified'
else:
return 'don''t care'
just to catch up on #ciasto piekarz question: depending on what you want to show:
repo.index.diff(None)
does only list files that have not been staged
repo.index.diff('Head')
does only list files that have been staged

Turn subversion path into walkable directory

I have a subversion repo ie "http://crsvn/trunk/foo" ... I want to walk this directory or for starters simply to a directory list.
The idea is to create a script that will do mergeinfo on all the branches in "http://crsvn/branches/bar" and compare them to trunk to see if the branch has been merged.
So the first problem I have is that I cannot walk or do
os.listdir('http://crsvn/branches/bar')
I get the value label syntax is incorrect (mentioning the URL)
You can use PySVN. In particular, the pysvn.Client.list method should do what you want:
import pysvn
svncl = pysvn.Client()
entries = svncl.list("http://rabbitvcs.googlecode.com/svn/trunk/")
# Gives you a list of directories:
dirs = (entry[0].repos_path for entry in entries if entry[0].kind == pysvn.node_kind.dir)
list(dirs)
No checkout needed. You could even specify a revision to work on, to ensure your script can ignore other people working on the repository while it runs.
listdir takes a path and not a url. It would be nice if python could be aware of the structure on a remote server but i don't think that is the case.
If you were to checkout your repository locally first you could easly walk the directories using pythons functions.

Determine if a listing is a directory or file in Python over FTP

Python has a standard library module ftplib to run FTP communications. It has two means of getting a listing of directory contents. One, FTP.nlst(), will return a list of the contents of a directory given a directory name as an argument. (It will return the name of a file if given a file name instead.) This is a robust way to list the contents of a directory but does not give any indication whether each item in the list is a file or directory. The other method is FTP.dir(), which gives a string formatted listing of the directory contents of the directory given as an argument (or of the file attributes, given a file name).
According to a previous question on Stack Overflow, parsing the results of dir() can be fragile (different servers may return different strings). I'm looking for some way to list just the directories contained within another directory over FTP, though. To the best of my knowledge, scraping for a d in the permissions part of the string is the only solution I've come up with, but I guess I can't guarantee that the permissions will appear in the same place between different servers. Is there a more robust solution to identifying directories over FTP?
Unfortunately FTP doesn't have a command to list just folders so parsing the results you get from ftp.dir() would be 'best'.
A simple app assuming a standard result from ls (not a windows ftp)
from ftplib import FTP
ftp = FTP(host, user, passwd)
for r in ftp.dir():
if r.upper().startswith('D'):
print r[58:] # Starting point
Standard FTP Commands
Custom FTP Commands
If the FTP server supports the MLSD command, then please check that answer for a couple of useful classes (FTPDirectory and FTPTree).
Another way is to assume everything is a directory and try and change into it. If this succeeds it is a directory, but if this throws an ftplib.error_perm it is probably a file. You can catch then catch the exception. Sure, this isn't really the safest, but neither is parsing the crazy string for leading 'd's.
Example
def processRecursive(ftp,directory):
ftp.cwd(directory)
#put whatever you want to do in each directory here
#when you have called processRecursive with a file,
#the command above will fail and you will return
#get the files and directories contained in the current directory
filenames = []
ftp.retrlines('NLST',filenames.append)
for name in filenames:
try:
processRecursive(ftp,name)
except ftplib.error_perm:
#put whatever you want to do with files here
#put whatever you want to do after processing the files
#and sub-directories of a directory here

Categories