How Can I read file from Egnyte folder through Python? - python

Also I want to move this file in another folder and give output file on another folder. All folders are on Egnyte and using Python
client = egnyte.EgnyteClient({"domain": "apidemo.egnyte.com",
"User_Name": "Password"})
folder = client.folder("/Shared/Data/Individuals/Input")
Client use for login on Egnyte.
Folder contains the address of file and now how I read file in this folder and file name is "abc.txt".
How can I read this file and move on location "/Shared/Data/Individuals/Checked".
And after data processed output file saved on other location "/Shared/Data/Individuals/output".

It's an old post but here is how you would download/read files from a folder.
client = egnyte.EgnyteClient({"domain": "apidemo.egnyte.com",
"access_token": "OAuth token"})
file = client.file("/Shared/Data/Individuals/Input/abc.txt")
file_resource = file.download()
file_resource.read()

Egnyte have desktopconnect(2.0) version which I download on my system and after installation it will make egnyte drive on my pc and now I can easily access the egnyte files from my code just like other local files can read from code.

Related

Flask - Sending zipfile contains absolute path

My flask app has a function that compresses log files in a directory into a zip file and then sends the file to the user to download. The compression works, except that when the client receives the zipfile, the zip contains a series of folders that matches the absolute path of the original files that were zipped in the server. However, the zipfile that was made in the server static folder does not.
Zipfile contents in static folder: "log1.bin, log2.bin"
Zipfile contents that was sent to user: "/home/user/server/data/log1.bin, /home/user/server/data/log2.bin"
I don't understand why using "send_file" seems to make this change to the zip file contents and fills the received zip file with sub folders. The actual contents of the received zip file do in fact match the contents of the sent zip file in terms of data, but the user has to click through several directories to get to the files. What am I doing wrong?
#app.route("/download")
def download():
os.chdir(data_dir)
if(os.path.isfile("logs.zip")):
os.remove("logs.zip")
log_dir = os.listdir('.')
log_zip = zipfile.ZipFile('logs.zip', 'w')
for log in log_dir:
log_zip.write(log)
log_zip.close()
return send_file("logs.zip", as_attachment=True)
Using send_from_directory(directory, "logs.zip", as_attachment=True) fixed everything. It appears this call is better for serving up static files.

How to find location of downloaded files on Heroku

I am using YouTube-dl with Python and Flask to download youtube videos and return them using the send_file function.
When running locally I have been using to get the file path:
username = getpass.getuser()
directories = os.listdir(rf'C:\\Users\\{username}')
I then download the video with YouTube-dl:
youtube_dl.YoutubeDL().download([link])
I then search the directory for the file based on the video code:
files = [file for file in directories]
code = link.split('v=')[1]
for file in files:
if file.endswith('.mp4') is True:
try:
code_section = file.split('-')[1].split('.mp4')[0]
if code in code_section:
return send_file(rf'C:\\Users\\{username}\\{file}')
except:
continue
Finally, I return the file:
return send_file(rf'C:\\Users\\{username}\\{file}')
to find the location of the downloaded file, but, on Heroku, this doesn't work - simply the directory doesn't exist. How would I find where the file is downloaded? Is there a function I can call? Or is there a set path it would go to?
Or alternatively, is there a way to set the download location with YouTube-dl?
Since heroku is running Linux and not windows, you could attempt to download your files to your current working directory and then just send it from there.
The main tweak would be setting up some options in your YoutubeDL app:
import os
opts = {
"outtmpl": f"{os.getcwd()}/(title)s.(ext)s"
}
youtube_dl.YoutubeDL(opts).download([link])
That will download the file to your current working directory.
Then you can just upload it from your working directory using return send_file(file).

Upload one file to multi folder google driver via API with python

I want to upload a file to multi folder google driver via API but only one file will be save, not each file for each folder. (1 file in serveral folder)
Example using by hand: Add the Same File to Multiple Folders in Google Drive without Copying
Could you please help me! Thank you!
In inserting a file in a folder, you need to specify the correct folder ID in the parents property of the file. Using Python:
folder_id = '0BwwA4oUTeiV1TGRPeTVjaWRDY1E'
file_metadata = {
'name' : 'photo.jpg',
'parents': [ folder_id ]
}
media = MediaFileUpload('files/photo.jpg',
mimetype='image/jpeg',
resumable=True)
file = drive_service.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print 'File ID: %s' % file.get('id')
As further mentioned in Files: insert, setting the parents[] property in the request body will put the file in all of the provided folders. If no folders are provided in parents[] field, the file will be placed in the default root folder.
Hope that helps!

It is possible to search in files on ftp in Python?

right now this is all I have:
import ftputil
a_host = ftputil.FTPHost("ftp_host", "username","pass") # login to ftp
for (dirname, subdirs, files) in a_host.walk("/"): # directory
for f in files:
fullpath = a_host.path.join(dirname, f)
if fullpath.endswith('html'):
#stucked
so I can log in to my ftp, and do a .walk in my files
the thing I am not able to manage is when the .walk finds a html file to also search in it for a string I want.
for example:
on my ftp - there is a index.html and a something.txt file
I want to find with .walk the index.html file, and then in index.html search for 'my string'
thanks
FTP is a protocol for file transfer only. It has not the ability by itself to execute remote commands which are needed to search the files on the remote server (there is a SITE command but it can usually not be used for such a purpose because it is not implemented or restricted to only a few commands).
This means your only option with FTP is to download the file and search it locally, i.e. transfer the file to the local system, open it there and look for the string.

Issue while uploading file with ftputil

I am struggling to upload file to my FTP server. Please advise what is wrong in code below:
host: someserver.com
path: ./my_folder/at_this_server
target: 'test.pdf'
with ftputil.FTPHost(ftp_settings['host'],
ftp_settings['user'],
ftp_settings['password'],
ftp_settings['port']) as ftp_host:
safe_chdir(ftp_host, ftp_settings['path']) # change FTP dir
ftp_host.upload_if_newer('local_test.pdf', 'test.pdf')
There is successfully execute the command upload_if_newer() or upload(), but I don't see any uploaded file to the FTP folder.
UPDATE
I found that file is uploading to host+"/my_folder" only instead of host+"/my_folder/at_this_server".
1) Check the result of ftp_host.upload_if_newer('local_test.pdf', 'test.pdf'). If it is True then the file was copied.
2) Are you sure that safe_chdir function is correct? You could check that current directory on FTP changed using ftp_host.getcwd(). Try to upload file using full path instead of changing FTP dir.
3) Check access rights.

Categories