Issue while uploading file with ftputil - python

I am struggling to upload file to my FTP server. Please advise what is wrong in code below:
host: someserver.com
path: ./my_folder/at_this_server
target: 'test.pdf'
with ftputil.FTPHost(ftp_settings['host'],
ftp_settings['user'],
ftp_settings['password'],
ftp_settings['port']) as ftp_host:
safe_chdir(ftp_host, ftp_settings['path']) # change FTP dir
ftp_host.upload_if_newer('local_test.pdf', 'test.pdf')
There is successfully execute the command upload_if_newer() or upload(), but I don't see any uploaded file to the FTP folder.
UPDATE
I found that file is uploading to host+"/my_folder" only instead of host+"/my_folder/at_this_server".

1) Check the result of ftp_host.upload_if_newer('local_test.pdf', 'test.pdf'). If it is True then the file was copied.
2) Are you sure that safe_chdir function is correct? You could check that current directory on FTP changed using ftp_host.getcwd(). Try to upload file using full path instead of changing FTP dir.
3) Check access rights.

Related

Flask - Sending zipfile contains absolute path

My flask app has a function that compresses log files in a directory into a zip file and then sends the file to the user to download. The compression works, except that when the client receives the zipfile, the zip contains a series of folders that matches the absolute path of the original files that were zipped in the server. However, the zipfile that was made in the server static folder does not.
Zipfile contents in static folder: "log1.bin, log2.bin"
Zipfile contents that was sent to user: "/home/user/server/data/log1.bin, /home/user/server/data/log2.bin"
I don't understand why using "send_file" seems to make this change to the zip file contents and fills the received zip file with sub folders. The actual contents of the received zip file do in fact match the contents of the sent zip file in terms of data, but the user has to click through several directories to get to the files. What am I doing wrong?
#app.route("/download")
def download():
os.chdir(data_dir)
if(os.path.isfile("logs.zip")):
os.remove("logs.zip")
log_dir = os.listdir('.')
log_zip = zipfile.ZipFile('logs.zip', 'w')
for log in log_dir:
log_zip.write(log)
log_zip.close()
return send_file("logs.zip", as_attachment=True)
Using send_from_directory(directory, "logs.zip", as_attachment=True) fixed everything. It appears this call is better for serving up static files.

How to find location of downloaded files on Heroku

I am using YouTube-dl with Python and Flask to download youtube videos and return them using the send_file function.
When running locally I have been using to get the file path:
username = getpass.getuser()
directories = os.listdir(rf'C:\\Users\\{username}')
I then download the video with YouTube-dl:
youtube_dl.YoutubeDL().download([link])
I then search the directory for the file based on the video code:
files = [file for file in directories]
code = link.split('v=')[1]
for file in files:
if file.endswith('.mp4') is True:
try:
code_section = file.split('-')[1].split('.mp4')[0]
if code in code_section:
return send_file(rf'C:\\Users\\{username}\\{file}')
except:
continue
Finally, I return the file:
return send_file(rf'C:\\Users\\{username}\\{file}')
to find the location of the downloaded file, but, on Heroku, this doesn't work - simply the directory doesn't exist. How would I find where the file is downloaded? Is there a function I can call? Or is there a set path it would go to?
Or alternatively, is there a way to set the download location with YouTube-dl?
Since heroku is running Linux and not windows, you could attempt to download your files to your current working directory and then just send it from there.
The main tweak would be setting up some options in your YoutubeDL app:
import os
opts = {
"outtmpl": f"{os.getcwd()}/(title)s.(ext)s"
}
youtube_dl.YoutubeDL(opts).download([link])
That will download the file to your current working directory.
Then you can just upload it from your working directory using return send_file(file).

Unable to download files from s3 because a junk string is appended to key name

I am trying to download all the files and folders from a specific folder in S3 bucket and download in a specific local folder. When I try to download, I get the exception:
No such file or directory
I tried different snippets of codes to perform the same task but could not solve the issue. Here is the code snippet:
def download(self):
s3_bucket = self.resource.Bucket(self.bucket)
for s3_object in s3_bucket.objects.filter(Prefix=self.folder):
local_path = os.path.join(self.local_path, s3_object.key)
if not os.path.exists(os.path.dirname(local_path)):
os.makedirs(local_path)
key = str(s3_object.key)
if not key.endswith('/'):
logging.info("Downloading {}".format(key))
s3_bucket.download_file(s3_object.key, key)
return json.dumps({"msg": "data is downloaded"})
But It gives me the error:
IOError: [Errno 2] No such file or directory:
'temp/abc/metrics_data.csv.CDBdfD3f'
The actual key of the file is temp/abc/metrics_data.csv but I don't know from where ".CDBdfD3f" is appended. Please guide.
I know its a bit late but I've had the same issue and the problem is the directory you are writing your file to does not exist. I had a case sensitivity problem and once I've put the correct path everything worked as expected.

How Can I read file from Egnyte folder through Python?

Also I want to move this file in another folder and give output file on another folder. All folders are on Egnyte and using Python
client = egnyte.EgnyteClient({"domain": "apidemo.egnyte.com",
"User_Name": "Password"})
folder = client.folder("/Shared/Data/Individuals/Input")
Client use for login on Egnyte.
Folder contains the address of file and now how I read file in this folder and file name is "abc.txt".
How can I read this file and move on location "/Shared/Data/Individuals/Checked".
And after data processed output file saved on other location "/Shared/Data/Individuals/output".
It's an old post but here is how you would download/read files from a folder.
client = egnyte.EgnyteClient({"domain": "apidemo.egnyte.com",
"access_token": "OAuth token"})
file = client.file("/Shared/Data/Individuals/Input/abc.txt")
file_resource = file.download()
file_resource.read()
Egnyte have desktopconnect(2.0) version which I download on my system and after installation it will make egnyte drive on my pc and now I can easily access the egnyte files from my code just like other local files can read from code.

It is possible to search in files on ftp in Python?

right now this is all I have:
import ftputil
a_host = ftputil.FTPHost("ftp_host", "username","pass") # login to ftp
for (dirname, subdirs, files) in a_host.walk("/"): # directory
for f in files:
fullpath = a_host.path.join(dirname, f)
if fullpath.endswith('html'):
#stucked
so I can log in to my ftp, and do a .walk in my files
the thing I am not able to manage is when the .walk finds a html file to also search in it for a string I want.
for example:
on my ftp - there is a index.html and a something.txt file
I want to find with .walk the index.html file, and then in index.html search for 'my string'
thanks
FTP is a protocol for file transfer only. It has not the ability by itself to execute remote commands which are needed to search the files on the remote server (there is a SITE command but it can usually not be used for such a purpose because it is not implemented or restricted to only a few commands).
This means your only option with FTP is to download the file and search it locally, i.e. transfer the file to the local system, open it there and look for the string.

Categories