ftplib.error_perm: 550 Requested action not taken - python

Got this error while running my script:
ftplib.error_perm: 550 Requested action not taken.
ftp = FTP()
HOST = 'some host here'
PORT = 'some port here'
ftp.connect(host=HOST, port=PORT)
ftp.login(user="some_user", passwd="pass")
out = 'ftp_files/'
filenames = ftp.nlst()
for file in filenames:
file = file[::-1]
file = file.split(' ')[0]
file = file[::-1] # file name is ready
with open(out + file, 'wb') as f:
ftp.retrbinary('RETR ' + file, f.write)
ftp.close()
I've changed pass, username, host and port in this example. They are correct in real.
If anybody knows what the problem can be?

It falls only on one file that is empty. I still don't know if this was the problem, but i will look for anoher solution in my project. Current ftp isn's stable in work.

Related

Python paramiko library - Put files from local server to remote server (CSV file) [duplicate]

Aim: I am trying to use SFTP through Paramiko in Python to upload files on server pc.
What I've done: To test that functionality, I am using my localhost (127.0.0.1) IP. To achieve that I created the following code with the help of Stack Overflow suggestions.
Problem: The moment I run this code and enter the file name, I get the "IOError : Failure", despite handling that error. Here's a snapshot of the error:
import paramiko as pk
import os
userName = "sk"
ip = "127.0.0.1"
pwd = "1234"
client=""
try:
client = pk.SSHClient()
client.set_missing_host_key_policy(pk.AutoAddPolicy())
client.connect(hostname=ip, port=22, username=userName, password=pwd)
print '\nConnection Successful!'
# This exception takes care of Authentication error& exceptions
except pk.AuthenticationException:
print 'ERROR : Authentication failed because of irrelevant details!'
# This exception will take care of the rest of the error& exceptions
except:
print 'ERROR : Could not connect to %s.'%ip
local_path = '/home/sk'
remote_path = '/home/%s/Desktop'%userName
#File Upload
file_name = raw_input('Enter the name of the file to upload :')
local_path = os.path.join(local_path, file_name)
ftp_client = client.open_sftp()
try:
ftp_client.chdir(remote_path) #Test if remote path exists
except IOError:
ftp_client.mkdir(remote_path) #Create remote path
ftp_client.chdir(remote_path)
ftp_client.put(local_path, '.') #At this point, you are in remote_path in either case
ftp_client.close()
client.close()
Can you point out where's the problem and the method to resolve it?
Thanks in advance!
The second argument of SFTPClient.put (remotepath) is path to a file, not a folder.
So use file_name instead of '.':
ftp_client.put(local_path, file_name)
... assuming you are already in remote_path, as you call .chdir earlier.
To avoid a need for .chdir, you can use an absolute path:
ftp_client.put(local_path, remote_path + '/' + file_name)

Download large files using pysftp

I have a file >500 MB to download using sftp connection, I tried using pysptp and getting error SSHException: Server connection dropped:
import pysftp
import sys
myHostname = "dbfiles.xyz.org"
myUsername = "XXXX"
myPassword = "YYYY"
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
with pysftp.Connection(host=myHostname, username=myUsername, password=myPassword,cnopts=cnopts) as sftp:
print("Connection succesfully stablished ... ")
localFilePath = 'c:/....'
remoteFilePath = sftp.listdir('/folder/')
for filename in remoteFilePath:
if 'string_to_match' in filename:
local_path = localFilePath + filename
print (filename)
print (local_path)
sftp.get("folder/" + filename, local_path)
And getting SSHException: Server connection dropped: EOF error after 18MB of file is downloaded. Is there any way I can put limit on amount to data downloaded or can delay this get process in order to get full file, i tried several ways but because of large file size, unable to download complete file. Any help appreciated.
Go to Paramiko library in stfp_file.py and change the MAX_REQUEST_SIZE to 1024. It worked for me. You can find the package here:
/home//.local/lib/python3.8/site-packages/paramiko/sftp_file.py

to check whether a file exists in remote server with a loop

I am working on an application where I have to upload a file to the remote server.
I then wait for the file to process and sftp the processed file back to the original server.
I have managed to copy the file to the remote server using paramiko module.
How do I achieve the following?
Create a criteria for checking whether the result file is generated on a loop basis, and
Only proceed with the sftp once the file has been created in the remote folder.
Here is my code what i have tried so far
s = open("sea" + str(UID), 'w')
s.write(outtext)
s.close()
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect("XXXX", username="XXXX", password="XXXXX")
sftp = ssh.open_sftp()
currentfile=pwd + "/sea" + str(UID)
print currentfile
destinationfile="/srv/sftp/smc-sftp-DEFAULT/inbox/sea" + str(UID)
sftp.put(currentfile,destinationfile)
outputfile="/srv/sftp/smc-sftp-DEFAULT/outbox/"
finalfile="/sea" + str(UID) + ".res"
while True:
try:
print(sftp.stat(outputfile+finalfile))
print('file exists')
sftp.get(outputfile+finalfile,pwd + "/sea" + str(UID) + ".res")
break
except IOError:
print('copying file')
continue
sftp.close()
ssh.close()
the issue is now solved.
it was a mistake in defining the outputfile with an extra "/"
changed them to below value.
outputfile="/srv/sftp/smc-sftp-DEFAULT/outbox"

Ftp in python 3.6

I want to transfer some datas from python to a FTP server which don't require a login... From C++ I can transfer datas without authentification so Ftp server s working well.. I want to do same thing in python...
ftp=ftplib.FTP('192.168.1.21')
filename= "test.html"
myfile = open('/Users/Univers/Desktop/test.html', 'rb')
ftp.storlines('STOR ' + filename, myfile)
ftp.quit()
Something like that but it returns me:
ftplib.error_perm: 530 User cannot log in.
Thanks
i think you need to:
from ftplib import FTP
ftp = FTP('127.0.0.1') # connect to host, default port
ftp.login() # user anonymous, passwd anonymous# default
ftp.retrlines('LIST') # list directory contents
ftp.quit()
I dont see the ftp.login() in your script.

FTP server breaks connection before download is complete

Sometimes, FTP server closes connection before file is completely downloaded..
Here is my code:
ftp = ftplib.FTP(site)
ftp.login(user, pw)
ftp.cwd(dir)
remotefiles = ftp.nlst()
for file in remotefiles:
if fnmatch.fnmatch(file, match_text):
if os.path.exists(file):
if True: print file, 'already fetched'
else:
if True: print 'Downloading', file
local = open(file, 'wb')
try:
ftp.retrbinary('RETR ' + file, local.write)
finally:
local.close()
if True: print 'Download done.'
You can specify a timeout parameter in the FTP constructor and set it to 0 or something very large value like sys.maxint.
class ftplib.FTP([host[, user[, passwd[, acct[, timeout]]]]])
Additionally, you can turn on debugging to see what's going on behind the scenes.
ftp = ftplib.FTP(site, user, pw, timeout=0)
ftp.set_debuglevel(2)
Hope this helps.

Categories