Ftp in python 3.6 - python

I want to transfer some datas from python to a FTP server which don't require a login... From C++ I can transfer datas without authentification so Ftp server s working well.. I want to do same thing in python...
ftp=ftplib.FTP('192.168.1.21')
filename= "test.html"
myfile = open('/Users/Univers/Desktop/test.html', 'rb')
ftp.storlines('STOR ' + filename, myfile)
ftp.quit()
Something like that but it returns me:
ftplib.error_perm: 530 User cannot log in.
Thanks

i think you need to:
from ftplib import FTP
ftp = FTP('127.0.0.1') # connect to host, default port
ftp.login() # user anonymous, passwd anonymous# default
ftp.retrlines('LIST') # list directory contents
ftp.quit()
I dont see the ftp.login() in your script.

Related

Python paramiko library - Put files from local server to remote server (CSV file) [duplicate]

Aim: I am trying to use SFTP through Paramiko in Python to upload files on server pc.
What I've done: To test that functionality, I am using my localhost (127.0.0.1) IP. To achieve that I created the following code with the help of Stack Overflow suggestions.
Problem: The moment I run this code and enter the file name, I get the "IOError : Failure", despite handling that error. Here's a snapshot of the error:
import paramiko as pk
import os
userName = "sk"
ip = "127.0.0.1"
pwd = "1234"
client=""
try:
client = pk.SSHClient()
client.set_missing_host_key_policy(pk.AutoAddPolicy())
client.connect(hostname=ip, port=22, username=userName, password=pwd)
print '\nConnection Successful!'
# This exception takes care of Authentication error& exceptions
except pk.AuthenticationException:
print 'ERROR : Authentication failed because of irrelevant details!'
# This exception will take care of the rest of the error& exceptions
except:
print 'ERROR : Could not connect to %s.'%ip
local_path = '/home/sk'
remote_path = '/home/%s/Desktop'%userName
#File Upload
file_name = raw_input('Enter the name of the file to upload :')
local_path = os.path.join(local_path, file_name)
ftp_client = client.open_sftp()
try:
ftp_client.chdir(remote_path) #Test if remote path exists
except IOError:
ftp_client.mkdir(remote_path) #Create remote path
ftp_client.chdir(remote_path)
ftp_client.put(local_path, '.') #At this point, you are in remote_path in either case
ftp_client.close()
client.close()
Can you point out where's the problem and the method to resolve it?
Thanks in advance!
The second argument of SFTPClient.put (remotepath) is path to a file, not a folder.
So use file_name instead of '.':
ftp_client.put(local_path, file_name)
... assuming you are already in remote_path, as you call .chdir earlier.
To avoid a need for .chdir, you can use an absolute path:
ftp_client.put(local_path, remote_path + '/' + file_name)

ftplib.error_perm: 550 Requested action not taken

Got this error while running my script:
ftplib.error_perm: 550 Requested action not taken.
ftp = FTP()
HOST = 'some host here'
PORT = 'some port here'
ftp.connect(host=HOST, port=PORT)
ftp.login(user="some_user", passwd="pass")
out = 'ftp_files/'
filenames = ftp.nlst()
for file in filenames:
file = file[::-1]
file = file.split(' ')[0]
file = file[::-1] # file name is ready
with open(out + file, 'wb') as f:
ftp.retrbinary('RETR ' + file, f.write)
ftp.close()
I've changed pass, username, host and port in this example. They are correct in real.
If anybody knows what the problem can be?
It falls only on one file that is empty. I still don't know if this was the problem, but i will look for anoher solution in my project. Current ftp isn's stable in work.

Python ftplib upload text file with commands to server: error 502 command not recognized [duplicate]

I would like to make a script to upload a file to FTP.
How would the login system work? I'm looking for something like this:
ftp.login=(mylogin)
ftp.pass=(mypass)
And any other sign in credentials.
Use ftplib, you can write it like this:
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()
Use ftplib.FTP_TLS instead if you FTP host requires TLS.
To retrieve it, you can use urllib.retrieve:
import urllib
urllib.urlretrieve('ftp://server/path/to/file', 'file')
EDIT:
To find out the current directory, use FTP.pwd():
FTP.pwd(): Return the pathname of the current directory on the server.
To change the directory, use FTP.cwd(pathname):
FTP.cwd(pathname): Set the current directory on the server.
ftplib now supports context managers so I guess it can be made even easier
from ftplib import FTP
from pathlib import Path
file_path = Path('kitten.jpg')
with FTP('server.address.com', 'USER', 'PWD') as ftp, open(file_path, 'rb') as file:
ftp.storbinary(f'STOR {file_path.name}', file)
No need to close the file or the session
You will most likely want to use the ftplib module for python
import ftplib
ftp = ftplib.FTP()
host = "ftp.site.uk"
port = 21
ftp.connect(host, port)
print (ftp.getwelcome())
try:
print ("Logging in...")
ftp.login("yourusername", "yourpassword")
except:
"failed to login"
This logs you into an FTP server. What you do from there is up to you. Your question doesnt indicate any other operations that really need doing.
Try this:
#!/usr/bin/env python
import os
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('hostname', username="username", password="password")
sftp = ssh.open_sftp()
localpath = '/home/e100075/python/ss.txt'
remotepath = '/home/developers/screenshots/ss.txt'
sftp.put(localpath, remotepath)
sftp.close()
ssh.close()
To avoid getting the encryption error you can also try out below commands
ftp = ftplib.FTP_TLS("ftps.dummy.com")
ftp.login("username", "password")
ftp.prot_p()
file = open("filename", "rb")
ftp.storbinary("STOR filename", file)
file.close()
ftp.close()
ftp.prot_p() ensure that your connections are encrypted
I just answered a similar question here
IMHO, if your FTP server is able to communicate with Fabric please us Fabric. It is far better than doing raw ftp.
I have an FTP account from dotgeek.com so I am not sure if this will work for other FTP accounts.
#!/usr/bin/python
from fabric.api import run, env, sudo, put
env.user = 'username'
env.hosts = ['ftp_host_name',] # such as ftp.google.com
def copy():
# assuming i have wong_8066.zip in the same directory as this script
put('wong_8066.zip', '/www/public/wong_8066.zip')
save the file as fabfile.py and run fab copy locally.
yeukhon#yeukhon-P5E-VM-DO:~$ fab copy2
[1.ai] Executing task 'copy2'
[1.ai] Login password:
[1.ai] put: wong_8066.zip -> /www/public/wong_8066.zip
Done.
Disconnecting from 1.ai... done.
Once again, if you don't want to input password all the time, just add
env.password = 'my_password'
You can use the below function. I haven't tested it yet, but it should work fine. Remember the destination is a directory path where as source is complete file path.
import ftplib
import os
def uploadFileFTP(sourceFilePath, destinationDirectory, server, username, password):
myFTP = ftplib.FTP(server, username, password)
if destinationDirectory in [name for name, data in list(remote.mlsd())]:
print "Destination Directory does not exist. Creating it first"
myFTP.mkd(destinationDirectory)
# Changing Working Directory
myFTP.cwd(destinationDirectory)
if os.path.isfile(sourceFilePath):
fh = open(sourceFilePath, 'rb')
myFTP.storbinary('STOR %s' % f, fh)
fh.close()
else:
print "Source File does not exist"

Files are corrupted after transfer them to a FTP server using ftplib

I've got a problem with the ftplib library when I'm uploading .gz files.
The script was working before but somehow in one of my thousands of editions, I changed something that is causing to transfer corrupted files.
The files is successfully transferred to the ftp server, however, that files when is being used in the ftp won't open because is corrupted.
The files which are going to be transferred are without any issues. Also if the files are not compressed, the transfer has not any issues. It's something with the way it reads .gz
Can anyone let me know what is wrong with the code?
for filename in dir:
os.system("gzip %s/%s" % (Path, filename))
time.sleep(5) # Wait up to 4 seconds to compress file
zip_filename = filename + '.gz'
try:
# Connect to the host
ftp = ftplib.FTP(host)
# Login to the ftp server
ftp.login(username, password)
# Transfer the file
myfile = open(zip_filename, 'rb')
ftp.storlines("STOR temp/" + zip_filename, myfile)
myfile.close()
except ftplib.all_errors as e:
print(e)
The issue was the use of storlines. In this case, needs to be use storbinary

SFTP in Python? (platform independent)

I'm working on a simple tool that transfers files to a hard-coded location with the password also hard-coded. I'm a python novice, but thanks to ftplib, it was easy:
import ftplib
info= ('someuser', 'password') #hard-coded
def putfile(file, site, dir, user=(), verbose=True):
"""
upload a file by ftp to a site/directory
login hard-coded, binary transfer
"""
if verbose: print 'Uploading', file
local = open(file, 'rb')
remote = ftplib.FTP(site)
remote.login(*user)
remote.cwd(dir)
remote.storbinary('STOR ' + file, local, 1024)
remote.quit()
local.close()
if verbose: print 'Upload done.'
if __name__ == '__main__':
site = 'somewhere.com' #hard-coded
dir = './uploads/' #hard-coded
import sys, getpass
putfile(sys.argv[1], site, dir, user=info)
The problem is that I can't find any library that supports sFTP. What's the normal way to do something like this securely?
Edit: Thanks to the answers here, I've gotten it working with Paramiko and this was the syntax.
import paramiko
host = "THEHOST.com" #hard-coded
port = 22
transport = paramiko.Transport((host, port))
password = "THEPASSWORD" #hard-coded
username = "THEUSERNAME" #hard-coded
transport.connect(username = username, password = password)
sftp = paramiko.SFTPClient.from_transport(transport)
import sys
path = './THETARGETDIRECTORY/' + sys.argv[1] #hard-coded
localpath = sys.argv[1]
sftp.put(localpath, path)
sftp.close()
transport.close()
print 'Upload done.'
Thanks again!
Paramiko supports SFTP. I've used it, and I've used Twisted. Both have their place, but you might find it easier to start with Paramiko.
You should check out pysftp https://pypi.python.org/pypi/pysftp it depends on paramiko, but wraps most common use cases to just a few lines of code.
import pysftp
import sys
path = './THETARGETDIRECTORY/' + sys.argv[1] #hard-coded
localpath = sys.argv[1]
host = "THEHOST.com" #hard-coded
password = "THEPASSWORD" #hard-coded
username = "THEUSERNAME" #hard-coded
with pysftp.Connection(host, username=username, password=password) as sftp:
sftp.put(localpath, path)
print 'Upload done.'
Here is a sample using pysftp and a private key.
import pysftp
def upload_file(file_path):
private_key = "~/.ssh/your-key.pem" # can use password keyword in Connection instead
srv = pysftp.Connection(host="your-host", username="user-name", private_key=private_key)
srv.chdir('/var/web/public_files/media/uploads') # change directory on remote server
srv.put(file_path) # To download a file, replace put with get
srv.close() # Close connection
pysftp is an easy to use sftp module that utilizes paramiko and pycrypto. It provides a simple interface to sftp.. Other things that you can do with pysftp which are quite useful:
data = srv.listdir() # Get the directory and file listing in a list
srv.get(file_path) # Download a file from remote server
srv.execute('pwd') # Execute a command on the server
More commands and about PySFTP here.
If you want easy and simple, you might also want to look at Fabric. It's an automated deployment tool like Ruby's Capistrano, but simpler and of course for Python. It's build on top of Paramiko.
You might not want to do 'automated deployment' but Fabric would suit your use case perfectly none the less. To show you how simple Fabric is: the fab file and command for your script would look like this (not tested, but 99% sure it will work):
fab_putfile.py:
from fabric.api import *
env.hosts = ['THEHOST.com']
env.user = 'THEUSER'
env.password = 'THEPASSWORD'
def put_file(file):
put(file, './THETARGETDIRECTORY/') # it's copied into the target directory
Then run the file with the fab command:
fab -f fab_putfile.py put_file:file=./path/to/my/file
And you're done! :)
fsspec is a great option for this, it offers a filesystem like implementation of sftp.
from fsspec.implementations.sftp import SFTPFileSystem
fs = SFTPFileSystem(host=host, username=username, password=password)
# list a directory
fs.ls("/")
# open a file
with fs.open(file_name) as file:
content = file.read()
Also worth noting that fsspec uses paramiko in the implementation.
With RSA Key then refer here
Snippet:
import pysftp
import paramiko
from base64 import decodebytes
keydata = b"""AAAAB3NzaC1yc2EAAAADAQABAAABAQDl"""
key = paramiko.RSAKey(data=decodebytes(keydata))
cnopts = pysftp.CnOpts()
cnopts.hostkeys.add(host, 'ssh-rsa', key)
with pysftp.Connection(host=host, username=username, password=password, cnopts=cnopts) as sftp:
with sftp.cd(directory):
sftp.put(file_to_sent_to_ftp)
Twisted can help you with what you are doing, check out their documentation, there are plenty of examples. Also it is a mature product with a big developer/user community behind it.
There are a bunch of answers that mention pysftp, so in the event that you want a context manager wrapper around pysftp, here is a solution that is even less code that ends up looking like the following when used
path = "sftp://user:p#ssw0rd#test.com/path/to/file.txt"
# Read a file
with open_sftp(path) as f:
s = f.read()
print s
# Write to a file
with open_sftp(path, mode='w') as f:
f.write("Some content.")
The (fuller) example: http://www.prschmid.com/2016/09/simple-opensftp-context-manager-for.html
This context manager happens to have auto-retry logic baked in in the event you can't connect the first time around (which surprisingly happens more often than you'd expect in a production environment...)
The context manager gist for open_sftp: https://gist.github.com/prschmid/80a19c22012e42d4d6e791c1e4eb8515
Paramiko is so slow. Use subprocess and shell, here is an example:
remote_file_name = "filename"
remotedir = "/remote/dir"
localpath = "/local/file/dir"
ftp_cmd_p = """
#!/bin/sh
lftp -u username,password sftp://ip:port <<EOF
cd {remotedir}
lcd {localpath}
get {filename}
EOF
"""
subprocess.call(ftp_cmd_p.format(remotedir=remotedir,
localpath=localpath,
filename=remote_file_name
),
shell=True, stdout=sys.stdout, stderr=sys.stderr)
PyFilesystem with its sshfs is one option. It uses Paramiko under the hood and provides a nicer paltform independent interface on top.
import fs
sf = fs.open_fs("sftp://[user[:password]#]host[:port]/[directory]")
sf.makedir('my_dir')
or
from fs.sshfs import SSHFS
sf = SSHFS(...
Here's a generic function that will download any given sftp url to a specified path
from urllib.parse import urlparse
import paramiko
url = 'sftp://username:password#hostname/filepath.txt'
def sftp_download(url, dest):
url = urlparse(url)
with paramiko.Transport((url.hostname, 22)) as transport:
transport.connect(None,url.username,url.password)
with paramiko.SFTPClient.from_transport(transport) as sftp:
sftp.get(url.path, dest)
Call it with
sftp_download(url, "/tmp/filepath.txt")

Categories