I'm running a script for an FTP server in Python with PyFTPdLib like this:
from pyftpdlib.authorizers import DummyAuthorizer
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
def main():
authorizer = DummyAuthorizer()
authorizer.add_user('user', '12345', '.', perm='elradfmwM')
handler = FTPHandler
handler.authorizer = authorizer
address = ('localhost', 2121)
server = FTPServer(address, handler)
# start ftp server
server.serve_forever()
if __name__ == '__main__':
main()
Whenever I call this code, it keeps the command line busy with running the "server.serve_forever()", it doesn't keep running through the if loop.
So my question is, how do you add a user while the server is running without shutting down the server?
Do I need to create another script to call this one and then another to add a user? Will it need to be threaded so both can run without locking up on this one?
I've tried to look through the tutorials on this module, but I haven't found any examples or seen any solutions. I'm willing to switch to a different module to run a FTP server if needed, so long as I can control everything.
I'm going to be adding users a lot from a database, then removing them. So I don't want to take the server down and restart it every time I need to add someone.
In case anyone was interested in this topic, I think I have found a solution.
What I did was save the above code in a file called "create_ftp_server.py" and changed "main" from a def to a class.
#First .py module
class ftp_server:
def __init__(self):
self.authorizer = DummyAuthorizer()
self.authorizer.add_user('admin', 'password', '.', perm='elradfmwM')
def run(self):
self.handler = FTPHandler
self.handler.authorizer = self.authorizer
self.address = ('localhost', 21)
self.server = FTPServer(self.address, self.handler)
logging.basicConfig(filename='pyftpd.log', level=logging.INFO)
self.server.serve_forever()
def add_user(self,user,passwd,loc,privi):
self.authorizer.add_user(str(user), str(passwd), str(loc), perm=str(privi))
Then I made another module to call and run this file (really this is just for my personal taste, as you could just add this to the same module)
#Second .py module
import thread
import create_ftp_server
this_ftp = create_ftp_server.ftp_server()
thread.start_new_thread(this_ftp.run,())
thread.start_new_thread(this_ftp.add_user,('user','password',".",'elradfmwM'))
All you have to do now is decide how you want to call the thread to add the user.
Note: I tried to use the newer Threading module but I couldn't get any real results with that. This option will work well for me, though.
You can do it, with the SITE Command like this:
proto_cmds = FTPHandler.proto_cmds.copy()
#Define SITE ADDUSER USERNAME PASSWORT HOME PRIVS
proto_cmds.update(
{'SITE ADDUSER': dict(perm='m', auth=True, arg=True, help='Syntax: SITE <SP> ADDUSER USERNAME PASSWORT HOME PRIVS <SP>.')}
)
class siteadd(TLS_FTPHandler):
proto_cmds = proto_cmds
def ftp_SITE_ADDUSER(self,user,passwd,loc,privi):
self.authorizer.add_user(str(user), str(passwd), str(loc), perm=str(privi))
and change the handle line from
self.handler = FTPHandler
to
self.handler = siteadd
Check if this works.
Connect to your FTP and run SITE HELP command you see all available commands.
You can run the FTPd in daemon mode, too. Look here for an example
Related
I am trying to run multiple instances of FTP servers on same localhost using the following code:
# "create_multiple_ftp_servers.py"
from multiprocessing import Pool
import sys
sys.path.insert(1, r'C:\Users\Desktop\PythonCodes')
import create_ftp_server
ftp_server_dict = {'ftp1': ['127.0.0.1', 'test', 'test#123', r'C:\Users\Desktop\ftpdir', 1024],
'ftp2': ['127.0.0.1', 'test', 'test#123', r'C:\Users\Desktop\ftpdir', 1025]}
for k, v in ftp_server_dict.items():
with Pool(5) as p:
server_instance = p.map(create_ftp_server.ftp, [v,])
p.close()
Then following "create_ftp_server.py" file is used to create and run an instance of FTP server:
import os
from pyftpdlib.authorizers import DummyAuthorizer
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
def ftp(data):
# Instantiate a dummy authorizer for managing 'virtual' users
authorizer = DummyAuthorizer()
# Define a new user having full r/w permissions and a read-only
# anonymous user
authorizer.add_user(data[1], data[2], data[3], perm='elradfmwMT')
authorizer.add_anonymous(os.getcwd())
# Instantiate FTP handler class
handler = FTPHandler
handler.authorizer = authorizer
# Define a customized banner (string returned when client connects)
handler.banner = "pyftpdlib based ftpd ready."
# Specify a masquerade address and the range of ports to use for
# passive connections. Decomment in case you're behind a NAT.
# handler.masquerade_address = '151.25.42.11'
handler.passive_ports = range(60000, 65535)
# Instantiate FTP server class and listen on 0.0.0.0:2121
address = (data[0], data[4])
server = FTPServer(address, handler)
# set a limit for connections
server.max_cons = 256
server.max_cons_per_ip = 5
# start ftp server
server.serve_forever()
After running the above codes I expect that there should be 2 instances of FTP servers to run on the localhost or '127.0.0.1" on port number 1024 & 1025 respectively.
But after running the above code only one instance of FTP server is running on the localhost. Can somebody please let me know how to resolve this issue.
The serve_forever() call is what is causing you issues. This call never returns and is running an infinite loop.
The only way I see you could handle this is to run your FTP servers in background threads, one thread per server or using a concurrency library like GEvent (which would ultimately do the same thing as background threads, albeit a bit lighter).
I am trying to use pyftpdlib for serving files via ftp. I was able to run a sample server but I am looking for a way to change working directory for a user right after he is logged in.
Current situation is that after log in user PWD always return '/' (the root):
ftp> pwd
257 "/" is the current directory.
I want after user is logged in server to automatically change its directory to let say 'test' so PWD after login to return:
ftp> pwd
257 "/test" is the current directory.
I looked in the documentation and see that there is on_login() handler but no examples how to use it.
Any ideas will be helpfull
I am adding sample code:
import os
from pyftpdlib.authorizers import DummyAuthorizer
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
def main():
authorizer = DummyAuthorizer()
authorizer.add_user('user', '12345', homedir='c:\\temp', perm='elradfmwMT')
handler = FTPHandler
handler.authorizer = authorizer
address = ('', 21)
server = FTPServer(address, handler)
server.serve_forever()
if __name__ == '__main__':
main()
I have also folder c:\temp\test that I want to switch to automatically without setting it as a root. Root should remain c:\temp
Just change the homedir parameter to your desired path:
def main():
authorizer = DummyAuthorizer()
authorizer.add_user('user', 'pwd', homedir='/test', perm='elradfmwMT')
handler.authorizer = authorizer
server = FTPServer(('', 2121))
server.serve_forever()
if __name__ == "__main__":
main()
Reference: https://pyftpdlib.readthedocs.io/en/latest/api.html
I would like to know if there is a way to translate in python the fact that someone disconnet from my ftp server with pyftplib. I know that there is a function call 'on_logout' but i don't know how to use it.
Thank you for your answers !
Create your own implementation of FTPHandler and use it with your server instance.
Though the on_logout is triggered in a response to user explicitly logging out. But most FTP clients do not do that, they simply disconnect. For that, use on_disconnect.
class MyFTPHandler(FTPHandler):
def on_logout(self, username):
print("%s logged out" % username)
def on_disconnect():
print("disconnected")
handler = MyFTPHandler
# ...
server = FTPServer(('', 21), handler)
I want to import some function in form of a script, let's call it controller.py, to a flask app as a web service. Let's call the flask app api.py.
The problem is, in controller.py, there is a pyserial declaration.
controller.py:
import serial
ser = serial.Serial('COM31', 9600,timeout=2)
def serial_function(foo):
ser.write(foo)
reply = ser.read()
return reply
api.py:
from flask import Flask
import controller as cont
app = Flask(__name__)
#app.route('/function/<foo>',methods=['GET'])
def do_function(foo):
data=cont.serial_function(foo)
return data
if __name__ == '__main__':
app.run('0.0.0.0', 80,True)
But i got this error:
raise SerialException("could not open port %s: %s" % (self.portstr, ctypes.WinError()))
serial.serialutil.SerialException: could not open port COM31: [Error 5] Access is denied.
It seems that Flask is trying to import controller.py over and over again, and the serial port re initialized.
Is there some way I can achieve what I'm trying to do as described above?
The main problem of your code is that the Serial object creation is directly in the code of your module. This way, each time you import your module, python will load/interpret the file, and by doing this, will execute all the code found at the root level of your module. (you can refer to this excellent answer if you want to dig in more : What does if __name__ == "__main__": do?)
Moreover, in debug mode, Flask will start 2 processes (one to monitor the source code and when changed, restart the second one, which is the one who will really handle the requests) and in production mode, you could create much more threads or processes when starting the server, by doing this, your module is imported at least twice => conflict in serial open.
A possible solution would be to remove the initialization of the serial port from your module and use the context manager syntax in your method :
def serial_function(foo):
with serial.Serial('COM31', 9600,timeout=2) as ser:
ser.write(foo)
reply = ser.read()
return reply
This way, you will open (and close) your serial port at each read.
But, you'll still have to deal with concurrent access if you have multiple clients making requests to your webserver simultaneously.
EDIT:
If, as you say in comment, you need to open only once your serial port, you'll need to encapsulate this in a specific object (probably using a Singleton pattern) that will be responsible of opening the serial port if not already opened:
class SerialProxy:
def __init__(self):
self.__serial = None
def serial_function(self, foo):
if self.__serial is None:
self.__serial = serial.Serial('COM31', 9600,timeout=2)
self.__serial.write(foo)
reply = self.__serial.read()
return reply
I want to start a simple web server locally, then launch a browser with an url just served. This is something that I'd like to write,
from wsgiref.simple_server import make_server
import webbrowser
srv = make_server(...)
srv.blocking = False
srv.serve_forever()
webbrowser.open_new_tab(...)
try:
srv.blocking = True
except KeyboardInterrupt:
pass
print 'Bye'
The problem is, I couldn't find a way to set a blocking option for the wsgiref simple server. By default, it's blocking, so the browser would be launched only after I stopped it. If I launch the browser first, the request is not handled yet. I'd prefer to use a http server from the standard library, not an external one, like tornado.
You either have to spawn a thread with the server, so you can continue with your control flow, or you have to use 2 python processes.
untested code, you should get the idea
class ServerThread(threading.Thread):
def __init__(self, port):
threading.Thread.__init__(self)
def run(self):
srv = make_server(...)
srv.serve_forever()
if '__main__'==__name__:
ServerThread().start()
webbrowser.open_new_tab(...)