The Python documentation includes an example of creating an HTTP server:
def run(server_class=HTTPServer, handler_class=BaseHTTPRequestHandler):
server_address = ('', 8000)
httpd = server_class(server_address, handler_class)
httpd.serve_forever()
A RequestHandler class is provided to the Server, which then takes care of instantiating the handler automatically.
Let's say I want to pass in custom parameters to the request handler when it's created. How can and should I do that?
More specifically, I want to pass in parameters from the command line, and having to access sys.argv inside the request handler class seems unnecessarily clunky.
It seems like this should be possible by overriding parts of the Server class, but I feel like I'm overlooking a simpler and better solution.
I solved this in my code using "partial application".
Example is written using Python 3, but partial application works the same way in Python 2:
from functools import partial
from http.server import HTTPServer, BaseHTTPRequestHandler
class ExampleHandler(BaseHTTPRequestHandler):
def __init__(self, foo, bar, qux, *args, **kwargs):
self.foo = foo
self.bar = bar
self.qux = qux
# BaseHTTPRequestHandler calls do_GET **inside** __init__ !!!
# So we have to call super().__init__ after setting attributes.
super().__init__(*args, **kwargs)
def do_HEAD(self):
self.send_response(200)
self.send_header('Content-type', 'text/plain')
self.end_headers()
def do_GET(self):
self.do_HEAD()
self.wfile.write('{!r} {!r} {!r}\n'
.format(self.foo, self.bar, self.qux)
.encode('utf8'))
# We "partially apply" the first three arguments to the ExampleHandler
handler = partial(ExampleHandler, sys.argv[1], sys.argv[2], sys.argv[3])
# .. then pass it to HTTPHandler as normal:
server = HTTPServer(('', 8000), handler)
server.serve_forever()
This is very similar to a class factory, but in my opinion it has a couple of subtle advantages:
partial objects are much easier to introspect for what's inside them than nested classes defined and returned by factory functions.
partial objects can be serialized with pickle in modern Python, whereas nested class definitions inside factory functions cannot (at least not without going out of your way to code a __reduce__ method on the class to make it possible).
In my limited experience explicit "pre-attaching" of arguments with partial to an otherwise Pythonic and normal class definition is easier (less cognitive load) to read, understand, and verify for correctness than a nested class definition with the parameters of the wrapping function buried somewhere inside it.
The only real disadvantage is that many people are unfamiliar with partial - but in my experience it is better for everyone to become familiar with partial anyway, because partial has a way of popping up as an easy and composable solution in many places, sometimes unexpectedly, like here.
Use a class factory:
def MakeHandlerClassFromArgv(init_args):
class CustomHandler(BaseHTTPRequestHandler):
def __init__(self, *args, **kwargs):
super(CustomHandler, self).__init__(*args, **kwargs)
do_stuff_with(self, init_args)
return CustomHandler
if __name__ == "__main__":
server_address = ('', 8000)
HandlerClass = MakeHandlerClassFromArgv(sys.argv)
httpd = HTTPServer(server_address, HandlerClass)
httpd.serve_forever()
At the time of this writing all the answers here essentially stick to the (very awkward) intention that the author of the socketserver module seemed to have that the handler passed in be a class (i.e. constructor). Really the only thing that's required of the handler is that it's callable, so we can work around the socketserver API by making instances of our handler class callable and having them run the superclass's __init__ code when called. In Python 3:
class MyHandler(http.server.BaseHTTPRequestHandler):
def __init__(self, message):
self.message = message
def __call__(self, *args, **kwargs):
"""Handle a request."""
super().__init__(*args, **kwargs)
def do_GET(self):
self.send_response(200)
self.end_headers()
self.wfile.write(self.message.encode("utf-8"))
This keeps the superclass "constructor" call out of __init__ which eliminates the possibility of dispatching a request (from the superclass's constructor) before the subclass's constructor is finished. Note that the __init__ override must be present to divert execution even if it's not needed for initialization; an empty implementation using pass would work.
With this design the weird interface is hidden and using the API looks more natural:
handler = MyHandler("Hello world")
server = http.server.HTTPServer(("localhost", 8000), handler)
server.serve_forever()
I would just comment on Thomas Orozco's answer but since I can't..
Perhaps this will help others who also run into this problem. Before Python3, Python has "old-style" classes, and BaseHTTPRequestHandler seems to be one of them. So, the factory should look like
def MakeHandlerClassFromArgv(init_args):
class CustomHandler(BaseHTTPRequestHandler, object):
def __init__(self, *args, **kwargs):
do_stuff_with(self, init_args)
super(CustomHandler, self).__init__(*args, **kwargs)
return CustomHandler
to avoid errors like TypeError: must be type, not classobj.
Why not just subclass the RequestHandler ?
class RequestHandler(BaseHTTPRequestHandler):
a_variable = None
class Server(HTTPServer):
def serve_forever(self, variable):
self.RequestHandlerClass.a_variable = variable
HTTPServer.serve_forever(self)
def run(server_class=Server, handler_class=RequestHandler):
server_address = ('', 8000)
httpd = server_class(server_address, handler_class)
variable = sys.argv
httpd.serve_forever(variable)
Ref Subclassing the HTTPServer is another option. Variables on the server are accessible in the Request Handler methods via self.server.context. It basically works like this:
class MyHTTPServer(HTTPServer):
def __init__(self, *args, **kwargs):
HTTPServer.__init__(self, *args, **kwargs)
self.context = SomeContextObject()
class MyHandler(BaseHTTPRequestHandler):
def do_GET(self):
context = self.server.context
...
# Drawback, Notice that you cannot actually pass the context parameter during constructor creation, but can do it within the __init__ of the MyHTTPServer
server = MyHTTPServer(('', port), MyHandler)
server.serve_forever()
If you do not need instance properties, but only class properties you could use this approach:
def run(server_class=HTTPServer, handler_class=BaseHTTPRequestHandler):
server_address = ('', 8000)
httpd = server_class(server_address, handler_class)
httpd.RequestHandlerClass.my_custom_variable = "hello!"
httpd.serve_forever()
or maybe you could:
def run(server_class=HTTPServer, handler_class=BaseHTTPRequestHandler):
server_address = ('', 8000)
httpd = server_class(server_address, handler_class)
httpd.my_custom_variable = "hello!"
httpd.serve_forever()
and retrieve in your RequestHandler with:
self.server.my_custom_variable
Using a lambda is a pretty simple way to create a new function that takes the request handler args and creates your custom class.
Here I want to pass a variable that will be used in do_POST(), and set the directory used by SimpleHTTPRequestHandler, so setup calls
HTTPServer(('', 8001), lambda *_: _RequestHandler("[1, 2]", *_, directory=sys.path[0]))
Full program:
from http.server import HTTPServer, SimpleHTTPRequestHandler
import sys
class _RequestHandler(SimpleHTTPRequestHandler):
def __init__(self, x, *args, **kwargs):
self.x = x # NEEDS TO HAPPEN BEFORE super().__init__()
super().__init__(*args, **kwargs)
def _set_headers(self):
self.send_response(200)
self.send_header('Content-type', 'application/json')
self.end_headers()
def do_POST(self):
print("POST")
length = int(self.headers.get('content-length'))
message = self.rfile.read(length).decode('utf-8')
print(message)
self._set_headers()
self.wfile.write(self.x.encode('utf-8'))
def run_server():
server_address = ('', 8001)
httpd = HTTPServer(server_address, lambda *_: _RequestHandler("[1, 2]", *_, directory=sys.path[0]))
print('serving http://localhost:8001')
httpd.serve_forever()
if __name__ == '__main__':
run_server()
Never do it with a global. Use the factory described in other answers.
CONFIG = None
class MyHandler(BaseHTTPRequestHandler):
def __init__(self, ...
self.config = CONFIG # CONFIG is now 'stuff'
if __name__ == "__main__":
global CONFIG
CONFIG = 'stuff'
server_address = ('', 8000)
httpd = HTTPServer(server_address, MyHandler)
httpd.serve_forever()
(except maybe in the privacy of your own home)
Related
I have the following two Python classes:
import socketserver
class MyServer(socketserver.ThreadingMixIn, socketserver.TCPServer):
def __init__(self, server_address):
socketserver.TCPServer.__init__(self, server_address, MyTcpHandler)
self.allow_reuse_address = True
self.serve_forever()
class MyTcpHandler(socketserver.BaseRequestHandler):
data = ""
def handle(self):
self.data = self.request.recv(BUFF_SIZE).strip()
if self.data == b"shutdown":
self.request.close()
import threading
threading.Thread(target=SERVER.shutdown).start()
Thus, when the client sends "shutdown", the server itself should shutdown. As a workaround I set the global variable SERVER to the MyServer object, then I call SERVER.shutdown in another thread, as shown above.
But using a global variable is ugly as hell. So how can I communicate directly from the request handler with the socket server instead?
The server is available as self.server according to the Python docs here https://docs.python.org/2/library/socketserver.html#SocketServer.BaseRequestHandler.handle
While looking over the new features in Python 3.x, I was intrigued by the asyncio library being added. After looking at the reference documentation, I decided to play around with it a little bit.
This worked well until I tried to make it work for multiple clients, and keep a list of all active/connected clients. This introduced a cyclic dependency between the server class and the session class.
Now, I've tried a few different ways to resolve this; however, there doesn't appear to be any way for me to get at this data directly from the server class through any method/functional call.
While I have been able to workaround this by using a "lazy" import, it seems like this my be indicative of either a poor design, a lack of understanding of the library itself, or a combination of both.
Code wise, I have a small sample put together. Am I missing an obvious solution here, or does my organization need to change to better support the functionality provided by the asyncio library?
__main__.py:
from network.server import Server
def main(args=None):
s = Server()
try:
s.run()
except KeyboardInterrupt:
pass
s.close()
if __name__ == "__main__":
main()
server.py:
import asyncio
from network.session import Session
class Server:
sessionList = []
def __init__(self):
self.handler = None
self.loop = asyncio.get_event_loop()
self.coro = self.loop.create_server(Session, 'localhost', 1234)
def run(self):
self.handler = self.loop.run_until_complete(self.coro)
print('Server Running On: {}'.format(self.handler.sockets[0].getsockname()))
self.loop.run_forever()
def close(self):
self.handler.close()
self.loop.run_until_complete(self.handler.wait_closed())
self.loop.close()
#staticmethod
def add_session(session):
Server.sessionList.append(session)
#staticmethod
def del_session(session):
Server.sessionList.remove(session)
session.py:
import asyncio
class Session(asyncio.Protocol):
def __init__(self):
from network.server import Server
self._transport = None
Server.add_session(self)
def connection_made(self, transport):
self._transport = transport
self._transport.write('Echo Server Example\r\n\r\n'.encode())
def data_received(self, data):
self._transport.write(data)
def eof_received(self):
self._transport.close()
def connection_lost(self, exc):
from network.server import Server
Server.del_session(self)
if exc is not None:
self._transport.close()
You may pass server instance into Session constructor:
self.loop.create_server(lambda: Session(self), 'localhost', 1234)
Storing sessionList as global object is not the best practice.
I recommend saving it as self.sessionList = [] in Server.__init__ and converting both add_session and del_session from staticmethod into regular methods.
CLIENT:
#!/usr/bin/env python
from twisted.internet import reactor, protocol
class EchoClient(protocol.Protocol):
def __init__(self, arg):
self.arg = arg
def connectionMade(self):
self.transport.write("hello, world!")
def dataReceived(self, data):
print "Server said:", data
self.transport.loseConnection()
def connectionLost(self, reason):
print "connection lost"
class EchoFactory(protocol.ClientFactory):
protocol = EchoClient
def buildProtocol(self, address):
proto = protocol.ClientFactory.buildProtocol(self, address, 12)
self.connectedProtocol = proto
return proto
def clientConnectionFailed(self, connector, reason):
print "Connection failed - goodbye!"
reactor.stop()
def clientConnectionLost(self, connector, reason):
print "Connection lost - goodbye!"
reactor.stop()
def main():
f = EchoFactory()
reactor.connectTCP("localhost", 8000, f)
reactor.run()
if __name__ == '__main__':
main()
SERVER:
#!/usr/bin/env python
from twisted.internet import reactor, protocol
from twisted.application import service, internet
class Echo(protocol.Protocol):
def dataReceived(self, data):
self.transport.write(data)
def main():
factory = protocol.ServerFactory()
factory.protocol = Echo
reactor.listenTCP(8000,factory)
reactor.run()
if __name__ == '__main__':
main()
ERROR:
exceptions.TypeError: buildProtocol() takes exactly 2 arguments (3 given)
QUESTION:
How can I get the EchoClient class in the CLIENT to accept parameters and assign instance variables ( such as arg in the EchoClient constructor above)? As noted below, it was previously suggested that I override the buildProtocol function, but my attempt at doing so has lead me to the above error. I am not really sure where to go from here. I suppose my question can be generalize to: how can I add instance variables to a protocol?
you wrote:
def buildProtocol(self, address):
proto = protocol.ClientFactory.buildProtocol(self, address, 12)
that is, you are overriding ClientFactory.buildProtocol and calling the parent class with a different signature than it knows how to handle.
Passing data from the factory to the client is only a little tricky. You can provide any __init__ you want to the factory, but twisted creates instances of IProtocol itself. Fortunately, most factories assign themselves to the factory attribute of the protocol, once it's ready to go:
class MyClientProtocol(protocol.Protocol):
def connectionMade(self):
# use self.factory here:
self.transport.write(self.factory.arg)
class MyClientFactory(protocol.ClientFactory):
protocol = MyClientProtocol
def __init__(self, arg):
self.arg = arg
In fact, the whole ProtocolFactory business is to support this kind of use; but be mindful; many instances of Protocol will share a single instance of their factory; use the factory for configuration but manage state in the protocol.
It's certainly possible that the way the standard family of Protocol/Factory implementations don't suit your needs, and that's also reasonable, so long as you fully implement the IProtocol and IProtocolFactory interfaces. The base classes exist because they handle most of the cases for you, not because they are the only possible implementation.
It's not clear from your question what exactly your tryed and what exactly the error was, but anyway you have to do two steps:
Make EchoClient's constructor take whatever arguments you need it to take and initialise whatever field you need it to initialise.
Override buildProtocol method in your factory to supply those arguments to your protocol.
I am doing an app whit wxPython and XMLRPC i need that the window does an action every time the XMLRPC server has a request
How could i do it without blocking the main Window?
I tried with threads but it doesnt work also I tried calling the run method of the thread in the Frame's constructor neither it worked
Sorry for the language
I hope to be clear
Thanks
Here's an example of a threaded XMLRPC server using SimpleXMLRPCServer. Note the wx.CallAfter to call into the wx main thread and the "return 0" (though you can configure the server so that return values of None are OK.)
from SimpleXMLRPCServer import SimpleXMLRPCServer
import threading
class XMLRPCServerThread(threading.Thread):
def __init__(self, remoteObject, host='localhost', port=8000):
self.remoteObject = remoteObject
self.host = host
self.port = port
threading.Thread.__init__(self)
def stop(self):
self.server.shutdown()
def run(self):
self.server = SimpleXMLRPCServer( (self.host, self.port), logRequests=False )
self.server.register_instance( self.remoteObject )
self.server.serve_forever()
class MyRemoteCalls(object):
def __init__(self, obj):
self.obj = obj
def exampleCall(self, arg):
wx.CallAfter(self.obj.method, arg)
return 0
def getRPCThread(obj, host='localhost', port=8000):
remoteObj = MyRemoteCalls(obj)
rpcThread = XMLRPCServerThread(remoteObj, host, port)
rpcThread.start()
return rpcThread
I am adding a feature to my current project that will allow network admins to install the software to the network. I need to code a DNS server in Python that will allow me to redirect to a certain page if the request address is in my list. I was able to write the server, just not sure how to redirect.
Thank you. I am using Python 2.6 on Windows XP.
There's little, simple example here that can easily be adapted to make all kinds of "mini fake dns servers". Note that absolutely no "redirect" is involved (that's not how DNS works): rather, the request is for a domain name, and the result of that request is an IP address. If what you want to do is drastically different from translating names to addresses, then maybe what you need is not actually a DNS server...?
Using circuits and dnslib here's a full recursive dns server written in Python in only 143 lines of code:
#!/usr/bin/env python
from __future__ import print_function
from uuid import uuid4 as uuid
from dnslib import CLASS, QR, QTYPE
from dnslib import DNSHeader, DNSQuestion, DNSRecord
from circuits.net.events import write
from circuits import Component, Debugger, Event
from circuits.net.sockets import UDPClient, UDPServer
class lookup(Event):
"""lookup Event"""
class query(Event):
"""query Event"""
class response(Event):
"""response Event"""
class DNS(Component):
def read(self, peer, data):
record = DNSRecord.parse(data)
if record.header.qr == QR["QUERY"]:
return self.fire(query(peer, record))
return self.fire(response(peer, record))
class ReturnResponse(Component):
def response(self, peer, response):
return response
class Client(Component):
channel = "client"
def init(self, server, port, channel=channel):
self.server = server
self.port = int(port)
self.transport = UDPClient(0, channel=self.channel).register(self)
self.protocol = DNS(channel=self.channel).register(self)
self.handler = ReturnResponse(channel=self.channel).register(self)
class Resolver(Component):
def init(self, server, port):
self.server = server
self.port = port
def lookup(self, qname, qclass="IN", qtype="A"):
channel = uuid()
client = Client(
self.server,
self.port,
channel=channel
).register(self)
yield self.wait("ready", channel)
self.fire(
write(
(self.server, self.port),
DNSRecord(
q=DNSQuestion(
qname,
qclass=CLASS[qclass],
qtype=QTYPE[qtype]
)
).pack()
)
)
yield (yield self.wait("response", channel))
client.unregister()
yield self.wait("unregistered", channel)
del client
class ProcessQuery(Component):
def query(self, peer, query):
qname = query.q.qname
qtype = QTYPE[query.q.qtype]
qclass = CLASS[query.q.qclass]
response = yield self.call(lookup(qname, qclass=qclass, qtype=qtype))
record = DNSRecord(
DNSHeader(id=query.header.id, qr=1, aa=1, ra=1),
q=query.q,
)
for rr in response.value.rr:
record.add_answer(rr)
yield record.pack()
class Server(Component):
def init(self, bind=("0.0.0.0", 53)):
self.bind = bind
self.transport = UDPServer(self.bind).register(self)
self.protocol = DNS().register(self)
self.handler = ProcessQuery().register(self)
class App(Component):
def init(self, bind=("0.0.0.0", 53), server="8.8.8.8", port=53,
verbose=False):
if verbose:
Debugger().register(self)
self.resolver = Resolver(server, port).register(self)
self.server = Server(bind).register(self)
def main():
App().run()
if __name__ == "__main__":
main()
Usage:
By default this example binds go 0.0.0.0:53 so you will need to do something like:
sudo ./dnsserver.py
Otherwise change the bind parameter.
Here is a dns serer/proxy that works for me written in python:
http://thesprawl.org/projects/dnschef/
I wrote a DNS Server using Python Twisted library for NameOcean.net. You can see examples on https://twistedmatrix.com/documents/16.5.0/names/howto/custom-server.html.