By default SimpleHTTPServer sends it's own headers.
I've been trying to figure out how to send my own headers and found this solution. I tried adapting it to my (very) simple proxy:
import SocketServer
import SimpleHTTPServer
import urllib
class Proxy(SimpleHTTPServer.SimpleHTTPRequestHandler):
headers = ['Date: Wed, 29 Oct 2014 15:54:43 GMT', 'Server: Apache', 'Accept-Ranges: bytes', 'X-Mod-Pagespeed: 1.6.29.7-3566', 'Vary: Accept-Encoding', 'Cache-Control: max-age=0, no-cache', 'Content-Length: 204', 'Connection: close', 'Content-Type: text/html']
def end_headers(self):
print "Setting custom headers"
self.custom_headers()
SimpleHTTPServer.SimpleHTTPRequestHandler.end_headers(self)
def custom_headers(self):
for i in self.headers:
key, value = i.split(":", 1)
self.send_header(key, value)
def do_GET(self):
self.copyfile(urllib.urlopen(self.path), self.wfile)
httpd = SocketServer.ForkingTCPServer(('', PORT), Proxy)
httpd.serve_forever()
But end_headers() doesn't set the custom headers (confirmed on Wireshark).
Given a list of headers like the one in my little snippet, how I can overwrite SimpleHTTPServer's default headers and server my own?
I think you miss something in do_GET().
SimpleHTTPServer also calls
self.send_response(200)
See the following code or better the module SimpleHTTPServer
self.send_response(200)
self.send_header("Content-type", ctype)
fs = os.fstat(f.fileno())
self.send_header("Content-Length", str(fs[6]))
self.send_header("Last-Modified", self.date_time_string(fs.st_mtime))
self.end_headers()
I think you should override the send_head() method for what you want to do and read the source of SimpleHTTPServer.
Related
I have seen code like this that shows how to use a proxy for python requests.
import requests
proxies = {
'http': 'http://localhost:7777',
'https': 'http://localhost:7777',
}
requests.get('http://example.org', proxies=proxies)
requests.get('https://example.org', proxies=proxies)
But I am wondering how can we make a very simple proxy server in Python that would be able to respond to the GET request?
You can find many examples how to do it - even in questions on Stackoverflow.
Some of them use standard module socket (but it doesn't look simply).
Other use standard module http but they show code for Python 2 which was using different names.
Version for Python 3
import http.server
import socketserver
import urllib.request
class MyProxy(http.server.SimpleHTTPRequestHandler):
def do_GET(self):
print(self.path)
url = self.path
self.send_response(200)
self.end_headers()
self.copyfile(urllib.request.urlopen(url), self.wfile)
# --- main ---
PORT = 7777
httpd = None
try:
socketserver.TCPServer.allow_reuse_address = True # solution for `OSError: [Errno 98] Address already in use`
httpd = socketserver.TCPServer(('', PORT), MyProxy)
print(f"Proxy at: http://localhost:{PORT}")
httpd.serve_forever()
except KeyboardInterrupt:
print("Pressed Ctrl+C")
finally:
if httpd:
httpd.shutdown()
#httpd.socket.close()
Test using page httpbin.org
import requests
proxies = {
'http': 'http://localhost:7777',
'https': 'http://localhost:7777',
}
response = requests.get('http://httpbin.org/get', proxies=proxies)
print(response.text)
response = requests.get('http://httpbin.org/get?arg1=hello&arg2=world', proxies=proxies)
print(response.text)
But it works only for HTTP.
For HTTPS it may need to use ssl.socket from module ssl.
And it works only with GET.
For POST, PUT, DELETE, etc. it would need do_POST, do_PUT, do_DELETE, etc. with different code.
EDIT:
def do_POST(self):
url = self.path
# - post data -
content_length = int(self.headers.get('Content-Length', 0)) # <--- size of data
if content_length:
content = self.rfile.read(content_length) # <--- data itself
else:
content = None
req = urllib.request.Request(url, method="POST", data=content)
output = urllib.request.urlopen(req)
# ---
self.send_response(200)
self.end_headers()
self.copyfile(output, self.wfile)
But if you need local proxy only to test your code then you could use
Python module/program: mitmproxy (Man-In-The-Middle-Proxy)
not-python, not-free (but work 30 days for free), with nice GUI: Charles Proxy
More complex OWASP ZAP, Burp Suite (community edition)
I'm trying out some PHP on my pc and made a little python server to host the files, one problem:
It can't do POST, I always get the error 501. I've heard that you can implement POST in these servers, but I didn't find how to do this, can someone help?
Here's my current server:
import http.server
import socketserver
PORT = 8080
Handler = http.server.SimpleHTTPRequestHandler
with socketserver.TCPServer(("", PORT), Handler) as httpd:
print("serving at port", PORT)
httpd.serve_forever()
This is the script I personally use for when I need this kind of functionality:
#!/usr/env python3
import http.server
import os
import logging
try:
import http.server as server
except ImportError:
# Handle Python 2.x
import SimpleHTTPServer as server
class HTTPRequestHandler(server.SimpleHTTPRequestHandler):
"""
SimpleHTTPServer with added bonus of:
- handle PUT requests
- log headers in GET request
"""
def do_GET(self):
server.SimpleHTTPRequestHandler.do_GET(self)
logging.warning(self.headers)
def do_PUT(self):
"""Save a file following a HTTP PUT request"""
filename = os.path.basename(self.path)
# Don't overwrite files
if os.path.exists(filename):
self.send_response(409, 'Conflict')
self.end_headers()
reply_body = '"%s" already exists\n' % filename
self.wfile.write(reply_body.encode('utf-8'))
return
file_length = int(self.headers['Content-Length'])
with open(filename, 'wb') as output_file:
output_file.write(self.rfile.read(file_length))
self.send_response(201, 'Created')
self.end_headers()
reply_body = 'Saved "%s"\n' % filename
self.wfile.write(reply_body.encode('utf-8'))
if __name__ == '__main__':
server.test(HandlerClass=HTTPRequestHandler)
But perhaps a more fitting, and simpler script would be the following, as found on Flavio Copes' blog:
from http.server import BaseHTTPRequestHandler, HTTPServer
class handler(BaseHTTPRequestHandler):
def do_GET(self):
self.send_response(200)
self.send_header('Content-type','text/html')
self.end_headers()
message = "Hello, World! Here is a GET response"
self.wfile.write(bytes(message, "utf8"))
def do_POST(self):
self.send_response(200)
self.send_header('Content-type','text/html')
self.end_headers()
message = "Hello, World! Here is a POST response"
self.wfile.write(bytes(message, "utf8"))
with HTTPServer(('', 8000), handler) as server:
server.serve_forever()
Im trying to create a simple HTTP server that will receive POST messages and provide a simple response. Im using the standard HTTPServer with python. The client connects using a session() which should use a persistent connection but after each POST I see the message below in the debug that the connection is dropping.
INFO:urllib3.connectionpool:Resetting dropped connection:
DEBUG:urllib3.connectionpool:"GET / HTTP/1.1" 200 None
The client works properly when I try it with Apache so I believe the issue is in my simple server configuration. How can I configure the simple http server to work with persistent connections?
Simple Server Python Code:
from http.server import HTTPServer, BaseHTTPRequestHandler
from io import BytesIO
import time
import datetime
import logging
class SimpleHTTPRequestHandler(BaseHTTPRequestHandler):
def _set_response(self):
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.send_header("Connection", "keep-alive")
self.send_header("keep-alive", "timeout=5, max=30")
self.end_headers()
def do_GET(self):
self.send_response(200)
self.end_headers()
self.wfile.write(b'Hello, world!')
def do_POST(self):
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
curr_time = datetime.datetime.now()
data = ('{"msgid":"0x0002", "timestamp": "'+ str(curr_time) +'", "message":"Test http response from Raspberry Pi HTTP server"}').encode()
self.send_response(200)
self.end_headers()
response = BytesIO()
#response.write(b'This is POST request. ')
#response.write(b'Received: ')
response.write(data)
self.wfile.write(response.getvalue())
print("Simple HTTP Server running...")
logging.basicConfig(level=logging.DEBUG)
httpd = HTTPServer(('', 8000), SimpleHTTPRequestHandler)
httpd.serve_forever()
Client Python code:
#!/usr/bin/env python
# Using same TCP connection for all HTTP requests
import os
import json
import time
import datetime
import logging
import requests
from requests.auth import HTTPBasicAuth
logging.basicConfig(level=logging.DEBUG)
start_time = time.time()
def get_data(limit):
session = requests.Session()
url = "http://localhost:8000"
for i in range(10):
curr_time = datetime.datetime.now()
data = '{"msgid":"0x0001", "timestamp": "'+ str(curr_time) +'", "message":"Test http message from Raspberry Pi"}'
print("Sending Data: " + data)
response = session.post(url.format(limit), data)
#response_dict = json.loads(response.text)
print("Received Data: " + response.text)
if __name__ == "__main__":
limit = 1
get_data(limit)
print("--- %s seconds ---" % (time.time() - start_time))
You aren't actually setting the Connection header in your POST handler. In order for persistent connections to work, you'll also need to set the Content-Length header in the response so that client knows how many bytes of the HTTP body to read before reusing the connection.
Try this POST handler, adapted from your code:
def do_POST(self):
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
# Process the request here and generate the entire response
response_data = b'{"stuff": 1234}'
# Send the response
self.send_response(200)
self.send_header("Connection", "keep-alive")
self.send_header("Content-Length", str(len(response_data)))
self.end_headers()
# Write _exactly_ the number of bytes specified by the
# 'Content-Length' header
self.wfile.write(response_data)
I am trying to write a basic "echo" HTTP server that writes back the raw data it receives in the request. How can I get the request data as a string?
This is my program:
#!/usr/bin/env python
from http.server import HTTPServer, BaseHTTPRequestHandler
class RequestHandler(BaseHTTPRequestHandler):
def do_GET(self):
print('data', self.rfile.readall())
self.send_response(200)
self.send_header('Content-Type', 'text/html')
self.end_headers()
message = 'Hello Client!'
self.wfile.write(bytes(message, 'utf8'))
return
def server_start():
address = ('', 1992)
httpd = HTTPServer(address, RequestHandler)
httpd.serve_forever()
server_start()
Error:
self.rfile.readall(): '_io.BufferedReader' object has no attribute 'readall'
If it's a get request there won't be a body, so the data is going to be sent in the url.
from http.server import HTTPServer, BaseHTTPRequestHandler
import urlparse
class RequestHandler(BaseHTTPRequestHandler):
def do_GET(self):
parsed_path = urlparse.urlparse(self.path)
print(parsed_path.query)
...
Otherwise, you should implement a POST method if you want to send any more complex object as data (take a look at different HTTP methods if you're not familiar with them).
A post method would be something like:
def do_POST(self):
post_body = self.rfile.readall()
Note that here you can use the method rfile.readall(). Here you have a nice gist with some examples!
I want something like BaseHTTPRequestHandler, except that I don't want it to bind to any sockets; I want to handle the raw HTTP data to and from it myself. Is there a good way that I can do this in Python?
To Clarify, I want a class that receives raw TCP data from Python (NOT a socket), processes it and returns TCP data as a response (to python again). So this class will handle TCP handshaking, and will have methods that override what I send on HTTP GET and POST, like do_GET and do_POST. So, I want something like the Server infrastructure that already exists, except I want to pass all raw TCP packets in python and not through operating system sockets.
BaseHTTPRequestHandler derives from StreamRequestHandler, which basically reads from file self.rfile and writes to self.wfile, so you can derive a class from BaseHTTPRequestHandler and supply your own rfile and wfile e.g.
import StringIO
from BaseHTTPServer import BaseHTTPRequestHandler
class MyHandler(BaseHTTPRequestHandler):
def __init__(self, inText, outFile):
self.rfile = StringIO.StringIO(inText)
self.wfile = outFile
BaseHTTPRequestHandler.__init__(self, "", "", "")
def setup(self):
pass
def handle(self):
BaseHTTPRequestHandler.handle(self)
def finish(self):
BaseHTTPRequestHandler.finish(self)
def address_string(self):
return "dummy_server"
def do_GET(self):
self.send_response(200)
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write("<html><head><title>WoW</title></head>")
self.wfile.write("<body><p>This is a Total Wowness</p>")
self.wfile.write("</body></html>")
outFile = StringIO.StringIO()
handler = MyHandler("GET /wow HTTP/1.1", outFile)
print ''.join(outFile.buflist)
Output:
dummy_server - - [15/Dec/2009 19:22:24] "GET /wow HTTP/1.1" 200 -
HTTP/1.0 200 OK
Server: BaseHTTP/0.3 Python/2.5.1
Date: Tue, 15 Dec 2009 13:52:24 GMT
Content-type: text/html
<html><head><title>WoW</title></head><body><p>This is a Total Wowness</p></body></html>