How can I work on this code to be able to send 2 separate requests. The requests would be in this order:
Request1 :
HEAD http://google.com
Host: google.com
... wait for reply from google server ...
Request2 :
GET http://yahoo.com HTTP/1.1
User-Agent: mozilla
Accept: */*
... second request sent from browser while first request is static for all requests ...
The code I’m trying to modify is:
from twisted.web import proxy, http
class SnifferProxy(proxy.Proxy):
def allContentReceived(self):
print "Received data..."
print "method = %s" % self._command
print "action = %s" % self._path
print "ended content manipulation\n\n"
return proxy.Proxy.allContentReceived(self)
class ProxyFactory(http.HTTPFactory):
protocol = SnifferProxy
if __name__ == "__main__":
from twisted.internet import reactor
reactor.listenTCP(8080, ProxyFactory())
reactor.run()
The twisted proxy would be connecting to another external proxy
Any help is appreciated..
I think you can get what you want by adding the call to the Proxy.allContentReceived method as a callback to a HEAD request using Agent.
from twisted.internet import reactor from twisted.web import proxy, http
from twisted.web.client import Agent
from twisted.web.http_headers import Headers
agent = Agent(reactor)
class SnifferProxy(proxy.Proxy):
def allContentReceived(self):
def cbHead(result):
print "got response for HEAD"
def doProxiedRequest(result):
proxy.Proxy.allContentReceived(self)
# I assumed self._path, but it looks OP wants to do the
# HEAD request to the same path always
PATH = "http://foo.bar"
d = agent.request(
'HEAD', PATH, Headers({'User-Agent': ['twisted']}), None)
d.addCallback(cbHead)
d.addCallback(doProxiedRequest)
Related
I am trying to implement a client-server using tornado_http2 api in python but server never receive messages from the client.
I have checked that server is well started with this comm
and and I had this result:
(mmsx-TPjM8MGB-py3.9) xx#ITLP071: 7 (master) ~/dev/mmsx/tornado_http2/demo$ proxy=127.0.0.1:8443; curl --http2-prior-knowledge -d "bla bla" -X POST https://localhost:8443/ -E test.crt
curl: (60) SSL certificate problem: self signed certificate
More details here: https://curl.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.
And from the output server :
(mmsx-TPjM8MGB-py3.9) xx#ITLP071: 130 (master) ~/dev/mmsx/tornado_http2/demo$ poetry run python server_test.py
[I 220722 04:02:37 server_test:30] starting
[W 220722 04:02:41 iostream:1517] SSL Error on 7 ('127.0.0.1', 60040): [SSL: TLSV1_ALERT_UNKNOWN_CA] tlsv1 alert unknown ca (_ssl.c:1123)
The connection is not perfectly done (that I do not succed to resolve for now) but at least I have a reaction from the server.
With request from the client, I have no response.
Please find my server code below:
import logging
import os
import ssl
from tornado.ioloop import IOLoop
from tornado.options import parse_command_line
from tornado.web import Application, RequestHandler
from tornado_http2.server import Server
class MainHandler(RequestHandler):
def get(self):
self.write("Hello world")
def post(self):
self.write("bla bla")
def main():
parse_command_line()
ssl_ctx = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
ssl_ctx.load_cert_chain(
os.path.join(os.path.dirname(__file__), 'test.crt'),
os.path.join(os.path.dirname(__file__), 'test.key'))
app = Application([('/hello', MainHandler)], debug=True)
server = Server(app, ssl_options=ssl_ctx)
port = 8443
address = "127.0.0.1"
server.listen(port, address)
logging.info("starting")
IOLoop.instance().start()
if __name__ == '__main__':
main()
And my client code:
from tornado_http2.curl import CurlAsyncHTTP2Client as HTTP2Client
import asyncio
URI = "http:127.0.0.1:8443/hello"
class Test():
def __init__(self):
self.__client = HTTP2Client(force_instance=True)
async def send(self):
global URI
body = "body"
response = await self.__client.fetch(URI, method='POST', body=body,
validate_cert=False)
print(response)
def main():
asyncio.run(Test().send())
if __name__ == "__main__":
main()
I started the server in a terminal and then the client in another one and for me, it should displayed in the client console the result of the request.
Thanks for your help !
OK, I have found.
It is a bug in tornado_http2 api. The event loop has to be created before the instanciation of the class HTTP2Client, else this does not work.
If the client code is remplaced bu this, it will work :
from tornado_http2.curl import CurlAsyncHTTP2Client as HTTP2Client
import asyncio
from tornado.httpclient import AsyncHTTPClient
from tornado.ioloop import IOLoop
class Test():
def __init__(self):
self.__client = HTTP2Client(force_instance=True)
async def send(self):
uri = "https://127.0.0.1:8443/hello"
response = await self.__client.fetch(uri, validate_cert=False)
print(response.body.decode('utf-8'))
def run_asyncio():
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
try:
return loop.run_until_complete(Test().send())
finally:
loop.close()
asyncio.set_event_loop(None)
def main():
run_asyncio()
if __name__ == "__main__":
main()
Hopefully it will help someone =).
I have seen code like this that shows how to use a proxy for python requests.
import requests
proxies = {
'http': 'http://localhost:7777',
'https': 'http://localhost:7777',
}
requests.get('http://example.org', proxies=proxies)
requests.get('https://example.org', proxies=proxies)
But I am wondering how can we make a very simple proxy server in Python that would be able to respond to the GET request?
You can find many examples how to do it - even in questions on Stackoverflow.
Some of them use standard module socket (but it doesn't look simply).
Other use standard module http but they show code for Python 2 which was using different names.
Version for Python 3
import http.server
import socketserver
import urllib.request
class MyProxy(http.server.SimpleHTTPRequestHandler):
def do_GET(self):
print(self.path)
url = self.path
self.send_response(200)
self.end_headers()
self.copyfile(urllib.request.urlopen(url), self.wfile)
# --- main ---
PORT = 7777
httpd = None
try:
socketserver.TCPServer.allow_reuse_address = True # solution for `OSError: [Errno 98] Address already in use`
httpd = socketserver.TCPServer(('', PORT), MyProxy)
print(f"Proxy at: http://localhost:{PORT}")
httpd.serve_forever()
except KeyboardInterrupt:
print("Pressed Ctrl+C")
finally:
if httpd:
httpd.shutdown()
#httpd.socket.close()
Test using page httpbin.org
import requests
proxies = {
'http': 'http://localhost:7777',
'https': 'http://localhost:7777',
}
response = requests.get('http://httpbin.org/get', proxies=proxies)
print(response.text)
response = requests.get('http://httpbin.org/get?arg1=hello&arg2=world', proxies=proxies)
print(response.text)
But it works only for HTTP.
For HTTPS it may need to use ssl.socket from module ssl.
And it works only with GET.
For POST, PUT, DELETE, etc. it would need do_POST, do_PUT, do_DELETE, etc. with different code.
EDIT:
def do_POST(self):
url = self.path
# - post data -
content_length = int(self.headers.get('Content-Length', 0)) # <--- size of data
if content_length:
content = self.rfile.read(content_length) # <--- data itself
else:
content = None
req = urllib.request.Request(url, method="POST", data=content)
output = urllib.request.urlopen(req)
# ---
self.send_response(200)
self.end_headers()
self.copyfile(output, self.wfile)
But if you need local proxy only to test your code then you could use
Python module/program: mitmproxy (Man-In-The-Middle-Proxy)
not-python, not-free (but work 30 days for free), with nice GUI: Charles Proxy
More complex OWASP ZAP, Burp Suite (community edition)
Im trying to create a simple HTTP server that will receive POST messages and provide a simple response. Im using the standard HTTPServer with python. The client connects using a session() which should use a persistent connection but after each POST I see the message below in the debug that the connection is dropping.
INFO:urllib3.connectionpool:Resetting dropped connection:
DEBUG:urllib3.connectionpool:"GET / HTTP/1.1" 200 None
The client works properly when I try it with Apache so I believe the issue is in my simple server configuration. How can I configure the simple http server to work with persistent connections?
Simple Server Python Code:
from http.server import HTTPServer, BaseHTTPRequestHandler
from io import BytesIO
import time
import datetime
import logging
class SimpleHTTPRequestHandler(BaseHTTPRequestHandler):
def _set_response(self):
self.send_response(200)
self.send_header('Content-type', 'text/html')
self.send_header("Connection", "keep-alive")
self.send_header("keep-alive", "timeout=5, max=30")
self.end_headers()
def do_GET(self):
self.send_response(200)
self.end_headers()
self.wfile.write(b'Hello, world!')
def do_POST(self):
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
curr_time = datetime.datetime.now()
data = ('{"msgid":"0x0002", "timestamp": "'+ str(curr_time) +'", "message":"Test http response from Raspberry Pi HTTP server"}').encode()
self.send_response(200)
self.end_headers()
response = BytesIO()
#response.write(b'This is POST request. ')
#response.write(b'Received: ')
response.write(data)
self.wfile.write(response.getvalue())
print("Simple HTTP Server running...")
logging.basicConfig(level=logging.DEBUG)
httpd = HTTPServer(('', 8000), SimpleHTTPRequestHandler)
httpd.serve_forever()
Client Python code:
#!/usr/bin/env python
# Using same TCP connection for all HTTP requests
import os
import json
import time
import datetime
import logging
import requests
from requests.auth import HTTPBasicAuth
logging.basicConfig(level=logging.DEBUG)
start_time = time.time()
def get_data(limit):
session = requests.Session()
url = "http://localhost:8000"
for i in range(10):
curr_time = datetime.datetime.now()
data = '{"msgid":"0x0001", "timestamp": "'+ str(curr_time) +'", "message":"Test http message from Raspberry Pi"}'
print("Sending Data: " + data)
response = session.post(url.format(limit), data)
#response_dict = json.loads(response.text)
print("Received Data: " + response.text)
if __name__ == "__main__":
limit = 1
get_data(limit)
print("--- %s seconds ---" % (time.time() - start_time))
You aren't actually setting the Connection header in your POST handler. In order for persistent connections to work, you'll also need to set the Content-Length header in the response so that client knows how many bytes of the HTTP body to read before reusing the connection.
Try this POST handler, adapted from your code:
def do_POST(self):
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
# Process the request here and generate the entire response
response_data = b'{"stuff": 1234}'
# Send the response
self.send_response(200)
self.send_header("Connection", "keep-alive")
self.send_header("Content-Length", str(len(response_data)))
self.end_headers()
# Write _exactly_ the number of bytes specified by the
# 'Content-Length' header
self.wfile.write(response_data)
How can I access remote peer IP in this Twisted HTTP Client example? (From Twisted Docs)
Working with this example:
from sys import argv
from pprint import pformat
from twisted.internet.task import react
from twisted.web.client import Agent, readBody
from twisted.web.http_headers import Headers
def cbRequest(response):
#print 'Response version:', response.version
#print 'Response code:', response.code
#print 'Response phrase:', response.phrase
#print 'Response headers:'
#print pformat(list(response.headers.getAllRawHeaders()))
poweredby = response.headers.getRawHeaders("X-Powered-By")
server = response.headers.getRawHeaders("Server")
print poweredby
print server
d = readBody(response)
d.addCallback(cbBody)
return d
def cbBody(body):
print 'Response body:'
#print body
def main(reactor, url=b"http://www.example.com/"):
agent = Agent(reactor)
d = agent.request(
'GET', url,
Headers({'User-Agent': ['Twisted Web Client Example']}),
None)
d.addCallback(cbRequest)
return d
react(main, argv[1:])
After searching on the Internet and SO, I found that it can be read from:
self.xmlstream.transport.getHandle().getpeername()
or
self.transport.getPeer()
However I don't know which Class "self" refers to and where to put it in the example code?
Any help? Tips? Ideas?
Thanks,
It is possible to get the address, though you have to hack through some layers of abstraction and touch a private attribute:
from __future__ import print_function
from twisted.web.client import Agent
from twisted.internet.task import react
from twisted.internet.protocol import Protocol
from twisted.internet.defer import Deferred
class ReadAddress(Protocol):
def __init__(self):
self.result = Deferred()
def connectionMade(self):
self.result.callback(self.transport._producer.getPeer())
def readAddress(response):
p = ReadAddress()
response.deliverBody(p)
return p.result
#react
def main(reactor):
a = Agent(reactor)
d = a.request(b"GET", b"http://www.google.com/")
d.addCallback(readAddress)
d.addCallback(print)
return d
Ideally, there would be a simpler (public!) interface to retrieve information like this. It would be excellent if you could file a feature request in the Twisted tracker.
I was trying to get the HTTP POST request body by using t.p.basic.LineReceiver but failed. My code is listed below:
from twisted.internet import reactor, protocol
from twisted.protocols import basic
class PrintPostBody(basic.LineReceiver):
def __init__(self):
self.line_no = 0
def lineReceived(self, line):
print '{0}: {1}'.format(str(self.line_no).rjust(3), repr(line))
self.line_no += 1
def connectionLost(self, reason):
print "conn lost"
class PPBFactory(protocol.ServerFactory):
protocol = PrintPostBody
def main():
f = PPBFactory()
reactor.listenTCP(80, f)
reactor.run()
if __name__ == '__main__':
main()
But when I was doing HTTP POST request to that machine at port 80, only the HTTP request headers were printed out.
Sample output:
0: 'POST / HTTP/1.0'
1: 'Host: ###.##.##.##'
2: 'Referer: http://#.#####.###/?ssid=0&from=0&bd_page_type=1&uid=wiaui_1292470548_2644&pu=sz%40176_229,sz%40176_208'
3: 'Content-Length: 116'
4: 'Origin: http://#.#####.###'
5: 'Content-Type: application/x-www-form-urlencoded'
6: 'Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5'
7: 'User-Agent: Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/534.11 (KHTML, like Gecko) Chrome/9.0.565.0 Safari/534.11'
8: 'Accept-Encoding: gzip,deflate,sdch'
9: 'Accept-Language: en-US,en;q=0.8'
10: 'Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3'
11: 'Via: 1.1 #####.###.###.##:8080 (squid/2.6.STABLE21)'
12: 'X-Forwarded-For: ###.##.###.###'
13: 'Cache-Control: max-age=0'
14: 'Connection: keep-alive'
15: ''
So the connection was not closed here but the POST body was not received either.
I have tested the network condition by running sudo nc -l 80 and it did print out the HTTP POST request body.
So, how could I get the HTTP POST request body using Twisted?
Thank you very much.
I suspect you didn't see the request body printed out because it didn't contain any newlines or end with a newline. So it got into the parse buffer of your PrintPostBody instance and sat there forever, waiting for a newline to indicate that a full line had been received. LineReceiver won't call the lineReceived callback until a full line is received.
Instead, you can let Twisted Web do this parsing for you:
from twisted.web.server import Site # Site is a server factory for HTTP
from twisted.web.resource import Resource
from twisted.internet import reactor
class PrintPostBody(Resource): # Resources are what Site knows how to deal with
isLeaf = True # Disable child lookup
def render_POST(self, request): # Define a handler for POST requests
print request.content.read() # Get the request body from this file-like object
return "" # Define the response body as empty
reactor.listenTCP(80, Site(PrintPostBody()))
reactor.run()