sniffing http packets using scapy - python

Im using python 2.7.15, scapy and scapy-http on windows.
I want to sniff all the http packets and extract the html pages that were sent.
This is the code Im using:
from scapy.all import *
import scapy_http.http
def printPacket(packet):
if packet.haslayer('HTTP'):
print '='*50
print packet.show()
sniff(prn=printPacket)
but from some reason it only captures some of the http packets(when I use the browser I dont see any packets) and I dont see any html code in the ones that it does print.

I think that's because some of the traffic sent is HTTPS (= HTTP + TLS). In your function you expect to HTTP application layer, which is encapsulated and encrypted in a TLS layer, and therefore it is not matched.
To sniff HTTPS, you can use this: https://github.com/tintinweb/scapy-ssl_tls (I haven't tried it yet).

Related

Is it possible to recreate a request from the packets programatically?

For a script I am making, I need to be able to see the parameters that are sent with a request.
This is possible through Fiddler, but I am trying to automate the process.
Here are some screenshots to start with. As you can see in the first picture of Fiddler, I can see the URL of a request and the parameters sent with that request.
I tried to do some packet sniffing with scapy with the code below to see if I can get a similar result, but what I get is in the second picture. Basically, I can get the source and destination of a packet as ip addresses, but the packets themselves are just bytes.
def sniffer():
t = AsyncSniffer(prn = lambda x: x.summary(), count = 10)
t.start()
time.sleep(8)
results = t.results
print(len(results))
print(results)
print(results[0])
From my understanding, after we establish a TCP connection, the request is broken down into several IP packets and then sent over to the destination. I would like to be able to replicate the functionality of Fiddler, where I can see the url of the request and then the values of parameters being sent over.
Would it be feasible to recreate the information of a request through only the information gathered from the packets?
Or is this difference because the sniffing is done on Layer 2, and then maybe Fiddler operates on Layer 3/4 before/after the translation into IP packets is done, so it actually sees the content of the original request itself and the result of the combination of packets? If my understanding is wrong, please correct me.
Basically, my question boils down to: "Is there a python module I can use to replicate the features of Fiddler to identify the destination url of a request and the parameters sent along with that request?"
The sniffed traffic is HTTPS traffic - therefore just by sniffing you won't see any details on the HTTP request/response because it is encrypted via SSL/TLS.
Fiddler is a proxy with HTTPS interception, that is something totally different compared to sniffing traffic on network level. This means that for the client application Fiddler "mimics" the server and for the server Fiddler mimics the client. This allows Fiddler to decrypt the requests/responses and show them to you.
If you want to perform request interception on python level I would recommend to you to use mitmproxy instead of Fiddler. This proxy also can perform HTTPS interception but it is written in Python and therefore much easier to integrate in your Python environment.
Alternatively if you just want to see the request/response details of a Python program it may be easier to do so by setting the log-level in an appropriate way. See for example this question: Log all requests from the python-requests module

Python - dig ANY equivalent with scapy module

I want to use the python module scapy to perform an equivalent command of
dig ANY google.com #8.8.4.4 +notcp
I've made a simple example code:
from scapy.all import *
a = sr(IP(dst="8.8.4.4")/UDP(sport=RandShort(),dport=53)/DNS(qd=DNSQR(qname="google.com",qtype="ALL",qclass="IN")))
print str(a[0])
And it send and recieve a packet,
but when I sniffed the packet the response says Server failure.
Wireshark Screenshot - scapy
Wireshark Screenshot - dig
Sniffing the dig command itself, looks nearly the same but it gives me a correct response and also it does not send another ICMP - Destination unreachable Packet.. this only comes up when sending it with scapy.
If you need more information, feel free to ask.
Maybe someone can help me with this..
EDIT:
Maybe the ICMP - Destination unreachable packet were send because 8.8.4.4 tries to send the response to my sport, wich is closed? But why should dig then work?!
Got the Python code working with scapy..
srp(Ether()/IP(src="192.168.1.101",dst="8.8.8.8")/UDP(sport=RandShort(),dport=53)/DNS(rd=1,qd=DNSQR(qname="google.com",qtype="ALL",qclass="IN"),ar=DNSRROPT(rclass=3000)),timeout=1,verbose=0)
In Wireshark we can see now a correct response:
Wireshark Screenshot
But I'm still getting the ICMP - Destination unreachable packet..
and I don't know why..

Python Scapy - Loading HTTP from a file

I'm using this extension for scapy to detect and analyze HTTP packets. It works great, but when I save the HTTP packets to a pcap file with wrpcap and then load it with rdpcap it doesn't give me the same packet, it only detects its HTTP packet but not HTTP Requests, it also occurs when I do this -
from scapy.all import *
from scapy_http.http import *
packets = sniff(count=10, lfilter=lambda p: HTTPRequest in p)
wrpcap('file.pcap', packets)
restored = rdpcap('file.pcap')
print len([x for x in restored if HTTPRequest in p]) # prints 0
Why this is happening? how can I recover the packets?
I am very new to Python in general, Scapy in particular but is this what you are looking for?
from scapy.all import *
def http_request(pkt):
if pkt.haslayer('HTTPRequest'): ##Use HTTPResponse for response packets
pkt.show()
exit(0) ##Omit to show more then the first packet
pkts = rdpcap('/root/Desktop/example_network_traffic.pcap')
for p in pkts:
http_request(p)
##For sniffing packets
##sniff(prn=http_request)
I think the problem may be the way Scapy exorts packets. When I run your code and inspect the packet in Wireshark, the protocol is listed as TCP. When I use Wireshark to capture the same type of packet, it lists the protocol as HTTP. If I export the packet from Wireshark and read it using rdpcap, you get the results you are looking for, ie the HTTPRequest/HTTPResponse layers. I don't know this for fact, but I checked the Berkeley Packet Filter syntax, and they don't list HTTP as a protocol. If Scapy is based on the BPF syntax, and they don't use the HTTP protocol, maybe it exports the packet with a protocol of TCP and Scapy-Http just parses the Raw load during sniff(). Just a guess.

Get requests from website

I'm trying to intercept all the requests received by a website, to get a certain file. For example, when you use Firefox's network monitor. Can I do that in Python? Sorry for being so vague. I'd like to get all the URLs that the website requests, like you can see in the picture. Example: the favicon, js files, xml files, etc.
Example:
So you probably need a packet sniffer like tcpdump. The best python sniffer I know is scapy. Here is in example of how HTTP may be sniffed with it:
http://www.r00tsec.com/2013/12/simple-sniffer-http-request-and-http.html
Note that you couldn't do that trick with HTTPS. Also packet sniffing usually requires root privileges on a host system.

Send packet to webserver?

I'm just getting into python, and I dont now how to send packets.
So can anyone tell me how can I send packets to a webserver? I would also like to choose the size of the packet myself?
I'm using python 2.7.
You can use the httplib module, to easily create http requests. Or, if you want to take a bare approach, you can use the socket module to manually create a TCP connection and send whatever packets you would like.
from httplib import HTTPConnection
...
HTTPConnection.putheader("Content-Length","512")
HTTPConnection.endheaders()
HTTPConnection.send(data)
multipart/form-data example:
http://www.voidspace.org.uk/python/cgi.shtml#upload upload_test.py

Categories