How to set timeout for ProxyAgent? - python

The client.Agent class has a connection timeout argument:
agent = client.Agent(reactor, connectTimeout=timeout, pool=pool)
How can this timeout be set when using client.ProxyAgent?
auth = base64.b64encode("%s:%s" % (username, password))
headers['Proxy-Authorization'] = ["Basic " + auth.strip()]
endpoint = endpoints.TCP4ClientEndpoint(reactor, host, port)
agent = client.ProxyAgent(endpoint, reactor=reactor, pool=pool)

The TCP4ClientEndpoint you pass to ProxyAgent can be initialized with a timeout.
auth = base64.b64encode("%s:%s" % (username, password))
headers['Proxy-Authorization'] = ["Basic " + auth.strip()]
endpoint = endpoints.TCP4ClientEndpoint(reactor, host, port, timeout=yourTimeout)
agent = client.ProxyAgent(endpoint, reactor=reactor, pool=pool)
This is supposing you want to set the timeout for connecting to the proxy. If you wanted to set the timeout used by the proxy to connect to the upstream HTTP server, you can't control this.

It looks like client.ProxyAgent doesn't have a connectTimeout property:
class ProxyAgent(_AgentBase):
"""
An HTTP agent able to cross HTTP proxies.
#ivar _proxyEndpoint: The endpoint used to connect to the proxy.
#since: 11.1
"""
def __init__(self, endpoint, reactor=None, pool=None):
if reactor is None:
from twisted.internet import reactor
_AgentBase.__init__(self, reactor, pool)
self._proxyEndpoint = endpoint
def request(self, method, uri, headers=None, bodyProducer=None):
"""
Issue a new request via the configured proxy.
"""
# Cache *all* connections under the same key, since we are only
# connecting to a single destination, the proxy:
key = ("http-proxy", self._proxyEndpoint)
# To support proxying HTTPS via CONNECT, we will use key
# ("http-proxy-CONNECT", scheme, host, port), and an endpoint that
# wraps _proxyEndpoint with an additional callback to do the CONNECT.
return self._requestWithEndpoint(key, self._proxyEndpoint, method,
_URI.fromBytes(uri), headers,
bodyProducer, uri)
ProxyAgent inherits from the same class Agent does (_AgentBase) and not from Agent itself.

Related

Send message to existing TCP connection using Twisted

I am writing a TCP server to listen for TCP packets containing status information from remote machines. The remote machines keep the TCP connection alive once established.
Here are the salient parts of my code:
#!/usr/bin/python
from twisted.internet import reactor, protocol
class FactoryProcess(protocol.Protocol):
def dataReceived(self, data):
# Process received data
def send_data(self, message):
# Reply to message etc
self.transport.write(message)
factory = protocol.ServerFactory()
factory.protocol = FactoryProcess
reactor.listenTCP(8256,factory)
reactor.run()
The machines can connect and send their data, and I can send acknowledgements back in the send_data block.
So far, so good.
I cannot understand how to asynchronously send data to one of the devices from outside the Protocol code. Clearly, I need to somehow access an instance of the Factory class for the specific connection I wish to use but I cannot see how.
Keep safe and many thanks.
EDIT After #notorious.no provided a very helpful example, I changed my code to capture IP addresses and ports, also connection objects of connected devices:
from twisted.internet import endpoints, protocol, reactor
device_ips = []
device_ports = []
connections = []
class ChatProtocol(protocol.Protocol):
def connectionMade(self):
global device_ips, device_ports, connections
# Append client
self.factory.clientList.append(self)
print('client connected. Connection Count = ' + str(len(self.factory.clientList)))
connections.append(self)
ip, port = self.transport.client
device_ips.append(ip)
device_ports.append(port)
print('ips:' + str(device_ips) + ', ports:' + str(device_ports) + ', connections:' + str(connections))
def connectionLost(self, _):
# Remove client
self.factory.clientList.remove(self)
print('client lost. Connection Count = ' + str(len(self.factory.clientList)))
def dataReceived(self, data):
print('Data received:' + str(data))
# Send message to all connected clients
for client in self.factory.clientList:
if client == self:
continue
client.transport.write(data)
class ChatFactory(protocol.Factory):
protocol = ChatProtocol
clientList = []
def main():
epServer = endpoints.serverFromString(reactor, "tcp:8123")
epServer.listen(ChatFactory())
reactor.run()
main()
When I run this and then connect two test devices I get:
client connected. Connection Count = 1
ips:['redacted'], ports:[54182], connections:[<__main__.ChatProtocol instance at 0x7f5a835afcd0>]
client connected. Connection Count = 2
ips:['redacted', 'redacted'], ports:[54182, 57437], connections:[<__main__.ChatProtocol instance at 0x7f5a835afcd0>, <__main__.ChatProtocol instance at 0x7f5a835c2140>]
So now I have lists of connected device IPs and ports, and presumably I can use the connections objects to asynchronously send one a message when needed. Please could you advise how I can do this?
Keep safe...
Not really sure what you mean by "devices from outside the Protocol code" but I assume you mean accessing other clients that have connected to the same server (please comment if that's not the case). One thing you can do is have list of connected protocols in the factory object. The Factory.buildProtocol (by default, unless you overload it) will set the factory param in the protocol.
from twisted.internet import endpoints, protocol, reactor
class ChatProtocol(protocol.Protocol):
def connectionMade(self):
# Append client
self.factory.clientList.append(self)
print(len(self.factory.clientList))
def connectionLost(self, _):
# Remove client
self.factory.clientList.remove(self)
print(len(self.factory.clientList))
def dataReceived(self, data):
# Send message to all connected clients
for client in self.factory.clientList:
if client == self:
continue
client.transport.write(data)
class ChatFactory(protocol.Factory):
protocol = ChatProtocol
clientList = []
def main():
epServer = endpoints.serverFromString(reactor, "tcp:8256:interface=0.0.0.0")
epServer.listen(ChatFactory())
reactor.run()
main()

Disabling gunicorn certificate validation

i'm creating an application with Flask and i'm using gunicorn as my application server. I enabled the verification of the client's certificate, and i would like to know if there is a way to disable the validation of client's certificate for a specific user or if there is a way to use two address:1 that uses https and another that uses http.
gunicorn configuration
import ssl
bind = "0.0.0.0:8080"
ca_certs = "certs/ca-crt.pem"
certfile = "certs/server-crt.pem"
keyfile = "certs/server-key.pem"
cert_reqs = ssl.CERT_REQUIRED
worker_class = 'proto_worker.CustomSyncWorker'
from gunicorn.workers.sync import SyncWorker
import werkzeug.serving
import OpenSSL
class CustomSyncWorker(SyncWorker):
def handle_request(self, listener, req, client, addr):
cert = client.getpeercert()
try:
key = client.get_password()
except:
key = ''
headers = dict(req.headers)
#headers['CERT'] = dict(cert)
headers['CERT'] = str(cert)+str(key)
req.headers = list(headers.items())
super(CustomSyncWorker, self).handle_request(listener, req, client, addr)

How to send UDP requests via asyncio in python 3

I would like to send UDP messages to my InfluxDB server using asyncio. This is my synchronous code:
import socket
import requests
from logger import get_logger
log = get_logger(__name__)
class InfluxDB(object):
def __init__(self, username, password, host, port, udp=True, udp_port=4444):
log.info("InfluxDBClient()")
self.udp = udp
if self.udp:
self.udp_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
self.udp_port_tuple = (host, udp_port)
self.session = requests.session()
self.session.auth = (username, password)
self.base_url = "http://{}:{}".format(host, port)
def _query(self, payload, endpoint="/query"):
log.debug(payload)
return self.session.post(self.base_url + endpoint, params=payload)
def _write(self, db, data, endpoint="/write"):
log.debug("{} \t {}".format(db, data))
return self.session.post(self.base_url + endpoint, params={"db": db}, data=data)
def write_udp(self, data_str):
if self.udp:
return self.udp_socket.sendto(data_str.encode("utf-8"), self.udp_port_tuple)
else:
raise Exception("UDP disabled")
How do I rewrite the "write_udp" function using the async/await syntax?
-- edit 7/5/2018 --
To be more specific, I am unsure how to reference the asyncio equivalent of "socket". Presumably there is a version of an asyncio based socket, which I would reference as self.udp_socket = socket.socket_asyncio, and then I would fire off messages via await socket.socket_asyncio.sendto(....) what I need to understand is how to reference this asyncio based socket object.

Python + Twisted + FtpClient + SOCKS

I just started using Twisted. I want to connect to an FTP server and perform some basic operations (use threading if possible). I am using this example.
Which does the job quite well. The question is how to add a SOCKS4/5 proxy usage to the code? Can somebody please provide a working example? I have tried this link too.
But,
# Copyright (c) Twisted Matrix Laboratories.
# See LICENSE for details.
"""
An example of using the FTP client
"""
# Twisted imports
from twisted.protocols.ftp import FTPClient, FTPFileListProtocol
from twisted.internet.protocol import Protocol, ClientCreator
from twisted.python import usage
from twisted.internet import reactor, endpoints
# Socks support test
from socksclient import SOCKSv4ClientProtocol, SOCKSWrapper
from twisted.web import client
# Standard library imports
import string
import sys
try:
from cStringIO import StringIO
except ImportError:
from StringIO import StringIO
class BufferingProtocol(Protocol):
"""Simple utility class that holds all data written to it in a buffer."""
def __init__(self):
self.buffer = StringIO()
def dataReceived(self, data):
self.buffer.write(data)
# Define some callbacks
def success(response):
print 'Success! Got response:'
print '---'
if response is None:
print None
else:
print string.join(response, '\n')
print '---'
def fail(error):
print 'Failed. Error was:'
print error
def showFiles(result, fileListProtocol):
print 'Processed file listing:'
for file in fileListProtocol.files:
print ' %s: %d bytes, %s' \
% (file['filename'], file['size'], file['date'])
print 'Total: %d files' % (len(fileListProtocol.files))
def showBuffer(result, bufferProtocol):
print 'Got data:'
print bufferProtocol.buffer.getvalue()
class Options(usage.Options):
optParameters = [['host', 'h', 'example.com'],
['port', 'p', 21],
['username', 'u', 'webmaster'],
['password', None, 'justapass'],
['passive', None, 0],
['debug', 'd', 1],
]
# Socks support
def wrappercb(proxy):
print "connected to proxy", proxy
pass
def run():
def sockswrapper(proxy, url):
dest = client._parse(url) # scheme, host, port, path
endpoint = endpoints.TCP4ClientEndpoint(reactor, dest[1], dest[2])
return SOCKSWrapper(reactor, proxy[1], proxy[2], endpoint)
# Get config
config = Options()
config.parseOptions()
config.opts['port'] = int(config.opts['port'])
config.opts['passive'] = int(config.opts['passive'])
config.opts['debug'] = int(config.opts['debug'])
# Create the client
FTPClient.debug = config.opts['debug']
creator = ClientCreator(reactor, FTPClient, config.opts['username'],
config.opts['password'], passive=config.opts['passive'])
#creator.connectTCP(config.opts['host'], config.opts['port']).addCallback(connectionMade).addErrback(connectionFailed)
# Socks support
proxy = (None, '1.1.1.1', 1111, True, None, None)
sw = sockswrapper(proxy, "ftp://example.com")
d = sw.connect(creator)
d.addCallback(wrappercb)
reactor.run()
def connectionFailed(f):
print "Connection Failed:", f
reactor.stop()
def connectionMade(ftpClient):
# Get the current working directory
ftpClient.pwd().addCallbacks(success, fail)
# Get a detailed listing of the current directory
fileList = FTPFileListProtocol()
d = ftpClient.list('.', fileList)
d.addCallbacks(showFiles, fail, callbackArgs=(fileList,))
# Change to the parent directory
ftpClient.cdup().addCallbacks(success, fail)
# Create a buffer
proto = BufferingProtocol()
# Get short listing of current directory, and quit when done
d = ftpClient.nlst('.', proto)
d.addCallbacks(showBuffer, fail, callbackArgs=(proto,))
d.addCallback(lambda result: reactor.stop())
# this only runs if the module was *not* imported
if __name__ == '__main__':
run()
I know the code is wrong. I need Solution.
Okay, so here's a solution (gist) that uses python's built-in ftplib, as well as the open source SocksiPy module.
It doesn't use twisted, and it doesn't explicitly use threads, but using and communicting between threads is pretty easily done with threading.Thread and threading.Queue in python's standard threading module
Basically, we need to subclass ftplib.FTP to support substituting our own create_connection method and add proxy configuration semantics.
The "main" logic just configures an FTP client that connects via a localhost socks proxy, such as one created by ssh -D localhost:1080 socksproxy.example.com, and downloads a source snapshot for GNU autoconf to the local disk.
import ftplib
import socket
import socks # socksipy (https://github.com/mikedougherty/SocksiPy)
class FTP(ftplib.FTP):
def __init__(self, host='', user='', passwd='', acct='',
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
proxyconfig=None):
"""Like ftplib.FTP constructor, but with an added `proxyconfig` kwarg
`proxyconfig` should be a dictionary that may contain the following
keys:
proxytype - The type of the proxy to be used. Three types
are supported: PROXY_TYPE_SOCKS4 (including socks4a),
PROXY_TYPE_SOCKS5 and PROXY_TYPE_HTTP
addr - The address of the server (IP or DNS).
port - The port of the server. Defaults to 1080 for SOCKS
servers and 8080 for HTTP proxy servers.
rdns - Should DNS queries be preformed on the remote side
(rather than the local side). The default is True.
Note: This has no effect with SOCKS4 servers.
username - Username to authenticate with to the server.
The default is no authentication.
password - Password to authenticate with to the server.
Only relevant when username is also provided.
"""
self.proxyconfig = proxyconfig or {}
ftplib.FTP.__init__(self, host, user, passwd, acct, timeout)
def connect(self, host='', port=0, timeout=-999):
'''Connect to host. Arguments are:
- host: hostname to connect to (string, default previous host)
- port: port to connect to (integer, default previous port)
'''
if host != '':
self.host = host
if port > 0:
self.port = port
if timeout != -999:
self.timeout = timeout
self.sock = self.create_connection(self.host, self.port)
self.af = self.sock.family
self.file = self.sock.makefile('rb')
self.welcome = self.getresp()
return self.welcome
def create_connection(self, host=None, port=None):
host, port = host or self.host, port or self.port
if self.proxyconfig:
phost, pport = self.proxyconfig['addr'], self.proxyconfig['port']
err = None
for res in socket.getaddrinfo(phost, pport, 0, socket.SOCK_STREAM):
af, socktype, proto, canonname, sa = res
sock = None
try:
sock = socks.socksocket(af, socktype, proto)
sock.setproxy(**self.proxyconfig)
if self.timeout is not socket._GLOBAL_DEFAULT_TIMEOUT:
sock.settimeout(self.timeout)
sock.connect((host, port))
return sock
except socket.error as _:
err = _
if sock is not None:
sock.close()
if err is not None:
raise err
else:
raise socket.error("getaddrinfo returns an empty list")
else:
sock = socket.create_connection((host, port), self.timeout)
return sock
def ntransfercmd(self, cmd, rest=None):
size = None
if self.passiveserver:
host, port = self.makepasv()
conn = self.create_connection(host, port)
try:
if rest is not None:
self.sendcmd("REST %s" % rest)
resp = self.sendcmd(cmd)
# Some servers apparently send a 200 reply to
# a LIST or STOR command, before the 150 reply
# (and way before the 226 reply). This seems to
# be in violation of the protocol (which only allows
# 1xx or error messages for LIST), so we just discard
# this response.
if resp[0] == '2':
resp = self.getresp()
if resp[0] != '1':
raise ftplib.error_reply, resp
except:
conn.close()
raise
else:
raise Exception("Active transfers not supported")
if resp[:3] == '150':
# this is conditional in case we received a 125
size = ftplib.parse150(resp)
return conn, size
if __name__ == '__main__':
ftp = FTP(host='ftp.gnu.org', user='anonymous', passwd='guest',
proxyconfig=dict(proxytype=socks.PROXY_TYPE_SOCKS5, rdns=False,
addr='localhost', port=1080))
with open('autoconf-2.69.tar.xz', mode='w') as f:
ftp.retrbinary("RETR /gnu/autoconf/autoconf-2.69.tar.xz", f.write)
To elaborate why I asked some of my original questions:
1) Do you need to support active transfers or will PASV transfers be sufficient?
Active transfers are much harder to do via a socks proxy because they require the use of the PORT command. With the PORT command, your ftp client tells the FTP server to connect to you on a specific port (e.g., on your PC) in order to send the data. This is likely to not work for users behind a firewall or NAT/router. If your SOCKS proxy server is not behind a firewall, or has a public IP, it is possible to support active transfers, but it is complicated: It requires your SOCKS server (ssh -D does support this) and client library (socksipy does not) to support remote port binding. It also requires the appropriate hooks in the application (my example throws an exception if passiveserver = False) to do a remote BIND instead of a local one.
2) Does it have to use twisted?
Twisted is great, but I'm not the best at it, and I haven't found a really great SOCKS client implementation. Ideally there would be a library out there that allowed you to define and/or chain proxies together, returning an object that implements the IReactorTCP interface, but I have not yet found anything like this just yet.
3) Is your socks proxy behind a VIP or just a single host directly connected to the Internet?
This matters because of the way PASV transfer security works. In a PASV transfer, the client asks the server to provide a port to connect in order to start a data transfer. When the server accepts a connection on that port, it SHOULD verify the client is connected from the same source IP as the connection that requested the transfer. If your SOCKS server is behind a VIP, it is less likely that the outbound IP of the connection made for the PASV transfers will match the outbound IP of the primary communication connection.

Proxy username/password with Twisted

I'm trying to use Twisted's ProxyAgent class to connect to a proxy server and make HTTP requests, however the server requires a username and password. Is it possible to specify these credentials to the server using ProxyAgent?
endpoint = TCP4ClientEndpoint(reactor, host, port)
agent = ProxyAgent(endpoint)
# Maybe need to pass auth credentials in the header here?
body = agent.request("GET", path)
Figured out the problem, the Proxy-Authorization field has to be set in the headers:
endpoint = TCP4ClientEndpoint(reactor, host, port)
agent = ProxyAgent(endpoint)
headers = {}
auth = base64.b64encode("%s:%s" % (username, password))
headers["Proxy-Authorization"] = ["Basic " + auth.strip()]
body = agent.request("GET", path, Headers(headers))

Categories