How to implement username password authenticator for RPyC server - python

I am trying to secure my RpyC server connections through username and password. The documentation indeed shows an example, but it is too brief. No details were given on how exactly the password is passed from the client-side. Anyone figured out how to do that? Thanks in advance.

Answering my own question:
I had to override some internal methods of RPyC on the client side to achieve the desired behaviour. I don't know if much cleaner solution exists, but this seems to be a plausible one.
Server:
import rpyc
from rpyc.utils.authenticators import AuthenticationError
def magic_word_authenticator(sock):
if sock.recv(5).decode() != "Ma6ik":
raise AuthenticationError("wrong magic word")
return sock, None
class SecuredService(rpyc.Service):
def exposed_secured_op(self):
return 'Secret String'
rpyc.ThreadedServer(
service=SecuredService, hostname='localhost',
port=18812, authenticator=magic_word_authenticator
).start()
Client:
import rpyc
import traceback
class AuthSocketStream(rpyc.SocketStream):
#classmethod
def connect(cls, *args, authorizer=None, **kwargs):
stream_obj = super().connect(*args, **kwargs)
if callable(authorizer):
authorizer(stream_obj.sock)
return stream_obj
def rpyc_connect(host, port, service=rpyc.VoidService, config={}, ipv6=False, keepalive=False, authorizer=None):
s = AuthSocketStream.connect(
host, port, ipv6=ipv6, keepalive=keepalive,
authorizer=authorizer
)
return rpyc.connect_stream(s, service, config)
print('With correct authorizer')
conn1 = rpyc_connect(
'localhost', 18812, authorizer=lambda sock: sock.send('Ma6ik'.encode())
)
print(conn1.root.secured_op())
print('With wrong authorizer')
conn2 = rpyc_connect(
'localhost', 18812, authorizer=lambda sock: sock.send('Invalid'.encode())
)
try:
conn2.root
except Exception:
print(traceback.format_exc())
print('With no authorizer')
conn3 = rpyc_connect(
'localhost', 18812
)
try:
conn3.root
except Exception:
print(traceback.format_exc())
Client Console Log:
With correct authorizer
Secret String
With wrong authorizer
Traceback (most recent call last):
File "/home/client.py", line 40, in <module>
conn2.root
File "/usr/lib/python3.10/site-packages/rpyc/core/protocol.py", line 507, in root
self._remote_root = self.sync_request(consts.HANDLE_GETROOT)
File "/usr/lib/python3.10/site-packages/rpyc/core/protocol.py", line 474, in sync_request
return self.async_request(handler, *args, timeout=timeout).value
File "/usr/lib/python3.10/site-packages/rpyc/core/async_.py", line 101, in value
self.wait()
File "/usr/lib/python3.10/site-packages/rpyc/core/async_.py", line 48, in wait
self._conn.serve(self._ttl)
File "/usr/lib/python3.10/site-packages/rpyc/core/protocol.py", line 387, in serve
data = self._channel.poll(timeout) and self._channel.recv()
File "/usr/lib/python3.10/site-packages/rpyc/core/channel.py", line 55, in recv
header = self.stream.read(self.FRAME_HEADER.size)
File "/usr/lib/python3.10/site-packages/rpyc/core/stream.py", line 260, in read
raise EOFError("connection closed by peer")
EOFError: connection closed by peer
With no authorizer
Traceback (most recent call last):
File "/home/client.py", line 52, in <module>
conn3.root
File "/usr/lib/python3.10/site-packages/rpyc/core/protocol.py", line 507, in root
self._remote_root = self.sync_request(consts.HANDLE_GETROOT)
File "/usr/lib/python3.10/site-packages/rpyc/core/protocol.py", line 474, in sync_request
return self.async_request(handler, *args, timeout=timeout).value
File "/usr/lib/python3.10/site-packages/rpyc/core/async_.py", line 101, in value
self.wait()
File "/usr/lib/python3.10/site-packages/rpyc/core/async_.py", line 48, in wait
self._conn.serve(self._ttl)
File "/usr/lib/python3.10/site-packages/rpyc/core/protocol.py", line 387, in serve
data = self._channel.poll(timeout) and self._channel.recv()
File "/usr/lib/python3.10/site-packages/rpyc/core/channel.py", line 55, in recv
header = self.stream.read(self.FRAME_HEADER.size)
File "/usr/lib/python3.10/site-packages/rpyc/core/stream.py", line 260, in read
raise EOFError("connection closed by peer")
EOFError: connection closed by peer

Related

How to reconnect a websocket connection websocket-client

I've been trying to write code that collects crypto data from Binance. Binance auto disconnects after 24 hours. Is there any way for me to reconnect after disconnection? I believe running forever should take care of that for me, but it dies when an error is thrown. I will be running this program on a server 24/7. I will also need a way to be notified maybe telegram/discord bot that I can build where do I type the code to send when it is disconnected
This is the error I get.
Traceback (most recent call last):
File "exchanges/binance/binance_ticker.py", line 97, in <module>
start()
File "exchanges/binance/binance_ticker.py", line 94, in start
rel.dispatch()
File "/home/pyjobs/.local/lib/python3.8/site-packages/rel/rel.py", line 205, in dispatch
registrar.dispatch()
File "/home/pyjobs/.local/lib/python3.8/site-packages/rel/registrar.py", line 72, in dispatch
if not self.loop():
File "/home/pyjobs/.local/lib/python3.8/site-packages/rel/registrar.py", line 81, in loop
e = self.check_events()
File "/home/pyjobs/.local/lib/python3.8/site-packages/rel/registrar.py", line 232, in check_events
self.callback('read', fd)
File "/home/pyjobs/.local/lib/python3.8/site-packages/rel/registrar.py", line 125, in callback
self.events[etype][fd].callback()
File "/home/pyjobs/.local/lib/python3.8/site-packages/rel/listener.py", line 108, in callback
if not self.cb(*self.args) and not self.persist and self.active:
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_app.py", line 349, in read
op_code, frame = self.sock.recv_data_frame(True)
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_core.py", line 401, in recv_data_frame
frame = self.recv_frame()
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_core.py", line 440, in recv_frame
return self.frame_buffer.recv_frame()
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_abnf.py", line 352, in recv_frame
payload = self.recv_strict(length)
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_abnf.py", line 373, in recv_strict
bytes_ = self.recv(min(16384, shortage))
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_core.py", line 524, in _recv
return recv(self.sock, bufsize)
File "/home/pyjobs/.local/lib/python3.8/site-packages/websocket/_socket.py", line 122, in recv
raise WebSocketConnectionClosedException(
websocket._exceptions.WebSocketConnectionClosedException: Connection to remote host was lost.
My code:
import websocket
import rel
uri = "wss://stream.binance.com:9443/ws/!ticker#arr"
def on_message(ws, message):
print(message)
def on_error(ws, error):
print(error)
write_logs(error)
def on_close(ws, close_status_code, close_msg):
print("### closed ###")
write_logs(str(close_status_code) + str(close_msg))
start(
def on_open(ws):
print("Opened connection")
start()
websocket.enableTrace(True)
ws = websocket.WebSocketApp(uri,
on_open = on_open,
on_message=on_message,
on_error = on_error,
on_close (on_close)
ws.run_forever(dispatcher=rel) #Set the dispatcher to automatic reconnection.
rel.signal(2, rel.abort) # Keyboard Interrupt
rel.dispatch()
start()
The comment in this line of code ws.run_forever(dispatcher=rel) #Set the dispatcher to automatic reconnection. could auto reconnection depending on rel module? And how the module rel and func dispatcher work together?

Python XMLRPC: Cannot allow None even after allow_none = True

I'm trying to write a simple application that communicates using RPCs. I'm using python 3.7's xmlrpc.
This is my server code
MY_ADDR = ("localhost", int(sys.argv[1]))
HOST_ADDR = ("localhost", int(sys.argv[2]))
class RpcServer(threading.Thread):
def __init__(self):
threading.Thread.__init__(self)
self.port = MY_ADDR[1]
self.addr = MY_ADDR[0]
# serve other hosts using this
self.server = SimpleXMLRPCServer((self.addr, self.port))
self.server.register_function(self.recv_ops)
def run(self):
self.server.serve_forever()
def recv_ops(self, sender, op):
print("Sender ", sender, " sent: ", op)
pass
And this is what I'm using as my client's code
def send_ops(host_addr, op):
# contact the other host using this
proxy_addr = "http://{addr}:{port}/".format(addr=host_addr[0], port=host_addr[1])
client_proxy = xmlrpc.client.ServerProxy(proxy_addr, allow_none=True)
resp = client_proxy.recv_ops(MY_ADDR, op)
...
send_ops(HOST_ADDR, ("d", ii, last_line[ii])) # THE RPC CALL I MAKE
Despite setting allow_none=True, I keep getting this:
Exception in thread Thread-1:
Traceback (most recent call last):
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/threading.py", line 865, in run
self._target(*self._args, **self._kwargs)
File "nb.py", line 102, in editor
send_ops(HOST_ADDR, ("d", ii, last_line[ii]))
File "nb.py", line 63, in send_ops
resp = client_proxy.recv_ops(MY_ADDR, op)
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/xmlrpc/client.py", line 1112, in __call__
return self.__send(self.__name, args)
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/xmlrpc/client.py", line 1452, in __request
verbose=self.__verbose
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/xmlrpc/client.py", line 1154, in request
return self.single_request(host, handler, request_body, verbose)
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/xmlrpc/client.py", line 1170, in single_request
return self.parse_response(resp)
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/xmlrpc/client.py", line 1342, in parse_response
return u.close()
File "/usr/local/Cellar/python/3.7.0/Frameworks/Python.framework/Versions/3.7/lib/python3.7/xmlrpc/client.py", line 656, in close
raise Fault(**self._stack[0])
xmlrpc.client.Fault: <Fault 1: "<class 'TypeError'>:cannot marshal None unless allow_none is enabled">
What's tripping my is that the server on the other side actually receives the message (without any None)
Sender ['localhost', 8001] sent: ['d', 4, 'o']
What am I missing here? Any help would be appreciated.
Thanks!
In your server class, add allow_none=True to your SimpleXMLRPCServer instantiation.
self.server = SimpleXMLRPCServer((self.addr, self.port), allow_none=True)
The allow_none and encoding parameters are passed on to xmlrpc.client and control the XML-RPC responses that will be returned from the server.

How can I correct an 'only one socket usage of each address error' caused by a process being spawned within a class?

Currently, I am trying to develop a server framework which passes messages from twitch to other machines on a local network. I have a class called server and below I have a rudimentary example which demonstrates the problem I am running into. The issue is that the twitch_socket is being created twice and bound to the address/port. My expected result is that the socket would be shared between the child processes of the Server class. How can I modify the class, or even get rid of it entirely, so that the Processes would be able to share sockets between them?
import multiprocessing
import socket
import re
from BotPass import PASSWORD
def send_message(socketobj, message):
'Sends a str as bytes through socket'
message = message.encode()
socketobj.sendall(message)
def recv_message(socketobj):
'Receives a str as bytes though socket'
return socketobj.recv(2048).decode()
class Server:
'Handles receiving messages from twitch and directs messages from clients'
twitch_socket = socket.socket()
twitch_socket.connect(('irc.chat.twitch.tv', 6667))
send_message(twitch_socket, 'PASS %s\r\n' % (PASSWORD))
send_message(twitch_socket, 'NICK %s\r\n' % ('squid_coin_bot'))
send_message(twitch_socket, 'JOIN #jtv\r\n')
send_message(twitch_socket, 'CAP REQ :twitch.tv/commands\r\n')
server_socket = socket.socket()
server_socket.bind(('', 9999))
work_queue = multiprocessing.Queue()
#Queue of messages from twitch
worker_queue = multiprocessing.Queue()
#Queue of free client socket objects
result_queue = multiprocessing.Queue()
#Queue of what to send back to twitch
def start():
accept_process = multiprocessing.Process(target=Server.accept_connections)
# *This is most likely where the issue is occurring*
accept_process.daemon = True
accept_process.start()
def accept_connections():
''
Server.server_socket.listen(10)
while 1:
(clientsocket, clientaddr) = Server.server_socket.accept()
# What I believe I am referencing here is the server socket which is inherent to the Server class
if re.match(r'192\.168\.\d{1,3}\.\d{1,3}', clientaddr[0])\
or clientaddr[0] == '127.0.0.1':
Server.worker_queue.put(clientsocket)
else:
clientsocket.close()
Server.start()
input()
Output in Console:
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Program Files\Python36\lib\multiprocessing\spawn.py", line 105, in spawn_main
exitcode = _main(fd)
File "C:\Program Files\Python36\lib\multiprocessing\spawn.py", line 114, in _main
prepare(preparation_data)
File "C:\Program Files\Python36\lib\multiprocessing\spawn.py", line 225, in prepare
_fixup_main_from_path(data['init_main_from_path'])
File "C:\Program Files\Python36\lib\multiprocessing\spawn.py", line 277, in _fixup_main_from_path
run_name="__mp_main__")
File "C:\Program Files\Python36\lib\runpy.py", line 263, in run_path
pkg_name=pkg_name, script_name=fname)
File "C:\Program Files\Python36\lib\runpy.py", line 96, in _run_module_code
mod_name, mod_spec, pkg_name, script_name)
File "C:\Program Files\Python36\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "C:\twitch-market\server.py", line 18, in <module>
class Server:
File "C:\twitch-market\server.py", line 27, in Server
server_socket.bind(('', 9999))
OSError: [WinError 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted
Add this socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
This is because the previous execution has left the socket in a TIME_WAIT state, and can’t be immediately reused.the SO_REUSEADDR flag tells the kernel to reuse a local socket in TIME_WAIT state, without waiting for its natural timeout to expire.

Can't initiate get request in web.py

I'm trying to run two servers using web.py and initiating calls from one to another. Both servers start normally but when I try to call a url the below stack trace is thrown.
import web
urls = (
'/ping', 'Ping',
'/acqlock/+(.*)', 'Acquire',
)
class MSA(web.application):
def run(self, port=8081, *middleware):
func = self.wsgifunc(*middleware)
return web.httpserver.runsimple(func, ('127.0.0.1', port))
app = MSA(urls, globals())
if __name__ == "__main__":
app.run(port=8081)
class Acquire:
def GET(self, resource_name):
print resource_name
response = app.request('http://127.0.0.1:8080/acqlock/' + resource_name, method='GET')
return response
But I keep getting this error after calling the /acqlock.
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\web\wsgiserver\__init__.py", line 1245, in communicate
req.respond()
File "C:\Python27\lib\site-packages\web\wsgiserver\__init__.py", line 775, in respond
self.server.gateway(self).respond()
File "C:\Python27\lib\site-packages\web\wsgiserver\__init__.py", line 2018, in respond
response = self.req.server.wsgi_app(self.env, self.start_response)
File "C:\Python27\lib\site-packages\web\httpserver.py", line 306, in __call__
return self.app(environ, xstart_response)
File "C:\Python27\lib\site-packages\web\httpserver.py", line 274, in __call__
return self.app(environ, start_response)
File "C:\Python27\lib\site-packages\web\application.py", line 279, in wsgi
result = self.handle_with_processors()
File "C:\Python27\lib\site-packages\web\application.py", line 249, in handle_with_processors
return process(self.processors)
File "C:\Python27\lib\site-packages\web\application.py", line 246, in process
raise self.internalerror()
File "C:\Python27\lib\site-packages\web\application.py", line 515, in internalerror
parent = self.get_parent_app()
File "C:\Python27\lib\site-packages\web\application.py", line 500, in get_parent_app
if self in web.ctx.app_stack:
AttributeError: 'ThreadedDict' object has no attribute 'app_stack'
Use requests library for this.
import requests
response = requests.request(method='GET', url ='http://127.0.0.1:8080/acqlock/' + resource_name)
Note: You have used port 8080 in url even though you have hosted the web.py in 8081

Autobahn Python Errno 99 cannot assign requested address

While trying to set up a Websocket server, I have encountered the following error.
The same code works fine under LAN IP '192.168.x.x', but fails to work with a public ip/domain name
Here is the error trace
Traceback (most recent call last):
File "WSServer.py", line 19, in <module>
server = loop.run_until_complete(coro)
File "/usr/lib64/python3.4/asyncio/base_events.py", line 208, in run_until_complete
return future.result()
File "/usr/lib64/python3.4/asyncio/futures.py", line 243, in result
raise self._exception
File "/usr/lib64/python3.4/asyncio/tasks.py", line 319, in _step
result = coro.send(value)
File "/usr/lib64/python3.4/asyncio/base_events.py", line 579, in create_server
% (sa, err.strerror.lower()))
OSError: [Errno 99] error while attempting to bind on address ('121.6.x.x', 9000): cannot assign requested address
Python Server Code:
from autobahn.asyncio.websocket import WebSocketServerProtocol
class MyServerProtocol(WebSocketServerProtocol):
def onMessage(self, payload, isBinary):
print("message received")
self.sendMessage(payload, isBinary)
if __name__ == '__main__':
import asyncio
from autobahn.asyncio.websocket import WebSocketServerFactory
factory = WebSocketServerFactory()
factory.protocol = MyServerProtocol
loop = asyncio.get_event_loop()
coro = loop.create_server(factory, '121.6.x.x', 9000)
server = loop.run_until_complete(coro)
try:
loop.run_forever()
except KeyboardInterrupt:
pass
finally:
server.close()
loop.close()
Could the issue be related with the server setting? e.g. hostname, SELinux

Categories