Run function in another thread - python

So say there are two running codes: script1 and script2.
I want script2 to be able to run a function in script1.
script1 will be some kind of background process that will run "forever".
The point is to be able to make an API for a background process, E.G. a server.
The unclean way to do it would be to have a file transmit the orders from script2. script1 would then execute it with exec(). However, I would like to use a module or something cleaner because then I would be able to output classes and not only text.
EDIT: example:
script1:
def dosomething(args):
# do something
return information
while True:
# Do something in a loop
script2:
# "import" the background process
print(backgroundprocess.dosomething(["hello", (1, 2, 3)]))
The execution would look like this:
Run script1
Run script2 in a parallel window

Summary
The XMLRPC modules are designed for this purpose.
The docs include a worked out example for a server (script1) and a client (script2).
Server Example
from xmlrpc.server import SimpleXMLRPCServer
from xmlrpc.server import SimpleXMLRPCRequestHandler
class RequestHandler(SimpleXMLRPCRequestHandler):
rpc_paths = ('/RPC2',)
# Create server
with SimpleXMLRPCServer(('localhost', 8000),
requestHandler=RequestHandler) as server:
server.register_introspection_functions()
# Register pow() function; this will use the value of
# pow.__name__ as the name, which is just 'pow'.
server.register_function(pow)
# Register a function under a different name
def adder_function(x, y):
return x + y
server.register_function(adder_function, 'add')
# Register an instance; all the methods of the instance are
# published as XML-RPC methods (in this case, just 'mul').
class MyFuncs:
def mul(self, x, y):
return x * y
server.register_instance(MyFuncs())
# Run the server's main loop
server.serve_forever()
Client Example
import xmlrpc.client
s = xmlrpc.client.ServerProxy('http://localhost:8000')
print(s.pow(2,3)) # Returns 2**3 = 8
print(s.add(2,3)) # Returns 5
print(s.mul(5,2)) # Returns 5*2 = 10
# Print list of available methods
print(s.system.listMethods())

Related

Python dictionary is empty while imported to another module

This is a project template so I can't change much...
I will omit what I believe are irrelevant parts:
#file server.py
import functions
import json
import socket
funcs = {}
class JSONRPCServer:
"""The JSON-RPC server."""
def __init__(self, host, port):
self.host = host
self.port = port
self.sock = None
def register(self, name, function):
"""Registers a function."""
funcs[name] = function
(...)
if __name__ == "__main__":
# Test the JSONRPCServer class
server = JSONRPCServer('0.0.0.0', 8000)
# Register functions
server.register('hello', functions.hello)
server.register('greet', functions.greet)
server.register('add', functions.add)
server.register('sub', functions.sub)
server.register('mul', functions.mul)
server.register('div', functions.div)
print(funcs)
# Start the server
server.start()
Here this will print all my functions inside the funcs dict.
I have another file that needs the contents of funcs but for testing I have this:
#file test.py
from server import funcs
print(funcs)
This prints an empty dictionary. How do I make it so that funcs keeps it's values across these two files?
When you run the test.py file, anything within the
if __name__ == "__main__"
of the server.py isn't being run, since it isn't the main file (when you run server.py directly, it IS going into the if __main__ and therefore filling up the funcs dict). Therefore, all those server.register calls aren't being run when you run the test.py file, and hence your funcs dict is empty.
Maybe put that piece of code with all the register calls in a different function, and call that directly?
When you directly run server.py, it prints populated func as __name__=="__main__" evaluates to true. This doesn't work when server.py is imported in another module though.
Also, as a good practice you should add a function to server.py to fetch the funcs instead of relying on global.
Also, you can refactor the logic inside name==main to a function (eg. start_server) so that any module can start the server by just calling this function.

Access global instance modified inside main() function of a server from different modules

I have a server that contains a class which performs an expensive computation
during its initialization. I want to initialize this class once, inside the main() method of the server module, before starting the server. Then, I want other modules that import the server module to be able to retrieve the instance of this class.
Example (the sleep emulates the server running)
import time
# I want to store the shared_instance of this global variable
shared_instance = None
class Shared:
def __init__(self):
# Expensive computation that I only want to run once
pass
def main():
global shared_instance
shared_instance = Shared() # Now instance_of_scorer is not None anymore
print(shared_instance)
print("Starting server...")
time.sleep(1000)
if __name__ == '__main__':
main()
When I run this server it prints:
<__main__.Shared object at 0x000001865A3C4320>
Starting server...
Now I have other module that should be able to see the instance:
import server
print(server.shared_instance)
However, shared_instance is not '<main.Shared object at 0x000001865A3C4320>' as expected. It is 'None'. Could you please tell me want I'm doing wrong and how can I solve this issue and achieve this functionality?.
Many thanks

What's the closest I can get to calling a Python function using a different Python version?

Say I have two files:
# spam.py
import library_Python3_only as l3
def spam(x,y)
return l3.bar(x).baz(y)
and
# beans.py
import library_Python2_only as l2
...
Now suppose I wish to call spam from within beans. It's not directly possible since both files depend on incompatible Python versions. Of course I can Popen a different python process, but how could I pass in the arguments and retrieve the results without too much stream-parsing pain?
Here is a complete example implementation using subprocess and pickle that I actually tested. Note that you need to use protocol version 2 explicitly for pickling on the Python 3 side (at least for the combo Python 3.5.2 and Python 2.7.3).
# py3bridge.py
import sys
import pickle
import importlib
import io
import traceback
import subprocess
class Py3Wrapper(object):
def __init__(self, mod_name, func_name):
self.mod_name = mod_name
self.func_name = func_name
def __call__(self, *args, **kwargs):
p = subprocess.Popen(['python3', '-m', 'py3bridge',
self.mod_name, self.func_name],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE)
stdout, _ = p.communicate(pickle.dumps((args, kwargs)))
data = pickle.loads(stdout)
if data['success']:
return data['result']
else:
raise Exception(data['stacktrace'])
def main():
try:
target_module = sys.argv[1]
target_function = sys.argv[2]
args, kwargs = pickle.load(sys.stdin.buffer)
mod = importlib.import_module(target_module)
func = getattr(mod, target_function)
result = func(*args, **kwargs)
data = dict(success=True, result=result)
except Exception:
st = io.StringIO()
traceback.print_exc(file=st)
data = dict(success=False, stacktrace=st.getvalue())
pickle.dump(data, sys.stdout.buffer, 2)
if __name__ == '__main__':
main()
The Python 3 module (using the pathlib module for the showcase)
# spam.py
import pathlib
def listdir(p):
return [str(c) for c in pathlib.Path(p).iterdir()]
The Python 2 module using spam.listdir
# beans.py
import py3bridge
delegate = py3bridge.Py3Wrapper('spam', 'listdir')
py3result = delegate('.')
print py3result
Assuming the caller is Python3.5+, you have access to a nicer subprocess module. Perhaps you could user subprocess.run, and communicate via pickled Python objects sent through stdin and stdout, respectively. There would be some setup to do, but no parsing on your side, or mucking with strings etc.
Here's an example of Python2 code via subprocess.Popen
p = subprocess.Popen(python3_args, stdin=subprocess.PIPE, stdout=subprocess.PIPE)
stdout, stderr = p.communicate(pickle.dumps(python3_args))
result = pickle.load(stdout)
You could create a simple script as such :
import sys
import my_wrapped_module
import json
params = sys.argv
script = params.pop(0)
function = params.pop(0)
print(json.dumps(getattr(my_wrapped_module, function)(*params)))
You'll be able to call it like that :
pythonx.x wrapper.py myfunction param1 param2
This is obviously a security hazard though, be careful.
Also note that if your params are anything else than string or integers, you'll have some issues, so maybe think about transmitting params as a json string, and convert it using json.loads() in the wrapper.
It's possible to use the multiprocessing.managers module to achieve what you want. It does require a small amount of hacking though.
Given a module that has functions you want to expose then you need to create a Manager that can create proxies for those functions.
manager process that serves proxies to the py3 functions:
from multiprocessing.managers import BaseManager
import spam
class SpamManager(BaseManager):
pass
# Register a way of getting the spam module.
# You can use the exposed arg to control what is exposed.
# By default only "public" functions (without a leading underscore) are exposed,
# but can only ever expose functions or methods.
SpamManager.register("get_spam", callable=(lambda: spam), exposed=["add", "sub"])
# specifying the address as localhost means the manager is only visible to
# processes on this machine
manager = SpamManager(address=('localhost', 50000), authkey=b'abc',
serializer='xmlrpclib')
server = manager.get_server()
server.serve_forever()
I've redefined spam to contain two function called add and sub.
# spam.py
def add(x, y):
return x + y
def sub(x, y):
return x - y
client process that uses the py3 functions exposed by the SpamManager.
from __future__ import print_function
from multiprocessing.managers import BaseManager
class SpamManager(BaseManager):
pass
SpamManager.register("get_spam")
m = SpamManager(address=('localhost', 50000), authkey=b'abc',
serializer='xmlrpclib')
m.connect()
spam = m.get_spam()
print("1 + 2 = ", spam.add(1, 2)) # prints 1 + 2 = 3
print("1 - 2 = ", spam.sub(1, 2)) # prints 1 - 2 = -1
spam.__name__ # Attribute Error -- spam is a module, but its __name__ attribute
# is not exposed
Once set up, this form gives an easy way of accessing functions and values. It also allows these functions and values to be used them in a similar way that you might use them if they were not proxies. Finally, it allows you to set a password on the server process so that only authorised processes can access the manager. That the manager is long running, also means that a new process doesn't have to be started for each function call you make.
One limitation is that I've used the xmlrpclib module rather than pickle to send data back and forth between the server and the client. This is because python2 and python3 use different protocols for pickle. You could fix this by adding your own client to multiprocessing.managers.listener_client that uses an agreed upon protocol for pickling objects.

Connect DBUS Service method to another method

I have 4 DBUS Services python script (a.py,b.py,c.py,d.py) and Run it in 'main.py'
The Reason why i want to merge dbus services, is because of %Memory per process running. 2.0% Memory per dbus service. I'll create a 15 of dbus services.
main.py
#!/usr/bin/env python
import sys
import gobject
from dbus.mainloop.glib import DBusGMainLoop
listofdbusfilenames = ['a','b','b','d']
def importDbusServices():
for dbusfilename in listofdbusfilenames:
globals()[dbusfilename] = __import__(dbusfilename)
def callservices():
for dbusfilename in listofdbusfilenames:
globals()[dbusfilename +'_var'] = eval(dbusfilename +'.ServiceClass()')
if __name__ == '__main__':
importDbusDervices()
DBusGMainLoop(set_as_default = True)
callservices()
loop = gobject.MainLoop()
loop.run()
a.py wants to get the return method from b.py
b.py will get the return method of c.py and d.py
-a.py
--|b.py
----|c.py
----|d.py
The PROBLEMS:
I can't get the return method ''get_dbus_method'', Introspect Error appears.
I tried signal and receiver BUT it takes longer than ''get_dbus_method''.
my dbus service format
import dbus
import dbus.service
class ServiceClass(dbus.service.Object):
def __init__(self):
busName = dbus.service.BusName('test.a', bus = dbus.SystemBus())
dbus.service.Object.__init__(self, busName, '/test/a')
#dbus.service.method('test.a')
def aMethod1(self):
#get the b.py method value 'get_dbus_method' here
return #value from b.py method
Is there any other way to get the method directly?
Thanks in advance for response and reading this. :D

twisted: How to send and receive the same object with Perspective Broker?

I have a simple 'echo' PB client and server where the client sends an object to the server which echo the same object back to the client:
The client:
from twisted.spread import pb
from twisted.internet import reactor
from twisted.python import util
from amodule import aClass
factory = pb.PBClientFactory()
reactor.connectTCP("localhost", 8282, factory)
d = factory.getRootObject()
d.addCallback(lambda object: object.callRemote("echo", aClass()))
d.addCallback(lambda response: 'server echoed: '+response)
d.addErrback(lambda reason: 'error: '+str(reason.value))
d.addCallback(util.println)
d.addCallback(lambda _: reactor.stop())
reactor.run()
The server:
from twisted.application import internet, service
from twisted.internet import protocol
from twisted.spread import pb
from amodule import aClass
class RemoteClass(pb.RemoteCopy, aClass):
pass
pb.setUnjellyableForClass(aClass, RemoteClass)
class PBServer(pb.Root):
def remote_echo(self, a):
return a
application = service.Application("Test app")
# Prepare managers
clientManager = internet.TCPServer(8282, pb.PBServerFactory(PBServer()));
clientManager.setServiceParent(application)
if __name__ == '__main__':
print "Run with twistd"
import sys
sys.exit(1)
The aClass is a simple class implementing Copyable:
from twisted.spread import pb
class aClass(pb.Copyable):
pass
When i run the above code, i get this error:
twisted.spread.jelly.InsecureJelly: Module builtin not allowed (in type builtin.RemoteClass).
In fact, the object is sent to the server without any problem since it was secured with pb.setUnjellyableForClass(aClass, RemoteClass) on the server side, but once it gets returned to the client, that error is raised.
Am looking for a way to get an easy way to send/receive my objects between two peers.
Perspective broker identifies classes by name when talking about them over the network. A class gets its name in part from the module in which it is defined. A tricky problem with defining classes in a file that you run from the command line (ie, your "main script") is that they may end up with a surprising name. When you do this:
python foo.py
The module name Python gives to the code in foo.py is not "foo" as one might expect. Instead it is something like "__main__" (which is why the if __name__ == "__main__": trick works).
However, if some other part of your application later tries to import something from foo.py, then Python re-evaluates its contents to create a new module named "foo".
Additionally, the classes defined in the "__main__" module of one process may have nothing to do with the classes defined in the "__main__" module of another process. This is the case in your example, where __main__.RemoteClass is defined in your server process but there is no RemoteClass in the __main__ module of your client process.
So, PB gets mixed up and can't complete the object transfer.
The solution is to keep the amount of code in your main script to a minimum, and in particular to never define things with names there (no classes, no function definitions).
However, another problem is the expectation that a RemoteCopy can be sent over PB without additional preparation. A Copyable can be sent, creating a RemoteCopy on the peer, but this is not a symmetric relationship. Your client also needs to allow this by making a similar (or different) pb.setUnjellyableForClass call.

Categories