I am writing a python script that generates a list of integers and is supposed to send this list to a concurrently running C++ program that is constantly awaiting input. What is a quick and painless way to design the link between these two languages?
I have considered possibilities such as:
writing the python result to a file, while listening for file updates in C++ and loading it as soon as new data is available.
using python pexpect package and including a command line interface in the C++ program, so that it receives the data through cin.
Both above possibilities seem a bit too hacky to me. Are there better options, which would not require to become an expert in C++ library coding for python?
You could set up a Socket and an OutputStream in the python app to send the data, and another Socket and an InputStream in the C++ program to receive the data.
Related
I have a script running on my raspberry, these script is started from a command from an php page. I’ve multiple if stetements, now I would like to pass new arguments to the script whithout stopping it. I found lots of information by passing arguments to the python script, but not if its possible while the svpcript is already running to pass new arguments. Thanks in advance!
The best option for me is to use a configuration file input for your script.
Some simple yaml will do. Then in a separate thread you must observe the hash of the file, if it gets changed that
means somebody has updated your file and you must re/adjust your inputs.
Basically you have that constant observer running all the time.
You need some sort of IPC mechanism really. As you are executing/updating the script from a PHP application, I'd suggest you'll look into something like ZeroMQ which supports both Python and PHP, and will allow you to do a quick and dirty Pub/Sub implementation.
The basic idea is, treat your python script as a subscriber to messages coming from the PHP application which publishes them as and when needed. To achieve this, you'll want to start your python "script" once and leave it running in the background, listening for messages on ZeroMQ. Something like this should get you going
import zmq
context = zmq.Context()
socket = context.socket(zmq.REP)
socket.bind("tcp://*:5555")
while True:
# Wait for next message from from your PHP application
message = socket.recv()
print("Recieved a message: %s" % message)
# Here you should do the work you need to do in your script
# Once you are done, tell the PHP application you are done
socket.send(b"Done and dusted")
Then, in your PHP application, you can use something like the following to send a message to your Python service
$context = new ZMQContext();
// Socket to talk to server
$requester = new ZMQSocket($context, ZMQ::SOCKET_REQ);
$requester->connect("tcp://localhost:5555");
$requester->send("ALL THE PARAMS TO SEND YOU YOUR PYTHON SCRIPT");
$reply = $requester->recv();
Note, I found the above examples using a quick google search (and amended slightly for educational purposes), but they aren't tested, and purely meant to get you started. For more information, visit ZeroMQ and php-zmq
Have fun.
I'm currently doing a project in which I'm making an ADS-B flightradar on a led matrix, which is controlled by a Raspberry Pi. I've found a program called dump1090 which receives and decodes the data from my SDR receiver. I can find lots of example on how to use to forward that data to a webserver or whatever, but I can't seem to find anything on how you can programmatically listen to the data dump1090 produces. Does anyone know how you can programmatically receive dump1090's data in order to use the data in a program? (any language would do, but perhaps python would be the most obvious choice)
You should be able to start dump1090 using a programming language of choice (c/c++/java/python/etc.) and and read the std out pipe.
Personally, on Raspberry Pi, I find Python nicer to use since it's easier to test/reiterate without needing to compile. Python provides the subprocess package which allows you run dump1090(or any other application) from within Python and have a look at the output (using subprocess.check_output('dump1090') for example). Have a look at check_output and Popen options to see what works best with your application.
Currently, I have two programs, one running on Ruby and the other in Python. I need to read a file in Ruby but I need first a library written in Python to parse the file. Currently, I use XMLRPC to have the two programs communicate. Porting the Python library to Ruby is out of question. However, I find and read that using XMLRPC has some performance overhead. Recently, I read that another solution for the Ruby-Python conundrum is the use of pipes. So I tried to experiment on that one. For example, I wrote this master script in ruby:
(0..2).each do
slave = IO.popen(['python','slave.py'],mode='r+')
slave.write "master"
slave.close_write
line = slave.readline
while line do
sleep 1
p eval line
break if slave.eof
line = slave.readline
end
end
The following is the Python slave:
import sys
cmd = sys.stdin.read()
while cmd:
x = cmd
for i in range(0,5):
print "{'%i'=>'%s'}" % (i, x)
sys.stdout.flush()
cmd = sys.stdin.read()
Everything seems to work fine:
~$ ruby master.rb
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
{"0"=>"master"}
{"1"=>"master"}
{"2"=>"master"}
{"3"=>"master"}
{"4"=>"master"}
My question is, is it really feasible to implement the use of pipes for working with objects between Ruby and Python? One consideration is that there may be multiple instances of master.rb running. Will concurrency be an issue? Can pipes handle extensive operations and objects to be passed in between? If so, would it be a better alternative for RPC?
Yes. No. If you implement it, yes. Depends on what your application needs.
Basically if all you need is simple data passing pipes are fine, if you need to be constantly calling functions on objects in your remote process then you'll probably be better of using some form of existing RPC instead of reinventing the wheel. Whether that should be XMLRPC or something else is another matter.
Note that RPC will have to use some underlying IPC mechanism, which could well be pipes. but might also be sockets, message queues, shared memory, whatever.
I'm writing a simple web-based front-end for a Python console program that runs on a local machine. The interactions are extremely simple. Essentially, the web front-end needs to:
Accept input from the user (through an AJAX form or something).
Pass this input to the Python program and run it.
Display the output of the Python console program while it is running, until it terminates.
The first two can be accomplished quite easily (though suitable AJAX library recommendations would be helpful).
Question: What Javascript library would I need to accomplish No. 3?
Remarks:
I am aware of packages like AJAXterm and Shellinabox, but instead of a full-shell, I just want to display the output of the Python console program; essentially I'd like a way to pipe Python's stdout to a web-page in real-time.
You are probably looking for a comet implementation or another server push protocol, because of the unpredictable timing of python code output. So on the server side you have a thread that is reading from your python process' stdout and pushing out the output to your client via comet.
cometd may be your best bet for the client & server components.
I have a scenario where GUI (Developed in VB) is sending commands to the target system application developed in C language (Xilinx).
User on a PC sends the commands to target using GUI.
But now I need to remove the GUI and want to send commands (call C functions in target system application) using Python.
I found some useful info from ctypes but don't know the exact procedure so getting budded up frequently.
Question :
Can anyone tell me the exact procedure to develop a python script for this. I know fair amount of python scripting.
Invoke a SWIG wrapper to make C functions look like Python functions. Take a look at this example.