LLDB: Equivalent of process connect in python - python

I am currently executing:
process connect <url>
from the lldb command line to make a connection. What I would like to do is to include in a python script to automate making this connection.
What is the python api I need to use?

In the SB API this is SBTarget::ConnectRemote. It can't be a SBProcess method because you don't have a process before you call this...
Note, you probably don't want to provide your own SBListener to SBTarget::ConnectRemote. It (and all the other process creation calls that take a SBListener) will use the Debugger's listener if you provide an nil listener.
For instance, if you are writing a Python command to do the connect, then you'll want to let the regular Debugger's event handler to deal with process events after you've connected.
On the off chance you want to try handling process events yourself, the example:
http://llvm.org/svn/llvm-project/lldb/trunk/examples/python/process_events.py
will get you started.

Related

Best (easiest) way to execute a function (with response) in another python thread?

I have a little python script running on an raspberry pi (which is hooked up to detect if something is delivered to my mailbox) to send me a telegram message with a snapshot of my mailbox content.
Up until now this has been a single monolithic script which handled GPIO interaction (led lights and threaded_callbacks for reed_contacts), picamera and the telegram messaging.
But the telegram bot I was using (telepot) is no longer supported. Which is why I am looking to incorporate another python telegram bot implementation (python-telegram-bot) as well as migrate the script to python3 since python2 has also been discontinued.
But in doing so, I think I will need to split up the script, since the python-telegram-bot does not run non-blocking in a calling script.
In my old script I could still continue with the main program after calling the MessageLoop(bot, handler).run_as_thread()(spawning a separate background thread for update checking). But with the python-telegram-bot no instruction after
updater.start_polling() updater.idle() is evaluated till the bot is stopped.
I think my best bet in migrating the script is splitting it into two separate scripts which communicate with each other. One script which handles the interaction with picamera & gpio and another one soley for user interaction via telegram.
For example, the command to request a picture of the actual mailbox contents is received by the telegram_script. The telegram_script should then tell the low_level_script to execute the capture() function and wait for the return/result of this function (to make sure the picture is saved/updated before the telegram_script tries to send it).
My question is, how do I communicate between the two?
What is the best/easiest way in python to execute a function in the low_level_script with the result returned to the telegram_script?
I think it depends on how you want to structure your system. If you have one script that runs on 2 process using the multiprocessing you could a pipe or a queue to communicate between them.
If you have two very independent scripts, maybe you can look then at using a socket with a Unix socket name.

Passing arguments to a running python script

I have a script running on my raspberry, these script is started from a command from an php page. I’ve multiple if stetements, now I would like to pass new arguments to the script whithout stopping it. I found lots of information by passing arguments to the python script, but not if its possible while the svpcript is already running to pass new arguments. Thanks in advance!
The best option for me is to use a configuration file input for your script.
Some simple yaml will do. Then in a separate thread you must observe the hash of the file, if it gets changed that
means somebody has updated your file and you must re/adjust your inputs.
Basically you have that constant observer running all the time.
You need some sort of IPC mechanism really. As you are executing/updating the script from a PHP application, I'd suggest you'll look into something like ZeroMQ which supports both Python and PHP, and will allow you to do a quick and dirty Pub/Sub implementation.
The basic idea is, treat your python script as a subscriber to messages coming from the PHP application which publishes them as and when needed. To achieve this, you'll want to start your python "script" once and leave it running in the background, listening for messages on ZeroMQ. Something like this should get you going
import zmq
context = zmq.Context()
socket = context.socket(zmq.REP)
socket.bind("tcp://*:5555")
while True:
# Wait for next message from from your PHP application
message = socket.recv()
print("Recieved a message: %s" % message)
# Here you should do the work you need to do in your script
# Once you are done, tell the PHP application you are done
socket.send(b"Done and dusted")
Then, in your PHP application, you can use something like the following to send a message to your Python service
$context = new ZMQContext();
// Socket to talk to server
$requester = new ZMQSocket($context, ZMQ::SOCKET_REQ);
$requester->connect("tcp://localhost:5555");
$requester->send("ALL THE PARAMS TO SEND YOU YOUR PYTHON SCRIPT");
$reply = $requester->recv();
Note, I found the above examples using a quick google search (and amended slightly for educational purposes), but they aren't tested, and purely meant to get you started. For more information, visit ZeroMQ and php-zmq
Have fun.

How to toggle process state via command line?

We are trying to create a simple command line utility that tracks some metrics over an unknown period of time (start/stop triggered via command line externally).
For example,
python metrics-tool.py --start-collect
... run some additional commands external to metrics-tool ...
python metrics-tool.py --stop-collect
Does anyone have any ideas or suggestions on how an application can receive a command for a "second time"? Is this even possible, or a good way to do this?
It almost sounds like this should be a service, configurable at runtime by an endpoint?
This does sound more like a service which can be started and stopped, via Systemd (or Supervisor) for example.
Using Systemd means there's no need to daemonise your Python process yourself: https://stackoverflow.com/a/30189540/736221
You can of course do that if you want to, if you're using something other than Systemd: https://pagure.io/python-daemon

Window socket programming with Python

I'm trying to implement Window socket using Python.
Mostly, everything has been so far solved using ctypes.windll.ws2_32 and pywin32 lib.
However, I haven't been able to find out how to translate the following C++ codes into Python and I wonder if anyone is kind enough to help:
LRESULT WINAPI AsyncSocketProc(
__in HWND hwnd,
__in UINT uMsg,
__in WPARAM wParam,
__in LPARAM lParam
)
switch(uMsg) {
case WM_CREATE:
//...
case WM_SOCKET: {# this is basically an int constant
switch(WSAGETSELECTEVENT(lParam)){
case FD_ACCEPT:
//accepting new conn
case FD_READ:
//receiving data
}
}
}
In the above code, I couldn't find Python's equivalent for WSAGETSELECTEVENT.
For the FD_ACCEPT, FD_READ, I could find them inside win32file package (of pywin32 lib)
Lastly, the reason why I'm trying to implement this Window socket programming is that the C++ version of the window socket server (above) is non-blocking for an application of mine but Python's built-in select.select is blocking. So I'm trying to see if I can port the C++ version to Python and see if it works.
EDITED:
I would like to clarify that the socket server works as a 'plug in' to an existing C++ program, which doesn't support threading.
The socket server needs to wait (indefinitely) for clients to connect so it needs to continuously listen.
So using a normal Python socket or select.select would entail a while loop (or otherwise how can it acts as a server continuously listening for events? Please correct me I'm wrong), which would block the main program.
Somehow, using the Window Socket server callback above, the main program is not blocked. And this is the main reason while I'm trying to port it to Python.
The socket server is preferably in Python because many related libs the server needs are written in Python.
Thanks a lot.
Have a look at the socket module instead. It already contains all the code you need to work with sockets without using the win32 API.
[EDIT] You can write multi threaded code that can handle several connections. Just accept the connection and then start a new thread, give it the connection and let it read the data in a while 1: data = conn.recv(1024) ... kind of loop.
That said, Python also has a module for just that: SocketServer
[EDIT2] You say
the socket server works as a 'plug in' to an existing program, which doesn't support threading.
It's a bit hard to help with so little information but think about it this way:
You can run the socket server loop in a new thread. This code is isolated from the rest of your app, so it doesn't matter whether the other code uses/supports threads. This solves your "endless loop" problem.
Now this socket server loop will get connections from clients. My guess is that the clients will call methods from the rest of the app and here, things get hairy.
You need a way to synchronize these calls. In other places (like all UI frameworks), there is a single thread which runs any UI calls (drawing something, creating the UI, responding to user input).
But if I understand you correctly, then you can in fact modify the "main loop" of the existing app and let it do more things (like listening to new connections). If you can do this, then there is a way out:
Create a new thread for the socket server as described above. When the server gets a connection, spawn a new thread that talks to the client. When the client sends commands, create "work objects" (see command pattern) and put them into a queue.
In the main loop, you can look into the queue. If something is in there, pop the work objects and call it's run() method.
You don't need or want to port this code. This code is specific to how the WIN32 API notifies native code that a socket operation has completed. It doesn't apply in Python.
The equivalent in python would be, roughly, to paste the "accepting new conn" code in wherever your python code accepts a new connection. And paste the "receiving data" code wherever your python code receives data.
You can also use select, just keep in mind that the semantics are a bit of the reverse of async sockets. With async sockets, you start an operation whenever you want and you get a callback when it completes. With 'select', it tell you when to perform an operation such that it completes immediately.

How can I detect what other copy of Python script is already running

I have a script. It uses GTK. And I need to know if another copy of scrip starts. If it starts window will extend.
Please, tell me the way I can detect it.
You could use a D-Bus service. Your script would start a new service if none is found running in the current session, and otherwise send a D-Bus message to the running instace (that can send "anything", including strings, lists, dicts).
The GTK-based library libunique (missing Python bindings?) uses this approach in its implementation of "unique" applications.
You can use a PID file to determine if the application is already running (just search for "python daemon" on Google to find some working implementations).
If you detected that the program is already running, you can communicate with the running instance using named pipes.
The new copy could search for running copies, fire a SIGUSER signal and trigger a callback in your running process that then handles all the magic.
See the signal library for details and the list of things that can go wrong.
I've done that using several ways depending upon the scenario
In one case my script had to listen on a TCP port. So I'd just see if the port was available it'd mean it is a new copy. This was sufficient for me but in certain cases, if the port is already in use, it might be because some other kind of application is listening on that port. You can use OS calls to find out who is listening on the port or try sending data and checking the response.
In another case I used PID file. Just decide a location and a filename, and everytime your script starts, read that file to get a PID. If that PID is running, it means another copy is already there. Otherwise create that file and write your process ID in it. This is pretty simple. If you are using django then you can simply use django's daemonizer: "from django.utils import daemonize". Otherwise you can use this script: http://www.jejik.com/articles/2007/02/a_simple_unix_linux_daemon_in_python/

Categories