Pretty much what the title says, I would like to be able to connect to a python process running under paster or uwsgi and utilize pdb functionality.
Using winpdb, you can attach to a running process like this:
Insert
import rpdb2; rpdb2.start_embedded_debugger('mypassword')
inside your script.
Launch your script (through paster or uwsgi) as usual.
Run winpdb
Click File>Attach
Type in password (e.g. "mypassword"), select the process.
To detach, click File>Detach. The script will continue to run, and can be attached to again later.
Related
Normally, I would use "blender -P script.py" to run a python script. In this case, a new blender process is started to execute the script. What I am trying to do now is to run a script using a blender process that is already running, instead of starting a new one.
I have not seen any source on this issue so far, which makes me concern about the actual feasibility of this approach.
Any help would be appreciated.
Blender isn't designed to be started from the cli and to then keep receiving more commands from the cli as it is running. It does however include a text editor that can open text files and run the text block as a python script, it also includes a python console that can be used to interactively type in commands while blender is running. You may also find this addon useful as it lets you to run a text block in the python console, this leaves you with an interactive session that contains the variables as they exist at the end of the scripts execution.
There is a cli option to run blender as a python console blender --python-console - the gui does not get updated while this console is running, so you could open and exec several scripts and then when you exit the console, blender will update it's gui and allow interactive use, or if you start in background mode -b then it will quit when you exit the console.
My solution was to launch Blender via console with a python script (blender --python script.py) that contains a while loop and creates a server socket to receive requests to process some specific code. The loop will prevent blender from opening the GUI, and the socket will handle the multiple requests inside the same blender process.
I'm developing a cassandra storage finder for graphite-api.
graphite-api is installed via pip and run via gunicorn so I can't just call the script with a debugger but want to use interactive debugging.
When I import pdb in my storage finder and set a breakpoint, the code will halt there, but how can I connect now to the headless running pdb in the script?
Or is my approach to this debugging problem the wrong one and this has to be done in a completely other way?
pdb gives control over to gunicorn, which is not what you want. Have a look at rpdb or other remote debugging solutions.
I am trying to use Fabric to run commands on a remote machine.
This works fine, until the command on the remote machine is interactive. In that case, Fabric return the interactive shell, but force me to type the info needed, while I am trying to send a command that does everything in remote, so I can automate the procedure.
Example:
from fabric.api import *
env.hosts=['myhost.mydomain']
env.user='root'
run(test1/myapp; exectask; exit)
I run a cli application on a remote machine, that uses interactive shell, so it is waiting for me to type the command (exectask); then once done, to exit, I call the exit command.
What happens now is that the app launch, Fabric show me the interactive UI and I still need to type exectask and after, exit.
How can I tell fabric to run that app, then pass the command to the interactive shell and then the exit to quit?
I see that Fabric has the prompt feature, but that's to ask the user to input data, while I want to just pass the commands and get the result back.
You might want to look into pexpect, a pure-Python module that works like expect. Essentially, it allows your program to spawn an external program or process, then control it just like a human was interacting with it. You program in what the program should expect to see (hence the name), then what action(s) to take.
I have web application on Flask and user can send some request which my script is processing and then, running another script in python in the console with some parameters like this:
import sys
os.system('python start.py -u 100 -p 122224')
All works good, but now I want controlling all running copies of my script like start, stop and pause.
How i can do this without crutches?
Check subprocess and multiprocessing modules. The first one allows you to execute external application. To use the second one, you'll be required to call some python code, but the management capabilities should be much wider.
I want to execute a Python script on several (15+) remote machine using SSH. After invoking the script/command I need to disconnect ssh session and keep the processes running in background for as long as they are required to.
I have used Paramiko and PySSH in past so have no problems using them again. Only thing I need to know is how to disconnect a ssh session in python (since normally local script would wait for each remote machine to complete processing before moving on).
This might work, or something similar:
ssh user#remote.host nohup python scriptname.py &
Basically, have a look at the nohup command.
On Linux machines, you can run the script with 'at'.
echo "python scriptname.py" ¦ at now
If you are going to perform repetitive tasks on many hosts, like for example deploying software and running setup scripts, you should consider using something like Fabric
Fabric is a Python (2.5 or higher) library and command-line tool for
streamlining the use of SSH for application deployment or systems
administration tasks.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
Typical use involves creating a Python module containing one or more
functions, then executing them via the fab command-line tool.
You can even use tmux in this scenario.
As per the tmux documentation:
tmux is a terminal multiplexer. It lets you switch easily between several programs in one terminal, detach them (they keep running in the background) and reattach them to a different terminal. And do a lot more
From a tmux session, you can run a script, quit the terminal, log in again and check back as it keeps the session until the server restart.
How to configure tmux on a cloud server