Run Python script in Python environment? - python

In the terminal when starting Python, can we run a Python script under Python environment?
I know I can run it on bash, but don't know if I can run it in Python environment. The purpose is to see when the script goes wrong, the values of the variables at that time.

The purpose is to see when the script goes wrong, the values of the variables at that time.
You have two options for that (neither of which is precisely the question you're asking, but is nonetheless the proper way to achieve the desired outcome)
First, the pdb module:
import pdb; pdb.set_trace()
This enters the debugger at whatever point you place this code. Useful for seeing variables.
Second, running the command with -i:
$ python -i script.py
This drops into the full interpreter after execution, with all variables intact

Related

Python pdb on python script run as package

I have a python program that I usually run as a part of a package:
python -m mymod.client
in order to deal with relative imports inside "mymod/client.py." How do I run this with pdb - the python debugger. The following does not work:
python -m pdb mymod.client
It yields the error:
Error: mymod.client does not exist
EDIT #1 (to address possible duplicity of question)
My question isn't really about running two modules simultaneously python, rather it is about how to use pdb on a python script that has relative imports inside it and which one usually deals with by running the script with "python -m."
Restated, my question could then be, how do I use pdb on such a script while not having to change the script itself just to have it run with pdb (ie: preserving the relative imports inside the script as much as possible). Shouldn't this be possible, or am I forced to refactor in some way if I want to use pdb? If so what would be the minimal changes to the structure of the script that I'd have to introduce to allow me to leverage pdb.
In summary, I don't care how I run the script, just so long as I can get it working with pdb without changing it's internal structure (relative imports, etc) too much.
I think I have a solution.
Run it like this:
python -m pdb path/mymod/client.py arg1 arg2
that will run it as a script, but will not treat it as a package.
At the top of client.py, the first line should be:
import mymod
That will get the package itself loaded.
I am still playing with this, but it seems to work so far.
This is not possible. Though unstated in documentation, Python will not parse two modules via the -m command line option.

Python - running a script as a service with a name other than 'python'

I have a set of python scripts which I run as a daemon services. These all work great, but when all the scripts are running and I use top -u <USER>, I see all my scripts running as python.
I would really like to know which script is running under which process id. So is there any way to execute a python script as a different process name?
I'm stuck here, and I'm not ever sure what terms to Google. :-)
Note: I'm using Ubuntu Linux. Not sure if the OS matters or not.
Try using setproctitle. It should work fine on Linux.
Don't have a linux system here to test this on appropriately, but if the above doesn't work, you should be able to use the same trick they use for things like gzip etc.
The script has to tell what to run it at the top like this:
#!/usr/local/bin/python
Use a softlink like this:
ln -s /usr/local/bin/python ~/bin/myutil
Then just change your script to
#!~/bin/myutil
and it should show up that way instead. You may need to use a hard link instead of a soft link.
Launching a python script using the python script itself (and file associations and/or shell magic) is not very portable, but you can use similar methods on nearly any OS.
The easiest way to get this is using she bang. The first line of your python script should be:
#!/usr/bin/python
or
#!/usr/bin/python3
depending upon whether you use python or python3
and then assign executable permissions to the script as follows:
chmod +x <scriptname>
and then run the script as
./scriptname
this will show up as scriptname in top.

How can I debug Python code without running a script (using Eclipse)?

I've noticed how easy it is to debug a Python script from Eclipse. Simply set breakpoints and run a Python script from the debug menu. But is it possible to start a Python Interactive Interpreter instead of running a particular Python script, whilst still having Eclipse breaking on breakpoints? This would make it so much easier to test functions.
Thanks for any help
Still looking for a simple/ish way to start the debugger in Eclipse->PyDev that lets me use the interactive debugger. None of the answers as of yet is acceptable
You can explicitly write code to setup a breakpoint in your script, and then "remote debug". That means having pydevd in the pythonpath of your script wherever it is running, and running the eclipse pydev remote debugger on your devbox. If it's all happening on the same machine, this is fairly straightforward.
If not, you'll need to specify the hostname of the dev machine running the python remote debugger in the call to settrace(). You'll also need pydevd available on the machine running the script.
I have got this working in the past without having to install eclipse+pydevd on the machine running the script. It's not totally straightforward, and if you go that route I'd recommend checking that the pydevd versions match or at least you know they're compatible. Otherwise you end up wasting time debugging the debugger.
For details see: Pydev Remote Debugger
what about this, in the script, you can write a function, say onlyForTest, then each time you write a new function and want to test it, you can just put it in the onlyForTest function, then specify some arguments required by the new function you just write, then open the interactive python shell, import the script, call the onlyForTest function, check out the result.

python multithreading issue in cronjob

I have a python program that uses the ThreadPool for multithreading. The program is one step in a shell script. When I execute the shell script manually on the command line, the entire flow works as expected. However, when I execute the shell script as a cronjob, it appears that the flow goes to the next steps before the python multithreading steps are completely finished.
Inside the python program, I do call AsyncResult.get(timeout) to wait for all the results to come back before moving on.
Run your program via batch(1) (see the output of the command man batch) as well. If that works OK, but the cron version does not, then it is almost certainly a problem with your environment variable setup. To verify that, run printenv from your interactive shell to inspect your environment there. Then do the same thing inside the crontab (you will just need to temporarily set up an extra cron entry for it). Try setting the variables in your shell script before invoking Python.
On the other hand, if it doesn't work via batch(1) either, it could be something to do with the files that your code has open. Try running your shell script with input redirected from /dev/null and output going to a file:
$ /usr/local/bin/myscript </dev/null >|/tmp/outfile.txt 2>&1
Try setting "TERM=xterm" (or whatever env variable you have, figure out by command 'env' on your terminal) in your crontab.

Wrap all commands entered within a Bash-Shell with a Python script

What i'd like to have is a mechanism that all commands i enter on a Bash-Terminal are wrapped by a Python-script. The Python-script executes the entered command, but it adds some additional magic (for example setting "dynamic" environment variables).
Is that possible somehow?
I'm running Ubuntu and Debian Squeezy.
Additional explanation:
I have a property-file which changes dynamically (some scripts do alter it at any time). I need the properties from that file as environment variables in all my shell scripts. Of course i could parse the property-file somehow from shell, but i prefer using an object-oriented style for that (especially for writing), as it can be done with Python (and ConfigObject).
Therefore i want to wrap all my scripts with that Python script (without having to modify the scripts themselves) which handles these properties down to all Shell-scripts.
This is my current use case, but i can imagine that i'll find additional cases to which i can extend my wrapper later on.
The perfect way to wrap every command that is typed into a Bash Shell is to change the variable PROMPT_COMMAND inside the .bashrc. For example, if I want to do some Python stuff before every command, liked asked in my question:
.bashrc:
# ...
PROMPT_COMMAND="python mycoolscript.py; $PROMPT_COMMAND;"
export $PROMPT_COMMAND
# ...
now before every command the script mycoolscript.py is run.
Use Bash's DEBUG trap. Let me know if you need me to elaborate.
Edit:
Here's a simple example of the kinds of things you might be able to do:
$ cat prefix.py
#!/usr/bin/env python
print "export prop1=foobar"
print "export prop2=bazinga"
$ cat propscript
#!/bin/bash
echo $prop1
echo $prop2
$ trap 'eval "$(prefix.py)"' DEBUG
$ ./propscript
foobar
bazinga
You should be aware of the security risks of using eval.
I don't know of anything but two things that might help you follow
http://sourceforge.net/projects/pyshint/
The iPython shell has some functionality to execute shell commands in the iterpreter.
There is no direct way you can do it .
But you can make a python script to emulate a bash terminal and you can use the beautiful "Subprocess" module in python to execute commnands the way you like

Categories