Wrap all commands entered within a Bash-Shell with a Python script - python

What i'd like to have is a mechanism that all commands i enter on a Bash-Terminal are wrapped by a Python-script. The Python-script executes the entered command, but it adds some additional magic (for example setting "dynamic" environment variables).
Is that possible somehow?
I'm running Ubuntu and Debian Squeezy.
Additional explanation:
I have a property-file which changes dynamically (some scripts do alter it at any time). I need the properties from that file as environment variables in all my shell scripts. Of course i could parse the property-file somehow from shell, but i prefer using an object-oriented style for that (especially for writing), as it can be done with Python (and ConfigObject).
Therefore i want to wrap all my scripts with that Python script (without having to modify the scripts themselves) which handles these properties down to all Shell-scripts.
This is my current use case, but i can imagine that i'll find additional cases to which i can extend my wrapper later on.

The perfect way to wrap every command that is typed into a Bash Shell is to change the variable PROMPT_COMMAND inside the .bashrc. For example, if I want to do some Python stuff before every command, liked asked in my question:
.bashrc:
# ...
PROMPT_COMMAND="python mycoolscript.py; $PROMPT_COMMAND;"
export $PROMPT_COMMAND
# ...
now before every command the script mycoolscript.py is run.

Use Bash's DEBUG trap. Let me know if you need me to elaborate.
Edit:
Here's a simple example of the kinds of things you might be able to do:
$ cat prefix.py
#!/usr/bin/env python
print "export prop1=foobar"
print "export prop2=bazinga"
$ cat propscript
#!/bin/bash
echo $prop1
echo $prop2
$ trap 'eval "$(prefix.py)"' DEBUG
$ ./propscript
foobar
bazinga
You should be aware of the security risks of using eval.

I don't know of anything but two things that might help you follow
http://sourceforge.net/projects/pyshint/
The iPython shell has some functionality to execute shell commands in the iterpreter.

There is no direct way you can do it .
But you can make a python script to emulate a bash terminal and you can use the beautiful "Subprocess" module in python to execute commnands the way you like

Related

How to run a Python script setting the location of the modules in the shell?

I would like to run a Python script setting in the shell where the interpreter must look for the modules.
Suppose that myscript.py contains only:
import mymodule ; mymodule.myfunction()
But mymodule is in /home/user/hello, whereas myscript.py is, say, in /home/user/Desktop. I want to run on a terminal something like:
$ python /home/user/Desktop/myscript.py LOCATION_OF_THE_MODULES=/home/user/hello.
Would it be possible? I think that an alternative solution is to define the location in the import statement from the code, but this is not what I am looking for. I want to set the location through a variable in the shell.
So, I've been exploring a little your question, turns out this isn't a Python question but a "prompt" question, because indeed there is a way to do that but, since Python can't hop into interactive from script, we can't make it using Python only, but the Python interactive command have some extra options we can use
see:
python3 -h
for more info.
Specifically there are 2 options that are interesting, -i and -c which stands for interactive mode and command string respectively, that way we can load the modules with -c and hop into interactive with -i, like so:
python3 -i -c "import os"
Obviously, we need to make it more advanced so it can load multiple modules without Python scripting, then we will be needing to make the actual command to run Python and load the scripts you want, there is a problem tho, since we need to issue a command to be able to load all the modules you want in a folder it might create incompatibilities with prompts since not all prompts have the same syntax. There might be another low-level answer to this problem but I couldn't get to it, however, I will leave a Bash Script for reference so you can use it and/or edit it so it works best with your prompt.
FINAL_VAR=""
cd $1
for f in *.py; do
FINAL_VAR+="import ${f%.py}"$'\n'
done
python3 -i -c "$FINAL_VAR"
Usage steps:
Copy and save the script
Give it run permissions (chmod +x file_name.sh)
Run it this way: ./file_name.sh "/full/path/to/your/modules"
It will load all the .py files and will hop into an interactive Python shell for your use
Note: You might want to change the last line so it works accordingly to your Python installation

Applying source on Python script to export environment variables?

Context:
I'm working with a few developers, some on OSX and some on Linux. We use a collection of bash scripts to load environment variables, build docker images etc. The OSX users are using zsh instead of bash, so some tricks (specifically how we're getting the running script's directory) aren't compatible. What works on Linux (but not OSX) is:
$ cat ./scripts/env_script.sh
THIS_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
# ... more variables ...
$ source ./scripts/env_script.sh
$ echo $THIS_DIR
# the absolute path to the scripts directory
Some cursory testing shows this doesn't work on OSX, presumably because BASH_SOURCE doesn't work the same with zsh. Fine.
Goal:
We're all working with a Python, we would like to replace our bash scripts with Python.
The same pattern is very easy in Python.
from pathlib import Path
THIS_DIR = str(Path(__file__).parent)
Not the cleanest, but it works anywhere. Ideally we could run this and share THIS_DIR with the shell that called the script. Something like
$ <answer to my question> ./scripts/env_script.py
$ echo $THIS_DIR
# absolute path to the scripts directory
Question:
Suppose we rewrote our ./scripts/env_script.sh into ./scripts/env_script.py and collected a list of variables we would like to share with the calling environment.
That is, we wrote the Python script above.
Is it possible to do something akin to source ./scripts/env_script.py so that we can export the variables from the Python script into the calling shell?
What I've tried:
I found a potential duplicate here though the answer is basically 'Python runs in a sub-process, source would only work if your shell is Python` which makes sense but doesn't answer the question.
If there isn't a nice way to exchange these variables, then another option would basically be to have the Python script spit the variables to stdout and then eval them within my shell. I.e. eval $(./script/env_script.py). I guess this could work, but it feels wrong at least because I was always taught eval is evil.
I spent a little more time on the second solution since I could imagine it working (although I'm not happy with it). Basically, I rewrite ./scripts/env_script.py to be
FOO='whatever'
THIS_DIR=str(Path(__file__).parent)
# and so on...
# prints all "normal" variables to stdout
def_vars = list(locals().items())
for name, val in def_vars:
if name.startswith("_"):
continue
print(f'{name}="{val}"')
So running it normally would output
$ ./scripts/env_script.py
FOO="whatever"
THIS_DIR="..."
...
$ echo $FOO
# nothing
Then, to jam those variables into the calling shell, we wrap that with eval and it does the job
$ eval $(./scripts/env_script.py)
$ echo $FOO
whatever
I guess with some further tweaking, the footer at the end of env_script.py could be made a little more robust, and then moved outside of the script. Some sort of alias could be defined to make it syntactically nicer, so then calling it could like pysource ./scripts/env_script.py. Still, I'm not satisfied with this answer.
Edit: I did the tweaking and wrote a downloadable script here.

Execute python code from within a shell script

I want to have some python code run within a shell script. I don't want to rely on an external file to be ran. Is there any way to do that?
I did a ton of googling, but there aren't any clear answers. This code is what I find... But it relies on the external python script to be ran. I want it all within one file.
python python_script.py
You can use a so-called "here document":
#!/usr/bin/env bash
echo "hello from bash"
python3 - <<'EOF'
print("hello from Python 3")
EOF
The single quotes around the first EOF prevent the usual expansions and command substitions in a shell script.
If you want those to happen, simply remove them.
If you mean within a BASH shell script without executing any external dependencies, I am afraid you're out of luck, since BASH only interprets its own scripting language.
Your question is somewhat like asking "Can I run a Java .class file without the JVM"? Obviously, you will always have the external dependency of the JRE/JVM. This is the same case, you depend on the external Python compiler and interpreter.
Optionally, you have the option of including the python script inline, but it would still require the python executable.
This works:
python -c 'print("Hi")'
Or this with BASH redirection:
python <<< 'print("Hi")'

Setting Environment Up with Python

Our environment has a shell script to setup the working area. setup.sh looks like this:
export BASE_DIR=$PWD
export PATH=$BASE_DIR/bin
export THIS_VARIABLE=THAT_VALUE
The user does the following:
% . setup.sh
Some of our users are looking for a csh version and that would mean having two setup files.
I'm wondering if there is a way to do this work with a common python file. In The Hitchhiker's Guide to Python Kenneth Reitz suggests using a setup.py file in projects, but I'm not sure if Python can set environment variables in the shell as I do above.
Can I replace this shell script with a python script that does the same thing? I don't see how.
(There are other questions that ask this more broadly with many many comments, but this one has a direct question and direct single answer.)
No, Python (or generally any process on Unix-like platforms) cannot change its parent's environment.
A common solution is to have your script print the output in a format suitable for the user's shell. E.g. ssh-agent will print out sh-compatible global assignments with -s or when it sees that it is being invoked from a Bourne-compatible shell; and csh syntax if invoked from csh or tcsh or when explicitly invoked with -c.
The usual invocation in sh-compatible shells is $(eval ssh-agent) -- so the text that the program prints is evaluated by the shell where the user invoked this command.
eval is a well-known security risk, so you want to make this code very easy to vet even for people who don't speak much Python (or shell, or anything much else).
If you are, eh cough, skeptical of directly supporting Csh users, perhaps you can convince them to run your sh-compatible script in a Bourne-compatible shell and then exec csh to get their preferred interactive environment. This also avoids the slippery slope of having an ever-growing pile of little maintenance challenges for supporting Csh, Fish, rc, Powershell etc users.

TO initiate Python shell

You can use subprocess.call("/usr/bin/python") to open Python shell within a piece of Python code. Now my question is is it possible to predefine some variables/functions before initialization of this shell? In other words, inside Python code, I can define a bunch of useful variables and functions and I want them to be available in the Python shell opened later by subprocess call. It is useful in the sense that sometimes you want a customized Python shell to test your environment.
You can do this using the -i switch. This will run a script, and then drop into the interpreter for interactive use.
python -i scriptname.py
Not directly, but I wouldn't do it this way anyways; I'd use code.
Yes, that's something that it's possible and it's useful. In fact, that's something that django provides with the python manage.py shell command.
Looking at the source code for this command should be helpful not only as an example to open a shell with some default configuration, but also to use any shell you like (ipython, bpython or the default one).

Categories