Run bash script on `cd` command - python

I'm python developer and most frequently I use buildout for managing my projects. In this case I dont ever need to run any command to activate my dependencies environment.
However, sometime I use virtualenv when buildout is to complicated for this particular case.
Recently I started playing with ruby. And noticed very useful feature. Enviourement is changing automatically when I cd in to the project folder. It is somehow related to rvm nad .rvmrc file.
I'm just wondering if there are ways to hook some script on different bash commands. So than I can workon environment_name automatically when cd into to project folder.
So the logic as simple as:
When you cd in the project with folder_name, than script should run workon folder_name

One feature of Unix shells is that they let you create shell functions, which are much like functions in other languages; they are essentially named groups of commands. For example, you can write a function named mycd that first runs cd, and then runs other commands:
function mycd () {
cd "$#"
if ... ; then
workon environment
fi
}
(The "$#" expands to the arguments that you passed to mycd; so mycd /path/to/dir will call cd /path/to/dir.)
As a special case, a shell function actually supersedes a like-named builtin command; so if you name your function cd, it will be run instead of the cd builtin whenever you run cd. In that case, in order for the function to call the builtin cd to perform the actual directory-change (instead of calling itself, causing infinite recursion), it can use Bash's builtin builtin to call a specified builtin command. So:
function cd () {
builtin cd "$#" # perform the actual cd
if ... ; then
workon environment
fi
}
(Note: I don't know what your logic is for recognizing a project directory, so I left that as ... for you to fill in. If you describe your logic in a comment, I'll edit accordingly.)

I think you're looking for one of two things.
autoenv is a relatively simple tool that creates the relevant bash functions for you. It's essentially doing what ruakh suggested, but you can use it without having to know how the shell works.
virtualenvwrapper is full of tools that make it easier to build smarter versions of the bash functions—e.g., switch to the venv even if you cd into one of its subdirectories instead of the base, or track venvs stored in git or hg, or … See the Tips and Tricks page.
The Cookbook for autoenv, shows some nifty ways ways to use the two together.

Just found in the description of virtualenvwraper this topic
It describes exactly what I need.

Related

Applying source on Python script to export environment variables?

Context:
I'm working with a few developers, some on OSX and some on Linux. We use a collection of bash scripts to load environment variables, build docker images etc. The OSX users are using zsh instead of bash, so some tricks (specifically how we're getting the running script's directory) aren't compatible. What works on Linux (but not OSX) is:
$ cat ./scripts/env_script.sh
THIS_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )"
# ... more variables ...
$ source ./scripts/env_script.sh
$ echo $THIS_DIR
# the absolute path to the scripts directory
Some cursory testing shows this doesn't work on OSX, presumably because BASH_SOURCE doesn't work the same with zsh. Fine.
Goal:
We're all working with a Python, we would like to replace our bash scripts with Python.
The same pattern is very easy in Python.
from pathlib import Path
THIS_DIR = str(Path(__file__).parent)
Not the cleanest, but it works anywhere. Ideally we could run this and share THIS_DIR with the shell that called the script. Something like
$ <answer to my question> ./scripts/env_script.py
$ echo $THIS_DIR
# absolute path to the scripts directory
Question:
Suppose we rewrote our ./scripts/env_script.sh into ./scripts/env_script.py and collected a list of variables we would like to share with the calling environment.
That is, we wrote the Python script above.
Is it possible to do something akin to source ./scripts/env_script.py so that we can export the variables from the Python script into the calling shell?
What I've tried:
I found a potential duplicate here though the answer is basically 'Python runs in a sub-process, source would only work if your shell is Python` which makes sense but doesn't answer the question.
If there isn't a nice way to exchange these variables, then another option would basically be to have the Python script spit the variables to stdout and then eval them within my shell. I.e. eval $(./script/env_script.py). I guess this could work, but it feels wrong at least because I was always taught eval is evil.
I spent a little more time on the second solution since I could imagine it working (although I'm not happy with it). Basically, I rewrite ./scripts/env_script.py to be
FOO='whatever'
THIS_DIR=str(Path(__file__).parent)
# and so on...
# prints all "normal" variables to stdout
def_vars = list(locals().items())
for name, val in def_vars:
if name.startswith("_"):
continue
print(f'{name}="{val}"')
So running it normally would output
$ ./scripts/env_script.py
FOO="whatever"
THIS_DIR="..."
...
$ echo $FOO
# nothing
Then, to jam those variables into the calling shell, we wrap that with eval and it does the job
$ eval $(./scripts/env_script.py)
$ echo $FOO
whatever
I guess with some further tweaking, the footer at the end of env_script.py could be made a little more robust, and then moved outside of the script. Some sort of alias could be defined to make it syntactically nicer, so then calling it could like pysource ./scripts/env_script.py. Still, I'm not satisfied with this answer.
Edit: I did the tweaking and wrote a downloadable script here.

Setting Environment Up with Python

Our environment has a shell script to setup the working area. setup.sh looks like this:
export BASE_DIR=$PWD
export PATH=$BASE_DIR/bin
export THIS_VARIABLE=THAT_VALUE
The user does the following:
% . setup.sh
Some of our users are looking for a csh version and that would mean having two setup files.
I'm wondering if there is a way to do this work with a common python file. In The Hitchhiker's Guide to Python Kenneth Reitz suggests using a setup.py file in projects, but I'm not sure if Python can set environment variables in the shell as I do above.
Can I replace this shell script with a python script that does the same thing? I don't see how.
(There are other questions that ask this more broadly with many many comments, but this one has a direct question and direct single answer.)
No, Python (or generally any process on Unix-like platforms) cannot change its parent's environment.
A common solution is to have your script print the output in a format suitable for the user's shell. E.g. ssh-agent will print out sh-compatible global assignments with -s or when it sees that it is being invoked from a Bourne-compatible shell; and csh syntax if invoked from csh or tcsh or when explicitly invoked with -c.
The usual invocation in sh-compatible shells is $(eval ssh-agent) -- so the text that the program prints is evaluated by the shell where the user invoked this command.
eval is a well-known security risk, so you want to make this code very easy to vet even for people who don't speak much Python (or shell, or anything much else).
If you are, eh cough, skeptical of directly supporting Csh users, perhaps you can convince them to run your sh-compatible script in a Bourne-compatible shell and then exec csh to get their preferred interactive environment. This also avoids the slippery slope of having an ever-growing pile of little maintenance challenges for supporting Csh, Fish, rc, Powershell etc users.

Directory changing script in Python? [duplicate]

I read that the executables for the commands issued using exec() calls are supposed to be stored in directories that are part of the PATH variable.
Accordingly, I found the executables for ls, chmod, grep, cat in /bin.
However, I could not find the executable for cd.
Where is it located?
A process can only affect its own working directory. When an executable is executed by the shell it executes as a child process, so a cd executable (if one existed) would change that child process's working directory without affecting the parent process (the shell), hence the cd command must be implemented as a shell built-in that actually executes in the shell's own process.
cd is a shell built-in, unfortunately.
$ type cd
cd is a shell builtin
...from http://www.linuxquestions.org/questions/linux-newbie-8/whereis-cd-sudo-doesnt-find-cd-464767/
But you should be able to get it working with:
sh -c "cd /somedir; do something"
Not all utilities that you can execute at a shell prompt need actually exist as actual executables in the filesystem. They can also be so-called shell built-ins, which means – you guessed it – that they are built into the shell.
The Single Unix Specification does, in general, not specify whether a utility has to be provided as an executable or as a built-in, that is left as a private internal implementation detail to the OS vendor.
The only exceptions are the so-called special built-ins, which must be provided as built-ins, because they affect the behavior of the shell itself in a manner that regular executables (or even regular built-ins) can't (for example set, which sets variables that persist even after set exits). Those special built-ins are:
break
:
continue
.
eval
exec
exit
export
readonly
return
set
shift
times
trap
unset
Note that cd is not on that list, which means that cd is not a special built-in. In fact, according to the specification, it would be perfectly legal to implement cd as a regular executable. It's just not possible, for the reasons given by the other answers.
And if you scroll down to the non-normative section of the specification, i.e. to the part that is not officially part of the specification but only purely informational, you will find that fact explicitly mentioned:
Since cd affects the current shell execution environment, it is always provided as a shell regular built-in.
So, the specification doesn't require cd to be a built-in, but it's simply impossible to do otherwise.
Note that sometimes utilities are provided both as a built-in and as an executable. A good example is the time utility, which on a typical GNU system is provided both as an executable by the Coreutils package and as a shell regular built-in by Bash. This can lead to confusion, because when you do man time, you get the manpage of the time executable (the time builtin is documented in man builtins), but when you execute time you get the time built-in, which does not support the same features as the time executable whose manpage you just read. You have to explicitly run /usr/bin/time (or whatever path you installed Coreutils into) to get the executable.
According to this, cd is always a built-in command and never an executable:
Since cd affects the current shell execution environment, it is always provided as a shell regular built-in.
cd is part of the shell; an internal command. There is no binary for it.
The command cd is built-in in your command line shell. It could not affect the working directory of your shell otherwise.
I also searched the executable of "cd" and there is no such.
You can work with chdir (pathname) in C, it has the same effect.

Can't find mkvirtualenv, but can execute it

I'm trying to run mkvirtualenv from a bash script, and I'm keep being told that it can't be found - yet it seems that my system can't make up it's mind about whether or not it can find it. Can anyone explain why I can execute it from the terminal, but not from a script?
jimbo#wavefront:~$ locate mkvirtualenv
jimbo#wavefront:~$ which mkvirtualenv
jimbo#wavefront:~$ mkvirtualenv --version
13.1.2
jimbo#wavefront:~$
It's because it's a function attached to the shell. Run this to see it:
$ type mkvirtualenv
Avoid using which to check for binaries/etc. It isn't standardized, isn't always available and is an external binary itself (and so is more expensive than the better choices).
The better choices are type and command.
They are both built-ins, both standardized (at least at their most basic levels) and, because they are built-in, can see shell functions and aliases too.

Wrap all commands entered within a Bash-Shell with a Python script

What i'd like to have is a mechanism that all commands i enter on a Bash-Terminal are wrapped by a Python-script. The Python-script executes the entered command, but it adds some additional magic (for example setting "dynamic" environment variables).
Is that possible somehow?
I'm running Ubuntu and Debian Squeezy.
Additional explanation:
I have a property-file which changes dynamically (some scripts do alter it at any time). I need the properties from that file as environment variables in all my shell scripts. Of course i could parse the property-file somehow from shell, but i prefer using an object-oriented style for that (especially for writing), as it can be done with Python (and ConfigObject).
Therefore i want to wrap all my scripts with that Python script (without having to modify the scripts themselves) which handles these properties down to all Shell-scripts.
This is my current use case, but i can imagine that i'll find additional cases to which i can extend my wrapper later on.
The perfect way to wrap every command that is typed into a Bash Shell is to change the variable PROMPT_COMMAND inside the .bashrc. For example, if I want to do some Python stuff before every command, liked asked in my question:
.bashrc:
# ...
PROMPT_COMMAND="python mycoolscript.py; $PROMPT_COMMAND;"
export $PROMPT_COMMAND
# ...
now before every command the script mycoolscript.py is run.
Use Bash's DEBUG trap. Let me know if you need me to elaborate.
Edit:
Here's a simple example of the kinds of things you might be able to do:
$ cat prefix.py
#!/usr/bin/env python
print "export prop1=foobar"
print "export prop2=bazinga"
$ cat propscript
#!/bin/bash
echo $prop1
echo $prop2
$ trap 'eval "$(prefix.py)"' DEBUG
$ ./propscript
foobar
bazinga
You should be aware of the security risks of using eval.
I don't know of anything but two things that might help you follow
http://sourceforge.net/projects/pyshint/
The iPython shell has some functionality to execute shell commands in the iterpreter.
There is no direct way you can do it .
But you can make a python script to emulate a bash terminal and you can use the beautiful "Subprocess" module in python to execute commnands the way you like

Categories