How to create tcsh aliases using python? - python

I am trying to created aliases for tcsh from a python script (running Python 2.7.1).
Once the aliases are created I want to use them in the same shell I ran the python script in.
I tried:
os.system('alias test "echo test"')
but I get the following error:
sh: line 0: alias: test: not found
sh: line 0: alias: echo test: not found
I then tried:
os.system(r"""/bin/csh -i -c 'alias test "echo test"'""")
And then no errors occurred, but the alias did not register, and therefore I could not use it.
The result I'm looking for is this:
tcsh>python my_script.py
tcsh>test
test
Thanks!

os.system executes that command in a subshell (the bourne shell by the look of it), so even if your syntax was correct alias test="echo test", it would not persist after the call (since the subshell closed).
But this seems like an XY question. You ask about Y - the solution you had in mind, and not about X - your problem.
If you simply want to create a bunch of aliases at once, why not use a c-shell script!? (Why you are torturing yourself with c-shell is another matter entirely).

Your python script cannot execute anything in the context of your shell. While you could use subprocess.call(..., shell=True) this would use a new shell and thus not update your existing shell.
The only way to do what you want is to make your python script write valid shell commands to stdout and then, instead of just executing it, you need to make your shell evaluate the output of your python script.

Related

How to send value from python script to console and exit

My python script generate a proper command that user need to run in the same console. My scenario is that user is running a script and then as a result see the command that must to run. Is there any way to exit python script and send that command to console, so user do not need to copy/paste?
A solution would be to have your python script (let's call it script.py) just print the command: print('ls -l') and use it in a terminal like so: $(python3 script.py). This makes bash run the output of your script as a command, and would basically run a ls -l in the terminal.
You can even go a step beyond and create an alias in ~/.bashrc so that you no longer need to call the whole line. You can write at the end of the file something like alias printls=$(python3 /path/to/script.py). After starting a new terminal, you can type printls and the script will run.
A drawback of this method is that you have no proper way of handling exceptions or errors in your code, since everything it prints will be run as a command. One way (though ugly) would be to print('echo "An error occured!"') so that the user who runs the command can see that something malfunctioned.
However, I'd suggest going for the "traditional" way and running the command directly from python. Here's a link to how you can achieve this: Calling an external command in Python.
Python can run system commands in new subshells. The proper way of doing this is via the subprocess module, but for simple tasks it's easier to just use os.system. Example Python script (assuming a Unix-like system):
import os
os.system('ls')

Unset or remove most recent like of .bash_history from within Python script

The Issue
I have a Python script that when I run it from the command line I do not want to record anything within .bash_history.
The reason for this is that the script uses the Python argparse library which allows me to pass in arguments to the python code directly from the command line.
For example I could write the script so that it would use "123456" as a value in the script:
$ ./scriptname.py -n 123456
The issue is that I don't want the value 123456 stored in .bash_history. In fact, I'd rather the entire command was never stored into the .bash_history file in the first place.
What I've Tried
Subprocess & history -c
I've added the subprocess library to the top of my script and then included this directly after to attempt to proactively clear the current history of the shell I am working in:
subprocess.call("history -c", shell=True)
Theoretically this should clear the history of the current shell. I don't see errors from this so I'm assuming that it runs in some other shell. When I run it outside of the script (directly after running the command to invoke the script) it works properly.
Subprocess & unset HISTFILE
I have also used subprocess with the following with no success:
subprocess.call("unset HISTFILE", shell=True)
os.system & history -c
I've also used the os library for Python and included the following in the script:
os.system("history -c")
os.system and unset HISTFILE
I've also tried unset HISTFILE with os.system to no avail.
os.system("unset HISTFILE")
Preferred Solution Characteristics
I realize that I could simply type in unset HISTFILE or history -c after using the command. But I want this to be as much as possible a self-contained script.
Ideally the solution would prevent the ./scomescript.py command from ever being recorded within .bash_history.
I need this script to output text to the terminal based on the input so I can't close the terminal immediately afterwards either.
I imagine there must be a way to do this from within the python script itself - this is my preference.
This really isn't very feasible... Adding the entry to the history file is performed by the interactive shell, and it occurs after the command has completed and the parent shell exits. It is, strictly speaking, possible, if you were to make your python program execute spawn a hacky background process that did something like read the history file in a loop re-writing it. I really can't advocate anything like this, but you could append your script with something like:
os.system("nohup bash -ic 'while :; do read -d \"\" history < \"$HISTFILE\"; echo \"$history\" | sed -e\"s#^%s.*##\" -e\"/^$/d\" > \"$HISTFILE\"; sleep 1; done &' >/dev/null 2>&1" % sys.argv[0])
I think a much better way to accomplish your goal of not recording any arguments would be to use something like var = raw_input("") instead of passing sensitive argument on the command line.
You could also perhaps create a shell function to wrap your script, something like my_script(){ set +o history; python_script.py "$#; set -o history ;}?

Change parent shell's environment from a subprocess

If I have a program written in a language other than bash (say python), how can I change environment variables or the current working directory inside it such that it reflects in the calling shell?
I want to use this to write a 'command line helper' that simplifies common operations. For example, a smart cd. When I simple type in the name of a directory into my prompt, it should cd into it.
[~/]$ Downloads
[~/Downloads]$
or even
[~/]$ project5
[~/projects/project5]$
I then found How to change current working directory inside command_not_found_handle (which is exactly one of the things I wanted to do) , which introduced me to shopt -s autocd. However, this still doesn't handle the case where the supplied directory is not in ./.
In addition, if I want to do things like setting the http_proxy variable from a python script, or even update the PATH variable, what are my options?
P. S. I understand that there probably isn't an obvious way to write a magical command inside a python script that automatically updates environment variables in the calling shell. I'm looking for a working solution, not necessarily one that's elegant.
This can only be done with the parent shell's involvement and assistance. For a real-world example of a program that does this, you can look at how ssh-agent is supposed to be used:
eval "$(ssh-agent -s)"
...reads the output from ssh-agent and runs it in the current shell (-s specifies Bourne-compatible output, vs csh).
If you're using Python, be sure to use pipes.quote() (or, for Python 3.x, shlex.quote()) to process your output safely:
import pipes
dirname='/path/to/directory with spaces'
foo_val='value with * wildcards * that need escaping and \t\t tabs!'
print 'cd %s; export FOO=%s;' % (pipes.quote(dirname), pipes.quote(foo_val))
...as careless use can otherwise lead to shell injection attacks.
By contrast, if you're writing this as an external script in bash, be sure to use printf %q for safe escaping (though note that its output is targeted for other bash shells, not for POSIX sh compliance):
#!/bin/bash
dirname='/path/to/directory with spaces'
foo_val='value with * wildcards * that need escaping and \t\t tabs!'
printf 'cd %q; export FOO=%q;' "$dirname" "$foo_val"
If, as it appears from your question, you want your command to appear to be written as a native shell function, I would suggest wrapping it in one (this practice can also be used with command_not_found_handle). For instance, installation can involve putting something like the following in one's .bashrc:
my_command() {
eval "$(command /path/to/my_command.py "$#")"
}
...that way users aren't required to type eval.
Essentially, Charles Duffy hit the nail on the head, I present here another spin on the issue.
What you're basically asking about is interprocess communication: You have a process, which may or may not be a subprocess of the shell (I don't think that matters too much), and you want that process to communicate information to the original shell (just another process, btw), and have it change its state.
One possibility is to use signals. For example, in your shell you could have:
trap 'cd /tmp; pwd;' SIGUSR2
Now:
Type echo $$ in your shell, this will give you a number, PID
cd to a directory in your shell (any directory other than /tmp)
Go to another shell (in another window or what have you), and type: kill SIGUSR2 PID
You will find that you are in /tmp in your original shell.
So that's an example of the communication channel. The devil of course is in the details. There are two halves to your problem: How to get the shell to communicate to your program (the command_not_found_handle would do that nicely if that would work for you), and how to get your program to communicate to the shell. Below, I cover the latter issue:
You could, for example, have a trap statement in the original shell:
trap 'eval $(/path/to/my/fancy/command $(pwd) $$)' SIGUSR2
...your fancy command will be given the current working directory of the original shell as the first argument, and the process id of the shell (so it knows who to signal), and it can act upon it. If your command sends an executable shell command string to the eval command, it will be executed in the environment of the original shell.
For example:
trap 'eval $(/tmp/doit $$ $(pwd)); pwd;' SIGUSR2
/tmp/doit is the fancy command. It could be any executable type [Python, C, Perl, etc.]), the key is that it spits out a string that the shell can evaluate. In /tmp/doit, I have provided a bash script:
#!/bin/bash
echo "echo PID: $1 original directory: $2; cd /tmp"
(I make sure the file is executable with: chmod 755 /tmp/doit). Now if I type:
cd; echo $$
Then, in another shell, take the number output ("NNNNN") by the above echo and do:
kill -s SIGUSR2 NNNNN
...then suddenly I will see something like this pop up in the original shell:
PID: NNNNN original directory: /home/myhomepath
/tmp
and if I type "pwd" in my original shell, I will see that I'm in /tmp.
The guy who wanted command_not_found_handle to do something in the current shell environment could have used signals to get the effect he wanted. Here I was running the kill manually but there's no reason why a shell function couldn't do it.
Doing fancy work on the frontend, whereby you re-interpret or pre-interpret the user's input to the shell, may require that the user runs a frontend program that could be pretty complicated, depending on what you want to do. The old school "expect" program is ideal for something like this, but not too many youngsters pick up TCL these days :-) .

run interpreter after execute bash python

Obviously it helps to have the interpreter to debug, but I prefer to execute commands in terminal. Is there any way to make it run the python program, then startup an interpreter with the variables and functions created already in there. My current command is this:
python main.py < tests/1.in
Does anyone know how to modify it to make the variables and functions accessible after runtime?
Use the -i flag:
python -i main.py < tests/1.in
Houw about -i:
-i : inspect interactively after running script; forces a prompt even
if stdin does not appear to be a terminal; also PYTHONINSPECT=x

Python 'source HOME/.bashrc' with os.system()

I am writing a python script (Linux) that is adding some shell aliases (writes them to HOME/.bash_aliases).
In order to make an alias available immediately after it was written I should issue the following bash built-in:
source HOME/.bashrc
source is a bash built-in so I cannot just:
os.system(source HOME/.bashrc)
If i try something like:
os.system('/bin/bash -c source HOME/.bashrc')
...will freeze the script (just like is waiting for something).
Any suggestions ?
What you want is not possible. A program (your script) cannot modify the environment of the caller (the shell you run it from).
Another approach which would allow you to do something close is to write it in terms of a bash function, which is run in the same process and can modify the caller. Note that sourcing during runtime can have possible negative side-effects depending on what the user has in their bashrc.
what you are trying to do is impossible. or better: how you are trying to do it is impossible.
your bash command is wrong. bash -s command does not execute command. it just stores the string "command" in the variable $1 and then drops you to the prompt. that is why the python script seems to freeze. what you meant to do is bash -c command.
why do you source .bashrc? would it not be enough to just source .bash_aliases?
even if you got your bash command right, the changes will only take effect in the bash session started from python. once that bash session is closed, and your python script is done, you are back at your original bash session. all changes in the bash session started from python is lost.
everytime you want to change something in the current bash session, you have to do it from inside the current bash session. most of the commands you run from bash (system commands, python scripts, even bash scripts) will spawn another process, and everything you do in that other process will not affect your first bash session.
source is a bash builtin which allows you to execute commands inside the currently running bash session, instead of spawning another process and running the commands there. defining a bash function is another way to execute commands inside the currently running bash session.
see this answer for more information about sourcing and executing.
what you can do to achieve what you want
modify your python script to just do the changes necessary to .bash_aliases.
prepare a bash script to run your python script and then source .bash_aliases.
#i am a bash script, but you have to source me, do not execute me.
modify_bash_aliases.py "$#"
source ~/.bash_aliases
add an alias to your .bashrc to source that script
alias add_alias='source modify_bash_aliases.sh'
now when you type add_alias some_alias in your bash prompt it will be replaced with source modify_bash_aliases.sh and then executed. since source is a bash builtin, the commands inside the script will be executed inside the currently running bash session. the python script will still run in another process, but the subsequent source command will run inside your currently running bash session.
another way
modify your python script to just do the changes necessary to .bash_aliases.
prepare a bash function to run your python script and then source .bash_aliases.
add_alias() {
modify_bash_aliases.py "$#"
source ~/.bash_aliases
}
now you can call the function like this: add_alias some_alias
I had an interesting issue where I needed to source an RC file to get the correct output in my python script.
I eventually used this inside my function to bring over the same variables from the bash file I needed to source. Be sure to have os imported.
with open('overcloudrc') as data:
lines = data.readlines()
for line in lines:
var = line.split(' ')[1].split('=')[0].strip()
val = line.split(' ')[1].split('=')[1].strip()
os.environ[var] = val
Working solution from Can I use an alias to execute a program from a python script :
import subprocess
sp = subprocess.Popen(["/bin/bash", "-i", "-c", "nuke -x scriptpath"])
sp.communicate()

Categories