Vim commands running in bash after I run a python file - python

I am having an issue after I run a python unit test file. Once the file exits I can only interact with my console after pressing "i" and using other vim keybindings. I also noticed that using the arrow keys to traverse what I typed will delete a random number of characters at the end of the line.
EX:
$ ./tests.py -v
<output>
$ <cannot type>
<press "i">
$ I can now type
<press <- >
$ I can no
I am using RHEL 7 and bash. I've tried googling this issue but I'm either formatting the question poorly or it is an uncommon issue.
Thank you for the help.
EDIT:
The actual test.py contains private code, but this is example contains the same essential code.
test.py
#!/usr/bin/env python
import unittest
class TestUtil(unittest.TestCase):
def test_hello_world(self):
text = "Hello World!"
self.assertEqual("Hello World!", text)
print(text)
pass
if __name__ == '__main__':
unittest.main()

It sounds as if your shell is being placed into vi-mode. This is a readline mode where you can use vi editing keys instead of the more commonly used emacs keys.
There are two ways I know of that this can happen.
set -o vi
bindkey -v
Technically, to turn it off you use set +o vi. However, that will disable all inline editing. It is more likely that you wish to go back to emacs mode, which is usually the default. To do that, do this instead:
set -o emacs

Related

Printing .py file output in command line

I am trying to access a python function from the command line, and I would like to write such a command that will print the output in the terminal. The below doesn't work. What could I change?
python -c 'from laser import Laser; laser = Laser();l = laser.embed_sentences("hello", lang = "en").shape == (1, 1024); print(l)'
(base) ~ % python -c 'print("hello, world")'
hello, world
Printing works fine for me when running python through python -c. Are you sure your terminal isn't truncating your output by omitting the last (and in this case, only) line? You could try creating a single line file (no newline at the end) and then running cat [filename] (which is how I sometimes discover that my terminal is doing this)
-c cmd : program passed in as string (terminates option list)
That is the correct flag to be used. This must be a CLI config issue. Or the script is taking longer than you are expecting to run and it appears no output is generated.
Does python -c 'print("hello")' work?

"python -i print_content.py < file.txt" exits interactive mode

python -i <script.py> is theoretically meant to run script.py and proceed into interactive mode, however:
shell$ python -i print_content.py < file.txt
hello world
>>>
shell$
The script runs successfully and is followed by printing >>>, implying that the Python shell is entered following script execution.
I know I could bypass the issue by opening the file from within the script - I'm more curious to know the cause of the issue.
Note: Contents of print_content.py seem innocent enough:
import sys
for i in sys.stdin:
print i
Note: I'm using Python 2.7.3

Error in check_call() subprocess, executing 'mv' unix command: "Syntax error: '(' unexpected"

I'm making a python script for Travis CI.
.travis.yml
...
script:
- support/travis-build.py
...
The python file travis-build.py is something like this:
#!/usr/bin/env python
from subprocess import check_call
...
check_call(r"mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder", shell=True)
...
When Travis building achieves that line, I'm getting an error:
/bin/sh: 1: Syntax error: "(" unexpected
I just tried a lot of different forms to write it, but I get the same result. Any idea?
Thanks in advance!
Edit
My current directory layout:
- my_project/final_folder/
- cmake-3.0.2-Darwin64-universal/
- fileA
- fileB
- fileC
I'm trying with this command to move all the current files fileA, fileB and fileC, excluding my_project and cmake-3.0.2-Darwin64-universal folders into ./my_project/final_folder. If I execute this command on Linux shell, I get my aim but not through check_call() command.
Note: I can't move the files one by one, because there are many others
I don't know which shell Travis are using by default because I don't specify it, I only know that if I write the command in my .travis.yml:
.travis.yml
...
script:
# Here is the previous Travis code
- mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder
...
It works. But If I use the script, it fails.
I found this command from the following issue:
How to use 'mv' command to move files except those in a specific directory?
You're using the bash feature extglob, to try to exclude the files that you're specifying. You'll need to enable it in order to have it exclude the two entries you're specifying.
The python subprocess module explicitly uses /bin/sh when you use shell=True, which doesn't enable the use of bash features like this by default (it's a compliance thing to make it more like original sh).
If you want to get bash to interpret the command; you have to pass it to bash explicitly, for example using:
subprocess.check_call(["bash", "-O", "extglob", "-c", "mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder"])
I would not choose to do the job in this manner, though.
Let me try again: in which shell do you expect your syntax !(...) to work? Is it bash? Is it ksh? I have never used it, and a quick search for a corresponding bash feature led nowhere. I suspect your syntax is just wrong, which is what the error message is telling you. In that case, your problem is entirely independent form python and the subprocess module.
If a special shell you have on your system supports this syntax, you need to make sure that Python is using the same shell when invoking your command. It tells you which shell it has been using: /bin/sh. This is usually just a link to the real shell executable. Does it point to the same shell you have tested your command in?
Edit: the SO solution you referenced contains the solution in the comments:
Tip: Note however that using this pattern relies on extglob. You can
enable it using shopt -s extglob (If you want extended globs to be
turned on by default you can add shopt -s extglob to .bashrc)
Just to demonstrate that different shells might deal with your syntax in different ways, first using bash:
$ !(uname)
-bash: !: event not found
And then, using /bin/dash:
$ !(uname)
Linux
The argument to a subprocess.something method must be a list of command line arguments. Use e.g. shlex.split() to make the string be split into correct command line arguments:
import shlex, subprocess
subprocess.check_call( shlex.split("mv !(...)") )
EDIT:
So, the goal is to move files/directories, with the exemption of some file(s)/directory(ies). By playing around with bash, I could get it to work like this:
mv `ls | grep -v -e '\(exclusion1\|exclusion2\)'` my_project
So in your situation that would be:
mv `ls | grep -v -e '\(myproject\|cmake-3.0.2-Darwin64-universal\)'` my_project
This could go into the subprocess.check_call(..., shell=True) and it should do what you expect it to do.

How to tell if python script is being run in a terminal or via GUI?

I'm working in Linux and am wondering how to have python tell whether it is being run directly from a terminal or via a GUI (like alt-F2) where output will need to be sent to a window rather than stdout which will appear in a terminal.
In bash, this done by:
if [ -t 0 ] ; then
echo "I'm in a terminal"
else
zenity --info --title "Hello" --text "I'm being run without a terminal"
fi
How can this be accomplished in python? In other words, the equivalent of [ -t 0 ])?
$ echo ciao | python -c 'import sys; print sys.stdin.isatty()'
False
Of course, your GUI-based IDE might choose to "fool" you by opening a pseudo-terminal instead (you can do it yourself to other programs with pexpect, and, what's sauce for the goose...!-), in which case isatty or any other within-Python approach cannot tell the difference. But the same trick would also "fool" your example bash program (in exactly the same way) so I guess you're aware of that. OTOH, this will make it impossible for the program to accept input via a normal Unix "pipe"!
A more reliable approach might therefore be to explicitly tell the program whether it must output to stdout or where else, e.g. with a command-line flag.
I scoured SE for an answer to this but everywhere indicated the use of sys.stdout.isatty() or os.isatty(sys.stdout.fileno()). Neither of these dependably caught my GUI test cases.
Testing standard input was the only thing that worked for me:
sys.stdin.isatty()
I had the same issue, and I did as follow:
import sys
mode = 1
try:
if sys.stdin.isatty():
mode = 0
except AttributeError: # stdin is NoneType if not in terminal mode
pass
if mode == 0:
# code if terminal mode ...
else:
# code if gui mode ...
There are several examples of this on PLEAC which counts for a third case: running at an interactive Python prompt.
In bash I use this script:
$ cat ~/bin/test-term.sh
#!/bin/bash
#See if $TERM has been set when called from Desktop shortcut
echo TERM environment variable: $TERM > ~/Downloads/test-term.txt
echo "Using env | grep TERM output below:" >> ~/Downloads/test-term.txt
env | grep TERM >> ~/Downloads/test-term.txt
exit 0
When you create a desktop shortcut to call the script the output is:
$ cat ~/Downloads/test-term.txt
TERM environment variable: dumb
Using env | grep TERM output below:
Notice grepping env command returns nothing?
Now call the script from the command line:
$ cat ~/Downloads/test-term.txt
TERM environment variable: xterm-256color
Using env | grep TERM output below:
TERM=xterm-256color
This time the TERM variable from env command returns xterm-256color
In Python you can use:
#import os
result = os.popen("echo $TERM")
result2 = os.popen("env | grep TERM")
Then check the results. I haven't done this in python yet but will probably need to soon for my current project. I came here looking for a ready made solution but noone has posted one like this yet.

How to use export with Python on Linux

I need to make an export like this in Python :
# export MY_DATA="my_export"
I've tried to do :
# -*- python-mode -*-
# -*- coding: utf-8 -*-
import os
os.system('export MY_DATA="my_export"')
But when I list export, "MY_DATA" not appear :
# export
How I can do an export with Python without saving "my_export" into a file ?
export is a command that you give directly to the shell (e.g. bash), to tell it to add or modify one of its environment variables. You can't change your shell's environment from a child process (such as Python), it's just not possible.
Here's what's happening when you try os.system('export MY_DATA="my_export"')...
/bin/bash process, command `python yourscript.py` forks python subprocess
|_
/usr/bin/python process, command `os.system()` forks /bin/sh subprocess
|_
/bin/sh process, command `export ...` changes its local environment
When the bottom-most /bin/sh subprocess finishes running your export ... command, then it's discarded, along with the environment that you have just changed.
You actually want to do
import os
os.environ["MY_DATA"] = "my_export"
Another way to do this, if you're in a hurry and don't mind the hacky-aftertaste, is to execute the output of the python script in your bash environment and print out the commands to execute setting the environment in python. Not ideal but it can get the job done in a pinch. It's not very portable across shells, so YMMV.
$(python -c 'print "export MY_DATA=my_export"')
(you can also enclose the statement in backticks in some shells ``)
Not that simple:
python -c "import os; os.putenv('MY_DATA','1233')"
$ echo $MY_DATA # <- empty
But:
python -c "import os; os.putenv('MY_DATA','123'); os.system('bash')"
$ echo $MY_DATA #<- 123
I have an excellent answer.
#! /bin/bash
output=$(git diff origin/master..origin/develop | \
python -c '
# DO YOUR HACKING
variable1_to_be_exported="Yo Yo"
variable2_to_be_exported="Honey Singh"
… so on
magic=""
magic+="export onShell-var1=\""+str(variable1_to_be_exported)+"\"\n"
magic+="export onShell-var2=\""+str(variable2_to_be_exported)+"\""
print magic
'
)
eval "$output"
echo "$onShell-var1" // Output will be Yo Yo
echo "$onShell-var2" // Output will be Honey Singh
Mr Alex Tingle is correct about those processes and sub-process stuffs
How it can be achieved is like the above I have mentioned.
Key Concept is :
Whatever printed from python will be stored in the variable in the catching variable in bash [output]
We can execute any command in the form of string using eval
So, prepare your print output from python in a meaningful bash commands
use eval to execute it in bash
And you can see your results
NOTE
Always execute the eval using double quotes or else bash will mess up your \ns and outputs will be strange
PS: I don't like bash but your have to use it
I've had to do something similar on a CI system recently. My options were to do it entirely in bash (yikes) or use a language like python which would have made programming the logic much simpler.
My workaround was to do the programming in python and write the results to a file.
Then use bash to export the results.
For example:
# do calculations in python
with open("./my_export", "w") as f:
f.write(your_results)
# then in bash
export MY_DATA="$(cat ./my_export)"
rm ./my_export # if no longer needed
You could try os.environ["MY_DATA"] instead.
Kind of a hack because it's not really python doing anything special here, but if you run the export command in the same sub-shell, you will probably get the result you want.
import os
cmd = "export MY_DATA='1234'; echo $MY_DATA" # or whatever command
os.system(cmd)
In the hope of providing clarity over common cinfusion...
I have written many python <--> bash <--> elfbin toolchains and the proper way to see it is such as this:
Each process (originator) has a state of the environment inherited from whatever invoked it. Any change remains lokal to that process. Transfering an environment state is a function by itself and runs in two directions, each with it's own caveats. The most common thing is to modify environment before running a sub-process. To go down to the metal, look at the exec() - call in C. There is a variant that takes a pointer to environment data. This is the only actually supported transfer of environment in typical OS'es.
Shell scripts will create a state to pass when running children when you do an export. Otherwise it just uses that which it got in the first place.
In all other cases it will be some generic mechanism used to pass a set of data to allow the calling process itself to update it's environment based on the result of the child-processes output.
Ex:
ENVUPDATE = $(CMD_THAT_OUTPUTS_KEYVAL_LISTS)
echo $ENVUPDATE > $TMPFILE
source $TMPFILE
The same can of course be done using json, xml or other things as long as you have the tools to interpret and apply.
The need for this may be (50% chance) a sign of misconstruing the basic primitives and that you need a better config or parameter interchange in your solution.....
Oh, in python I would do something like...
(need improvement depending on your situation)
import re
RE_KV=re.compile('([a-z][\w]*)\s*=\s*(.*)')
OUTPUT=RunSomething(...) (Assuming 'k1=v1 k2=v2')
for kv in OUTPUT.split(' ')
try:
k,v=RE_KV.match(kv).groups()
os.environ[k]=str(v)
except:
#The not a property case...
pass
One line solution:
eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
echo $python_include_path # prints /home/<usr>/anaconda3/include/python3.6m" in my case
Breakdown:
Python call
python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'
It's launching a python script that
imports sysconfig
gets the python include path corresponding to this python binary (use "which python" to see which one is being used)
prints the script "python_include_path={0}" with {0} being the path from 2
Eval call
eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
It's executing in the current bash instance the output from the python script. In my case, its executing:
python_include_path=/home/<usr>/anaconda3/include/python3.6m
In other words, it's setting the environment variable "python_include_path" with that path for this shell instance.
Inspired by:
http://blog.tintoy.io/2017/06/exporting-environment-variables-from-python-to-bash/
import os
import shlex
from subprocess import Popen, PIPE
os.environ.update(key=value)
res = Popen(shlex.split("cmd xxx -xxx"), stdin=PIPE, stdout=PIPE, stderr=PIPE,
env=os.environ, shell=True).communicate('y\ny\ny\n'.encode('utf8'))
stdout = res[0]
stderr = res[1]
os.system ('/home/user1/exportPath.ksh')
exportPath.ksh:
export PATH=MY_DATA="my_export"

Categories