Pipe Python script to ssh, but do other bash commands first - python

A very convenient way to execute a Python script on a remote server is to pipe it to ssh:
cat script.py | ssh user#address.com python -
where the - seems to be optional.
How can I execute other bash commands before running the Python script in this way?
This does not work:
cat script.py | ssh user#address.com "cd ..; python -" # WRONG!
Interestingly, this sends a non-deterministically corrupted version of the Python script, which gives a syntax error in a different place every time you run it!

You can create a sub-shell:
cat script.py | ssh user#address.com "(cd ..; python -)"
Or a temporary file:
cat script.py | ssh user#address.com "tee >/tmp/tmp.py; cd ..; python /tmp/tmp.py; rm /tmp/tmp.py"

Related

Start a remote script from a Mac OS X machine via SSH command

I am trying to start a python script on my VM from my local Mac OS
I did
ssh -i /key/path/id_rsa root#111.11.1.0 "sleep 5s; cd /root/Server;pkill -f server.py;./server.py;"
Result
It's SSH in and it quickly runs those commands and it quickly logging me out. I was expecting it to stay open in SSH session.
My script is NOT running ...
ps -aux | grep python
root 901 0.0 0.2 553164 18584 ? Ssl Jan19 20:37 /usr/bin/pytho -Es /usr/sbin/tuned -l -P
root 15444 0.0 0.0 112648 976 pts/0 S+ 19:16 0:00 grep --color=auto python
If I do this it works
ssh -i /key/path/id_rsa root#111.11.1.0 "sleep 5s; cd /root/Server"
Then
./server.py;
Then, it works.
Am I missing anything?
You might need to state the shell starting your script i.e /bin/bash server.py:
ssh -i /key/path/id_rsa root#111.11.1.0 "sleep 5s; cd /root/Server; pkill -f server.py; /bin/bash ./server.py;"
If you would like to start the script and leave it running even after you end your ssh session you could use nohup. Notice that you need to put the process in the background and redirect stdin, stdout and stderr to completly detach from the remote process:
-i /key/path/id_rsa root#111.11.1.0 "sleep 5s; cd /root/Server; nohup /bin/bash ./server.py < /dev/null > std.out 2> std.err &"
It seems like the reason that your ssh command returns imediately is because somehow the call to pkill -f server.py will also terminate the actual ssh session, since it also contains the server.py in the commandline.
I don't have my regular MacBook Pro here to test with, but I think that adding another semicolon and ending the command line with /bin/bash might do it.

How to pipe output from a linux command to a python script?

We have a python script that accepts the following arguments as example:
mypie.py --size=20 --pielist='{"apple":[5],"orange":[7]}' --quantity=10
We tried in our test.sh bash shell script:
test.sh 20 5 7 10
test.sh
export SIZE=$1
export APPLE=$2
export ORANGE=$3
export QUANTITY=$4
echo "--size=$SIZE" --pipelist='{\"apple\":[$2],\"orange\":[$3]}\"' --quantity=$4" | tee output.txt
cat output.txt | sudo python mypie.py
We got error
ERROR: size is not specified.
but when we cat output.txt, we can see that it is there with size value.
--size=20 --pielist='{"apple":[5],"orange":[7]}' --quantity=10
What are we doing wrong ?
Thanks
This would do -
cat output.txt | sudo xargs python mypie.py
Using the pipe redirects the content to stdin of the program - not the command line.
You could consider building up an args variable - then passing that to both the output.txt, and your python script.
For example:
export SIZE=$1
export APPLE=$2
export ORANGE=$3
export QUANTITY=$4
export ARGS="--size=$SIZE" --pipelist='{\"apple\":[$2],\"orange\":[$3]}\"' --quantity=$4"
echo $ARGS | tee output.txt
sudo python mypie.py $ARGS
You can use command substitution to directly place the output of command as the arguments to your python program.
python mypie.py `cat output.txt`
or
python mypie.py $(cat output.txt)
The simple and obvious solution is to change test.sh so that it passes the parameters correctly.
#!/bin/sh
set -x # if you want to see the script's parameters
sudo python mypie.py --size="$1" \
--pipelist="{'apple':[$2],'orange':[$3]}" --quantity="$4"
Passing the arguments as standard input (which is what a pipe does) is simply not how it is usually done, nor a particularly suitable model for this particular type of interaction.

running series of interactive shell commands in bash/python/perl script

Currently, I do the following steps:
a. Grep for pid of a process and kill it.
ps -aux | grep foo.bar # process of interest
kill -9 pid_of_foo.bar # kill the process
b. start virtualenv
cd {required_folder}
sudo virtualenv folder/
cd {folder2}
source bin/activate
c. Start the manage.py in shell mode
cd {required folder}
sudo python manage.py shell
d. In the interactive manage shell, execute the following commands:
from core import *
foo.bar.bz.clear.state()
exit
e. Execute a script
/baz/maz/foo
In bash we can write down a series of commands, however Is it possible to run the interactive shell in django using bash and execute commands? I was wondering if above steps can be scriptified.
Thanks
You need a script like this one:
#!/bin/bash
# kill all foo.bar's instances
for pid in $(ps -aux | grep foo.bar | grep -v grep | awk '{print $2;}'); do
kill $pid
done
# start virtualenv
cd {required_folder}
...
# Start the manage.py in shell mode
cd {required folder}
cat << EOF | sudo python manage.py shell
from core import *
foo.bar.bz.clear.state()
exit
EOF
# Execute a script
/baz/maz/foo
The key point of the script is HEREDOC python snippet. Take a look at the example I've just tried in a console:
[alex#galene ~]$ cat <<EOF_MARK | python -
> import sys
> print "Hello, world from python %s" % sys.version
> exit
> EOF_MARK
Hello, world from python 2.7.6 (default, Nov 22 2013, 22:57:56)
[GCC 4.7.2 20121109 (ALT Linux 4.7.2-alt7)]
[alex#galene ~]$ _

How to execute remote pysftp commands under a specific shell

I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash

Is there any way to run a python script on remote machine without sending it?

I can run a shell script on remote machine with ssh. For example:
ssh -l foo 192.168.0.1 "`cat my_script.sh`"
Now I want to run a python script without sending .py file. Is there any way?
This will put the contents of my_script.py on your computer into an echo command that is performed on the remote computer and passed into python.
ssh -l foo 192.168.0.1 "echo '`cat my_script.py`' | python"
If you want to add command line args it should be as simple as putting them after the python command like so:
ssh -l foo 192.168.0.1 "echo '`cat my_script.py`' | python -testSwitch -arg 0"
Make sure that the command line args are inside the double quotes of the command you are sending to the remote host.

Categories