how to execute python script on remote machine using psexec? - python

I am trying to execute a python script on remote machine using psexec. The python script is already on the remote machine i only want to execute it there. I am using the following command:
psexec -i -s -d \\123 -u xyz -p xyz C:/sample.py
But i get error as :
PsExec could not start C:\sample.py on 123:
The system cannot find the file specified
I tried placing the python exe path also in the psexec comand as:
psexec -i -s -d \\123 -u xyz -p xyz C:\programs\python.exe C:/sample.py
then it opens the python.exe but does not execute the sample.py. The paths are all correct. But i am not getting why the psexec command is not able to find the script. Please suggest how shall i execute the script on the remote machine using psexec.

Remove the -d option from the command and provide the path in quotes and use backslash in path

try adding " " around the exe filename
psexec -i -s -d \\123 -u xyz -p xyz "C:\programs\python.exe" C:/sample.py
if it doesn't work, try adding " " also around the parameters
psexec -i -s -d \\123 -u xyz -p xyz "C:\programs\python.exe" "C:/sample.py"

Related

Running mkdir -p remotely via ssh results in is not a valid local path or glob error

I'm using Fabric (http://www.fabfile.org) framework which connects via ssh to a VPS (Droplet on DigitalOcean) to push some bash commands.
Running a simple bash command mkdir fails with
ValueError: 'mkdir -p /opt/create_this_dir' is not a valid local path or glob.
What could be the problem here? When I log into the VPS via ssh as root, I'm able to run
"mkdir -p /opt/create_this_dir"
and the directory gets created under /opt/ without same error I get when I run command remotely with fabric script seen in screenshot below.
I need to use
run("sudo mkdir -p /opt/reimaginedworks")
instead of
put("sudo mkdir -p /opt/reimaginedworks")

ssh restrict commands using Python

I am using a python script to restrict the commands usage using the command argument in the authorized_keys file.
command:
ssh host-name bash --login -c 'exec $0 "$#"' mkdir -p hello
My script is performing required actions to restrict the commands. After filtering, the python script does sys.exit(1) for error and sys.exit(0) for success. After the return value the above ssh command is not getting executed at the end. Is there something else I need to send from the python script to SSH daemon?
The command modifier in the authorized_keys is not (only) used to validate the users command, but that command is run instead of the command provided by the user. This means calling sys.exit(0) from there prevents running the user-provided command.
In that script, after you validate the command, you need to run it too!
I think changing it to
ssh host-name bash --login -c 'exec $0 "$#" && mkdir -p hello'
should do the trick, otherwise bash will assume only the part in the single quotes is the command to execute.
If the second part should be executed even if the first part fails, replace the && with ;

pssh freezes only inside shell script

I am trying to run a pssh command inside a shell script, but the script freezes and there are no connections made, as verified in a ps -ef command. Also, because there is only one host in the hosts file I am using.
At this point, Control-C fails to kill the script, and it will not timeout. Only a kill command works.
If I run the same command on the command-line, there is no issue. Also, a pscp command in the same script causes no issues, so it seems that the required libraries are being loaded.
$ cat /home/myusername/tmp/hosts
mysinglehostname
Here is the script being run:
$ cat /home/myusername/bin/testpssh
#!/bin/bash
source ~/.bashrc
$HOME/path/to/python-virtualenv/bin/pscp -h "/home/myusername/tmp/hosts" "/tmp/garbage" "/tmp/garbage"
$HOME/path/to/python-virtualenv/bin/pssh -h "/home/myusername/tmp/hosts" -l myusername -p 512 -t 3 -o "out" -O GSSAPIAuthentication=no -i "whoami"
Here is what happens when I run the script:
$ /home/myusername/bin/testpssh &
[1] 18553
$ [1] 14:51:12 [SUCCESS] mysinglehostname 22
$ ps -ef | grep pssh
myusername 18580 18553 0 14:33 pts/16 00:00:00 /home/myusername/path/to/python-virtualenv/bin/python /home/myusername/path/to/python-virtualenv/bin/pssh -h /home/myusername/tmp/hosts -l myusername -p 512 -t 3 -o out -O GSSAPIAuthentication=no -i whoami
$ ## The script above is hanging after completing the pscp, before pssh completes.\
> But if I copy and paste the process line, it works fine as shown here:
$ /home/myusername/path/to/python-virtualenv/bin/python \
> /home/myusername/path/to/python-virtualenv/bin/pssh \
> -h /home/myusername/tmp/hosts -l myusername \
> -p 512 -t 3 -o out -O GSSAPIAuthentication=no -i whoami
[1] 14:59:03 [SUCCESS] mysinglehostname 22
myusername
$
The first [SUCCESS] above is for the pscp action, and no subsequent [SUCCESS] comes from the pssh command, unless it is performed explicitly on the command-line.
Why will the pssh command not work inside the bash shell script?
The script works fine if I use ksh instead of bash (and remove the line to source ~/.bashrc) in the shebang line.
I am on RedHat 6.4, using python 2.6.6

Python - String deformats when running via SSH

I need to run some bash commands via Fabric API (ssh).
I have the following String in my Python module:
newCommand = command + "'`echo -ne '\\015'"
When I print this string directly in Python the output is the expected:
command'`echo -ne '\015'
However, if I try to run this command via the Fabric API the command is somehow modified into this:
/bin/bash -l -c "command'\`echo -ne '\015'"
Notice the '\' before 'echo'. Why is this happenning? The '\' is breaking my command and I can't successfuly run the command.
ps: The prefix "/bin/bash -l -c" is expected since that's how Fabric works with SSH
This is not a valid shell command:
command'`echo -ne '\015'
Even if you add the missing backtick and single quote, it's nothing like writing "command" and pressing enter.
The context your command will be run in is basically what you'd get if you'd ssh and paste a command:
clientprompt$ ssh host
Welcome to Host, User
hostprompt$ <COMMAND HERE>
You should focus your efforts on finding a single command that does what you want, and not a series of keypresses that you could write to do it (that's not how ssh works).

How to execute remote pysftp commands under a specific shell

I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash

Categories