I have a python script from which i am executing a process on remote machine as follows:
sample= sp.Popen( ['c:/psexec/PsExec.exe','-i','-s','\\\\' + 'xyz','-u', 'sample','-p', 'xyz','C:/sample.bat'],stdin=sp.PIPE, stdout = sp.PIPE, stderr=sp.PIPE)
It executes well but what i want not to provide complete exe path as follows:
sample= sp.Popen( ['psexec','-i','-s','\\\\' + 'xyz','-u', 'sample','-p', 'xyz','C:/sample.bat'],stdin=sp.PIPE, stdout = sp.PIPE, stderr=sp.PIPE)
It does not work when i remove the complete psexec exe path. So, suggest what am i not doing right and how shall i execute python script using only psexec keyword.
I know this question is fairly old, but it appears that psexec is not installed in the standard location, so it is likely that the location of the psexec binary is not in your systems's PATH environment variable. Add c:\psexec to the PATH env var on the local machine, start a new command prompt, and this should work.
Related
Running rabbitmqctl from a Python package using subprocess returns "command not found".
proc = subprocess.Popen(['/path/to/rabbitmqctl', 'arguments'], stdout=subprocess.PIPE)
output = proc.communicate()[0]
rt = proc.returncode
The above code is part of a python project that will be packaged to a wheel distribution. After installing the wheel through pip, the above code returns an exit code 127 which is "command not found".
I tried with the full path to rabbitmqctl, used sudo with the command, used preexec_fn in subprocess and set the uid to rabbitmq user but everything returns returncode 127.
The command executes fine in the python interpreter. Issue is only when the code is installed as a package.
This code is part of a flask app which is controlled by gunicorn. I've even tried to start gunicorn with sudo, but ended up getting the same error.
The issue was due to the python virtual environment.
I installed the package that has the rabbitmqctl command in a python virtual environment. So even though the module had root privileges, it is not able to find rabbitmqctl command because the path to that binary was not part of the PATH environment variable of the virtual environment. I fixed it by adding the env parameter in subprocess.
rabbit_env = os.environ.copy()
rabbit_env['PATH'] = '/path/where/rabbitmqctl/is/located/:' + rabbit_env['PATH']
proc = subprocess.Popen(['/path/to/rabbitmqctl', 'arguments'], env=rabbit_env, stdout=subprocess.PIPE)
output = proc.communicate()[0]
rt = proc.returncode
The reason why I got exit code 127 even when I specified the full path of rabbitmqctl is because rabbitmqctl is a script that runs some other commands and rabbitmqctl was not able to find those dependent commands in the PATH because those commands' locations are not part of the virtual environment PATH. So make sure you add the locations of all the rabbitmqctl dependent commands in the rabbit_env['PATH'] above.
I am trying to invoke a shell script using python's subprocess module.
The shell script activates a virtualenv using virtualenvwrapper and in turn invokes a python script.
The last invoked python script needs libraries installed in virtualenv and it is crashing.
tried activating virtualenv again in python script but of no use
Parent Python code-
command = "/home/arman/analysis_server/new_analysis/run"
output = subprocess.Popen([command], shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
/run script -
#!/bin/bash
export WORKON_HOME=~/Envs
source /usr/local/bin/virtualenvwrapper.sh
workon analytics
python /home/arman/analysis_server/new_analysis/AnalysisWrapper.py
AnalysisWrapper.py -
cmd = "python /home/arman/analysis_server/new_analysis/DataHandlerWrapper.py " + instrument + " &"
subprocess.Popen(cmd, shell=True, executable='/bin/bash', stdout=out, stderr=out)
The DataHandlerWrapper.py script needs virtualenv but it is crashing
I think your issue is that Popen spawns a subshell, so you activating the virtualenv in one subprocess and trying to use it in another is never going to work.
If there's nothing happening in between you could perhaps try chaining your commands into the same process:
command = "/home/arman/analysis_server/new_analysis/run && python /home/arman/analysis_server/new_analysis/DataHandlerWrapper.py " + instrument + " &"
Requirement - I want to trigger command "service.bat install" from CMD as administrator in Python Script.
I am using below code as of now and it is opening a new windows for asking permission to say yes.
source_path = 'C:\\JBoss_Playground\\wildfly-10.1.0.Final\\bin\\service'
cmd_command = 'service.bat install'
os.chdir(source_path)
subprocess.call(cmd_command, shell=True)
Now, I want to run the above command in background using admin privileges on Windows server. This is to a module to automate JBoss/Wildfly application service configuration.
Appreciate for your help.
Thank You !!
I used:
os.system('your command')
To start it as admin I think that python script have to be executed as root. For background process...
In my project I used:
import subprocess
proc = subprocess.Popen('cmd.exe', stdin = subprocess.PIPE, stdout = subprocess.PIPE)
stdout, stderr = proc.communicate('dir c:\\')
stdout```
I tryed in python idle and it returns:
'Microsoft Windows [Version 6.1.7600]...'
Have you tried to use runas command? You can change to admin mode. Please check this runas syntax
OS X 10.13.6 Python 3.6
I am trying to run the following command from a jupyter notebook:
vpn_cmd = '''
sudo openvpn
--config ~/Downloads/configs/ipvanish-US-Chicago-chi-a49.ovpn
--ca ~/Downloads/configs/ca.ipvanish.com.crt'''
proc = Popen(vpn_cmd.split(), stdout=PIPE, stderr=STDOUT)
stdout, stderr = proc.communicate()
print(stdout.decode())
But get the error:
sudo: openvpn: command not found
What I've tried:
added export PATH="/usr/local/sbin:$PATH" to my ~/.bash_profile and can run the the sudo openvpn command from my terminal
edited my sudoers file so sudo no longer prompts for a password
called sudo which openvpn and tried adding /usr/local/sbin/openvpn to my sys.path within python
not splitting vpn_cmd and setting shell=True
tried packaging it in a test.py script and executing from the terminal, but it just hangs at the proc.communicate() line
specified the full path for the --config and --ca flags
So far, nothing has fixed this. I can run openvpn from my terminal just fine. It seems like a simple path issue but I can't figure out what I need to add to my python path. Is there something particular with the jupyter notebook kernel?
Jupyter probably isn't picking up your personal .bashrc settings, depending also on how you are running it. Just hardcode the path or augment the PATH in your Python script instead.
With shell=False you don't get the tildes expanded; so you should change those to os.environ["HOME"], or make sure you know in which directory you run this, and use relative paths.
You should not be using Popen() if run can do what you require.
home = os.environ["HOME"]
r = subprocess.run(
['sudo', '/usr/local/sbin/openvpn',
'--config', home + '/Downloads/configs/ipvanish-US-Chicago-chi-a49.ovpn',
'--ca', home + '/Downloads/configs/ca.ipvanish.com.crt'],
stdout=PIPE, stderr=PIPE, universal_newlines=True)
print(r.stdout)
I have the following code that works great to run the ls command. I have a bash alias that I use alias ll='ls -alFGh' is it possible to get python to run the bash command without python loading my bash_alias file, parsing, and then actually running the full command?
import subprocess
command = "ls" # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None, shell=True)
#Launch the shell command:
output = process.communicate()
print (output[0])
Trying with command = "ll" the output I get is:
/bin/sh: ll: command not found
b''
You cannot. When you run a python process it has no knowledge of a shell alias. There are simple ways of passing text from parent to child process (other than IPC), the command-line and through environment (i.e. exported) variables. Bash does not support exporting aliases.
From the man bash pages: For almost every purpose, aliases are superseded by shell functions.
Bash does support exporting functions, so I suggest you make your alias a simple function instead. That way it is exported from shell to python to shell. For example:
In the shell:
ll() { ls -l; }
export -f ll
In python:
import subprocess
command = "ll" # the shell command
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=None, shell=True)
output = process.communicate()
print(output[0].decode()) # Required if using Python 3
Since you are using the print() function I have assumed you are using python 3. In which case you need the .decode(), since a bytes object is returned.
With a bit of hackery it is possible to create and export shell functions from python as well.