I need to run a batch script on my windows server machine. I am running that using WMIC command from my local system like this wmic /node:ipaddress /user:\username /password:password process call create "E:\Hello\start.bat" this gives me process ID and status.
That batch file will run a service in the server machine, and I can access the service from my local system. so once the operations on my local system are done. I will have to press Q on that batch file to generate report. But I have to enter button Q on my server from my local Using PID.
Is there a way to do that. I will have to execute these two commands using python script to automate the process.
Thanks in advance
I tried providing input to WMIC process using subprocess.PIPE.stdin("Q"). but did not work.
The two commands are : 1) wmic /node:ipaddress /user:\username /password:password process call create "E:\Hello\start.bat", this command will give me process id and status. 2) a command to enter key "Q" on the same process ID, returned from first command. If there is another solution also to do this that is also fine for me, but the problem is that, the server is windows 2016 and communicating that server from python code is little difficult.
Related
The only solution i found was this :
cat MyScript.py | ssh username#ip_addr python -
the problem with this is that it wont show the outputs of that script live, it waits until the program is finished and then it displays the output
I'm using scapy with in this script and sniff packets with it, and i have to run this on that remote server (and no, i cant copy it there or write it there)
so what is the solution? how can i view the outputs of a script live in my command line?
I'm using windows 10.
Important note: also i think ssh is using some sort of buffering, and sends the output after the amount of printed stuff gets more than buffer, since when the output is very large it does show part of it suddenly, i want the output of that function to come to my computer as soon as possible, not after reaching a buffer or something
You should first send the file to your remote machine using
scp MyScript.py username#ip_addre:/path/to/script
then SSH to your remote machine using
ssh username#ip_addr
ANd finally, you run you script normally
python path/to/MyScript.py
EDIT
To execute your script directly without copying it to the remote machine, use this command:
ssh user#ip_addr 'python -s' < script.py
Looking for an answer on the following scenario:
I have a Python Script running, which i have plugged in my Windows Task Scheduler. Task runs successfully but does not do the work. Actually this script is supposed to download a file from SFTP Location Decrypt the file and converted csv file is being uploaded into the SQL Server.
Above Script is running fine when i run from the command line. however same script does not work when running from the Scheduler.
I have tried following approach to solve:
Task is running with the highest privilege
Task does not require user to log in, without login also task is running.
Other than above 2 options, I have no idea what needs to be done.
PS: From the task scheduler's history, I can see that there is no access denied error coming
I am writing a Python script that creates and runs several VMs via virsh for the user. Some of the configuration has to be done by executing commands inside the VM which I would want to do automatically.
What would be the easiest way to get remote shell access in Python? I am considering the following approaches:
To use the virsh console command as a sub-process and do I/O to it.
To bring up an SSH session to the VM. I can configure the VM before it boots by editing its file system so I know its target IP address.
Any better API for doing this. RPC?
I need to get the return values for commands so I know if they executed correctly or not. For that matter I need to be able to detect when a program I invoke has finished. Options #1 and #2 rely on scraping the output and that gets complex.
Any suggestions much appreciated.
I'm writing a program to monitor a python script running. There will be multiple instances of this script, which is a server, running from different locations on the computer. My program will be monitoring to see that all instances that are being "watched" are up and running and will restart them as necessary.
The server script cannot be edited.
My problem is that the server process just shows up as the python executable, and I am unable to determine the location of the specific server script on the computer.
Is there anyway to determine what script is actually running on a specific python/pythonw.exe process? And also its path?
ENV: Windows only
You can run the following command from the command line:
WMIC PROCESS get Caption,Commandline,Processid,ExecutablePath
This will show you which module is being executed.
I want to execute a Python script on several (15+) remote machine using SSH. After invoking the script/command I need to disconnect ssh session and keep the processes running in background for as long as they are required to.
I have used Paramiko and PySSH in past so have no problems using them again. Only thing I need to know is how to disconnect a ssh session in python (since normally local script would wait for each remote machine to complete processing before moving on).
This might work, or something similar:
ssh user#remote.host nohup python scriptname.py &
Basically, have a look at the nohup command.
On Linux machines, you can run the script with 'at'.
echo "python scriptname.py" ¦ at now
If you are going to perform repetitive tasks on many hosts, like for example deploying software and running setup scripts, you should consider using something like Fabric
Fabric is a Python (2.5 or higher) library and command-line tool for
streamlining the use of SSH for application deployment or systems
administration tasks.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
Typical use involves creating a Python module containing one or more
functions, then executing them via the fab command-line tool.
You can even use tmux in this scenario.
As per the tmux documentation:
tmux is a terminal multiplexer. It lets you switch easily between several programs in one terminal, detach them (they keep running in the background) and reattach them to a different terminal. And do a lot more
From a tmux session, you can run a script, quit the terminal, log in again and check back as it keeps the session until the server restart.
How to configure tmux on a cloud server