I'm trying to create a true interactive remote shell using Python. When I say true, I mean I don't want to just execute a single command and send the results- I have that working already. I also don't want to abstract executing single commands by having the server interpret directory changes or what not.
I am trying to have a client start an interactive /bin/bash and have the server send commands which are then executed by the same persistent shell. For instance, so if I run cd /foo/bar then pwd would return /foo/bar because I would be interacting with the same bash shell.
Here's some slimmed down example code that currently only will do single command execution...
# client.py
import socket
import subprocess
s = socket.socket()
s.connect(('localhost', 1337))
while True:
cmd = s.recv(1024)
# single command execution currently (not interactive shell)
results = subprocess.Popen(cmd, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
results = results.stdout.read() + results.stderr.read()
s.sendall(results)
# server.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 1337))
s.listen(5)
conn, _ = s.accept()
while True:
cmd = raw_input('> ').rstrip()
conn.send(cmd)
results = conn.recv(4096)
print results
I've tried many solutions none of which have worked. The subprocess module had a communication method, but it kills the shell after a single command. I'd really like to be able to accomplish this with stdlib, but I've looked at the pexpect module after reading this thread. However, I can't get that to work either? It also doesn't look like it's primary use case is for creating an interactive shell, but rather catching specific command line output for interaction. I can't even get single command execution working with pexpect...
import pexpect, sys
proc = pexpect.spawn('/bin/bash')
proc.logfile = sys.stdout
proc.expect('$ ')
proc.sendline('pwd\n')
If anyone can help it would be appreciated, I feel like there could be a way to multi-thread and spawn off a /bin/bash -i with subprocess and then some how write to stdin and read from stdout? Thanks in advance, and sorry for the length.
Try this code:
# client.py
import socket
import subprocess
s = socket.socket()
s.connect(('localhost', 1337))
process = subprocess.Popen(['/bin/bash', '-i'],
stdout=s.makefile('wb'), stderr=subprocess.STDOUT,
stdin=s.makefile('rb'))
process.wait()
# server.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 1337))
s.listen(5)
conn, _ = s.accept()
fp = conn.makefile('wb')
proc1 = subprocess.Popen('cat', stdin=conn.makefile('rb'))
while True:
fp.write(sys.stdin.read(4096))
proc1.wait()
Related
I am trying to use telnet to check service connections to a server. This is the code I used:
p = subprocess.run("telnet localhost 80", shell=True, universal_newlines=True, stdout=subprocess.PIPE)
print(p.stdout)
if I run this there is a blank response and it seems telnet is waiting to timeout in the background until I press ctrl+c.
If I run the following code:
p = subprocess.run("telnet localhost 80", shell=True, universal_newlines=True)
there is a response as below:
Trying ::1...
Connected to localhost.
Escape character is '^]'.
If I want to use this script to test connections to servers, how do pass ctrl+] and 'quit' to get out of the telnet prompt?
Also, if I want to test responses using "GET/" to telnet, how do I do it ?
You'd need to use subprocess.Popen() and communicate with the telnet process using the resulting popen object's stdin/stdout streams.
But – do you actually need to shell out to telnet? If you only need to know whether something is connectable,
import socket
s = socket.socket()
s.connect(('localhost', 80))
s.close()
will raise an exception if connecting fails; if you need to do a simple ping/pong,
import socket
s = socket.socket()
s.connect(('localhost', 80))
s.sendall(b'Hello? Are you there?\n')
print(s.recv(8192))
s.close()
might suffice.
(You may need to look into adding timeouts and such, though.)
Try:
proc = subprocess.Popen(['lsof', '-iTCP', -'sTCP:LISTEN', '-n' ,'-P'], stdout=subprocess.PIPE)
tmp = proc.stdout.read()
I have a (very) simple web server I wrote in C and I want to test it. I wrote it so it takes data on stdin and sends out on stdout. How would I connect the input/output of a socket (created with socket.accept()) to the input/output of a process created with subprocess.Popen?
Sounds simple, right? Here's the killer: I'm running Windows.
Can anyone help?
Here's what I've tried:
Passing the client object itself as stdin/out to subprocess.Popen. (It never hurts to try.)
Passing socket.makefile() results as stdin/out to subprocess.Popen.
Passing the socket's file number to os.fdopen().
Also, in case the question was unclear, here's a slimmed-down version of my code:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(('', PORT))
sock.listen(5)
cli, addr = sock.accept()
p = subprocess.Popen([PROG])
#I want to connect 'p' to the 'cli' socket so whatever it sends on stdout
#goes to the client and whatever the client sends goes to its stdin.
#I've tried:
p = subprocess.Popen([PROG], stdin = cli.makefile("r"), stdout = cli.makefile("w"))
p = subprocess.Popen([PROG], stdin = cli, stdout = cli)
p = subprocess.Popen([PROG], stdin = os.fdopen(cli.fileno(), "r"), stdout = os.fdopen(cli.fileno(), "w"))
#but all of them give me either "Bad file descriptor" or "The handle is invalid".
I had the same issue and tried the same way to bind the socket, also on windows. The solution I came out with was to share the socket and bind it on the process to stdin and stdout. My solutions are completely in python but I guess that they are easily convertible.
import socket, subprocess
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(('', PORT))
sock.listen(5)
cli, addr = sock.accept()
process = subprocess.Popen([PROG], stdin=subprocess.PIPE)
process.stdin.write(cli.share(process.pid))
process.stdin.flush()
# you can now use `cli` as client normally
And in the other process:
import sys, os, socket
sock = socket.fromshare(os.read(sys.stdin.fileno(), 372))
sys.stdin = sock.makefile("r")
sys.stdout = sock.makefile("w")
# stdin and stdout now write to `sock`
The 372 is the len of a measured socket.share call. I don't know if this is constant, but it worked for me. This is possible only in windows, as the share function is only available on that OS.
I want to know is there a way to send a multiline command to maya through python socket and the Maya's own "commandPort" command?
I'm using below code for sending the code to maya("message" value is the command):
import socket
#HOST = '192.168.1.122' # The remote host
HOST = '127.0.0.1' # the local host
PORT = 54321 # The same port as used by the server
ADDR=(HOST,PORT)
def SendCommand():
client = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
client.connect(ADDR)
command = 'import maya.cmds as mc mc.polyCube()' # the commang from external editor to maya
MyMessage = command
client.send(MyMessage)
data = client.recv(1024) #receive the result info
client.close()
print 'The Result is %s'%data
if __name__=='__main__':
SendCommand()
When I send a single command like 'polyCube()' it works but for example sending a python
Command such as:
import maya.cmds as mc
mc.polyCube()
Raises an "invalid syntax error"!
Try:
command = 'import maya.cmds as mc\n mc.polyCube()'
For sending small commands to Maya, #pajton's method works, or you can use ; as a separator:
command = "import maya.cmds as mc; mc.polyCube()"
If possible, the easiest way to send many lines at once is to create a separate .py file that Maya has access to.
command = "import sys; sys.append(r'c:\path to my_script');"
command += "import my_script; my_script.run()"
I'm having a bit of trouble. I want to create a simple program that connects to the server and executes a command using subprocess then returns the result to the client. It's simple but I can't get it to work. Right now this is what I have:
client:
import sys, socket, subprocess
conn = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
host = sys.argv[1]
port = int(sys.argv[2])
socksize = 1024
conn.connect((host, port))
while True:
shell = raw_input("$ ")
conn.send(shell)
data = conn.recv(socksize)
#msglen = len(data)
output = data
iotype = subprocess.PIPE
cmd = ['/bin/sh', '-c', shell]
proc = subprocess.Popen(cmd, stdout=iotype).wait()
stdout,stderr = proc.communicate()
conn.send(stdout)
print(output)
if proc.returncode != 0:
print("Error")
server:
import sys, socket, subprocess
host = ''
port = 50106
socksize = 1024
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind((host, port))
print("Server started on port: %s" %port)
s.listen(1)
print("Now listening...\n")
conn, addr = s.accept()
while True:
print 'New connection from %s:%d' % (addr[0], addr[1])
data = conn.recv(socksize)
cmd = ['/bin/sh', '-c', data]
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE).wait()
stdout,stderr = cmd.communicate()
if not data:
break
elif data == 'killsrv':
sys.exit()
Danger, Will Robinson!!!
Do you really want to send commands in clear text without authentication over the network? It is very, very dangerous.
Do it over SSH with paramiko.
Alright I've heard this answer too many times. I don't want to use SSH I'm just building it to learn more about sockets. I'm not going to actually use this if I want to send commands to a system. – AustinM
There is no way I could infer this noble quest from your question. :-)
The sockets module is a thin layer over the posix library; plain sockets is tedious and hard to get right. As of today (2014), asynchronous I/O and concurrency are not among Python's strongest traits - 3.4 is starting to change that but libraries will lag behind for a while. My advice is to spent your time learning some higher level API like Twisted (twistedmatrix.com/trac). If you are really interested in the low level stuff, dive in the project source.
Alright. Any idea on how I could use twisted for this type of thing? – AustinM
Look at twistedmatrix.com/documents/current/core/examples/#auto2
Well I can understand your frustration Austin; I was in the same boat. However trial and error at last worked out. Hopefully you were looking for this:
print "Command is:",command
op = subprocess.Popen(command, shell=True, stderr=subprocess.PIPE, stdout=subprocess.PIPE)
if op:
output=str(op.stdout.read())
print "Output:",output
conn.sendall(output)
else:
error=str(op.stderr.read())
print "Error:",error
conn.sendall(error)
It's unclear why you are using subprocess.Popen() for the same command in both the client and the server. Here's an outline of what I would try to do (pseudocode):
client
while True:
read command from user
send command to server
wait for and then read response from server
print response to user
server
while True:
wait for and then read command from client
if command is "killsrv", exit
execute command and capture output
send output to client
The problem with your code is this line (in both client and server):
proc = subprocess.Popen(cmd, stdout=iotype).wait()
stdout,stderr = proc.communicate()
You are calling wait on the Popen object, which means that the variable proc is getting an int (returned by wait) instead of a Popen object. You can just get rid of the wait -- since communicate waits for the process to end before returning, and you aren't checking the exit code anyway, you don't need to call it.
Then, in your client, I don't think you even need the subprocess calls, unless you're running some command that the server is sending back.
I now have a small java script server working correctly, called by:
<?php
$handle = fsockopen("udp://78.129.148.16",12345);
fwrite($handle,"vzctlrestart110");
fclose($handle);
?>
On a remote server the following python server is running and executing the comand's
#!/usr/bin/python
import os
import socket
print " Loading Bindings..."
settings = {}
line = 0
for each in open('/root/actions.txt', 'r'):
line = line + 1
each = each.rstrip()
if each != "":
if each[0] != '#':
a = each.partition(':')
if a[2]:
settings[a[0]] = a[2]
else:
print " Err # line",line,":",each
print " Starting Server...",
port = 12345
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.bind(("", port))
print "OK."
print " Listening on port:", port
while True:
datagram = s.recv(1024)
if not datagram:
break
print "Rx Cmd:", datagram
if settings.has_key(datagram):
print "Launch:", settings[datagram]
os.system(settings[datagram]+" &")
s.close()
Is it possible to easily send the output of the command back, when the server is started and is running a command the output is shown in the ssh window, however I want this output to be sent back to the browser of the original client, maybe setting the browser to wait for 15 seconds and then check for any data received via the socket.
I know I am asking quite a lot, however I am creating a PHP script which I have a large knowledge about, however my python knowledge lacks greatly.
Thanks,
Ashley
Yes, you can read the output of the command. For this I would recommend the Python subprocess module. Then you can just s.write() it back.
Naturally this has some implications, you would probably have to let your PHP script run for a while longer since the process may be running slow.
# The pipe behaves like a file object in Python.
process = Popen(cmd, shell=True, stdout=PIPE)
process_output = ""
while process.poll():
process_output += process.stdout.read(256)
s.write(process_output)
# Better yet.
process = Popen(cmd, shell=true, stdout=PIPE)
stdout, stderr = process.communicate() # will read and wait for process to end.
s.write(stdout)
Integrated into your code:
# ... snip ...
import subprocess
con, addr = s.accept()
while True:
datagram = con.recv(1024)
if not datagram:
break
print "Rx Cmd:", datagram
if settings.has_key(datagram):
print "Launch:", settings[datagram]
process = subprocess.Popen(settings[datagram]+" &", shell=True, stdout=subprocess.PIPE)
stdout, stderr = process.communicate()
con.send(stdout)
con.close()
s.close()
Here's an example of how to get the output of a command:
>>> import commands
>>> s = commands.getoutput("ls *")
>>> s
'client.py'