ussp-push from python not working - python

I would like to write a python script that sends a file to my android phone.
I will run the script from my phone via ssh.
I tried the following codes but none of them was working. I did not get any errors, but when I try to test it I do not get anything on my phone via Bluetooth.
from subprocess import call
call(['ussp-push', '0C:D6:xx:xx:xx:xx#1x', '/home/pi/alfred.jpg', 'ana.jpg'])
***************************************************************
from subprocess import Popen, PIPE
process = Popen(['ussp-push', '0C:D6:xx:xx:xx:xx#1x', '/home/pi/alfred.jpg' 'ana.jpg'], stdout=PIPE, stderr=PIPE)
stdout, stderr = process.communicate()
Does anyone see what the problem could be? Thanks in advance.

output = check_output(['ussp-push', '0C:D6:xx:xx:xx:xx#1x', '/home/pi/alfred.jpg' 'ana.jpg'], stderr=STDOUT)
check_output returns subprocess' stdout and also checks its exit code. So it raises an exception on any error in a subprocess.

Related

How can I simulate a key press in a Python subprocess?

The scenario is, I have a Python script which part of it is to execute an external program using the code below:
subprocess.run(["someExternalProgram", "some options"], shell=True)
And when the external program finishes, it requires user to "press any key to exit".
Since this is just a step in my script, it would be good for me to just exit on behalf of the user.
Is it possible to achieve this and if so, how?
from subprocess import Popen, PIPE
p = Popen(["someExternalProgram", "some options"], stdin=PIPE, shell=True)
p.communicate(input=b'\n')
If you want to capture the output and error log
from subprocess import Popen, PIPE
p = Popen(["someExternalProgram", "some options"], stdin=PIPE, stdout=PIPE, stderr=PIPE, shell=True)
output, error = p.communicate(input=b'\n')
remember that the input has to be a bytes object

Python's subrocess module hangs in PyCharm on self.stdout.read() only when using Chainpoint cli tool commands

Python's subprocess module hangs when calling chp (Chainpoint cli tool) commands. But only when I do this inside PyCharm. Doing the same in a Python shell directly in the terminal works perfectly. Also using other processes works fine in PyCharm. It seems to be the combination between chp and PyCharm that creates this failure.
This is what i try:
outputs_raw = subprocess.check_output(['chp', 'version'])
it eventually hangs at:
stdout = self.stdout.read() in subprocess.py
I looked around for a solution, but all the other "subprocess hangs pycharm" results don't help.
I also tried using readingproc as an alternative, advised here. This gave me an interesting result. It keeps looping in readingproc/core.py:
while self._proc.poll() is None:
with _unblock_read(self._proc):
result = self._yield_ready_read()
self._check_timeouts(chunk_timeout, total_timeout)
if result is not None:
self._update_chunk_time()
yield result
here result is always None, as self._yield_ready_read() keeps returning None so the if statement never passes.
This is what the _yield_ready_read function looks like. ( in core.py)
def _yield_ready_read(self):
stdout = b''
stderr = b''
if self._poll_stdout.poll(0):
stdout = self._read_while(self._proc.stdout)
if self._poll_stderr.poll(0):
stderr = self._read_while(self._proc.stderr)
if len(stdout) > 0 or len(stderr) > 0:
return ProcessData(stdout, stderr)
else:
return None
I am using python 3.7.3
PATH is the same in the working environment and the failing one.
Can someone help me fix this issue? Thanks!
This fixed it:
from subprocess import Popen, PIPE, STDOUT
proc = Popen(command, stdin=PIPE, stdout=PIPE, stderr=STDOUT)
outputs_raw, errs = proc.communicate()

how to overcome pinentry of gpg using python when we run an gpg command in python script

I tried the following code which is working perfect, but it's not taking my passphrase. when I run this code I get a popup which asks to enter the passphrase for every time I run the python code in new cmd. But I want to automate this. So please suggest a better option to take passphrase for python script itself.
from subprocess import PIPE, Popen
output_file_name = 'abc.zip'
input_file_name = 'abc.zip.pgp'
args = ['gpg', '-o', output_file_name, '--decrypt', input_file_name]
proc = Popen(args, stdin=PIPE, stdout=PIPE, stderr=PIPE,shell=True)
proc.stdin.write('passphrase\n')
proc.stdin.flush()
stdout, stderr = proc.communicate()
print(stdout)
print(stderr)

Controlling shell stdin from python

I have a custom input method and I have a python module to communicate with it. I'm trying to control the shell with it so everything from local stdout is printed on the remote device and everything sent from the remote device goes into local stdin, so that remote device can control the input given to the program, like if there was an input function inside the program the remote device can answer to that too (like in ssh).
I used python subprocess to control the stdin and stdout:
#! /usr/bin/python
from subprocess import Popen, PIPE
import thread
from mymodule import remote_read, remote_write
def talk2proc(dap):
while True:
try:
remote_write(dap.stdout.read())
incmd = remote_read()
dap.stdin.write(incmd)
except Exception as e:
print (e)
break
while True:
cmd = remote_read()
if cmd != 'quit':
p = Popen(['bash', '-c', '"%s"'%cmd], stdout=PIPE, stdin=PIPE, stderr=PIPE)
thread.start_new_thread(talk2proc, (p,))
p.wait()
else:
break
But it doesn't work, what should I do?
p.s.
is there a difference for windows?
I had this problem, I used this for STDIN
from subprocess import call
call(['some_app', 'param'], STDIN=open("a.txt", "rb"))
a.txt
:q
This I used for a git wrapper, this will enter the data line wise whenever there is an interrupt in some_app that is expecting and user input
There is a difference for Windows. This line won't work in Windows:
p = Popen(['bash', '-c', '"%s"'%cmd], stdout=PIPE, stdin=PIPE, stderr=PIPE)
because the equivalent of 'bash' is 'cmd.exe'.

python subprocess module hangs for spark-submit command when writing STDOUT

I have a python script that is used to submit spark jobs using the spark-submit tool. I want to execute the command and write the output both to STDOUT and a logfile in real time. i'm using python 2.7 on a ubuntu server.
This is what I have so far in my SubmitJob.py script
#!/usr/bin/python
# Submit the command
def submitJob(cmd, log_file):
with open(log_file, 'w') as fh:
process = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print output.strip()
fh.write(output)
rc = process.poll()
return rc
if __name__ == "__main__":
cmdList = ["dse", "spark-submit", "--spark-master", "spark://127.0.0.1:7077", "--class", "com.spark.myapp", "./myapp.jar"]
log_file = "/tmp/out.log"
exist_status = submitJob(cmdList, log_file)
print "job finished with status ",exist_status
The strange thing is, when I execute the same command direcly in the shell it works fine and produces output on screen as the proggram proceeds.
So it looks like something is wrong in the way I'm using the subprocess.PIPE for stdout and writing the file.
What's the current recommended way to use subprocess module for writing to stdout and log file in real time line by line? I see bunch of options on the internet but not sure which is correct or latest.
thanks
Figured out what the problem was.
I was trying to redirect both stdout n stderr to pipe to display on screen. This seems to block the stdout when stderr is present. If I remove the stderr=stdout argument from Popen, it works fine. So for spark-submit it looks like you don't need to redirect stderr explicitly as it already does this implicitly
To print the Spark log
One can call the commandList given by user330612
cmdList = ["spark-submit", "--spark-master", "spark://127.0.0.1:7077", "--class", "com.spark.myapp", "./myapp.jar"]
Then it can be printed by using subprocess, remember to use communicate() to prevent deadlocks https://docs.python.org/2/library/subprocess.html
Warning Deadlock when using stdout=PIPE and/or stderr=PIPE and the child process generates enough output to a pipe such that it blocks waiting for the OS pipe buffer to accept more data. Use communicate() to avoid that. Here below is the code to print the log.
import subprocess
p = subprocess.Popen(cmdList,stdout=subprocess.PIPE,stdout=subprocess.PIPE,stderr=subprocess.PIPE)
stdout, stderr = p.communicate()
stderr=stderr.splitlines()
stdout=stdout.splitlines()
for line in stderr:
print line #now it can be printed line by line to a file or something else, for the log
for line in stdout:
print line #for the output
More information about subprocess and printing lines can be found at:
https://pymotw.com/2/subprocess/

Categories