I've been looking at multiple examples on here and elsewhere, but nothing seems for work for me. I know nothing about python. All I am trying to do is run a perl script simply located at
sdb1/media/process.pl
The example code that I've found runs all over the place, and mostly seems like it has extra stuff that I don't need. What I'm trying right now is
#! /usr/bin/python
pipe = subprocess.Popen(["perl", "/sdb1/media/process.pl"], stdout=subprocess.PIPE)
But that just gives me the error
NameError: name 'subprocess' is not defined
If I've missed anything important, let me know. Otherwise, thanks for your time.
you need to import the subprocess library to run subprocess
#! /usr/bin/python
import subprocess
pipe = subprocess.Popen(["perl", "/sdb1/media/process.pl"], stdout=subprocess.PIPE)
Alternatively, if you are just using that same function a lot of times, you can do
from subprocess import Popen
then you can just call
pipe = Popen(["perl", "/sdb1/media/process.pl"], stdout=subprocess.PIPE)
I would have commented but I need 50 rep.
Related
Im trying to access server data via a jar-file. Doing this in MATLAB is quite simple:
javaaddpath('*PATH*\filename.jar')
WWS=gov.usgs.winston.server.WWSClient(ip,port);
Data = eval('WWS.getRawData(var1,var2,var3)');
WWS.close;
Problem is that I need to execute this in Python and I can't figure out how to translate these few lines of code. I've tried using the subprocess module like:
WWS=subprocess.call(['java', 'gov/usgs/winston/server/WWSClient.class'])
but the best I can get is the error "could not find or load main class gov.usgs.winston.server.WWSClient.class"
Thankful for all the help!
Also you can use the following code:
import subprocess
command = "java -jar <*PATH*\filename.jar>"
result = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
And result is the output of the jar file.
There are a few ways you can do this. One of the easiest ways is
import subprocess
subprocess.run(["java", "-jar", "*PATH*\filename.jar"])
The python subprocess command runs a system command. It takes a list as an argument, and the list is just the system command you want to run and it's arguments.
I'm having a tough time figuring out how to run NodeJS from Python. I have no problems running ShellScript from Python and NodeJS from ShellScript, but can't seem to get NodeJS from Python, I just get the following output:
b"
These are the simplified version of my scripts.
NodeJS I am trying to run from within Python.
#!/usr/bin/env node
console.log("Hello " + process.argv[2]);
And here is the Python, using Python3.
from datetime import datetime
import json
import os
import re
import sys
import subprocess
if __name__ == '__main__':
p = subprocess.Popen(['/Users/Brett/scripts/hello.js', 'Brett'], stdout=subprocess.PIPE)
out = p.stdout.read()
print(out)
Thanks for the help! Much appreciated.
EDITS:
I have no issue executing the following from the commandline, as 'hello.js' is executable:
hello.js 'Brett'
shell=true does not fix it.
Additionally, I am on macOS Catalina 10.15.5 and therefore my shell is zsh.
If I add node to the front of the command, I get no such file or directory for node, I tried it as follows:
p = subprocess.Popen(['/Users/Brett/scripts/hello.js', 'Brett'], stdout=subprocess.PIPE)
Thanks everyone for the responses. All were super helpful. Especially #max-stanley and #jared-smith.
The following ended up working for me:
p = subprocess.Popen(['/usr/local/bin/node', '/Users/Brett/scripts/hello.js', 'Brett'], stdout=subprocess.PIPE)
out = p.stdout.read()
print(out)
Not sure why it doesn't work with the shebang in the executable js file but I am not committed to it, so I will just take the working solution and move on. ;-)
Cheers!
Okay after doing some testing based on the comments by Max Stanley:
There is an inconsistency between Linux and MacOS here about the population of the argv array. On Mac you will want the second index (1) and on Linux you will want the third (2).
I recommend using a command-line argument parser like command-line-args which should paper over the platform differences.
In the meantime you can specify node in the python subprocess call Popen(["node", "/Users/Brett/scripts/hello.js", "Brett"]) which has the same behavior on both.
Having tested this on my system, it looks as though you need to either make the hello.js file executable chmod +x ./hello.js or you need to add 'node' to the beginning of the Popen argument list as #Jared had said.
I am still a newbie to python, so apologies in advance. I have related topics on this but didn't find the best solution. (Run a python script from another python script, passing in args)
Basically, I have a python script (scriptB.py) that takes in a config file as argument and does some stuff. I need to call this script from another python script (scriptA.py).
If I had no arguments to pass, I could have just done
import scriptB.py
However, things got little complicated because we need to pass the config file (mycnofig.yml) as argument.
One of the suggestions was to use;
os.system(python scriptB.py myconfig.yml)
But, it is often reported as not a recommended approach and that it often does not work.
Another suggestion was to use:
import subprocess
subprocess.Popen("scriptB.py myconfig.yaml", shell=True)
I am not very sure if this is a common practice.
Just want to point out that both scripts don't have any main inside the script.
Please advise on the best way to handle this.
Thanks,
this should work just fine
subprocess.Popen(['python', '/full_path/scriptB.py', 'myconfig.yaml'], stdout=PIPE, stderr=PIPE)
See https://docs.python.org/3/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3
If you really need to run a separate process, using the multiprocessing library is probably best. I would make an actual function inside scriptB.py that does the work. In the below example I consider config_handler to be a function inside scriptB.py that actually takes the config file path argument.
1.) create a function that will handle the calling of your external python script, also, import your script and the method inside it that takes arguments
scriptA.py: importing config_handler from scriptB
import multiprocessing
from scriptB import config_handler
def other_process(*args):
p = multiprocessing.Process(*args)
p.start()
2.) Then just call the process and feed your arguments to it:
scriptA.py: calling scriptB.py function, config_handler
other_process(name="config_process_name", target=config_handler, args=("myconfig.yml",))
Opinion:
From the information you have provided, i imagine you could manage to do this without separate processes. Just do things all in sequence and make scriptB.py a library with a function you use in scriptA.py.
It seems you got all your answers in the old thread, but if you really want to run it through os, not through python, this is what I do:
from subprocess import run, PIPE, DEVNULL
your_command = './scriptB.py myconfig.yaml'
run(your_command.split(), stdout=PIPE, stderr=DEVNULL)
In case you need the output:
output = run(your_command.split(), stdout=PIPE, stderr=DEVNULL).stdout.decode('utf-8')
If the scriptB has the shebang header telling the bash its a python script, it should run it correctly.
Path can be both relative and absolute.
It is for Python 3.x
I have a batch file, which I use to load some pre-build binaries to control my device.
It's command is:
cd build
java -classpath .;..\Library\mfz-rxtx-2.2-20081207-win-x86\RXTXcomm.jar -
Djava.library.path=..\Library\mfz-rxtx-2.2-20081207-win-x86 tabotSample/Good1
pause
Now, I want to run the batch file using Python, and I tried os.system(batch,bat), and I tried using Popen
import os
from subprocess import Popen
os.popen("cd TAbot")
r=os.popen("hello.bat")
However, the python console(Anaconda python 2.7) seems like executed the code, but returns nothing, and nothing happens.
I want to run this batch file from python, please help me.
by the way, I tried popen for another batch file like,
echo Hello but nothing happens.
Here is the simple solution.
from subprocess import Popen
import subprocess
def run_batch_file(file_path):
Popen(file_path,creationflags=subprocess.CREATE_NEW_CONSOLE)
run_batch_file('file_name.bat')
file_name.bat
echo .bat file running from python
pause
You can also use this
import subprocess
subprocess.call(["C:\\temp\\test.bat"], shell=False)
test.bat
copy "C:\temp\test.txt" "C:\temp\test2.txt"
I think this should work like this:
batch.py
from subprocess import Popen
p = Popen("test.bat", cwd=r"C:\path\to\batch\folder")
stdout, stderr = p.communicate()
test.bat
echo Hello World!
pause
Here many guys suggested very useful solutions, but I want to point the importance of where is the program located.
(Bat file is usually made for automation task to reduce time and this has high probability to work some task related path)
import subprocess
os.chdir("YOUR TARGET PATH")
exit_code = subprocess.call(FILEPATH)# FILEPATH is from the standpoint on YOUR TARGET PATH
I'm apprenticing into system administration without schooling, so sometimes I'm missing what is elementary information to many others.
I'm attempting to give my stdout line another argument before printing, but I'm not sure which process I should use, and I'm a bit fuzzy on the commands for subprocess if that's what I should be using.
My current code is:
f = open('filelist', 'r')
searchterm = f.readline()
f.close()|
#takes line from a separate file and gives it definition so that it may be callable.
import commands
commands.getoutput('print man searchterm')
This is running, but not giving me an ouput to the shell. My more important question is though, am I using the right command to get my preferred process? Should I be using one of the subprocess commands instead? I tried playing around with popen, but I don't understand it fully enough to use it correctly.
Ie, I was running
subprocess.Popen('print man searchterm')
but I know without a doubt that's not how you're supposed to run it. Popen requires more arguments than I have given it, like file location and where to run it (Stdout or stderr). But I was having trouble making these commands work. Would it be something like:
subprocess.Popen(pipe=stdout 'man' 'searchterm')
#am unsure how to give the program my arguments here.
I've been researching everywhere, but it is such a widely used process I seem to be suffering from a surplus of information rather than not enough. Any help would be appreciated, I'm quite new.
Preemptive thanks for any help.
The cannonical way to get data from a separate process is to use subprocess (commands is deprecated)
import subprocess
p = subprocess.Popen(['print','man','searchitem'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdoutdata, stderrdata = p.communicate()
Note that some convenience functions exist for splitting strings into lists of arguments. Most notably is shlex.split which will take a string and split it into a list the same way a shell does. (If nothing is quoted in the string, str.split() works just as well).
commands is deprecated in Python 2.6 and later, and has been removed in Python 3. There's probably no situation where it's preferable in new code, even if you are stuck with Python 2.5 or earlier.
From the docs:
Deprecated since version 2.6: The commands module has been removed in
Python 3. Use the subprocess module instead.
To run man searchterm in a separate process and display the result in the terminal, you could do this:
import subprocess
proc = subprocess.Popen('man searchterm'.split())
proc.communicate()