Python/Linux/Streamlit: executing subprocess for linux script - python

I'm a bit turned around on how to execute a shell script file in a Linux environment via Python's subprocess command in Streamlit. Any assistance on what I'm missing is appreciated.
I'm using a shell script called 0_texts.sh to run Pylanguagetool for a grammar check of one text file and return corrections in another text file, like so:
cd /home/user/dir/TXTs/
pylanguagetool text_0.txt > comments_0.txt
This script runs correctly in the Linux terminal, writing a comments_0.txt with appropriate grammar checks from text_0.txt.
I need to create a Python/Streamlit app that runs these shell scripts. In attempting to run this shell script, I've written script.py below:
import os
import subprocess
import sys
subprocess.run(['bash','/home/user/dir/Scripts/0_texts.sh'])
I then run script.py in Streamlit via the code below, keeping with Streamlit's documentation on using subprocess here.
import streamlit as st
import os
import subprocess
import sys
def app():
button1 = st.button("Click me")
if button1:
p = subprocess.run([f"{sys.executable}", "/home/user/dir/pages/script.py"])
st.write(p)
When I execute the script.py via Streamlit, the 0_txts.sh script executes, writing comments_0.txt in the correct directory and providing the following traceback: CompletedProcess(args=['/usr/bin/python3', '/home/user/dir/pages/script.py'], returncode=0). However, the comments_0.txt output contains the error input file is required, as if it can't properly access or read text_0.txt. I've tinkered around trying to find the problem, but have hit a brick wall.
Any suggestions on what I'm missing, or paths forward? Any help greatly appreciated.

Related

Run .sh script from python from specific folder

I am trying to run a .sh script from python.
I saw that this can be done in various ways such as:
import subprocess
subprocess.call(["./test.sh"])
or
import os
os.system("sh test.sh")
However this assumes that test.sh is in the same folder where you are running the script from. What if I want to run the .sh which is in a specific folder?
I tried the following but with no luck:
import subprocess
subprocess.call(["cd ~/ros_ws", "./intera.sh"])
import subprocess
subprocess.call(["cd ~/ros_ws", "./intera.sh"], shell=True)
Thanks for the help.
The subprocess.call has an cwd function argument (change working directory)
import subprocess
subprocess.call(["./intera.sh"], cwd="~/ros_ws")

How do I export environment variables using a script in the same shell in python?

My django project fetches credentials from environment variables, now I want to automate this process and store the credentials in the vault(hashivcorp).
I have a python and shell script which fetches data from an API and exports it as environment variables, when I run it using os.system command it runs the shell script but as it runs it in a subprocess, I can't access the variables in the main(parent) process/shell. Only way of doing it by inserting the shell script in the settings.py file.
Is there any way I can do it so that I get those in the main process?
P.s: I did try sourcing, os.system didn't recognise it as a command.
Here's the code I'm running:
import os
os.environ['ENV'] = 'Demo'
os.system('python3 /home/rishabh/export.py')
print(os.environ.get('RDS_DB_NAME'))
output:
None
the python file, shell script works just fine.
One way to do it is to run export.py in the same process, as user1934428 suggested:
import os
import sys
os.environ['ENV'] = 'Demo'
sys.path.append('/home/rishabh/')
import export # runs export.py in the same process
print(os.environ.get('RDS_DB_NAME'))
This assumes there are no __name__ == '__main__' checks inside export.py.
You only need the sys.path line if export.py is in a different directory than your current script.

Python RPi - File not found when running script from another script

I'm trying to run a python script from another python script on a Raspberry Pi 3 with Raspbian. I've been trying to find ways to do this for some hours and didn't find anything that worked. I've tried some ways but it either says that has not permission to execute the file or it can't find it. I don't know what I'm doing wrong. I need to run multiple instances of the other script through the main script in a new console (new processes) and keep them running (I don't expect them to return anything to the main script). Can anyone help me? Because with Windows it was really easy as the program was working fine until I tried to run it on Linux (with Windows, I used os.startfile).
In test.py:
print("test1")
input()
In main.py:
import os
import subprocess
print("main")
os.system("python test.py")
input()
In the console:
main
python: can't open file 'test.py': [Errno 2] No such file or directory
In main.py:
import os
import subprocess
print("main")
subprocess.Popen("python test.py",shell=True)
input()
In the console:
main
python: can't open file 'test.py': [Errno 2] No such file or directory
In main.py:
import os
import subprocess
print("main")
subprocess.call("python test.py",shell=True)
input()
In the console:
main
python: can't open file 'test.py': [Errno 2] No such file or directory
I tried more ways but I don't remember them. Maybe I'm doing something wrong?
EDIT: I can now run the scripts without any problems with os.chdir (thanks to J H). My problem now is that it prints test in the same console window as the main.py and I needed it to create another process for the test.py. Any solutions?
EDIT 2: Finally I could start a new processes of the test.py from the main.py! I used os.system('xdg-open "test.py"') to open test.py with the default application. Anyway thanks to J H, otherwise it would continue to say file not found.
Final main.py:
import os
print("main")
os.chdir('/home/pi/Desktop/')
os.system('xdg-open test.py')
input()
Thanks in advance!
Printing out os.getcwd() will help you to debug this.
Either supply a fully qualified pathname, /some/where/test.py, or use os.chdir('/some/where') before executing test.py.

Calling python commands from another python script

I am using a python package Molbery, A tool for Molecular biologists, the usage is like
molbery -o output_file_path input_path
I am working with python CGI script and want to have the above command to execute from the that CGI script. and then the resultant output of the output file would be displayed in a webpage
One way is to do it as a systems call:
from subprocess import call
call(["some_command ", "your_args"])
... or ...
import os
os.system("some_command your_args")
However usually, you can use the module directly by importing it and using it's functions and modules. I don't seem to find any documentation for this so the first thing I'd do is to look into the source code itself. Especially the entry point (i.e., main function/module).
If I understand your question properly this should works for you
import os
os.system("molbery -o output_file_path input_path")
or this
from subprocess import call
call('molbery -o output_file_path input_path')
You can also see Calling an external command in Python

Issuing commands to the command line in Python

This is my code. I'm pretty new to this.
from subprocess import call
call(["cd", "/etc/apache2/"])
However, when this function is run, I get
Errno 2: No such file or directory
I am running Django within Apache*. This is my views.py file. Ask for additional code, and you shall receive.
edit - It should be noted that /etc/apache2/ does indeed exist.
If you want to change the working directory of the Python process you can use chdir from the os module:
import os
os.chdir('/etc/apache2')
First of all, you will not get what you expect if you run this. Try
import os
os.chdir('/etc/apache2')
Second, try /path/to/cd as process may not know cd alias.

Categories