bash script that is run from Python reaches sudo timeout - python

This is a long bash script (400+ lines ) that is originally invoked from a django app like so -
os.system('./bash_script.sh &> bash_log.log')
It stops on a random command in the script. If the order of commands is changed, it hangs on another command in approx. the same location.
sshing to the machine that runs the django app, and running sudo ./bash_script.sh, asks for a password and then runs all the way.
I can't see the message it presents when it hangs in the log file, couldn't make it redirect there. I assume it's a sudo password request.
Tried -
sudo -v in the script - didn't help.
ssh to the machine and manually extend the sudo timeout in /etc/sudoers - didnt help, I think since the django app is already in the air and uses the previos timeout.
splitting the script in two, and running one in separate thread, like so -
def basher(command, log_path):
with open(log_path) as log:
Popen(command, stdout=log, stderr=log).wait()
script_thread = Thread(target=basher, args=('bash_script_pt1.sh', 'bash_log_pt1.log'))
script_thread.start()
os.system('./bash_script_pt2.sh &> bash_log_pt2.log') # I know it's deprecated, not sure if maybe it's better in this case
script_thread.join()
The logs showed that part 1 ended ok, but part 2 still hangs, albeit later in the code than when they were together.
I thought to edit /etc/sudoers from inside the Python code, and then re-login via su - user. There are snippets of how to pass the password using pty, however I don't understand the mechanics of it and could not get it to work.
I also noted that ps aux | grep bash_script.sh shows that the script is being run twice. As -
/bin/bash bash_script.sh
and as
sh -c bash_script.sh.
I assume os.system has an internal shell=True going on.
I don't understand the Linux entities/mechanics in play to figure out what's happening.

My guess is that the django app has different and more limited permissions, than the script itself does, and the script is inheriting said restrictions because it is being executed by it.
You need to find out what permissions the script has when you run it just from bash, and what it has when you run it via django, and then figure out what the difference is.

Related

How do I execute ssh via Robot Framework's Process library

I'm at a loss for what to think, here. There must be something I'm missing since I'm able to execute this perfectly fine in another, similar context, but not here. I am trying to run the following console command via Robot Framework for the sake of testing my system's reaction to a network outage: ssh it#128.0.0.1 sudo ifconfig ens192 down (the user & host ip are modified for posting here.)
To accomplish this, I'm using the following line of code:
Run Process 'ssh it#128.0.0.1 sudo ifconfig ens192 down' shell=True stdout=stdout stderr=stderr
Now, I've executed this myself in console and saw that the result was an inability to ping the ip, which is what I want. So I know for certain that the command itself isn't a problem. What is a problem is the fact that I get the following result: /bin/sh: ssh it#128.0.0.1 sudo ifconfig ens192 down: command not found
That's fine, though, because I believe this is a simple issue of the ssh instance not having the it user's $PATH. For reasons described below, I'm going to guess its $PATH variable is empty. Thus, I modified the keyword call as such:
Run Process 'ssh it#128.0.0.1 PATH\=/usr/sbin sudo ifconfig ens192 down' shell=True stdout=stdout stderr=stderr
The result of this command is as follows: /bin/sh: ssh it#128.0.0.1 PATH=/usr/sbin sudo ifconfig ens192 down: No such file or directory, which I believe to be referring to the items in the PATH attribute. I've tried several different things to determine what the problem could be, such as why it would be unable to find these files in an instance where, executed manually in the console, these directories are indeed present.
Given that I have less than a year's worth of experience with Robot Framework, I'm certain that there's just something I'm not understanding about the Process library. Could I have an explanation as to what I am missing for this execution and, if possible, an explanation as to how I should be executing an ssh command via Robot Framework?
A few extra things worth pointing out:
The it user is brand new, having been created via useradd it and copying the home directory contents into its new space.
I have already added an entry into the device's sudoers file to allow the it user to execute ifconfig without requiring password entry.
On that note, I've sent the executing user's ssh keys over to the device such that it does not require a password to log into it as that user.
I have attempted to execute several basic commands. I have intermittently gotten ls to work but have been unable to get a greater sense of what $PATH the Robot Framework ssh instance has because echo doesn't seem to work.

How to run a python or bash script interactively on a webpage?

I am building a website and I would like to show a terminal on my webpage which runs a script (python or bash) interactively.
Something like trinket.io but I would like to use the python interpreter or the bash I have on my server, so I could install pip packages and in general control every aspect of the script.
I was thinking at something like an interactive frame which shows the terminal and what's executed in it, obv with user interaction supported.
A good example is https://create.withcode.uk/, it's exactly what I want but I would like to host it on my own server with my own modules and ecosystem. This seems to be pretty good also on the security side.
Is there anything like that?
If I understand well you look for a mechanism, that allows you to display a terminal on a web server.
Then you want to run an interactive python script on that terminal, right.
So in the end the solution to share a terminal does not necessarily have to be written in python, right? (Though I must admit that I prefer python solutions if I find them, but sometimes being pragmatic isn't a bad idea)
You might google for http and terminal emulators.
Perhaps ttyd fits the bill. https://github.com/tsl0922/ttyd
Building on linux could be done with
sudo apt-get install build-essential cmake git libjson-c-dev libwebsockets-dev
git clone https://github.com/tsl0922/ttyd.git
cd ttyd && mkdir build && cd build
cmake ..
make && make install
Usage would be something like:
ttyd -p 8888 yourpythonscript.py
and then you could connect with a web browser with http://hostip:8888
you might of course 'hide' this url behind a reverse proxy and add authentification to it
or add options like --credential username:password to password protect the url.
Addendum:
If you want to share multiple scripts with different people and the shareing is more a on the fly thing, then you might look at tty-share ( https://github.com/elisescu/tty-share ) and tty-server ( https://github.com/elisescu/tty-server )
tty-server can be run in a docker container.
tty-share can be used to run a script on your machine on one of your terminals. It will output a url, that you can give to the person you want to share the specific session with)
If you think that's interesting I might elaborate on this one
>> Insert security disclaimer here <<
Easiest most hacktastic way to do it is to create a div element where you'll store your output and an input element to enter commands. Then you can ajax POST the command to a back-end controller.
The controller would take the command and run it while capturing the output of the command and sending it back to the web page for it to render it in the div
In python I use this to capture command output:
from subprocess import Popen, STDOUT, PIPE
proc = Popen(['ls', '-l'], stdout=PIPE, stderr=STDOUT, cwd='/working/directory')
proc.wait()
return proc.stdout.read()

Ways to run python script in the background on ubuntu VPS and save the logs

So, I have a python script which outputs some data into terminal from time to time. Im trying to run in on the Ubuntu VPS even after I close the SSH connection and still keep the logs somewhere.
Im saving the logs by using:
python3 my_script.py >>file.txt
and it works perfect, however when I try to run this process using
nohup python3 my_script.py >>file.txt &
so it runs in the background and after the ssh connection is closed it seems to save only the first log outputted from my_script.py. I've also tried running this in crontab but the result is similar - only the first log is saved.
Any tips? What am I doing wrong?
I could not understand what you mean "the first log". Maybe the first line of logs?
To run something in the background when SSH connection is closed, I prefer Linux screen, a terminal simulation tool that help you run your command in a sub-process. With it, you could choose to view your output any time in the foreground, or leave your process run in the background.
Usage (short)
screen is not included in most Linux distributions. Install it (Ubuntu):
$ sudo apt-get install screen
Run your script in the foreground:
$ screen python3 my_script.py
You'll see it running. Now detach from this screen: Press keys Ctrl-A followed by Ctrl-D. You'll be back to your shell where you run previous screen command. If you need to switch back to the running context, use screen -r command.
This tool supports multiple parallel running process too.
Something weird
I've tried to redirect stdout or stderr to a file with > or >> symbol. It turned out in failure. I am not an expert of this either, and maybe you need to see its manual page. However, I tend to directly write to a file in Python scripts, with some essential output lines on the console.

How to interact with new shell I logged into from script?

Not able to send commands to shell I logged into
Originally, I wrote a Python script. It was able to send commands like
subprocess.run(['kubectl', 'config', 'get-context'], shell=True)
but when it came time to get to the child shell, in this case bash, the command wouldn't run until I exited that shell and it would say things like it couldn't find the command.
I then tried to do it with the module "sh," but was also unsuccessful
I thought maybe using Python was problem and also realized my ultimate goal was to use a different shell (cypher-shell) and so skipped immediately to that with bash as the parent shell. In there I have a line that is sometimes successful, sometimes not
kubectl run -it --rm cypher-shell --image=gcr.io/cloud-marketplace/neo4j-public/causal-cluster-k8s:3.4 --restart=Never --namespace=default --command -- ./bin/cypher-shell -u neo4j -p "password" -a "domain.name"
But even when it successfully logs in it, it just hangs until I manually exit and then it runs the next commands
Note: I saw this and so, perhaps, it's not a child shell? Run shell command from child shell
I can't say I know exactly what you are doing, but if I understand your objective correctly you want the Python program to continue to log while the script continues to run? The problem is that the logger continues to run and holds up your program. The way I would deal with that would be to run the logger as a background process.
With bash, that would be ./script.sh & which would make it run without holding the rest of the program back from running.
Hopefully that may give you an idea! Good luck.

Easiest way to run a Python script (that access the internet) once a day?

I wrote a simple script that parses some stuff off the web and emails it to me. Very simple. But I'm starting to realize that implementing this is going to be far more difficult that it really should be.
All I really want to do is run this script once a day.
I have explored using Google App Engine, but it doesn't like smtplib using ssl to login to my gmail to send an email.
I am considering using Heroku, but that just seems like a lot of work for something so simple.
I tried using my raspberry pi, but I'm not sure the script is still running when I exit ssh. I looked into running the scrip on a cron job, but I'm not sure thats an elegant solution.
I looked into running an applescript from my calendar, but I'm not sure what happens if my computer is closed and/or offline.
My question is: Is there a simple, elegant, easy solution here?
When you start the script from your session (./script.py or python script.py) than it stops running when you disconnect. If you want to run the script this way for whatever reason, I would recommend using tmux .
If you're using Raspian or another Debian based distro for your Pi:
$ apt-get install tmux
$ tmux
# disconnect from your tmux session with pressing CTRL+B and (after that) D
# to reattach to your session later, use
$ tmux attach
I would recommend using cron. Just add a file like this in /etc/cron.d/, if you want to run it at a specific time (e.g. every day at 1am), like so:
$ echo "0 1 * * * python /path/to/your/script.py > /dev/null 2>&1" > /etc/cron.d/script-runner
# and don't forget to make it executable
$ chmod +x /etc/cron.d/script-runner
Wikipedia has a nice explanation of the format (and also of shortcuts like #hourly and #daily).
If you don't care when exactly it runs, you can just put your script into /etc/cron.daily/. Don't forget a chmod +x to make it executable.
If you don't want to run it on one of your machines, you can also get a shell on one of uberspaces servers. You can pay whatever you wan't (minimum 1 Eur/month) and you get a shell on a linux box with 10GB storage (the first month is free for testing, cancelation happens automatically when you don't pay, no strings attached). I'm sure there are a lot of other services like that, I just mention it, because it's a cheap one with nice support. Also you get a domain (..uberspace.de) and can send mail from the server (e.g. with mail). So no need to use a gmail account.
Edit: Overread the "python" part. Changed everything to .py. Either use #!/usr/bin/env python3 (or 2.7) in your script or start the script via python scriptname.py.

Categories