Calling a python web server within a script - python

I would like to have a web server displaying the status of 2 of my python scripts.
These scripts listen for incoming data on a specific port. What I would like to do is have it so that when the script is running the web server will return a HTTP200 and when the script is not running a 500. I have had a look at cherrypy and other such python web servers but I could not get them to run first and then while the web server is running continue with the rest of my code. I would like it so that if the script crashes so does the web server. Or a way for the web server to display say a blank webpage with just a 1 in the HTML if the script is running or a 0 if it is not.
Any suggestions?
Thanks in advance.

Actually I was just answering a question moderately similar to this one the idea would be to run script A and have it break off 2 threads running the scripts that you intend and then just have a web page do a:
import threading, cherrypy
from cherrypy import expose
class thread1(threading.Thread):
def run(self):
#code for script 1 goes here
class thread2(threading.Thread):
def run(self):
#code for script 2 goes here
t1 = thread1()
t2 = thread2()
t1.start()
t2.start()
#expose
def check(self):
if t1.isAlive() and t2.isAlive():
return "1"
return "0"
I would advise you to put either nginx or apache infront of this with them being a reverse proxy.
Now there is 2 ways that this will show you that one of them stopped. Either it will show you a 1 that both are running fine. A zero if one or both stopped but managed to keep the rest of the script running. Or nginx/apache will give you a 500 error saying that the backend server (ie:cherrypy) crashed which means that the entire script stopped working.

I would break it apart further:
script A, on port a
script B, on port b
web script C which checks on A and B (by making simple requests to them)
and returns the results in a machine-friendly format, ie JSON or XML
web page D which calls C and formats the results for people, ie an HTML table
There are existing programs which do this - Nagios springs to mind.

Related

Any way to run an internal python script from a webpage?

I finally made a project that I wanted to make since a long time :
I'm using an Arduino Uno to replace my PC power button (with a simple relay) and that Arduino Board is connected to a Raspi 3 for network connection purposes
My wish is to do a webpage (or a API-Like request) that at a touch of a button (preferably in a password-protected page) It'll power the PC on
I know how to code in Python, and my script to control the Arduino is already done but I can't find a way to run, only server-side, a Python Script from a button in a webpage
I found that CherryPy framework but I don't think it'll suit my needs
Can someone give me any ideas about that please?
As already mentioned by #ForceBru, you need a python webserver.
If this can be useful to you, this is a possible unsecure implementation using flask:
from flask import Flask
from flask import request
app = Flask(__name__)
#app.route('/turnOn')
def hello_world():
k = request.args.get('key')
if k == "superSecretKey":
# Do something ..
return 'Ok'
else:
return 'Nope'
If you put this in an app.py name file and, after having installed flask (pip install flask), you run flask run you should be able to see Ok if visiting the url http://localhost:5000/turnOn?key=superSecretKey .
You could write a brief html gui with a button and a key field in a form but I leaves that to you (you need to have fun too!).
To avoid potential security issues you could use a POST method and https.
Look at the flask documentation for more infos.

How to have a Python Script run constantly in the background connected to a Node.JS server

Currently I have a Node.JS server that needs to run a python script. Part of this script imports some data from a server that takes time which is detrimental to the experience as this Python script needs to be ran frequently.
What I want is for the Node.JS server to run this Python script when the server is ran, then in the background constantly have it keep running, with the ability to (from the Node.JS server) call a python function that returns data.
So far I have this on the Node.JS server that runs a python script and outputs the response. This is repeated every time data is needed to be retrieved:
const util = require('util');
var options = {
mode: 'text',
args: [sentenceToUse]
};
const readPy = util.promisify(PythonShell.run);
const test = await readPy("spacyTest.py", options);
var response = test.toString();
response = response.toString();
response = response.replace(/'/g, '"');
response = JSON.parse(response);
return(response);
console.log(test);
'''
How can I keep this running in the background without restarting the python script every time data is needed?
It seems you need to change the python script itself to keep running and respond to requests from its parent. If the python script now runs and exits, there's nothing you can do from node.js to stop that. You need to change the python script.
Here are some options:
The python script can regularly send data back to its parent on stdout and you can read that data from nodejs as it arrives.
You can put a little local http server or socket.io server in the python script on a known port and have your nodejs parent communicate with that to make requests and get responses.
The Python-shell module also shows you how to communicate between node.js parent and the python script here in the doc, so that is an option too.
Since options 1 and 3 are built-in already, I would probably start with one of those until you find some reason they aren't sufficient.

Ruby on Rails hangs during HTTP request from Python script

I'm writing a web application in Ruby on Rails where users can write python code in a web editor and execute it in an docker environment on the server. I've written a simple python code that creates a docker container:
import docker
import sys
if __name__ == "__main__":
if(len(sys.argv) == 2):
token = sys.argv[1]
client = docker.from_env()
res = client.containers.run('openql','python3 /home/pythonwrapper.py '+token)
else:
print("Requires one parameter")
As you can see it creates a docker container using the image openql and execute a simple python script inside. If a user press the execute button in the web editor Ruby on Rails executes this script by using the this command: system("python","script.py","<TOKEN>") So far so good, this works all fine.
However, there is a problem executing the pythonwrapper.py inside the docker container. I'm using python's requests library for requesting the files written by the user to execute them inside the docker container. The code looks like this:
# Request all the available assets, it does not download the content of the files.
# Downloading the content of the files is done in a later request
url = rails_url+"allAssets/"+token
res = requests.get(url)
#Convert bytes into string
content = str(res.content, 'utf8')
Looks pretty simple, but the whole ruby on rails webserver hangs during this request. And the strange thing is that it all works fine if I first executes this script manually from the console after restarting the server.
The only thing I get from the Rails console if this:
Started GET "/allAssets/123" for 10.0.2.15 at 2017-08-02 10:24:59 +0200
When I quit the webserver for a restart, Ruby on Rails shows the following logs:
Screenshot console
And then nothing. Does anyone know what could be the problem?
You should be running the container in background i guess.
res = client.containers.run('openql','python3 /home/pythonwrapper.py '+token, detach=True)
This will make sure that your server is not stuck till it waits for the container to complete and finish the command it is executing

Simultaneous requests with turbogears2

I'm very new to web dev, and i'm trying to build a simple Web interface with Ajax calls to refresh data, and turbogears2 as the backend.
My Ajax calls are working fine and makes periodic calls to my Turbogears2 server, however these calls takes time to complete (some requests make the server to use remote SSH calls on other machines, which takes up to 3-4 seconds to complete).
My problem is that TurboGears waits for each request to complete before handling the next one, so all my concurrent Ajax calls are being queued instead of being all processed in parallel.
To refresh N values takes 3*N seconds where it could just take 3 seconds with concurrency.
Any idea how to fix this ?
Here is my current server-side code (method get_load is the one called with Ajax):
class RootController(TGController):
#expose()
def index(self):
with open ("index.html") as data:
index = data.read()
return index
#expose()
def get_load(self, ip):
command = "bash get_cpu_load.sh"
request = subprocess.Popen(["ssh", "-o ConnectTimeout=2", ip, command])
load = str(request.communicate()[0])
return load
Your problem is probably caused by the fact that you are serving requests with Gearbox wsgiref server. By default the wsgiref server is single threaded and so can serve a single request at time. That can be changed by providing the wsgiref.threaded = true configuration option in your development.ini server section (the same where ip address and port are specified too). See https://github.com/TurboGears/gearbox#gearbox-http-servers and http://turbogears.readthedocs.io/en/latest/turbogears/gearbox.html#changing-http-server for additional details.
Note that wsgiref is the development server for TurboGears and usage on production is usually discouraged. You should consider using something like waitress, chaussette or mod_wsgi when deploying your application, see http://turbogears.readthedocs.io/en/latest/cookbook/deploy/index.html?highlight=deploy

WebIOPi and Harmony Hub

My end goal here is to turn on my tv using my Pi. I've already setup and configured everything I can think of, I can access the pi remotely via http, but I constantly get a 404 when trying to call a macro via the REST API. Script runs fine on its own, just can't seem to be called from http.
At this point, I'd take any solution that can be executed via http. Php, cgi, etc, don't care, I just need it to run beside the current setup.
Added to config file as follows:
myscript = /home/pi/harmony.py
harmony.py
import webiopi
import sys
import os
#webiopi.macro
def HarAll():
os.system("/home/pi/Desktop/harmonycontrol/HarmonyHubControl em#i.l passwort start_activity 6463490")
When I attempt to access http://piaddress:8000/macros/HarAll I get a 404. I'm positive I'm missing a step here, for some reason, webIOPi simply isn't adding the macro to the web server.
Got it figured out, this whole time I was trying to test it instead of just adding it to the app I made, I was sending http GET from web browser instead of http POST. Works perfectly.

Categories