Is there an equivalent to Python yield's behavior in Groovy? - python

Trying to learn Groovy, and it's been a fun and only slightly confusing adventure so far. What I'm currently trying to do is stand up a server, make a wget request to it and when that request is received, have a certain action be executed - in this case, just creating a new file:
import java.net.http.HttpResponse
class ServerLogic {
static def holdupServer() {
println('Standing up server..\n')
def socketServer = new ServerSocket(5000)
// server is up
socketServer.accept { socket ->
// the lines below only execute when a connection is made to the server
socket.withStreams { input, output ->
println("[${new Date()}] HELLO\n")
def newFile = new File("/home/nick/IdeaProjects/groovy_learning/resources/new.txt")
newFile.createNewFile()
newFile.text = 'Hello!!!'
println("NEW FILE SHOULD HAVE BEEN CREATED.")
println ">> READ: ${input.newReader().readLine()}"
}
}
return HttpResponse
}
}
ServerLogic.holdupServer()
With the above code, when I execute a wget http://localhost:5000, it "works" in the sense that the file is created like I want it to be, but the wget output is unhappy:
--2021-07-17 15:42:32-- http://localhost:5000/
Resolving localhost (localhost)... 127.0.0.1
Connecting to localhost (localhost)|127.0.0.1|:5000... connected.
HTTP request sent, awaiting response... No data received.
Retrying.
--2021-07-17 15:42:33-- (try: 2) http://localhost:5000/
Connecting to localhost (localhost)|127.0.0.1|:5000... failed: Connection refused.
Resolving localhost (localhost)... 127.0.0.1
Connecting to localhost (localhost)|127.0.0.1|:5000... failed: Connection refused.
// these occur because the server has shut down after the last println call, when the `return HttpResponse` triggers
So from that, we can reason out that there isn't a proper response being returned, even though I have the return HttpResponse after the sockerServer.accept ... logic. My thought on how to solve the problem (primarily because I come from a Python background) would be to somehow mimic yielding a response in Python (basically, return a response without breaking out of the holdupServer() logic and thus breaking the server connection). Is there a way to achieve this in Groovy, or is there a different approach I could use to essentially return a valid HttpResponse without exiting the holdupServer() block?

Explanation
You can use a function callback, which in Groovy translates to a closure callback. Basically, you pass the value you want to return to another function/method defering the stack on the current method. This approach works essentially on all languages. In java (the versions which don't support lambda), for instance, you would have to pass an object in which you would call a method later.
Example
import java.net.http.HttpResponse
class ServerLogic {
static def holdupServer(Closure closure) {
(0..2).each {
closure.call(HttpResponse)
}
}
}
ServerLogic.holdupServer { httpResponse ->
println httpResponse
}
Output
interface java.net.http.HttpResponse
interface java.net.http.HttpResponse
interface java.net.http.HttpResponse
Addressing OP's comment
You have to provide some headers. At least Content-Type and Content-Length should've been provided along with the data and HTTP status (HTTP/1.1 200, in this case) properly formatted. Also, you should've wrapped the ServerSocket.accept calls in a while loop.
See the MDN Overview on HTTP.
Code
class ServerLogic {
static HEADERS = [
"HTTP/1.1 200",
"Content-Type: text/html; charset=utf-8",
"Connection: Keep-Alive",
"Keep-Alive: timeout=5, max=1000",
"Content-Length: %d\r\n",
"%s"
].join("\r\n")
static def holdupServer(Closure callback) {
println('Standing up server..\n')
def socketServer = new ServerSocket(5000)
// server is up
while(true) { // Continue to accept connections
socketServer.accept { socket ->
// the lines below only execute when a connection is made to the server
callback.call(socket) // call function
}
}
}
}
ServerLogic.holdupServer { Socket socket ->
String data = "RESPONSE <--\n"
def response = String.format(ServerLogic.HEADERS, data.size(), data)
println response
socket << response
}
Client output
RESPONSE <--

Related

how to retrieve Python array in javascript (Flask to React communication)

I dumped Python array to a response with json.dumps and now I'm trying to retrieve the data as Javascript list.
#app.route('/get_scales')
#cross_origin()
def get_scales():
classes = inspect.getmembers(sys.modules['mingus.core.scales'], inspect.isclass)
scales = [class_[0] for class_ in classes if ('Error' not in class_[0] and class_[0] != '_Scale')]
return json.dumps(scales)
getScales() {
// create a new XMLHttpRequest
var xhr = new XMLHttpRequest();
// get a callback when the server responds
xhr.addEventListener("load", () => {
// update the state of the component with the result here
console.log(xhr.responseText);
});
// open the request with the verb and the url
xhr.open("GET", "http://127.0.0.1:5000/get_scales");
// send the request
xhr.send();
var formatted_response = JSON.stringify(xhr.responseText);
console.log(JSON.parse(xhr.responseText));
return xhr.responseText;
}
When I made the function in getScales log to console type of xhr.responseText it shows String, but then when trying to parse it with JSON.parse it throws an error. Trying to stringify it first, like above doesn't help either.
I don't know what error it gave, but I think it was actually because xhr.response, wasn't there yet when you tried to use it. This is because XMLrequests function asynchronously, meaning that while the XMLrequest is still waiting for a response, the rest of your code continues executing. Try this instead:
xhr.open('GET', url, false);
the "false" parameter basically says you want your XMLrequest to function synchronously. So the rest of your code will wait for it to finish.
Do keep in mind that your performance may suffer because of this in a lot of situations.
So if you have multiple XMLrequests at a time or Sequentially, you could consider using HTML5 Workers for this.
Or If you don't want your requests to function synchronously (if you can avoid having your XMLrequest function synchronously, you definitely should) you could also try something like this (something like this is defentintely the best option for performance, so if you can use it):
getScales() {
// create a new XMLHttpRequest
var xhr = new XMLHttpRequest();
// get a callback when the server responds
xhr.addEventListener("load", () => {
// update the state of the component with the result here
console.log(xhr.responseText);
var formatted_response = JSON.stringify(xhr.responseText);
console.log(JSON.parse(xhr.responseText));
return xhr.responseText;
});
// open the request with the verb and the url
xhr.open("GET", "http://127.0.0.1:5000/get_scales");
// send the request
xhr.send();
}

How does send work in flask-socketio?

I cannot understand how send function works in flask-socketio.
For example, I am using flask-socketio as server and socket.io as client:
server:
#socketio.on("test")
def handle_test():
send("Wow")
client:
socket.emit('test', "", (data) => {
console.log(data);
});
I think I can get data from server side, but I'm wrong. I just get nothing.
I understand I can build a structure based on event. But I cannot understand how send works. Will it send response to client? If it will, how could I get that response? If it won't, what does it do?
First, I suggest you use emit() instead of send().
To send from client to server, use this:
socket.emit('test', {data: 'my data'});
On the server, you can receive the event and then emit back to the client:
#socketio.on('test')
def handle_test():
emit('wow');
To receive this second emit on the client, do this:
socket.on('wow', function(data) {
console.log(data);
});

How to send no response to client in Flask (Python)

I want to handle some malicious request by not sending any kind of response in Flask.
In a route like:
#app.route("/something", methods=['POST'])
def thing():
return
Still returns an INTERNAL SERVER ERROR with View function did not return a response.
How do I formally send NO RESPONSE back to a client, i.e from an ajax call like so?
$.ajax({
url: "/something/",
method: "POST",
data: JSON.stringify({
"foo": "abc",
"bar": "123"
}),
success: function(resp) {
console.log(resp);
},
error: function(error) {
console.log(error); // this still gets called. I want it to hang.
}
})
... I want it to hang.
What you probably want is to close the connection on the server side (since you need to free the resources) but don't want the client to realize that the connection is closed. You cannot do this from flask or any other web application since any close of the connection on the server will cause the OS kernel to send the FIN to the client and thus the client knows about the closing too.
I suggest instead that you issue a redirect to the client to a URL where the client just hangs: for example have some iptables DROP rule on port 8080 and then redirect the client to http://your.host:8080/. Many clients will blindly follow such redirects and then hang (until they timeout) while trying to connect to this dead drop URL.

Asynchronous Django using django-websocket-redis

I'm trying to use django-websocket-redis and I didn't understand how it works even reading the doc..
The part client (javascript/template) was easy to understand but I want to send data messages from one client to other and i'm blocking here..
Connecting each client :
var ws = new WebSocket('ws://localhost:8000/ws/foobar?subscribe-group');
ws.onopen = function(e) {
console.log("websocket connected");
};
ws.onclose = function(e) {
console.log("connection closed");
};
How manage my views.py to create a link between them ?
With NodeJS I was using this code to link the clients together :
io.sockets.on('connection', function (socket) {
var data={"action": "connexion", "session_id": socket.id,};
socket.emit('message',data);
socket.on('message', function(socket){
if (socket.action == "test")
{
io.sockets.socket(socket.code).emit('message',{"action": "move"});
//the socket.code is the session_id of the client one transmitted by a form
}
});
});
Thanks you.
The link between your Django view.py and the Websocket loop is the Redis message queue. Imagine to have two separate main loops on the server: One which handles HTTP-requests using the normal Django request handler. The other loop handles the Websockets, with their long living connections. Since you can't mix both loops within the normal Django request handler, you need message queuing, so that they can communicate to each other.
Therefore, in your Django view.py, send the data to the websocket using something like:
def __init__(self):
self.redis_publisher = RedisPublisher(facility='foo', broadcast=True)
def get(self, request):
data_for_websocket = json.dumps({'some': 'data'})
self.redis_publisher.publish_message(RedisMessage(data_for_websocket))
This will publish data_for_websocket on all Websockets subscribed (=listening) using the URL:
ws://example.com/ws/foo?subscribe-broadcast

Code in Python, communicate in Node.js and Socket.IO, present in HTML

You have a python script diagnosis.py that generates realtime event-based data. Using Node.js, you can launch it as a child process and capture its output and then using Socket.IO emit that to the client and present it using HTML.
Server
var util = require('util'),
spawn = require('child_process').spawn,
ls = spawn('python', ['diagnosis.py']);
var app = require('http').createServer(handler)
, io = require('socket.io').listen(app)
, fs = require('fs')
app.listen(80);
function handler (req, res) {
fs.readFile(__dirname + '/index.html',
function (err, data) {
if (err) {
res.writeHead(500);
return res.end('Error loading index.html');
}
res.writeHead(200);
res.end(data);
});
}
io.sockets.on('connection', function (socket) {
ls.stdout.on('data', function (gdata) {
socket.emit('news', gdata.toString());
});
});
Client
<html>
<head>
<script src="/socket.io/socket.io.js"></script>
<script>
var d = "";
var socket = io.connect('http://localhost');
socket.on('news', function (data) {
d += data;
document.getElementById('data').innerHTML = d;
console.log(data);
});
</script>
</head>
<body>
<div id="data"></div>
</body>
</html>
Question
This is great and all, but what if you're looking for the same HTML-Node.js communicative power that Socket.IO provides but instead between Node.js and Python? How would you do that? There's no web server there, so Socket.IO does not make a lot of sense and communicating over bare TCP does not provide the same power/elegance. How do I achieve full duplex communication between Node.js and Python?
Update I answered my own question, but I'm open to an alternative approach. RPC doesn't quite do what I want though.
Apache Thrift is a pretty awesome way to write RPC code between all of the major languages. You write a generic Thrift spec declaring the types and services, and then the code generator creates bindings for your desired languages. You can have apis for calling methods between your node and python code bases.
On a more generic low level approach, ZeroMQ is a message queue library that also has support for all of the major languages. With this, you can design your own solution for how you want to communicate (not just purely RPC). It has patterns for request/reply, push/pull, pub/sub, and pair. It gives you enough tools to put together a scalable system of any type.
I have used both and they are great solutions.
Just as a very rough example, the Thrift spec may be something like this. Lets say you want to communicate events from python to node.js, have it processed and get back some response:
myspec.thrift
struct Event {
1: string message;
}
service RpcService {
string eventOccurred(1:Event e)
}
This defines a data structure called Event with a single string member to hold the message. Then a service called RpcService define one function called eventOccured which expects an Event as an argument, and will return a string.
When you generate this code for python and node.js, you can then use the client side code for python and the server side code for node.js
python
from myrpc import RpcService, ttypes
# create a connection somewhere in your code
CONN = connect_to_node(port=9000)
def someoneClickedSomething():
event = ttypes.Event("Someone Clicked!")
resp = CONN.eventOccurred(event)
print "Got reply:", resp
node.js
// I don't know node.js, so this is just pseudo-code
var thrift = require('thrift');
var myrpc = require('myrpc.js');
var server = thrift.createServer(myrpc.RpcService, {
eventOccurred: function(event) {
console.log("event occured:", event.message);
success("Event Processed.");
},
});
server.listen(9000);
You can look at some messaging systems like 0mq http://www.zeromq.org/
I used the library inspired by this question to turn diagnosis.py into a Socket.IO client. This way I can emit the realtime data to the Node.js Socket.IO server:
socketIO.emit('gaze', ...)
And then have it do a socket.broadcast.emit to emit the data to all the Socket.IO clients (browser and diagnosis.py).
RPC is probably the more standard approach for cross-language development but I find it's a bit of an overkill to do that when the goal is to exchange data. It also does not support evented IO out of the box.
Update on Jan 2013 Since socket.broadcast.emit generates a lot of unnecessary traffic, I tried to find a better way of doing this. The solution I came up with is to use namespaces which is supported by the basic python Socket.IO client library I mentioned.
Python
self.mainSocket = SocketIO('localhost', 80)
self.gazeSocket = self.mainSocket.connect('/gaze')
self.gazeSocket.emit('gaze', ...)
To connect to the gaze namespace.
Node.js
var gaze = io.of('/gaze').on('connection', function (socket) {
socket.on('gaze', function (gdata) {
gaze.emit('gaze', gdata.toString());
});
});
This emits the data received only to clients connected to the gaze namespace.
Browser
var socket = io.connect('http://localhost/gaze');
socket.on('gaze', function (data) {
console.log(data);
});
If you're just looking for something that gives you a simple protocol on top of sockets, so you don't have to deal with buffering and delimiting messages and all that stuff, the two obvious choices (depending on what kind of data you're trying to send) are netstrings and JSON-RPC. There are multiple choices of libraries for both, in both languages, but you can end up with code as simple as this:
class MyService(Service):
# … code to set up the newsdb
def getnews(self, category, item):
return self.newsdb.get(category, item)
myService = MyService(6000)
myService = Service(6000)
// … code to set up the newsdb
myService.getnews = function(category, item, callback) {
callback(this.newsdb.get(category, item);
}

Categories