I'm working on a web interface which currently runs using PHP and communicates locally to a python script.
I'm moving the web side to appengine, which so far is going well when being used locally, I'm currently communicating from the appengine app to the python app via get requests that are handled by the python script.
The problem is, that obviously the machine running the python script will be behind a firewall, I've never needed to do this before and am not sure on how to implement this best.
The only idea I have so far is for the python script to send post requests to the appengine with some data and then as a response, send back some other data. The only problem with this is that the web interface should update the client quite fast.
Any ideas?
Take a look at ProtoRPC Python API: https://developers.google.com/appengine/docs/python/tools/protorpc/overview
Though it is still marked as experimental, it seems to be a decent framework for what you are trying to do - send messages back and forth between the apps.
Since you said your local app runs behind a firewall, I'm assuming you cannot open up an endpoint and protect it with some form of authentication.
Once you have messages flowing, you can either use Channel API to keep the front-end updated: https://developers.google.com/appengine/docs/python/channel/overview
Or if you want to go more basic, just implement long/short polling through AJAX.
Sorry with the limited amount of info you have provided, that's all I can think of right now. Please feel free to post more details and I'll try to help further.
Related
I have a running python application that needs to receive some data and process them. and I also have a PHP server that can get these data. I want to send JSON data from PHP to my python app.
anyway except running a python web server and send data to it, or insert into DB and get from DB with python?
thanks.
I tried using python cherryPy web server.
#Niklas D It would be easier to answer your question, if you can give some more context about the application or use case you want to solve.
Some further possibilities are:
Glue Code (I never did it with python and php only C++ with python, but you should be able to find examples on the internet e.g. https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages#PHP )
Messaging Systems like RabbitMQ, ActiveMQ, ZeroMQ, etc.
Redis (I know you said except writing to a database, but Redis provides some features for publish subscribe https://redis.io/commands/pubsub which allows you to write to Redis from the one side and get data on the other side without polling the db all the time, which is the issue you have with using a database I guess) It's a bit easier to setup and use, than a messaging system.
TCP connection between the python and php application. https://medium.com/swlh/lets-write-a-chat-app-in-python-f6783a9ac170
If you want to send data to a python application using web protocols, i.e send POST, GET requests etc then you need to create a python web app to receive and handle those requests. Which in turn needs to be running off a webserver or you could build serverless functions to handle this, see https://serverless.com/
If you want to get data using a python application, i.e the python app sends POST and GET requests etc to your php app to ask for the JSON payload you can build an app using python's standard requests library https://docs.python.org/3/library/urllib.request.html or better still us the Requests package http://docs.python-requests.org/en/master/
Or you could do something and save the JSON file to disk and then open it with your python app. You'd need to set up scheduling or make your php app execute python code on the server... This last suggestion is a bad idea please don't unless your app is isolated and not publicly accessible or you know how to lock down your security.
first of all, i apolagise for making such a general question, but it has been really hard to find any useful information on the web.
I would like to create a client side + server side application, which will communicate using a REST api or websockets.
However, my question is the following.
On the server side, after receiving the request from the client,
is it possible to execute a python script, which will return the intended answer, and only then returning that answer to the client?
Let's take an example.
Suppose my client app ask the user which city he is on right now.
Than, the server receives a get request with the name of that city.
Than, the server executes a python script to obtain some information about that city (lets say it searches for the weather).
Only after the script return's the info, can the server return it's answer to the client so it can be rendered.
I hope this is not marked as off topic, as i think this is a relevant general question.
I appreciate your help.
most serious Python Web frameworks have add-on that will do most of the transport / serialisation job.
Django-REST-Framework and Flask-RESTful are the kings in Python language for full stack REST server side applications.
Otherwise, for simple things and few configuration, you could consider Falcon https://falcon.readthedocs.io/en/stable/
On client side, use the "requests" package. http://docs.python-requests.org/en/master/
I'm currently working on a University project that needs to be implemented with a Client - Server model.
I had experiences in the past where I was managing the communication at socket level and that really sucked.
I was wondering if someone could suggest an easy to use python framework that I can use for that purpose.
I don't know what kind of details you may need to answer so I'm just going to describe the project briefly.
Communication should happen over HTTP, possibly HTTPS.
The server does not need to send data back or invoke methods on the clients, it just collects data
Many clients send data concurrently to server, who needs to distinguish the sender, process the data accordingly and put the result in a database.
You can use something like Flask or Django. Both frameworks are fairly easy to implement, Flask is much easier than Django IMO, although Django has a built in authentication layer that you can use, albeit more difficult to implement in a client/server scenario like you need.
I would personally use Flask and JWT (JSON Web Tokens), which will allow you to give a token to each client for authentication with the server, which will also let you differentiate between clients, and you can use HTTPS for your SSL/TLS requirement. It is tons easier to implement this, and although I like django better for what it brings to the table, it is probably overkill to have you learn it for a single assignment.
For Flask with SSL, here is a quick rundown of that.
For JWT with Flask, here is that.
You can use any database system you would like.
If I understood you correctly you can use any web framework in python. For instance, you can use Flask (I use it and I like it). Django is also a popular choice among the python web frameworks. However, you shouldn't be limited to only these two. There are plenty of them out there. Just google for them.
The implementation of the client depends on what kind of communication there will be between the clients and the server - I don't have enough details here. I only know it's unidirectional.
The client can be a browser accessing you web application written in Flask where users send only POST requests to the server. However, even here the communication will bidirectional (the clients need to open the page which means the server sends requests back to the client) and it violates your initial requirement.
Then it can be a specific client written in python sending some particular requests to your server over http/https. For instance, your client can use a requests package to send HTTP requests.
I would like to do a very simple thing but I kept having trouble to get it work.
I have a client and a server. Both of them have python. The client needs at a certain time in the python code to send a picture to the server and the server is using python to receive the picture, to do some modifications in the picture then save it to disk.
How can I acheive this the easiest way possible? Is Django a good idea?
My problem is that I keep getting an error from the Django server side and it seems it is because I am not managing the cookies.
Can someone give me a sample code for the client and for the server to authenticate then send the file to the server in https?
Also, if you think it is best to use something else than Django, your comments are welcomed :). In fact I managed to get it work very easily with client python and server php but because I have to treat everything in python on the server, I would have prefered not to install apache, php, ... and use only python also to get the picture.
Many thanks for your help,
John.
You don't need Django - a web framework - for this unless you really want to have the features of Django. (Here's a good link. But to sum it up, it would be "a bunch of website stuff".)
You'd probably be best off just using something to transmit data over the network. There are a lot of ways to do this!
If your data is all local (same network) you can use something like ZeroMQ.
If you are not sure if your data is local, or if you know it won't be, you can use HTTP without a server - the Requests library is awesome for this.
In both these scenarios, you'd need to have a "client" and a "server" which you already have a good handle on.
I'm trying to figure out how to build a TCP proxy on GAE (Google App Engine). I would ordinarily do it using twisted networking engine but GAE doesn't allow frameworks. I'm also pretty new to internet and networking technologies in general.
Basically I have a proxy server and I'd like to use GAE as a TCP proxy to relay everything to the primary proxy server. All the GAE front ends are connected to the back end by google fiber, so if I make the back end near the primary proxy server, it should make it super fast regardless of where I'm connecting from.
Unfortunately GAE doesn't allow me to control ports at all and everything that I'm reading either tells me how to configure a TCP proxy on a server that I'm in complete control of or how to configure a proxy where I type the url into a webpage in the browser. Something along the lines of making a personal http://www.hidemyass.com/proxy/ type of website.
I'd like to set it up so I can simply tell chrome to ignore certificate errors (it connects to a dynamic IP using HTTPS so there's no way to sign it but I trust myself) and put the proxy info into chrome.
Edit: I'd prefer to write it in python but I can do any language
Thanks in advance
P.S. Please don't give answers like just use GoAgent or tor or something. They don't fulfill my purpose.
If you're simply trying to proxy HTTP requests like GoAgent does then have a look at the URLFetch documentation for Google App Engine.
URL Fetch Python API Overview
If you're trying to proxy anything else, then Daniel is correct.
This isn't the sort of thing you can use GAE for.
I don't know where you got the idea that GAE "doesn't allow frameworks". Of course it does, anything that speaks WSGI (eg Django, Flask, Pylons) is fine. But GAE is a web platform: it's not an appropriate place to try and write any sort of bare-metal networking platform. Apart from anything else, bandwidth on GAE is fairly expensive.
And also I don't know where you think the GAE "front ends" are, as opposed to the "back ends". GAE is not split that way, AFAIK.
I don't really understand what exactly you are trying to do, but it sounds like a content delivery network (CDN) like Akamai might be more appropriate.