Does anyone know, if I can fetch an internal AppEngine url from within my AppEngine app?
The official URL Fetch Python API doesn´t cover that.
http://code.google.com/intl/et-EE/appengine/docs/python/urlfetch/overview.html
I tried different possibilities, but as it seems nothing works. Does anyone did that before?
urllib2.urlopen('http://127.0.0.1:8080/start_something/')
or
urllib2.urlopen('/start_something/')
...
Thanks in advance.
This will not work at all with the development server. It is single-threaded, so trying to load one of your app's URL's while inside one of its request handlers will hang until the URLFetch times out.
It should work with no problems at all in production.
To get around the development server limitation, you should run two instances on different ports.
Related
I have a nodejs server setup on AWS with mongoDB. I want to access the database contents using GET method. There is another application in python which needs to access this database present on AWS. I searched on the internet and came across PycURL but I am not getting how to use it exactly. How to approach with pycURL or what can be an alternate solution?
You can build your restful API that is going to handle those GET requests. You have awesome tutorial (with example that you want on bottom):
https://scotch.io/tutorials/build-a-restful-api-using-node-and-express-4
Edit: If you want phyton code for GET requests there is awesome answer here: Simple URL GET/POST function in Python
Edit 2: Example of how would this work. You first need to code your API how to handle GET request and on what route (example: http://localhost:5000/api/getUsers). Than you want to make GET request to that route using Phyton:
Example:
r = requests.get(url="http://localhost:5000/api/getUsers")
I had a similar problem a while ago, there is a tutorial here here. It can lead you towards your intended direction, the drawback may be that in the tutorial, to issue the http request (if I remember correctly), they used postman but I'm sure you can still use PyCurl.
Well, I'm developing an application in GAE with python and basically what I need is a way to emulate a chat in a many-to-one pages. The many pages can send messages to the single page.
I looked for sockets but I get problems when I work with local ports (in all ports I tried I've got an access denied message), they only worked on console application and I need this working on web. Then I looked for an AJAX solution but I can't still find that final part in which, after a message has been submitted, the content of the page that is going to receive all the messages has to be updated.
Does anyone has an idea how to deal with this? I'm also open to suggestions for a different implementation.
Have you looked at Google App Engine Channels ?
I think it is exactly what you are looking for: https://cloud.google.com/appengine/docs/python/channel/
I would like to do a very simple thing but I kept having trouble to get it work.
I have a client and a server. Both of them have python. The client needs at a certain time in the python code to send a picture to the server and the server is using python to receive the picture, to do some modifications in the picture then save it to disk.
How can I acheive this the easiest way possible? Is Django a good idea?
My problem is that I keep getting an error from the Django server side and it seems it is because I am not managing the cookies.
Can someone give me a sample code for the client and for the server to authenticate then send the file to the server in https?
Also, if you think it is best to use something else than Django, your comments are welcomed :). In fact I managed to get it work very easily with client python and server php but because I have to treat everything in python on the server, I would have prefered not to install apache, php, ... and use only python also to get the picture.
Many thanks for your help,
John.
You don't need Django - a web framework - for this unless you really want to have the features of Django. (Here's a good link. But to sum it up, it would be "a bunch of website stuff".)
You'd probably be best off just using something to transmit data over the network. There are a lot of ways to do this!
If your data is all local (same network) you can use something like ZeroMQ.
If you are not sure if your data is local, or if you know it won't be, you can use HTTP without a server - the Requests library is awesome for this.
In both these scenarios, you'd need to have a "client" and a "server" which you already have a good handle on.
I'm working on a web interface which currently runs using PHP and communicates locally to a python script.
I'm moving the web side to appengine, which so far is going well when being used locally, I'm currently communicating from the appengine app to the python app via get requests that are handled by the python script.
The problem is, that obviously the machine running the python script will be behind a firewall, I've never needed to do this before and am not sure on how to implement this best.
The only idea I have so far is for the python script to send post requests to the appengine with some data and then as a response, send back some other data. The only problem with this is that the web interface should update the client quite fast.
Any ideas?
Take a look at ProtoRPC Python API: https://developers.google.com/appengine/docs/python/tools/protorpc/overview
Though it is still marked as experimental, it seems to be a decent framework for what you are trying to do - send messages back and forth between the apps.
Since you said your local app runs behind a firewall, I'm assuming you cannot open up an endpoint and protect it with some form of authentication.
Once you have messages flowing, you can either use Channel API to keep the front-end updated: https://developers.google.com/appengine/docs/python/channel/overview
Or if you want to go more basic, just implement long/short polling through AJAX.
Sorry with the limited amount of info you have provided, that's all I can think of right now. Please feel free to post more details and I'll try to help further.
I'm writing a syndication client, with the aim being to have a client for devices, and a web site that has the same functionality. I shall develop the website using Django - this is already decided; the client shall be written in python with both a CLI and a PyQt4 GUI. I have been writing the clinet first, and it's fairly database-heavy, as everything is cached to enable it to be read while offline.
It struck me today that it would make sense to use Django models for my application, to reduce the repetition of effort between the client and the website. My question is how easy it is to seperate this, and how much of Django I will need in my client to use Django's models. AFAIK I should not need to run the server, but what else is needed? I had an idea of generating the same html for my client as the website, but showing it withing Qt widgets rather than serving pages for a browser.
Has anyone tried this sort of thing before? I'm starting on this already, but it would be good to get a warning of potential dead-ends or things that will create a maintainance nightmare...
Read up on standalone Django scripts and you'll be on your path to victory. Basically all you're really doing is referencing the Django settings.py (which Django expects) and then using models without web views or urls.
If all you're really interested in is using Django's ORM to manage your models and database interaction, you might want to consider using SQLAlchemy instead.
You'll still have to run the Django app as a web server, but you can restrict it to serve to only localhost or something. And sure, you can use QtWebKit as the client.