I like to work with data saved in one GAE application in other GAE applications.
Basically share the datastore between multiple web applications in Google App Engine (Python) Development and Production.
Also if possible with:
http://localhost:####/_ah/admin/datastore
I like to view data in other applications not runnings and/or running on one screen?
Thanks for the help!
Nope, datastores are totally contained within the application. There is no direct sharing of data from one app to another.
You could however expose a web service to make data from one application available to another, using REST for example.
I guess the core problem here is that you would like to share the data between two applications hosted on GAE. There are two ways to do that.
You could use Google Cloud Datastore to store the information. This gives you more flexibility as you can have different services accessing datastore. You could even have something running on google compute engine and communicating with datastore.
Use google appengine modules. All modules share the same datastore. In your case each module could be a different application.
Hope this helps.
No, a datastore can only be accessed by one application (but that app can serve up multiple sites).
If you want Google to allow multiple applications to directly access the same datastore then you should star this issue:
http://code.google.com/p/googleappengine/issues/detail?id=1300
Unfortunately the way this issue is written is a bit ambiguous, but I take it to mean 'multiple applications' rather then 'multiple accounts'.
FWIW, you can deploy an application with another version and language - but with the same id, and be able to access its datastore concurrently
Related
Im developing a financial app using PySimpleGUI.
This is a desktop app, and will be sold publicly on my web page. I need a place to store my future clients data.
Does Google Cloud Storage work for a Desktop App, and is it safe? ( There will be sensitive financial data stored ). Also, multiple people will be editing the files simultaneously, will this cause the Google Cloud Storage to break?
Will you recommend me using something else for storing my data?
Thanks
I have tried connecting to SQL Server, but it only works for computers that are on the same network.
Your choice of components is way out of sync. My suggestion would be to first see what the actual requirements are. One small example would be how many people at a time will access the application, what data access controls will be present, how will you implement them? Can you use features of GCP or will you be developing your own? In any of the scenarios, are you involving data masking? What would be the design to expand the application in long run, etc. etc. Also a small disclaimer, the above queries have barely scratched the surface of the complexities involved in designing such data systems.
Once done, go through the list of tools available in GCP. See, what fits and how an efficient chain can be established.
Also, connecting to GCP via python works anywhere depending on how you setup the environment.
I am attempting to create a python application on a Raspberry Pi that can access data stored in a db model on an App Engine application. Specifically the latest entry in the data store.
I have no experience doing this type of remote data access but have a fair bit of experience with App Engine and Python.
I have found very little that I understand on this subject of remote data access.
I would like to access the data store directly, not text on a web page like this.
ProtoRPC kind of looks like it may work but Im not familiar with it and it looks like it is pretty involved when I just need to access a few strings.
What would make the most sense to accomplish this? If an example is easy to provide I would appreciate it.
What you looking for is the appengine remote api.
https://cloud.google.com/appengine/docs/python/tools/remoteapi
Google App Engine doesn't allow direct access to it's databases from your local python script. Instead, only an application hosted on App engine's server can access that data for your application.
Essentially, you're looking for a Google App Engine compatible, automatic, Restful API. Several exist, and have been discussed here. YMMV with the various frameworks that are discussed there.
I'm curious, is there a way I could use the new Google Cloud Storage client library from outside AppEngine? If so, how would I go about setting the credentials/API key? I looked though the sparse documentation, to no avail. Any help is much appreciated.
Thanks.
Google Cloud Storage and Google AppEngine are separate products that can be used seperately. AppEngine provides an AppEngine-specific client for Google Cloud Storage that provides several useful features for developing an AppEngine app that will use Google Cloud Storage, which I believe is the library you're referring to.
You can absolutely use Google Cloud Storage from outside AppEngine, but you cannot use AppEngine's GCS library to do so. Instead, you'll have to use one of GCS's APIs or client libraries. There are two main APIs (XML and JSON-based), and also client libraries for many major languages, including Python and Java.
For getting started, check out https://developers.google.com/storage/docs/signup
It should be possible to use gcs client from outside GAE, however you will still need to have the GAE SDK so the imports can work.
Take a look at the method common.set_access_token, you would need to refresh the token by yourself however.
If you are willing to dig further, you can take a look at the constructor of the _RestApi class which receives a token maker function.
This is an open source project and changes are welcomed.
I'm new to GAE and though I've looked around a fair bit, I haven't seen anything that mimics the functionality of statsd for GAE. Basically it would be nice to have something that you could easily set stats on and see the results graphed.
http://codeascraft.etsy.com/2011/02/15/measure-anything-measure-everything/
One thing that seems to be difficult for statsd is handling unlimited amount of data. If you are interested in aggregate application statistics (across the entire dataset), I would suggest using the App Engine Log API or the App Engine Datastore in conjunction with Google BigQuery.
If you are interested specifically in analyzing App Engine logs, there are two projects that you can take a look at that helps move App Engine Log data into BigQuery:
log2bq, a Python app for GAE logs->BigQuery
Mache, a framework for pushing GAE log data into BigQuery (I know you are
asking about Python , but this one is written in Java)
For general stats collection and analysis, it's also possible to move Datastore data into BigQuery for analysis. The GAE team has recently started testing a feature that imports data from the experimental Datastore backup tool directly into BigQuery. Check this link for more info.
BigQuery doesn't provide visualization tools on it's own, but there are lots of ways to visualize BigQuery's query results, examples include:
Google Chart Tools API
Google Apps Script
Tableau
QlikView
There's a lot more on the BigQuery third party tools page.
I'm in the process of setting up a new web app and deciding whether to just do it with WSGI or go the full framework route with Django.
The app's foremost requirements:
1) The app has no UI what so ever and all of the data is exposed to clients via a REST api with JSON.
2) It will have data to persist so MongoDB & probably Amazon's SimpleDB will be used for the database side.
Is there a reason to use Django or can I get marginal speed improvement with WSGI only?
Previous server-side apps I've built were either with Java/Struts and Groovy/Grails on the JVM. My understanding is that Django is an MVC framework similar to Rails and Grails.
I've also played around with Google App Engine which uses WSGI as thin layer above your code for managing and routing requests.
I suggest you consider something between those two extremes. Flask is lightweight, very easy to use, and connects to your web server via wsgi. You can use regular python database connectors with it, and a few databases even have Flask-specific extension modules.
I have worked with Django for a couple of projects and I like it a lot, but since you are going to use mongoDB and a lot of JSON I suggest you use NodeJS as server side, with Express as framework, you can see a brief tutorial here:
http://howtonode.org/express-mongodb
One of the advantages of this is that you will use only javascript all along your project, I began working with this technology the last month in a Hackathon, and I can tell you that I'm very impressed of how fast and simple it is.
I've worked a bit with some django "apps" ,its really easy, but setting up the "apps" can be a bit of a long process. Django has a lot of nice features that you won't be using and I agree that you might be on one "extreme" here.