I'm from Brazil and study at FATEC (college located in Brazil).
I'm trying to learn about AppEngine.
Now, I'm trying to load a large database from MySQL to AppEngine to perform some queries, but I don't know how i can do it. I did some testing with CSV files,but is there any way to perform the direct import from MySQL?
This database is from Pentaho BI Server (www.pentaho.com).
Thank you for your attention.
Regards,
Daniel Naito
It isn't clear from your tags, but the documented bulkloader is preferable to trying to hoist your csv files directly to the app-server.
Advanced Bulk Loading by Nick Johnson is what you are looking for.
If you need live synchronization between App Engine and MySQL, you should look into AppRocket. AppRocket seems to require that you have your data in App Engine before the first synchronization. It will also require some minor changes to your model.
If you're using Pentaho BI Server as your data source, why don't you consider using Pentaho Data Integration (ETL tool) to move the data over? At the very least PDI automate any movement of data between your data source and any AppEngine bulk loader tool (it can easily trigger any app with a shell step).
Related
We use db2 at our company. I would like to find a way to query db2 for data and display that data in grafana. For example to get a number of completed transactions.
I see grafana does support mysql natively but not db2. Is there a way to just add the db2 driver/libraries?
Worst case is writing queries in python and then simply displaying that recorded data with grafana an effective solution?
Thanks
Don't know if you found what you need but in case you didnĀ“t you might consider using 'Db2 rest services' and Grafana plugin 'Simple JSON Datasource'.
In the meantime, two suitable datasources have been developed
grafana-odbc-datasource
grafana-db2-datasource
Unfortunately, both require an enterprise license.
We're currently evaluating other approaches like copying the data in a PostgreSQL database with an ETL-like job.
A generic ODBC/JDBC plugin is really needed.
I'm mostly working on backend staff, except now in a project I need to use python to do computing and visualize the results on google maps. Think about it as, for example, compute the geographical clusters of people tweeting in new york city.
In the python program, it runs about 10 seconds, and then output one iteration of data, which is a json object for coordinates. I'm wondering how should I connect this data to google maps?
What I thought was let python write data into a file and JS would listen to that file every few milliseconds. However that sounds too hacky. Just wondering is there a better way to do it?
I'm really a newbie to js. please forgive my ignorance.
Thanks
The normal way a HTML page gets data from a backend service (like your coordinate generator every 10 seconds) is to poll a web service (usually, a JSON feed) for updates.
All of the dynamic Google Maps stuff happens within a browser, and that page polls a JSON endpoint, or uses something fancier like websockets to stream data into the browser window.
For the frontend, consider using jQuery, which makes polling JSON dead simple. Here's some examples.
Your "python program" should dump results into a simple database. While relational and traditional databases like MySQL or PostgreSQL should suffice, i'd encourage you to use a NoSQL database, which handles capped collections. This prevents you from having to clean old data out from a cron schedule. It additionally allows storing data in ranged buckets for some cool playback style histories.
You should then have a simple web server which can handle the JSON requests from the HTML frontend page, and simply pulls data from the MongoDB. This can be done quickly in any one of the python web frameworks like Flask, Bottle or Pyramid. You could also play with something a little sexier like node.js. The only requirement here is that a database driver exists for it.
Hope that gives a 10,000 foot view of what you need to do now.
I have some CSV files for cities,state and countries with their ids, names etc. I want to put all this data into Google app engine datastore.
Can someone please suggest an efficient way of doing this on development server as well as on the production server?
Thanks in advance.
You're in luck. The functionality you described is baked into appcfg.py:
http://code.google.com/appengine/docs/python/tools/uploadingdata.html
Hi I want some help in building a Phone book application on python and put it on google app engine. I am running a huge db of 2 million user lists and their contacts in phonebook. I want to upload all that data from my servers directly onto the google servers and then use a UI to retrieve the phone book contacts of each user based on his name.
I am using MS SQL sever 2005 as my DB.
Please help in putting together this application.
Your inputs are much appreciated.
For building your UI, AppEngine has it's own web framework called webapp that is pretty easy to get working. I've also had a good experience using the Jinja2 templating engine, which you can include in your source, or package as a zip file (example shows Django, you can do the same type of thing for Jinja).
As for loading all of your data into the DataStore, you should take a look at the bulk uploader documentation.
I think you're going to need to be more specific as to what problem you're having. As far as bulk loading goes, there's lots of bulkloader documentation around; or are you asking about model design? If so, we need to know more about how you plan to search for users. Do you need partial string matches? Sorting? Fuzzy matching?
Does anyone have any good information aside from the Google App Engine docs provided by Google that gives a good overview for people with MS SQL background to porting their knowledge and using Google App Engine Data Store API effectively.
For Example, if you have a self created Users Table and a Message Table
Where there is a relationship between Users and Message (connected by the UserID), how would this structure be represented in Google App Engine?
SELECT * FROM Users INNER JOIN Message ON Users.ID = Message.UserID
Here is a good link: One to Many Join using Google App Engine.
http://blog.arbingersys.com/2008/04/google-app-engine-one-to-many-join.html
Here is another good link: Many to Many Join using Google App Engine:
http://blog.arbingersys.com/2008/04/google-app-engine-many-to-many-join.html
Here is a good discussion regarding the above two links:
http://groups.google.com/group/google-appengine/browse_thread/thread/e9464ceb131c726f/6aeae1e390038592?pli=1
Personally I find this comment in the discussion very informative about the Google App Engine Data Store:
http://groups.google.com/group/google-appengine/msg/ee3bd373bd31e2c7
At scale you wind up doing a bunch of
things that seem wrong, but that are
required by the numbers we are
running. Go watch the EBay talks. Or
read the posts about how many database
instances FaceBook is running.
The simple truth is, what we learned
about in uni was great for the
business automation apps of small to
medium enterprise applications, where
the load was predictable, and there
was money enough to buy the server
required to handle the load of 50
people doing data entry into an
accounts or business planning and
control app....
Searched around a bit more and came across this Google Doc Article:
http://code.google.com/appengine/articles/modeling.html
App Engine allows the creation of easy
to use relationships between datastore
entities which can represent
real-world things and ideas. Use
ReferenceProperty when you need to
associate an arbitrary number of
repeated types of information with a
single entity. Use key-lists when you
need to allow lots of different
objects to share other instances
between each other. You will find that
these two approaches will provide you
with most of what you need to create
the model behind great applications.
Can I supplement the excellent answer further above with a link to a video:
http://sites.google.com/site/io/building-scalable-web-applications-with-google-app-engine
It's a great talk by Google's Brett Slatkin who talks for an hour about the special way you need to think about your application before you can expect it to scale well. There are some genuine WTFs (such as no count() in db queries) that will cause you to struggle if you are coming from a relational background.
I think this is the basics : Keys and Entity Groups
look for it in appengine docs. (I'm new here so can't post a link)
I have worked on it but not a expert though Google app engine is very good thing and it is the future as it implements Platform as a Service and Software as a Service. Google app engine provides a non- relational database. So you cantreally write relationships here.
Regards,
Gaurav J
These links are great, but are predominantly python biased, I am using GWT, and therefore have to use the java flavour of GAE, does anyone have any examples of how to achieve these "join" equivalencies in the java version of GAE?
Cheers,
John
The standalone GAE SDK is pretty difficult to use for putting data into and retrieving data from the Google App Engine data store.
"Objectify" is a GAE extension that makes these operations much easier. The Objectify wiki and source code can be found here. I strongly recommend using Objectify in your GAE project.
http://code.google.com/p/objectify-appengine/
Here are a couple of tutorials on using Objectify with the app engine. Follow these tutorials and you will be storing and retrieving data in no time.
http://www.fishbonecloud.com/2010/11/use-objectify-to-store-data-in-google.html