Pulling Files from Mongo DB into CSV using python - python

I am new to python and I was trying to pull the data from mongodb using python and import it into a csv file.Can some one tell me from where should I start.
I am so new to both the technologies,But trying to learn.
If someone could help me it would be really helpful.

Check out pymongo: http://api.mongodb.com/python/current/tutorial.html
The are also some great introductory videos to MongoDB on youtube demonstrating basic CRUD operations (create, read, update, delete) and working in the mongo shell, which is the core interface to MongoDB.

Related

How to get data and send data to mLab

I have made a python flask app that I can use to manipulate a scoreboard. The app is hosted on heroku.com. The scoreboard is stored in a JSON file. First I just had the JSON file in the GitHub that Heroku makes for you. But then I found out that every couple of hours Heroku does a reset to your last commit. So any changes I would have made to the scoreboard.json would have been lost.
So I came to the conclusion that I needed to use an actual database hosting site to host my scoreboard.json. I have chosen mLab for this.
What command sends over a complete copy of a file in mLab back to the server so I can make changes to the file and then what command replaces the old file with the new file?
You're looking for a python mongodb driver. According to https://docs.mongodb.com/ecosystem/drivers/python/:
PyMongo is the recommended way to work with MongoDB from Python.
Check out the tutorial on using PyMongo, specifically inserting and getting documents.
That being said, you may want to consider splitting up the scoreboard data into smaller units. For example, having one document per player/team might be easier to manage.
Good luck!

How possible data pre processing possible with python for Klipfolio

I am not sure exactly right place to ask but I need any single infrmation about it.
I am going to create a dashboard with Klipfolio and I want to make data pre processing with Python and integrate in klipfolio but unfortunately Klipfoli does not have any specific place to do it.
Is anyone used Klipfolio, data pre processing with Python for Klipfolio.
While Klipfolio does not have any Python integrations, Klipfolio does connect to various types of SQL databases. One work around is to dump your processed data from Python into a SQL database and then connect that SQL database with Klipfolio to make data sources to build the visualization.
You can either directly connect to the database, or if you are running python on a server, you can user "Rest/URL" method in Klipfolio to directly connect to your python code and integrate the output into your dashboard.

Python ORM - save or read sql data from/to files

I'm completely new to managing data using databases so I hope my question is not too stupid but I did not find anything related using the title keywords...
I want to setup a SQL database to store computation results; these are performed using a python library. My idea was to use a python ORM like SQLAlchemy or peewee to store the results to a database.
However, the computations are done by several people on many different machines, including some that are not directly connected to internet: it is therefore impossible to simply use one common database.
What would be useful to me would be a way of saving the data in the ORM's format to be able to read it again directly once I transfer the data to a machine where the main database can be accessed.
To summarize, I want to do:
On the 1st machine: Python data -> ORM object -> ORM.fileformat
After transfer on a connected machine: ORM.fileformat -> ORM object -> SQL database
Would anyone know if existing ORMs offer that kind of feature?
Is there a reason why some of the machine cannot be connected to the internet?
If you really can't, what I would do is setup a database and the Python app on each machine where data is collected/generated. Have each machine use the app to store into its own local database and then later you can create a dump of each database from each machine and import those results into one database.
Not the ideal solution but it will work.
Ok,
thanks to MAhsan's and Padraic's answers I was able to find the how this can be done: the CSV format is indeed easy to use for import/export from a database.
Here are examples for SQLAlchemy (import 1, import 2, and export) and peewee

MySQL to AppEngine

I'm from Brazil and study at FATEC (college located in Brazil).
I'm trying to learn about AppEngine.
Now, I'm trying to load a large database from MySQL to AppEngine to perform some queries, but I don't know how i can do it. I did some testing with CSV files,but is there any way to perform the direct import from MySQL?
This database is from Pentaho BI Server (www.pentaho.com).
Thank you for your attention.
Regards,
Daniel Naito
It isn't clear from your tags, but the documented bulkloader is preferable to trying to hoist your csv files directly to the app-server.
Advanced Bulk Loading by Nick Johnson is what you are looking for.
If you need live synchronization between App Engine and MySQL, you should look into AppRocket. AppRocket seems to require that you have your data in App Engine before the first synchronization. It will also require some minor changes to your model.
If you're using Pentaho BI Server as your data source, why don't you consider using Pentaho Data Integration (ETL tool) to move the data over? At the very least PDI automate any movement of data between your data source and any AppEngine bulk loader tool (it can easily trigger any app with a shell step).

Need help in designing a phone book application on python running on google app engine

Hi I want some help in building a Phone book application on python and put it on google app engine. I am running a huge db of 2 million user lists and their contacts in phonebook. I want to upload all that data from my servers directly onto the google servers and then use a UI to retrieve the phone book contacts of each user based on his name.
I am using MS SQL sever 2005 as my DB.
Please help in putting together this application.
Your inputs are much appreciated.
For building your UI, AppEngine has it's own web framework called webapp that is pretty easy to get working. I've also had a good experience using the Jinja2 templating engine, which you can include in your source, or package as a zip file (example shows Django, you can do the same type of thing for Jinja).
As for loading all of your data into the DataStore, you should take a look at the bulk uploader documentation.
I think you're going to need to be more specific as to what problem you're having. As far as bulk loading goes, there's lots of bulkloader documentation around; or are you asking about model design? If so, we need to know more about how you plan to search for users. Do you need partial string matches? Sorting? Fuzzy matching?

Categories