I have a lot of data from an excel sheet and I used python to read that data with xlrd and am now outputting all of that data from python. My question is, how do I take that data that I am outputting through python, and upload it on MongoDB. I understand that pymongo must be used, but am not quite sure how to do it. Any help is greatly appreciated.
Let's assume you've read the tutorials but still don't get it...
You'll need to convert your xlrd data into a list of dictionaries, one dictionary for each row in your spreadsheet. Here's a clue: Python Creating Dictionary from excel data
Once you have the list of dictionaries/rows, make sure you have MongoDB running on your machine, then:
from pymongo import MongoClient
db = MongoClient().mydb # create a database called 'mydb'
for row_dict in list_of_rows:
db.rows.save(row_dict) # saves each row in collection called "rows"
Related
Suppose I have list of API's like the following...
https://developer.genesys.cloud/devapps/api-explorer#get-api-v2-alerting-alerts-active
https://developer.genesys.cloud/devapps/api-explorer#get-api-v2-alerting-interactionstats-rules
https://developer.genesys.cloud/devapps/api-explorer#get-api-v2-analytics-conversations-details
I want to read this API's one by one and store the data to snowflake using Pandas and SQLalchemy.
Do you have any ideas for reading the API's one by one in my python script?
-Read the API's one by one from a file.
Load the data to snowflake table directly.
After running the mongodb docker image I created a database called MovieFlix and some collections with items inside the database.
What I want is to find a way to store all the data from the MovieFlix db in a json file to have it saved for later user for docker-compose.
Should I do it with python code using pymongo or is there a simpler way ?
Mongodb has a command line tool for exporting to json or csv. The keyword for this is dump. Hope it helps!
I need to import JSON data from an API to MySQL database using python. I can get the json data in python script but no idea how to insert this data to a mysql database.
finally, did it. Saved the JSON file locally first, then parsed it using key values in a for loop & lastly ran a query to insert into MySQL table.
Serializing JSON to string and insert string to DB, and make it Deserializing when you want to use it. I use this method but I don't know if it is optimum
Working With JSON Data in Python
I am new to the world of python and have some problems loading data from a csv files unto postgresql.
I have successfully connected to my database from python and created my table. But when I go to load the data from the csv file into the created table in postgresql, I get nothing on the table when I use either the insert function or or copy function and commit.
cur.execute('''COPY my_schema.sheet(id, length, width, change_d, change_h,change_t, change_a, name)
FROM '/private/tmp/data.csv' DELIMITER ',' CSV HEADER;''')
dbase.commit()
I am not sure what I am missing, can anyone please help with this or advise a better way to load csv data using python script
so i am building a database for a larger program and do not have much experience in this area of coding (mostly embedded system programming). My task is to import a large excel file into python. It is large so i'm assuming I must convert it to a CSV then truncate it by parsing and then partitioning and then import to avoid my computer crashing. Once the file is imported i must be able to extract/search specific information based on the column titles. There are other user interactive aspects that are simply string based so not very difficult. As for the rest, I am getting the picture but would like a more efficient and specific design. Can anyone offer me guidance on this?
An excel or csv can be read into python using pandas. The data is stored as rows and columns and is called a dataframe. To import data in such a structure, you need to import pandas first and then read the csv or excel into the dataframe structure.
import pandas as pd
df1= pd.read_csv('excelfilename.csv')
This dataframe structure is similar to tables and you can perform joining of different dataframes, grouping of data etc.
I am not sure if this is what you need, let me know if you need any further clarifications.
I would recommend actually loading it into a proper database such as Mariadb or Postgresql. This will allow you to access the data from other applications and it takes the load off of you for writing a database. You can then use a ORM if you would like to interact with the data or simply use plain SQL via python.
read the CSV
df = pd.read_csv('sample.csv')
connect to a database
conn = sqlite3.connect("Any_Database_Name.db") #if the db does not exist, this creates a Any_Database_Name.db file in the current directory
store your table in the database:
df.to_sql('Some_Table_Name', conn)
read a SQL Query out of your database and into a pandas dataframe
sql_string = 'SELECT * FROM Some_Table_Name' df = pd.read_sql(sql_string, conn)