This question already has answers here:
Python: use mysqldb to import a MySQL table as a dictionary?
(5 answers)
Closed 9 years ago.
After executing a query statement on a MySQL database connection, I perform:
rows = cursor.fetchall()
This gives an array of arrays. I'd like to have an array of dictionaries, where each dictionary takes its keys from the requested column names of my table and associates the values from the table.
How do I do this?
Well, you forgot to mention which mysql library you're using.
If using oursql (which I recommend, it is certainly the best one), use oursql's DictCursor. Example:
conn = oursql.connect(...)
curs = conn.cursor(oursql.DictCursor)
If using MySQLdb (why?) Use MySQLdb's DictCursor. Example:
conn = MySQLdb.connect(..., cursorclass=MySQLdb.cursors.DictCursor)
curs = conn.cursor()
Doing that will give you a cursor that returns dicts for each row. Remember to not have duplicate rownames in your query.
Related
This question already has an answer here:
Python sqlite3 parameterized drop table
(1 answer)
Closed 10 months ago.
def refreshDatabase(table):
c.execute("DROP TABLE ?", (table,))
conn.commit()
createNewTable(table)
Hey, how can I drop the Table that is declared as a parameter when calling the function? It doesn't seem to work with this syntax. thanks in advance!
Take a look here
You can't substitute table name at SQL side, so change it from Python (which is not safe of course)
def refreshDatabase(table):
c.execute(f"DROP TABLE {table}")
conn.commit()
createNewTable(table)
When using the SQLite executable, I can print to console a list of all the tables in a database with .tables. Is there an equivalent one-liner way to do this in Python using the sqlite3 module without using Pandas?
Based on the answers at List of tables, db schema, dump etc using the Python sqlite3 API, it appears you have to create a cursor first, then a SQL query to return a list of the tables (which is actually a list of tuples as unicode characters that needs to be cleaned for a pretty print to console). That's five lines, versus SQLite's one line.
Or maybe a related question is: why do I need to open a cursor to get this information in Python sqlite3?
con = sqlite3.connect(db)
cursor = con.cursor()
cursor.execute("SELECT name FROM sqlite_master WHERE type='table'")
tables = cursor.fetchall()
tables = map(str, [t[0] for t in tables])
print(tables)
It's not pretty, but since you're only using the variables once, you can simply chain all the calls.
print([str(t[0]) for t in sqlite3.connect(db).cursor().execute("SELECT name FROM sqlite_master WHERE type='table'").fetchall()])
This question already has answers here:
Sqlite insert query not working with python?
(2 answers)
Closed 6 years ago.
I'm new to python and SQLite, so I apologize if this is a dumb question. I've written the code below to open up a database and delete the data in the STAGING_LIDs table. The script runs, but when I check the DB, the data is still there. Am I doing something wrong?
import sqlite3
import csv
conn = sqlite3.connect('C:\\SQLite\\Budget_Dev.db')
cur = conn.cursor()
#delete all table data
cur.execute("DELETE FROM STAGING_LIDs;")
I'm using bernie's answer in this question (the accepted answer) as a template.
I figured it out. I needed to add a line:
conn.commit()
This question already has an answer here:
How to get variable length placeholders in a Python call to SQLite3
(1 answer)
Closed 7 years ago.
When there are only a few parameters I can do this
cur.execute(INSERT INTO table (col1,col2) VALUES (?,?), (1,2))
but what if I have like 50 parameters? I don't want to write 50 question marks in that case. Is there a way to do something like the following?
cur.execute(INSERT INTO table (listOfColumns) VALUES tuple(['?']*len(listOfColumns)), (listOfValues))
Yes, you just build the SQL statement dynamically:
sql = ('INSERT INTO table ({}) VALUES ({})'.format(','.join(listOfColumns),
','.join('?'*len(listOfColumns))))
cur.execute(sql, listOfValues)
Note that this assumes the list of columns was generated locally and not tainted by user input. If the list of columns could be tainted you need to check it pretty carefully to ensure that it only contains valid column names before inserting it into SQL code.
This question already has answers here:
Python: use mysqldb to import a MySQL table as a dictionary?
(5 answers)
Closed 9 years ago.
After executing a query statement on a MySQL database connection, I perform:
rows = cursor.fetchall()
This gives an array of arrays. I'd like to have an array of dictionaries, where each dictionary takes its keys from the requested column names of my table and associates the values from the table.
How do I do this?
Well, you forgot to mention which mysql library you're using.
If using oursql (which I recommend, it is certainly the best one), use oursql's DictCursor. Example:
conn = oursql.connect(...)
curs = conn.cursor(oursql.DictCursor)
If using MySQLdb (why?) Use MySQLdb's DictCursor. Example:
conn = MySQLdb.connect(..., cursorclass=MySQLdb.cursors.DictCursor)
curs = conn.cursor()
Doing that will give you a cursor that returns dicts for each row. Remember to not have duplicate rownames in your query.