How to retrieve Table structure from sqlite3 (.db ) database using python - python

I am trying to get table structure using python from sqlite3 (.db) database. for that I am using the below code, but it is giving syntax error any help?
import sqlite3
conn = sqlite3.connect('Db-IMDB.db')
cursor = conn.cursor()
cursor.execute('DESCRIBE Movie')

It's not MySQL, but sqlite3. In Python API you can retrieve information on table from this statement:
table_info = cursor.execute("SELECT name FROM sqlite_master WHERE type='table';").fetchall()

Related

PostgreSQL Error while executing sql command in Python

i've been trying to get some data from my db by using below code, but the code is not working. is there any mistake that i made in the code, if so how can i fix it.
NOTE: i took the below code from just a script not a django or flesk web app.
def db():
conn = psycopg2.connect(
"dbname=mydb user=postgres password=****** host=*.*.*.*")
cur = conn.cursor()
cur.execute("""SELECT * FROM MddPublisher""")
query_results = cur.fetchall()
print(query_results)
db()
ERROR: psycopg2.errors.UndefinedTable: relation "mddpublisher" does not exist LINE 1: SELECT * FROM MddPublisher
additionally,i want to show below code to prove that connection is ok. the problem is that i can't receive data from my db whenever i try to execute select command through python.
def print_tables():
conn = psycopg2.connect(
"dbname=mydb user=postgres password=***** host=*.*.*.*.*")
cur = conn.cursor()
cur.execute("""SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'""")
for table in cur.fetchall():
print(table)
print_tables()
OUTPUT:
('MddPublisher',)
This is probably an issue with case sensitivity. Postgresql names are usually normalized to lower case. However, when used inside double quotes, they keep their case. So, to access a table named MddPublisher you must write it like "MddPublisher".
All the gory details are in Section 4.1.1, Identifiers and Key Words in the Postgresql 14 docs.

How to Make SQL Table Show on Pycharm

I have a postgres db running on docker. I am able to access this db via my sql client Dbeaver and when I run select statements I see the expected results.
I would like to be able to query this db via a Python script and after some searching found psycopg2 package.
When I run the code below it 'looks' like it's successful, the conn and cursor objects appear as a variables.
import pandas as pd
import psycopg2
# connect to db
conn = psycopg2.connect(
host="localhost",
database="postgres",
user="postgres",
password="example")
# create a cursor
cur = conn.cursor()
However, when trying to query the db using cur.connect(), , variable ex_data is None. This exact same query via my sql client returns a table of data.
ex_data = cur.execute('select * from myschema.blah limit 10;')
How can I query my db via Python using psycopg2? Desired result wold be a data frame with the result set from the query string above.

Python MySQLdb doesn't return all the data from the database

I'm using the Python package MySQLdb to fetch data from a MySQL database. However, I notice that I can't fetch the entirety of the data.
import MySQLdb
db = MySQLdb.connect(host=host, user=user, passwd=password)
cur = db.cursor()
query = "SELECT count(*) FROM table"
cur.execute(query)
This returns a number less than what I get if I execute the exact same query in MySQL Workbench. I've noticed that the data it doesn't return is the data that was inserted into the database most recently. Where am I going wrong?
You are not committing the inserted rows on the other connection.

pETL fails to load data to MySQL DB

I'm using petl package to build an ETL pipeline from Python 2.7 to MySQL5.6
My db connector is MySQLdb (mysql-python).
The following code fails to execute:
import MySQLdb as mdb
import petl as etl
con = mdb.connect(host = '127.0.0.1', user = '<someuser>', passwd = '<somepass>')
cur = con.cursor() # get the cursor
cur.execute('DROP SCHEMA IF EXISTS petltest')
cur.execute('CREATE SCHEMA petltest')
cur.execute('USE petltest')
dat = [{'id':1,'name':'One'},
{'id':2,'name':'Two'},
{'id':3,'name':'Three'}]
table = etl.fromdicts(dat) # petl object
etl.todb(table,cur,'table',schema='petltest',create=True)
The error code is:
ProgrammingError: (1064, 'You have an error in your SQL syntax; check
the manual that corresponds to your MySQL server version for the right
syntax to use near \'"table" (\n\tid INTEGER NOT NULL, \n\tname
VARCHAR(5) NOT NULL\n)\' at line 1')
This error occurs also when trying to create table separately or running petl.appenddb
How can I fix it / overcome the issue?
Thanks
Apparently the problem was the quotes style that PETL use.
If you run:
cur.execute('SET SQL_MODE=ANSI_QUOTES')
before petl sql (petl.todb()) statements it executes well.

How to determine fields in a table using SQLAlchemy?

Is it possible to determine fields available in a table (MySQL DB) pragmatically at runtime using SQLAlchemy or any other python library ? Any help on this would be great.
Thanks.
Reflection could do this.
Reflect the database at once using sqlalchemy
meta = MetaData()
meta.reflect(bind=someengine)
users_table = meta.tables['users']
addresses_table = meta.tables['addresses']
# fields of address_table
fields = addresses_table.columns.keys()
See more information at http://docs.sqlalchemy.org/en/rel_0_7/core/schema.html#reflecting-database-objects
You can run the SHOW TABLE TABLENAME and get the columns of the tables.
This is part of the std DB api specification for python (pep 249), namely the description member on cursors, so you do not need SQLAlchemy.
for example for using http://www.pymysql.org/ , user is your table
import pymysql
conn = pymysql.connect(host='127.0.0.1', port=3306, user='root', passwd='', db='mysql')
cur = conn.cursor()
cur.execute("SELECT * FROM user")
print cur.description

Categories