I have 700 tables in a test.db file, and was wondering how do I loop through all these tables and return the table name if columnA value is -?
connection.execute('SELECT * FROM "all_tables" WHERE "columnA" = "-"')
How do I put all 700 tables in all_tables?
To continue on a theme:
import sqlite3
try:
conn = sqlite3.connect('/home/rolf/my.db')
except sqlite3.Error as e:
print('Db Not found', str(e))
db_list = []
mycursor = conn.cursor()
for db_name in mycursor.execute("SELECT name FROM sqlite_master WHERE type = 'table'"):
db_list.append(db_name)
for x in db_list:
print "Searching",x[0]
try:
mycursor.execute('SELECT * FROM '+x[0]+' WHERE columnA" = "-"')
stats = mycursor.fetchall()
for stat in stats:
print stat, "found in ", x
except sqlite3.Error as e:
continue
conn.close()
SQLite
get all tables name:
SELECT name FROM sqlite_master WHERE type='table' ORDER BY name;
Cycle
for table in tables:
...
connection.execute('SELECT * FROM "table1" WHERE "columnA" = "-"')
or one SQL request UNION
sql = []
for table in tables
sql.append('(SELECT * FROM "' + table + '" WHERE "columnA" = "-";)')
' UNION '.join(sql)
You could query the sqlite_master to get all the table names within your database: SELECT name FROM sqlite_master WHERE type = 'table'
sqlite_master can be thought of as a table that contains information about your databases (metadata).
A quick but most likely inefficient way (because it will be running 700 queries with 700 separate resultsets) to get the list of table names, loop through those tables and return data where columnA = "-":
for row in connection.execute('SELECT name FROM sqlite_master WHERE type = "table" ORDER BY name').fetchall()
for result in connection.execute('SELECT * FROM ' + row[1] + ' WHERE "columnA" = "-"').fetchall()
# do something with results
Note: Above code is untested but gives you an idea on how to approach this.
Related
I have a python function in which I want to check if a PostgreSQL table exists or not (True, False)
it does not return True... even when I am logged into the same DB and checking in PGAdmin4.. and getting True.
Am I missing a commit? I tried adding a commit() to no effect.
def __exists_table(self, table_name):
cursor = self.__get_a_cursor()
try:
string_to_execute = "SELECT EXISTS(SELECT 1 FROM pg_catalog.pg_tables WHERE schemaname = 'public' AND tablename = '" + table_name + "');"
cursor.execute(string_to_execute)
query_results = cursor.fetchall()
if len(query_results) > 1:
print("__exists_data got back multiple results, using the first")
query_results = query_results[0][0]
return query_results
except Exception as err:
print("Exception on __exists_table: " + str(err))
raise err
finally:
cursor.close()
Your code appears to work as written.
I have a database that contains a single table, table1:
$ psql -h localhost
psql (11.6, server 12.1 (Debian 12.1-1.pgdg100+1))
Type "help" for help.
lars=> \d
List of relations
Schema | Name | Type | Owner
--------+--------+-------+-------
public | table1 | table | lars
(1 row)
If I wrap your code up in a runnable script, like this:
import psycopg2
class DBTest:
def __init__(self):
self.db = psycopg2.connect('host=localhost dbname=lars password=secret')
def __get_a_cursor(self):
return self.db.cursor()
def __exists_table(self, table_name):
cursor = self.__get_a_cursor()
try:
string_to_execute = "SELECT EXISTS(SELECT 1 FROM pg_catalog.pg_tables WHERE schemaname = 'public' AND tablename = '" + table_name + "');"
cursor.execute(string_to_execute)
query_results = cursor.fetchall()
if len(query_results) > 1:
print("__exists_data got back multiple results, using the first")
query_results = query_results[0][0]
return query_results
except Exception as err:
print("Exception on __exists_table: " + str(err))
raise err
finally:
cursor.close()
def test_exists_table(self, table_name):
return self.__exists_table(table_name)
db = DBTest()
for table_name in ['table1', 'table2']:
if db.test_exists_table(table_name):
print(f'{table_name} exists')
else:
print(f'{table_name} does not exist')
Running it produces the output I would expect:
table1 exists
table2 does not exist
Having said that, I would make the follow change to your code. First, rather than creating your query string like this:
string_to_execute = """SELECT EXISTS(
SELECT 1 FROM pg_catalog.pg_tables
WHERE schemaname = 'public'
AND tablename = '""" + table_name + "');"
cursor.execute(string_to_execute)
I would let your database driver take care of parameter substitution for you:
string_to_execute = """SELECT EXISTS(
SELECT 1 FROM pg_catalog.pg_tables
WHERE schemaname = 'public'
AND tablename = %s
)"""
cursor.execute(string_to_execute, (table_name,))
This is easier to read and safer, since it will properly quote any special character in the parameter.
This code works, but is very slow. And I will like to use sqlalchemy module because the rest of the script uses that instead of mysql. Is there any advantage of using sqlalchemy or should I continue with this ...
for emp_id in mylist:
try:
connection = mysql.connector.connect(host='x.x.x.x', port='3306', database='xxx', user='root', password='xxx')
cursor = connection.cursor(prepared=True)
sql_fetch_blob_query = """SELECT col1, col2, Photo from tbl where ProfileID = %s"""
cursor.execute(sql_fetch_blob_query, (emp_id, ))
record = cursor.fetchall()
for row in record:
image = row[2]
file_name = 'myimages4'+'/'+str(row[0])+ '_' + str(row[1]) + '/' + 'simage' + str(emp_id) + '.jpg'
write_file(image, file_name)
except mysql.connector.Error as error :
connection.rollback()
print("Failed to read BLOB data from MySQL table {}".format(error))
finally:
if(connection.is_connected()):
cursor.close()
connection.close()
Do you really need to set up new mysql connection and obtain cursor on each iteration? If no, opening it once at the beginning will really speed up your code.
connection = mysql.connector.connect(host='x.x.x.x', port='3306', database='xxx', user='root', password='xxx', charset="utf8")
cursor = connection.cursor(prepared=True)
for emp_id in mylist:
try:
sql_fetch_blob_query = """SELECT col1, col2, Photo from tbl where ProfileID = %s"""
cursor.execute(sql_fetch_blob_query, (emp_id, ))
record = cursor.fetchall()
for row in record:
image = row[2]
file_name = 'myimages4'+'/'+str(row[0])+ '_' + str(row[1]) + '/' + 'simage' + str(emp_id) + '.jpg'
write_file(image, file_name)
except mysql.connector.Error as error :
connection.rollback()
print("Failed to read BLOB data from MySQL table {}".format(error))
finally:
# ouch ...
if(connection.is_connected()):
cursor.close()
connection.close()
UPD:
Actually you don't even need to make N queries to database, because all data can be obtained in one query with WHERE ProfileID IN (.., ..) SQL statement. Take a look this small code, which solves a pretty much identical task:
transaction_ids = [c['transaction_id'] for c in checkouts]
format_strings = ','.join(['%s'] * len(transaction_ids))
dm_cursor.execute("SELECT ac_transaction_id, status FROM transactions_mapping WHERE ac_transaction_id IN (%s)" % format_strings, tuple(transaction_ids))
payments = dm_cursor.fetchall()
Please use it to solve your problem.
I am writing a function that will retrieve data from sqlite table based on the parameters user provide. This is the function so far
def database_retrieve(db_file, id):
try:
conn = sqlite3.connect(db_file)
with conn:
sql_command = "SELECT * FROM my_table WHERE id = "+id
cur = conn.cursor()
cur.execute(sql_command)
result = cur.fetchall()
return result
except Exception as e:
print(e)
db_file = 'testdb.db'
print(database_retrieve(db_file, 'subject1'))
This gives me the following error
no such column: subject1
None
When I add subject1, which is an entry under the id column in my_table, directly to the sql command like this
sql_command = "SELECT * FROM my_table WHERE id = 'subject1'"
it works fine and prints all the data.
I am new to sqlite3. Please help. Thanks in advance
These are the links I used to come this far
Python sqlite3 string variable in execute
https://www.dummies.com/programming/databases/how-to-retrieve-data-from-specific-rows-in-mysql-databases/
When you do this
sql_command = "SELECT * FROM my_table WHERE id = "+id
The value of sql_command is
"SELECT * FROM my_table WHERE id = subject1"
As you can see, subject1 is not in quotes. sqlite thinks it is a column, that's why you see that error.
Instead, do this
sql_command = "SELECT * FROM my_table WHERE id = ?"
cur.execute(sql_command, [id])
? acts as a placeholder for the variable id.
The official sqlite3 documentation mentions few others methods
https://docs.python.org/2/library/sqlite3.html
The sql_command string being generated should be something like this (Formatted string):
sql_command = "SELECT * FROM my_table WHERE id = %s AND name = %s" % (212212, 'shashank')
I am querying a mysql database version 5.6.13 using python 2.7.
This works:
whichCustomer = str(1934)
qry = ("SELECT * FROM customers WHERE customerid = " + whichCustomer)
cursor.execute(qry)
The query also works:
qry = ("SELECT * FROM customers WHERE customerid = 1934")
cursor.execute(qry)
BUT, when I try to use string substitution the query fails:
whichCustomer = 1934
qry = ("SELECT * FROM customers WHERE customerid = %d")
cursor.execute(qry, (whichCustomer))
Is there something I am missing. The full try/execute code follows:
try:
import mysql.connector
print 'Module mysql initialized'
print 'Attempting connection to cheer database'
cnx = mysql.connector.connect(user='notsure',
password='notsure',
host='localhost',
database='notreal')
cursor = cnx.cursor()
whichCustomer = str(1934)
qry = ("SELECT * FROM customers WHERE customerid = " + whichCustomer)
cursor.execute(qry)
recx = cursor.fetchone()
print recx[1]
cnx.close()
print 'Successful connection to notreal database'
except:
print 'Error initialzing mysql databsasr'
You need to use %s for SQL parameters, and the second argument must be a sequence, like a tuple:
whichCustomer = 1934
qry = ("SELECT * FROM customers WHERE customerid = %s")
cursor.execute(qry, (whichCustomer,))
Note the comma in the second parameter; without a comma, that parameter is not a tuple and just the 1934 integer value is passed in instead.
Although both Python string interpolation placeholders and SQL parameters use closely related syntax, they are not the same thing. As such, SQL parameters for positional values are always expressed as %s regardless of the type.
Can someone please explain how I can get the tables in the current database?
I am using postgresql-8.4 psycopg2.
This did the trick for me:
cursor.execute("""SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'""")
for table in cursor.fetchall():
print(table)
pg_class stores all the required information.
executing the below query will return user defined tables as a tuple in a list
conn = psycopg2.connect(conn_string)
cursor = conn.cursor()
cursor.execute("select relname from pg_class where relkind='r' and relname !~ '^(pg_|sql_)';")
print cursor.fetchall()
output:
[('table1',), ('table2',), ('table3',)]
The question is about using python's psycopg2 to do things with postgres. Here are two handy functions:
def table_exists(con, table_str):
exists = False
try:
cur = con.cursor()
cur.execute("select exists(select relname from pg_class where relname='" + table_str + "')")
exists = cur.fetchone()[0]
print exists
cur.close()
except psycopg2.Error as e:
print e
return exists
def get_table_col_names(con, table_str):
col_names = []
try:
cur = con.cursor()
cur.execute("select * from " + table_str + " LIMIT 0")
for desc in cur.description:
col_names.append(desc[0])
cur.close()
except psycopg2.Error as e:
print e
return col_names
Here's a Python3 snippet that includes connect() parameters as well as generate a Python list() for output:
conn = psycopg2.connect(host='localhost', dbname='mySchema',
user='myUserName', password='myPassword')
cursor = conn.cursor()
cursor.execute("""SELECT relname FROM pg_class WHERE relkind='r'
AND relname !~ '^(pg_|sql_)';""") # "rel" is short for relation.
tables = [i[0] for i in cursor.fetchall()] # A list() of tables.
Although it has been answered by Kalu, but the query mentioned returns tables + views from postgres database. If you need only tables and not views then you can include table_type in your query like-
s = "SELECT"
s += " table_schema"
s += ", table_name"
s += " FROM information_schema.tables"
s += " WHERE"
s += " ("
s += " table_schema = '"+SCHEMA+"'"
s += " AND table_type = 'BASE TABLE'"
s += " )"
s += " ORDER BY table_schema, table_name;"
db_cursor.execute(s)
list_tables = db_cursor.fetchall()
you can use this code for python 3
import psycopg2
conn=psycopg2.connect(database="your_database",user="postgres", password="",
host="127.0.0.1", port="5432")
cur = conn.cursor()
cur.execute("select * from your_table")
rows = cur.fetchall()
conn.close()