I have a column called REQUIREDCOLUMNS in a SQL database which contains the columns which I need to select in my Python script below.
Excerpt of Current Code:
db = mongo_client.get_database(asqldb_row.SCHEMA_NAME)
coll = db.get_collection(asqldb_row.TABLE_NAME)
table = list(coll.find())
root = json_normalize(table)
The REQUIREDCOLUMNSin SQL contains values reportId, siteId, price, location
So instead of explicitly typing:
print(root[["reportId","siteId","price","location"]])
Is there a way to do print(root[REQUIREDCOLUMNS])?
Note: (I'm already connected to the SQL database in my python script)
You will have to use cursors if you are using mysql or pymysql , both the syntax are almost similar below i will mention for mysql
import mysql
import mysql.connector
db = mysql.connector.connect(
host = "localhost",
user = "root",
passwd = " ",
database = " "
)
cursor = db.cursor()
sql="select REQUIREDCOLUMNS from table_name"
cursor.execute(sql)
required_cols = cursor.fetchall()#this wll give ["reportId","siteId","price","location"]
cols_as_string=','.join(required_cols)
new_sql='select '+cols_as_string+' from table_name'
cursor.execute(new_sql)
result=cursor.fetchall()
This should probably work, i intentionally split many lines into several lines for understanding.
syntax could be slightly different for pymysql
Related
I am trying to create tables out of json files containing the field names and types of each table of a database downloaded from Bigquery. The SQL request semt fine to me and but no table was created according to psql command-line interpreter typing \d
So, to begin I've just tried with a simpler sql request that doesn't work neither,
Here is the code :
import pandas as pd
import psycopg2
# information used to create a database connection
sqluser = 'postgres'
dbname = 'testdb'
pwd = 'postgres'
# Connect to postgres database
con = psycopg2.connect(dbname=dbname, user=sqluser, password=pwd )
curs=con.cursor()
q="""set search_path to public,public ;
CREATE TABLE tab1(
i INTEGER
);
"""
curs.execute(q)
q = """
SELECT table_name
FROM information_schema.tables
WHERE table_schema='public'
AND table_type='BASE TABLE';
"""
df = pd.read_sql_query(q, con)
print(df.head())
print("End of test")
The code written above displays this new table tab1, but actually this new table doesn't appear listed when typing \d within the psql command line interpreter. If I type in the psql interpreter :
SELECT table_name
FROM information_schema.tables
WHERE table_type='BASE TABLE';
it doesn't get listed neither , seems it's not actually created, Thanks in advance for your help
There was a commit() call missing, that must be written after the table creation sql request,
This code works:
import pandas as pd
import psycopg2
# information used to create a database connection
sqluser = 'postgres'
dbname = 'testdb'
pwd = 'postgres'
# Connect to postgres database
con = psycopg2.connect(dbname=dbname, user=sqluser, password=pwd )
curs=con.cursor()
q="""set search_path to public,public ;
CREATE TABLE tab1(
i INTEGER
);
"""
curs.execute(q)
con.commit()
q = """
SELECT table_name
FROM information_schema.tables
WHERE table_schema='public'
AND table_type='BASE TABLE';
"""
df = pd.read_sql_query(q, con)
print(df.head())
print("End of test")
I am using cx_Oracle module to connect to oracle database. In the script i use two variables schema_name and table_name. The below query works fine
cur1.execute("select owner,table_name from dba_tables where owner ='schema_name'")
But i need to query the num of rows of a table, where i need to qualify the table_name with the schema_name and so the query should be
SELECT count(*) FROM "schema_name"."table_name"
This does not work when using in the code, i have tried to put it in triple quotes, single quotes and other options but it does not format the query as expected and hence errors out with table does not exist.
Any guidance is appreciated.
A prepared statement containing placeholders with variables of the form ...{}.{}".format(sc,tb) might be used
sc='myschema'
tb='mytable'
cur1.execute("SELECT COUNT(*) FROM {}.{}".format(sc,tb))
print(cur1.fetchone()[0])
In this particular case, you could also try setting Connection.current_schema, see the cx_Oracle API doc
For example, if you create table in your own schema:
SQL> show user
USER is "CJ"
SQL> create table ffff (mycol number);
Table created.
SQL> insert into ffff values (1);
1 row created.
SQL> commit;
Commit complete.
Then run Python code that connects as a different user:
import cx_Oracle
import os
import sys, os
if sys.platform.startswith("darwin"):
cx_Oracle.init_oracle_client(lib_dir=os.environ.get("HOME")+"/Downloads/instantclient_19_8")
username = "system"
password = "oracle"
connect_string = "localhost/orclpdb1"
connection = cx_Oracle.connect(username, password, connect_string)
connection.current_schema = 'CJ';
with connection.cursor() as cursor:
sql = """select * from ffff"""
for r in cursor.execute(sql):
print(r)
sql = """select sys_context('USERENV','CURRENT_USER') from dual"""
for r in cursor.execute(sql):
print(r)
the output will be:
(1,)
('SYSTEM',)
The last query shows that it is not the user that is being changed, but just the first query is automatically changed from 'ffff' to 'CJ.ffff'.
I'm trying to figure out why I can't access a particular table in a PostgreSQL database using psycopg2. I am running PostgreSQL 11.5
If I do this, I can connect to the database in question and read all the tables in it:
import psycopg2
try:
connection = psycopg2.connect(user = "postgres", #psycopg2.connect() creates connection to PostgreSQL database instance
password = "battlebot",
host = "127.0.0.1",
port = "5432",
database = "BRE_2019")
cursor = connection.cursor() #creates a cursor object which allows us to execute PostgreSQL commands through python source
#Print PostgreSQL version
cursor.execute("""SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'""")
for table in cursor.fetchall():
print(table)
The results look like this :
('geography_columns',)
('geometry_columns',)
('spatial_ref_sys',)
('raster_columns',)
('raster_overviews',)
('nc_avery_parcels_poly',)
('Zone5e',)
('AllResidential2019',)
#....etc....
The table I am interested in is the last one, 'AllResidential2019'
So I try to connect to it and print the contents by doing the following:
try:
connection = psycopg2.connect(user = "postgres",
#psycopg2.connect() creates connection to PostgreSQL database instance
password = "battlebot",
host = "127.0.0.1",
port = "5432",
database = "BRE_2019")
cursor = connection.cursor() #creates a cursor object which allows us to execute PostgreSQL commands through python source
cursor.execute("SELECT * FROM AllResidential2019;") #Executes a database operation or query. Execute method takes SQL query as a parameter. Returns list of result
record = cursor.fetchall()
print(record)
except (Exception, psycopg2.Error) as error:
print("Error while connecting to PostgreSQL: ", error)
And I get the following error:
Error while connecting to PostgreSQL: relation "allresidential2019" does not exist
LINE 1: SELECT * FROM AllResidential2019;
However, I can successfully connect and get results when attempting to connect to another table in another database I have (this works! and the results are the data in this table):
try:
connection = psycopg2.connect(user = "postgres", #psycopg2.connect() creates connection to PostgreSQL database instance
password = "battlebot",
host = "127.0.0.1",
port = "5432",
database = "ClimbingWeatherApp") . #different database name
cursor = connection.cursor()
cursor.execute("SELECT * FROM climbing_area_info ;")
record = cursor.fetchall()
print(record)
except (Exception, psycopg2.Error) as error:
print("Error while connecting to PostgreSQL: ", error)
I can't figure out why I can retrieve information from one table but not another, using exactly the same code (except names are changes). And I am also not sure how to troubleshoot this. Can anyone offer suggestions?
Your table name is case-sensitive and you have to close it in double quotes:
SELECT * FROM "AllResidential2019";
In Python program it may look like this:
cursor.execute('SELECT * FROM "AllResidential2019"')
or you can use the specialized module SQL string composition:
from psycopg2 import sql
# ...
cursor.execute(sql.SQL("SELECT * FROM {}").format(sql.Identifier('AllResidential2019')))
Note that case-sensitive Postgres identifiers (i.e. names of a table, column, view, function, etc) unnecessarily complicate simple matters. I would advise you not to use them.
Likely, the reason for your issue is Postgres' quoting rules which adheres to the ANSI SQL standard regarding double quoting identifiers. In your table creation, you likely quoted the table:
CREATE TABLE "AllResidential2019" (
...
)
Due to case sensitivity of at least one capital letter, this requires you to always quote the table when referencing the table. Do remember: single and double quotes have different meanings in SQL as opposed to being mostly interchangeable in Python.
SELECT * FROM "AllResidential2019"
DELETE FROM "AllResidential2019" ...
ALTER TABLE "AllResidential2019" ...
It is often recommended, if your table, column, or other identifier does not contain special characters, spaces, or reserved words, to always use lower case or no quotes:
CREATE TABLE "allresidential2019" (
...
)
CREATE TABLE AllResidential2019 (
...
)
Doing so, any combination of capital letters will work
SELECT * FROM ALLRESIDENTIAL2019
SELECT * FROM aLlrEsIdEnTiAl2019
SELECT * FROM "allresidential2019"
See further readings on the subject:
Omitting the double quote to do query on PostgreSQL
PostgreSQL naming conventions
Postgres Docs - 4.1.1. Identifiers and Key Words
Don’t use double quotes in PostgreSQL
What is the difference between single and double quotes in SQL?
I was facing the same error in Ubuntu. But in my case, I accidentally added the tables to the wrong database, which was in turn owned by the root postgres user instead of the new postgres user that I had created for my flask app.
I'm using a SQL file to create and populate the tables. This is the command that I used to be able to create these tables using a .sql file. This allows you to specify the owner of the tables as well as the database in which they should be created:
sudo -u postgres psql -U my_user -d my_database -f file.sql -h localhost
You will then be prompted for my_users's password.
sudo -u postgres is only necessary if you are running this from a terminal as a the root user. It basically runs the psql ... command as the postgres user.
I have a MySQL database named my_database , and it that database there are a lot of tables. I want to connect MySQL with Python and to work with specific table named my_table from that database.
This is the code that I have for now:
import json
import pymysql
connection = pymysql.connect(user = "root", password = "", host = "127.0.0.1", port = "", database = "my_database")
cursor = connection.cursor()
print(cursor.execute("SELECT * FROM my_database.my_table"))
This code returns number of rows, but I want to get all columns and rows (all values from that table).
I have also tried SELECT * FROM my_table but result is the same.
Did you read the documentation? You need to fetch the results after executing: fetchone(), fetchall() or something like this:
import json
import pymysql
connection = pymysql.connect(user = "root", password = "", host = "127.0.0.1", port = "", database = "my_database")
with connection.cursor(pymysql.cursors.DictCursor) as cursor:
cursor.execute("SELECT * FROM my_database.my_table")
rows = cursor.fetchall()
for row in rows:
print(row)
You probably also want a DictCursor as the results are then parsed as dict.
I'm new to mySQL and Python.
I have code to insert data from Python into mySQL,
conn = MySQLdb.connect(host="localhost", user="root", passwd="kokoblack", db="mydb")
for i in range(0,len(allnames)):
try:
query = "INSERT INTO resumes (applicant, jobtitle, lastworkdate, lastupdate, url) values ("
query = query + "'"+allnames[i]+"'," +"'"+alltitles[i]+"',"+ "'"+alldates[i]+"'," + "'"+allupdates[i]+"'," + "'"+alllinks[i]+"')"
x = conn.cursor()
x.execute(query)
row = x.fetchall()
except:
print "error"
It seems to be working fine, because "error" never appears. Instead, many rows of "1L" appear in my Python shell. However, when I go to MySQL, the "resumes" table in "mydb" remains completely empty.
I have no idea what could be wrong, could it be that I am not connected to MySQL's server properly when I'm viewing the table in MySQL? Help please.
(I only use import MySQLdb, is that enough?)
use commit to commit the changes that you have done
MySQLdb has autocommit off by default, which may be confusing at first
You could do commit like this
conn.commit()
or
conn.autocommit(True) Right after the connection is created with the DB