I am using cx_Oracle module to connect to oracle database. In the script i use two variables schema_name and table_name. The below query works fine
cur1.execute("select owner,table_name from dba_tables where owner ='schema_name'")
But i need to query the num of rows of a table, where i need to qualify the table_name with the schema_name and so the query should be
SELECT count(*) FROM "schema_name"."table_name"
This does not work when using in the code, i have tried to put it in triple quotes, single quotes and other options but it does not format the query as expected and hence errors out with table does not exist.
Any guidance is appreciated.
A prepared statement containing placeholders with variables of the form ...{}.{}".format(sc,tb) might be used
sc='myschema'
tb='mytable'
cur1.execute("SELECT COUNT(*) FROM {}.{}".format(sc,tb))
print(cur1.fetchone()[0])
In this particular case, you could also try setting Connection.current_schema, see the cx_Oracle API doc
For example, if you create table in your own schema:
SQL> show user
USER is "CJ"
SQL> create table ffff (mycol number);
Table created.
SQL> insert into ffff values (1);
1 row created.
SQL> commit;
Commit complete.
Then run Python code that connects as a different user:
import cx_Oracle
import os
import sys, os
if sys.platform.startswith("darwin"):
cx_Oracle.init_oracle_client(lib_dir=os.environ.get("HOME")+"/Downloads/instantclient_19_8")
username = "system"
password = "oracle"
connect_string = "localhost/orclpdb1"
connection = cx_Oracle.connect(username, password, connect_string)
connection.current_schema = 'CJ';
with connection.cursor() as cursor:
sql = """select * from ffff"""
for r in cursor.execute(sql):
print(r)
sql = """select sys_context('USERENV','CURRENT_USER') from dual"""
for r in cursor.execute(sql):
print(r)
the output will be:
(1,)
('SYSTEM',)
The last query shows that it is not the user that is being changed, but just the first query is automatically changed from 'ffff' to 'CJ.ffff'.
Related
Can the cursor.execute call below execute multiple SQL queries in one go?
cursor.execute("use testdb;CREATE USER MyLogin")
I don't have python setup yet but want to know if above form is supported by cursor.execute?
import pyodbc
# Some other example server values are
# server = 'localhost\sqlexpress' # for a named instance
# server = 'myserver,port' # to specify an alternate port
server = 'tcp:myserver.database.windows.net'
database = 'mydb'
username = 'myusername'
password = 'mypassword'
cnxn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER='+server+';DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = cnxn.cursor()
#Sample select query
cursor.execute("SELECT ##version;")
row = cursor.fetchone()
while row:
print(row[0])
row = cursor.fetchone()
Multiple SQL statements in a single string is often referred to as an "anonymous code block".
There is nothing in pyodbc (or pypyodbc) to prevent you from passing a string containing an anonymous code block to the Cursor.execute() method. They simply pass the string to the ODBC Driver Manager (DM) which in turn passes it to the ODBC Driver.
However, not all ODBC drivers accept anonymous code blocks by default. Some databases default to allowing only a single SQL statement per .execute() to protect us from SQL injection issues.
For example, MySQL/Connector ODBC defaults MULTI_STATEMENTS to 0 (off) so if you want to run an anonymous code block you will have to include MULTI_STATEMENTS=1 in your connection string.
Note also that changing the current database by including a USE … statement in an anonymous code block can sometimes cause problems because the database context changes in the middle of a transaction. It is often better to execute a USE … statement by itself and then continue executing other SQL statements.
Yes, it is possible.
operation = 'SELECT 1; INSERT INTO t1 VALUES (); SELECT 2'
for result in cursor.execute(operation, multi=True):
But it is not a comprehensive solution. For example, in queries with two selections, you have problems.
Consider that two types of answers must be fetch all in the cursor!
So the best solution is to break the query to sub queries and do your work step by step.
for example :
s = "USE some_db; SELECT * FROM some_table;"
s = filter(None, s.split(';'))
for i in s:
cur.execute(i.strip() + ';')
in the pyodbc documentation should give you the example your looking for. more over in the GitHub wiki: https://github.com/mkleehammer/pyodbc/wiki/Objects#cursors
you can see an example here:
cnxn = pyodbc.connect(...)
cursor = cnxn.cursor()
cursor.execute("""
select user_id, last_logon
from users
where last_logon > ?
and user_type <> 'admin'
""", twoweeks)
rows = cursor.fetchall()
for row in rows:
print('user %s logged on at %s' % (row.user_id, row.last_logon))
from this example and exploring the code, I would say your next step is testing a multi cursor.execute("<your_sql_Querie>").
if this test works, maybe try and create a CLASS then create instances of that class for each query you want to run.
This would be the basic evolution of a developers effort of reproducing documentation...hope this helps you :)
Yes, you can results for multiple queries by using the nextset() method...
query = "select * from Table1; select * from Table2"
cursor = connection.cursor()
cursor.execute(query)
table1 = cursor.fetchall()
cursor.nextset()
table2 = cursor.fetchall()
The code explains it... cursors return result "sets", which you can move between using the nextset() method.
i've been trying to get some data from my db by using below code, but the code is not working. is there any mistake that i made in the code, if so how can i fix it.
NOTE: i took the below code from just a script not a django or flesk web app.
def db():
conn = psycopg2.connect(
"dbname=mydb user=postgres password=****** host=*.*.*.*")
cur = conn.cursor()
cur.execute("""SELECT * FROM MddPublisher""")
query_results = cur.fetchall()
print(query_results)
db()
ERROR: psycopg2.errors.UndefinedTable: relation "mddpublisher" does not exist LINE 1: SELECT * FROM MddPublisher
additionally,i want to show below code to prove that connection is ok. the problem is that i can't receive data from my db whenever i try to execute select command through python.
def print_tables():
conn = psycopg2.connect(
"dbname=mydb user=postgres password=***** host=*.*.*.*.*")
cur = conn.cursor()
cur.execute("""SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'""")
for table in cur.fetchall():
print(table)
print_tables()
OUTPUT:
('MddPublisher',)
This is probably an issue with case sensitivity. Postgresql names are usually normalized to lower case. However, when used inside double quotes, they keep their case. So, to access a table named MddPublisher you must write it like "MddPublisher".
All the gory details are in Section 4.1.1, Identifiers and Key Words in the Postgresql 14 docs.
I have a column called REQUIREDCOLUMNS in a SQL database which contains the columns which I need to select in my Python script below.
Excerpt of Current Code:
db = mongo_client.get_database(asqldb_row.SCHEMA_NAME)
coll = db.get_collection(asqldb_row.TABLE_NAME)
table = list(coll.find())
root = json_normalize(table)
The REQUIREDCOLUMNSin SQL contains values reportId, siteId, price, location
So instead of explicitly typing:
print(root[["reportId","siteId","price","location"]])
Is there a way to do print(root[REQUIREDCOLUMNS])?
Note: (I'm already connected to the SQL database in my python script)
You will have to use cursors if you are using mysql or pymysql , both the syntax are almost similar below i will mention for mysql
import mysql
import mysql.connector
db = mysql.connector.connect(
host = "localhost",
user = "root",
passwd = " ",
database = " "
)
cursor = db.cursor()
sql="select REQUIREDCOLUMNS from table_name"
cursor.execute(sql)
required_cols = cursor.fetchall()#this wll give ["reportId","siteId","price","location"]
cols_as_string=','.join(required_cols)
new_sql='select '+cols_as_string+' from table_name'
cursor.execute(new_sql)
result=cursor.fetchall()
This should probably work, i intentionally split many lines into several lines for understanding.
syntax could be slightly different for pymysql
I'm trying to figure out why I can't access a particular table in a PostgreSQL database using psycopg2. I am running PostgreSQL 11.5
If I do this, I can connect to the database in question and read all the tables in it:
import psycopg2
try:
connection = psycopg2.connect(user = "postgres", #psycopg2.connect() creates connection to PostgreSQL database instance
password = "battlebot",
host = "127.0.0.1",
port = "5432",
database = "BRE_2019")
cursor = connection.cursor() #creates a cursor object which allows us to execute PostgreSQL commands through python source
#Print PostgreSQL version
cursor.execute("""SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'""")
for table in cursor.fetchall():
print(table)
The results look like this :
('geography_columns',)
('geometry_columns',)
('spatial_ref_sys',)
('raster_columns',)
('raster_overviews',)
('nc_avery_parcels_poly',)
('Zone5e',)
('AllResidential2019',)
#....etc....
The table I am interested in is the last one, 'AllResidential2019'
So I try to connect to it and print the contents by doing the following:
try:
connection = psycopg2.connect(user = "postgres",
#psycopg2.connect() creates connection to PostgreSQL database instance
password = "battlebot",
host = "127.0.0.1",
port = "5432",
database = "BRE_2019")
cursor = connection.cursor() #creates a cursor object which allows us to execute PostgreSQL commands through python source
cursor.execute("SELECT * FROM AllResidential2019;") #Executes a database operation or query. Execute method takes SQL query as a parameter. Returns list of result
record = cursor.fetchall()
print(record)
except (Exception, psycopg2.Error) as error:
print("Error while connecting to PostgreSQL: ", error)
And I get the following error:
Error while connecting to PostgreSQL: relation "allresidential2019" does not exist
LINE 1: SELECT * FROM AllResidential2019;
However, I can successfully connect and get results when attempting to connect to another table in another database I have (this works! and the results are the data in this table):
try:
connection = psycopg2.connect(user = "postgres", #psycopg2.connect() creates connection to PostgreSQL database instance
password = "battlebot",
host = "127.0.0.1",
port = "5432",
database = "ClimbingWeatherApp") . #different database name
cursor = connection.cursor()
cursor.execute("SELECT * FROM climbing_area_info ;")
record = cursor.fetchall()
print(record)
except (Exception, psycopg2.Error) as error:
print("Error while connecting to PostgreSQL: ", error)
I can't figure out why I can retrieve information from one table but not another, using exactly the same code (except names are changes). And I am also not sure how to troubleshoot this. Can anyone offer suggestions?
Your table name is case-sensitive and you have to close it in double quotes:
SELECT * FROM "AllResidential2019";
In Python program it may look like this:
cursor.execute('SELECT * FROM "AllResidential2019"')
or you can use the specialized module SQL string composition:
from psycopg2 import sql
# ...
cursor.execute(sql.SQL("SELECT * FROM {}").format(sql.Identifier('AllResidential2019')))
Note that case-sensitive Postgres identifiers (i.e. names of a table, column, view, function, etc) unnecessarily complicate simple matters. I would advise you not to use them.
Likely, the reason for your issue is Postgres' quoting rules which adheres to the ANSI SQL standard regarding double quoting identifiers. In your table creation, you likely quoted the table:
CREATE TABLE "AllResidential2019" (
...
)
Due to case sensitivity of at least one capital letter, this requires you to always quote the table when referencing the table. Do remember: single and double quotes have different meanings in SQL as opposed to being mostly interchangeable in Python.
SELECT * FROM "AllResidential2019"
DELETE FROM "AllResidential2019" ...
ALTER TABLE "AllResidential2019" ...
It is often recommended, if your table, column, or other identifier does not contain special characters, spaces, or reserved words, to always use lower case or no quotes:
CREATE TABLE "allresidential2019" (
...
)
CREATE TABLE AllResidential2019 (
...
)
Doing so, any combination of capital letters will work
SELECT * FROM ALLRESIDENTIAL2019
SELECT * FROM aLlrEsIdEnTiAl2019
SELECT * FROM "allresidential2019"
See further readings on the subject:
Omitting the double quote to do query on PostgreSQL
PostgreSQL naming conventions
Postgres Docs - 4.1.1. Identifiers and Key Words
Don’t use double quotes in PostgreSQL
What is the difference between single and double quotes in SQL?
I was facing the same error in Ubuntu. But in my case, I accidentally added the tables to the wrong database, which was in turn owned by the root postgres user instead of the new postgres user that I had created for my flask app.
I'm using a SQL file to create and populate the tables. This is the command that I used to be able to create these tables using a .sql file. This allows you to specify the owner of the tables as well as the database in which they should be created:
sudo -u postgres psql -U my_user -d my_database -f file.sql -h localhost
You will then be prompted for my_users's password.
sudo -u postgres is only necessary if you are running this from a terminal as a the root user. It basically runs the psql ... command as the postgres user.
I am trying to just create a temporary table in my SQL database, where I then want to insert data (from a Pandas DataFrame), and via this temporary table insert the data into a 'permanent' table within the database.
So far I have something like
""" Database specific... """
import sqlalchemy
from sqlalchemy.sql import text
dsn = 'dsn-sql-acc'
database = "MY_DATABASE"
connection_str = """
Driver={SQL Server Native Client 11.0};
Server=%s;
Database=%s;
Trusted_Connection=yes;
""" % (dsn,database)
connection_str_url = urllib.quote_plus(connection_str)
engine = sqlalchemy.create_engine("mssql+pyodbc:///?odbc_connect=%s" % connection_str_url, encoding='utf8', echo=True)
# Open connection
db_connection = engine.connect()
sql_create_table = text("""
IF OBJECT_ID('[MY_DATABASE].[SCHEMA_1].[TEMP_TABLE]', 'U') IS NOT NULL
DROP TABLE [MY_DATABASE].[SCHEMA_1].[TEMP_TABLE];
CREATE TABLE [MY_DATABASE].[SCHEMA_1].[TEMP_TABLE] (
[Date] Date,
[TYPE_ID] nvarchar(50),
[VALUE] nvarchar(50)
);
""")
db_connection.execute("commit")
db_connection.execute(sql_create_table)
db_connection.close()
The "raw" SQL-snippet within sql_create_table works fine when executed in SQL Server, but when running the above in Python, nothing happens in my database...
What seems to be the issue here?
Later on I would of course want to execute
BULK INSERT [MY_DATABASE].[SCHEMA_1].[TEMP_TABLE]
FROM '//temp_files/temp_file_data.csv'
WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR='\n');
in Python as well...
Thanks
These statements are out of order:
db_connection.execute("commit")
db_connection.execute(sql_create_table)
Commit after creating your table and your table will persist.