I am trying to access the two specific column from the table in Django it is not Working but When I am trying to access select * it is working
I am using postgresql
When I am trying to access select all its working this is I am trying to access for particular column
def bigdataDatabase(X):
engine = sqlalchemy.create_engine('postgresql+psycopg2://postgres:password#localhost/db_name')
con = engine.connect()
result = con.execute(
"Select Orign,Departure From 'table_name' WHERE index = '" + str(X) + "'")
This is not working
I have also tried with this
result = con.execute("Select tablename.Orign,tablename.departure From 'table_name' WHERE index = '" + str(X) + "'")
both the above code is not working
Programming Error column does not exist
But When I am executing all this it is working
result = con.execute("Select * From 'table_name' WHERE index = '" + str(X) + "'")
I have found the solution of the problem the query should be executed like this
result = con.execute('Select "Orign","Departure" From "Table_name" WHERE index = ' + str(X))
Related
I have multiple tables that are updated after a value is changed in a grid. These tables don't always have the same keys or columns so I cannot explicitly name the columns or formats. The only thing that is ever the same, is the column where the keys reside. I know the way I am currently doing this is not correct and leaves me open to injection attacks.
I also ran into an issue where some of the values contain keys that throw an error in the SQL statement. For example, updating WHERE email = t'est#email.com.
I am not really sure of the proper way to write these statements. I did some research and see multiple methods for different purposes but am not sure which is proper. I am looking to do this as dynamically as possible. Can anyone point me in the right direction?
To connect:
import mysql.connector as sql
import MySQLdb
#Connect
self.db_name = 'database'
self.server = 'server'
self.user_id = 'user'
self.pw = 'password'
try:
self.db_con = MySQLdb.connect(user=self.user_id,password=self.pw,database=self.db_name,
host=self.server,charset='utf8',autocommit=True)
self.cursor = self.db_con.cursor()
except:
print("Error connecting")
SQL Statements:
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#SQL STATEMENTS
sql_update = "UPDATE " + tbl + " SET " + target_col + " = " + nVal + " WHERE " + key_col + " = " + key_id + ""
#INSERT
sql_update = ("INSERT INTO " + str(self.tbl) + "(" + self.key_col + ")" + "VALUES (" + str("'") + str(val) + str("'") + ")")
#DELETE
sql_update = "DELETE FROM " + tbl + " WHERE " + self.key_col + " = " + self.key_id + ""
#SELECT
sql_query = "SELECT * FROM " + self.tbl
#Excecute
try:
self.cursor.execute(sql_update)
except:
print('Error')
self.db_con.rollback()
Databases have different notations for "quoting" identifiers (table and column names etc) and values (data).
MySQL uses backticks to quote identifiers. For values, it's best to use the parameter substitution mechanism provided by the connector package: it's more likely to handle tricky cases like embedded quotes correctly, and will reduce the risk of SQL injection.
Here's an example for inserts; the same techniques can be used for the other types of query.
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#INSERT (using f-strings for brevity)
sql_update = (f"INSERT INTO `{self.tbl}` (`{self.key_col}`) VALUES (%s)")
# Pass the statement and values to cursor.execute.
# The values are assumed to be a sequence, so a single value should be
# placed in a tuple or list.
self.cursor.execute(sql_update, (nVal,))
If you have more than one column / value pair you could do something like this:
cols = ['A', 'B', 'C']
vals = ['a', 'b', 'c']
col_names = ','.join([f'`{c}`' for c in cols])
values_placeholder = ','.join(['%s'] * len(cols))
sql_update = (f"INSERT INTO `{self.tbl}` (col_names) VALUES ({values_placeholder})")
self.cursor.execute(sql_update, vals)
Values are not only data for insertion, but also data that we are using for comparison, for example in WHERE clauses. So an update statement with a filter might be created like this:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = (%s) WHERE (`{key_col}`) = %s")
self.cursor.execute(sql_update, (nVal, key_id))
However sometimes the target of a SET or WHERE clause may be a column, for example we want to do an update based on other values in the row. For example, this statement will set target_col to the value of other_col for all rows where key_col is equal to other_key_col:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = `{other_col}` WHERE (`{key_col}`) = `{other_key_col}`")
self.cursor.execute(sql_update)
I have multiple tables that are updated after a value is changed in a grid. These tables don't always have the same keys or columns so I cannot explicitly name the columns or formats. The only thing that is ever the same, is the column where the keys reside. I know the way I am currently doing this is not correct and leaves me open to injection attacks.
I also ran into an issue where some of the values contain keys that throw an error in the SQL statement. For example, updating WHERE email = t'est#email.com.
I am not really sure of the proper way to write these statements. I did some research and see multiple methods for different purposes but am not sure which is proper. I am looking to do this as dynamically as possible. Can anyone point me in the right direction?
To connect:
import mysql.connector as sql
import MySQLdb
#Connect
self.db_name = 'database'
self.server = 'server'
self.user_id = 'user'
self.pw = 'password'
try:
self.db_con = MySQLdb.connect(user=self.user_id,password=self.pw,database=self.db_name,
host=self.server,charset='utf8',autocommit=True)
self.cursor = self.db_con.cursor()
except:
print("Error connecting")
SQL Statements:
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#SQL STATEMENTS
sql_update = "UPDATE " + tbl + " SET " + target_col + " = " + nVal + " WHERE " + key_col + " = " + key_id + ""
#INSERT
sql_update = ("INSERT INTO " + str(self.tbl) + "(" + self.key_col + ")" + "VALUES (" + str("'") + str(val) + str("'") + ")")
#DELETE
sql_update = "DELETE FROM " + tbl + " WHERE " + self.key_col + " = " + self.key_id + ""
#SELECT
sql_query = "SELECT * FROM " + self.tbl
#Excecute
try:
self.cursor.execute(sql_update)
except:
print('Error')
self.db_con.rollback()
Databases have different notations for "quoting" identifiers (table and column names etc) and values (data).
MySQL uses backticks to quote identifiers. For values, it's best to use the parameter substitution mechanism provided by the connector package: it's more likely to handle tricky cases like embedded quotes correctly, and will reduce the risk of SQL injection.
Here's an example for inserts; the same techniques can be used for the other types of query.
key_id = str("'") + self.GetCellValue(event.GetRow(),1) + str("'")
target_col = self.GetColLabelValue(event.GetCol())
key_col = self.GetColLabelValue(1)
nVal = str("'") + self.GetCellValue(event.GetRow(),event.GetCol()) + str("'")
#INSERT (using f-strings for brevity)
sql_update = (f"INSERT INTO `{self.tbl}` (`{self.key_col}`) VALUES (%s)")
# Pass the statement and values to cursor.execute.
# The values are assumed to be a sequence, so a single value should be
# placed in a tuple or list.
self.cursor.execute(sql_update, (nVal,))
If you have more than one column / value pair you could do something like this:
cols = ['A', 'B', 'C']
vals = ['a', 'b', 'c']
col_names = ','.join([f'`{c}`' for c in cols])
values_placeholder = ','.join(['%s'] * len(cols))
sql_update = (f"INSERT INTO `{self.tbl}` (col_names) VALUES ({values_placeholder})")
self.cursor.execute(sql_update, vals)
Values are not only data for insertion, but also data that we are using for comparison, for example in WHERE clauses. So an update statement with a filter might be created like this:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = (%s) WHERE (`{key_col}`) = %s")
self.cursor.execute(sql_update, (nVal, key_id))
However sometimes the target of a SET or WHERE clause may be a column, for example we want to do an update based on other values in the row. For example, this statement will set target_col to the value of other_col for all rows where key_col is equal to other_key_col:
sql_update = (f"UPDATE `{tbl}` SET (`{target_col}`) = `{other_col}` WHERE (`{key_col}`) = `{other_key_col}`")
self.cursor.execute(sql_update)
Sorry if this is a noob question, but I am trying to dump a psycopg2 dictionary directly into a json string. I do get a return value in the browser, but it isn't formatted like most of the other json examples I see. The idea being to dump the result of a select statement into a json string and unbundle it on the other end to add into a database on the client side. The code is below and a sample of the return. Is there a better way to do this operation with json and psycopg2?
# initializing variables
location_connection = location_cursor = 0
sql_string = coordinate_return = data = ""
# opening connection and setting cursor
location_connection = psycopg2.connect("dbname='' user='' password=''")
location_cursor = location_connection.cursor(cursor_factory=psycopg2.extras.RealDictCursor)
# setting sql string and executing query
sql_string = "select * from " + tablename + " where left(alphacoordinate," + str(len(coordinate)) + ") = '" + coordinate + "' order by alphacoordinate;"
location_cursor.execute(sql_string)
data = json.dumps(location_cursor.fetchall())
# closing database connection
location_connection.close()
# returning coordinate string
return data
sample return
"[{\"alphacoordinate\": \"nmaan-001-01\", \"xcoordinate\":
3072951151886, \"planetarydiameter\": 288499, \"planetarymass\":
2.020936938e+27, \"planetarydescription\": \"PCCGQAAA\", \"planetarydescriptionsecondary\": 0, \"moons\": 1"\"}]"
You could create the JSON string directly in Postgres using row_to_json:
# setting sql string and executing query
sql_string = "select row_to_json(" + tablename + ") from " + tablename + " where left(alphacoordinate," + str(len(coordinate)) + ") = '" + coordinate + "' order by alphacoordinate;"
location_cursor.execute(sql_string)
data = location_cursor.fetchall()
I have following query to delete the duplicates from the table in database.
WITH x AS (SELECT "region_code" dup, min(ctid)
FROM public.test2 GROUP BY "region_code"
HAVING count(*) > 1)
DELETE FROM public.test2
USING x
WHERE (region_code) = (dup) AND public.test2.ctid <> x.min
RETURNING *;
Now I want to execute this query using python. When I run this query in python, nothing happens. I am using sqlalchemy with python 3.6.
query = "WITH x AS (SELECT \"region_code\" dup, min(ctid) FROM " + schema + "." + table_name + " GROUP BY \"region_code\" HAVING count(*) > 1) DELETE FROM " + schema + "." + table_name +" USING x WHERE (region_code) = (dup) AND " + schema + "." + table_name +".ctid <> x.min RETURNING *;"
data = con.execute(query)
I am trying to get the mssql table column names using pyodbc, and getting an error saying
ProgrammingError: No results. Previous SQL was not a query.
Here is my code:
class get_Fields:
def GET(self,r):
web.header('Access-Control-Allow-Origin', '*')
web.header('Access-Control-Allow-Credentials', 'true')
fields = []
datasetname = web.input().datasetName
tablename = web.input().tableName
cnxn = pyodbc.connect(connection_string)
cursor = cnxn.cursor()
query = "USE" + "[" +datasetname+ "]" + "SELECT COLUMN_NAME,* FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = " + "'"+ tablename + "'"
cursor.execute(query)
DF = DataFrame(cursor.fetchall())
columns = [column[0] for column in cursor.description]
return json.dumps(columns)
how to solve this?
You can avoid this by using some of pyodbc's built in methods. For example, instead of:
query = "USE" + "[" +datasetname+ "]" + "SELECT COLUMN_NAME,* FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = " + "'"+ tablename + "'"
cursor.execute(query)
DF = DataFrame(cursor.fetchall())
Try:
column_data = cursor.columns(table=tablename, catalog=datasetname, schema='dbo').fetchall()
print(column_data)
That will return the column names (and other column metadata). I believe the column name is the fourth element per row. This also relieves the very valid concerns about SQL injection. You can then figure out how to build your DataFrame from the resulting data.
Good luck!
Your line
query = "USE" + "[" +datasetname+ "]" + "SELECT COLUMN_NAME,*...
Will produce something like
USE[databasename]SELECT ...
In SSMS this would work, but I'd suggest to look on proper spacing and to separate the USE-statement with a semicolon:
query = "USE " + "[" +datasetname+ "]; " + "SELECT COLUMN_NAME,*...
Set the database context using the Database attribute when building the connection string
Use parameters any time you are passing user input (especially from HTTP requests!) to a WHERE clause.
These changes eliminate the need for dynamic SQL, which can be insecure and difficult to maintain.