I have an ever growing and changing database that reflects a permits passed by the State and EPA.
As the database changes and updates I need to transfer the relevant information.
The script does two things; first it checks which fields are the same and creates a list of fields and data that will be inserted into the new database. Second to insert the data into the new database.
Problem is I cannot get it to insert. I have matched everything like it says online in various ways but i get error ('42000', '[42000] [Microsoft][ODBC Microsoft Access Driver] Syntax error in INSERT INTO statement. (-3502) (SQLExecDirectW)').
I cannot figure out how to prevent it.
Code:
import pyodbc
importDatabase = r"J:\ENVIRO FIELD\AccessDatabases\MS4\MS4 Town Databases\~Template\MS4_Apocalypse Import DEV 1.accdb"
"Create the Import Database Connection"
connectionImport = pyodbc.connect(r'Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=%s;' %(importDatabase))
cursorImport = connectionImport.cursor()
"####---Outfall Section---####"
"Import the outfall names into the new database"
tbl = "tbl_Outfall_1_Profile"
exportList = []
importList = []
for row in cursorImport.columns(table = "tblExportMigration_Outfall_1_Profile"):
field = row.column_name
exportList.append(field)
for row in cursorImport.columns(table = "tbl_Outfall_1_Profile"):
field = row.column_name
importList.append(field)
matchingList = []
for field in exportList:
if field != "outfallID":
if field in importList:
matchingList.append(field)
else:
continue
sqlValue = ""
for field in matchingList:
sqlValue += "[%s], " %(field)
sqlValue = sqlValue[:-2]
sql = "SELECT %s from %s" %(sqlValue, "tblExportMigration_Outfall_1_Profile")
for rowA in cursorImport.execute(sql):
tupleList = list(rowA)
tupleList = ["" if i == None else i for i in tupleList]
tupleValues = tuple(tupleList)
sqlUpdate = """INSERT INTO tbl_Outfall_1_Profile (%s) Values %s;""" %(sqlValue, tupleValues)
cursorImport.execute(sqlUpdate)
cursorImport.close()
This is the sql string I create
"INSERT INTO tbl_Outfall_1_Profile ([profile_OutfallName], [profile_HistoricalName1], [profile_HistoricalName2], [profile_HistoricalName3], [profile_HistoricalName4]) Values ('756', '', '', '', '');"
Taking what #Gord Thompson said I was actually able to create a dynamic parameter flow
First created a module to create the ?
def Defining_Paramters(length):
parameterString = ""
for x in range(1,length):
parameterString += "?, "
parameterString += "?"
return parameterString
Then stuck it into the string for the sql update
sqlUpdate = sqlUpdate = "INSERT INTO %s (%s) Values (%s);" %(table, sqlFrameworkSubStr, parameters)
Run the cursor and commit it
cursorTo.execute(sqlUpdate, (dataTuple))
connectionTo.commit()
It would seem that you have to create the query in its entirety then have your data in tuple format for entry
This is the sql string [I think] I create
Try this:
sqlUpdate = """INSERT INTO tbl_Outfall_1_Profile (%s) Values (%s);""" %(sqlValue, tupleValues)
or perhaps:
sqlUpdate = "INSERT INTO tbl_Outfall_1_Profile (%s) Values (%s);" %(sqlValue, tupleValues)
Related
I am trying to create a method in python insert records into a table passing in a list of column names, and an associated list of records.
I was able to set it up where the column names populated dynamically via a for loop, but I can't figure out how to do the same thing with values because the psycopg2.executemany function relies on having %s's as placeholders.
Is it possible to have the number of %s's in the string populate dynamically via a loop? Is there another way to do this?
def load_table(dbname,table_name,fields,records):
try:
#Variable Qty Column Loop
sql_fields = []
for i in fields:
i = sql.Identifier(i)
sql_fields.append(i)
#Need similar loop to replace %s values
#Replace (%s,%s,%s) ???
#.....
#.....
sql_values = []
for i in fields:
sql_values.append('%s')
print(sql_values)
flist = sql.SQL(',').join(sql_fields)
connection, cursor = create_connection(dbname)
insert_query = sql.SQL('INSERT INTO {table_name} ({fields}) VALUES (%s,%s,%s)').format(
table_name = sql.Identifier(table_name),
fields = flist,
cursor.executemany(insert_query,records)
print('Records Loaded Successfully')
except (Exception,psycopg2.Error) as error:
print("Failed to insert record into table {error}".format(error = error))
finally:
# closing database connection.
if (connection):
close_connection(connection,cursor)
You can use sql.Placeholder, to populate the sql statement with the amount of %s-placeholders you need:
def load_table(dbname,table_name,fields,records):
con, cur = create_connection('foo')
query = sql.SQL("insert into {} ({}) values ({})").format(
sql.Identifier(table_name),
sql.SQL(', ').join(map(sql.Identifier, fields)),
sql.SQL(', ').join(sql.Placeholder() * len(fields)))
print(query.as_string(con))
if __name__ == '__main__':
dbname = '...'
table_name = 'messages'
fields = ['user_id', 'message_type', 'message_title']
records = [['12345', 'json', 'my first message'], ]
load_table(dbname,table_name,fields,records)
Output:
insert into "messages" ("user_id", "message_type", "message_title") values (%s, %s, %s)
I am pretty new in python developing. I have a long python script what "clone" a database and add additional stored functions and procedures. Clone means copy only the schema of DB.These steps work fine.
My question is about pymysql insert exection:
I have to copy some table contents into the new DB. I don't get any sql error. If I debug or print the created INSERT INTO command is correct (I've tested it in an sql editor/handler). The insert execution is correct becuse the result contain the exact row number...but all rows are missing from destination table in dest.DB...
(Ofcourse DB_* variables have been definied!)
import pymysql
liveDbConn = pymysql.connect(DB_HOST, DB_USER, DB_PWD, LIVE_DB_NAME)
testDbConn = pymysql.connect(DB_HOST, DB_USER, DB_PWD, TEST_DB_NAME)
tablesForCopy = ['role', 'permission']
for table in tablesForCopy:
with liveDbConn.cursor() as liveCursor:
# Get name of columns
liveCursor.execute("DESCRIBE `%s`;" % (table))
columns = '';
for column in liveCursor.fetchall():
columns += '`' + column[0] + '`,'
columns = columns.strip(',')
# Get and convert values
values = ''
liveCursor.execute("SELECT * FROM `%s`;" % (table))
for result in liveCursor.fetchall():
data = []
for item in result:
if type(item)==type(None):
data.append('NULL')
elif type(item)==type('str'):
data.append("'"+item+"'")
elif type(item)==type(datetime.datetime.now()):
data.append("'"+str(item)+"'")
else: # for numeric values
data.append(str(item))
v = '(' + ', '.join(data) + ')'
values += v + ', '
values = values.strip(', ')
print("### table: %s" % (table))
testDbCursor = testDbConn.cursor()
testDbCursor.execute("INSERT INTO `" + TEST_DB_NAME + "`.`" + table + "` (" + columns + ") VALUES " + values + ";")
print("Result: {}".format(testDbCursor._result.message))
liveDbConn.close()
testDbConn.close()
Result is:
### table: role
Result: b"'Records: 16 Duplicates: 0 Warnings: 0"
### table: permission
Result: b'(Records: 222 Duplicates: 0 Warnings: 0'
What am I doing wrong? Thanks!
You have 2 main issues here:
You don't use conn.commit() (which would be either be liveDbConn.commit() or testDbConn.commit() here). Changes to the database will not be reflected without committing those changes. Note that all changes need committing but SELECT, for example, does not.
Your query is open to SQL Injection. This is a serious problem.
Table names cannot be parameterized, so there's not much we can do about that, but you'll want to parameterize your values. I've made multiple corrections to the code in relation to type checking as well as parameterization.
for table in tablesForCopy:
with liveDbConn.cursor() as liveCursor:
liveCursor.execute("SELECT * FROM `%s`;" % (table))
name_of_columns = [item[0] for item in liveCursor.description]
insert_list = []
for result in liveCursor.fetchall():
data = []
for item in result:
if item is None: # test identity against the None singleton
data.append('NULL')
elif isinstance(item, str): # Use isinstance to check type
data.append(item)
elif isinstance(item, datetime.datetime):
data.append(item.strftime('%Y-%m-%d %H:%M:%S'))
else: # for numeric values
data.append(str(item))
insert_list.append(data)
testDbCursor = testDbConn.cursor()
placeholders = ', '.join(['`%s`' for item in insert_list[0]])
testDbCursor.executemany("INSERT INTO `{}.{}` ({}) VALUES ({})".format(
TEST_DB_NAME,
table,
name_of_columns,
placeholders),
insert_list)
testDbConn.commit()
From this github thread, I notice that executemany does not work as expected in psycopg2; it instead sends each entry as a single query. You'll need to use execute_batch:
from psycopg2.extras import execute_batch
execute_batch(testDbCursor,
"INSERT INTO `{}.{}` ({}) VALUES ({})".format(TEST_DB_NAME,
table,
name_of_columns,
placeholders),
insert_list)
testDbConn.commit()
How to insert data into table using python pymsql
Find my solution below
import pymysql
import datetime
# Create a connection object
dbServerName = "127.0.0.1"
port = 8889
dbUser = "root"
dbPassword = ""
dbName = "blog_flask"
# charSet = "utf8mb4"
conn = pymysql.connect(host=dbServerName, user=dbUser, password=dbPassword,db=dbName, port= port)
try:
# Create a cursor object
cursor = conn.cursor()
# Insert rows into the MySQL Table
now = datetime.datetime.utcnow()
my_datetime = now.strftime('%Y-%m-%d %H:%M:%S')
cursor.execute('INSERT INTO posts (post_id, post_title, post_content, \
filename,post_time) VALUES (%s,%s,%s,%s,%s)',(5,'title2','description2','filename2',my_datetime))
conn.commit()
except Exception as e:
print("Exeception occured:{}".format(e))
finally:
conn.close()
I'm quite new to mysql as in manipulating the database itself. I succeeded to store new lines in a table but my next endeavor will be a little more complex.
I'd like to fetch the column names from an existing mysql database and save them to an array in python. I'm using the official mysql connector.
I'm thinking I can achieve this through the information_schema.columns command but I have no idea how to build the query and store the information in an array. It will be around 100-200 columns so performance might become an issue so I don't think its wise just to iterate my way through it for each column.
The base code to inject code into mysql using the connector is:
def insert(data):
query = "INSERT INTO templog(data) " \
"VALUES(%s,%s,%s,%s,%s)"
args = (data)
try:
db_config = read_db_config()
conn = MySQLConnection(db_config)
cursor = conn.cursor()
cursor.execute(query, args)
#if cursor.lastrowid:
# print('last insert id', cursor.lastrowid)
#else:
# print('last insert id not found')
conn.commit()
cursor.close()
conn.close()
except Error as error:
print(error)
As said this above code needs to be modified in order to get data from the sql server. Thanks in advance!
Thanks for the help!
Got this as working code:
def GetNames(web_data, counter):
#get all names from the database
connection = create_engine('mysql+pymysql://user:pwd#server:3306/db').connect()
result = connection.execute('select * from price_usd')
a = 0
sql_matrix = [0 for x in range(counter + 1)]
for v in result:
while a == 0:
for column, value in v.items():
a = a + 1
if a > 1:
sql_matrix[a] = str(('{0}'.format(column)))
This will get all column names from the existing sql database
I have this situation where I created a method that will insert rows in database. I provide to that method columns, values and table name.
COLUMNS = [['NAME','SURNAME','AGE'],['SURNAME','NAME','AGE']]
VALUES = [['John','Doe',56],['Doe','John',56]]
TABLE = 'people'
This is how I would like to pass but it doesn't work:
db = DB_CONN.MSSQL() #method for connecting to MS SQL or ORACLE etc.
cursor = db.cursor()
sql = "insert into %s (?) VALUES(?)" % TABLE
cursor.executemany([sql,[COLUMNS[0],VALUES[0]],[COLUMNS[1],VALUES[1]]])
db.commit()
This is how it will pass query but problem is that I must have predefined column names and that's not good because what if the other list has different column sort? Than the name will be in surname and surname in name.
db = DB_CONN.MSSQL() #method for connecting to MS SQL or ORACLE etc.
cursor = db.cursor()
sql = 'insert into %s (NAME,SURNAME,AGE) VALUES (?,?,?)'
cursor.executemany(sql,[['John','Doe',56],['Doe','John',56]])
db.commit()
I hope I explained it clearly enough.
Ps. COLUMNS and VALUES are extracted from json dictionary
[{'NAME':'John','SURNAME':'Doe','AGE':56...},{'SURNAME':'Doe','NAME':'John','AGE':77...}]
if that helps.
SOLUTION:
class INSERT(object):
def __init__(self):
self.BASE_COL = ''
def call(self):
GATHER_DATA = [{'NAME':'John','SURNAME':'Doe','AGE':56},{'SURNAME':'Doe','NAME':'John','AGE':77}]
self.BASE_COL = ''
TABLE = 'person'
#check dictionary keys
for DATA_EVAL in GATHER_DATA:
if self.BASE_COL == '': self.BASE_COL = DATA_EVAL.keys()
else:
if self.BASE_COL != DATA_EVAL.keys():
print ("columns in DATA_EVAL.keys() have different columns")
#send mail or insert to log or remove dict from list
exit(403)
#if everything goes well make an insert
columns = ','.join(self.BASE_COL)
sql = 'insert into %s (%s) VALUES (?,?,?)' % (TABLE, columns)
db = DB_CONN.MSSQL()
cursor = db.cursor()
cursor.executemany(sql, [DATA_EVAL.values() for DATA_EVAL in GATHER_DATA])
db.commit()
if __name__ == "__main__":
ins = INSERT()
ins.call()
You could take advantage of the non-random nature of key-value pair listing for python dictionaries.
You should check that all items in the json array of records have the same fields, otherwise you'll run into an exception in your query.
columns = ','.join(records[0].keys())
sql = 'insert into %s (%s) VALUES (?,?,?)' % (TABLE, columns)
cursor.executemany(sql,[record.values() for record in records])
References:
https://stackoverflow.com/a/835430/5189811
I am working on a Python script (Python version 2.5.1 on Windows XP) that involves connecting to a Microsoft Access (.mdb) database to read values from a table. I'm getting some unexpected results with one record whereby the field of interest precision is getting rounded.
I know the Access table field of interest is a Double data type. But, the value that caused me to discover this in the table is 1107901035.43948. When I read the value in the Python code and print it out, it's showing 1107901035.44.
Is there a pyODBC connection parameter or other that must be set? I couldn't find anything in the documentation
Here's what my code looks like (the intention is to resolve unneeded records by identifying the record that has the greatest value for my field of interest):
conn = pyodbc.connect('DRIVER={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=' + pGDB)
conn.autocommit = True
cursor = conn.cursor()
tableList = []
for x in cursor.tables():
val = str(x[2])
if val[0:3] <> "MSy":
if val[0:3] <> "GDB":
if val[-5:] <> "Index":
tableList.append(val)
for x in tableList:
try:
SQL = "SELECT * FROM %s" % (x)
cursor.execute(SQL)
rows = cursor.fetchall()
counter = 0
for row in rows:
counter +=1
if counter > 1:
print "Site %s is a multipart basin" % (x)
SQL = "SELECT MAX(Shape_Area) AS AREA FROM %s" % (x)
cursor.execute(SQL)
row = cursor.fetchone()
val = row.AREA
print str(val)
SQL = "DELETE * FROM %s WHERE Shape_Area < %s" % (x, val)
cursor.execute(SQL)
thanks,
Tom
Django uses the Jinja template, so you can use its round filer. It works as follows:
template.html
<p>{{ VALUE| round(2, 'floor') }}</p>
Check out Jinja's documentation on topic
The SQL round function can also do this job:
SQL = "SELECT ROUND(MAX(Shape_Area), 2) AS AREA FROM %s" % (x)