Information: inserting a statistic, that is a temperature value. The info should be "temp" so that I know that the log is about the temperature
I'm trying to execute next query from a python script
info = "temp"
sql = 'insert into stat_bus (addr, grp, info, status)
VALUES (' + address + ', ' + group + ', ' + info + ', ' + status + ')'
I get following error: Unknown column temp in field list
The address, group and status don't give erros.
I have tried other possibilities for the string parsing (I guess that's the error?) but can't get it working. Thanks in advance
The problem is that with your query, SQL isn't interpreting temp as a the string "temp", but as a variable named temp.
If you print sql, you will get the following query:
insert into stat_bus (addr, grp, info, status) VALUES (something, something, temp, something)
Whereas you would want this query:
insert into stat_bus (addr, grp, info, status) VALUES ("something", "something", "temp", "something")
(Note the quotes around temp)
Just change your Python code by adding the quotes in the right places:
info = "temp"
sql = 'insert into stat_bus (addr, grp, info, status)
VALUES ("' + address + '", "' + group + '", "' + info + '", "' + status + '")'
Related
I have this table structure:
date_sourced
sha1
vsdt
trendx
notes
And my csv structure: sha1,vsdt,trendx,notes
How can I insert a variable value to my date_sourced?
I tried this:
var = "2018-1-10"
query = "LOAD DATA INFILE %s INTO TABLE jeremy_table_test FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (%s,sha1, #var1, trendx,notes) SET vsdt = TRIM(TRAILING ')' FROM TRIM(LEADING '(' FROM #var1))"
cursor.execute(query, (path,var))
but gives me error:
ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near ''2018-1-10',sha1, #var1, trendx,notes) SET vsdt = TRIM(TRAILING ')' FROM TRIM(LE' at line 1
Does LOAD_DATA_INFILE accept external variables? For example I have these two variables
import csv
import mysql.connector
path = 'C:\\Users\\trendMICRO\\Desktop\\OJT\\updated_test.csv'
print "CSV importing to database"
mydb = mysql.connector.connect(user='root', password='',
host='localhost',
database='jeremy_db')
cursor = mydb.cursor()
var = "apple"
query = "LOAD DATA INFILE %s INTO TABLE jeremy_table_test FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (%s, #var1, person) SET vsdt = TRIM(TRAILING ')' FROM TRIM(LEADING '(' FROM #var1))"
cursor.execute(query, (path))
mydb.commit()
How can I apply it here in my query,by replacing 'path/to/rb' to my variable path and value of fruit set by my variable var = "apple"?
LOAD DATA INFILE 'path/to/rb' INTO TABLE jeremy_table_test FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (fruit, #var1, person) SET vsdt = TRIM(TRAILING ')' FROM TRIM(LEADING '(' FROM #var1))
cursor.execute(query)
connection.commit()
This answer assumes you are using the MySQLdb module, if you are using a different driver the answer may vary.
To add the values we want to use a parameterized query as follows:
var = "apple"
path = "C:\\apple.txt"
query = "LOAD DATA INFILE %s INTO TABLE jeremy_table_test FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (%s, #var1, person) SET vsdt = TRIM(TRAILING ')' FROM TRIM(LEADING '(' FROM #var1))"
cursor.execute(query, (path, var))
connection.commit()
The parameters stored in the tuple given as the second argument to cursor.execute() will be substituted for the values of %s that occur in the query string.
Unfortunately this won't work because path is not going to be used as a column value, and according to the MySQLdb user's guide:
Parameter placeholders can only be used to insert column values. They
can not be used for other parts of SQL, such as table names,
statements, etc.
So we need to do a horrible thing and write our query string by hand using the file name. This is unsafe if you are allowing user input to be passed in to a variable like path.
We can still use a parameter for the value of var as before.
var = "apple"
path = "C:\\apple.txt"
query = "LOAD DATA INFILE '" + path + "' "
query += "INTO TABLE jeremy_table_test FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' IGNORE 1 LINES (%s, #var1, person) SET vsdt = TRIM(TRAILING ')' FROM TRIM(LEADING '(' FROM #var1))"
cursor.execute(query, (var,))
connection.commit()
i have the following function
def get_id(entityName, text):
"""Retrieve an entity's unique ID from the database, given its associated text.
If the row is not already present, it is inserted.
The entity can either be a sentence or a word."""
tableName = entityName + 's'
columnName = entityName
cursor.execute('SELECT rowid FROM ' + tableName + ' WHERE ' + columnName + ' = %s', (text,))
row = cursor.fetchone()
if row:
return row[0]
else:
cursor.execute('INSERT INTO ' + tableName + ' (' + columnName + ') VALUES (?)', (text,))
return cursor.lastrowid
when ever this method get called it producing this error
cursor.execute('SELECT rowid FROM ' + tableName + ' WHERE ' + columnName + ' = ?', (text,))
InterfaceError: Error binding parameter 0 - probably unsupported type.
currently this error is producing when i am running this in django
otherwise its working fine.
what can be the reason ?
here the type of my parameter 0 (text) was a <type 'unicode'> , and the column data-type in database was a text type
so the error
InterfaceError: Error binding parameter 0 - probably unsupported type.
is obvious, as parameter 0 is not meeting the type of database column
i did'nt get this earlier ,as i was getting this from another source.
But after getting it is not a text type i had to convert it in text type
something like str(text)
its working like a charm now
I am trying to insert a row into my postgresql database with a table created from
CREATE TABLE public.coinbase_btc_usd
(
id bigserial primary key,
price integer NOT NULL,
buy integer NOT NULL,
sell integer NOT NULL,
"timestamp" timestamp with time zone
)
However when my python 3.6 script runs and tries to add a row using psycopg2 like this it returns an error saying "no results to fetch" and nothing is added to my db.
sql_query = "INSERT INTO coinbase_btc_usd(price, buy, sell, timestamp)" \
" VALUES (" + exchange_rate + ', ' + buy_rate + ', ' + sell_rate + ", \'2015-10-10 06:44:33.8672177\')"
print(sql_query)
cur.execute(sql_query)
I also printed the sql_query variable to see exactly what was getting attempted to execute and this was printed to the output
INSERT INTO coinbase_btc_usd(price, buy, sell, timestamp) VALUES (16392.10, 16563.40, 16235.42, '2015-10-10 06:44:33.8672177')
Make sure that you are committing the transaction:
cur.execute(sql_query)
conn.commit()
Or you can enable auto commit to commit each query immediately after execution:
conn.autocommit = True
Furthermore, it costs nothing to prevent SQL injection attack - just use parametersied queries. In fact your code will actually be cleaner as well as safer:
sql_query = "INSERT INTO coinbase_btc_usd(price, buy, sell, timestamp) VALUES (%s, %s, %s, %s)"
cur.execute(sql_query, (exchange_rate, buy_rate, sell_rate, timestamp))
conn.commit()
change the
sql_query = "INSERT INTO coinbase_btc_usd(price, buy, sell, timestamp)" \
" VALUES (" + exchange_rate + ', ' + buy_rate + ', ' + sell_rate + ", \'2015-10-10 06:44:33.8672177\')"
to:
sql_query = "INSERT INTO coinbase_btc_usd(price, buy, sell, timestamp)" \
" VALUES (" + exchange_rate + ', ' + buy_rate + ', ' + sell_rate + ", \'2015-10-10 06:44:33.8672177\') returning *"
this should fix no results to fetch in my assumption.
If you see no row added, you most probably begin transaction and never commit it.
When trying to execute the following:
def postToMySQL(date,data,date_column_name,data_column_name,table):
cursor = conn.cursor ()
sql = "\"\"\"INSERT INTO " + table + " (" + date_column_name + ", " + data_column_name + ") VALUES(%s, %s)" + "\"\"\"" #+ ", " + "(" + date + ", " + data + ")"
cursor.execute(sql,(date,data))
I get this error:
_mysql_exceptions.ProgrammingError: (1064, 'You have an error in your SQL syntax... near:
\'"""INSERT INTO natgas (Date, UK) VALUES(\'2012-05-01 13:00:34\', \'59.900\')"""\' at line 1')
I'm puzzled as to where the syntax is wrong, because the following hardcoded example works fine:
def postUKnatgastoMySQL(date, UKnatgas):
cursor = conn.cursor ()
cursor.execute("""INSERT INTO natgas (Date, UK)VALUES(%s, %s)""", (date, UKnatgas))
Can you spot the error?
Alternately, could you tell me how to pass parameters to the field list as well as the value list?
Thanks a lot!
Those triple quotes are a way of representing a string in python. They aren't supposed to be part of the actual query.
On another note, be very sure you trust your input with this approach. Look up SQL Injection.
\'"""INSERT INTO natgas (Date, UK) VALUES(\'2012-05-01 13:00:34\',
\'59.900\')"""\' at line 1')
this is obviously not a vlaid SQL command. You need to get the backslashes out of there, you are probably escaping stuff you shouldn't.
the triple quotes for example sure are unnecessary there.
I am new in python, and using Python & PostgreSQL (9.03) (and psycopg2 to interface between the two) in Windows XP environment.
I am working on a huge spatial dataset road network dataset, and seperating the data per Country through ArcGIS Geoprocessing, and automatically store and them in a PostGIS (1.5) Database.
While when retrieving values from the database everything works as planned:
...
try:
conn = psycopg2.connect("host = '" + HostName + "' dbname='" + DBName + "' user='" + Username + "' password='" + Password + "'")
curs = conn.cursor()
except:
print "Unable to connect to the database"
SQLStatement = "SELECT data_partition FROM datasets WHERE map_partition='" + MapPartitions[0] + "'"
curs.execute(SQLStatement)
...
When I am trying to pass the following Union Statement to Postgres, there is no resulting table, while if I take the printed SQL Statement and run it in as an SQL Statement and run it PostgresSQL, it creates the desired resulting table:
conn = psycopg2.connect("host = '" + HostName + "' dbname='" + DBName + "' user='" + Username + "' password='" + Password + "'")
cur = conn.cursor()
SQLStatement = (
"CREATE TABLE " + Schema + "." + PartitionTableName + " AS \n"
"SELECT * FROM " + Schema + "." + partName + "_Lines_" + Rel + "_Net0 UNION \n"
"SELECT * FROM " + Schema + "." + partName + "_Lines_" + Rel + "_Net1 UNION \n"
"SELECT * FROM " + Schema + "." + partName + "_Lines_" + Rel + "_Net2 UNION \n"
"SELECT * FROM " + Schema + "." + partName + "_Lines_" + Rel + "_Net3 UNION \n"
"SELECT * FROM " + Schema + "." + partName + "_Lines_" + Rel + "_Net4 UNION \n"
"SELECT * FROM " + Schema + "." + partName + "_Lines_" + Rel + "_Net5;\n"
"\n"
"\n"
"ALTER TABLE " + Schema + "." + partName + "_Lines_" + Rel + "\n"
"DROP COLUMN gid;\n"
cur.execute(SQLStatement)
conn.commit()
cur.close()
If we print the SQL Statement, this is the resulting query:
print SQLStatement
CREATE TABLE compresseddata.FRA24_Lines_2011_03 AS
SELECT * FROM compresseddata.FRA24_Lines_2011_03_Net0 UNION
SELECT * FROM compresseddata.FRA24_Lines_2011_03_Net1 UNION
SELECT * FROM compresseddata.FRA24_Lines_2011_03_Net2 UNION
SELECT * FROM compresseddata.FRA24_Lines_2011_03_Net3 UNION
SELECT * FROM compresseddata.FRA24_Lines_2011_03_Net4 UNION
SELECT * FROM compresseddata.FRA24_Lines_2011_03_Net5;
ALTER TABLE compresseddata.FRA24_Lines_2011_03
DROP COLUMN gid;
I am using variables in the to Merge different Road Network Classes, and due to different Partitions of my dataset, I need to iterate through, them, but for some reason that I cannot still understand, there is no table being produced.
Any ideas?
Thanx in advance for the help
THe SQL you are sending are actually 3 statements, not 1.
I never tried this but I expect execute to complain about this.
Additionally there is a semicolon missing in the ALTER TABLE statement.
I would recommend to add exception handling to your code and execute each SQL statement separately so you get better error reporting on what might go wrong.
Indeed Peter, this seems to be the case.
More specifically Each SQL Statement must be passed separately through:
curs.execute(SQLStatement)
and them committed via:
conn.commit()
All the changes will then be apparent in the database.
Thanx again
As already mentioned, individually executing each statement and checking the exception can provide good insight to what is occurring.
In particular psycopg2 will raise psycopg2.ProgrammingError. If the error message is not useful, you may have better luck looking up the exception's pgcode and then investigating that.
PGCodes for 9.1:
http://www.postgresql.org/docs/9.1/static/errcodes-appendix.html ).
try:
cur.execute(SQLQUERY)
except psycopg2.ProgrammingError as e:
# Err code lookup at http://www.postgresql.org/docs/9.1/static/errcodes-appendix.html
print "psycopg2 error code %s" % e.pgcode
raise e
NOTE: A cursors execute statement CAN take multiple sql statements in a single string.
ex: cur.execute('create table ABBA (); create table BETA ();') is a perfectly legitimate statement.
For this reason, do not expect cursor.execute to perform any sanity checks on a string only input!
I'd suggest (except for special rare circumstances) to execute each statement individually.