How to update postgres array on Python - python

I selected column "user_list" in 'users' table and fetched to a python variable called "u_list". I appended 'item' in it and tried to update "user_list", but got a lot of errors. I tried searching on stackoverflow, but nothing helped.
code:
cursor.execute(f'SELECT user_list FROM users WHERE id=442392434899681280')
u_list = cursor.fetchone()[0]
u_list.append('item')
cursor.execute('UPDATE users SET user_list = {} WHERE id = 442392434899681280'.format(u_list))
data_base.commit()
but got an error:
Traceback (most recent call last):
File "d:\workspace\sabo\test.py", line 30, in <module>
cursor.execute('UPDATE users SET user_list = {} WHERE id = 442392434899681280'.format(u_list))
psycopg2.errors.SyntaxError: syntax error at or near "["
LINE 1: UPDATE users SET user_list = ['item'] WHERE id = 4423924348996...
Another try and error
code:
cursor.execute(f'SELECT user_list FROM users WHERE id=442392434899681280')
u_list = cursor.fetchone()[0]
u_list.append('item')
cursor.execute("UPDATE users SET user_list= (%s) WHERE id = 442392434899681280", (u_list))
data_base.commit()
error:
File "d:\workspace\sabo\test.py", line 33, in <module>
cursor.execute("UPDATE users SET user_list= (%s) WHERE id = 442392434899681280", (u_list))
psycopg2.errors.InvalidTextRepresentation: malformed array literal: "item"
LINE 1: UPDATE users SET user_list= ('item') WHERE id = 4423924348996...
^
DETAIL: Array value must start with "{" or dimension information.

Think about it like if you were typing the query yourself. As the error statement specifies, to PostgreSQL arrays must star with '{' so they will have to be within curly braces. If you were to write the query in SQL yourself it would look like this:
UPDATE users SET user_list = '{"foo", "bar", "item"}' WHERE id = 442392434899681280;
Handcrafted way
So in Python, it would have to be done like this:
cursor.execute(f'SELECT user_list FROM users WHERE id=442392434899681280')
u_list = cursor.fetchone()[0]
u_list.append('item')
cursor.execute('UPDATE users SET user_list = \'{{{}}}\' WHERE id = 442392434899681280'.format(','.join(['"{}"'.format(v) for v in u_list])))
data_base.commit()
Notice the three curly braces in the formatting, two are to escape so one '{' remains an the third is for the formatting. Also, if your list is not of strings you will have to convert it before joining.
Let psycopg2 handle it
psycopg2's docs also state that Python lists are converted to PostgreSQL ARRAY so the above would be done like like this:
cursor.execute("UPDATE users SET user_list= %s WHERE id = 442392434899681280", (u_list,))
You are missing a comma after the list and there is an extra parentheses surrounding %s in your code sample.

Related

Python - SQL Connector: Update do not work

I want to update records in MySQL. However, I always get an error that the syntax does not work. I think it is a formatting error, however I can't manage to fix it.
Error message:
mysql.connector.errors.ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''ovd' = 1 WHERE id = '16923'' at line 1
My code looks like:
func = ['OffizierIn vom Dienst Bezirk', 'EinsatzoffizierIn']
dbFields = ["ovd", "offizier"]
x = 0
for i in func:
el = chrome.find_element_by_id('RoleId')
for option in el.find_elements_by_tag_name('option'):
if option.text == i:
option.click()
chrome.find_element_by_id('SearchBtn').submit()
time.sleep(2)
tbody = chrome.find_element_by_id('SearchResults')
for row in tbody.find_elements_by_xpath('./tr'):
itemsEmployee = row.find_elements_by_xpath('./td')
cursor.execute('UPDATE employees SET %s = 1 WHERE id = %s;', (dbFields[x], itemsEmployee[1].text))
x = x + 1
In the first pass, the values are as in the error message: dbFields[x] = ovd itemsEmplyee[1] = 16923
The table was created as follows:
cursor.execute('CREATE TABLE IF NOT EXISTS employees (id INT NOT NULL UNIQUE, ovd BOOLEAN);')
You've encountered one of the annoyances in writing dynamic database queries: values must be quoted, if necessary, with quotation marks, as performed by the connector package, but table and column names, if quoted, are quoted with backticks. See the MySQL rules.
You need to add the column name using string formatting, then pass the value to a prepared statement:
stmt = f'UPDATE employees SET `{dbFields[x]}` = 1 WHERE id = %s;'
cursor.execute(stmt, (itemsEmployee[1].text,))

Python - SQLite For Loop OperationalError

I've been given a messy Excel file that I am trying to transfer into tidied SQL tables. I am building a Python program to do the transfer. There is a list of names, some of which are repeated. I've already gotten the column with all the names into a list of tuples like:
[(John Christopher, ), (Elizabeth Smith, ), (etc, )]
I've built an SQL table called Players and have an empty column named "id". I'm trying to iterate through this list and assign a unique id to each player in the Table and then also delete the duplicate names in my table.
However I keep getting this error:
cursor.execute("UPDATE Players SET id = "+str(id)+" WHERE name = "+str(item[0]))
sqlite3.OperationalError: near "Christopher": syntax error
What is my issue?
Here is the code:
import sqlite3
player_list_start = cursor.execute("SELECT name FROM Players")
saved_list = player_list_start.fetchall()
# number id should start on
id = 1
# list of players to keep track if they are already in the table
names = []
for item in saved_list:
if item[0] not in names:
cursor.execute("UPDATE Players SET id = "+str(id)+" WHERE name = "+str(item[0]))
connection.commit()
names.append(item[0])
id += 1
else:
cursor.execute("DELETE FROM Players WHERE name = "+str(item[0])+" AND id = NULL")
connection.commit()
It definitely says sqlite3.OperationalError: near "Christopher": syntax error .
The following line will generate syntactically wrong SQL statement because it does not quote strings (especially including spaces).
cursor.execute("DELETE FROM Players WHERE name = "+str(item[0])+" AND id = NULL")
Use
cursor.execute("DELETE FROM Players WHERE name = '"+str(item[0])+"' AND id = NULL")
or
cursor.execute("DELETE FROM Players WHERE name = ? AND id = NULL", (str(item[0]),))

Error with string formatting?

Having trouble here, It worked when I had multiple (%s,%s) and data was (user,pass) or something like that.
However with the following code I keep getting this error.
query = query % tuple([db.literal(item) for item in args])
TypeError: not all arguments converted during string formatting
Why does this keep happening? It only occurs when there is only a single argument
This code is from my flask application
username = request.values['username']
update_stmt = (
"UPDATE ACCOUNTS SET IN_USE = 1 WHERE USER = '(%s)'"
)
data = (username)
cursor.execute(update_stmt,data)
For a single valued tuple to be recognized as a tuple you need a trailing ,
data = (username,)
And unrelated, you don't really need to quote in your query
"UPDATE ACCOUNTS SET IN_USE = 1 WHERE USER = (%s)"
Your full code should be
username = request.values['username']
update_stmt = (
"UPDATE ACCOUNTS SET IN_USE = 1 WHERE USER = (%s)"
)
data = (username,)
cursor.execute(update_stmt,data)

Inserting JSON into MySQL using Python

I have a JSON object in Python. I am Using Python DB-API and SimpleJson. I am trying to insert the json into a MySQL table.
At moment am getting errors and I believe it is due to the single quotes '' in the JSON Objects.
How can I insert my JSON Object into MySQL using Python?
Here is the error message I get:
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7ff68f91d7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the
manual that corresponds to your MySQL server version for
the right syntax to use near ''favorited': '0',
'in_reply_to_user_id': '52063869', 'contributors':
'NULL', 'tr' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-py2.5.egg/
twitstream/twitasync.py|found_terminator|55] [twitter.py|callback|26]
[build/bdist.linux-x86_64/egg/MySQLdb/cursors.py|execute|166]
[build/bdist.linux-x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Another error for reference
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7feb9d52b7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the manual
that corresponds to your MySQL server version for the right
syntax to use near 'RT #tweetmeme The Best BlackBerry Pearl
Cell Phone Covers http://bit.ly/9WtwUO''' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-
py2.5.egg/twitstream/twitasync.py|found_terminator|55]
[twitter.py|callback|28] [build/bdist.linux-
x86_64/egg/MySQLdb/cursors.py|execute|166] [build/bdist.linux-
x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Here is a link to the code that I am using http://pastebin.com/q5QSfYLa
#!/usr/bin/env python
try:
import json as simplejson
except ImportError:
import simplejson
import twitstream
import MySQLdb
USER = ''
PASS = ''
USAGE = """%prog"""
conn = MySQLdb.connect(host = "",
user = "",
passwd = "",
db = "")
# Define a function/callable to be called on every status:
def callback(status):
twitdb = conn.cursor ()
twitdb.execute ("INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)",(status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status))
# print status
#print "%s:\t%s\n" % (status.get('user', {}).get('screen_name'), status.get('text'))
if __name__ == '__main__':
# Call a specific API method from the twitstream module:
# stream = twitstream.spritzer(USER, PASS, callback)
twitstream.parser.usage = USAGE
(options, args) = twitstream.parser.parse_args()
if len(args) < 1:
args = ['Blackberry']
stream = twitstream.track(USER, PASS, callback, args, options.debug, engine=options.engine)
# Loop forever on the streaming call:
stream.run()
use json.dumps(json_value) to convert your json object(python object) in a json string that you can insert in a text field in mysql
http://docs.python.org/library/json.html
To expand on the other answers:
Basically you need make sure of two things:
That you have room for the full amount of data that you want to insert in the field that you are trying to place it. Different database field types can fit different amounts of data.
See: MySQL String Datatypes. You probably want the "TEXT" or "BLOB" types.
That you are safely passing the data to database. Some ways of passing data can cause the database to "look" at the data and it will get confused if the data looks like SQL. It's also a security risk. See: SQL Injection
The solution for #1 is to check that the database is designed with correct field type.
The solution for #2 is use parameterized (bound) queries. For instance, instead of:
# Simple, but naive, method.
# Notice that you are passing in 1 large argument to db.execute()
db.execute("INSERT INTO json_col VALUES (" + json_value + ")")
Better, use:
# Correct method. Uses parameter/bind variables.
# Notice that you are passing in 2 arguments to db.execute()
db.execute("INSERT INTO json_col VALUES %s", json_value)
Hope this helps. If so, let me know. :-)
If you are still having a problem, then we will need to examine your syntax more closely.
The most straightforward way to insert a python map into a MySQL JSON field...
python_map = { "foo": "bar", [ "baz", "biz" ] }
sql = "INSERT INTO your_table (json_column_name) VALUES (%s)"
cursor.execute( sql, (json.dumps(python_map),) )
You should be able to insert intyo a text or blob column easily
db.execute("INSERT INTO json_col VALUES %s", json_value)
You need to get a look at the actual SQL string, try something like this:
sqlstr = "INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)", (status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status)
print "about to execute(%s)" % sqlstr
twitdb.execute(sqlstr)
I imagine you are going to find some stray quotes, brackets or parenthesis in there.
#route('/shoes', method='POST')
def createorder():
cursor = db.cursor()
data = request.json
p_id = request.json['product_id']
p_desc = request.json['product_desc']
color = request.json['color']
price = request.json['price']
p_name = request.json['product_name']
q = request.json['quantity']
createDate = datetime.now().isoformat()
print (createDate)
response.content_type = 'application/json'
print(data)
if not data:
abort(400, 'No data received')
sql = "insert into productshoes (product_id, product_desc, color, price, product_name, quantity, createDate) values ('%s', '%s','%s','%d','%s','%d', '%s')" %(p_id, p_desc, color, price, p_name, q, createDate)
print (sql)
try:
# Execute dml and commit changes
cursor.execute(sql,data)
db.commit()
cursor.close()
except:
# Rollback changes
db.rollback()
return dumps(("OK"),default=json_util.default)
One example, how add a JSON file into MySQL using Python. This means that it is necessary to convert the JSON file to sql insert, if there are several JSON objects then it is better to have only one call INSERT than multiple calls, ie for each object to call the function INSERT INTO.
# import Python's JSON lib
import json
# use JSON loads to create a list of records
test_json = json.loads('''
[
{
"COL_ID": "id1",
"COL_INT_VAULE": 7,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 3.14159,
"COL_STRING_VAULE": "stackoverflow answer"
},
{
"COL_ID": "id2",
"COL_INT_VAULE": 10,
"COL_BOOL_VALUE": false,
"COL_FLOAT_VALUE": 2.71828,
"COL_STRING_VAULE": "http://stackoverflow.com/"
},
{
"COL_ID": "id3",
"COL_INT_VAULE": 2020,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 1.41421,
"COL_STRING_VAULE": "GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer"
}
]
''')
# create a nested list of the records' values
values = [list(x.values()) for x in test_json]
# print(values)
# get the column names
columns = [list(x.keys()) for x in test_json][0]
# value string for the SQL string
values_str = ""
# enumerate over the records' values
for i, record in enumerate(values):
# declare empty list for values
val_list = []
# append each value to a new list of values
for v, val in enumerate(record):
if type(val) == str:
val = "'{}'".format(val.replace("'", "''"))
val_list += [ str(val) ]
# put parenthesis around each record string
values_str += "(" + ', '.join( val_list ) + "),\n"
# remove the last comma and end SQL with a semicolon
values_str = values_str[:-2] + ";"
# concatenate the SQL string
table_name = "json_data"
sql_string = "INSERT INTO %s (%s)\nVALUES\n%s" % (
table_name,
', '.join(columns),
values_str
)
print("\nSQL string:\n\n")
print(sql_string)
output:
SQL string:
INSERT INTO json_data (COL_ID, COL_INT_VAULE, COL_BOOL_VALUE, COL_FLOAT_VALUE, COL_STRING_VAULE)
VALUES
('id1', 7, True, 3.14159, 'stackoverflow answer'),
('id2', 10, False, 2.71828, 'http://stackoverflow.com/'),
('id3', 2020, True, 1.41421, 'GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer.');
The error may be due to an overflow of the size of the field in which you try to insert your json. Without any code, it is hard to help you.
Have you considerate a no-sql database system such as couchdb, which is a document oriented database relying on json format?
Here's a quick tip, if you want to write some inline code, say for a small json value, without import json.
You can escape quotes in SQL by a double quoting, i.e. use '' or "", to enter ' or ".
Sample Python code (not tested):
q = 'INSERT INTO `table`(`db_col`) VALUES ("{k:""some data"";}")'
db_connector.execute(q)

Getting error on inserting tuple values in postgreSQL table using python

I want to keep last.fm's user recent music tracks list to postgresql database table using pylast interface.But when I tried to insert values to the table it shows errors.Code example:
import pylast
import psycopg2
import re
from md5 import md5
import sys
import codecs
import psycopg2.extensions
psycopg2.extensions.register_type(psycopg2.extensions.UNICODE)
user_name = raw_input("Enter last.fm username: ")
user_password = raw_input("Enter last.fm password: ")
api_key = '*********'
api_secret = '********'
#Lastfm network authentication
md5_user_password = md5(user_password).hexdigest()
network = pylast.get_lastfm_network(api_key, api_secret,user_name,md5_user_password)
used=pylast.User(user_name, network)
recent_tracks=used.get_recent_tracks(10)
# Database connection
try:
conn=psycopg2.connect("dbname='**' user='postgres' host='localhost' password='*'")
conn.set_client_encoding('UNICODE')
except:
print "I am unable to connect to the database, exiting."
sys.exit()
cur=conn.cursor()
for i, artist in enumerate(recent_tracks):
for key in sorted(artist):
cur.execute("""
INSERT INTO u_recent_track(Playback_date,Time_stamp,Track)
VALUES (%s,%s,%s)""", (key, artist[key]))
conn.commit()
cur.execute("SELECT * FROM u_recent_track;")
cur.fetchone()
for row in cur:
print ' '.join(row[1:])
cur.close()
conn.close()
Here "recent_tracks" tuple have the values for example:
artist 0
- playback_date : 5 May 2010, 11:14
- timestamp : 1273058099
- track : Brian Eno - Web
I want to store these value under u_recent_track(Tid, Playback_date, Time_stamp, Track).Can anybody have idea how to sort out this problem? when I tried to run, it shows error:
Traceback (most recent call last):
File "F:\JavaWorkspace\Test\src\recent_track_database.py", line 50, in <module>
VALUES (%s,%s,%s)""", (key, artist[key]))
IndexError: tuple index out of range
sorted(artist) returns a ordered list of artist, when you're iterating over it it returns still elements of artist. So when you're trying to access artist[key] it is actually trying to access an element of artist indexed by the index, which is an element of artist itself. Tuples do not work this way.
It seems you're using python2.5 or lower and therefore you could do:
cur.executemany("""
INSERT INTO u_recent_track(Playback_date,Time_stamp,Track)
VALUES (%(playback_date)s,%(timestamp)s,%(track)s)""", recent_tracks)
conn.commit()
This should work.
This error isn't anything to do with Postgres, but with the artist variable. You're firstly saying:
for key in sorted(artist):
implying that it's a list, but then you're accessing it as if it were a dictionary, which is raising an error. Which is it? Can you show an example of the full contents?
(Playback_date,Time_stamp,Track) indicates you want to insert three values into a row.
VALUES (%s,%s) should therefore be VALUES (%s,%s,%s)
and (key, artist[key]) should be a tuple with 3 elements, not 2.
Try:
for track in recent_tracks:
cur.execute("""
INSERT INTO u_recent_track(Playback_date,Time_stamp,Track)
VALUES (%s,%s,%s)""", (track.get_date(), track.get_timestamp(), track.get_track()))
conn.commit()
PS. This is where I'm getting my information about the pylast API.
PPS. If my reading of the documentation is correct, track.get_track() will return a Track object. It has methods like get_album, get_artist, get_id and get_title. Exactly what do you want stored in the Track column of the u_recent_track database table?

Categories