## COMMENT OUT below just for reference
""
cursor.execute ("""
CREATE TABLE yellowpages
(
business_id BIGINT(20) NOT NULL AUTO_INCREMENT,
categories_name VARCHAR(255),
business_name VARCHAR(500) NOT NULL,
business_address1 VARCHAR(500),
business_city VARCHAR(255),
business_state VARCHAR(255),
business_zipcode VARCHAR(255),
phone_number1 VARCHAR(255),
website1 VARCHAR(1000),
website2 VARCHAR(1000),
created_date datetime,
modified_date datetime,
PRIMARY KEY(business_id)
)
""")
""
## TOP COMMENT OUT (just for reference)
## code
website1g = "http://www.triman.com"
business_nameg = "Triman Sales Inc"
business_address1g = "510 E Airline Way"
business_cityg = "Gardena"
business_stateg = "CA"
business_zipcodeg = "90248"
phone_number1g = "(310) 323-5410"
phone_number2g = ""
website2g = ""
cursor.execute ("""
INSERT INTO yellowpages(categories_name, business_name, business_address1, business_city, business_state, business_zipcode, phone_number1, website1, website2)
VALUES ('%s','%s','%s','%s','%s','%s','%s','%s','%s')
""", (''gas-stations'', business_nameg, business_address1g, business_cityg, business_stateg, business_zipcodeg, phone_number1g, website1g, website2g))
cursor.close()
conn.close()
I keep getting this error
File "testdb.py", line 51
""", (''gas-stations'', business_nameg, business_address1g, business_cityg, business_stateg, business_zipcodeg, phone_number1g, website1g, website2g))
^
SyntaxError: invalid syntax
any idea why?
Thanks for the help in advance
Update #2, I have removed the double single quote on the "categories_name", but now even
import MySQLdb
conn = MySQLdb.connect(host="localhost",port=22008,user="cholestoff",passwd="whoami",db="mydatabase")
cursor = conn.cursor()
## Find mysql version
cursor.execute ("SELECT VERSION()")
row = cursor.fetchone()
print "server version:", row[0]
website1g = "http://www.triman.com"
business_nameg = "Triman Sales Inc"
business_address1g = "510 E Airline Way"
business_cityg = "Gardena"
business_stateg = "CA"
business_zipcodeg = "90248"
phone_number1g = "(310) 323-5410"
phone_number2g = ""
cursor.execute ("""
INSERT INTO yellowpages(categories_name, business_name)
VALUES ('%s','%s')
""", ('gas-stations', business_nameg))
cursor.close()
conn.close()
still gotten this error
server version: 5.1.33-community
Traceback (most recent call last):
File "testdb.py", line 23, in <module>
""", ('gas-stations', business_nameg))
File "C:\Python26\lib\site-packages\MySQLdb\cursors.py", line 173, in execute
self.errorhandler(self, exc, value)
File "C:\Python26\lib\site-packages\MySQLdb\connections.py", line 36, in defaulterrorhandler
raise errorclass, errorvalue
_mysql_exceptions.ProgrammingError: (1064, "You have an error in your SQL syntax;
check the manual that corresponds to your MySQL server version for the right syntax to use
near 'gas-stations'',''Triman Sales Inc'')' at line 2")
Thanks again for the help
I think your problem is here:
''gas-stations''
This gives a syntax error. You probably want to use one set of quotes:
'gas-stations'
If you want to insert the value 'gas-stations' into the database including the quotes then you can either escape the quotes or surround the string with double-quotes instead of single quotes:
"'gas-stations'"
The reason why the "up arrow" is pointing at the wrong place is because your lines are so long that it is wrapping on your console. Make your lines shorter, or widen your console window to see where the error really is.
For your second problem, you need to lose all those single-quote characters in your VALUES clause ... should look like VALUES (%s,%s) not like VALUES ('%s','%s').
The general rules are very simple: for each parameter, have one place-marker (in the case of mySQLdb this is %s) in your SQL statement, and supply one Python expression in your tuple of parameters. Then lean back and let the interface software do the right thing for you. This includes quoting strings properly. Don't try to do it your self. In particular, string expressions should be exactly what you expect to retrieve later.
Example: The business_name of a gas station is "O'Reilly's Pump'n'Go" as a Python string constant. This will end up in the constructed SQL as ...VALUES(...,'O''Reilly''s Pump''n''Go',... without you having to think about it.
You can't use doubled single-quotes (i.e. ''gas-stations'') - use either just single single-quotes ('gas-stations'), or actual double quotes ("gas-stations").
I also got such error and I solved it by replacing '%s' by %s under values.
VALUES(%s,%s)
Related
I tried to insert each element of the json api into my postgres table.
But I get the follwoing error:
Traceback (most recent call last):
File "c:/Users/myname/Documents/repos/docker-playground/parse_json_to_postgres.py", line 20, in <module>
cursor.execute(f"INSERT into catfacts(data) VALUES ( {cat_fact} )")
psycopg2.errors.SyntaxError: syntax error at or near "{"
LINE 1: INSERT into catfacts(data) VALUES ( {'status': {'verified':...
^
My postgres table:
CREATE TABLE cat_facts (
id serial NOT NULL PRIMARY KEY,
data jsonb NOT NULL
);
My Python code to insert the data into the table:
import requests, json, psycopg2
cat_facts_json = requests.get('https://cat-fact.herokuapp.com/facts').json
conn = psycopg2.connect(user="postgres",
password="password",
host="localhost",
port="5432",
database="postgres")
cursor = conn.cursor()
for cat_fact in cat_facts_json():
cursor.execute(f"INSERT into catfacts(data) VALUES ( \' {cat_fact} \' )")
API = https://cat-fact.herokuapp.com/facts
What I am trying to achieve:
INSERT INTO cat_facts(data) VALUES ('{"status":{"verified":true,"sentCount":1},"type":"cat","deleted":false,"_id":"58e008800aac31001185ed07","user":"58e007480aac31001185ecef","text":"Wikipedia has a recording of a cat meowing, because why not?","__v":0,"source":"user","updatedAt":"2020-08-23T20:20:01.611Z","createdAt":"2018-03-06T21:20:03.505Z","used":false}');
INSERT INTO cat_facts(data) VALUES ('{"status":{"verified":true,"sentCount":1},"type":"cat","deleted":false,"_id":"58e008630aac31001185ed01","user":"58e007480aac31001185ecef","text":"When cats grimace, they are usually \"taste-scenting.\" They have an extra organ that, with some breathing control, allows the cats to taste-sense the air.","__v":0,"source":"user","updatedAt":"2020-08-23T20:20:01.611Z","createdAt":"2018-02-07T21:20:02.903Z","used":false},{"status":{"verified":true,"sentCount":1},"type":"cat","deleted":false,"_id":"58e00a090aac31001185ed16","user":"58e007480aac31001185ecef","text":"Cats make more than 100 different sounds whereas dogs make around 10.","__v":0,"source":"user","updatedAt":"2020-08-23T20:20:01.611Z","createdAt":"2018-02-11T21:20:03.745Z","used":false}');
....
See here JSON Adaption.
So something like:
from psycopg2.extras import Json
cursor.execute("INSERT into catfacts(data) VALUES (%s)", [Json(cat_fact)])
I got it working now:
for cat_fact in cat_facts_json:
data = json.dumps(cat_fact)
insert_query = "insert into cat_facts (data) values (%s) returning data"
cursor.execute(insert_query, (data,))
conn.commit()
conn.close()
I considered your comments #Stefano Frazzetto and #Adrian Klaver.
json.dumps works !
I didn't execute the parameters directly in the execute query
I still think, this is a pretty odd syntax with the comma after data:
cursor.execute(insert_query, (data,))
Though it is a repeated question , but want to know where my code is wrong as I am facing a syntax error .
def update_block():
table_name = input("Enter the name of the table: ")
column_update = input("Enter the column name to be updated: ")
column_name = input("Enter the column where the operation is to be performed: ")
name = input("Enter the name has to get update: ")
column_value = input("Enter the column value: ")
try:
sql_update_query = f"""Update {table_name} set {column_update} = %s where {column_name} = %s"""
inputData = (f"{name},{column_value}" )
my_cursor.execute(sql_update_query,inputData)
mydb.commit()
print("Record Updated successfully ")
except mysql.connector.Error as error:
print("Failed to update record to database: {}".format(error))
finally:
if (mydb.is_connected()):
my_cursor.close()
mydb.close()
print("MySQL connection is closed")
update_block()
error i am getting as :
Failed to update record to database: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds
to your MySQL server version for the right syntax to use near '%s where prj_id = %s' at line 1
MySQL connection is closed
f"""Update {table_name} set {column_update} = %s where {column_name} = %s"""
You used f strings with curly brackets and %s notation here. Do either one and it should work
e.g.:
f"""Update {table_name} set {column_update} = {name} where {column_name} = {column_value}"""
There are two problems with the code.
In this line,
sql_update_query = f"""Update {table_name} set {column_update} = %s where {column_name} = %s"""
the table and column names should be quoted with backticks ("`") to handle names containing spaces or hyphens (or some unicode characters). So the line would look like this.
sql_update_query = f"""Update `{table_name}` set `{column_update}` = %s where `{column_name}` = %s"""
Note that the placeholders for variables should remain as %s.
In this line
inputData = (f"{name},{column_value}" )
the variables' values are being converted to strings within a single string. But the statement expects two variables, not one. Also, it is better* to pass the raw variables to the database connection and let the connection manage formatting them correctly in the final query. So the line should be
inputData = (name, column_value)
And now the statement can be executed with the related variables
my_cursor.execute(sql_update_query, inputData)
* The driver knows how to correctly convert Python data types into those expected by the database, and how to escape and quote these values. This provides at least two benefits:
it helps prevent SQL injection attacks, where a malicious user provides an SQL statement as a variable value (such as "; DELETE FROM mytable;"
it ensures that values are processed as expected; consider this statement:
SELECT '2020-09-01' AS `Quoted Date`, 2020-09-01 AS `Unquoted Date`;
+-------------+---------------+
| Quoted Date | Unquoted Date |
+-------------+---------------+
| 2020-09-01 | 2010 |
+-------------+---------------+
I'd like to have returned to me (via cx_oracle in python) the value of the Identity that's created for a row that I'm inserting. I think I can figure out the python bit on my own, if someone could please state how to modify my SQL statement to get the ID of the newly-created row.
I have a table that's created with something like the following:
CREATE TABLE hypervisor
(
id NUMBER GENERATED BY DEFAULT AS IDENTITY (
START WITH 1 NOCACHE ORDER ) NOT NULL ,
name VARCHAR2 (50)
)
LOGGING ;
ALTER TABLE hypervisor ADD CONSTRAINT hypervisor_PK PRIMARY KEY ( id ) ;
And I have SQL that's similar to the following:
insert into hypervisor ( name ) values ('my hypervisor')
Is there an easy way to obtain the id of the newly inserted row? I'm happy to modify my SQL statement to have it returned, if that's possible.
Most of the google hits on this issue were for version 11 and below, which don't support automatically-generated identity columns so hopefully someone here can help out.
Taking what user2502422 said above and adding the python bit:
newest_id_wrapper = cursor.var(cx_Oracle.STRING)
sql_params = { "newest_id_sql_param" : newest_id_wrapper }
sql = "insert into hypervisor ( name ) values ('my hypervisor') " + \
"returning id into :python_var"
cursor.execute(sql, sql_params)
newest_id=newest_id_wrapper.getvalue()
This example taken from learncodeshare.net has helped me grasp the correct syntax.
cur = con.cursor()
new_id = cur.var(cx_Oracle.NUMBER)
statement = 'insert into cx_people(name, age, notes) values (:1, :2, :3) returning id into :4'
cur.execute(statement, ('Sandy', 31, 'I like horses', new_id))
sandy_id = new_id.getvalue()
pet_statement = 'insert into cx_pets (name, owner, type) values (:1, :2, :3)'
cur.execute(pet_statement, ('Big Red', sandy_id, 'horse'))
con.commit()
It's only slightly different from ragerdl's answer, but different enough to be added here I believe!
Notice the absence of sql_params = { "newest_id_sql_param" : newest_id_wrapper }
Use the returning clause of the insert statement.
insert into hypervisor (name ) values ('my hypervisor')
returning id into :python_var
You said you could handle the Python bit ? You should be able to "bind" the return parameter in your program.
I liked the answer by Marco Polo, but it is incomplete.
The answer from FelDev is good too but does not address named parameters.
Here is a more complete example from code I wrote with a simplified table (less fields). I have omitted code on how to set up a cursor since that is well documented elsewhere.
import cx_Oracle
INSERT_A_LOG = '''INSERT INTO A_LOG(A_KEY, REGION, DIR_NAME, FILENAME)
VALUES(A_KEY_Sequence.nextval, :REGION, :DIR_NAME, :FILENAME)
RETURNING A_KEY INTO :A_LOG_ID'''
CURSOR = None
class DataProcessor(Process):
# Other code for setting up connection to DB and storing it in CURSOR
def save_log_entry(self, row):
global CURSOR
# Oracle variable to hold value of last insert
log_var = CURSOR.var(cx_Oracle.NUMBER)
row['A_LOG_ID'] = log_var
row['REGION'] = 'R7' # Other entries set elsewhere
try:
# This will fail unless row.keys() =
# ['REGION', 'DIR_NAME', 'FILE_NAME', 'A_LOG_ID']
CURSOR.execute(INSERT_A_LOG, row)
except Exception as e:
row['REJCTN_CD'] = 'InsertFailed'
raise
# Get last inserted ID from Oracle for update
self.last_log_id = log_var.getvalue()
print('Insert id was {}'.format(self.last_log_id))
Agreeing with the older answers. However, depending on your version of cx_Oracle (7.0 and newer), var.getvalue() might return an array instead of a scalar.
This is to support multiple return values as stated in this comment.
Also note, that cx_Oracle is deprecated and has moved to oracledb now.
Example:
newId = cur.var(oracledb.NUMBER, outconverter=int)
sql = """insert into Locations(latitude, longitude) values (:latitude, :longitude) returning locationId into :newId"""
sqlParam = [latitude, longitude, newId]
cur.execute(sql, sqlParam)
newIdValue = newId.getvalue()
newIdValue would return [1] instead of 1
I am getting a sql lite operational syntax error for this code:
def checkIn(uname, title):
bookid = findBookID(title) #returns an int bookid given the title
print bookid
with libDB:
checkCur = libDB.cursor()
checkCur.execute(
"IF NOT EXISTS(SELECT 1 FROM Checks WHERE Username =? AND bookID =?) INSERT INTO Checks VALUES(?,?)",
(uname, bookid, uname, bookid))
checkCur.close()
mess = "OK::CHKIN::", uname, "::", title
return mess
The error is:
sqlite3.OperationalError: near "IF": syntax error
This is how I defined the table:
with libDB:
checkCur = libDB.cursor()
checkCur.execute(
"CREATE TABLE Checks(bookID INTEGER, Username TEXT, FOREIGN KEY(bookID) REFERENCES Books(bookID),FOREIGN KEY(Username) REFERENCES Users(Username))")
checkCur.close()
My apologies if I am missing something simple. I looked over the code several times and search online and I don't see where the syntax error is. I compared my query to those I found online and it seems to match. The only thing I can think of that could be wrong is if my parameters are not correct but I tried altering them and I still can't get it to work.
Thank you in advance for any help.
-CJ
IF NOT EXISTS is incompatible with sqlite. The insert statement you want is as follows:
INSERT INTO Checks (bookID, Username)
SELECT 7, 'Bob' /* for example */
WHERE NOT EXISTS (SELECT 1 FROM Checks WHERE bookID = 7 and Username = 'Bob');
Note that NOT EXISTS is in the WHERE clause. This sort of insert statement is compatible with sqlite. You can play with the sql fiddle here.
So in your Python function, try this instead:
insert_stmt = ("INSERT INTO Checks (bookID, Username) " # note the space at end of string
"SELECT ?, ? "
"WHERE NOT EXISTS (SELECT 1 FROM Checks WHERE bookID = ? and Username = ?)")
checkCur.execute(insert_stmt, (bookid, uname) * 2) # no need to repeat the bookid, uname combo twice; just multiply the tuple by 2
I have a JSON object in Python. I am Using Python DB-API and SimpleJson. I am trying to insert the json into a MySQL table.
At moment am getting errors and I believe it is due to the single quotes '' in the JSON Objects.
How can I insert my JSON Object into MySQL using Python?
Here is the error message I get:
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7ff68f91d7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the
manual that corresponds to your MySQL server version for
the right syntax to use near ''favorited': '0',
'in_reply_to_user_id': '52063869', 'contributors':
'NULL', 'tr' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-py2.5.egg/
twitstream/twitasync.py|found_terminator|55] [twitter.py|callback|26]
[build/bdist.linux-x86_64/egg/MySQLdb/cursors.py|execute|166]
[build/bdist.linux-x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Another error for reference
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7feb9d52b7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the manual
that corresponds to your MySQL server version for the right
syntax to use near 'RT #tweetmeme The Best BlackBerry Pearl
Cell Phone Covers http://bit.ly/9WtwUO''' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-
py2.5.egg/twitstream/twitasync.py|found_terminator|55]
[twitter.py|callback|28] [build/bdist.linux-
x86_64/egg/MySQLdb/cursors.py|execute|166] [build/bdist.linux-
x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Here is a link to the code that I am using http://pastebin.com/q5QSfYLa
#!/usr/bin/env python
try:
import json as simplejson
except ImportError:
import simplejson
import twitstream
import MySQLdb
USER = ''
PASS = ''
USAGE = """%prog"""
conn = MySQLdb.connect(host = "",
user = "",
passwd = "",
db = "")
# Define a function/callable to be called on every status:
def callback(status):
twitdb = conn.cursor ()
twitdb.execute ("INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)",(status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status))
# print status
#print "%s:\t%s\n" % (status.get('user', {}).get('screen_name'), status.get('text'))
if __name__ == '__main__':
# Call a specific API method from the twitstream module:
# stream = twitstream.spritzer(USER, PASS, callback)
twitstream.parser.usage = USAGE
(options, args) = twitstream.parser.parse_args()
if len(args) < 1:
args = ['Blackberry']
stream = twitstream.track(USER, PASS, callback, args, options.debug, engine=options.engine)
# Loop forever on the streaming call:
stream.run()
use json.dumps(json_value) to convert your json object(python object) in a json string that you can insert in a text field in mysql
http://docs.python.org/library/json.html
To expand on the other answers:
Basically you need make sure of two things:
That you have room for the full amount of data that you want to insert in the field that you are trying to place it. Different database field types can fit different amounts of data.
See: MySQL String Datatypes. You probably want the "TEXT" or "BLOB" types.
That you are safely passing the data to database. Some ways of passing data can cause the database to "look" at the data and it will get confused if the data looks like SQL. It's also a security risk. See: SQL Injection
The solution for #1 is to check that the database is designed with correct field type.
The solution for #2 is use parameterized (bound) queries. For instance, instead of:
# Simple, but naive, method.
# Notice that you are passing in 1 large argument to db.execute()
db.execute("INSERT INTO json_col VALUES (" + json_value + ")")
Better, use:
# Correct method. Uses parameter/bind variables.
# Notice that you are passing in 2 arguments to db.execute()
db.execute("INSERT INTO json_col VALUES %s", json_value)
Hope this helps. If so, let me know. :-)
If you are still having a problem, then we will need to examine your syntax more closely.
The most straightforward way to insert a python map into a MySQL JSON field...
python_map = { "foo": "bar", [ "baz", "biz" ] }
sql = "INSERT INTO your_table (json_column_name) VALUES (%s)"
cursor.execute( sql, (json.dumps(python_map),) )
You should be able to insert intyo a text or blob column easily
db.execute("INSERT INTO json_col VALUES %s", json_value)
You need to get a look at the actual SQL string, try something like this:
sqlstr = "INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)", (status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status)
print "about to execute(%s)" % sqlstr
twitdb.execute(sqlstr)
I imagine you are going to find some stray quotes, brackets or parenthesis in there.
#route('/shoes', method='POST')
def createorder():
cursor = db.cursor()
data = request.json
p_id = request.json['product_id']
p_desc = request.json['product_desc']
color = request.json['color']
price = request.json['price']
p_name = request.json['product_name']
q = request.json['quantity']
createDate = datetime.now().isoformat()
print (createDate)
response.content_type = 'application/json'
print(data)
if not data:
abort(400, 'No data received')
sql = "insert into productshoes (product_id, product_desc, color, price, product_name, quantity, createDate) values ('%s', '%s','%s','%d','%s','%d', '%s')" %(p_id, p_desc, color, price, p_name, q, createDate)
print (sql)
try:
# Execute dml and commit changes
cursor.execute(sql,data)
db.commit()
cursor.close()
except:
# Rollback changes
db.rollback()
return dumps(("OK"),default=json_util.default)
One example, how add a JSON file into MySQL using Python. This means that it is necessary to convert the JSON file to sql insert, if there are several JSON objects then it is better to have only one call INSERT than multiple calls, ie for each object to call the function INSERT INTO.
# import Python's JSON lib
import json
# use JSON loads to create a list of records
test_json = json.loads('''
[
{
"COL_ID": "id1",
"COL_INT_VAULE": 7,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 3.14159,
"COL_STRING_VAULE": "stackoverflow answer"
},
{
"COL_ID": "id2",
"COL_INT_VAULE": 10,
"COL_BOOL_VALUE": false,
"COL_FLOAT_VALUE": 2.71828,
"COL_STRING_VAULE": "http://stackoverflow.com/"
},
{
"COL_ID": "id3",
"COL_INT_VAULE": 2020,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 1.41421,
"COL_STRING_VAULE": "GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer"
}
]
''')
# create a nested list of the records' values
values = [list(x.values()) for x in test_json]
# print(values)
# get the column names
columns = [list(x.keys()) for x in test_json][0]
# value string for the SQL string
values_str = ""
# enumerate over the records' values
for i, record in enumerate(values):
# declare empty list for values
val_list = []
# append each value to a new list of values
for v, val in enumerate(record):
if type(val) == str:
val = "'{}'".format(val.replace("'", "''"))
val_list += [ str(val) ]
# put parenthesis around each record string
values_str += "(" + ', '.join( val_list ) + "),\n"
# remove the last comma and end SQL with a semicolon
values_str = values_str[:-2] + ";"
# concatenate the SQL string
table_name = "json_data"
sql_string = "INSERT INTO %s (%s)\nVALUES\n%s" % (
table_name,
', '.join(columns),
values_str
)
print("\nSQL string:\n\n")
print(sql_string)
output:
SQL string:
INSERT INTO json_data (COL_ID, COL_INT_VAULE, COL_BOOL_VALUE, COL_FLOAT_VALUE, COL_STRING_VAULE)
VALUES
('id1', 7, True, 3.14159, 'stackoverflow answer'),
('id2', 10, False, 2.71828, 'http://stackoverflow.com/'),
('id3', 2020, True, 1.41421, 'GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer.');
The error may be due to an overflow of the size of the field in which you try to insert your json. Without any code, it is hard to help you.
Have you considerate a no-sql database system such as couchdb, which is a document oriented database relying on json format?
Here's a quick tip, if you want to write some inline code, say for a small json value, without import json.
You can escape quotes in SQL by a double quoting, i.e. use '' or "", to enter ' or ".
Sample Python code (not tested):
q = 'INSERT INTO `table`(`db_col`) VALUES ("{k:""some data"";}")'
db_connector.execute(q)