I want to insert Hindi language sentences to mysql database.
But I encountered a problem: Hindi language sentences that are inserted into mysql database has become garbled.
I have set the encoding format to UTF-8, then my code is as follows.
Thanks a lot!
#coding = utf-8
import MySQLdb
import sys
reload(sys)
sys.setdefaultencoding('utf-8')
dbs = MySQLdb.connect(host='x.x.x.x', user='x', passwd = 'x', db='x',port=x)
cursor = dbs.cursor()
with open('hindi.wiki.set','r') as file:
count = 1
for line in file.readlines():
if count == 5:
break
sql = """insert into `lab_ime_test_set_2` (id_, type_, lang_, text_, anno_) values(%s, %s, %s,'%s', %s)""" % ("null", "'wiki'", "'hindi'", MySQLdb.escape_string(line.strip()), "'not_anno'")
try:
cursor.execute(sql)
dbs.commit()
except Exception as eh:
print("error")
print("total count", count)
cursor.close()
dbs.close()
since the sql can bu run in navicat for mysql and the hindi language can be shown correctly.
But when I run this code, the sentences can be inserted in mysql database as well, but can't be shown correctly.
such as "संतरे के जायके वाले मूल टैंग को 1957 में जनरल फूडà¥à¤¸ कॉरपोरेशन के लिठविलियम à¤"
My python 3.6 code is supposed to create a database and create a table inside it.
import sqlite3
db_filename = 'database.db'
connect = sqlite3.connect(db_filename)
c = connect.cursor()
c.execute('CREATE TABLE IF NOT EXISTS task (id number PRIMARY KEY, priority integer, details text, status text)')
connect.commit()
connect.close()
However the output is not what I intended. I am getting weird characters included in the .db file;
SQLite format 3 # .�
� b b� k�9tabletasktaskCREATE TABLE task (id number PRIMARY KEY, priority integer, details text, status text)'; indexsqlite_autoindex_task_1task
If anyone could tell me where I went wrong I would be grateful.
Thanks.
There is nothing wrong here. To view a .db file you need db viewer or reader tool. http://sqlitebrowser.org/ has DB browser for SQLite which can be used to view your database. You can install it and use it to read your .db file.
If you want to use the table you can do so by inserting elements in the table and viewing it as follows:
import sqlite3
db_filename = 'database.db'
connect = sqlite3.connect(db_filename)
c = connect.cursor()
c.execute('CREATE TABLE IF NOT EXISTS task (id number PRIMARY KEY, priority integer, details text, status text)')
c.execute("INSERT INTO task (id,priority,details,status) \
VALUES (1,22,'ABC','YES' )");
cursor = c.execute("SELECT id,priority,details,status from task")
for row in cursor:
print ("ID = ", row[0])
print ("PRIORITY = ", row[1])
print ("DETAILS = ", row[2])
print ("STATUS = ", row[3], "\n")
connect.commit()
connect.close()
OUTPUT:
ID = 1
PRIORITY = 22
DETAILS = ABC
STATUS = YES
Im trying to insert / update a very long sql, tried to format to string, but wont take any effect on MySQL table:
import MySQLdb
db = MySQLdb.connect(host=host, user=user, passwd=passwd, db=db)
cur = db.cursor()
str1 = 'x'
str2 = 'y'
s = """a:150:{i:0;b:0;s:12:"social_icons";a:9:{s:8:"facebook";s:1:"1";s:7:"twitter";s:1:"1";s:5:"email";s:1:"1";s:9:"pinterest";s:1:"1";s:10:"googleplus";s:1:"1";s:8:"linkedin";s:1:"0";s:2:"vk";s:1:"0";s:6:"tumblr";s:1:"0";s:8:"whatsapp";s:1:"0";}s:7:"backups";N;s:9:"smof_init";s:31:"Tue, 16 Aug 2016 23:03:59
+0000";s:17:"minified_flatsome";i:0;s:16:"flatsome_builder";i:1;s:13:"flatsome_docs";i:1;s:16:"maintenance_mode";i:0;s:21:"maintenance_mode_text";s:24:"Please check back soon..";s:19:"html_scripts_header";s:0:"";s:19:"html_scripts_footer";s:0:"";s:15:"html_custom_css";s:6:"div {}";s:22:"html_custom_css_mobile";s:0:"";s:9:"site_logo";s:57:"[site_url]/wp-content/uploads/2016/08/Logomakr_8dvMrU.png";s:0:"";s:165:"Favicon upload has moved to: <br/> <a href=''[site_url]/wp-admin/customize.php?&autofocus%5Bpanel%5Dof-option-logoandicons''>Appearance
> Customize > Site Identity</a>";s:16:"custom_cart_icon";s:0:"";s:14:"site_logo_dark";s:0:"";s:16:"site_logo_sticky";s:0:"";s:11:"body_layout";s:10:"full-width";s:10:"box_shadow";i:0;s:7:"body_bg";s:0:"";s:13:"body_bg_image";s:0:"";s:12:"body_bg_type";s:12:"bg-full-size";s:13:"content_color";s:5:"light";s:10:"content_bg";s:4:"#FFF";s:13:"header_preset";s:0:"";s:13:"header_height";s:3:"120";s:10:"logo_width";s:3:"210";s:13:"logo_position";s:6:"center";s:10:"search_pos";s:4:"left";s:12:"nav_position";s:3:"top";s:8:"nav_size";s:3:"80%";s:18:"myaccount_dropdown";i:1;s:19:"account_login_style";s:4:"link";s:9:"show_cart";i:1;s:14:"top_right_text";s:0:"";s:13:"header_sticky";i:1;s:20:"header_height_sticky";s:2:"70";s:12:"header_color";s:5:"light";s:9:"header_bg";s:4:"#fff";s:13:"header_bg_img";s:0:"";s:17:"header_bg_img_pos";s:8:"repeat-x";s:11:"topbar_show";i:1;s:9:"topbar_bg";s:0:"";s:11:"topbar_left";s:0:"";s:12:"topbar_right";s:0:"";s:15:"nav_position_bg";s:4:"#eee";s:18:"nav_position_color";s:5:"light";s:17:"nav_position_text";s:0:"";s:21:"nav_position_text_top";s:0:"";s:17:"html_after_header";s:0:"";s:10:"html_intro";s:0:"";s:16:"footer_left_text";s:158:"Copyright [ux_current_year] © <strong>asd.com</strong>.<br> <img src=''http://www.aasd.com/wp-content/uploads/2015/12/creditcard-icons.png''>";s:17:"footer_right_text";s:137:"<img src="http://gninggel.com/wp-content/uploads/2016/08/comodo_secure_seal_100x85_transp.png"><br> %s <br>%s";s:14:"footer_1_color";s:5:"light";s:17:"footer_1_bg_color";s:7:"#ff3233";s:17:"footer_1_bg_image";s:0:"";s:16:"footer_1_columns";s:7:"large-3";s:14:"footer_2_color";s:4:"dark";s:17:"footer_2_bg_color";s:7:"#772222";s:17:"footer_2_bg_image";s:0:"";s:16:"footer_2_columns";s:7:"large-3";s:19:"footer_bottom_style";s:4:"dark";s:19:"footer_bottom_color";s:4:"#333";s:18:"html_before_footer";s:0:"";s:17:"html_after_footer";s:0:"";s:13:"disable_fonts";i:0;s:13:"type_headings";s:4:"Lato";s:10:"type_texts";s:4:"Lato";s:8:"type_nav";s:4:"Lato";s:8:"type_alt";s:14:"Dancing Script";s:11:"type_subset";a:7:{s:5:"latin";s:1:"1";s:12:"cyrillic-ext";s:1:"0";s:9:"greek-ext";s:1:"0";s:5:"greek";s:1:"0";s:10:"vietnamese";s:1:"0";s:9:"latin-ext";s:1:"0";s:8:"cyrillic";s:1:"0";}s:11:"custom_font";s:0:"";s:13:"color_primary";s:7:"#dd3333";s:15:"color_secondary";s:7:"#eeee22";s:13:"color_success";s:7:"#7a9c59";s:11:"color_links";s:0:"";s:13:"button_radius";s:3:"0px";s:15:"dropdown_border";s:0:"";s:11:"dropdown_bg";s:0:"";s:13:"dropdown_text";s:5:"light";s:11:"blog_layout";s:13:"right-sidebar";s:10:"blog_style";s:11:"blog-normal";s:18:"blog_archive_title";i:1;s:11:"blog_header";s:1:" ";s:15:"blog_after_post";s:1:" ";s:16:"blog_post_layout";s:13:"right-sidebar";s:15:"blog_post_style";s:7:"default";s:15:"blog_author_box";i:1;s:10:"blog_share";i:0;s:13:"blog_parallax";i:0;s:19:"featured_items_page";i:0;s:22:"featured_items_pr_page";s:2:"12";s:22:"featured_items_related";s:7:"default";s:29:"featured_items_related_height";s:5:"250px";s:16:"wc_account_links";i:1;s:14:"facebook_login";i:0;s:17:"facebook_login_bg";s:0:"";s:14:"color_checkout";s:0:"";s:10:"color_sale";s:0:"";s:16:"color_new_bubble";s:7:"#7a9c59";s:12:"color_review";s:0:"";s:15:"product_sidebar";s:12:"left_sidebar";s:25:"product_offcanvas_sidebar";s:1:"0";s:15:"product_display";s:4:"tabs";s:18:"cart_dropdown_show";s:1:"1";s:16:"shop_aside_title";s:0:"";s:12:"product_zoom";s:1:"0";s:16:"related_products";s:6:"slider";s:23:"related_products_pr_row";s:1:"4";s:20:"max_related_products";s:2:"12";s:15:"disable_reviews";s:1:"0";s:9:"tab_title";s:0:"";s:11:"tab_content";s:0:"";s:23:"html_before_add_to_cart";s:1:" ";s:22:"html_after_add_to_cart";s:0:"";s:14:"html_shop_page";s:0:"";s:16:"category_sidebar";s:12:"left-sidebar";s:10:"grid_style";s:5:"grid1";s:10:"grid_frame";s:6:"normal";s:12:"masonry_grid";s:1:"0";s:16:"add_to_cart_icon";s:7:"disable";s:25:"short_description_in_grid";s:1:"0";s:9:"cat_style";s:10:"text-badge";s:15:"breadcrumb_size";s:17:"breadcrumb-normal";s:15:"breadcrumb_home";s:1:"1";s:18:"category_row_count";s:1:"3";s:25:"category_row_count_mobile";s:1:"2";s:16:"products_pr_page";s:2:"12";s:13:"search_result";s:1:"0";s:13:"product_hover";s:12:"fade_in_back";s:12:"bubble_style";s:6:"style1";s:22:"sale_bubble_percentage";s:1:"0";s:18:"disable_quick_view";s:1:"0";s:13:"wishlist_icon";s:5:"heart";s:15:"coupon_checkout";s:1:"0";s:17:"continue_shopping";s:1:"0";s:16:"html_cart_footer";s:0:"";s:21:"html_checkout_sidebar";s:0:"";s:14:"html_thank_you";s:0:"";s:12:"catalog_mode";s:1:"0";s:19:"catalog_mode_prices";s:1:"0";s:19:"catalog_mode_header";s:0:"";s:20:"catalog_mode_product";s:0:"";s:21:"catalog_mode_lightbox";s:0:"";s:19:"facebook_login_text";s:0:"";s:23:"facebook_login_checkout";s:1:"1";s:18:"custom_share_icons";s:0:"";s:18:"nav_menu_locations";a:3:{s:7:"primary";i:11;s:14:"primary_mobile";i:11;s:6:"footer";i:12;}s:18:"custom_css_post_id";i:-1;}
""" % (str1, str2)
cur.execute("INSERT INTO `wp_options`(`option_id`, `option_name`, `option_value`, `autoload`) VALUES ('450', 'theme_mods_flatsome', '%s', 'yes')" % s)
db.commit()
Also tried with ( ) instead of """ and with """, but nothing.
There is no error message, seems query is okay, but won`t change nothing.
Made a new file to try to load the string from file
with open('config') as f:
query = f.read()
cur.execute("UPDATE wp_options SET option_value = '%s' WHERE option_id = 450" % query)
The problem was with encoding, I had several hours of headache cause of this but I`m back to share the resolution:
str = unicode(str, errors='ignore')
I have a JSON object in Python. I am Using Python DB-API and SimpleJson. I am trying to insert the json into a MySQL table.
At moment am getting errors and I believe it is due to the single quotes '' in the JSON Objects.
How can I insert my JSON Object into MySQL using Python?
Here is the error message I get:
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7ff68f91d7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the
manual that corresponds to your MySQL server version for
the right syntax to use near ''favorited': '0',
'in_reply_to_user_id': '52063869', 'contributors':
'NULL', 'tr' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-py2.5.egg/
twitstream/twitasync.py|found_terminator|55] [twitter.py|callback|26]
[build/bdist.linux-x86_64/egg/MySQLdb/cursors.py|execute|166]
[build/bdist.linux-x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Another error for reference
error: uncaptured python exception, closing channel
<twitstream.twitasync.TwitterStreamPOST connected at
0x7feb9d52b7e8> (<class '_mysql_exceptions.ProgrammingError'>:
(1064, "You have an error in your SQL syntax; check the manual
that corresponds to your MySQL server version for the right
syntax to use near 'RT #tweetmeme The Best BlackBerry Pearl
Cell Phone Covers http://bit.ly/9WtwUO''' at line 1")
[/usr/lib/python2.5/asyncore.py|read|68]
[/usr/lib/python2.5/asyncore.py|handle_read_event|390]
[/usr/lib/python2.5/asynchat.py|handle_read|137]
[/usr/lib/python2.5/site-packages/twitstream-0.1-
py2.5.egg/twitstream/twitasync.py|found_terminator|55]
[twitter.py|callback|28] [build/bdist.linux-
x86_64/egg/MySQLdb/cursors.py|execute|166] [build/bdist.linux-
x86_64/egg/MySQLdb/connections.py|defaulterrorhandler|35])
Here is a link to the code that I am using http://pastebin.com/q5QSfYLa
#!/usr/bin/env python
try:
import json as simplejson
except ImportError:
import simplejson
import twitstream
import MySQLdb
USER = ''
PASS = ''
USAGE = """%prog"""
conn = MySQLdb.connect(host = "",
user = "",
passwd = "",
db = "")
# Define a function/callable to be called on every status:
def callback(status):
twitdb = conn.cursor ()
twitdb.execute ("INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)",(status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status))
# print status
#print "%s:\t%s\n" % (status.get('user', {}).get('screen_name'), status.get('text'))
if __name__ == '__main__':
# Call a specific API method from the twitstream module:
# stream = twitstream.spritzer(USER, PASS, callback)
twitstream.parser.usage = USAGE
(options, args) = twitstream.parser.parse_args()
if len(args) < 1:
args = ['Blackberry']
stream = twitstream.track(USER, PASS, callback, args, options.debug, engine=options.engine)
# Loop forever on the streaming call:
stream.run()
use json.dumps(json_value) to convert your json object(python object) in a json string that you can insert in a text field in mysql
http://docs.python.org/library/json.html
To expand on the other answers:
Basically you need make sure of two things:
That you have room for the full amount of data that you want to insert in the field that you are trying to place it. Different database field types can fit different amounts of data.
See: MySQL String Datatypes. You probably want the "TEXT" or "BLOB" types.
That you are safely passing the data to database. Some ways of passing data can cause the database to "look" at the data and it will get confused if the data looks like SQL. It's also a security risk. See: SQL Injection
The solution for #1 is to check that the database is designed with correct field type.
The solution for #2 is use parameterized (bound) queries. For instance, instead of:
# Simple, but naive, method.
# Notice that you are passing in 1 large argument to db.execute()
db.execute("INSERT INTO json_col VALUES (" + json_value + ")")
Better, use:
# Correct method. Uses parameter/bind variables.
# Notice that you are passing in 2 arguments to db.execute()
db.execute("INSERT INTO json_col VALUES %s", json_value)
Hope this helps. If so, let me know. :-)
If you are still having a problem, then we will need to examine your syntax more closely.
The most straightforward way to insert a python map into a MySQL JSON field...
python_map = { "foo": "bar", [ "baz", "biz" ] }
sql = "INSERT INTO your_table (json_column_name) VALUES (%s)"
cursor.execute( sql, (json.dumps(python_map),) )
You should be able to insert intyo a text or blob column easily
db.execute("INSERT INTO json_col VALUES %s", json_value)
You need to get a look at the actual SQL string, try something like this:
sqlstr = "INSERT INTO tweets_unprocessed (text, created_at, twitter_id, user_id, user_screen_name, json) VALUES (%s,%s,%s,%s,%s,%s)", (status.get('text'), status.get('created_at'), status.get('id'), status.get('user', {}).get('id'), status.get('user', {}).get('screen_name'), status)
print "about to execute(%s)" % sqlstr
twitdb.execute(sqlstr)
I imagine you are going to find some stray quotes, brackets or parenthesis in there.
#route('/shoes', method='POST')
def createorder():
cursor = db.cursor()
data = request.json
p_id = request.json['product_id']
p_desc = request.json['product_desc']
color = request.json['color']
price = request.json['price']
p_name = request.json['product_name']
q = request.json['quantity']
createDate = datetime.now().isoformat()
print (createDate)
response.content_type = 'application/json'
print(data)
if not data:
abort(400, 'No data received')
sql = "insert into productshoes (product_id, product_desc, color, price, product_name, quantity, createDate) values ('%s', '%s','%s','%d','%s','%d', '%s')" %(p_id, p_desc, color, price, p_name, q, createDate)
print (sql)
try:
# Execute dml and commit changes
cursor.execute(sql,data)
db.commit()
cursor.close()
except:
# Rollback changes
db.rollback()
return dumps(("OK"),default=json_util.default)
One example, how add a JSON file into MySQL using Python. This means that it is necessary to convert the JSON file to sql insert, if there are several JSON objects then it is better to have only one call INSERT than multiple calls, ie for each object to call the function INSERT INTO.
# import Python's JSON lib
import json
# use JSON loads to create a list of records
test_json = json.loads('''
[
{
"COL_ID": "id1",
"COL_INT_VAULE": 7,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 3.14159,
"COL_STRING_VAULE": "stackoverflow answer"
},
{
"COL_ID": "id2",
"COL_INT_VAULE": 10,
"COL_BOOL_VALUE": false,
"COL_FLOAT_VALUE": 2.71828,
"COL_STRING_VAULE": "http://stackoverflow.com/"
},
{
"COL_ID": "id3",
"COL_INT_VAULE": 2020,
"COL_BOOL_VALUE": true,
"COL_FLOAT_VALUE": 1.41421,
"COL_STRING_VAULE": "GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer"
}
]
''')
# create a nested list of the records' values
values = [list(x.values()) for x in test_json]
# print(values)
# get the column names
columns = [list(x.keys()) for x in test_json][0]
# value string for the SQL string
values_str = ""
# enumerate over the records' values
for i, record in enumerate(values):
# declare empty list for values
val_list = []
# append each value to a new list of values
for v, val in enumerate(record):
if type(val) == str:
val = "'{}'".format(val.replace("'", "''"))
val_list += [ str(val) ]
# put parenthesis around each record string
values_str += "(" + ', '.join( val_list ) + "),\n"
# remove the last comma and end SQL with a semicolon
values_str = values_str[:-2] + ";"
# concatenate the SQL string
table_name = "json_data"
sql_string = "INSERT INTO %s (%s)\nVALUES\n%s" % (
table_name,
', '.join(columns),
values_str
)
print("\nSQL string:\n\n")
print(sql_string)
output:
SQL string:
INSERT INTO json_data (COL_ID, COL_INT_VAULE, COL_BOOL_VALUE, COL_FLOAT_VALUE, COL_STRING_VAULE)
VALUES
('id1', 7, True, 3.14159, 'stackoverflow answer'),
('id2', 10, False, 2.71828, 'http://stackoverflow.com/'),
('id3', 2020, True, 1.41421, 'GIRL: Do you drink? PROGRAMMER: No. GIRL: Have Girlfriend? PROGRAMMER: No. GIRL: Then how do you enjoy life? PROGRAMMER: I am Programmer.');
The error may be due to an overflow of the size of the field in which you try to insert your json. Without any code, it is hard to help you.
Have you considerate a no-sql database system such as couchdb, which is a document oriented database relying on json format?
Here's a quick tip, if you want to write some inline code, say for a small json value, without import json.
You can escape quotes in SQL by a double quoting, i.e. use '' or "", to enter ' or ".
Sample Python code (not tested):
q = 'INSERT INTO `table`(`db_col`) VALUES ("{k:""some data"";}")'
db_connector.execute(q)