I need to insert the list,some values into table
I have tried executemany but it's not worked.
list1=['a','b','c','d','e',['f','g','h','i']]
query="insert into "+metadata_table_name+"(created_by,Created_on,File_Path,Category,File_name,Fields) values(%s,%s,%s,%s,%s,%s)" # inserting the new record
cursor.executemany(query,list1)
list should be entered into the last(Fileds) Column
Please help me.
Thanks in Advance.
You have to think about data types. Does MySQL have a suitable data type for python's nested lists? I don't know such types.
A possible solution is to use JSON encoding and storing the lists as strings in MySQL table. Encode last element of your list to JSON string:
import json
list1=['a','b','c','d','e',['f','g','h','i']]
query_params = list1[0:-1] + [json.dumps(list1[:-1])]
query="insert into "+metadata_table_name+" .(created_by,Created_on,File_Path,Category,File_name,Fields) values(%s,%s,%s,%s,%s,%s)" # inserting the new record
cursor.executemany(query, query_params)
For using stored data later you have to convert back JSON string to a list:
fields_list = json.loads(fields_str)
Related
I want to insert and read clob data through python.
I put data like below code.
cursor.execute("""insert into test values (:data)""", data=data.encode())
The data is like this.
b'{"metadata": {"dataVersion": "2", ...
The insert operation works fine.
However, when data is imported with the select statement, completely different data comes out.
cursor.execute("select columns from test")
a = cursor.fetchall()
a = a[1][0]
a
a:
'7B226D65746164617461223A207B226461746156657273696F6E223A202232222C20226D617463684964223A20224B525F35353934313639363034222C20227061727469636970616E7473223A205B2278544258423041787838786E425A41525A43704344554A734A6C674A4F6954636D4E5644704544356C71705058505A7A6D32695358674751452D6C314C516E356D5466384C51333636396F686377222C2022656634464E5971484A416C586C72346749445A61617950446B775A696A43544B59757964704642576B4F4B49724A3264472D737837637A524C73325274394A4A4C4D684C30506555793663726851222C20226B7A4B5A2D6134724A647952635949326A765A6E423854527367342D526B422D624838637373666C72754C474C3058614B337343356F6A68476252496D4C4D436D327063566E6170662D58326C51222C20225A4B50654A4E4F656F4766426162533464564177556A566E676133584F36483873622D46756273366F5832356C4B324B6579354C385232724F345F6E667672694D65685A6C7051514F74505A3977222C20226B3839526C4E5271596A677A34486158512D67695972376643574D2D4F415A785075476371557A58626B624...
Do I need to decode the data?
Or do I have to put it in another way when inserting?
Option #1:
use read() function to convert the CLOB to string:
a = a[1][0].read()
Option #2:
Convert cx_Oracle.LOB data to string in python
I am trying to insert integer in SQL table with datatype of column as INTEGER.
df_asgn['Attempts'] = df_asgn['Attempts'].apply(lambda x: randint(0,6)).astype('int32')
But still in database it is geeting stored as : snip of SQL table
And on retreiving the value from table using fetchone() the output I'm getting is :
b'\x06\x00\x00\x00' instead of just 6
I want to keep the data as integer only. Please help me out.
Thanks in Advance.
I am using python to fetch data from Oracle DB. All the rows have a column which has XML data. When I print the data fetched from Oracle DB using python, the column with XML data is printed as - cx_Oracle.OBJECT object at 0x7fffe373b960 etc. I even converted the data to pandas data frame and still the data for this columns is printed as cx_Oracle.OBJECT object at 0x7fffe373b960. I want to access the key value data stored in this column(XML files).
Please read inline comments.
cursor = connection.cursor() # you know what it is for
# here getClobVal() returns whole xml. It won't work without alias I don't know why.
query = """select a.columnName.getClobVal() from tablename a"""
cursor.execute(query) #you know what it is for
result = cursor.fetchone()[0].read() # for single record
result = cursor.fetchall() # for all records
for res in result:
print res[0].read()
I'm trying to store a compressed dictionary in my sqlite database. First, I convert the dict to a string using json.dumps, which seems to work fine. Storing this string in DB also works.
In the next step, I'm compressing my string using encode("zlib"). But storing the resulting string in my db throws an error.
mydict = {"house":"Haus","cat":"Katze","red":u'W\xe4yn',"dict":{"1":"asdfhgjl ahsugoh ","2":"s dhgsuoadhu gohsuohgsduohg"}}
dbCommand("create table testTable (ch1 varchar);")
# convert dictionary to string
jch1 = json.dumps(mydict,ensure_ascii=True)
print(jch1)
# store uncompressed values
dbCommand("insert into testTable (ch1) values ('%s');"%(jch1))
# compress json strings
cjch1 = jch1.encode("zlib")
print(cjch1)
# store compressed values
dbCommand("insert into testTable (ch1) values ('%s');"%(cjch1))
The first print outputs:
{"house": "Haus", "dict": {"1": "asdfhgjl ahsugoh ", "2": "s dhgsuoadhu gohsuohgsduohg"}, "red": "W\u00e4yn", "cat": "Katze"}
The second print is not readable of course:
xワフ1テPCᆵyfᅠネノ õ
Do I need to do any additional conversion before?
Looking forward to any helping hint!
Let's approach this from behind: why are you using gzip encoding in the first place? Do you think you need to save space in your database? Have you checked how long the dictionary strings will be in production? These strings will need to have a minimal length before compression will actually save storage space (for small input strings the output might even be larger than the input!). If that actually saves some disk space: did you think through whether the additional CPU load and processing time due to gzip encoding and decoding are worth the saved space?
Other than that: the result of gzip/zlib compression is a binary blob. In Python 2, this should be of type str. In Python 3, this should be type bytes. In any case, the database needs to know that whatever you are storing there is binary data! VARCHAR is not the right data type for this endeavor. What follows is a quote from MySQL docs:
Also, if you want to store binary values such as results from an
encryption or compression function that might contain arbitrary byte
values, use a BLOB column rather than a CHAR or VARCHAR column, to
avoid potential problems with trailing space removal that would change
data values.
The same consideration holds true for other databases. Also in case of SQLite you must use the BLOB data type (see docs) for storing binary data (if you want to ensure to get back the exact same data as you have put in before :-)).
Thanks a lot Jan-Philip,
you showed me the right solution. My table needs to have a BLOB entry to store the data. Here is the working code:
mydict = {"house":"Haus","cat":"Katze","red":u'W\xe4yn',"dict":{"1":"asdfhgjl ahsugoh ","2":"s dhgsuoadhu gohsuohgsduohg"}}
curs.execute("create table testTable (ch1 BLOB);")
# convert dictionary to string
jch1 = json.dumps(mydict,ensure_ascii=True)
cjch1 = jch1.encode("zlib")
# store compressed values
curs.execute('insert into testTable values (?);', [buffer(cjch1)])
db.commit()
I have a table in postgresql with a column of type JSON. I'm trying to append data to the table.
cursor.execute("""INSERT INTO my_table VALUES(%s);""",(json.dumps(myobject))
Has been working like a charm. But now I need to really increase the throughput.
Here is the code which doesn't work:
import StringIO,psycopg2,json
buffer = StringIO.StringIO(json.dumps(myobject))
cursor.copy_from(buffer,'my_table')
connection.commit()
The json written to the buffer is not compatible with copy_from. For example, '\' characters need to be escaped so '\n' needs to be '\\n'.
How can I write a string to the buffer so that copy_from will put the correct json into my table?
Thanks
I found one solution which seems to work for now:
import StringIO,psycopg2,json
json_to_write = json.dumps(myobject).replace('\\','\\\\')
buffer = StringIO.StringIO(json_to_write)
cursor.copy_from(buffer,'my_table')
connection.commit()
I don't love this because how do I know there are not other issues?
Maybe I should make a feature request to the psycopg2 guys?