I'm writing an insert function and I have a nested dictionary that I want to insert into a column in postgres, is there a way to insert the whole json into the column? Lets say I have to insert the value of the key "val" into a column, how can I achieve that? I'm using psycopg2 library in my python code.
"val": {
"name": {
"mike": "2.3",
"roy": "4.2"
}
}
Yes, you can extract nested JSON using at least Postgres 9.4 and up by casting your string to JSON and using the "Get JSON object field by key" operator:
YOUR_STRING ::CAST JSON_OPERATOR
'{"val":1}' ::JSON -> 'val'
This works in at least Postgres 9.4 and up:
INSERT INTO my_table (my_json)
VALUES ('"val":{"name":{"mike":"2.3"}}'::JSON->'val');
Depending on your column type you may choose to cast to JSONB instead of JSON (the above will only work for TEXT and JSON).
INSERT INTO my_table (my_json)
VALUES ('"val":{"name":{"mike":"2.3"}}'::JSONB->'val');
See: https://www.postgresql.org/docs/9.5/static/functions-json.html
Related
I know thanks to this website that it is possible to bind a list to a custom query parameter, however I have only seen it done with an IN clause.
What I am trying to do is to pass a list to a MySQL JSON_ARRAY, with no success at the moment
My code looks like this:
from sqlalchemy import text
query = text("INSERT into models (name, elements) VALUES (:model_name, :elements)"
session.execute(query, {"model_name": "foo", "elements": ["bar", "baz"]}
But it raises OperationalError: (pymysql.err.OperationalError) (1241, 'Operand should contain 1 column(s)') as I assume that each element of the list is identified as an indidividual column.
I tried to enclose the :elements parameters in a JSON_ARRAY(...) to no avail.
Is there a way to leverage SQLALchemy's text here, or do I need to go back to building custom strings to pass with something like
session.execute(query, {"model_name": "foo", "elements": '", "'.join(["bar", "baz"])}
?
I'd serialize your python array to a string in JSON format, then you can bind it as a single scalar value.
import json
session.execute(query, {"model_name": "foo", "elements": json.dumps(["bar", "baz"])}
I am attempting to update a json column in postgres (this is a bulk update using execut values). I am receiving a json object via an API. I insert the entire object into one column that is classified as a json column
CREATE TABLE my_table(
id SERIAL PRIMARY KEY,
event_data json NOT NULL default '{}'::JSON,
createdt timestamp NOT NULL DEFAULT now()
);
My update script looks like this:
UPDATE my_tableAS t
SET event_data = e.event_data::json
FROM (VALUES %s) AS e(id, event_data)
WHERE t.id= e.id
RETURNING *
I do a json.dumps on all json before hand
event_list.append([event['id'], json.dumps(event['data'])])
Once I get the completed rows I handle the data as such:
return json.loads(json.dumps(update_data, default=date_converter))
This all works properly when doing a straight insert into the json value, I dump the values before insert and then do the json.dumps/loads on the returning rows. Everything works fine. Just the update method.
Here is how the data is returned via the api on the update:
[
{
"id": 170152,
"event_data": "{\"commenttxt\": \"Test comment\", \"descrtxt\": \"HELLO WORLD\", \"eventcmpltflg\": false, \"eventcmpltontmflg\": false, \"active\": true}",
"createdt": "2021-03-18T08:34:07Z"
}
]
And this is how I recieve it doing an insert:
[
{
"id": 170152,
"event_data": {
"commenttxt": "Test comment",
"descrtxt": "Test descr",
"eventcmpltflg": false,
"eventcmpltontmflg": false,
"active": true
},
"createdt": "2021-03-18T08:34:07Z"
}
]
If I remove the json.dumps in the event_list.append section I get the error "Can't adapt type of dict".
For some context, I am not replacing individual elements inside the json. I am updating the entire column with a new set of json. I use a different table for tracking changes for historical/audit trails of what has changed. I use a json column because different teams use different values as their needs might be different so rather than using a table with a million columns to handle different teams json seemed best way to manage it.
I appreciate any help.
Ok, so I found the solution. It turns out because I am just returning * that postgres is taking the already dumped values that I am inserting and returning that instead of returning from the table row directly. I had to modify the SQL accordingly
UPDATE my_table AS t
SET event_data = e.event_data::json
FROM (VALUES %s) AS e(id, event_data)
WHERE t.id= e.id
RETURNING t.*
So basically in my RETURNING i had to specify which table it came from and since i renamed my table as "t" it had to be t.* or if you want specific columns t.column_name.
I had assumed that it automatically return the data coming from the table and not the pseudo table created by the FROM statement.
I need to insert the list,some values into table
I have tried executemany but it's not worked.
list1=['a','b','c','d','e',['f','g','h','i']]
query="insert into "+metadata_table_name+"(created_by,Created_on,File_Path,Category,File_name,Fields) values(%s,%s,%s,%s,%s,%s)" # inserting the new record
cursor.executemany(query,list1)
list should be entered into the last(Fileds) Column
Please help me.
Thanks in Advance.
You have to think about data types. Does MySQL have a suitable data type for python's nested lists? I don't know such types.
A possible solution is to use JSON encoding and storing the lists as strings in MySQL table. Encode last element of your list to JSON string:
import json
list1=['a','b','c','d','e',['f','g','h','i']]
query_params = list1[0:-1] + [json.dumps(list1[:-1])]
query="insert into "+metadata_table_name+" .(created_by,Created_on,File_Path,Category,File_name,Fields) values(%s,%s,%s,%s,%s,%s)" # inserting the new record
cursor.executemany(query, query_params)
For using stored data later you have to convert back JSON string to a list:
fields_list = json.loads(fields_str)
This is my table schema,
[column] [type]
tablename json
word varchar
text json
I implemented using psycopg2 with Python,
cur.execute("INSERT INTO json (word,text) VALUES (%s,%s);",(word,text))
word contains list object type but inside are string,
['a','b','c']
text contains list object type but inside is dict (json),
[{'a':'b'},{'c':'d'}]
When I run the function. I got this error wanring below,
" can't adapt type 'dict' "
The question is, How to insert json into postgreSQL, As you see type of text. It's look like dict, But how to assign text variable is json?. or I'm missing something?
json.dumps() could be used to switch to a string for the database.
import json
postgres_string = json.dumps(text)
# Save postres_string into postgress here
# When you want to retrieve the dictionary, do so as below:
text = json.loads(postgres_string)
Well,you use execute function to execute a SQL, just construct the right SQL, it would get success.You want insert a json type data, just use "::" to transform a string type into a json type, like below, it works:
postgres=# insert into json_test(word,text) values('abcd_word','[{"a":"b"},{"c":"d"}]'::json);
INSERT 0 1
postgres=#
postgres=# select * from json_test ;
tablename | word | text
-----------+-----------+-----------------------
| abcd_word | [{"a":"b"},{"c":"d"}]
Using python 3, I want to download API data, which is returned as JSON, and then I want to insert only specific (columns or fields or whatever?) into a sqlite database. So, here's what I've got and the issues I have:
Using python's request module:
##### import modules
import sqlite3
import requests
import json
headers = {
'Authorization' : 'ujbOsdlknfsodiflksdonosB4aA=',
'Accept' : 'application/json'
}
r = requests.get(
'https://api.lendingclub.com/api/investor/v1/accounts/94837758/detailednotes',
headers=headers
)
Okay, first issue is how I get the requested JSON data into something (a dictionary?) that python can use. Is that...
jason.loads(r.text)
Then I create the table into which I want to insert the specific data:
curs.execute('''CREATE TABLE data(
loanId INTEGER NOT NULL,
noteAmount REAL NOT NULL,
)''')
No problem there...but now, even though the JSON data looks something like this (although there are hundreds of records)...
{
"myNotes": [
{
"loanId":11111,
"noteId":22222,
"orderId":33333,
"purpose":"Debt consolidation",
"canBeTraded":true,
"creditTrend":"DOWN",
"loanAmount":10800,
"noteAmount":25,
"paymentsReceived":5.88,
"accruedInterest":12.1,
"principalPending":20.94,
},
{
"loanId":11111,
"noteId":22222,
"orderId":33333,
"purpose":"Credit card refinancing",
"canBeTraded":true,
"creditTrend":"UP",
"loanAmount":3000,
"noteAmount":25,
"paymentsReceived":7.65,
"accruedInterest":11.92,
"principalPending":19.76,
}]
}
I only want to insert 2 data points into the sqlite database, the "loanId" and the "noteAmount". I believe inserting the data into the database will look something like this (but know this is incorrect):
curs.execute('INSERT INTO data (loanId, noteAmount) VALUES (?,?)', (loanID, noteAmount))
But I am now at a total loss as to how to do that, so I guess I have 2 main issues; getting the downloaded data into something that python can use to then insert specific data into the database; and then how exactly do I insert the data into the database from the object that holds the downloaded data. I'm guessing looping is part of the answer...but from what? Thanks in advance!
As the documentation says:
The sqlite3 module supports two kinds of placeholders: question marks
(qmark style) and named placeholders (named style).
Note that you can even insert all rows at once using executemany.
So in your case:
curs.executemany('INSERT INTO data (loanId, noteAmount) '
'VALUES (:loanId,:noteAmount)', json.loads(...)['myNotes'])
First off, it's js = json.loads(r.text)` so you're very close.
Next, if you want to insert just the loanID and noteAmount fields of each record, then you'll need to loop and do something like
for record in js['myNotes']:
curs.execute('INSERT INTO data (loanId, noteAmount) VALUES (?,?)', (record['loanID'], record['noteAmount']))
If you play with it a bit, you could coerce the JSON into one big INSERT call.