Serialize UUID objects in flask-restful - python

I have a flask-restful project that interfaces with some custom classes, containing uuid (uuid.UUID) types used as ids. There are a couple of api endpoints which return the object associated with the given id, parsed by flask as an UUID. The issue is that, when I return them as a json payload, I get the following exception:
UUID('…') is not JSON serializable
I want to have those uuids represented as strings to the final user, making the process seamless (the user can take the returned uuid and use it for his next api request).

In order to fix this problem, I had to put together suggestions from two different places:
first, I need to create a custom json encoder, which when dealing with uuids, returns their string representation. StackOverflow answer here
import json
from uuid import UUID
class UUIDEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, UUID):
# if the obj is uuid, we simply return the value of uuid
return str(obj) # <- notice I'm not returning obj.hex as the original answer
return json.JSONEncoder.default(self, obj)
second, I need to take this new encoder and set it as the flask-restful encoder used for the responses. GitHub answer here
class MyConfig(object):
RESTFUL_JSON = {'cls': MyCustomEncoder}
app = Flask(__name__)
app.config.from_object(MyConfig)
api = Api(app)
putting it together:
# ?: custom json encoder to be able to fix the UUID('…') is not JSON serializable
class UUIDEncoder(json.JSONEncoder):
def default(self, obj: Any) -> Any: # pylint:disable=arguments-differ
if isinstance(obj, UUID):
return str(obj) # <- notice I'm not returning obj.hex as the original answer
return json.JSONEncoder.default(self, obj)
# ?: api configuration to switch the json encoder
class MyConfig(object):
RESTFUL_JSON = {"cls": UUIDEncoder}
app = Flask(__name__)
app.config.from_object(MyConfig)
api = Api(app)
on a side note, if you are using vanilla flask, the process is simpler, just set your app json encoder directly (app.json_encoder = UUIDEncoder)
I hope it's useful to someone!

Related

How to send the cursor from a mongo db find query as a result in flask

I want to be able to query my mongo db and get the results for all entries in my stocksCollection collection. I am using allStocks = list(stocksCollection.find({})) which gives me a list of all the entries in that collection. However, when I try to return this as the response to a get request, I get the error:
TypeError: The view function did not return a valid response.
The return type must be a string, dict, tuple, Response instance, or WSGI callable, but it was a list.
Here is the simple code I used to get this:
#app.route("/stocks", methods=['GET'])
def getAllStocks():
return list(stocksCollection.find({}))
I have also tried: return stocksCollection.find({}) (errors because it is a type: cursor) and
allStocks = list(stocksCollection.find({}))
return {'data': allStocks}
But that just gives me the error TypeError: Object of type ObjectId is not JSON serializable. Any ideas on what formatt I can change this cursor to so that I am able to return it to my api call (this will not be serving up directly to a webpage, just being called by the frontend to do some calculations)
Create a JSONEncoder class to make ObjectId JSON serializable
import json
from bson import ObjectId
class JSONEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, ObjectId):
return str(o)
return json.JSONEncoder.default(self, o)
And then use JSONEncoder class as below (in json.dumps) and make the _id value JSON serializable and append the data into a new list return it.
#app.route("/stocks", methods=['GET'])
def getAllStocks():
output = []
all_data = stocksCollection.find({})
for data in all_data:
data['_id'] = json.dumps(data['_id'], cls=JSONEncoder)
output.appned(data)
return {"data": output}
I think this will solve your problem.

Pyramid custom JSON renderer, translate TranslationString

I have a problem with returning a JSON response with translated strings in my application's endpoints.
The default pyramid renderer is my custom JSON renderer. Some of the objects in response are TranslationStrings. I would like to have them automatically translated.
Now I am using this: Pyramid TranslationString not working on json renderer, but it's not an ideal solution for me. I do not want to translate all of the responses manually.
For translations I am using the TransationStringFactory:
_ = i18n.TranslationStringFactory('coma')
I already have some renderer's adapters. So I added a new one - for TransationString class:
def includeme(config):
json_renderer = JSON()
def date_adapter(obj, request):
return obj.isoformat()
def set_adapter(obj, request):
return list(obj)
def uuid_adapter(obj, request):
return str(obj)
def enum_adapter(obj, request):
return obj.value
def trans_string_adapter(obj, request):
return request.localizer.translate(obj)
json_renderer.add_adapter(TranslationString, trans_string_adapter)
json_renderer.add_adapter(datetime.date, date_adapter)
json_renderer.add_adapter(set, set_adapter)
json_renderer.add_adapter(uuid.UUID, uuid_adapter)
json_renderer.add_adapter(enum.Enum, enum_adapter)
config.add_renderer('json', json_renderer)
Here is the example of the JSON object I want to return:
return {
'label': _('Estimated net income'),
'value': round(income_net, self.decimal_places),
...
}
Why my custom JSON renderer cannot call adapter for TranslationString object?
The reason it's not being called is because json.dumps only invokes the default adapters if a type is not json-serializable. A TranslationString subclasses str so it is json-serializable and your adapters are not used.
I think, in general, this is an issue with TranslationString and how it works. It expects you to always pass the string through the localizer, and so you should do that as soon as possible instead of waiting for egress. Unfortunately that basically means passing the localizer all over the place, or making it available as a threadlocal.

JSON encoding NDB entities with keys

I'm trying to create a CRUD application with Google Cloud ndb and REST architecture.
Therefore I have different API calls for creating and retrieving.
Now to update the entities I need to display them in the front end but also give an identifier, so ndb knows which entity to update later.
I try to get the entities with model.query and then encode them to JSON with an extended encoder to serialize datetime and ndb.Key:
# JSONEncoder extension to handle datetime & ndb.Key
class CustomJsonEncoder(json.JSONEncoder):
def default(self, obj):
if isinstance(obj, datetime):
return obj.strftime('%Y-%m-%d-%H-%M-%S')
if isinstance(obj, ndb.Key):
return obj.urlsafe()
return json.JSONEncoder.default(self, obj)
# Get JSON data from ndb
query = Card.query(Card.category==category, ancestor=ancestor_key()).fetch()
cards = json.dumps([c.to_dict() for c in query], cls=CustomJsonEncoder)
self.response.headers['Content-Type'] = 'application/json'
self.response.out.write(cards)
My problem now is that the entity key just disappears and isn't shown in the json.dump anymore. I don't get any errors, it's just not encoded and passed along.
The datetime object is shown correctly when printed. Any ideas on how I can send the URL safe ndb key along with the json.dump?
You're dumping Card entity objects, which do not include the respective object keys. To confirm this just temporarily add a logging statement like this to your CustomJsonEncoder:
if isinstance(obj, ndb.Key):
logging.error('key found: %s' % obj.urlsafe())
return obj.urlsafe()
You need to explicitly add the entity keys to the data to be json-encoded if you want them.
Or you could use a ComputedProperty in that entity's model (but then you'd be wasting some storage):
key_str = ndb.ComputedProperty(lambda self: self.key.urlsafe())

Using a custom JSON encoder for SQLAlchemy's PostgreSQL JSONB implementation

I am using SQLAlchemy's core library to access some PostgreSQL database. Consider I have the following table:
create table foo (j jsonb);
And the following python code:
from decimal import *
from sqlalchemy import Table, Column, Integer, String, MetaData, ForeignKey, DateTime
from sqlalchemy.dialects.postgresql import JSONB
metadata = MetaData(schema="public")
foo = Table('foo', metadata,Column('f', JSONB))
d = Decimal(2)
ins = foo.insert().values(j = {'d': d})
# assuming engine is a valid sqlalchemy's connection
engine.execute(ins)
This last sentence fails with the following error:
StatementError("(builtins.TypeError) Decimal('2') is not JSON serializable",)
Which is why I am asking this question: Is there a way to specify a custom encoder for SQLAchemy to use when encoding json data into PostgreSQL dialect?
This is supported via the json_serializer keyword argument to create_engine, as documented under sqlalchemy.dialects.postgresql.JSON:
def _default(val):
if isinstance(val, Decimal):
return str(val)
raise TypeError()
def dumps(d):
return json.dumps(d, default=_default)
engine = create_engine(..., json_serializer=dumps)
If you, like me, are finding a nice way to get this running with Flask-SQLAlchemy, this is what I did. If you import and pass flask.json instead of the standard library json module, you’ll get automatic deserialization of dates, datetimes and uuid.UUID instances.
class HackSQLAlchemy(SQLAlchemy):
""" Ugly way to get SQLAlchemy engine to pass the Flask JSON serializer
to `create_engine`.
See https://github.com/mitsuhiko/flask-sqlalchemy/pull/67/files
"""
def apply_driver_hacks(self, app, info, options):
options.update(json_serializer=json.dumps)
super(HackSQLAlchemy, self).apply_driver_hacks(app, info, options)
If you're using Flask, you already have an extended JSONEncoder defined in flask.json which handles UUID, but not Decimal. It can be mapped into the SqlAlchemy engine with the json_serializer param as in #univerio's answer:
from flask import json
engine = create_engine(
app.config['SQLALCHEMY_DATABASE_URI'],
convert_unicode=True,
json_serializer=json.dumps,
)
You can further extend the Flask JSONEncoder to support decimal.Decimal with the following:
import decimal
from flask import json
class CustomJSONEncoder(json.JSONEncoder):
"""
Override Flask's `JSONEncoder.default`, which is called
when the encoder doesn't handle a type.
"""
def default(self, o):
if isinstance(o, decimal.Decimal):
return str(o)
else:
# raises TypeError: o not JSON serializable
return json.JSONEncoder.default(self, o)
def init_json(app):
"""
Use custom JSON encoder with Flask
"""
app.json_encoder = CustomJSONEncoder
I found anwser here: https://github.com/flask-restful/flask-restful/issues/116#issuecomment-128419699 Summing it up, to run it with Flask-SQLAlchemy:
from flask import Flask, json
from decimal import Decimal
# define encoder
class JSONEncoder(json.JSONEncoder):
def default(self, value):
if isinstance(value, Decimal):
return str(value)
return json.JSONEncoder.default(self, value)
class Config:
RESTFUL_JSON = {}
# make sure RESTful and Flask encoders stay synchronized
#staticmethod
def init_app(app):
app.config['RESTFUL_JSON']['cls'] = app.json_encoder = JSONEncoder
app = Flask(__name__)
app.config.from_object(Config)
Config.init_app(app)

TypeError: Collection(Database(MongoClient("localhost", 27017), u'demo_database'), u'entries) is not JSON serializable

Good afternoon.
I'm trying to combine Python, MongoDB (via pymongo) and Flask to create client-server application. I want to use one of methods to return all the entire collection, like here:
#app.route('/entries', methods = ['GET'])
def get_entries():
client = MongoClient(db_host, db_port)
db_demo = client['demo_database']
entries = db_demo.entries
return JSONEncoder().encode(entries)
I also have an Encoder class, as advised here:
class JSONEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, ObjectId):
return str(o)
return json.JSONEncoder.default(self, o)
Data collection is very simple - actually, only one item with few fields. What am I doing wrong? Perhaps I should develop more sophisticated encoder class?
Use bson.json_util.dumps, which already supports all the MongoDB extended JSON types:
>>> from bson.json_util import dumps
>>> c.test.test.insert_many([{} for _ in range(3)])
<pymongo.results.InsertManyResult object at 0x7f6ed3189550>
>>> dumps(c.test.test.find())
'[{"_id": {"$oid": "554faa99fa5bd8782e1698cf"}}, {"_id": {"$oid": "554faa99fa5bd8782e1698d0"}}, {"_id": {"$oid": "554faa99fa5bd8782e1698d1"}}]'
Using a combination of both approaches, I prefer the solution that I provided here
from flask import Flask
from flask.json import JSONEncoder
from bson import json_util
from . import resources
# define a custom encoder point to the json_util provided by pymongo (or its dependency bson)
class CustomJSONEncoder(JSONEncoder):
def default(self, obj): return json_util.default(obj)
application = Flask(__name__)
application.json_encoder = CustomJSONEncoder
if __name__ == "__main__":
application.run()

Categories