Cornice schema validation with colanderalchemy - python

Cornice's documentation mentions how to validate your schema using a colander's MappingSchema subclass. How should we use a colanderalchemy schema for the same purpose? Because if we create a schema using colanderalchemy as stated in the documentation, the schema object has already instantiated the colander's class, and I think that this results in an error.
To be more precise, here is my sample code:
from sqlalchemy.ext.declarative import declarative_base
from cornice.resource import resource, view
from colanderalchemy import SQLAlchemySchemaNode
from sqlalchemy import (
Column,
Integer,
Unicode,
)
Base = declarative_base()
'''
SQLAlchemy part
'''
class DBTable(Base):
__tablename__ = 'mytable'
id = Column(Integer, primary_key=True,
info={'colanderalchemy': {'exclude': True}})
name = Column(Unicode(70), nullable=False)
description = Column(Unicode(256))
'''
ColanderAlchemy part
'''
ClndrTable = SQLAlchemySchemaNode(DBTable)
'''
Cornice part
'''
PRF='api'
#resource(collection_path='%s/' % PRF, path='%s/{fid}' % PRF)
class TableApi(object):
def __init__(self, request):
self.request = request
#view(schema=ClndrTable, renderer='json')
def put(self):
# do my stuff here
pass
Where ClndrTable is my auto-generated schema. Now, when trying to deploy this code, I get the following error:
NotImplementedError: Schema node construction without a typ argument or a schema_type() callable present on the node class
As I've mentioned earlier, I am suspecting that the problem is that ClndrTable (given as an argument to the view decorator) is an instantiation of the automatically generated schema by colanderalchemy.
Anyone knowing how to resolve this?
Thanks all in advance!

This appears to be due to the issue of colander having both a typ property and a schema_type property. They're both supposed to tell you the schema's type, but they can actually be different values. I filed an issue with colander, but if there's a fix it'll likely not make it to pypi any time soon.
So what's happing is: ColanderAlchemy ignores schema_type and uses typ while Cornice ignores typ and uses schema_type.
You can hack a fix with the following: ClndrTable.schema_type = lambda: ClndrTable.typ
However, that just leads you to the next exception:
cornice.schemas.SchemaError: schema is not a MappingSchema: <class 'colanderalchemy.schema.SQLAlchemySchemaNode'>
This is due to Cornice not duck typing but expecting all Schema to be a subclass of MappingSchema. However, MappingSchema is just a Schema with typ/schema_type being Mapping (which is what ColanderAlchemy returns).
I'll see if I can enact some changes to fix this.
Update
Despite the names, 'typ' and 'schema_type' have two different purposes. 'typ' always tells you the type of a schema instance. 'schema_type' is a method that's called to give a SchemaNode a default type when it's instantiated (so it's called in the __init__ if you don't pass a typ in, but other than that it's not supposed to be used).
Cornice has been patched to properly use typ now (though, as of this message, it's not part of the latest release).

Related

Cannot update with FastAPI

When I run put from FastAPI docs with the following code, I get 500 Error: Internal Server Error`` and the terminal shows AttributeError: 'Test' object has no attribute ' items'```` and the terminal shows AttributeError: 'Test' object has no attribute 'items'.
I can create, get, delete, etc. normally, but for some reason I can't just put.
Also, if I try putting in a non-existent ID, I get a 404 error normally.
I would appreciate it if you could tell me more about it.
router
#router.put('/{id}', status_code=status.HTTP_202_ACCEPTED)
def update(id, request:schemas.Test ,db:Session = Depends(get_db)):
test= db.query(models.Test).filter(models.Test.id == id)
if not test.first():
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f'Test with id {id} is not found')
test.update(request)
db.commit()
return 'updated'
models
from sqlalchemy import Column, Integer, String
from db import Base
class Test(Base):
__tablename__ = 'tests'
id = Column('id',Integer, primary_key=True,index=True)
title = Column('title',String(256))
schemas
from pydantic import BaseModel
class Test(BaseModel):
title:str
SQLAlchemy's update method on objects expects a dict. You're giving it a pydantic base model.
Pydantic's BaseModel supports a dict() method to return the object's properties as a dictionary. You can give this dictionary to your update method instead:
test.update(request.dict())
Also be aware that the request name is used for other things in FastAPI and might not be a good name for a route parameter. The id name is also a reference to a built-in Python function, so you'll usually want to name it test_id or something similar instead.

Completely restart/reload declarative class with dynamic functionality in SQLAlchemy

I am using SQLAlchemy + SQLite3 for creating multiple databases based on user input. When initializing a new database, the user defines any number of arbitrary features and their types. I wrote a DBManager class to serve as an interface between user input and database creation/access.
Dynamically "injecting" these arbitrary features in the declarative model (the Features class) is working as expected. The problem I have is when the user wants to create a second/different database: I can't figure out how to completely "clear" or "refresh" the model or the declarative_base so that the user is able to create a new database (with possibly different features).
Below is a minimal reproducible example of my situation:
src.__init__.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
Session = sessionmaker()
Base = declarative_base()
src.features.py
from sqlalchemy import Column, ForeignKey, Integer
from sqlalchemy.orm import relationship
from src import Base
class Features(Base):
__tablename__ = "features"
features_id = Column(Integer, primary_key=True)
#classmethod
def add_feature(cls, feature_name, feature_type):
setattr(cls, feature_name, Column(feature_type))
src.db_manager.py:
from typing import Optional, Dict
from sqlalchemy import create_engine
from src import Base, Session
from src.features import Features
class DBManager:
def __init__(self, path: str, features: Optional[Dict] = None) -> None:
self.engine = create_engine(f'sqlite:///{path}')
Session.configure(bind=self.engine)
self.session = Session()
self.features = features
if self.features: # user passed in some arbitrary features
self.bind_features_to_features_table()
Base.metadata.create_all(bind=self.engine)
def bind_features_to_features_table(self):
for feature_name, feature_type in self.features.items():
Features.add_feature(feature_name=feature_name, feature_type=feature_type)
I'd like to be able to do something like this:
from sqlalchemy import String, Float, Integer
from src.db_manager import DBManager
# User wants to create a database with these features
features = {
'name': String,
'height': Float,
}
db_manager = DBManager(path='my_database.db', features=features)
# ... User does some stuff with database here ...
# Now the user wants to create another database with these features
other_features = {
'age': Integer,
'weight': Float,
'city_of_residence': String,
'name': String,
}
db_manager = DBManager(path='another_database.db', features=other_features)
After executing the last line, I'm met with: InvalidRequestError: Implicitly combining column features.name with column features.name under attribute 'name'. Please configure one or more attributes for these same-named columns explicitly. The error wouldn't occur if the feature name did not appear on both databases, but then the feature height would be brought over to the second database, which is not desired.
Things I tried but didn't work:
call Base.metadata.clear() between DBManager instances: same error
call sqlalchemy.orm.clear_mappers() between DBManager instances: results in AttributeError: 'NoneType' object has no attribute 'instrument_attribute'
call delattr(Features, feature_name): results in NotImplementedError: Can't un-map individual mapped attributes on a mapped class..
This program will be running inside a GUI, so I can't really afford to exit/restart the script in order to connect to the second database. The user should be able to load/create different databases without having to close the program.
I understand that the error stems from the fact that the underlying Base object has not been "refreshed" and is still keeping track of the features created in my first DBManager instance. However I do not know how to fix this. What's worse, any attempt to overwrite/reload a new Base object will need to be applied to all modules that imported that object from __init__.py, which sounds tricky. Does anyone have a solution for this?
My solution was to define the Features declarative class inside a function, get_features, that takes a Base (declarative base) instance as an argument. The function returns the Features class object, so that every call essentially creates a new Features class as a whole.
The class DBManager is then responsible for calling that function, and Features becomes a instance attribute of DBManager. Creating a new instance of DBManager means creating an entire new class based on Features, to which I can then add any arbitrary features I'd like.
The code looks something like this:
def get_features(declarative_base):
class Features(declarative_base):
__tablename__ = "features"
features_id = Column(Integer, primary_key=True)
#classmethod
def add_feature(cls, feature_name, feature_type):
setattr(cls, feature_name, Column(feature_type))
return Features
class DBManager:
def __init__(self, path, features):
self.engine = create_engine(f'sqlite:///{path}')
Session.configure(bind=self.engine)
self.session = Session()
base = declarative_base()
self.features_table = get_features(base=base)
if self.features: # user passed in some arbitrary features
self.bind_features_to_features_table()
Base.metadata.create_all(bind=self.engine)
def bind_features_to_features_table(self):
for feature_name, feature_type in self.features.items():
self.features_table.add_feature(feature_name=feature_name, feature_type=feature_type)
It definitely feels a bit convoluted, and I have no idea if there are any caveats I'm not aware of, but as far as I can tell this approach solved my problem.

Flask + SQLAlchemy - custom metaclass to modify column setters (dynamic hybrid_property)

I have an existing, working Flask app that uses SQLAlchemy. Several of the models/tables in this app have columns that store raw HTML, and I'd like to inject a function on a column's setter so that the incoming raw html gets 'cleansed'. I want to do this in the model so I don't have to sprinkle "clean this data" all through the form or route code.
I can currently already do this like so:
from application import db, clean_the_data
from sqlalchemy.ext.hybrid import hybrid_property
class Example(db.Model):
__tablename__ = 'example'
normal_column = db.Column(db.Integer,
primary_key=True,
autoincrement=True)
_html_column = db.Column('html_column', db.Text,
nullable=False)
#hybrid_property
def html_column(self):
return self._html_column
#html_column.setter
def html_column(self, value):
self._html_column = clean_the_data(value)
This works like a charm - except for the model definition the _html_column name is never seen, the cleaner function is called, and the cleaned data is used. Hooray.
I could of course stop there and just eat the ugly handling of the columns, but why do that when you can mess with metaclasses?
Note: the following all assumes that 'application' is the main Flask module, and that it contains two children: 'db' - the SQLAlchemy handle and 'clean_the_data', the function to clean up the incoming HTML.
So, I went about trying to make a new base Model class that spotted a column that needs cleaning when the class is being created, and juggled things around automatically, so that instead of the above code, you could do something like this:
from application import db
class Example(db.Model):
__tablename__ = 'example'
__html_columns__ = ['html_column'] # Our oh-so-subtle hint
normal_column = db.Column(db.Integer,
primary_key=True,
autoincrement=True)
html_column = db.Column(db.Text,
nullable=False)
Of course, the combination of trickery with metaclasses going on behind the scenes with SQLAlchemy and Flask made this less than straight-forward (and is also why the nearly matching question "Custom metaclass to create hybrid properties in SQLAlchemy" doesn't quite help - Flask gets in the way too). I've almost gotten there with the following in application/models/__init__.py:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.ext.hybrid import hybrid_property
# Yes, I'm importing _X stuff...I tried other ways to avoid this
# but to no avail
from flask_sqlalchemy import (Model as BaseModel,
_BoundDeclarativeMeta,
_QueryProperty)
from application import db, clean_the_data
class _HTMLBoundDeclarativeMeta(_BoundDeclarativeMeta):
def __new__(cls, name, bases, d):
# Move any fields named in __html_columns__ to a
# _field/field pair with a hybrid_property
if '__html_columns__' in d:
for field in d['__html_columns__']:
if field not in d:
continue
hidden = '_' + field
fget = lambda self: getattr(self, hidden)
fset = lambda self, value: setattr(self, hidden,
clean_the_data(value))
d[hidden] = d[field] # clobber...
d[hidden].name = field # So we don't have to explicitly
# name the column. Should probably
# force a quote on the name too
d[field] = hybrid_property(fget, fset)
del d['__html_columns__'] # Not needed any more
return _BoundDeclarativeMeta.__new__(cls, name, bases, d)
# The following copied from how flask_sqlalchemy creates it's Model
Model = declarative_base(cls=BaseModel, name='Model',
metaclass=_HTMLBoundDeclarativeMeta)
Model.query = _QueryProperty(db)
# Need to replace the original Model in flask_sqlalchemy, otherwise it
# uses the old one, while you use the new one, and tables aren't
# shared between them
db.Model = Model
Once that's set, your model class can look like:
from application import db
from application.models import Model
class Example(Model): # Or db.Model really, since it's been replaced
__tablename__ = 'example'
__html_columns__ = ['html_column'] # Our oh-so-subtle hint
normal_column = db.Column(db.Integer,
primary_key=True,
autoincrement=True)
html_column = db.Column(db.Text,
nullable=False)
This almost works, in that there's no errors, data is read and saved correctly, etc. Except the setter for the hybrid_property is never called. The getter is (I've confirmed with print statements in both), but the setter is ignored totally and the cleaner function is thus never called. The data is set though - changes are made quite happily with the un-cleaned data.
Obviously I've not quite completely emulated the static version of the code in my dynamic version, but I honestly have no idea where the issue is. As far as I can see, the hybrid_property should be registering the setter just like it has the getter, but it's just not. In the static version, the setter is registered and used just fine.
Any ideas on how to get that final step working?
Maybe use a custom type ?
from sqlalchemy import TypeDecorator, Text
class CleanedHtml(TypeDecorator):
impl = Text
def process_bind_param(self, value, dialect):
return clean_the_data(value)
Then you can just write your models this way:
class Example(db.Model):
__tablename__ = 'example'
normal_column = db.Column(db.Integer, primary_key=True, autoincrement=True)
html_column = db.Column(CleanedHtml)
More explanations are available in the documentation here: http://docs.sqlalchemy.org/en/latest/core/custom_types.html#augmenting-existing-types

How to use exclude_properties and include_properties to exclude/include SQLAlchemy model attributes from corresponding Spyne model?

I have model declared as:
class SAProduct(Base):
sku = Column(PRODUCT_SKU_TYPE, primary_key=True)
i_want_to_hide = Column(String(20), nullable=False)
name = Column(Unicode(255), nullable=True)
#property
def my_property(self):
return i_calculate_property_here(self)
and Spyne model declared as:
db = create_engine('sqlite:///:memory:')
Session = sessionmaker(bind=db)
class TableModel(ComplexModelBase):
__metaclass__ = ComplexModelMeta
__metadata__ = MetaData(bind=db)
class SProduct(TableModel):
__table__ = SAProduct.__table__
How can I make attribute i_want_to_hide to be excluded from Spyne model, and property my_property to be included as Spyne model attribute?
P.S.
Now I use monkey patching Spyne to support this syntax:
class SProduct(GComplexModel):
__model__ = Product
class Attributes:
exclude_attrs = ('i_want_to_hide',)
add_attrs = {'my_property': Boolean}
But I want get rid of it.
This doesn't directly answer your question, but please consider the following code:
from spyne import *
TableModel = TTableModel()
class SomeClass(TableModel):
__tablename__ = 'some_table'
id = Integer(pk=True)
s = Unicode
i = Integer(exc_table=True)
Here, pk stands for primary key (you can use the long form primary_key if you wish) and the i attribute will just be ignored by SqlAlchemy. e.g. it won't be created in the table, it won't be instrumented by SqlAlchemy's metaclass, etc.
As for an attribute that will be hidden from the RPC Parts of Spyne, but not from SqlAlchemy, that's a new feature coming in 2.12.
You will be able to say e.g.:
i = Integer(exc_table=True, pa={JsonObject: dict(exc=true)})
where pa stands for protocol attributes. (you can use the long form prot_attrs if you wish) Here i is ignored by every protocol that inherits JsonObject.
If you don't want it on the wsdl either, you'll have to do:
i = Integer(exc_table=True, exc_interface=True)
https://github.com/arskom/spyne/blob/fa4b1eef5815d3584287d1fef66b61846f82d2f8/spyne/interface/xml_schema/model.py#L197
Spyne offers a richer object model interface compared to SqlAlchemy. Trying to replicate this functionality without adding Spyne as a dependency means you'll have to duplicate all the work done in Spyne in your project. It's your choice!

PYMongo : Parsing|Serializing query output of a collection

By default collection.find or collection.findone() functions results in a dictionary types and if you pass paramater as_class=SomeUserClass than it will try to parse the result into this class format.
but it seems this class should also be derived class of dictionary (as it required __setitem__ function to be defined and i can add keys in the class ).
Here i want to set the properties of the class. how can i do achieve this?
Also, my collection class contains some child classes as properties .So how can i set the properties of child classes also.
It sounds like you want something like an object-relational mapper. I am the primary author of one Ming , but there exist several others for Python as well. In Ming, you might do the following to set up your mapping:
from ming import schema, Field
from ming.orm import (mapper, Mapper, RelationProperty,
ForeignIdProperty)
WikiDoc = collection(‘wiki_page', session,
Field('_id', schema.ObjectId()),
Field('title', str, index=True),
Field('text', str))
CommentDoc = collection(‘comment', session,
Field('_id', schema.ObjectId()),
Field('page_id', schema.ObjectId(), index=True),
Field('text', str))
class WikiPage(object): pass
class Comment(object): pass
ormsession.mapper(WikiPage, WikiDoc, properties=dict(
comments=RelationProperty('WikiComment')))
ormsession.mapper(Comment, CommentDoc, properties=dict(
page_id=ForeignIdProperty('WikiPage'),
page=RelationProperty('WikiPage')))
Mapper.compile_all()
Then you can query for some particular page via:
pg = WikiPage.query.get(title='MyPage')
pg.comments # loads comments via a second query from MongoDB
The various ODMs I know of for MongoDB in Python are listed below.
Ming
MongoKit
MongoEngine
I have solved this by adding __setitem__ in class.
than i do
result = as_class()
for key,value in dict_expr.items():
result.__setitem__(key,value)
and in my class __setitem__ is like
def __setitem__(self,key,value):
try:
attr = getattr(class_obj,key)
if(attr!=None):
if(isinstance(value,dict)):
for child_key,child_value in value.items():
attr.__setitem__(child_key,child_value)
setattr(class_obj,key,attr)
else:
setattr(class_obj,key,value)
except AttributeError:
pass

Categories