From Nicola's SO answer here and my own testing it seems clear that Eve does not support conditional deletes at resource endpoints.
I know I could use a GET: "where={...}" request to the _ids and _etags of the documents I would like to delete, and then send out a series of requests at each item endpoint to delete them with the If-Match header appropriately set to each item's _etag:
for each item:
DELETE: http://localhost:5000/items/<item._id>
...but I would like to avoid sending out multiple HTTP requests if possible.
One solution may be predefined database filters, but these would be static filters where I'd like to dynamically filter deletion depending on some URL parameters. Pre-event hooks may be the solution I'm seeking.
Does Eve support bulk deletion? And if not, what is the recommended way to extend Eve's functionality to provide conditional and/or bulk deletes?
I added a pre-event hook to DELETE, and this seems to be working with the tests I've run so far:
def add_delete_filters(resource, request, lookup):
if 'where' in request.args:
conditions = request.args.getlist('where')
for cond_str in conditions:
cond = json.loads(cond_str)
for attrib in cond:
lookup[attrib] = cond[attrib]
app = Eve()
app.on_pre_DELETE += add_delete_filters
Related
I am now developing with FastAPI and SQLAlchemy Core 1.4.
I've got my API function.
async def do_something(
...
current_user: dict = Depends(get_superuser),
conn: AsyncConnection = Depends(engine_begin),
):
...
do_something API depends on get_superuser and SQLAlchemy connection by engine_begin.
I am planning on sub-dependencies like this:
do_something -> get_superuser -> get_current_user
get_current_user dependency is not required to access database during JWT validation.
However, get_superuser is required to access database to check permissions or something else.
async def get_superuser(
current_user: dict = Depends(get_current_user),
conn: AsyncConnection = Depends(engine_connect),
) -> dict:
...
return {
"superuser": row._mapping,
"conn": conn,
}
I was able to make get_superuser depends on get_current_user and engine_begin, and returned dict { superuser: dict, conn: connection } as shown above.
Personally it does not make sense if get_superuser returns database connection, too.
I tried to put the path decorator dependencies=[Depends(engine_begin)] Then, I could not get the connection handle.
Another solution might be double dependencies like do_something depends on engine_begin and get_superuser depends on engine_begin. I think it will cause another problem due to nested transaction begin.
I would like to know if there is a clear way to check authorization/permission after authentication.
Thank you.
FastAPI will not evaluate a dependency multiple times - so your database connection will only be initialized once.
There shouldn't be any issue with having multiple dependency references such as conn: AsyncConnection = Depends(engine_connect). It just says that "I need the dependency returned by this function"; any subsequent calls will return the same created dependency.
If one of your dependencies is declared multiple times for the same path operation, for example, multiple dependencies have a common sub-dependency, FastAPI will know to call that sub-dependency only once per request.
And it will save the returned value in a "cache" and pass it to all the "dependants" that need it in that specific request, instead of calling the dependency multiple times for the same request.
In an advanced scenario where you know you need the dependency to be called at every step (possibly multiple times) in the same request instead of using the "cached" value, you can set the parameter use_cache=False when using Depends
So your assumption "I think it will cause another problem due to nested transaction begin." is wrong: it's the suggested way of handling requiring the same dependency in multiple locations.
This allows you to compose your dependencies in a very visible and helpful way, and they only get resolved if they're actually necessary somewhere in the hierarchy.
I have a 200MB sized csv file containing rows where a key term is matched against a list of strings inside the second column.
term_x | ["term_1","term_2"]
term_y | ["term_1","term_2"]
term_z | ["term_1","term_2"]
My Django app is not configured to use any complex memory caching (Redis, Memcached) and in practice, I want to pass a term into the database table to retrieve the corresponding list value. Due to its size however, retrieving the list from the correct row takes around half a second to do, on top of other actions being performed while loading the page.
Is it possible in Django to "pre-cache" this table upon server startup? i.e. add all of those values to the cache with the first column being the key? I have attempted something similar by overriding the "ready" method in my app.py to load the database table into the cache on startup, but I get null values when I try to use a term I know is in the table:
class MyAppConfig(AppConfig):
name = 'test_display'
def ready(self):
print("Loading RS Lookup cache..."),
#setup database connection....
cache_df = pd.read_sql_query("Select * from table_to_cache", engine)
print("Table loaded")
for index, row in cache_df.iterrows():
cache.set(row['term'], row['list_of_terms'], None)
print("RS Table loaded")
My init.py in the same Django app has only one line:
default_app_config = 'test_display.apps.MyAppConfig'
Check whether the following is correct:
In the project settings you did not configure caching or used the local memory caching as described in the documentation.
You only use the default cache (from django.core.cache import cache) or correctly handle cache names.
Make sure your code in .ready() actually stores the values you are trying to read later. You can use one of the following:
assert "my_term" in cache, "The term is not cached!"
or
from django.core.cache.backends import locmem
print(locmem._caches)
# now check what you have inside using your very own eyes and patience
As for the following:
Is it possible in Django to "pre-cache" ... ?
Your solution utilizes AppConfig.ready() which is generally a very good place for activities that your server should only perform once per instance. At least I am not aware of a better solution.
I am adding new data into my Database by doing a POST-request on my eve-API.
Since there need to be added some data from the Python side I thought I could add these data by using a pre-request event hook.
So is there a way to modify the data contained in the POST-request using a pre-request hook before inserting the data into the database? I already understood how to implement such a hook but do not have any clue about how to modify data before inserting to DB.
You probably want to look at database hooks, specifically at insert hooks:
When a POST requests hits the API and new items are about to be stored in the database, these vents are fired:
on_insert for every resource endpoint.
on_insert_<resource_name> for the specific resource endpoint.
Callback functions could hook into these events to arbitrarily add new fields or edit existing ones.
In the code below:
def before_insert(resource_name, documents):
if resource_name == 'myresource':
for document in documents:
document['field'] = 'value'
app = Eve()
app.on_insert += before_insert
app.run()
Every time a POST hits the API the before_insert function is invoked. The function updates field1 for every document. Since this callback is invoked before the payload is sent to the database, changes will be persisted to the database.
An interesting alternative would be:
def before_insert(resource_name, documents):
for document in documents:
document['field'] = 'value'
app = Eve()
app.on_insert_myresource += before_insert
app.run()
In the callback we are not testing the endpoint name anymore. This is because we hooked our callback to the on_insert_myresoure event so the function will only be called when POST request are performed on the myresource endpoint. Better separation of concerns, code is simpler and also, improved performance since the callback is not going to be hit an all API inserts. Side note, eventually you can hook multiple callbacks to the same event (hence the use of the addition operator +=).
In my case I wanted to duplicate documents if a given property is in data.
I have to use pre_POST event hook to do that.
def pre_notifications(request):
data = json.loads(request.get_data())
if 'payload' in data and 'condition' in data:
notification = data['payload']
documents = []
users = app.data.pymongo().db.users.find()
for user in users:
copy_notification = copy(notification)
copy_notification['user_email'] = user['user_email']
documents.append(copy_notification)
request._cached_data = json.dumps(documents).encode('utf-8')
First, I tried to replace request.data but it does not work. Doing some search on code I figured out the _cached_data property. Then it works.
Just to complement the answer of #Gustavo (I cannot leave a comment in his answer). You can update the request._cached_json property without serializing your data.
Using his example:
def pre_notifications(request):
data = json.loads(request.get_data())
if 'payload' in data and 'condition' in data:
notification = data['payload']
documents = []
users = app.data.pymongo().db.users.find()
for user in users:
copy_notification = copy(notification)
copy_notification['user_email'] = user['user_email']
documents.append(copy_notification)
request._cached_json = documents
There's some way to return items that field contains some value? Eg.
GET /people?contains="foo"
Return all persons that have the word 'foo' in the name.
Thanks in advance
You could use mongodb $regex operator, which is blacklisted by default in Eve (MONGO_QUERY_BLACKLIST = ['$where', '$regex']).
Add MONGO_QUERY_BLACKLIST = ['$where'] to your settings.py. Then you can query your API like this:
?where={"name": {"$regex": ".*foo.*"}}.
Be careful however. If you don't control the client, enabling regexes could potentially increase your API vulnerability.
I am not conversant with Eve myself. But looking at it's webpage seems like it exposes all of Flask's functionalities.
You need to be looking at this page in the documentation that talks about accessing the request data.
In your Flask app, define a method that accepts both POST and GET requests and then you can access foo by doing request.args.get('contains', '').
This is what I mean:
#app.route('/people', methods=['POST', 'GET'])
def get_people():
search_key = request.args.get('contains', '')
#search for people containing 'foo' in your DB
Hope this gives you a starting point as to how to go about things.
I have a resource in eve e.g. ABC, I want to manipulate another resource e.g. BCD when some condition meet while I am posting a new item to ABC, I know I can hook the event for post/pre_POST_ABC but is there a 'internal' way to do post on BCD without going through via the HTTP again?
In your callback function you could either:
A) use the data driver to store data directly to the database
Something like this:
def update_ABC(request, payload):
accounts = app.data.driver.db['abc_collection']
account = accounts.insert(docs)
app = Eve()
app.on_post_POST_ABC += update_ABC
app.run()
Would do the trick. You would be bypassing the framework this way, and storing directly on the database.
B) Use app.test_client.post() to POST through directly through the application.
app.test_client().post('/bcd', json.dumps({"field":"value"}, content_type='application_json'))
This is probably a better option since the request goes through the framework (meta fields like data_created are handled for you.)
Update: With v0.5+ you can now use post_internal to achieve the same result. There are equivalent internal methods available for other CRUD methods as well.