All within the Django admin, I'd like to enter form fields related to generating a report. Fields you'd expect in a report: report name, report type, start and end dates, report fields, etc.
How would one take these inputs, grab the inputs from the request, pass these inputs to an API (in the background), then process it (in queue-like fashion), finally create a CSV or PDF to download?
I'm fine with creating the admin model form and I think grabbing the inputs when the form is submitted in the admin, then I think I simply pass those inputs to my other API code to process...
My questions are:
When the third-party API is processing the request, is there a special way to handle this lag time?
Where and how would I return the result - which is a CSV or PDF - in the admin interface? The /change/ page?
Is there best-practice for this? I haven't been able to find an example of this when dealing with the admin. I'm not new to Python but am somewhat new to Django.
Nobody will likely answer. Therefore, I'll share my high-level general solution.
You'll need to setup celery.py and modify your __init__.py like it shows in celery's documentation. The key "gotcha" is where they put "proj", you should put the actual name of your project, not "proj".
In your admin.ModelAdmin class, override save_model(self, request, obj, form, change) to capture the form data submitted from the admin form.
Use the form inputs and pass to a celery task function in tasks.py within the current app. Ex: get_report.delay(name, start, end, type). Note the .delay part as this is what allows it to run in the background without having to wait at the screen after you submit the form.
In save_model(...), you still have to call super().save_model(request, obj, form, change) at the end.
The most important step of all - you have open two terminals! 1 for running python manage.py runserver (for example) and another for starting the celery worker. I'm on Windows and Windows has issues so a non-concurrent solution is celery -A scorecard worker --pool=solo -l info (to start the worker - you type this in Terminal), OR, you can install eventlet and use eventlet as the --pool=eventlet value instead of --pool=solo.
Related
The use case is rather simple. That of a background transaction manager, which requires that this classic type of post handler:
class MyCreate(CreateView):
model = MyModel
fields = '__all__'
def post(self, request, *args, **kwargs):
self.form = self.get_form()
if self.form.is_valid():
self.object = self.form.save(commit=True)
return self.form_valid(self.form)
else:
return self.form_invalid(self.form)
doesn't save the form with form.save(), but rather saves the form object somewhere (to the session?) starts a Celery task which will then be responsible for running form.save()
The problem I'm facing is that the form object is refusing to serialize at all, with JSON or pickle, it's just too rich an object, and starting a Celery task requires the arguments to be serialized. I can serialize just the POST data (request.POST) which is doable, and pass it to the Celery task as an argument which works but I can't from that find any way to re-instantiate a form, let alone a formset.
if I want to start the Celery task from a view I redirect to (that implements a progress bar say) then I have the even bigger challenge of passing the form to a whole new Django view. The obvious candidate is to save it to session:
self.request.session["my_form_data"] = self.request.POST
But then in the Celery task, I can't even load that session data (I can possibly wrangle it using out of view sessions), or in the view load, it then passes it as an argument.
The aim though is that the Celery task has a Form object that we can call the save() method on, from the posted data, and generally manage and manipulate as we do a Form. One possibility is to instantiate a ModelForm using the POST data as Initial data then saving that but that's not obviously a solution and may or may not work.
In short, I'm trying to recreate a small part of the view context inside a Celery task, and given how often I see Django and Celery mentioned on-line together (and they seem to work well together) I'm really wondering before I experiment this to death with a hacked up solution, if there are not some simple or canonical means of doing this that I am missing.
I see hints of solutions all over the place. There is the django-remote-forms which promises to serialize a form but offers no means of recreating a Form object or objects (given we maybe use FormSets) from such serialized data. And there's the hint of a FormSerializer in the Django REST framework, but that's a big bundle of package and learning with lots of baggage that may not even do what I need by the time I'm done.
There's even good documentation of serializing Django objects, but not Forms.
It would be a dream if there was a simple, canonical method of passing a Django Form object to a Celery Task so it can work with it just as the Django view might.
I have this project that I’ve been working on for a while:
In the views.py file, I scrape a lot of information from IMDB, with each call taking around 0.3 of a second. Meanwhile, my page will stand idle. I want to have it load the page and then finish up the call.
For instance, I want it to load the recommended movies after already showing the actors and actresses that played in both. Or in an index, I want to allow the user to type and then show the options to click on.
I’ve tried Celery with Redis, but Django can’t display asynchronous tasks.
How could I do this?
As you said, Django can't do that. My advice would be to divide and conquer:
First, write a view that just displays some HTML to load a JS script that can load your data asynchronously, using the render shortcut. A front-end framework like Vue can help you to hide the parts of your layout while your data loads.
Next, write views that just return the data using the JsonResponse object in Django, for example: one view to load the recommendations, one view to load the actor list.
Use XH requests to call your views and retrieve the information, using the Promise methods to make everything appear in sync.
Bonus: If you already have Celery in place, you can define a task that grabs all the data you need on the server, and create a view that polls the status of your task, and then call it using XHR every few milliseconds until your data (or part of it, that really depends on how you define your task) is ready.
I have written some test scripts in python which call some apis for a paritcular application and output the results to a database so I can use jasper to report on the results. Currently the scripts are run using a python interpreter, ie double click on the python file that has some parameters and variables modifed within the script that then initiates the tests. I would like to move towards a more user friendly interface so other people can use these test scripts without having to modify the python code. So I am thinking about using django and creating a web page that has check boxes, and if the check box is ticked, then it will execute that particular python test script or a text box that will pass the values to a given variable, for example. I have a few questions around how this might be achieved.
1 - would I need to implement the python code in to the django source code or can I call the python script from the web page served up by django?
2 - If I were to run this from a web page, how could I ensure that if the web page was closed, that the test would continue in the background.
3 - Is there a way to output the status of the test case to a web page and if the web page was closed, for the status to be availble if the web page was reopened?
Many thanks - oli
If you have a python function you can call from a Django django view maybe with a form as parameter input. If you have long running processes you might want to consider a tip from here: How to start a long-running process from a Django view?
from mytests import testfunc
def test_parameter_view(request):
if request.method == 'POST': # If the form has been submitted...
form = ParameterForm(request.POST)
if form.is_valid():
testfunc(form.cleaned_data['parameter']) # <-- Here the actual testing happens
return HttpResponseRedirect(reverse(test_result)) # Redirect after POST
else:
form = ParameterForm()
return render_to_response('test.html', {
'form': form,
})
In your test_result view you can access the test result values from the database.
If the user closes the browser or not doesn't affect server processes that already have been started. And since you write your results to the database they are persistent and can be accessed any time after the test has finished.
If you don't want to port over your scripts into django views, there is another way:
1 - Set up a form with all the options that you want passed to the script
2 - GET or POST the form params and save them to variables using var1 = request.POST['param1'], etc
3 - Use a module called subprocess to execute your script. http://docs.python.org/library/subprocess.html
I have a Django admin interface that is used almost solely as a gui form for making changes to a single postgresql table. There's also a Python script that's currently run manually from the command line whenever a change is made to the database, & I'd like to hook that up so it runs whenever someone hits "save" after making a change to a row of the table via the admin interface. If this was an entry in views.py, it looks like I'd import the script as a module and run its main function from the view (ie, Can Django use "external" python scripts linked to other libraries (NumPy, RPy2...)). I'm not sure, however, how to do this in the admin interface.
How is admin.py similar/different to a regular entry in views.py?
Where do I put the import/call to the external script - somewhere in the model, somewhere in admin.py?
I'm familiar with Python, but am fairly new to (& somewhat mystified by) "web stuff" (ie, frameworks like Django), & I'm not even sure if I'm asking this question very clearly, because I'm still a little fuzzy on the view/model concept ...
Edit: Turns out I had, in fact, found the solution by reading the documentation/tutorial, but assumed there was a difference with admin stuff. As Keith mentioned in the comments, I'm now running into permissions issues, but I guess that's a separate problem. So thanks, & maybe I'll stop second guessing myself ...
Generally, things you want to happen at 'save' time are either
Part of the model.
If so, you override the model's save method: http://docs.djangoproject.com/en/1.3/ref/models/instances/#saving-objects
You can do anything in that save method.
Part of the view function.
If so, you either extend the admin interface (not so easy), or you write your own.
One thing you might consider is defining the save_model method in your ModelAdmin. This will get executed when someone saves from the admin (but not when someone does a save outside of the admin). This approach might depend on what your requirements are, but should give you the necessary hook when doing the save from the admin.
In admin.py
class MyModelAdmin(admin.ModelAdmin):
model = models.MyModel
def save_model(self, request, obj, form, change):
# you can put custom code in here
obj.save()
In Python-Django, I've got a Model with a FileField member in it. This member stores video files.
I'd like to "interfere" with the standard "add model row/object/instance" proecdure of Django, and manipulate each video I'm adding, before actually committing or adding it to database.
The manipulation is to convert the video to a specific uniform format. Thus, all added videos will eventually be stored in the same format (WebM).
How can I do that? I've looked into Django's custom managers, but I don't think that's what I'm looking for.
Thanks. :)
You can either override save() or use signals.
However, converting the video will take a lot of time. It is probably not a good idea to do that synchronously in your web request. A common approach is to offload the work to a task queue. Have a look at Celery for that.
I am actually doing this very same thing. You don't want to process the video file on the same request as it comes in on for several reasons:
1) You will hang the user on a non-responsive page for a long time, possibly timing them out and wondering if it worked.
2) And if they go to see if it uploaded - it still hasn't finished and saved in the DB (non-consistent) they'll think it is broken.
You want to initially save the record and file on your server. Mark it as needs-to-be-worked-on. And fire off a celery task that will do that work and update that flag when it is complete. I am actually doing this very same thing with zencoder for a project I am working on right now. It works wonderfully.
Celery: http://pypi.python.org/pypi/django-celery
Ghettoq (for local): http://pypi.python.org/pypi/ghettoq
Or you can use django signals to trigger events when items are about to be, or have been, saved to the database.
Specifically, you use the Signal.connect() method to connect the signal handler you want to launch, for example pre_save, post_save, pre_delete, post_delete etc.
In order to set things up:
signals.py:
from django.db.models.signals import *
def entry_action_post_save(sender, instance, **kwargs):
# what do we want to do here?
pass
post_save.connect (entry_action_post_save, sender=Entry)
Where for me, Entry is a models.Model derived class.
This blog also covers an alternative way of setting it up using the dispatcher in models.py.
Note that since you are considering video encoding here, you might not want to actually re-encode the video inside these methods, otherwise your request will take forever to complete. A better method would be to check encoding, and have the model have a status field for webM or notwebm. Then pass off your encoding task elsewhere and do not display the video (Videos.objects.filter(format='webm') until it is complete.
You can just override the save() method on your model. See the documentation.