How to run rqworker for python-rq - python

I have a small Python flask webserver on an Ubuntu box (ngnix and uwsgi) that I just started using to receive and process webhooks. Part of the webhook processing can include sending an email, which I noticed causes a delay and subsequently blocks the response back to the server sending the webhook.
In researching a way to mitigate this, I discovered python-rq (aka rq), which lets me queue up a function call and then immediately respond to the webhook. In testing, this works great!
I'm testing it on my server, and to start rq I have to run rqworker in the same directory as my website. This is great for testing, but I don't want to have to log into the server to start rq just to keep in running.
Some ideas I've come across:
The python-rq docs mention supervisor, http://python-rq.org/patterns/supervisor/, but I don't know if I need that much overhead.
Would a simple cron job do the trick, using reboot?
This is a small internal-only server. I don't want to over-engineer it (I feel like I'm creeping in that direction already), but I also don't want to have to babysit it to make sure all of the pieces are working.
How can I set up rqworker to run in the web site application directory on its own?

Related

Serving requests to always-on python application from Apache

I’ve been running an Apache + PHP webserver for years with no issues. I would like to add a python application to the server, but I can’t figure out how to connect Apache to a python application that is always running. In short, I don’t know the name of the concept or buzzword to research regarding using Apache to send requests back and forth to a python script that is listening on a specific port.
The python script is part of the Microsoft Teams bot framework. It typically runs on port 3978, which means I need Apache to listen for requests to https://bot.example.com and then communicate the requests to localhost:3978. The python application needs to always be running to perform proactive tasks, such as messaging reminders to users, but that’s not important.
The main idea is to use Apache for ssl, presumably because it is battle tested, as the front end while a python application runs in a forever loop to handle the requests. I have searched for hours, but I can’t figure out how to get started even though I am experienced with both Apache and python. Any help would be greatly appreciated.

Send scheduled email django

I have made a small django app to fit all my needs. I will use it on my company level to track simple tasks of couple mehanical engineers. Now, only thing left is to send scheduled emails in my Django app (every day at noon, if someone is behind with work, he would get an email). Since I'm using Windows and I'll deploy this app on Windows, I can't use cron job (this only works on Linux, as I've seen on forums), which is simple and easy. Only way I found so far was using django-celery-beat. This is not so easy to set up, and I need to run 'worker' each time I run my server. This is a bit above my level and I would need to learn a lot more (and it needs to have a message broker, like RabbitMQ, which I also need to run and learn to implement).
I was wondering is there a more easy way to send a simple email every day at noon? I don't want to install additional apps, I wish to keep it simple as possible.
You can do it by Dockerizing Django with Redis and Celery.
Dockerizing is the process of packing, deploying, and running applications using Docker containers.
please use the below link to read more about dockerizing
Dockerizing
Dockerizing Django with Postgres, Redis and Celery

Running "tasks" periodically with Django without seperate server

I realize similar questions have been asked however they have all been about a sepficic problem whereas I don't even now how I would go about doing what I need to.
That is: From my Django webapp I need to scrape a website periodically while my webapp runs on a server. The first options that I found were "django-background-tasks" (which doesn't seem to work the way I want it to) and 'celery-beat' which recommends getting another server if i understood correctly.
I figured just running a seperate thread would work but I can't seem to make that work without it interrupting the server and vice-versa and it's not the "correct" way of doing it.
Is there a way to run a task periodically without the need for a seperate server and a request to be made to an app in Django?
'celery-beat' which recommends getting another server if i understood correctly.
You can host celery (and any other needed components) on the same server as your Django app. They would be separate processes entirely.
It's not an uncommon setup to have a Django app + celery worker(s) + message queue all bundled into the same server deployment. Deploying on separate servers may be ideal, just as it would be ideal to distribute your Django app across many servers, but is by no means necessary.
I'm not sure if this is the "correct" way but it was a cheap and easy way for me to do it. I just created custom Django Management Commands and have them run via a scheduler such as CRON or in my case I just utilized Heroku Scheduler for my app.

Django asynchronous tasks locally

I have a web application that runs locally only (does not run on a remote server). The web application is basically just a ui to adjust settings and see some information about the main application. Web UI was used as opposed to a native application due to portability and ease of development. Now, in order to start and stop the main application, I want to achieve this through a button in the web application. However, I couldn't find a suitable way to start a asynchronous and managed task locally. I saw there is a library called celery, however that seems to be suitable to a distributed environment, which mine is not.
My main need to be able to start/stop the task, as well as the check if the task is running (so I can display that in the ui). Is there any way to achieve this?
celery can work just fine locally. Distributed is just someone else's computer after all :)
You will have to install all the same requirements and the like. You can kick off workers by hand, or as a service, just like in the celery docs.

what's a good module for writing an http web service interface for a daemon?

To give a little background, I'm writing (or am going to write) a daemon in Python for scheduling tasks to run at user-specified dates. The scheduler daemon also needs to have a JSON-based HTTP web service interface (buzzword mania, I know) for adding tasks to the queue and monitoring the scheduler's status. The interface needs to receive requests while the daemon is running, so they either need to run in a separate thread or cooperatively multitask somehow. Ideally the web service interface should run in the same process as the daemon, too.
I could think of a few ways to do it, but I'm wondering if there's some obvious module out there that's specifically tailored for this kind of thing. Any suggestions about what to use, or about the project in general are quite welcome. Thanks! :)
Check out the class BaseHTTPServer -- a "Basic HTTP server" bundled with Python.
http://docs.python.org/library/basehttpserver.html
You can spin up a second thread and have it serve your requests for you very easily (probably < 30 lines of code). And it all runs in the same process and Python interpreter space, so it can access all your objects, etc.
I'm not sure I understand your question properly, but take a look at Twisted
I believed all kinds of python web framework is useful.
You can pick up one like CherryPy, which is small enough to integrate into your system. Also CherryPy includes a pure python WSGI server for production.
Also the performance may not be as good as apache, but it's already very stable.
Don't re-invent the bicycle!
Run jobs via cron script, and create a separate web interface using, for example, Django or Tornado.
Connect them via a database. Even sqlite will do the job if you don't want to scale on more machines.

Categories