I recently posted a question: How to dynamically create and close infinitely running Celery tasks using Django Channels . But Django channels seems to be a fairly niche area of development so I'd like to open up the question as a question about general architecture patterns.
I would ultimately like users of my application to tap into the same data streams dynamically.
I am creating a cryptocurrency application where users are accessing the same live price data.
It seems ridiculous that every user should have to request an API for the same information.
Scenario:
Multiple users are receiving BTC/USD data by API request. How can I get this data and share it between users?
This problem must have been solved countless times but I'm finding it very difficult to find the correct method for setting up a scalable solution.
Related
So I have been developing an app using the Django Platform, and recently I have been thinking about scalability in the app. I currently am developing this app in a civil engineering research lab, and we don't get a ton of traffic on our current server.
I was wondering if some of the apps we develop are able to take multiple requests at the same time in our labs. From my understanding, they should, based on how well the server we are using is configured (Nginx in our case, I believe). From what I understand (please correct me if I wrong) our server has some workers (or threads, I think they are the same thing) that will accept requests, then they will go through the code that I have written and generate responses based on what the request is.
My main question is if I am using third party packages, how well will they scale if my app was getting more traffic? For example, if I used pandas/numpy/numba/etc... in my controllers, would this be safe when we have multiple users sending requests to multiple threads?
EDIT: I changed the wording of the question as it was worded a bit poorly initially.
i have a django chatbot application on web faction shard host. the idea is : the chatbot application simulate the customer service in chatting with the customers.
Basically the conversation will be exchanged through the API using GET and POST, where it first POST the input then GET calls the python file to SELECT the input form the DB and process it then update the database with the retrieved out put.Finally a GET is used to fetch the out put and display it.
so far it is working for one user at a time, what i am considering now is that i want it to chat with multiple customer at the same time an isolating each user.
Do i have to use Redis just for the chatting part, if yes how i can merge it in my project? other there are other solution out there?
i have developed it using:
python3: for the chatbot code.
Django: for the website.
Mysql: for the data base, that hold the knowledge based for the chatbot such as a table that include number of input and it correspond output.
Thank you,
There is a whole chatbot solution based on Python 3 + Django + Mongo/sqlite. Its github link is https://github.com/gunthercox/ChatterBot. Hope it can help you.
This repository also contains Django application example: https://github.com/gunthercox/ChatterBot/tree/master/examples/django_app
You can use Redis,Celery,Python RQ,Rabbit MQ as a queue for distributed tasks(chatting tasks) in your Django app. But this will increase complexity in your project. I will recommend you to Develop Python based multi client chat server.
Say I have code written in python that analyzes files on my computer and returns a result. It works great locally on my HD, but now I'd like to turn it into a mobile app. This means I'll require a server of some kind (cloud for instance) where users can access it.
It is my understanding that all that would be required is a method to grant user credentials and permissions to the patrons so they can access the "run" command in my analysis program. But honestly, I have no ZERO visibility in this area and don't really know where to begin.
I only have two questions:
Users & their credentials are endless, but they all have to share the same analysis program. I don't know much about servers, but wouldn't this method cause long queue times? Generally-speaking what considerations would I have to make in my analysis code to avoid this?
Can someone just point me in the direction of what I'd need to learn in order to answer the above question? This topic is a bottomless pit of information and I don't wanna get trapped.
Thanks.
Django is an MVC Web framework which possesses all features required for doing Web applications with Python. Simply go through the tutorial and you should be up and running in no time, on your local machine.
To deploy there are various options, be it a cloud instance (a lot of providers here, including Rackspace and Amazon, Google for "django web hosting"), or "traditional" server machines (again a lot of providers here).
The "mobile" part is just the user interface. This affects decisions in the presentation part of your application, and you can restricted this to the View part in Django jargon (i.e. the HTML templates) of your Web application. You can look for frameworks which allow the production of aesthetically decent (or better) user interfaces HTML tailored for mobile/tablet devices, e.g. JQueryMobile.
Therefore direction: start with Django -> deploy on a server "somewhere" -> tailor your user interface for mobile devices.
I am currently working on a complex web interface and backend, that will need to address several issues.
Scalablility
multiple deployments of varying load demands
Very structured authorization groups
Different views for different user groups
admin panel
user/content management
Large managed database
current
long term stored data (histories)
Data Updates
Polling
Ex. Search queries, static pages/files, report generation per request
Pushing (likely websockets)
Ex. Real-time notifications
Varying protocols
Ex. HTTP, SSL, Websockets
I would like to use Python, because I have grown to really enjoy the language, and I am considering some combo of Django and Twisted.
I have some experience with Django, which I love for its MVT style of application programming, its authorization models, its admin panel, and its database API. However, it is not so strong in some of the data requirements that I need, in particular, the real-time aspects.
Now, I have not really used Twisted before, but I have seen many interesting things to it. In particular the async aspects, and the ability to run many protocols.
The problems in getting the two to work together are obvious in that Django is a blocking server and Twisted is designed to be non-blocking. I have seen some topics stating using the two together is possible and have had success with it. It also seems possible to run both and proxy them to accept different urls, but getting the authentication over the two may become tricky?
Having said all of that, I would like to ask if I am on the right track for implementing this system, as well as suggestions on how to use the two together, alternatives, or if I should just kick one out (at this point, I guess it'd have to be Django, because the real time stuff is necessary). I should mention that I have written some of the preliminary data models and views in Django already.
I am quite experienced on the client side of things (JS,CSS,HTML), but I am not so savvy in the server side of things. Any input would be helpful, thanks.
You can definitely use Twisted with Django. Several projects have used the two together to good effect. twistd web --wsgi provides a basic way to get it set up, and there's a great example with more bells and whistles, like static content by Alex Clemesha on github.
I'm using Django/Postgres and Python for my web site and the background processes. I have hundreds of messages every minute populating my database and I would like to securely allow other customers access their data.
My customers use either linux or windows so I would like some solution that will be platform/database agnostic.
I looked so far at Piston , Twisted , Celery and RabbitMQ. All these have some way to exchange data. But I'm not sure what to use or if there are any better options.
For example I need the customers to be able to access only their data on my database. Another thing I need is to allow the customers to send a short command back to my servers. My servers will execute the command and return re result in real time back to the customer.
Any ideas?
You asked how your customers can securely transmit commands to your website and retrieve results in their response (near "real-time").
... have you considered hooking a reasonable API into your django app? If you're concerned about security, you can use authentication and serve it over HTTPS.
It's not as fancy as the messaging and queuing platforms that the kids are using these days but it'll get the job done.
Things to like about HTTP/HTTPS APIs:
They can be load balanced (highly available and scalable!)
They can be cached (mo' betta performance and the ability to still serve content while rate limiting how often a client can hit the DB)
Just about every programming language has a mature library that allows HTTP/HTTPS connections. Some have multiple, e.g. Python: urllib,urllib2,httplib