I have a Django web application which shows a website to display some data. So, this application consists of the html pages and views to display this data which i am storing in a SQLite DB.
At the end of the day a third party needs to connect to this web application and upload binary data over to the application. What is the best way to host this service, as an independent python web server or part of Django application or how else ?
Any suggestions would be appreciated !
If the uploading doesn't occur too often, why not just create a Django POST/PUT view for the job that simply accepts the file over HTTP? With the information you've provided, I cannot see why this simple solution wouldn't be up to the task.
Related
I'm having an hard time figuring out how to solve a problem with a little project.
Basically i have a Django application. On the other hand i have an external Python script running. I would like to create a system where, each time a form in my Django app is submitted, the data submitted in the form is sent to the external Python application.
The external Python service should receive the data, read it, and according to who is the user and what did he submit, it should perform some tasks, then send a response.
Here is what i thought: 1) Connect the external Python app to the same database that Django is using. So that when the form is submitted, it is saved on the database and the data can be 'shared' with the second Python service. The problem with this solution is that i would need the second app to query the Database every second and perform a lot of queries, that would be a problem with performances. 2) Create an epi endpoint, so that the external python app would connect to the endpoint and fetch the data saved in the database from there. The problem is the same of the first solution. Would a service like Redis or RabbitMQ help in this case?
Importing the external Python process in my Django app is not a solution, it needs to be separate from the Django app. An important requirement for this, is speed. When new data is submitted, it needs to be received by the second Python app in the shortest time possible.
That said, i'm open to any advice or possible solution to solve this problem, thanks in advance :)
You could use a microservices architecture to build this. Instead of sharing databases between two applications you have them communicate with each other through web requests. Django would shoot a request to your other app with the relevant data, and the other server would respond back with the results.
Usually one would use something like Flask (synchronous server) or Sanic (asynchronous server) to receive/reply, but you can also look into something like Nameko. Would also recommend looking into Docker as eventually, as you set up more of these microservices, you'll need it.
The idea is (i.e. using Flask), to create an access point that does some computation to your data and returns it back to the Django server.
computation.py
from flask import Flask
from flask import request
app = Flask(__name__)
#app.route("/", methods=["POST"])
def computation():
data = request.get_json()
print(data)
return f"Hey! {data}"
app.run(host="0.0.0.0", port=8090)
The Django server is simply sending a request to your server application.
django_mock.py
import requests
req = requests.post('http://0.0.0.0:8090/', json={"data": "Hello"})
print(req.text)
The above will print out on the computation.py app:
{'data': 'Hello'}
and will print out on the django_mock.py example:
Hey! {'data': 'Hello'}
You should build an API. The 2nd app would now be an application server and the 1st app, when it receives a form submission from the user, would persist its data to the DB and then make an API call to the 2nd app via this API. You would include key information in the API request that identifies the record in the DB.
You can use Django (e.g. DRF) or Flask to implement a simple API server in Python.
Now, this requires your app server to be up and running all the time. What if it's down? What should the 1st app do? If you need this level of flexibility, then you need to decouple these apps in some way. Either the 1st app implements some kind of backoff/retry if it can't send to the 2nd app. Or you use a reliable queueing mechanism (something like Amazon SQS).
I'm currently doing a school project that tasks us to create a functioning web application (with database) for our university. Our application is an intranet-based activities logging system.
During our first term, we finished our frontend via Vue.js / Vuetify. It has complete routers (and multiple pages), functioning buttons and data-tables (and fake authentication).
Now, we need to connect it to the backend. We chose python django REST API as our research found that it would be faster to implement (our deadline is in 2 weeks tops). My question is how to (or if it's possible) to connect our vue.js application to django so that it can fetch login authentication and database queries to our SQL (postgreSQL).
We were using the Vue CLI during the building of our frontend.
Thank you!
You don't have to worry about integration much. This is a simple REST API and a Frontend framework.
You can find many tutorials online for the same and set it up from scratch, but keeping your deadlines in mind. I think you should use this boilerplate to start with: https://github.com/gtalarico/django-vue-template
i have a django chatbot application on web faction shard host. the idea is : the chatbot application simulate the customer service in chatting with the customers.
Basically the conversation will be exchanged through the API using GET and POST, where it first POST the input then GET calls the python file to SELECT the input form the DB and process it then update the database with the retrieved out put.Finally a GET is used to fetch the out put and display it.
so far it is working for one user at a time, what i am considering now is that i want it to chat with multiple customer at the same time an isolating each user.
Do i have to use Redis just for the chatting part, if yes how i can merge it in my project? other there are other solution out there?
i have developed it using:
python3: for the chatbot code.
Django: for the website.
Mysql: for the data base, that hold the knowledge based for the chatbot such as a table that include number of input and it correspond output.
Thank you,
There is a whole chatbot solution based on Python 3 + Django + Mongo/sqlite. Its github link is https://github.com/gunthercox/ChatterBot. Hope it can help you.
This repository also contains Django application example: https://github.com/gunthercox/ChatterBot/tree/master/examples/django_app
You can use Redis,Celery,Python RQ,Rabbit MQ as a queue for distributed tasks(chatting tasks) in your Django app. But this will increase complexity in your project. I will recommend you to Develop Python based multi client chat server.
I have a web application written in raw python and hosted on apache using mod_python. I am building another web application which is django based and will be hosted on same server using mod_wsgi.
Now, the scenerio is such that user will login from the web page which is using mod_python and a link will send him to my application which will be using mod_wsgi. My question is how can I maintain session? I need the same authentication to work for my application.
Thanks in advance.
If you're using django with mod_wsgi and a raw python page which only serve a link to django application, you don't need to maintain session on both page. When user click on first link and reach the django application, just check session there.
Django have session_db which use memcache. More information can be found here:
Django Sessions
SSO across web applications is poorly supported. One thing you can look at is:
http://www.openfusion.com.au/labs/mod_auth_tkt/
What you can do is really going to depend though on what authentication database you are currently using in the mod_python application and how you are remembering that someone is logged in. If you can provide that information, may be able to suggest other things.
Conceptually: store a cookie using your raw python web page that you process in a "welcome" view or custom middleware class in Django, and insert them into the sessions db. This is basically what hungnv suggests.
The most ridiculous way to do this would be to figure out how Django deals with sessions and session cookies, insert the correct row into Django's session database from your raw python app, and then custom-set the session cookie using Django's auth functions.
I'm writing a syndication client, with the aim being to have a client for devices, and a web site that has the same functionality. I shall develop the website using Django - this is already decided; the client shall be written in python with both a CLI and a PyQt4 GUI. I have been writing the clinet first, and it's fairly database-heavy, as everything is cached to enable it to be read while offline.
It struck me today that it would make sense to use Django models for my application, to reduce the repetition of effort between the client and the website. My question is how easy it is to seperate this, and how much of Django I will need in my client to use Django's models. AFAIK I should not need to run the server, but what else is needed? I had an idea of generating the same html for my client as the website, but showing it withing Qt widgets rather than serving pages for a browser.
Has anyone tried this sort of thing before? I'm starting on this already, but it would be good to get a warning of potential dead-ends or things that will create a maintainance nightmare...
Read up on standalone Django scripts and you'll be on your path to victory. Basically all you're really doing is referencing the Django settings.py (which Django expects) and then using models without web views or urls.
If all you're really interested in is using Django's ORM to manage your models and database interaction, you might want to consider using SQLAlchemy instead.
You'll still have to run the Django app as a web server, but you can restrict it to serve to only localhost or something. And sure, you can use QtWebKit as the client.