We have this open source app we built with Django http://map.ninux.org/ which is used by our wireless community network and is hosted outside of our network with Hetzner in Nurimberg.
We'd like to have a mirror inside our network for internal use only.
I would like to set the mirror to do write queries on the database which is hosted outside the network.
Best would be to set the mirror to do write queries both on its local DB and on the one outside the network.
Any suggestion?
I'm also wondering if there are any articles about developing distributed / redundant / decentralized applications with Django.
Thanks!
The multiple database documentation shows you how to set up two databases and how to select a database for saving which is what write operations do.
Related
I am working on scaling out a webapp and providing some database redundancy for protection against failures and to keep the servers up when updates are needed. The app is still in development, so I have chosen a simple multi-master redundancy with two separate database servers to try and achieve this. Each server will have the Django code and host its own database, and the databases should be as closely mirrored as possible (updated within a few seconds).
I am trying to figure out how to set up the multi-master (master-master) replication between databases with Django and MySQL. There is a lot of documentation about setting it up with MySQL only (using various configurations), but I cannot find any for making this work from the Django side of things.
From what I understand, I need to approach this by adding two database entries in the Django settings (one for each master) and then write a database router that will specify which database to read from and which to write from. In this scenario, both databases should accept both reads and writes, and writes/updates should be mirrored over to the other database. The logic in the router could simply use a round-robin technique to decide which database to use. From there on, further configuration to set up the actual replication should be done through MySQL configuration.
Does this approach sound correct, and does anyone have any experience with getting this to work?
Your idea of the router is great! I would add that you need automatically detect whether a databases is [slow] down. You can detect that by the response time and by connection/read/write errors. And if this happens then you exclude this database from your round-robin list for a while, trying to connect back to it every now and then to detect if the databases is alive.
In other words the round-robin list grows and shrinks dynamically depending on the health status of your database machines.
The another important notice is that luckily you don't need to maintain this round-robin list common to all the web servers. Each web server can store its own copy of the round-robin list and its own state of inclusion and exclusion of databases into this list. This is just because a database server can be seen from one web server and can be not seen from another one due to local network problems.
I am working in a Django app for investors. Currently using a database and 3 installed apps configured in the settings.py.
I am about integrating a new feature for which every broker will register their IP in our app, so that we will replicate our postgres database in their server (Everything is same except 'HOST' regarding with database) manually. Then broker will send GET and POST methods to our server from their server.
I need to switch the database based on the request coming. I think I can connect their postgres database dynamically by looking the request and process by SQL queries. My requirement is, I just need to use Django postgres methods for processing without configuring the database in settings file.
if configuring database in settings is the only way, how can I switch to database every time efficiently and how many databases can be connected in a single Django app?
I believe if you want to use Django methods (and not simply use RAW SQL queries and parse them) you will have to use the settings.py method and define all your databases there.
https://docs.djangoproject.com/en/1.7/topics/db/multi-db/
In short, you define a database and can manually chose it in your code via (as per docs):
Author.objects.using(database_name_variable).filter(...)
An alternative would be to look at using REST api (like Tastypie) to make calls to different Django instances connected to each database.
I'm currently running a t2.micro instance on EC2 right now. I have the html/web interface side of it working, along with a MySQL database.
The site allows users to register and stores them in the DB via a PHP script.
I want there to be an actual Python application that queries the MySQL database and returns user data, to then be executed in a Python script.
What I cannot find is whether I host this Python application as a totally separate instance or if it can exist on the same instance, in a different directory. I ultimately just need to query the database, which makes me thing it must exist on the same instance.
Could someone please provide some guidance?
Let me just be clear: this is not a Python web app. This Python backend is entirely separate except making queries against the database.
Either approach is possible, but there are pros & cons to each.
Running separate Python app on the same server:
Pros:
Setting up local access to the database is fairly simple
Only need to handle backups or making snapshots, etc. for a single instance
Cons:
Harder to scale up individual pieces if you need more memory, processing power, etc. in the future
Running the Python app on a separate server:
Pros:
Separate pieces means you can scale up & down the hardware each piece is running on, according to their individual needs
If you're using all micro instances, you get more resources to work with, without any extra costs (assuming you're still meeting all the other 'free tier eligible' criteria)
Cons:
In general, more pieces == more time spent on configuration, administration tasks, etc.
You have to open up the database to non-local access
Simplest: open up the database to access from anywhere (e.g. all remote IP addresses), and have the Python app log in via the internet
Somewhat safer, more complex: set the Python app server up with an elastic IP, open up the database to access only from that address
Much safer, more complex: set up your own virtual private cloud (VPC), and allow connections to the database only from within the VPC. You'd have to configure public access for each of the servers for whatever public traffic you'll have, presumably ports 80 and/or 443.
I'm building a web application in Pyramid and it needs user logins. Database backend is a MySQL DB connected via SQLAalchemy.
Pyramid has an introduction on using beaker for sessions, but it only shows how to configure it using files. I couldn't find out how to store session data in the database (I think it should be possible), since then I would have only one place were my varying data is stored.
I found it. Put something like this in your configuration file (development.ini/production.ini)
session.type=ext:database
session.secret=someThingReallyReallySecret
session.cookie_expires=true
session.key=WhatEver
session.url=mysql://user:password#host/database
session.timeout=3000
session.lock_dir=%(here)s/var/lock
I don't is if it is possible (or sensible) to put locking to the DB, too, but the sessions should live in the DB like this. You'll need to take care to delete old sessions from the DB yourself (but I think that's the case when using files, too).
I am using Django 0.97 version with postgresql 9.0. I have configured hot streaming replication on a master db server and a slave db server. My application has heavy bot-driven writes on the DB and reads only from the users. So, it makes it very optimized if I make the read-access slave db for the users' and write-access master db for the bot write access. Unfortunately only Django 1.2 has multiple database support and its a huge effort to upgrade in my application. I got some leads through the following link : http://www.eflorenzano.com/blog/post/easy-multi-database-support-django/ However, this also requires me to change all instances of db access in my application. Is there any simpler way to assign separate db servers for read access and write access by fiddling with the the django core db module?
Your best bet is to upgrade to 1.2 as it will be significantly less work than hacking together features that already exist. If you stick with 0.97 for much longer your life will only be more difficult down the road.
I'm guessing you may have some misconceptions on how using multiple DBs works in Django 1.2. You do not have to "change all instances of db access in [your] application" if you use the Database Routers feature of Django.
With a router, you can specify which database to use for reads and writes. All of your existing django models should work and begin sending requests to the proper database. It's pretty simple to set up a router, just check the docs. All that is required is to create the router class, put it somewhere, then add a line in your settings.
It works really nicely and is not as much work as you may expect. You may have other issues with upgrading that you aren't telling us, but as far as models go you shouldn't have many problems.