Moving database data from DEV to PRODUCTION, best practice? - python

I have a development Django project using MySQL, and it is deployed at PythonAnywhere. I am able to push my code updates via GIT, and the Django migrations take care of the database STRUCTURE but my question is about data.
During development I may add a new capability that relies on master data which I enter in the DEV database as I develop and test. When deploying I'd like to copy over the master data to the new database rather than re-enter it all.
Is exporting and importing files the best way or is there a more professional way?

I think the simplest way to do this is by using the dumpdata management command.
The output for this command can be used in executing the loaddata management command.

Related

Changing database structure of Django in server

When we work with Django in local environment, we change the structure of the Data Base using Command Prompt through migration.
But for using Django in server, i don't now how can i apply such changes? How can i type commands to change Data Base structure? Is it a good way to upload site files every time that i do some change again.
The whole point of migrations is that you run them on both your local database and in production, to keep them in sync.

Is it possible to deploy Django with Sqlite?

I've built a Django app that uses sqlite (the default database), but I can't find anywhere that allows deployment with sqlite. Heroku only works with postgresql, and I've spent two days trying to switch databases and can't figure it out, so I want to just deploy with sqlite. (This is just a small application.)
A few questions:
Is there anywhere I can deploy with sqlite?
If so, where/how?
SQLite is a database on the disk, it is very useful for development purposes, however services like Heroku expect your server-side code to be stateless, which as a consequence does not really allow for databases such as SQLite. I guess you could make it work (provided you find a place on Heroku's disk where to put your SQLite db) but you would constantly lose your database's content every time you redeploy.
For Heroku specifically, I'll redirect you to this link which explains how to use Django with PostgreSQL on Heroku.
Don't use SQLite on heroku. As stated in the docs you will lose your data:
SQLite runs in memory, and backs up its data store in files on disk.
While this strategy works well for development, Heroku’s Cedar stack
has an ephemeral filesystem. You can write to it, and you can read
from it, but the contents will be cleared periodically. If you were to
use SQLite on Heroku, you would lose your entire database at least
once every 24 hours.
Even if Heroku’s disks were persistent running SQLite would still not
be a good fit. Since SQLite does not run as a service, each dyno would
run a separate running copy. Each of these copies need their own disk
backed store. This would mean that each dyno powering your app would
have a different set of data since the disks are not synchronized.
Instead of using SQLite on Heroku you can configure your app to run on
Postgres.
sure you can deploy with sqlite ... its not really recommended but should work ok if you have low network traffic
you set your database engine to sqlite in settings.py ... just make sure you have write access to the path that you specify for your database

Rails app to work with a remote heroku database

I have built an application in python that is hosted on heroku which basically uses a script written in Python to store some results into a database (it runs as a scheduled task on daily basis). I would have done this with ruby/rails to avoid this confusion, but the application partner did not support Ruby.
I would like to know if it will be possible to build the front-end with Ruby on Rails and use the same database.
My rails application will need to make use MVC and have its own tables on the database, but it will also use the database that python sends data to just to retrieve some data from there.
Can I create the Rails app and reference the details of the database that my python application uses?
How could I test this on my local machine?
What would be the best approach to this?
I don't see any problem in doing this, as far as rails manages the database structure and python script populates it with data.
My advice, but just to make it simpler, is to define the database schema through migrations in your rails app and build it like the python script doesn't exist.
Once you have completed it, simply start the python script so it can start populating the Database (could be necessary to rename some table in the python script, but no more than this).
If you want to test in your local machine you can one of this:
run the python script in your local machine
configure the database.ymlin your rails app to point to the remote DB (can be difficult if you don't have administration access to the host server, because of port farwarding etc)
The only thing you should keep in mind is about concurrent accesses.
Because you have 2 application that both read and write in your DB, would be better if the python script makes its job in a single and atomic transaction, to avoid your rails app finding the DB in an half-updated state.
You can see the database like a shared box, it doesn't matter how many applications use it.

How do I run a Django 1.6 project with multiple instances running off the same server, using the same db backend?

I have a Django 1.6 project (stored in a Bitbucket Git repo) that I wish to host on a VPS.
The idea is that when someone purchases a copy of the software I have written, I can type in a few simple commands that will take a designated copy of the code from Git, create a new instance of the project with its own subdomain (e.g. <customer_name>.example.com), and create a new Postgres database (on the same server).
I should hopefully be able to create and remove these 'instances' easily.
What's the best way of doing this?
I've looked into writing scripts using some sort of combination of Supervisor/Gnunicorn/Nginx/Fabric etc. Other options could be something more serious like using Docker or Vagrant. I've also looked into various PaaS options too.
Thanks in advance.
(EDIT: I have looked at the following services/things: Dokku (can't use Heroku due to data constraints), Vagrant (inc Puppet), Docker, Fabfile, Deis, Cherokee, Flynn (under dev))
If I was doing it (and I did a similar thing with a PHP application I inherited), I'd have a fabric command that allows me to provision a new instance.
This could be broken up into the requisite steps (check-out code, create database, syncdb/migrate, create DNS entry, start web server).
I'd probably do something sane like use the DNS entry as the database name: or at least use a reversible function to do that.
You could then string these together to easily create a new instance.
You will also need a way to tell the newly created instance which database and domain name they needed to use. You could have the provisioning script write some data to a file in the checked out repository that is then used by Django in it's initialisation phase.

(Python/Django): How do I keep my production db in sync (scheme and data) and with dev pc db?

I have a local Postgres database which will be filled with data (daily) on my local development machine. What is a good solution to transfer/sync/mirror this data to a production Postgres database.
For what it's worth I'm developing in Python using Django.
Thanks!
This seems like a strange workflow for me. Wouldn't it be much better to import the data in the production database and then just sync it with your development db from time to time?
IMO, the development machine shouldn't be included in the production data workflow.
That's the way I do it using fabric. I've written a simple function which copies part of the production db onto the local development machine.
South is a great tool for dealing with database migrations in Django projects. The latest release now supports both schema and data migrations
http://south.aeracode.org/docs/tutorial/part3.html#data-migrations
The app provides a number of management commands which allow you to dump executable files which when run can alter the database schema or insert records. It's great for automating changes to a production environment or when working on a team. You could then use something like fabric (or do it manually if you must) to push up the migration files and run the migrate command to populate your database

Categories