Is it possible to deploy Django with Sqlite? - python

I've built a Django app that uses sqlite (the default database), but I can't find anywhere that allows deployment with sqlite. Heroku only works with postgresql, and I've spent two days trying to switch databases and can't figure it out, so I want to just deploy with sqlite. (This is just a small application.)
A few questions:
Is there anywhere I can deploy with sqlite?
If so, where/how?

SQLite is a database on the disk, it is very useful for development purposes, however services like Heroku expect your server-side code to be stateless, which as a consequence does not really allow for databases such as SQLite. I guess you could make it work (provided you find a place on Heroku's disk where to put your SQLite db) but you would constantly lose your database's content every time you redeploy.
For Heroku specifically, I'll redirect you to this link which explains how to use Django with PostgreSQL on Heroku.

Don't use SQLite on heroku. As stated in the docs you will lose your data:
SQLite runs in memory, and backs up its data store in files on disk.
While this strategy works well for development, Heroku’s Cedar stack
has an ephemeral filesystem. You can write to it, and you can read
from it, but the contents will be cleared periodically. If you were to
use SQLite on Heroku, you would lose your entire database at least
once every 24 hours.
Even if Heroku’s disks were persistent running SQLite would still not
be a good fit. Since SQLite does not run as a service, each dyno would
run a separate running copy. Each of these copies need their own disk
backed store. This would mean that each dyno powering your app would
have a different set of data since the disks are not synchronized.
Instead of using SQLite on Heroku you can configure your app to run on
Postgres.

sure you can deploy with sqlite ... its not really recommended but should work ok if you have low network traffic
you set your database engine to sqlite in settings.py ... just make sure you have write access to the path that you specify for your database

Related

Django website with sqlite db deployment on heroku

I have been reading at many places that Heroku doesn't support sqlite database. Is there no option to use sqlite? Is there any kind of wrapper or plug-in to be used with sqlite so that it can be deployed in Heroku?
Can anyone share resources or guides to do the same end to end?
No, there is no way to do this. An sqlite database is a file on disk, but the filesystem on Heroku is ephemeral and not shared between dynos, so the db would be lost every time you deploy.
But there is no reason to try. Django already abstracts away all the differences between databases for regular usage. Heroku supports a number of Postgres plans for different use cases, including a free hobby tier.

How can I update one (or a couple) files in a Google App Engine Flask App?

If I'm just updating, say, my main.py file, is there a better way to update the app then running gcloud app deploy, which takes several minutes? I wouldn't think I need to completely blow up and rebuild the environment if I'm just updating one file.
You must redeploy the service. App Engine isn't like a standard hosting site where you FTP single files, rather you upload a service that becomes containerized that can scale out to run on many instances. For a small site, this might feel weird, but consider a site serving huge amounts of traffic that might have hundreds of instances of code running that is automatically load balanced. How would you replace that single file in that situation across all your instances? So you upload a new version of a service and then you can migrate traffic to the new version either immediately or ramped up.
What you might consider an annoyance is part of the tradeoff that makes App Engine hugely powerful in not having to worry about how your app scales or is networked.

Django: Deploying an application on Heroku with sqlite3 as the database

I want to deploy an application with sqlite3 as the database on Heroku. However, it seems to be that Heroku doesn't support applications with sqlite3 as the database. Is it true? Is there no way to deploy my sqlite3-backed application on Heroku?
PS: I have successfully deployed my application using PythonAnywhere, but would now like to know whether there's any possible way to deploy it using Heroku.
As Heroku's dynos don't have a filesystem that persists across deploys, a file-based database like SQLite3 isn't going to be suitable. It's a great DB for development/quick prototypes, though.
Heroku do have a Postgres offering however that will suit - with a free tier and a basic $9/month tier that are good for hobby/small projects. The biggest benefit over SQLite is that you get backups that you wouldn't get otherwise (plus all the other Postgres features).
There's a guide to updating your settings.py to use Postgres here: https://devcenter.heroku.com/articles/getting-started-with-django#django-settings
Heroku has a detailed article explaining "Why is SQLite a bad fit for running on Heroku" https://devcenter.heroku.com/articles/sqlite3

Rails app to work with a remote heroku database

I have built an application in python that is hosted on heroku which basically uses a script written in Python to store some results into a database (it runs as a scheduled task on daily basis). I would have done this with ruby/rails to avoid this confusion, but the application partner did not support Ruby.
I would like to know if it will be possible to build the front-end with Ruby on Rails and use the same database.
My rails application will need to make use MVC and have its own tables on the database, but it will also use the database that python sends data to just to retrieve some data from there.
Can I create the Rails app and reference the details of the database that my python application uses?
How could I test this on my local machine?
What would be the best approach to this?
I don't see any problem in doing this, as far as rails manages the database structure and python script populates it with data.
My advice, but just to make it simpler, is to define the database schema through migrations in your rails app and build it like the python script doesn't exist.
Once you have completed it, simply start the python script so it can start populating the Database (could be necessary to rename some table in the python script, but no more than this).
If you want to test in your local machine you can one of this:
run the python script in your local machine
configure the database.ymlin your rails app to point to the remote DB (can be difficult if you don't have administration access to the host server, because of port farwarding etc)
The only thing you should keep in mind is about concurrent accesses.
Because you have 2 application that both read and write in your DB, would be better if the python script makes its job in a single and atomic transaction, to avoid your rails app finding the DB in an half-updated state.
You can see the database like a shared box, it doesn't matter how many applications use it.

(Python/Django): How do I keep my production db in sync (scheme and data) and with dev pc db?

I have a local Postgres database which will be filled with data (daily) on my local development machine. What is a good solution to transfer/sync/mirror this data to a production Postgres database.
For what it's worth I'm developing in Python using Django.
Thanks!
This seems like a strange workflow for me. Wouldn't it be much better to import the data in the production database and then just sync it with your development db from time to time?
IMO, the development machine shouldn't be included in the production data workflow.
That's the way I do it using fabric. I've written a simple function which copies part of the production db onto the local development machine.
South is a great tool for dealing with database migrations in Django projects. The latest release now supports both schema and data migrations
http://south.aeracode.org/docs/tutorial/part3.html#data-migrations
The app provides a number of management commands which allow you to dump executable files which when run can alter the database schema or insert records. It's great for automating changes to a production environment or when working on a team. You could then use something like fabric (or do it manually if you must) to push up the migration files and run the migrate command to populate your database

Categories