I have built an application in python that is hosted on heroku which basically uses a script written in Python to store some results into a database (it runs as a scheduled task on daily basis). I would have done this with ruby/rails to avoid this confusion, but the application partner did not support Ruby.
I would like to know if it will be possible to build the front-end with Ruby on Rails and use the same database.
My rails application will need to make use MVC and have its own tables on the database, but it will also use the database that python sends data to just to retrieve some data from there.
Can I create the Rails app and reference the details of the database that my python application uses?
How could I test this on my local machine?
What would be the best approach to this?
I don't see any problem in doing this, as far as rails manages the database structure and python script populates it with data.
My advice, but just to make it simpler, is to define the database schema through migrations in your rails app and build it like the python script doesn't exist.
Once you have completed it, simply start the python script so it can start populating the Database (could be necessary to rename some table in the python script, but no more than this).
If you want to test in your local machine you can one of this:
run the python script in your local machine
configure the database.ymlin your rails app to point to the remote DB (can be difficult if you don't have administration access to the host server, because of port farwarding etc)
The only thing you should keep in mind is about concurrent accesses.
Because you have 2 application that both read and write in your DB, would be better if the python script makes its job in a single and atomic transaction, to avoid your rails app finding the DB in an half-updated state.
You can see the database like a shared box, it doesn't matter how many applications use it.
Related
I am currently creating an application that will be using Python Flask for the back-end and API and PostgreSQL as the database to store my data in JSON format. My plan is to have a front-end in JS to interact with the API which will pull relevant information from my database.
How do I package the database into the program so that if a fresh copy is pulled from GitHub, a user would have everything needed to host and use the service? I am still a new developer and having difficulty taking my hobbyist code and presenting it in a clean, organized way.
Thank you for all help in advance.
Though your question leaves quite a few options open, here are two things you could do:
If you assume your users can install a PostgreSQL database themselves: you could dump the database which contains the minimum required to run your application (using pg_dump). When your application starts on your user's server, it should detect the database it's connecting to is empty, which should trigger an import of your data. The only thing your users should do is fill out their database connection details
If your users don't know anything about configuring servers: You could create a Docker image containing your Python code and PostgreSQL. This package will contain all dependencies of your application and runs anywhere. Admittedly, this is a bit more 'advanced' and could lead to other difficulties both on your side as well as on your users.
I've built a Django app that uses sqlite (the default database), but I can't find anywhere that allows deployment with sqlite. Heroku only works with postgresql, and I've spent two days trying to switch databases and can't figure it out, so I want to just deploy with sqlite. (This is just a small application.)
A few questions:
Is there anywhere I can deploy with sqlite?
If so, where/how?
SQLite is a database on the disk, it is very useful for development purposes, however services like Heroku expect your server-side code to be stateless, which as a consequence does not really allow for databases such as SQLite. I guess you could make it work (provided you find a place on Heroku's disk where to put your SQLite db) but you would constantly lose your database's content every time you redeploy.
For Heroku specifically, I'll redirect you to this link which explains how to use Django with PostgreSQL on Heroku.
Don't use SQLite on heroku. As stated in the docs you will lose your data:
SQLite runs in memory, and backs up its data store in files on disk.
While this strategy works well for development, Heroku’s Cedar stack
has an ephemeral filesystem. You can write to it, and you can read
from it, but the contents will be cleared periodically. If you were to
use SQLite on Heroku, you would lose your entire database at least
once every 24 hours.
Even if Heroku’s disks were persistent running SQLite would still not
be a good fit. Since SQLite does not run as a service, each dyno would
run a separate running copy. Each of these copies need their own disk
backed store. This would mean that each dyno powering your app would
have a different set of data since the disks are not synchronized.
Instead of using SQLite on Heroku you can configure your app to run on
Postgres.
sure you can deploy with sqlite ... its not really recommended but should work ok if you have low network traffic
you set your database engine to sqlite in settings.py ... just make sure you have write access to the path that you specify for your database
I have a Django application that runs on apache server and uses Sqlite3 db. I want to access this database remotely using a python script that first ssh to the machine and then access the database.
After a lot of search I understand that we cannot access sqlite db remotely. I don't want to download the db folder using ftp and perform the function, instead I want to access it remotely.
What could be the other possible ways to do this? I don't want to change the database, but am looking for alternate ways to achieve the connection.
Leaving aside the question of whether it is sensible to run a production Django installation against sqlite (it really isn't), you seem to have forgotten that, well, you are actually running Django. That means that Django can be the main interface to your data; and therefore you should write code in Django that enables this.
Luckily, there exists the Django REST Framework that allows you to simply expose your data via HTTP interfaces like GET and POST. That would be a much better solution than accessing it via ssh.
Sqlite needs to access the provided file. So this is more of a filesystem question rather than a python one. You have to find a way for sqlite and python to access the remote directory, be it sftp, sshfs, ftp or whatever. It entirely depends on your remote and local OS. Preferably mount the remote subdirectory on your local filesystem.
You would not need to make a copy of it although if the file is large you might want to consider that option too.
I have a Django 1.6 project (stored in a Bitbucket Git repo) that I wish to host on a VPS.
The idea is that when someone purchases a copy of the software I have written, I can type in a few simple commands that will take a designated copy of the code from Git, create a new instance of the project with its own subdomain (e.g. <customer_name>.example.com), and create a new Postgres database (on the same server).
I should hopefully be able to create and remove these 'instances' easily.
What's the best way of doing this?
I've looked into writing scripts using some sort of combination of Supervisor/Gnunicorn/Nginx/Fabric etc. Other options could be something more serious like using Docker or Vagrant. I've also looked into various PaaS options too.
Thanks in advance.
(EDIT: I have looked at the following services/things: Dokku (can't use Heroku due to data constraints), Vagrant (inc Puppet), Docker, Fabfile, Deis, Cherokee, Flynn (under dev))
If I was doing it (and I did a similar thing with a PHP application I inherited), I'd have a fabric command that allows me to provision a new instance.
This could be broken up into the requisite steps (check-out code, create database, syncdb/migrate, create DNS entry, start web server).
I'd probably do something sane like use the DNS entry as the database name: or at least use a reversible function to do that.
You could then string these together to easily create a new instance.
You will also need a way to tell the newly created instance which database and domain name they needed to use. You could have the provisioning script write some data to a file in the checked out repository that is then used by Django in it's initialisation phase.
I am looking for feasible solutions for my Application to be backed with MongoDB. I am looking to host the MongoDB on the cloud with a python based server to interact with the DB and my app (either mobile/web). I am trying to understand how the architecture should look like.
Either i can host a mongoDB on the AWS cloud and have the server running there only.
I also tried using MongoLab and seemed to be simple accessing it using HTTP requests. but i am not sure if it exposes all the essential features of MongoDB (what ever i can do using a pymongo driver)? Also, should i go for accessing the MongoLab service directly from my application or still i should build a server in-between?
I would prefer to building an server in either case as i want to do some processing before sending the data back to application. but i am not sure in that case how my DB-server-app interaction design should be
Any suggestions?
One thing to consider is that you don't need to use MongoLab's REST API. You can connect directly via a driver as well.
So, if you need to implement business logic (which it sounds like you do), it makes sense to have a three tier architecture with an app server connecting to your MongoLab database via one of the drivers. In your case it sounds like this would be pymongo.
-will