I am currently creating an application that will be using Python Flask for the back-end and API and PostgreSQL as the database to store my data in JSON format. My plan is to have a front-end in JS to interact with the API which will pull relevant information from my database.
How do I package the database into the program so that if a fresh copy is pulled from GitHub, a user would have everything needed to host and use the service? I am still a new developer and having difficulty taking my hobbyist code and presenting it in a clean, organized way.
Thank you for all help in advance.
Though your question leaves quite a few options open, here are two things you could do:
If you assume your users can install a PostgreSQL database themselves: you could dump the database which contains the minimum required to run your application (using pg_dump). When your application starts on your user's server, it should detect the database it's connecting to is empty, which should trigger an import of your data. The only thing your users should do is fill out their database connection details
If your users don't know anything about configuring servers: You could create a Docker image containing your Python code and PostgreSQL. This package will contain all dependencies of your application and runs anywhere. Admittedly, this is a bit more 'advanced' and could lead to other difficulties both on your side as well as on your users.
Related
Hello I am creating a small program on Python with the cx_oracle module which allows me to connect to my Oracle database on my Computer. However I would like to send it to a friend and therefore I would like him to be able to handle the same database as me. So I thought of an embeddable database (a bit like a file on SQLite) but with Oracle I did not find such a possibility. I would like to know if there is a way to do it with Oracle or if I am forced to connect to a local database.
First of all, you can export and import Oracle databases, which would help you a lot at the initial sharing. However, if you share your database with a friend and both of you are working on the same database but with different copies, then the two databases will eventually diverge from each other, with ever greater impact. So, you will need to consider your options carefully:
Using a central server
You could use a central server (which could be a remote server or even your computer if you apply port forwarding), ensure that both you and your friend are connected to that database and then both your and his changes will automatically be applied to the same database, without copies.
Versioned dumps
You could use a versioning tool, like git to store the versions of your database dump/structure/data and both you and your friend could use this, maybe storing the versions in a central repositories, so you would not need to send and communicate your database changes again and again. This would ensure schema and data synchronization, albeit you will have frequent merge conflicts and other merge-related problems.
Versioned scripts
You and your friend could write versioned scripts. This would apply on structural changes, so your and your friend's test data would diverge, but the structure would not.
Migration scripts
Some ORMs have automatically generated migration scripts and one can go forwards or backwards some levels. I do not particularly like the idea of automatically generated change scripts, but it is certainly a possible solution.
I have a Django application that integrates data from a SQL database and displays that as a downloadable HTML table. Now I would also like to add analysis functionality, but instead of painstakingly adding only a few functionalities using JavaScript, I want to redirect the user to a Jupyter notebook, so the user has full access to the functionality of Python data analysis libraries or even other languages. Now I'm not sure how to approach this and I have several questions:
Am I right that Jupyter needs to be run on a different server than the Django app since it uses the Tornado server?
How would I transfer the data produced with Django to Jupyter? I store the data in a new MySQL table, but I would at least need to transfer the table name.
Given the table name was transferred to Jupyter I would still need to execute custom code to access the database. I found in this question that some defaults for code execution can be set in IPython configuration, but since the table from which the data should be loaded is never the same, this would have to be adapted dynamically.
I'm glad for any suggestions and comments on whether this idea makes sense at all.
Am I right that Jupyter needs to be run on a different server than the Django app since it uses the Tornado server?
No. They can run on the same server.
How would I transfer the data produced with Django to Jupyter? I store the data in a new MySQL table, but I would at least need to transfer the table name.
You can run django and Jupyter together and execute any django code from Jupyter. The easiest way to do so is by using shell_plus from the package django-extensions You can even connect to the production database.
You should consider the possible security and data safety risks. This is a potential vector for remote code execution and data leaks.
It's probably safer to spin up a cloned django instance with a cloned database or use a different django configuration (settings.py) with read-only access to the db.
It's quite easy to mess up production data from Jupyter, since you can do stuff such as User.objects.all().delete() And that would be a problem...
Also consider running this as a user with stricter read and write permissions than you use for your regular django app.
And of course, you should make sure that the Jupyter site is not exposed on a publicly accessible url.
I have built an application in python that is hosted on heroku which basically uses a script written in Python to store some results into a database (it runs as a scheduled task on daily basis). I would have done this with ruby/rails to avoid this confusion, but the application partner did not support Ruby.
I would like to know if it will be possible to build the front-end with Ruby on Rails and use the same database.
My rails application will need to make use MVC and have its own tables on the database, but it will also use the database that python sends data to just to retrieve some data from there.
Can I create the Rails app and reference the details of the database that my python application uses?
How could I test this on my local machine?
What would be the best approach to this?
I don't see any problem in doing this, as far as rails manages the database structure and python script populates it with data.
My advice, but just to make it simpler, is to define the database schema through migrations in your rails app and build it like the python script doesn't exist.
Once you have completed it, simply start the python script so it can start populating the Database (could be necessary to rename some table in the python script, but no more than this).
If you want to test in your local machine you can one of this:
run the python script in your local machine
configure the database.ymlin your rails app to point to the remote DB (can be difficult if you don't have administration access to the host server, because of port farwarding etc)
The only thing you should keep in mind is about concurrent accesses.
Because you have 2 application that both read and write in your DB, would be better if the python script makes its job in a single and atomic transaction, to avoid your rails app finding the DB in an half-updated state.
You can see the database like a shared box, it doesn't matter how many applications use it.
I have a Django 1.6 project (stored in a Bitbucket Git repo) that I wish to host on a VPS.
The idea is that when someone purchases a copy of the software I have written, I can type in a few simple commands that will take a designated copy of the code from Git, create a new instance of the project with its own subdomain (e.g. <customer_name>.example.com), and create a new Postgres database (on the same server).
I should hopefully be able to create and remove these 'instances' easily.
What's the best way of doing this?
I've looked into writing scripts using some sort of combination of Supervisor/Gnunicorn/Nginx/Fabric etc. Other options could be something more serious like using Docker or Vagrant. I've also looked into various PaaS options too.
Thanks in advance.
(EDIT: I have looked at the following services/things: Dokku (can't use Heroku due to data constraints), Vagrant (inc Puppet), Docker, Fabfile, Deis, Cherokee, Flynn (under dev))
If I was doing it (and I did a similar thing with a PHP application I inherited), I'd have a fabric command that allows me to provision a new instance.
This could be broken up into the requisite steps (check-out code, create database, syncdb/migrate, create DNS entry, start web server).
I'd probably do something sane like use the DNS entry as the database name: or at least use a reversible function to do that.
You could then string these together to easily create a new instance.
You will also need a way to tell the newly created instance which database and domain name they needed to use. You could have the provisioning script write some data to a file in the checked out repository that is then used by Django in it's initialisation phase.
I looked at the sqlite.org docs, but I am new to this, so bear with me. (I have a tiny bit of experience with MySQL, and I think using it would be an overkill for what I am trying to do with my application.)
From what I understand I can initially create an SQLite db file locally on my MAC and add entrees to it using a Firefox extension. I could then store any number of tables and images (as binary). Once my site that uses this db is live, I could upload the db file to any web hosting service to any directory. In my site I could have a form that collects data and sends a request to write that data to the db file. Then, I could have an iOS app that connects to the db and reads the data. Did I get this right?
Would I be able to run a Python script that writes to SQLite? What questions should I ask a potential hosting service? (I want to leave MediaTemple, so I am looking around...)
I don't want to be limited to a Windows server, I am assuming SQLite would run on Unix? Or, does it depend on a hosting service? Thanks!
I could upload the db file to any web hosting service to any directory
Supposing that the service has the libraries installed to handle sqlite, and that sqlite is installed.
Would I be able to run a Python script that writes to SQLite
Yes, well, maybe. As of Python 2.5, Python includes sqlite support as part of it's standard library.
What questions should I ask a potential hosting service
Usually, in their technical specs they will list what databases/libraries/languages are supported. I have successfully ran Python sites w/ sqlite databases on Dreamhost.
SQLite would run on Unix
Most *nix flavors have pre-packaged sqlite installation binaries. The hosting provider should be able to tell you this as well.