Integrate Jupyter notebook with data fetched from Django application - python

I have a Django application that integrates data from a SQL database and displays that as a downloadable HTML table. Now I would also like to add analysis functionality, but instead of painstakingly adding only a few functionalities using JavaScript, I want to redirect the user to a Jupyter notebook, so the user has full access to the functionality of Python data analysis libraries or even other languages. Now I'm not sure how to approach this and I have several questions:
Am I right that Jupyter needs to be run on a different server than the Django app since it uses the Tornado server?
How would I transfer the data produced with Django to Jupyter? I store the data in a new MySQL table, but I would at least need to transfer the table name.
Given the table name was transferred to Jupyter I would still need to execute custom code to access the database. I found in this question that some defaults for code execution can be set in IPython configuration, but since the table from which the data should be loaded is never the same, this would have to be adapted dynamically.
I'm glad for any suggestions and comments on whether this idea makes sense at all.

Am I right that Jupyter needs to be run on a different server than the Django app since it uses the Tornado server?
No. They can run on the same server.
How would I transfer the data produced with Django to Jupyter? I store the data in a new MySQL table, but I would at least need to transfer the table name.
You can run django and Jupyter together and execute any django code from Jupyter. The easiest way to do so is by using shell_plus from the package django-extensions You can even connect to the production database.
You should consider the possible security and data safety risks. This is a potential vector for remote code execution and data leaks.
It's probably safer to spin up a cloned django instance with a cloned database or use a different django configuration (settings.py) with read-only access to the db.
It's quite easy to mess up production data from Jupyter, since you can do stuff such as User.objects.all().delete() And that would be a problem...
Also consider running this as a user with stricter read and write permissions than you use for your regular django app.
And of course, you should make sure that the Jupyter site is not exposed on a publicly accessible url.

Related

How to access one Django Project Database from another Django Project?

I am trying to access one Django Project Database value from another Django Project. So far I have tried making REST API on both the applications and it is working somewhat well on the local server. However, I am concerned that when I will be deploying both the projects on the server it will create some lag to access over the API. Moreover, if one server/application is down for maintenance, the other will be inaccessible.
Is it possible to directly access the other app database without using API as it will be much more faster and reliable?
Thanks.

Packaging a database with a full-stack Python application

I am currently creating an application that will be using Python Flask for the back-end and API and PostgreSQL as the database to store my data in JSON format. My plan is to have a front-end in JS to interact with the API which will pull relevant information from my database.
How do I package the database into the program so that if a fresh copy is pulled from GitHub, a user would have everything needed to host and use the service? I am still a new developer and having difficulty taking my hobbyist code and presenting it in a clean, organized way.
Thank you for all help in advance.
Though your question leaves quite a few options open, here are two things you could do:
If you assume your users can install a PostgreSQL database themselves: you could dump the database which contains the minimum required to run your application (using pg_dump). When your application starts on your user's server, it should detect the database it's connecting to is empty, which should trigger an import of your data. The only thing your users should do is fill out their database connection details
If your users don't know anything about configuring servers: You could create a Docker image containing your Python code and PostgreSQL. This package will contain all dependencies of your application and runs anywhere. Admittedly, this is a bit more 'advanced' and could lead to other difficulties both on your side as well as on your users.

Changing database structure of Django in server

When we work with Django in local environment, we change the structure of the Data Base using Command Prompt through migration.
But for using Django in server, i don't now how can i apply such changes? How can i type commands to change Data Base structure? Is it a good way to upload site files every time that i do some change again.
The whole point of migrations is that you run them on both your local database and in production, to keep them in sync.

Remotely accessing sqlite3 in Django using a python script

I have a Django application that runs on apache server and uses Sqlite3 db. I want to access this database remotely using a python script that first ssh to the machine and then access the database.
After a lot of search I understand that we cannot access sqlite db remotely. I don't want to download the db folder using ftp and perform the function, instead I want to access it remotely.
What could be the other possible ways to do this? I don't want to change the database, but am looking for alternate ways to achieve the connection.
Leaving aside the question of whether it is sensible to run a production Django installation against sqlite (it really isn't), you seem to have forgotten that, well, you are actually running Django. That means that Django can be the main interface to your data; and therefore you should write code in Django that enables this.
Luckily, there exists the Django REST Framework that allows you to simply expose your data via HTTP interfaces like GET and POST. That would be a much better solution than accessing it via ssh.
Sqlite needs to access the provided file. So this is more of a filesystem question rather than a python one. You have to find a way for sqlite and python to access the remote directory, be it sftp, sshfs, ftp or whatever. It entirely depends on your remote and local OS. Preferably mount the remote subdirectory on your local filesystem.
You would not need to make a copy of it although if the file is large you might want to consider that option too.

How do I run a Django 1.6 project with multiple instances running off the same server, using the same db backend?

I have a Django 1.6 project (stored in a Bitbucket Git repo) that I wish to host on a VPS.
The idea is that when someone purchases a copy of the software I have written, I can type in a few simple commands that will take a designated copy of the code from Git, create a new instance of the project with its own subdomain (e.g. <customer_name>.example.com), and create a new Postgres database (on the same server).
I should hopefully be able to create and remove these 'instances' easily.
What's the best way of doing this?
I've looked into writing scripts using some sort of combination of Supervisor/Gnunicorn/Nginx/Fabric etc. Other options could be something more serious like using Docker or Vagrant. I've also looked into various PaaS options too.
Thanks in advance.
(EDIT: I have looked at the following services/things: Dokku (can't use Heroku due to data constraints), Vagrant (inc Puppet), Docker, Fabfile, Deis, Cherokee, Flynn (under dev))
If I was doing it (and I did a similar thing with a PHP application I inherited), I'd have a fabric command that allows me to provision a new instance.
This could be broken up into the requisite steps (check-out code, create database, syncdb/migrate, create DNS entry, start web server).
I'd probably do something sane like use the DNS entry as the database name: or at least use a reversible function to do that.
You could then string these together to easily create a new instance.
You will also need a way to tell the newly created instance which database and domain name they needed to use. You could have the provisioning script write some data to a file in the checked out repository that is then used by Django in it's initialisation phase.

Categories