How to efficiently save data to a remote server database (python) - python

Currently in my code, there are a few client machines doing processing and one server machine with a database. After an individual client machine finishes processing some data, it saves the data to a .txt file and sftp's it over to the server.
The server has a job that just waits for these txt files and stores the data into a database.
I wanted to know of any other efficient processes for this, kinda a python beginner. Is there a way to remotely save data into the database of the server? How to do so securely, etc?
To be more specific, this project is a webapp hosted in django. I know how to use django's standalone scripts to save data into a db, just preferably need to know how to do so remotely.
Thank you.

Django databases can be remote - there is no requirement they be on the same host at the django server. Just set an appropriate HOST and PORT. See: https://docs.djangoproject.com/en/dev/ref/databases/#id10
Update: Based on your comment, I understand that you want to write python/django code that will run in the browser, and connect to a remote database. There is no practical way of doing this. Have the data sent back to your server, and forward it on from there.
Update 2: If you are able to distribute software outside of the browser, you could have a small django deployment on each client computer, which the user connects to through their browser, which could connect directly to the database. Obviously, security considerations apply.

Related

Python deployment on webserver with CGI

I have a python script that I want to make accessible through a website with an userinterface.
I was experimenting with Flask, but I'm not sure this is the right tool for what I want to do.
My script takes userdata (.doc/.txt files), does something with it and returns it to the user. I don't want to save anything and I don't think that I need a database for this (is this right?). The file will be temporarily saved on the server and everything will be deleted once the user downloaded the modified file.
My webhosting provider supports Python and only accepts CGI. I read that WSGI is the preferred method to use with Python and that CGI has scaling issues and can only process one request at a time. I'm not sure if I understand this correctly. If several users would upload files at the same time, the server would only accept one request or overwrite previous requests? Or it can do one request per unique IP address/user?
Would CGI be ok for the simple get/process/return task of my python script or should I look into a hosting service that uses WSGI?
I had a look at Heroku and Render to deploy a flask app, but I think I could do that through my webhosting provider I guess.
For anyone interested in this topic,
I decided to deploy my app on render.com, which supports gunicorn (WSGI).

Packaging a database with a full-stack Python application

I am currently creating an application that will be using Python Flask for the back-end and API and PostgreSQL as the database to store my data in JSON format. My plan is to have a front-end in JS to interact with the API which will pull relevant information from my database.
How do I package the database into the program so that if a fresh copy is pulled from GitHub, a user would have everything needed to host and use the service? I am still a new developer and having difficulty taking my hobbyist code and presenting it in a clean, organized way.
Thank you for all help in advance.
Though your question leaves quite a few options open, here are two things you could do:
If you assume your users can install a PostgreSQL database themselves: you could dump the database which contains the minimum required to run your application (using pg_dump). When your application starts on your user's server, it should detect the database it's connecting to is empty, which should trigger an import of your data. The only thing your users should do is fill out their database connection details
If your users don't know anything about configuring servers: You could create a Docker image containing your Python code and PostgreSQL. This package will contain all dependencies of your application and runs anywhere. Admittedly, this is a bit more 'advanced' and could lead to other difficulties both on your side as well as on your users.

Script that can automatically download new data from the server to my local backup

I have an application running on linux server and I need to create a local backup of it's data.
However, new data is being added to the application after every hour and I want to sync my local backup data with server's data.
I want to write a script (shell or python) that can automatically download new added data from the linux server to my local machine backup. But I am newbie to the linux envoirnment and don't know how to write shell script to achieve this.
What is the better way of achieving this ? And what would be the script to do so ?
rsync -r fits in your use case and it's a single line command.
rsync -r source destination
or the options you need according to your specific case.
So, you don't need a python script for that, but you can still write it and let it use the command above.
Moreover, if you want the Python script to do it in an automatic way, you may check the event scheduler module.
This depends on where and how your data is stored on the Linux server, but you could write a network application which pushes the data to a client and the client saves the data on the local machine. You can use sockets for that.
If the data is available via aan http server and you know how to write RESTful APIs, you could use that as well and make a task run on your local machine every hour which calls the REST API and handles its (JSON) data. Keep in mind that you need to secure the API if the server is running online and not in the same LAN.
You could also write a small application which downloads the files every hour from the server over FTP (if you want to backup files stored on the system). You will need to know the exact path of the file(s) to do this though.
All solutions above are meant for Python programming. Using a shell script is possible, but a little more complicated. I would use Python for this kind of tasks, as you have a lot of network related libraries available (ftp, socket, http clients, simple http servers, WSGI libraries, etc.)

Remotely accessing sqlite3 in Django using a python script

I have a Django application that runs on apache server and uses Sqlite3 db. I want to access this database remotely using a python script that first ssh to the machine and then access the database.
After a lot of search I understand that we cannot access sqlite db remotely. I don't want to download the db folder using ftp and perform the function, instead I want to access it remotely.
What could be the other possible ways to do this? I don't want to change the database, but am looking for alternate ways to achieve the connection.
Leaving aside the question of whether it is sensible to run a production Django installation against sqlite (it really isn't), you seem to have forgotten that, well, you are actually running Django. That means that Django can be the main interface to your data; and therefore you should write code in Django that enables this.
Luckily, there exists the Django REST Framework that allows you to simply expose your data via HTTP interfaces like GET and POST. That would be a much better solution than accessing it via ssh.
Sqlite needs to access the provided file. So this is more of a filesystem question rather than a python one. You have to find a way for sqlite and python to access the remote directory, be it sftp, sshfs, ftp or whatever. It entirely depends on your remote and local OS. Preferably mount the remote subdirectory on your local filesystem.
You would not need to make a copy of it although if the file is large you might want to consider that option too.

Using MongoLab Database service vs Custom web service with MongoDB running on AWS

I am looking for feasible solutions for my Application to be backed with MongoDB. I am looking to host the MongoDB on the cloud with a python based server to interact with the DB and my app (either mobile/web). I am trying to understand how the architecture should look like.
Either i can host a mongoDB on the AWS cloud and have the server running there only.
I also tried using MongoLab and seemed to be simple accessing it using HTTP requests. but i am not sure if it exposes all the essential features of MongoDB (what ever i can do using a pymongo driver)? Also, should i go for accessing the MongoLab service directly from my application or still i should build a server in-between?
I would prefer to building an server in either case as i want to do some processing before sending the data back to application. but i am not sure in that case how my DB-server-app interaction design should be
Any suggestions?
One thing to consider is that you don't need to use MongoLab's REST API. You can connect directly via a driver as well.
So, if you need to implement business logic (which it sounds like you do), it makes sense to have a three tier architecture with an app server connecting to your MongoLab database via one of the drivers. In your case it sounds like this would be pymongo.
-will

Categories