I have a remote database server in godaddy. I want to use that for database connectivity for my Django Project. Because, I want to access the data stored in pythonanywhere database and it is said that, it is accessible unless you have the SSH keys which is provided only if you have paid account.
Is there a way to configure the settings.py database values to access it remotely from a java program?
Thank you!
First of all, are you trying to access GoDaddy database from within PythonAnywhere? (if so, you will have to deal with GoDaddy security restrictions/rules on their database) Or are you trying to access a PythonAnywhere database from GoDaddy/somewhere else? (In which case, you would have to deal with PythonAnywhere security rules- which means that you will have to use ssh tunnelling for this use case)
In any case, if the database you need to access is only available over ssh tunnelling, check out the sshtunnel python package.
Related
I'm trying to use Google Cloud to store a MySQL database online which I can make requests to from my device in Python. I've established the connection with the database but I'm not sure where to go from here? Is it now 'saved' on the server, how would I go about accessing it from, say, a normal python module? Beginner in case it wasn't obvious.
While running MongoDB the warnings are
WARNING: Access control is not enabled for the database
Read and write access to data and configuration is unrestricted.
For development, this isn't an issue. You're simply using an insecure database. This means that anyone could connect your database and execute any query they'd like without restriction.
If you'd like to use a secure database, look into the --auth option when running mongodb.
I have a Django application that runs on apache server and uses Sqlite3 db. I want to access this database remotely using a python script that first ssh to the machine and then access the database.
After a lot of search I understand that we cannot access sqlite db remotely. I don't want to download the db folder using ftp and perform the function, instead I want to access it remotely.
What could be the other possible ways to do this? I don't want to change the database, but am looking for alternate ways to achieve the connection.
Leaving aside the question of whether it is sensible to run a production Django installation against sqlite (it really isn't), you seem to have forgotten that, well, you are actually running Django. That means that Django can be the main interface to your data; and therefore you should write code in Django that enables this.
Luckily, there exists the Django REST Framework that allows you to simply expose your data via HTTP interfaces like GET and POST. That would be a much better solution than accessing it via ssh.
Sqlite needs to access the provided file. So this is more of a filesystem question rather than a python one. You have to find a way for sqlite and python to access the remote directory, be it sftp, sshfs, ftp or whatever. It entirely depends on your remote and local OS. Preferably mount the remote subdirectory on your local filesystem.
You would not need to make a copy of it although if the file is large you might want to consider that option too.
Currently in my code, there are a few client machines doing processing and one server machine with a database. After an individual client machine finishes processing some data, it saves the data to a .txt file and sftp's it over to the server.
The server has a job that just waits for these txt files and stores the data into a database.
I wanted to know of any other efficient processes for this, kinda a python beginner. Is there a way to remotely save data into the database of the server? How to do so securely, etc?
To be more specific, this project is a webapp hosted in django. I know how to use django's standalone scripts to save data into a db, just preferably need to know how to do so remotely.
Thank you.
Django databases can be remote - there is no requirement they be on the same host at the django server. Just set an appropriate HOST and PORT. See: https://docs.djangoproject.com/en/dev/ref/databases/#id10
Update: Based on your comment, I understand that you want to write python/django code that will run in the browser, and connect to a remote database. There is no practical way of doing this. Have the data sent back to your server, and forward it on from there.
Update 2: If you are able to distribute software outside of the browser, you could have a small django deployment on each client computer, which the user connects to through their browser, which could connect directly to the database. Obviously, security considerations apply.
I am looking for feasible solutions for my Application to be backed with MongoDB. I am looking to host the MongoDB on the cloud with a python based server to interact with the DB and my app (either mobile/web). I am trying to understand how the architecture should look like.
Either i can host a mongoDB on the AWS cloud and have the server running there only.
I also tried using MongoLab and seemed to be simple accessing it using HTTP requests. but i am not sure if it exposes all the essential features of MongoDB (what ever i can do using a pymongo driver)? Also, should i go for accessing the MongoLab service directly from my application or still i should build a server in-between?
I would prefer to building an server in either case as i want to do some processing before sending the data back to application. but i am not sure in that case how my DB-server-app interaction design should be
Any suggestions?
One thing to consider is that you don't need to use MongoLab's REST API. You can connect directly via a driver as well.
So, if you need to implement business logic (which it sounds like you do), it makes sense to have a three tier architecture with an app server connecting to your MongoLab database via one of the drivers. In your case it sounds like this would be pymongo.
-will