Postgres in Azure Flask Web App - python

I've got Flask up and running as per this tutorial
https://azure.microsoft.com/en-us/documentation/articles/web-sites-python-create-deploy-flask-app/
I want Postgres to be my database system, but I'm in a Web App, so I can't just log into the VM and install it. What can I do here?
Thanks

It seems that we don’t have permission to install PostgreSQL database on Azure Web APP server. You need install PostgreSQL on Azure VM.
For examples:
A. If you created a Windows Server VM, refer to the link https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-windows-tutorial/ and connect your VM.
At the link page http://www.enterprisedb.com/products-services-training/pgdownload#windows, you can download a PostgreSQL Windows Installer file and run it on your VM to install it using default configuration step by step.
The PostgreSQL default port is 2345. Make sure the Windows Server Firewall allow the port accessing and try to test the connection by using VM DNS NAME from your local host, and then you can continue to develop.
B. If you create a Linux VM such as Ubuntu,refer to the link https://azure.microsoft.com/en-us/documentation/articles/virtual-machines-linux-tutorial/ and connect your VM.
To install PostgreSQL, you can use Linux Package Management Tool.
Refer to the link:https://wiki.postgresql.org/wiki/Detailed_installation_guides
Ubuntu/Debian:
$ sudo apt-get update
$ sudo apt-get install postgresql
RedHat/CentOS:
Refer to the link:http://wiki.postgresql.org/wiki/YUM_Installation
You can use SQLAlchemy ORM Framework in Flask for PostgreSQL, please refer to http://flask.pocoo.org/docs/0.10/patterns/sqlalchemy/ and http://docs.sqlalchemy.org/en/rel_1_0/dialects/postgresql.html.

Related

Server is freezing while trying to get a backup of MongoDB in docker composer

I have a back-end API server created with Python & Flask. I used MongoDB as my database. I build and run docker-composer every time while I update my source code. Because of this, I always take a backup of my database before stopping and restarting docker container.
From the beginning I am using this command to get a backup in my default folder:
sudo docker-compose exec -T db mongodump --archive --gzip --db SuperAdminDB> backup.gz
This line worked well previously. Then I restore the database again after restarting the the docker-composer to enable my back-end with updated code. I used this command to restore the database.
sudo docker-compose exec -T db mongorestore --archive --gzip < backup.gz
But from today, while I am trying to take a backup from server while the docker is still running (as usual), the server freezes like the image below.
I am using Amazon EC2 server and Ubuntu 20.04 version
First, stop redirecting output of the command. If you don't know whether it is working you should be looking at all available information which includes the output.
Then verify you can connect to your deployment using mongo shell and run commands.
If that succeeds look at server log and verify there is a record of connection from mongodump.
If that works try dumping other collections.
After digging 3 days for right reason I have found that the main reason is the apache.
I have recently installed apache to host my frontend also. While apache is running the server won't allow me to dump mongodb backup. Somehow apache was conflicting with docker.
My solution:
1. Stop apache service
sudo service apache2 stop
2. Then take MongoDB backup
sudo docker-compose exec -T db mongodump --archive --gzip --db SuperAdminDB> backup.gz

Not able to download pyodbc to Azure App Service

I'm using Azure App Service to make my website. The website worked fine on my localhost, using a sqllite database and SQLAlchemy. Now I am trying to switch to the Azure SQL DB using this: https://gist.github.com/timmyreilly/f4a351eda5dd45aa9d56411d27573d7c
I followed the directions, but I'm getting this error. I looked up the error and found this: pyodbc - error while running application within a container but it wasn't able to help because the solution there said to do sudo apt install unixodbc-dev, but Azure CLI doesn't let me use sudo so I'm not sure how I can do this. Can you guys help me, what should I do?
2019-02-15T00:55:28.174067202Z File "/home/site/wwwroot/antenv/lib/python3.7/site-packages/sqlalchemy/connectors/pyodbc.py", line 38, in dbapi
2019-02-15T00:55:28.174070902Z return __import__('pyodbc')
2019-02-15T00:55:28.174195702Z ImportError: libodbc.so.2: cannot open shared object file: No such file or directory
According to your error information and the #IvanYang comments, you deployed your Python app on Azure App Service for Linux Container which be based on Docker.
So you can refer to the offical document SSH support for Azure App Service on Linux to connect to the Linux system of your app to install the missing unixodbc-dev package via sudo apt install unixodbc-dev and then to make your app works before restart your app service.
The change of unixodbc-dev installed is temporary for docker container, you can refer to the existing SO thread Install unixodbc-dev for a Flask web app on Azure App Service to know it. The only way to keep it works is to add the content below into your .dockerfile file or to use a docker image which had been installed the required packages when create an App Service instance on Linux.
# Add unixodbc support
RUN apt-get update \
&& apt-get install -y --no-install-recommends unixodbc-dev

psycopg2.connect failing with error 'connection refused' (Django app+db hosted on separate VMs)

I have a Django app where the app and database reside in two separate VMs. I use Azure. In trying to connect to a postgresql backend hosted in a separate Azure VM, I'm writing: conn = psycopg2.connect("dbname='dbname' host='dnsname'"), where value of host was gotten off the Azure portal from here:
I have also tried the Virtual IP address above as the host. Both approaches fail and I get a connection error: could not connect to server: Connection refused. Is the server running on host "myapp.cloudapp.net" (23.132.341.192) and accepting TCP/IP connections on port 5432? Please note that I changed the IP address when pasting the error here.
The Azure VM hosting the postgresql database has port 5432 added to its Endpoints. Moreover, my app to which this postgresql backend belongs is a Django/Python app. My VMs have Ubuntu 14.04. In setting up postgresql, I ran the following installation commands on both app and database VMs separately:
sudo apt-get update
sudo apt-get install PostgreSQL postgresql-contrib
sudo apt-get install postgresql-server-dev-all
sudo apt-get install libpq-dev
And even though both app and database VMs have all of the above installed in them, the database itself was only created in the latter VM. I didn't create a separate user; I instead did:
sudo su postgres
psql -d postgres -U postgres
And then
CREATE DATABASE dbname;
How do I fix the psycopg2.connect error? Ask for more information if needed.
As to separate PostgreSQL into another Azure VM, we need to configure several settings both on Azure VM side and your PostgreSQL service.
1, modify the listen_addresses="*" at /etc/postgresql/9.4/main/postgresql.conf ,to allow other client beside local to request your PostgreSQL service.
2, Add the following line as the first line of /etc/postgresql/9.4/main/pg_hba.conf. It allows access to all databases for all users with an encrypted password:
# TYPE DATABASE USER CIDR-ADDRESS METHOD
host all all 0.0.0.0/0 md5
3, run command sudo service postgresql restart to restart the service
4, login Azure manage portal, add a request rule in Endpoints page of your VM, configure public port 5432 to private port 5432:

Django not connecting to RDS Mysql

I am trying to deploy my django app on EC2 and am using RDS(Mysql) as the backend.
However when I try to run gunicorn with this, the server does not respond, and the if I run python manage.py dbshell I get the error:
CommandError: You appear not to have the 'mysql' program installed or on your path.
It seems to be trying to connect to a local server of mysql when I have clearly changed the host setting the Database dict to my RDS deployment
Things I have done for solving this and trouble shooting:
1. I have changed the database settings so that they point to the RDS database.
Both the EC2 and the RDS instance are in the same zone of AWS(read this might have been the issue)
Permission could have been a problem, but I checked by trying to login from my local machine, and I was able to access the mysql commandline on my server.
Checked the database settings which django is picking up at runtime,they are correct.
Install Mysql-python, it would not install without a local installation of Mysql, so googled that , people suggested install libmysqlclient-dev first and then you should be able to install Mysql-python.
Has someone face a similar problem? What am I doing wrong here?

Remote query in MySQL using python

How can I query a remote MySQL database, write a select query and insert into my local mysql database using python?
Some tips to keep in mind:
MySQL by default does not listen on a public IP address. This means, even if the server is running; you may not be able to access it remotely.
Even if the server has been reconfigured to listen on the public IP address, your user account needs to be granted permission to connect from remote clients.
Once you have those two taken care of, make sure you are able to connect to server. Use the mysql client:
mysql -H remote.box.com -U yourusername -P
Next, you need to install the MySQL drivers for Python.
On Ubuntu/Kubuntu/Debian: sudo apt-get install python-mysqldb
On RedHat/Fedora/CentOS: sudo yum install MySQL-python
On Windows: http://www.lfd.uci.edu/~gohlke/pythonlibs/ (search for MySQLdb)
On Mac: sudo pip install mysql-python
Finally - read this tutorial which will get you started.

Categories