I have a python app that needs to connect to mongodb statefulset in kubernetes
I made a chart for my app and mongodb as a dependency using helm.
when the data needs to be written to the DB it send this error code:
pymongo.errors.OperationFailure: Authentication failed., full error: {'operationTime': Timestamp(1629805756, 1), 'ok': 0.0, 'errmsg': 'Authentication failed.', 'code': 18, 'codeName': 'AuthenticationFailed', '$clusterTime': {'clusterTime': Timestamp(1629805756, 1), 'signature': {'hash': b'\xde\x1e\t\x93:H\xd1\x9b\x9b\x15\xacu#S\xe9\x02\xd6#`\xae', 'keyId': 6999961677823213573}}}
The python configs for mongo:
from flask import Flask,request,render_template,jsonify
from pymongo import MongoClient
import os
from dotenv import load_dotenv, find_dotenv
load_dotenv(find_dotenv())
MONGODB_URI = os.getenv('MONGODB_URI')
app = Flask(__name__)
client = MongoClient(MONGODB_URI)
db = client.calocalcdb
collection = db.calocalc_collection
users = db.users
The mongo URI is in a secret file and the application should get it via env variable.
the program fails on this line:
user_id = users.insert_one(user)
When i try to echo $MONGODB_URI from the application pod it returns nothing.
In your depoyment yaml you need something to map the secret to the environment variable; something like:
spec:
template:
spec:
containers:
env:
- name: MONGODB_URI
valueFrom:
secretKeyRef:
name: <secret name>
key: <secret name>
Related
I have a web app which connects to google cloud sql and fetches data. This app works properly on localhost, that is, it can fetch data from the cloud sql database properly. But when I deploy the app on Google App Engine, the home page loads but when I click on function to fetch data from the database, it doesn't work.
Here is the Logs in the logs explorer (only relevant parts):
{
"textPayload": "[error] 33#33: *496 upstream prematurely closed connection while reading response header from upstream, server: , request: \"POST /display_all HTTP/1.1\",
"logName": "/logs/appengine.googleapis.com%2Fnginx.error",
}
Here is the code:
from flask import Flask, render_template, request, url_for
from flask_sqlalchemy import SQLAlchemy
import csv
import os
import urllib
import datetime
import time
app = Flask(__name__)
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = 'False'
PASSWORD ="mypwd"
PUBLIC_IP_ADDRESS ="myip"
DBNAME ="mydb"
PROJECT_ID ="myprojectid"
INSTANCE_NAME ="myinstance"
# configuration
app.config["SECRET_KEY"] = "mysecretkey"
app.config["SQLALCHEMY_DATABASE_URI"]= f"mysql+mysqldb://root:{PASSWORD}#{PUBLIC_IP_ADDRESS}:3306/{DBNAME}?unix_socket=/cloudsql/connectionname"
db = SQLAlchemy(app)
class table(db.Model):
_id = db.Column('_id', db.Integer, primary_key = True)
time = db.Column(db.String(50))
def __init__(self, time):
self.time = time
#app.route('/')
def hello_world():
return render_template('index.html')
#app.route('/display_all', methods = ['GET', 'POST'])
def search_all():
ans = db.session.execute("select * from table LIMIT 10;")
return render_template('display_table.html', ans = ans)
if __name__ == '__main__':
db.create_all()
app.run()
When I look up the error this is what I found (https://github.com/owncloud/client/issues/5706), is this what's wrong? How should I turn off buffering if that's the case?
Am I missing something with the app.yaml file or requirements.txt file?
Here they are:
app.yaml file:
In the line with cloud_sql_instances what would be "MY-DATABASE" here? Would it be the database name or the instance name?
runtime: python
env: flex
entrypoint: gunicorn -b :$PORT app:app
runtime_config:
python_version: 3
# This sample incurs costs to run on the App Engine flexible environment.
# The settings below are to reduce costs during testing and are not appropriate
# for production use. For more information, see:
# https://cloud.google.com/appengine/docs/flexible/python/configuring-your-app-with-app-yaml
# # Enable a Unix domain socket
beta_settings:
cloud_sql_instances: <MY-PROJECT>:<INSTANCE-REGION>:<MY-DATABASE>
manual_scaling:
instances: 1
resources:
cpu: 1
memory_gb: 0.5
disk_size_gb: 10
requirements.txt
Flask==1.1.2
Flask-SQLAlchemy==2.4.1
Jinja2==3.0.1
mysql-connector-python==8.0.23
mysqlclient==2.0.3
SQLAlchemy==1.3.23
gunicorn==20.1.0
I'm having trouble connecting to the Airflow's metadata database from Python.
The connection is set, and I can query the metadata database, using the UI's Ad hoc query window.
If I try to use the same connection but from Python, it wouldn't work.
So, in details, I have two connection setup, both are working from the UI:
Connection 1:
Conn type: MYSQL
Host: airflow-sqlproxy-service
Schema: composer-1-6-1-airflow-1-10-0-57315b5a
Login: root
Connection 2:
Conn type: MYSQL
Host: 127.0.0.1
Schema: composer-1-6-1-airflow-1-10-0-57315b5a
Login: root
As I said, both of them working from the UI (Data Profiling -> Ad Hoc Query)
But whenever I create a DAG and try to trigger it from a PythonOperator using various hooks, I'm always getting the same error message:
Sample code 1:
from airflow import DAG
from datetime import datetime, timedelta
from airflow.operators.python_operator import PythonOperator
from airflow.operators.mysql_operator import MySqlOperator
from airflow.hooks.mysql_hook import MySqlHook
class ReturningMySqlOperator(MySqlOperator):
def execute(self, context):
self.log.info('Executing: %s', self.sql)
hook = MySqlHook(mysql_conn_id=self.mysql_conn_id,
schema=self.database)
return hook.get_records(
self.sql,
parameters=self.parameters)
with DAG(
"a_JSON_db",
start_date=datetime(2020, 11, 19),
max_active_runs=1,
schedule_interval=None,
# catchup=False # enable if you don't want historical dag runs to run
) as dag:
t1 = ReturningMySqlOperator(
task_id='basic_mysql',
mysql_conn_id='airflow_db_local',
#sql="select * from xcom",
sql="select * from xcom")
def get_records(**kwargs):
ti = kwargs['ti']
xcom = ti.xcom_pull(task_ids='basic_mysql')
string_to_print = 'Value in xcom is: {}'.format(xcom)
# Get data in your logs
logging.info(string_to_print)
t2 = PythonOperator(
task_id='records',
provide_context=True,
python_callable=get_records)
Sample code 2:
def get_dag_ids(**kwargs):
mysql_hook = MySqlOperator(task_id='query_table_mysql',mysql_conn_id="airflow_db",sql="SELECT MAX(execution_date) FROM task_instance WHERE dag_id = 'Test_Dag'")
MySql_Hook = MySqlHook(mysql_conn_id="airflow_db_local")
records = MySql_Hook.get_records(sql="SELECT MAX(execution_date) FROM task_instance")
print(records)
t1 = PythonOperator(
task_id="get_dag_nums",
python_callable=get_dag_ids,
provide_context=True)
The error message is this:
ERROR - (2003, "Can't connect to MySQL server on '127.0.0.1' (111)")
I looked up the config, and I found this env_variable:
core sql_alchemy_conn mysql+mysqldb://root:#127.0.0.1/composer-1-6-1-airflow-1-10-0-57315b5a env var
I tried to use a postgress connection with this uri as well, same error message (as above).
I'm statred thinking the GCP Airflow's IAP blocking me to have access from a Python DAG.
My Airflow composer version is the following:
composer-1.6.1-airflow-1.10.0
Can anyone help me?
Only the service account associated with Composer will have read access to the tenant project for the metadata database. This will incorporate connecting from the underlying Kubernetes (Airflow) system to the tenant project hosting Airflow's Cloud SQL instance.
The accepted connection methods are SQLAlchemy and using the Kubernetes cluster as a proxy. Connections from GKE will use the airflow-sqlproxy-service.default service discovery name for connecting.
We also you have a CLI option through GKE. Use the following command to run a temporary deployment+pod using the mysql:latest image, which comes preinstalled with the mysql CLI tool:
$ kubectl run mysql-cli-tmp-deployment --generator=run-pod/v1 --rm --stdin --tty --image mysql:latest -- bash
Once in the shell, we can use the mysql tool to open an interactive session with the Airflow database:
$ mysql --user root --host airflow-sqlproxy-service --database airflow-db
Once the session is open, standard queries can be executed against the database.
Problem Facing
I am trying to set up CRUD using flask and pymongo to MongoDB collection. However, I am able to fetch the collections but I am not able to perform any CRUD operation on the collections. I have already created a user with all admin privileges and added the creds to my mongodb URI.
Code
import os
import pymongo
from dotenv import load_dotenv
load_dotenv()
PASSWORD = os.getenv("MONGODB_PASSWORD")
DB_NAME = os.getenv("MONGODB_DB_NAME")
USERNAME=os.getenv('MONGODB_USERNAME')
client = pymongo.MongoClient(f"mongodb+srv://{USERNAME}:{PASSWORD}>#cluster0-7hkix.mongodb.net/{DB_NAME}?retryWrites=true&w=majority")
db = client['cms_user']
user_info = db['UserInfo']
rate_data = db['rate_data']
print(user_info, rate_data)
send_dict = {
"Name": 'a',
"Mail": 'asda',
"age": 2,
"address": 'asdasd',
"uname": 'asdasd',
"pwd": 'a',
"cpwd": 'a'
}
try:
user_info.insert(send_dict)
except pymongo.errors.OperationFailure as e:
print(e.code, e.details)
Ouput
The output i am getting after running this
Collection(Database(MongoClient(host=['cluster0-shard-00-01-7hkix.mongodb.net:27017', 'cluster0-shard-00-02-7hkix.mongodb.net:27017', 'cluster0-shard-00-00-7hkix.mongodb.net:27017'], document_class=dict, tz_aware=False, connect=True, retrywrites=True, w='majority', authsource='admin', replicaset='Cluster0-shard-0', ssl=True), 'cms_user'), 'UserInfo') Collection(Database(MongoClient(host=['cluster0-shard-00-01-7hkix.mongodb.net:27017', 'cluster0-shard-00-02-7hkix.mongodb.net:27017', 'cluster0-shard-00-00-7hkix.mongodb.net:27017'], document_class=dict, tz_aware=False, connect=True, retrywrites=True, w='majority', authsource='admin', replicaset='Cluster0-shard-0', ssl=True), 'cms_user'), 'rate_data')
8000 {'ok': 0, 'errmsg': 'bad auth Authentication failed.', 'code': 8000, 'codeName': 'AtlasError'}
Mongo DB User details
I have created User on mongodb Atlas with full read write access privileges as given here
user access on db
System Configuration
Python version - Python 3.6.8
OS - Windows 10 Home
pymongo - version 3.10.1 with srv support
I'm having trouble with the MongoHQ Heroku addon. Locally my app works and the os variable is present and well-formed on Heroku. However, when I attempt to access the db it throws an error: OperationFailure: database error: unauthorized db:my_database ns:my_database.cars lock type:0 client:128.62.187.133. If I try to hard-code the connection string from MongoHQ and run locally, I get the same error.
My app is below:
import os
import datetime
from flask import Flask
from flask import g
from flask import jsonify
from flask import json
from flask import request
from flask import url_for
from flask import redirect
from flask import render_template
from flask import make_response
import pymongo
from pymongo import Connection
from bson import BSON
from bson import json_util
app = Flask(__name__)
def mongo_conn():
# Format: MONGOHQ_URL: mongodb://<user>:<pass>#<base_url>:<port>/<url_path>
if os.environ.get('MONGOHQ_URL'):
return Connection(os.environ['MONGOHQ_URL'])
else:
return Connection()
#app.route('/', methods=['GET', 'POST'])
def hello():
# Get your DB
connection = mongo_conn()
db = connection.my_database
# Create an object
car = {"brand": "Ford",
"model": "Mustang",
"date": datetime.datetime.utcnow()}
# Get your collection
cars = db.cars # crashes
# Insert it
cars.insert(car)
...
Edit: MongoHQ support helped me. Problem was that I was calling my database my_database instead of the actual DB name given to me by the MongoHQ addon. E.g., db = connection.app52314314. That change fixed it.
You likely need to run the authenticate command against the DB directly after you connect.
Try something like this:
db.authenticate([USER], [PASSWORD])
If that doesn't work, feel free to email support#mongohq.com and we can help you out with your specific DB.
You don't need to do all that. You can simply:
from pymongo import MongoClient
client = MongoClient(os.environ['MONGOHQ_URL'])
mongo_db = client.get_default_database()
It will automatically authenticate you, and connect to the provisioned database, the <url_path> part of your connection url.
I am attempting to deploy a Flask app to Heroku. I'm using Peewee as an ORM for a Postgres database. When I follow the standard Heroku steps to deploying Flask, the web process crashes after I enter heroku ps:scale web=1. Here's what the logs say:
Starting process with command `python app.py`
/app/.heroku/venv/lib/python2.7/site-packages/peewee.py:2434: UserWarning: Table for <class 'flask_peewee.auth.User'> ("user") is reserved, please override using Meta.db_table
cls, _meta.db_table,
Traceback (most recent call last):
File "app.py", line 167, in <module>
auth.User.create_table(fail_silently=True)
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 2518, in create_table if fail_silently and cls.table_exists():
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 2514, in table_exists return cls._meta.db_table in cls._meta.database.get_tables()
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 507, in get_tables ORDER BY c.relname""")
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 313, in execute cursor = self.get_cursor()
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 310, in get_cursor return self.get_conn().cursor()
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 306, in get_conn self.connect()
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 296, in connect self.__local.conn = self.adapter.connect(self.
database, **self.connect_kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/peewee.py", line 199, in connect return psycopg2.connect(database=database, **kwargs)
File "/app/.heroku/venv/lib/python2.7/site-packages/psycopg2/__init__.py", line 179, in connect connection_factory=connection_factory, async=async)
psycopg2.OperationalError: could not connect to server: No such file or directory
Is the server running locally and accepting connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
Process exited with status 1
State changed from starting to crashed
I've tried a bunch of different things to get Heroku to allow my app to talk to a Postgres db, but haven't had any luck. Is there an easy way to do this? What do I need to do to configure Flask/Peewee so that I can use a db on Heroku?
According to the Peewee docs, you don't want to use Proxy() unless your local database driver is different than your remote one (i.e. locally, you're using SQLite and remotely you're using Postgres). If, however, you are using Postgres both locally and remotely it's a much simpler change. In this case, you'll want to only change the connection values (database name, username, password, host, port, etc.) at runtime and do not need to use Proxy().
Peewee has a built-in URL parser for database connections. Here's how to use it:
import os
from peewee import *
from playhouse.db_url import connect
db = connect(os.environ.get('DATABASE_URL'))
class BaseModel(Model):
class Meta:
database = db
In this example, peewee's db_url module reads the environment variable DATABASE_URL and parses it to extract the relevant connection variables. It then creates a PostgresqlDatabase object with those values.
Locally, you'll want to set DATABASE_URL as an environment variable. You can do this according to the instructions of whatever shell you're using. Or, if you want to use the Heroku toolchain (launch your local server using heroku local) you can add it to a file called .env at the top level of your project. For the remote setup, you'll want to add your database URL as a remote Heroku environment variable. You can do this with the following command:
heroku config:set DATABASE_URL=postgresql://myurl
You can find that URL by going into Heroku, navigating to your database, and clicking on "database credentials". It's listed under URI.
Are you parsing the DATABASE_URL environment variable? It will look something like this:
postgres://username:password#host:port/database_name
So you will want to pull that in and parse it before you open a connection to your database. Depending on how you've declared your database (in your config or next to your wsgi app) it might look like this:
import os
import urlparse
urlparse.uses_netloc.append('postgres')
url = urlparse.urlparse(os.environ['DATABASE_URL'])
# for your config
DATABASE = {
'engine': 'peewee.PostgresqlDatabase',
'name': url.path[1:],
'password': url.password,
'host': url.hostname,
'port': url.port,
}
See the notes here: https://devcenter.heroku.com/articles/django
heroku config:set HEROKU=1
import os
import urlparse
import psycopg2
from flask import Flask
from flask_peewee.db import Database
if 'HEROKU' in os.environ:
DEBUG = False
urlparse.uses_netloc.append('postgres')
url = urlparse.urlparse(os.environ['DATABASE_URL'])
DATABASE = {
'engine': 'peewee.PostgresqlDatabase',
'name': url.path[1:],
'user': url.username,
'password': url.password,
'host': url.hostname,
'port': url.port,
}
else:
DEBUG = True
DATABASE = {
'engine': 'peewee.PostgresqlDatabase',
'name': 'framingappdb',
'user': 'postgres',
'password': 'postgres',
'host': 'localhost',
'port': 5432 ,
'threadlocals': True
}
app = Flask(__name__)
app.config.from_object(__name__)
db = Database(app)
Modified coleifer's answer to answer hasenj's comment.
Please mark one of these as the accepted answer.
I have managed to get my Flask app which uses Peewee working on Heroku using the below code:
# persons.py
import os
from peewee import *
db_proxy = Proxy()
# Define your models here
class Person(Model):
name = CharField(max_length=20, unique=True)
age = IntField()
class Meta:
database = db_proxy
# Import modules based on the environment.
# The HEROKU value first needs to be set on Heroku
# either through the web front-end or through the command
# line (if you have Heroku Toolbelt installed, type the following:
# heroku config:set HEROKU=1).
if 'HEROKU' in os.environ:
import urlparse, psycopg2
urlparse.uses_netloc.append('postgres')
url = urlparse.urlparse(os.environ["DATABASE_URL"])
db = PostgresqlDatabase(database=url.path[1:], user=url.username, password=url.password, host=url.hostname, port=url.port)
db_proxy.initialize(db)
else:
db = SqliteDatabase('persons.db')
db_proxy.initialize(db)
if __name__ == '__main__':
db_proxy.connect()
db_proxy.create_tables([Person], safe=True)
You should already have a Postgres database add-on attached to your app. You can do this via the command line or through the web front-end. Assuming that the database is already attached to your app and you have already deployed your with the above changes, log-in to Heroku and create the table(s):
$ heroku login
$ heroku run bash
$ python persons.py
Check that the table was created:
$ heroku pg:psql
your_app_name::DATABASE=> \dt
You then import this file (persons.py in this example) in another Python script, e.g. a request handler. You need to manage the database connection explicitly:
# server.py
from flask import g
from persons import db_proxy
#app.before_request
def before_request():
g.db = db_proxy
g.db.connect()
#app.after_request
def after_request(response):
g.db.close()
return response
…
References:
https://devcenter.heroku.com/articles/heroku-postgresql
http://peewee.readthedocs.org/en/latest/peewee/database.html#dynamically-defining-a-database
http://peewee.readthedocs.org/en/latest/peewee/example.html#establishing-a-database-connection
https://stackoverflow.com/a/20131277/3104465
https://gist.github.com/fprieur/9561148