Im trying to connect my DB for my discord bot.
I decided to use mysqlx module for python but when I run the code in heroku raises an error.
I'm using github for deployment method
Version checked.
It is locally working.
Checked get_session definition in local.
get_session definition in local:
def get_session(*args, **kwargs):
"""Creates a Session instance using the provided connection data.
Args:
*args: Variable length argument list with the connection data used
to connect to a MySQL server. It can be a dictionary or a
connection string.
**kwargs: Arbitrary keyword arguments with connection data used to
connect to the database.
Returns:
mysqlx.Session: Session object.
"""
settings = _get_connection_settings(*args, **kwargs)
return Session(settings)
I have another app hosted in Heroku that uses the same module and deployment method and works perfectly. I can't understand why it doesn't work in this heroku.
Heroku error:
File "/app/bot.py", line 25, in <module>
session = mysqlx.get_session({
AttributeError: module 'mysqlx' has no attribute 'get_session'
The code:
import mysqlx
session = mysqlx.get_session({
"host": "",
"port": 3306,
"user": "",
"password": ""
})
Related
I'm running a containerized airflow project which loads API data to Azure Blob or Data Lake. I'm currently having trouble getting airflow to identify my connections. I've tried several methods to resolve to issue but I still haven't progressed in fixing this problem.
I've tried manually adding connections in the airflow UI inputting
conn_id="azure_data_lake",
conn_type="Azure Blob Storage",
host="",
login= StorageAccountName,
password=StorageAccountKey
port=""
however, once I run the dag I get this error. I've tried running airflow db reset and airflow db init.
File "/opt/airflow/plugins/operators/coinmarketcap_toAzureDataLake.py", line 60, in upload_to_azureLake
wasb_hook = WasbHook(self.azure_conn_id)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/microsoft/azure/hooks/wasb.py", line 65, in __init__
self.connection = self.get_conn()
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/microsoft/azure/hooks/wasb.py", line 71, in get_conn
return BlockBlobService(account_name=conn.login, account_key=conn.password, **service_options)
File "/home/airflow/.local/lib/python3.8/site-packages/sqlalchemy/orm/attributes.py", line 365, in __get__
retval = self.descriptor.__get__(instance, owner)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/connection.py", line 213, in get_password
return fernet.decrypt(bytes(self._password, 'utf-8')).decode()
File "/home/airflow/.local/lib/python3.8/site-packages/cryptography/fernet.py", line 194, in decrypt
raise InvalidToken
cryptography.fernet.InvalidToken
If I programmatically add this via a python script. Running the airflow dag gives me a missing conn_id error. But surprisingly when I run the airflow connections list command I see the conn_id in the db.
from airflow import settings
from airflow.models import Connection
conn = Connection(
conn_id="azure_data_lake",
conn_type="Azure Blob Storage",
host="",
login= StorageAccountName,
password=StorageAccountKey
port=""
) #create a connection object
session = settings.Session() # get the session
session.add(conn)
session.commit()
In your case, there is a problem with token cryptography.fernet.InvalidToken.
The Airflow uses Fernet for all passwords for its connections in the backend database.
The Airflow backend is using the previous fernet key and you have generated a key using which you have created a new connection.
My recommendation is to do the following first:
airflow resetdb
this will help in deleting all the existing records in your backend db.
Then,
airflow initdb
this will initialize the backend like fresh.
If the error still persists, change your fernet_key (airflow.cfg > fernet_key)
$ python
>>> from cryptography.fernet import Fernet
>>> k=Fernet.generate_key()
>>> print(k)
Z6BkzaWcF7r5cC-VMAumjpBpudSyjGskQ0ObquGJhG0=
than, edit $AIRFLOW_HOME/airflow.cfg
So I am trying to set up an S3Hook in my airflow dag, by setting the connection programmatically in my script, like so
from airflow.hooks.S3_hook import S3Hook
from airflow.models import Connection
from airflow import settings
def s3_test_hook():
conn = Connection(
conn_id='aws-s3',
conn_type='s3',
extra={"aws_access_key_id":aws_key,
"aws_secret_access_key": aws_secret},
)
I can run the conn line no problem, which tells me the connection can be made. aws_key and aws_secret are loaded int through dotenv with an .env file I have in my local directory.
However when I run the next two lines in the function:
s3_hook = S3Hook(aws_conn_id='aws-s3')
find_bucket = s3_hook.check_for_bucket('nba-data')
to check for a bucket I know exists.. I receive this error
NoCredentialsError: Unable to locate credentials
Any thoughts on how to approach this?
Thanks!
In your code, you have created an Airflow Connection object, but this doesn't do anything by itself. When a hook is given a connection id, it will look up the given id in various locations (in this order):
Secrets backend (if configured)
Environment variable AIRFLOW_CONN_*
Airflow metastore
Your connection is currently only defined in code, but Airflow is unable to locate it in any of the three locations above.
The Airflow documentation provides some pointers for configuring an AWS connection: https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/connections/aws.html
I am developing a Flask/MongoDB application Im deploying on Azure. Locally, I am in the process of creating my models and testing my database connection. I am using Flask-MongoEngine to manage my DB connection. This is a sample of code that works perfectly on localhost but fails when calling its deployed version on Azure.
# On models.py
from flask_mongoengine import MongoClient
db = MongoClient()
class User(db.Document):
name = db.StringField(max_length=50)
token = db.StringField(max_length=50)
email = db.EmailField()
Later, from views.py I call my User class like this:
import models as mdl
#app.route('/test')
def test():
"""For testing purposes"""
user = mdl.User(name='Matias')
user.save()
users = mdl.User.objects
return jsonify(users)
which ouputs as expected locally. On Azure, however, I get the following error
(will only show the last and relevant part of the traceback):
File ".\app\views.py", line 53, in test
user = mdl.User(name='Matias')
File "D:\home\python364x86\lib\site-packages\mongoengine\base\document.py",
line 43, in _init_
self._initialised = False
File "D:\home\python364x86\lib\site-packages\mongoengine\base\document.py",
line 168, in _setattr_
self._is_document and
AttributeError: 'User' object has no attribute '_is_document'
Through pip freeze I checked I am using the same versions for mongoengine, pymongo and flask_mongoengine in both environments. I can't seem to find someone else with the same problem. The app is deployed as a webapp on a Windows machine in the Azure cloud.
Any help is appreciated, thanks.
PS: Further info
Reviewing mongoengine code, I found out that _is_document attribute is set inside a metaclass for the class Document (DocumentMetaclass and TopLevelDocumentMetaclass). I tried setting the attribute to True inside User, and the following error showed:
AttributeError: 'User' object has no attribute '_meta',
which is also an attribute defined inside those metaclasses. Somehow the metaclasses code is not running? Maybe the Azure environment has something to do with it?
The following lines from my Python app execute with no problems on my local machine.
import googleapiclient.discovery
project_id = 'some-project-id'
resource_manager = googleapiclient.discovery.build('cloudresourcemanager', 'v1')
iam_policy_request = resource_manager.projects().getIamPolicy(resource=project_id, body={})
iam_policy_response = iam_policy_request.execute(num_retries=3)
new_policy = dict()
new_policy['policy'] = iam_policy_response
del new_policy['policy']['version']
iam_policy_update_request = resourcemanager.projects().setIamPolicy(resource=project_id, body=new_policy)
update_result = iam_policy_update_request.execute(num_retries=3)
When I run the app in a GCE instance, and more precisely from within a Docker container inside the GCE instance, I get the exception:
URL being requested: POST https://cloudresourcemanager.googleapis.com/v1/projects/some-project-id:setIamPolicy?alt=json
Traceback (most recent call last):
File "/env/lib/python3.5/site-packages/google/api_core/grpc_helpers.py", line 54, in error_remapped_callable
return callable_(*args, **kwargs)
File "/env/lib/python3.5/site-packages/grpc/_channel.py", line 487, in __call__
return _end_unary_response_blocking(state, call, False, deadline)
File "/env/lib/python3.5/site-packages/grpc/_channel.py", line 437, in _end_unary_response_blocking
raise _Rendezvous(state, None, None, deadline)
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with (StatusCode.PERMISSION_DENIED, User not authorized to perform this action.)>
i.e. an authorization error. Oddly, when I open a Python terminal session inside the GCE instance and run the Python code line by line, I do not get the exception. It only throws the exception when the code is running as part of the app.
I am using a service account inside of the GCE instance, as opposed to my regular account on my local machine. But I don't think that is the problem since I am able to run the lines of code one by one inside of the instance while still relying on the service account roles.
I would like to be able to run the app without the exception within the Docker container inside of GCE. I feel like I'm missing something but can't figure out what the missing piece is.
Looking to your issue it seems an authentication issue, because your application is not properly authenticated :
1- First run this command it will let your application temporarily use your own user credentials:
gcloud beta auth application-default login
the output should be like this:
Credentials saved to file: $SOME_PATH/application_default_credentials.json
2-Then you have set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path to the key file:
export GOOGLE_APPLICATION_CREDENTIALS=$SOME_PATH/application_default_credentials.json
Try to run you Application after that.
Hi I am trying to test my connection to my database on my local drive for a personal project I am working on. I am trying to get better with python and MongoDB. Every time I try to test my connection using postman I am getting a 500 internal error. I double checked my URI = mongodb://127.0.0.1:27017 like majority local. I even uninstalled and reinstalled mongoDB. Any advice or solutions would be highly helpful. In my PyCharm IDE I have my database file
import pymongo
__author__ = 'jslvtr'
class Database(object):
URI = "mongodb://127.0.0.1:27017"
DATABASE = None
#staticmethod
def initialize():
client = pymongo.MongoClient(Database.URI)
Database.DATABASE = client['fullstack']
#staticmethod
def insert(collection, data):
Database.DATABASE[collection].insert(data)
#staticmethod
def find(collection, query):
return Database.DATABASE[collection].find(query)
#staticmethod
def find_one(collection, query):
return Database.DATABASE[collection].find_one(query)
Here is my app file:
from flask import Flask
from src.common.database import Database
__author__ = 'jslvtr'
app = Flask(__name__)
app.config.from_object('config')
app.secret_key ='123'
#app.before_first_request
def init_db():
Database.initialize()
from src.models.users.views import user_blueprint
app.register_blueprint(user_blueprint, url_prefix="/users")
Here is my config file:
__author__ = 'jslvtr'
DEBUG = True
ADMINS = frozenset([
"christopher.jxxxx#gmail.com"
])
This is the error I keep receiving:
DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN"
500 Internal Server Error
Internal Server Error
The server encountered an internal error and was unable to complete your request. Either the server is overloaded or there is an error in the application.
You must check mongodb from command line first like below
C:\Program Files\MongoDB\Server\3.6\bin>mongo localhost:27017 ;
if it works then you can jump into the code;
the issue you are getting error 500 is reserved for server only, meaning it doesn't link to your database connectivity at all. The request you are sending is wrong as per the REST endpoint you defined. Please verify the request with your code. If the request matches with your code you will get connection timeout error from MongoDB if you cannot connect to mongodb from localhost.
I think once you are okay with request you will be able to get a success.
You can paste the request JSON for more help.