Databricks SQL Server connection using integrated authentication - python

I'm trying to connect my Databricks cluster to an existing SQL Server database using python. I will like to leverage the integrated authentication method. Getting error com.microsoft.sqlserver.jdbc.SQLServerException: This driver is not configured for integrated authentication.
jdbcHostname = "sampledb-dev.database.windows.net"
jdbcPort= 1433
jdbcDatabase = "sampledb-dev"
jdbcUrl = "jdbc:sqlserver://{0}:{1}; database={2}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties={
"integratedSecurity" : "true",
"driver" : "com.microsoft.sqlserver.jdbc.SQLServerDriver"
}
print(jdbcUrl)
query ="(SELECT * FROM TABLE1.Domain)"
domains = spark.read.jdbc(url = jdbcUrl, table = query, properties = connectionProperties)
display(domains)

You can't use integratedSecurity=true with an Azure PaaS database. IntegratedSecurity is an on-premise construct.
You need to use authentication=ActiveDirectoryIntegrated or authentication=ActiveDirectoryPassword, please see JDBC docs here:
https://learn.microsoft.com/en-us/sql/connect/jdbc/connecting-using-azure-active-directory-authentication?view=sql-server-ver15
You will also need your account to be user with appropriate permissions to that database which is synch'd to Azure AD. If you use multi-factor authentication, then that's not supported for JDBC and your admin will need to provide you with a non-MFA enabled account. You'll know if this is the case because you will get a WSTrust error when trying to connect.

Related

How to connect from GKE to Cloud SQL using Python and Private IP

I want to connect to my MySQL database from my GKE pods using python using a private IP
I've done all the configurations, the connection is working inside a test pod through bash using
mysql -u root -p --host X.X.X.X --port 3306
But it doesn't work inside my Python app... maybe i'm missing something
Here is my current code
# initialize Connector object
connector = Connector(ip_type=IPTypes.PRIVATE)
# function to return the database connection object
def getconn():
conn = connector.connect(
INSTANCE_CONNECTION_NAME,
"pymysql",
user=DB_USER,
password=DB_PASS,
db=DB_NAME
)
return conn
# create connection pool with 'creator' argument to our connection object function
pool = sqlalchemy.create_engine(
"mysql+pymysql://",
creator=getconn,
)
I'm still getting these errors
aiohttp.client_exceptions.ClientResponseError: 403, message="Forbidden: Authenticated IAM principal does not seeem authorized to make API request. Verify 'Cloud SQL Admin API' is enabled within your GCP project and 'Cloud SQL Client' role has been granted to IAM principal.", url=URL('https://sqladmin.googleapis.com/sql/v1beta4/projects/manifest-altar-223913/instances/rapminerz-apps/connectSettings')
Check the below workaround :
Verify the Verify the Workload Identity
If not OK, please follow workload-identity troubleshooting to see what's wrong.
If the setup is OK, please just follow the error message "Verify 'Cloud SQL Admin API' is enabled within your GCP project and 'Cloud SQL Client' role has been granted to IAM principal."
You can search 'Cloud SQL Admin API' in the cloud console, make sure to Enable it.
For the Google Service Account, please grant the 'Cloud SQL Client' role to it.
Please go through the Cloud SQL Python Connector for more details.

Python: AWS Aurora Serverless Data API: password authentication failed for user

I am running out of ideas.
I have created a Aurora Serverless RDS (Version 1) with Data API enabled. I now wish to execute SQL statements against it using the Data API (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/data-api.html)
I have made a small test script using the provided guidelines (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/data-api.html#data-api.calling:~:text=Calling%20the%20Data%20API%20from%20a%20Python%20application)
import boto3
session = boto3.Session(region_name="eu-central-1")
rds = session.client("rds-data")
secret = session.client("secretsmanager")
cluster_arn = "arn:aws:rds:eu-central-1:<accountID>:cluster:aurorapostgres"
secret_arn = "arn:aws:secretsmanager:eu-central-1:<accountID>:secret:dbsecret-xNMeQc"
secretvalue = secret.get_secret_value(
SecretId = secret_arn
)
print(secretvalue)
SQL = "SELECT * FROM pipelinedb.dataset"
res = rds.execute_statement(
resourceArn = cluster_arn,
secretArn = secret_arn,
database = "pipelinedb",
sql = SQL
)
print(res)
However I get the error message:
BadRequestException: An error occurred (BadRequestException) when calling the ExecuteStatement operation: FATAL: password authentication failed for user "bjarki"; SQLState: 28P01
I have verified the following:
Secret value is correct
Secret JSON structure is correctly following recommended structure (https://docs.aws.amazon.com/secretsmanager/latest/userguide/reference_secret_json_structure.html)
IAM user running the python script has Admin access to the account, and thus is privileged enough
Cluster is running in Public Subnets (internet gateways attached to route tables) and ACL and security groups are fully open.
The user "bjarki" is the master user and thus should have the required DB privileges to run the query
I am out of ideas on why this error is appearing - any good ideas?
Try this AWS tutorial that is located in the AWS Examples
Code Library. It shows how to use the AWS SDK for Python (Boto3) to create a web application that tracks work items in an Amazon Aurora database and emails reports by using Amazon Simple Email Service (Amazon SES). This example uses a front end built with React.js to interact with a Flask-RESTful Python backend.
Integrate a React.js web application with AWS services.
List, add, and update items in an Aurora table.
Send an email report of filtered work items by using Amazon SES.
Deploy and manage example resources with the included AWS CloudFormation script.
https://docs.aws.amazon.com/code-library/latest/ug/cross_RDSDataTracker_python_3_topic.html
Try running the CDK to properly setup the database too.
Once you successfully implemented this example, you wil get this front end with a Python backend.

testing locally managed indentity with python

I was trying to setup code using python to test the azure managed identity services and with C# I can able to test the code locally. Is there any way to test the python code locally?
Enabled managed identity in azure appservice
Added the application user(appservice) in azure SQL server and gave permissions.
this is my sample python code to connect to azure sql with managed identity
conn = db.connect('Driver={ODBC Driver 17 for SQL Server};'
'Server=testdb.database.windows.net;'
'Database=studentdb;'
'Authentication=ActiveDirectoryIntegrated;'
)
query = pd.read_sql_query('SELECT * FROM STUDENT', conn)
frame = pd.DataFrame(query)
return func.HttpResponse(frame.to_json(orient="index"), status_code=200)
Can anyone help me to test this code locally? as i do not have permissions on azure to deploy this code and test.
You can use this Moto library which allow you to test services. You can run the Lambda functions in the same way you would run python script.
if __name__ == "__main__":
event = []
context = []
lambda_handler(event, context)
If you are in a virtual environment then this will ensure that all the required dependencies installed properly for the lambda function with correct python function.
If you check this document from Microsoft, then you will find that -
Managed Identity cannot be used to authenticate locally-running applications. Your application must be deployed to an Azure service that supports Managed Identity.
It's important to understand that Managed Identity feature in Azure is ONLY relevant when, in this case, the App Service is deployed.
As an alternative you can use DefaultAzureCredential() from the Azure.Identity library which is compatible both when running locally and for the deployed web app. You can read more about how Azure.Identity works from the official docs.
Most of the time we use Azure MSI to connect Azure SQL in Azure function with python. We can can use Azure MSI to get Azure AD access token then you can use the token to connect Azure SQL.
Once you enabled system assigned identity on your Azure Web App and gave SQL permissions, you can get access to the database directly from python as shown in the snippet below.
import pyodbc
from logzero import loggerwith pyodbc.connect(
"Driver=" + driver + ";Server=" + server + ";PORT=1433;Database=" + database
+ ";Authentication=ActiveDirectoryMsi",
) as conn:
logger.info("Successful connection to database")
with conn.cursor() as cursor:
cursor.execute(“select ##version")).fetchall()
Following are the parameters used above:
Driver: We should use : “{ODBC Driver 17 for SQL Server}”
Server: The sql server on which is your database
Database: The name of your database
Authentication: To specify the connection method “ActiveDirectoryMsi”
Check the SQL database access with managed identity from Azure Web App document and Configure your local Python dev environment for Azure document for more information.

Not autorized to execute any command using Service Connector on MongoDB

I am using the MongoDB on my app and when I try to access the database directly using the service connector, I am able to connect but then I am getting :
Error: error: {
"ok" : 0,
"errmsg" : "not authorized on admin to execute command { *any command*}",
"code" : 13
}
and this on any query or command.
Is there a way to change authorization or accessing the data of my MongoDB
P.S: My MongoDB was bind as in the tutorial: https://docs.developer.swisscom.com/tutorial-python/bind-service.html
It looks like you're trying to execute commands on the admin database on which your user is not authorized. You can find the correct database which your user is authorized on in the credentials (key mongodb.credentials.database) but ideally you connect using the provided URI (mongodb.credentials.uri) which will connect you to the correct database automatically.
You can have a look at the Python example in the tutorial you linked to find out how to access and use those credentials correctly.
The answer from Sandro Mathys is correct and helpful, I wish to clarify/simplyfy a little bit.
The service broker grants you the Role dbOwner and creates a database with random name for you. This is done during cf create-service process.
The database owner can perform any administrative action on the
database. This role combines the privileges granted by the readWrite,
dbAdmin and userAdmin roles.
You have no privileges on admin database. The admin database is only for Swisscom operators. Please use for login with mongo shell the parameter --authenticationDatabase with the random database name from cf env.
Specifies the database in which the user is created. See Authentication Database.
If you do not specify a value for --authenticationDatabase, mongo uses the database specified in the connection string.

Mongodb authentication issue

I am new to mongodb and I am trying to connect it remotely (from my local system to live db) and it is connected successfully. I have admin users in admin table and want that without authentication no one can access my database. But when I try to connect Mongodb remotely via the below mention code , even without authentication i can access any db :
from pymongo import MongoClient, Connection
c = MongoClient('myip',27017)
a = c.mydb.testData.find()
In my config file , the parameter auth is set to True , auth = True . But still no authentication is needed to access my db . Please can anyone let me know what I am missing here.
Based on your description I would guess you haven't actually enabled authentication. In order to enable authentication you must start the Mongo server with certain settings. You can find more information below:
http://docs.mongodb.org/manual/tutorial/enable-authentication/
Basically you need to run with --auth in order to enable authentication.

Categories