How can I connect to a SQL Server database using user login/password that is in another domain?
If I use my account to connect to DB, it works fine:
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=server_name;DATABASE=testdb;UID=MY_Domain\\username;PWD=pass; Trusted connection=YES')
But I need to use another user's credentials like
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=server_name;DATABASE=testdb;UID=Another_Domain\\username;PWD=pass; Trusted connection=YES')
When I try the latter I get an error of "failed login for MY_Domain\username" rather than
for the user "Another_Domain\username".
In both cases by running SQL Server Management Studio I can use Windows Authentication to connect to the db.
You can not pass a UID and Password and set Trusted_connection=True (your second connection string) to connect as a (impersonated) windows user. You can either connect as a SQL Server user (username and password) or as a windows authenticated user (trusted connection).
Your code should impersonate the windows user (as SSMS does) and then set Trusted_connection=True only.
This MSDN page WindowsIdentity.Impersonate has an example.
Since this works from SSMS it suggests there is the necessary trust between the domains.
Related
I am trying to open a web service app that will allow me to connect to the database on azure with python code, Ive tried using sqlalchemy and pyodbc and i am successfully able to connect to the database on my machine, in the local host, i can perform all necessary actions i want to there. but i want to be able to set this code up to be able to hit specific routes in an ajax call that can perform certain actions on my database, like flipping a users active flag to false. however the problem is that when i upload the python code to azure, using this guide here (https://learn.microsoft.com/en-us/azure/app-service/quickstart-python?tabs=bash&pivots=python-framework-flask) it just returns a 500 server error, i cant find anything in the trace as to why its not working, I thought that it might just be that my local machine is whitelisted in the IP address's to the database, but even still if i add that app services IP address to the allowed IP's it still returns a server error. here is the setup of the code:
from flask import Flask
from sqlalchemy import create_engine
import pyodbc
app = Flask(__name__)
#app.route('/')
def connection():
Driver = "{ODBC Driver 13 for SQL Server}"
Server = "server string"
Port = 1433
Database = "dbname"
Uid = "user"
Pwd = "pass"
try:
cnxn = pyodbc.connect(f'DRIVER={Driver};SERVER={Server};DATABASE={Database};Uid={Uid};Pwd={Pwd};Encrypt=yes;TrustServerCertificate=no;Connection Timeout=30;')
cursor = cnxn.cursor()
cursor.execute(f"SELECT * FROM (Database Table) where id = 999999999;")
for row in cursor:
print('row = %r' % (row,))
return "Connection to database successful"
I ommitted certain details but the syntax should remain intact, Again, in my local machine I can connect to the database and return data. but once it compiles the code in azure, it no longer works.
excuse the try statement. I was trying to catch the error and send it to the client in hopes of gathering more information from the 500 error, but it didnt work.
**Edit: its worth mentioning that if i remove the actual connection string and anything to do with connecting to the database, the code will return "connection to database successful" this leads me to believe that it isnt fully making a connection to the database at all. and thats why its erroring out, however the question remains, why can i connect in my local enviornment but not in azure?
have you heard about remote debugging in azure. This is the best tool to trace out errors using your visual studio code or VS where you web or app is initially working well. You can use multiple methods to debug line by line or cluster by cluster. Kindly check it out here
https://learn.microsoft.com/en-us/visualstudio/debugger/remote-debugging?view=vs-2019
So the short answer to this is the App service was missing the driver {ODBC Driver 13 for SQL Server} it wasn't erroring out locally because I have that driver installed locally, but this driver isn't available in the app. To get this to work I would have to install the App to a Virtual Machine to be able to download the driver appropriately.
fortunately however Azure web apps built on a linux base already come with {ODBC Driver 17 for SQL Server} So flipping the driver from 13 to 17 allowed the app to successfully connect to my Database.
I am trying to connect to an oracle database in Python using create_engine. This database does not have a username or password.
I see this is the format now:
oracle_db = sqlalchemy.create_engine('oracle://user:pass#server').
However, if this connection has NO username or password, how would the connection string look? I've tried DMIT_connection = create_engine('oracle+cx_oracle://#....) with no luck. When I go to write a pandas df to the database using to_sql I get the error below because I cannot get the connection right give that there is no username or password.
The error occurs because this database has no username (picked up from the localhost machine) and there is no password.
The error I get is this: DatabaseError: (cx_Oracle.DatabaseError) ORA-12545: Connect failed because target host or object does not exist (Background on this error at: http://sqlalche.me/e/14/4xp6)
Let me know authentication type used. If its external authentication , picking credentials from wallet, you can try sample code mentioned here
How to use SqlAlchemy to connect Database similar to cx_oracle when we use external authorization like wallets with TNS(net service name)
system: ubuntu
Cloud sql instance: db
Cloud sql user: admin
Cloud sql pass: pass
cloud sql db name: test
cloud_sql_proxy installed and executed by ./cloud_sql_proxy -dir=/cloudsql -instances=prj:asia-northeast1:db -credential_file=path/to/credential
The account in the credential file has all needed roles, and successfully connected to the db from a nodejs server (typeorm).
But with sqlalchemy, I tried
sqlalchemy.create_engine("postgresql+psycopg2://admin:pass#/test?host=/cloudsql/prj:asia-northeast1:db")
and
sqlalchemy.create_engine("postgres+pg8000://admin:pass#/test?unix_sock=%2Fcloudsql%2Fprj%3Aasia-northeast1%3Adb%2F.s.PGSQL.5432")
but both complains about FATAL: password authentication failed for user "admin"
What have I did wrong?
This error states that the db user and db password that you are using to connect to Cloud SQL server is wrong.
I would recommend to create a new Cloud SQL user and password and try again.
Creating and managing MySQL users
If you succeed with the new user, this will confirm my first hypothesis.
When I run this query to bulkinsert a file on a shared drive to SQL server 2008 with username and password (not Windows authentication), I get these errors. DBA, system admins and network guys are all denying these errors are related to their teams and I am lost... Can anyone please help me to identify where the issue is? When I run bulkinsert with database username and password, what authentication does SQL server use to open the file?
Run this on MS Management Studio
BULK INSERT DatabaseName.dbo.TableName
FROM '\\shared_server\parent\child\file_name.txt'
WITH(FIRE_TRIGGERS, DATAFILETYPE='char', FIELDTERMINATOR='\t',ROWTERMINATOR='\n', FIRSTROW=2);
and I get
Cannot bulk load because the file "\\shared_server\parent\child\file_name.txt" could not be opened. Operating system error code 5(Access is denied.).
Run this on python
import pyodbc
database = 'DatabaseName'
username = 'username'
password = 'password'
server = 'server_name'
failover = 'failover_server_name'
cnxn_string = 'DRIVER={SQL Server Native Client 10.0};SERVER=%s;FAILOVER_PARTNER=%s;DATABASE=%s;UID=%s;PWD=%s;CHARSET=UTF8' % (server, failover, database, username, password)
cnxn = pyodbc.connect(cnxn_string)
cursor = cnxn.cursor()
query = r"""
BULK INSERT Estimates.dbo.FundamentalsIS
FROM '\\shared_server\parent\child\file_name.txt'
WITH(FIRE_TRIGGERS, DATAFILETYPE='char', FIELDTERMINATOR='\t',ROWTERMINATOR='\n', FIRSTROW=2);
"""
cursor.execute(query)
cursor.commit()"
and I get
ProgrammingError: ('42000', '[42000] [Microsoft][SQL Server Native Client 10.0][SQL Server]Cannot bulk load because the file "\\shared_server\parent\child\file_name.txt" could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.). (4861) (SQLExecDirectW)')
Could the MS SQL server 2008 possibly be on a different security group (or have different settings) than the shared drives, where the file is located?
Because the bulk insert operation is run on the MS Management studio server side, it might not have access to the file, the 'access denied' leads me to believe DB server cannot get to shared file drive, and possibly does not have permission to access it. Likewise, even if using python to execute the BULK INSERT statement, the DB server still needs to have access to where ever the file is located.
I had a similar issue in the past, because the DB server could not get to the shared file, located elsewhere. My workaround was to use local computer to read in the file and run the insert queries using python. It sounds like the local environment has access to both and can be used as the central communication hub. You might have to do something similar to
https://stackoverflow.com/a/6482610/3761363
https://stackoverflow.com/a/11219626/3761363
I'm connecting Hive use pyhs2. But the Hive server required Kerberos authentication. Anyone knows how to convert the JDBC string to pyhs2 parameter? Like:
jdbc:hive2://biclient2.server.163.org:10000/default;principal=hive/app-20.photo.163.org#HADOOP.HZ.NETEASE.COM?mapred.job.queue.name=default
I think it will be something like this:
pyhs2.connect(host='biclient2.server.163.org',
port=10000,
authMechanism="KERBEROS",
password="something",
user='your_user#HADOOP.HZ.NETEASE.COM')
I'm also doing the same, I still not succeed, but at least having a meaningful errorcode:
(Server hive/xxx#yyy.COM not found in Kerberos database)
This connection string will work as long as the user running the script has a valid kerberos ticket:
import pyhs2
with pyhs2.connect(host='biclient2.server.163.org',
port=10000,
authMechanism="KERBEROS") as conn:
with conn.cursor() as cur:
print cur.getDatabases()
Username, password and any other configuration parameters are not
passed through the KDC.