PYODBC + MS SQL SERVER connection with Encrypt=yes not connecting - python

We have a python flask app running on an aws centos ECS instance. We are trying to establish an encrypted connection to our database via PYODBC with odbc 17 on Linux. When running locally we just use the SQL server driver. Currently we have the code:
params = urllib.parse.quote_plus(driver;server;user;pwd;...;Encrypt=yes)
SQLALCHEMY_DATABASE_URI="mssql+PYODBC:///?odbc_connect=%s" %params
We have tls enabled on the server. The connection works locally on windows but not deployed in Linux.
Currently doing a deployment with 'yes' instead of 'true'. We are also about to try with 'trustedserverconnection=yes'. Any insight on this process would be greatly appreciated!
Update: latest error, invalid connection string attribute 'trustservercertificate'

We ended up implementing a second connection param:
TrustServerCertificate=YES
Which is not ideal, obviously, because we want to have good security implementation practices. In future state we will need to set this to false and put our ssl pem file in the Linux ssl store.
Hope this helps someone. Had some issues finding documentation for pyodbc with MS SQL Server.

According to this documentation, pyodbc passes the connection string through to the underlying ODBC driver. Microsoft's
article Using Connection String Keywords with SQL Server Native Client
documents both the Encrypt and TrustServerCertificate attributes. The TrustServerCertificate setting should generally be avoided in production databases; however, it is very useful when testing encrypted connections to a development database that is using a self-signed certificate. For example, the default installation of SQL Server uses a self-signed certificate and will require this setting.
In my mssql+pyodbc connection strings I just append ?Encrypt=yes&TrustServerCertificate=yes as appropriate. Please note, if you already have another setting after a question mark ? then use & instead of ?, for example: ?Trusted_Connection=yes&Encrypt=yes&TrustServerCertificate=yes

Related

"OperationalError : no password supplied", when linking python and sql [duplicate]

This is probably a silly error but I cannot seem to find a satisfying solution.
When running db.create_all(), I got the following error.
sqlalchemy.exc.OperationalError: (OperationalError) fe_sendauth: no password supplied None None
My database link is set as
'postgresql://localhost/db_name'
This worked fine on my Mac and Heroku, but is not OK on ubuntu (digitalocean).
Any ideas what I might be doing wrong?
You probably just need to remove "localhost" from your connection string:
'postgresql:///db_name'
That tells psycopg2 to use Unix-domain sockets. Your default configuration will use "ident" so you'll be connecting as the user that runs the script. In the default configuration, "md5" only applies to TCP connections.
URL pattern should be:
postgresql://user:password#localhost:5432/database_name
pip install psycopg2
the user should be postgres or any other user you have created and intend to use
similarly for mySql it would be:
mysql://user:pass#localhost:3306/database_name
pip install mysql-python
On your Mac, PostgreSQL was set up for trust or peer authentication for connections from localhost.
On your Ubuntu box it's set up for md5 authentication for connections from localhost.
You'll want to configure a password, or change the authentication mode. See pg_hba.conf, and the Ubuntu guide for PostgreSQL (there's a section about this error).
Below worked for me. Your connection to your postgres database requires a password; thus, below is what you should write..
pg_user = "magicmike"
pg_pwd = "test123"
pg_port = "5432"
app.config["SQLALCHEMY_DATABASE_URI"] = "postgresql://{username}:{password}#localhost:{port}/foodversity_db".format(username=pg_user, password=pg_pwd, port=pg_port)
First make sure that the database server is connected and then run the command again.Silly, but it worked for me.
For Remote Server
remote server => postgresql://<username>:<password>#<ipaddress>:<port>/<database>
For Local in configuration use
local db => postgressql:///<database>

Python IBM_DB using SSL connection

I'm using Python on Centos 7 and I have installed GSK8Kit with DB2 11.3 client.
So I set:
IBM_DB_HOME=/path/to/my/db2client/sqllib - ODBC and clidriver
Also I set:
LD_LIBRARY_PATH = $IBM_DB_HOME/lib:$LD_LIBRARY_PATH
Then I installed ibm_db:
pip install ibm_db
I added my db2servercert.arm into mykeydb.kdb file, located /opt/IBM/db2/GSK8KitStore and I'm using the same version of GSK8Kit on client and server.
gsk8capicmd_64 -cert -add -db mykeydb.kdb -stashed -label "DB2 Server
self-signed certificate" -file db2servercert.arm -format ascii -trust enable
According to this IBM docs: https://www.ibm.com/support/knowledgecenter/SSEPGG_11.1.0/com.ibm.db2.luw.admin.sec.doc/doc/t0053518.html
From Db2 V10.5 FP5 onwards, the SSLClientKeystoredb and SSLClientKeystash keywords are not needed in the connection string, db2cli.ini file, FileDSN, or db2dsdriver.cfg file. If you have not set or passed values for the SSLClientKeystoreddb and SSLClientKeystash keywords, the CLI/ODBC client driver will create a default key database internally during the first SSL connection. The Client driver will call GSKit API's to create a key database populated with the default root certificates.
Now I'm trying to create ibm_db connection string for db2 SSL connection using various scenarios:
Security=ssl and SSLServerCertificate=/path/to/my/db2servercert.arm "Database=sampledb;Protocol=tcpip;Hostname=myhost;Servicename=50001;Security=ssl;SSLServerCertificate=/path/to/my/db2servercert.arm;"
SECURITY=SSL and SSLClientKeystoredb=/opt/IBM/db2/GSK8KitStore/mykeydb.kdb and SSLClientKeystash=/opt/IBM/db2/GSK8KitStore/mystashfile.sth
"Database=sampledb;Protocol=tcpip;Hostname=myhost;Servicename=50001;Security=ssl;SSLClientKeystoredb=/opt/IBM/db2/GSK8KitStore/mykeydb.kdb;SSLClientKeystash=/opt/IBM/db2/GSK8KitStore/mystashfile.sth;"
Security=ssl
"Database=sampledb;Protocol=tcpip;Hostname=myhost;Servicename=50001;Security=ssl;"
In 1) and 2) I was able to connect without any SSL error connections, but in 3) I'm getting Socket 414 error:
[IBM][CLI Driver] SQL30081N A communication error has been detected. Communication protocol being used: "SSL".
Communication API being used: "SOCKETS". Location where the error was detected: "".
Communication function detecting the error: "sqlccSSLSocketSetup". Protocol specific error code(s): "414", "", "". SQLSTATE=08001
That means:
https://www.ibm.com/support/knowledgecenter/en/SSAL2T_7.1.0/com.ibm.cics.tx.doc/reference/r_gskit_error_codes.html,
414 error: GSK_ERROR_BAD_CERT - Incorrectly formatted certificate received from partner.
Note: on another machine with the same config and ibm_db installed this connection string works (I'm sure I missed smth)
"Database=sampledb;Protocol=tcpip;Hostname=myhost;Servicename=50001;Security=ssl;"
My questions are:
Which env variables or db2 client parameters I have to configure to connect only with Security=ssl property?
How does ibm_db work under the hood, when trying to connect to db2 remote server and where I can find this root certificate based on which it automatically generate its own keydb.kdb file as mentioned in IBM docs?
Thx for any idea ;)
If you're using a self-signed SSL certificate, you can't connect without using options 1 or 2.
In option 1 you're supplying the certificate's public key directly, to allow the Db2 client to validate the Db2 server. This is already using the "in memory keystore" that you're asking about in question #2.
In option 2, you would have imported the same public key into your keystore to allow the Db2 client to validate the server.
If you want to connect using only Security=SSL, your Db2 server's SSL certificate needs to come from one of the CAs already in the system keystore.
I believe that when the Db2-documentation writes "The Client driver will call GSKit API's to create a key database populated with the default root certificates", it means that the dynamically created kdb will contain the certs for some common commercial CAs, and (if specified) will also contain the cert specified by SSLServerCertificate.
As you are using a self-signed certificate, the CA certs will be ignored in this case.
If you are connecting to a Db2-server that runs on Linux/Unix/Windows, using IBM's drivers, and want an encrypted connection that uses the target Db2-instance public-key as part of the encryption, then you must tell the Db2-client the location of that certificate (which contains the Db2-instance public key) in one way or another.
For a linux client, thay cert will either be in a statically created kdb (via GSKit commands), or in a dynamically created kdb as specified by using the SSLServerCertificate property. For a Db2-client running on Microsoft Windows the certificate can additionally be fetched from the MS keystore if Db2-client is configured to use that.
The source code for ibm_db module is available on github. However, the client-side SSL work happens not in ibm_db module but instead happens in the (closed source) Db2-driver along with (closed source) libraries for GSKit. To see some of what's happening under the covers you can trace the CLI driver. Refer to the Db2-documentation online for details of CLI tracing.

How to enable detailed database connection logging?

Our DBA is trying to migrate a Django application's database to a new backend host running Oracle 12. When I put that host's info in the Django settings.py file, I get this error:
DatabaseError: ORA-28547: connection to server failed, probable Oracle Net admin error
The DBA has asked for my help in solving this problem. Is there a way to turn on detailed logging in Django while establishing a database connection? I've seen directions for enabling logging of database queries, but we're not getting that far -- our error is happening sometime during the connection.
Basically you can use a Wireshark or any sniffing tool to check what is the connection string passed to the DB.
But this oracle error will occur when there is an error after the initial handshake to the Oracle DB and further establishing a connection.
Check you oracle_cx for python and the oracle instant client whether the insta client is of the correct version to the DB. Try a connection form instantclient directly to the DB using sqlplus.
Reference:
django docs
DBA forum
django community

sqlalchemy connection identical credentials refused when run on different machines

I have a problem which seems impossible to me, meaning I am fundamentally misunderstanding something. I've written a simple API using flask (a python library). This api, among other things, connects to a mysql server running on a remote web server. I am using the sqlalchemy library to perform this connection.
The connection string is quite simple. It looks like this:
db =create_engine('mysql+mysqlconnector://{user}:{password}#{host}:{port}/{database}'.format(user=Constants.Sql.USER, password=Constants.Sql.PASS, host=Constants.Sql.HOST, port=Constants.Sql.PORT, database=Constants.Sql.DATABASE))
connection = db.connect()
On my development machine this all works fine. However, when I deploy the api to a different remote machine, it doesn't work. I get the error:
sqlalchemy.exc.ProgrammingError: (ProgrammingError) 1045 (28000): Access denied for user 'user'#'domain' (using password: YES) None None
This doesn't make any sense to me because it is using exactly the same credentials (they are hard coded).
The working environment is a windows machine, the environment throwing the error is ubuntu 14.04. Both the windows and ubuntu machines are remote to the web server on which the database is running, so it can't be some weird localhost thing.
I am totally stumped with this. If anyone could give me some advice I'd really appreciate it!
Maybe the database only accepts connections from a particular IP address. That would explain why same username and password would succeed on one and fail on the other.
GRANT includes IP address information. Look at the MySQL documentation. Or this tutorial:
https://alvinalexander.com/blog/post/mysql/add-user-mysql/

Connecting to MS SQL Server using python on linux with 'Windows Credentials'

Is there any way to connect to an MS SQL Server database with python on linux using Windows Domain Credentials?
I can connect perfectly fine from my windows machine using Windows Credentials, but attempting to do the same from a linux python with pyodbs + freetds + unixodbc
>>import pyodbc
>>conn = pyodbc.connect("DRIVER={FreeTDS};SERVER=servername;UID=username;PWD=password;DATABASE=dbname")
results in this error:
class 'pyodbc.Error'>: ('28000', '[28000] [unixODBC][FreeTDS][SQL Server]Login incorrect. (20014) (SQLDriverConnectW)')
I'm sure the password is written correctly, but I've tried many different combinations of username:
DOMAIN\username
DOMAIN\\username
or even
UID=username;DOMAIN=domain
to no avail. Any ideas?
As of at least March 2013, this seems to work out of the box with FreeTDS. I specified the TDS protocol version for good measure--not sure if that makes the difference:
connStr = "DRIVER={{FreeTDS}};SERVER={0};PORT=1433;TDS_Version=7.2;UID={1}\\{2};PWD={3}".format(hostname, active_directory_domain, username, password)
Integrated authentication also appears to be supported in Microsoft's official driver for linux: http://msdn.microsoft.com/en-us/library/hh568450.aspx . I'm not sure how many Linux distributions it actually works on or how much of the source is available. They explicitly mention RHEL 5 and 6 and some dependencies on the download page.
As pointed out in one of the comments, this answer is quite stale by now. I regularly and routinely use GSSAPI to authenticate from Linux to SQL Server 2008 R2 but mostly with the EasySoft ODBC manager and the (commercial) EasySoft ODBC SQL Server driver.
In early 2009, a colleague and I managed to connect to a SQL Server 2005 instance from Solaris 10 using GSSAPI (Kerberos credentials) using DBB::Perl over a FreeTDS build linked against a particular version of the MIT kerberos libraries. The trick was -- and this is a little bit difficult to believe but I have verified it by looking through the FreeTDS source code -- to specify a zero-length user_name. If the length of the user_name string is 0 then the FreeTDS code will attempt to use GSSAPI (if that support has been compiled in). I have not been able to do this via Python and pyodbc as I could not figure out a way of getting ODBC to pass down a zero-length user_name.
Here in the perl code .. there are multiple opportunities for breakage wrt configuration files such as .freetds.conf etc. I seem to recall that the principal had to be in uppercase but my notes seem to be in disagreement with that.
$serverprincipal = 'MSSQLSvc/foo.bar.yourdomain.com:1433#YOURDOMAIN.COM';
$dbh = DBI->connect("dbi:Sybase:server=THESERVERNAME;kerberos=$serverprincipal", '', '');
You will have to know how to use the setspn utility in order to get the SQL Server server to use the appropriate security principal name.
I do not have any knowledge of the kerberos side of things because our environment was set up by an out and out Kerberos guru and has fancy stuff like mutual trust set up between the AD domain that the SQL Server is running in and the Kerberos domain that my client was running in.
There is some code http://code.google.com/p/libsqljdbc-auth/ which does GSSAPI authentication from Linux to SQL Server but it is Java only. The author (who seems to know his stuff) also has contributed a similar patch to the jTDS project which works with more recent versions of Java that have GSSAPI built in.
So the pieces are all there, it is just a big tangled mess trying to get them all to work together. I found the pyodbc to unixODBC to FreeTDS odbc to TDS integration pretty hard to trace/debug. The perl stuff because it was a pretty thin wrapper on top to CT-Lib was much easier to get going.
Probably a bit too late to help you out - but I encountered the same issue. At the time of writing, the latest version of pyodbc allows me to login with windows credentials. Just leave the UID field blank in your connection string like so:
cnxn = pyodbc.connect('DRIVER={SQL Server};SERVER=myserverinstance;DATABASE=mydatabase;UID=;PWD=mypassword')
Now this is using your existing windows credentials when you're logged on... not sure how to specify any arb windows domain credentials...
I haven't done it in a while, but I remember the whole unixodbc + FreeTDS + pyodbc thing being a little tricky. However, it can be done, and once setup it's not that hard.
This website provides very good instructions:
http://www.pauldeden.com/2008/12/how-to-setup-pyodbc-to-connect-to-mssql.html (archived copy on Web Archive)
Also, in my experience pyodbc had issues compiling/running on 64 bit Linux machines. Because of that we eventually used ceODBC. ceODBC isn't quite as stable as pyodbc (encountered more unexpected bugs than in pyodbc when running in python prorgram), but it is very easy to get up and running on Linux 64 bit.
I don't believe you'll be able to log in to a windows domain account in this way. You need to set up a user in sql directly for this manner of passing credentials.

Categories