I am not sure how to connect to a mongodb database that uses an authentication database with mongoengine.
On the command prompt I need to do mongo hostname:27017/myApp -u "test" -p "test" --authenticationDatabase admin, but I don't see where I'd pass this as an argument to mongoengine so I use the admin database for auth but connect to the myApp database for my models?
I believe this is where it's explained in the PyMongo guide:
https://api.mongodb.com/python/current/examples/authentication.html
>>> from pymongo import MongoClient
>>> client = MongoClient('example.com')
>>> db = client.the_database
>>> db.authenticate('user', 'password', source='source_database')
and I found the pull request that added this to mongoengine:
https://github.com/MongoEngine/mongoengine/pull/590/files
It looks like you just add authentication_source as an argument to connect like connect(authentication_source='admin'). It'd be nice if it was better documented.
http://docs.mongoengine.org/apireference.html?highlight=authentication_source
According to the mongoengine connecting guide, the connect() method support URI style connections. i.e.
connect(
'project1'
host='mongodb://username:password#host1:port1/databaseName'
)
In that sense, you can also specify the authentication source database as below:
"mongodb://username:password#host1:port1/database?authSource=source_database"
See also MongoDB connection string URI for more MongoDB URI examples.
Also Authentication options through connection string
The API has been updated, so this is the right way to do it now:
connect('mydb',
host="localhost",
username="admin",
password="secret",
authentication_source='your_auth_db')
The solution suggested doesn't work for me. What does work:
just add a authSource argument to the connect method as you would do with pymongo MongoClient method. Example:
connect('database_name', host='host', username="username",
password="password",authSource='authentication_database_name')
Here is an easy solution that worked for me.
connect(db="database_name", host="localhost", port=27017, username="username",
password="password", authentication_source="admin")
Related
I need to connect to Snowflake using SQLAlchemy but the trick is, I need to authenticate using OAuth2. Snowflake documentation only describes connecting using username and password and this cannot be used in the solution I'm building. I can authenticate using Snowflake's python connector but I see no simple path how to glue it with SQLAlchemy. I'd like to know if there is a ready solution before I write a custom interface for this.
Use snowflake.connector.connect to create a PEP-249 Connection to the database - see documentation. Then use param creator of create_engine (docs) - it takes a callable that returns PEP-249 Connection. If you use it then URL param is ignored.
Example code:
def get_connection():
return snowflake.connector.connect(
user="<username>",
host="<hostname>",
account="<account_identifier>",
authenticator="oauth",
token="<oauth_access_token>",
warehouse="test_warehouse",
database="test_db",
schema="test_schema"
)
engine = create_engine("snowflake://not#used/db", creator=get_connection)
I got this working but just adding more params in the connection URL:
from sqlalchemy.engine import create_engine
import urllib.parse
connection_url = f"snowflake://{user}:#{account}/{database}/{schema}?warehouse={warehouse}&authenticator=oauth&token={urllib.parse.quote(access_token)}"
engine = create_engine(connection_url)
with engine.begin() as connection:
print(connection.execute('select count(*) from lineitem').fetchone())
If you don't want to be constructing the URL on your own, you can use snowflake.sqlalchemy.URL like this:
from snowflake.sqlalchemy import URL
connection_url = URL(
user=user,
authenticator="oauth",
token=access_token,
host=host,
account=account,
warehouse=warehouse,
database=database,
schema=schema
)
I have been having major trouble connecting my python shell to my postgres. I am doing this on windows. I have downloaded psycopg2 and everything for this to process, however it still is not working.
import psycopg2
conn=psycopg2.connect("dbname = 'test' user ='postgres' host ='localhost' password = 'mypassword'")
It gives me an error telling me that the database "test" does not exist, however it does! If you guys have any advice at all on what I should test out, that would be amazing. Thank you!
You can layout connection parameters as a string and pass it to the connect() function as like:
conn = psycopg2.connect("dbname=test user=postgres password=postgres")
Or you can use a list of keyword arguments like
conn = psycopg2.connect(host="localhost",database="test", user="postgres", password="postgres")
If its still fails then you should check on PostgreSQL side. You should try to connect the db in question using command line and see if error re appears or not. if it appears then something is missing on DB server side.
I've recently changed my project to use SQLAlchemy and my project runs fine, it used an external MySQL server.
Now I'm trying to work with a different MySQL server with SSL CA, and it doesn't connect.
(It did connect using MySQL Workbench, so the certificate should be fine)
I'm using the following code:
ssl_args = {'ssl': {'ca': ca_path}}
engine = create_engine("mysql+pymysql://<user>:<pass>#<addr>/<schema>",
connect_args=ssl_args)
and I get the following error:
Can't connect to MySQL server on '\addr\' ([WinError 10054] An existing connection was forcibly closed by the remote host)
Any suggestions?
I changed the DBAPI to MySQL-Connector, and used the following code:
ssl_args = {'ssl_ca': ca_path}
engine = create_engine("mysql+mysqlconnector://<user>:<pass>#<addr>/<schema>",
connect_args=ssl_args)
And now it works.
If you just connect from a client machine with an ssl connection (so you don't have access to the cert and key), you could simple add ssl=true to your uri.
Edit:
For example:
mysql_db = "mysql+mysqlconnector://<user>:<pass>#<addr>/<schema>?ssl=true"
The official doc is well documented:
engine = create_engine(
db_url,
connect_args={
"ssl": {
"ssl_ca": "ca.pem",
"ssl_cert": "client-cert.pem",
"ssl_key": "client-key.pem"
}
}
)
Another solution is to use sqlalchemy.engine.url.URL to define the URL and pass it to create_engine.
sqlUrl = sqlalchemy.engine.url.URL(
drivername="mysql+pymysql",
username=db_user,
password=db_pass,
host=db_host,
port=3306,
database=db_name,
query={"ssl_ca": "main_app/certs/BaltimoreCyberTrustRoot.crt.pem"},
)
create_engine(sqlUrl)
You can include SSL parameters as a dictionary in the query argument.
This approach is useful if you are using Flask to initialize the SqlAlchemy engine with a config parameter like SQLALCHEMY_DATABASE_URI rather than directly using create_engine.
Is it possible to make SQLAlchemy do cross server joins?
If I try to run something like
engine = create_engine('mssql+pyodbc://SERVER/Database')
query = sql.text('SELECT TOP 10 * FROM [dbo].[Table]')
with engine.begin() as connection:
data = connection.execute(query).fetchall()
It works as I'd expect. If I change the query to select from [OtherServer].[OtherDatabase].[dbo].[Table] I get an error message "Login failed for user 'NT AUTHORITY\\ANONYMOUS LOGON"
Looks like there's an issue with how you authenticate to SQL server.
I believe you can connect using the current Windows user, the URI syntax is then mssql+pyodbc://SERVER/Database?trusted_connection=yes (I have never tested this, but give it a try).
Another option is to create a SQL server login (ie. a username/password that is defined within SQL server, NOT a Windows user) and use the SQL server login when you connect.
The database URI then becomes: mssql+pyodbc://username:password#SERVER/Database.
mssql+pyodbc://SERVER/Database?trusted_connection=yes threw an error when I tried to it. It did point me in the right direction though.
from sqlalchemy import create_engine, sql
import urllib
string = "DRIVER={SQL SERVER};SERVER=server;DATABASE=db;TRUSTED_CONNECTION=YES"
params = urllib.quote_plus(string)
engine = create_engine('mssql+pyodbc:///?odbc_connect={0}'.format(params))
query = sql.text('SELECT TOP 10 * FROM [CrossServer].[datbase].[dbo].[Table]')
with engine.begin() as connection:
data = connection.execute(query).fetchall()
It's quite complicated if you suppose to alter different servers through one connection.
But if you need to perform a query to a different server under different credentials you should add linked server first with sp_addlinkedserver. Then it should be added credentials to the linked server with sp_addlinkedsrvlogin. Have you tried this?
On the command line, this works:
$ mongo
> show dbs
mydatabase 1.0GB
However, this does not:
$ python
>>> import pymongo
>>> connection = pymongo.MongoClient()
>>> connection.mydatabase.find()
I read through docs here:
http://api.mongodb.org/python/current/tutorial.html
But do not understand how to either...
connect to an existing database (using pymongo)
query what databases exist in the mongodb connection.
Why can't I access my database?
Connect to an existing database
import pymongo
from pymongo import MongoClient
connection = MongoClient()
db = connection.mydatabase
List existing databases
import pymongo
from pymongo import MongoClient
connection = MongoClient()
# connection.database_names() # depreciated
connection.list_database_names()
The question implies user has a local MongoDB. However I found this question trying to connect to a remote MongoDB. I think the tutorial is worth mentioning (no other answer here mentioned how I can specify the host and the port)
The above code will connect on the default host and port. We can also specify the host and port explicitly, as follows:
client = MongoClient('localhost', 27017)
Or use the MongoDB URI format:
client = MongoClient('mongodb://localhost:27017/')
show dbs and find() are totally different commands as such you cannot compare the two.
connection.mydatabase.find()
Will actually do nothing because you cannot find() documents on database level. You are probably looking for:
cursor = connection.mydatabase.mycol.find()
I am no Python programmer but something like that and the foreach the cursor var to get your data.
As an added note you will want to replace mycol with the collection name that contains your documents.
As for querying for a list of databases you can do something like:
databases = connection.mydatabase.command({'listDatabases': 1});
As shown here: http://docs.mongodb.org/manual/reference/command/listDatabases/#listDatabases
However again I am no Python programmer but this should get you started.
On the python command line:
import pymongo
from pymongo import MongoClient
connection = MongoClient() ## connects by default to db at localhost:27017
connection.database_names() ## python binding equivalent to show dbs.
Although there doesn't seem to be a wealth of examples, it appears that the bindings are pretty complete within the Python Driver API Documentation.
database_names() is deprecated. One can use list_database_names() instead.
mongo_db_url will be something like "mongodb://localhost:27017/". 27017 is deafult port number, replace suitably.
from pymongo import MongoClient
client = MongoClient(<mongo_db_url>)
#or client = MongoClient('localhost', 27017)
client.list_database_names()