I'm doing the Python Eve tutorials on authentication and authorization. In the settings.py file it specifies this block of code:
# Let's just use the local mongod instance. Edit as needed.
# Please note that MONGO_HOST and MONGO_PORT could very well be
left
# out as they already default to a bare bones local 'mongod'
instance.
MONGO_HOST = 'localhost'
MONGO_PORT = 27017
# Skip these if your db has no auth. But it really should.
MONGO_USERNAME = '<your username>'
MONGO_PASSWORD = '<your password>'
But in the tutorial it sets up a separate 'accounts' collection in Mongo to handle users and accounts. It looks like it wants you to create a separate authentication process just for the python application, and not use Mongo's built in user access features. Is this correct, and is there a reason for this?
It seems to me that hardcoding a Mongo database user + password into a file on a server isn't so secure, and that it'd make more sense to try to use Mongo's own authentication stuff as much as possible. I know pymongo has some tools to work with Mongo users but I wanted to check if there was a good reason for doing it this way before I go diving into that.
Related
Is it possible to use username, password and db in Redis?
The reason for this question is because in the official pyramid_redis_sessions documentation ( http://pyramid-redis-sessions.readthedocs.io/en/latest/gettingstarted.html ) the parameter...
redis.sessions.url = redis://username:password#localhost:6379/0
... (to use inside a Python/Pyramid production.ini, for example) suggests the use of username, password and db.
However I have not found anything on the internet that explains how to create a user and password linked to a db on Redis. In the link https://stackoverflow.com/a/34781633/3223785 there is some information about using a db (Redis).
There is the possibility of creating a password ( https://stackoverflow.com/a/7548743/3223785 ). But it seems that is a scope of use other than the parameter redis.sessions.url.
NOTE: The pyramid_redis_sessions provides a implementation of Pyramid’s ISession interface, using Redis as its backend.
#Jacky
In the Redis, the AUTH command is used to authenticate to the Redis server. Once a client is authenticated against a server, it can switch to any of the DBs configured on there server. There is no inbuilt authentication against a specific database.
I have a python script that I'm running locally which has a password for another application embedded in the os.system call. I obfuscated my password by storing it in a DB that only I have access to and then using windows auth to connect to the DB (Because the script needs to be automated I cant have a prompt for the PW).
With the above said, it occurred to me, couldn't someone just modify my script and print the 'pw' var to obtain my password? I'm working in a shared cloud environment where other developers would have access to my script. Is there any way to abstract it further so someone couldnt just modify my script and get the pw?
import os
import sqlalchemy as sa
import urllib
import pandas as pd
#Specify the databases and servers used for reading and writing data.
read_server = '~~SERVER_HERE~~'
read_database = '~~DATABASE_HERE~~'
#Establish DB Connection
read_params = urllib.quote_plus("DRIVER={SQL Server};SERVER="+read_server+";DATABASE="+read_database+";TRUSTED_CONNECTION=Yes")
read_engine = sa.create_engine("mssql+pyodbc:///?odbc_connect=%s" % read_params)
#Read PW from DB and set to variable
pw_query = """ SELECT DISTINCT PW FROM ~~TABLENAME_HERE~~ """
pw = pd.read_sql_query(pw_query,con=read_engine,index_col=None)
pw = pw.values.tolist()
pw = str(pw[0])
pw = pw.lstrip("['").rstrip("]'")
#Establish connection to server
os.chdir(r"C:\tabcmd\Command Line Utility")
os.system(r'tabcmd login -s https://~~myURL~~ -u tabadmin -p {mypw}'.format(mypw = str(pw)))
#Make sure you update the below workbook, site names, and destination directory.
os.system(r'tabcmd get "~~FILE_Location~~" -f "~~Destination_DIR~~"')
I'm using standard python (Cpython) and MS SQL Server.
There's no real way to protect your password if someone can modify the script.
However, if the shared cloud environment has separate users (i.e logging in via ssh where each person has their own user on the server), then you can change the permissions to restrict access to your code. If not, then I don't think this is possible.
Given you are also hardcoding your database address and access code, nothing prevents others from just connecting to your database for example.
There are ways of obsfuscating your code, but in the end, there is no secure way for storing your password, just ways which require more effort to extract it.
Also see https://crypto.stackexchange.com/questions/19959/is-python-a-secure-programming-language-for-cryptography
TLDR; As long as somebody has access to your program or even source code, the hardcoded password can be extracted - So in your case it would make sense to restrict access to that program.
I'm attempting to create a REST API that selects the appropriate mongo database to write to along with the correct collection. How do I have eve select the database with the same name as a parameter as well as the collection?
With upcoming v0.6 Eve will natively support multiple Mongo instances.
New: Support for multiple MongoDB databases and/or servers.
You can have individual API endpoints served by different Mongo instances:
mongo_prefix resource setting allows overriding of the default MONGO prefix used when retrieving MongoDB settings from configuration. For example, set a resource mongo_prefix to MONGO2 to read/write from the database configured with that prefix in your settings file (MONGO2_HOST, MONGO2_DBNAME, etc.)
And/or you can use a different Mongo instance depending on the user hitting the database:
set_mongo_prefix() and get_mongo_prefix() have been added to BasicAuth class and derivates. These can be used to arbitrarily set the target database depending on the token/client performing the request.
A (very) naive implementation of user instances, taken from the docs:
from eve.auth import BasicAuth
class MyBasicAuth(BasicAuth):
def check_auth(self, username, password, allowed_roles, resource, method):
if username == 'user1':
self.set_mongo_prefix('MONGO1')
elif username == 'user2':
self.set_mongo_prefix('MONGO2')
else:
# serve all other users from the default db.
self.set_mongo_prefix(None)
return username is not None and password == 'secret'
app = Eve(auth=MyBasicAuth)
app.run()
Also:
Database connections are cached in order to not to loose performance. Also, this change only affects the MongoDB engine, so extensions currently targeting other databases should not need updates (they will not inherit this feature however.)
Hope this will cover your needs. It's currently on the development branch so you can already experiment/play with it.
Say you have parameters "dbname" and "collectionname", and a global MongoClient instance named "client":
collection = client[dbname][collectionname]
PyMongo's client supports the "[]" syntax for getting a database with a given name, and PyMongo's database supports "[]" for getting a collection.
Here's a more complete example with Flask:
client = MongoClient()
#app.route('/<dbname>/<collection_name>')
def find_something(dbname, collection_name):
return client[dbname][collection_name].find_one()
The good thing about my example is it reuses one MongoClient throughout, so you get optimal performance and connection pooling. The bad thing, of course, is you allow your users to access any database and any collection, so you'd want to secure that somehow.
I'm building a Flask app and started using Flask-Login for authentication. What troubles me is that Flask-Login calls the load_user callback for every request that flask handles. Here is the example from https://flask-login.readthedocs.org/en/latest/#how-it-works:
#login_manager.user_loader
def load_user(userid):
return User.get(userid)
To retrieve the user, I need to pass a session token a remote web service across a VPN, and the remote web service does a db query -- this results in noticeable latency on every web request. My load_user looks something like this:
#login_manager.user_loader
def load_user(userid):
# notice that I don't even use the userid arg
try:
# have to get session_token from session; why not
# just get the entire user object from session???
token = session.get('session_token')
user_profile = RestClient().get_user_profile(token)
return User(user_profile['LDAP_ID'])
except:
return None
Seems like maybe I'm subverting the framework. Could just store/retrieve user from session, so why bother to get it from the web service? This option also seems to subvert Flask-Login, but eliminates latency and makes good use of session.
The best way to handle this is to cache session information using something like memcached or redis (look into Flask-Cache for help).
You should have a key-value cache store that structures the cache like so:
key: sessionID
value: user object
This is what most frameworks do -- Flask-Login is a generic tool -- so you have to implement this yourself.
Incidentally, if you're looking for a way to abstract away that nasty LDAP stuff on the backend, you might want to check out https://stormpath.com -- they sync with LDAP servers, and provide a REST API on top of it. There's also a Flask library for interacting with it: Flask-Stormpath.
I just started using Fabric to better control the specific settings for test and deployment environments, and I'm trying to get an idea of the best approach to swapping configurations.
Let's say I have a module in my application that defines a simple database connection and some constants for authentication by default:
host = 'db.host.com'
user = 'someuser'
passw = 'somepass'
db = 'somedb'
class DB():
def __init__(self,host=host,user=user,passw=passw,db=db,cursor='DictCursor'):
#make a database connection here and all that jazz
Before I found fabric, I would use the getfqdn() function from the socket library to check the domain name of the host the system was being pushed to and then conditionalize the authentication credentials.
if getfqdn() == 'test.somedomain.com':
host = 'db.host.com'
user = 'someuser'
passw = 'somepass'
db = 'somedb'
elif getfqdn() == 'test.someotherdomain.com':
host = 'db.other.com'
user = 'otherguy'
passw = 'otherpass'
db = 'somedb'
This, for obvious reasons, is really not that great. What I would like to know is what's the smartest way of adapting something like this in Fabric, so that when the project gets pushed to a certain test/deployment server, these values are changed at post-push.
I can think of a few approaches just from looking through the docs. Should I have a file that just defines the constants that Fabric could output to using the shell commands based off what the deployment was, and then the file defining the database handler could import them? Does it makes sense to run open and write from within the fabfile like this? I assumed I'd also have to .gitignore these kinds of files so they don't get committed into the repo and just rely on Fabric to deploy them.
I plan on adapting whatever approach is the best suggested to all the configuration settings that I currently either swap using getfqdn or adjust manually. Thanks!
You can do all of that off of the env.host and then use something like the contrib template function to render the conf file and push it up. But templates are best in these instances (see: puppet and other config managers as well)