Can someone please explain how to set up dynamodb_mapper (together with boto?) to use ddbmock with sqlite backend as Amazon DynamoDB-replacement for functional testing purposes?
Right now, I have tried out "plain" boto and managed to get it working with ddbmock (with sqlite) by starting the ddbmock server locally and connect using boto like this:
db = connect_boto_network(host='127.0.0.1', port=6543)
..and then I use the db object for all operations against the database. However, dynamodb_mapper uses this way to get a db connection:
conn = ConnectionBorg()
As I understand, it uses boto's default way to connect with (the real) DynamoDB. So basically I'm wondering if there is a (preferred?) way to get ConnectionBorg() to connect with my local ddbmock server, as I've done with boto above? Thanks for any suggestions.
Library Mode
In library mode rather than server mode:
import boto
from ddbmock import config
from ddbmock import connect_boto_patch
# switch to sqlite backend
config.STORAGE_ENGINE_NAME = 'sqlite'
# define the database path. defaults to 'dynamo.db'
config.STORAGE_SQLITE_FILE = '/tmp/my_database.sqlite'
# Wire-up boto and ddbmock together
db = connect_boto_patch()
Any access to dynamodb service via boto will use ddbmock under the hood.
Server Mode
If you still want to us ddbmock in server mode, I would try to change ConnectionBorg._shared_state['_region'] in the really beginning of test setup code:
ConnectionBorg._shared_state['_region'] = RegionInfo(name='ddbmock', endpoint="localhost:6543")
As far as I understand, any access to dynamodb via any ConnectionBorg instance after those lines will use ddbmock entry point.
This said, I've never tested it. I'll make sure authors of ddbmock gives an update on this.
Related
I have a python application (dash and plotly) that I'm trying to run on AWS EBS. It ran successfully when I was pulling data from different APIs within the python app (tradingEconomics and fredapi).
I have since created an RDS database so I can pull from the APIs once, store the data, and access it there. I was able to successfully connect to the database (add to, and pull from) when running it locally via pymysql. How I am doing so is below:
from sqlalchemy import create_engine
engine = create_engine('mysql+pymysql://USERNAME:PASSWORD#WHAT_I_CALLED_DATABASE_INSTANCE_ON_RDS.ApPrOpRiAtEcHaRaCtErS.us-east-1.rds.amazonaws.com/NAME_OF_DATABASE_I_CREATED_ON_MYSQLWORKBENCH')
dbConnection = engine.connect()
returned_frame = pd.read_sql("SELECT * FROM nameOfTableICreatedOnMySQLWorkbench", dbConnection)
Like I said, this works when run locally. It does not work when trying to run on EBS. (I get an internal server error 500, based on the logs it looks this is the problem.)
I think I opened up the inbound and outbound permissions on the database instance on RDS as some have suggested, but perhaps I misunderstood. This did not work.
If you have any further questions for clarification, feel free to ask.
let's say that me and a partner are working on a python application that needs to connect to a database, exemple mysql:
import mysql.connector
mydb = mysql.connector.connect(host="databaseURL",user="user",passwd="userPassword")
l am trying to hide my credentials from the code, but referencing them in anyway (from a file, environment variable, substitution ...etc.) doesn't work since he can simply print the value, or get them from memory, and clearing memory isn't an option in my use case.
one idea that l thought about is using a proxy, that sits between the python app and the database, this way l could connect to the db with some proxy credentials instead. exemple:
import mysql.connector
mydb = mysql.connector.connect(host="proxyURL",user="proxyUser",passwd="proxyPassword")
basically if the credentials are valid, the proxy gonna request the actual credentials, and use them to connect to the database.
The difficulty that l found is how to to make a server listen for incoming JDBC connections? i.e. how to create a jdbc proxy. Otherwise is the approach even correct?
thinking about it, l don't think that's even possible, since each database has it own JDBC driver that communicates in its unique way with the database. this means that l need to write different proxies for each database.
I want to get the list of DBs on a remote server with a python script.
I know I can connect to a certain db with
import ibm_db
ibm_db.connect("DATABASE=name;HOSTNAME=host;PORT=60000;PROTOCOL=TCPIP;UID=username;
PWD=password;", "", "")
however I want to only connect to an instance and then do "db2 list db directory" to get the DB names.
Meaning change to the instance user and set off that command or preferably use a python module that can do just that. I only need the names no real connection to a database.
The result should be an array with all database names in that instance.
Any ideas or help?
Thank you
Unfortunately, there is no such function in python-ibmdb API and actually not even in full Db2 API. The only "workaround" I could think of would be UDF deployed on the remote database that uses db2DbDirOpenScan to access the catalog and return the info via the connection that is already established.
I am not new to Python, I am not new to nosql, I am not even new to AWS, however I am 100% new to Boto and dynamodb, I completely understand everything I have read about DynamoDb, except how to setup a connection to the service from my local dev environment.
I tried a few things, but it fails, and when I search the web I keep getting v1 stuff.
Can somebody please give me a link or a type up something that succinctly explains how to connect to the DynamoDB v2 Interface?
Found this yesterday, a simple interface to DynamoDB for Python:
https://github.com/eykd/duo
import duo
# Connect
db = duo.DynamoDB(key='aws_key', secret='aws_secret')
table = duo.DynamoDB['my_hashkey_table']
item = table['new-item']
# Get
print item['foo']
# Set
item['bar'] = 'bar'
item.put()
I want to connect the db available inside DynamoDbLocal using the boto sdk.I followed the documentation as per the below link.
http://boto.readthedocs.org/en/latest/dynamodb2_tut.html#dynamodb-local
This is the official documentation provided by the amazon.But when I am executing the snippet available in the document, I am unable to connect the db and I can't get the tables available inside the db. The dbname is "dummy_us-east-1.db". And my snippet is:
from boto.dynamodb2.layer1 import DynamoDBConnection
con = DynamoDBConnection(host='localhost', port=8000,
aws_access_key_id='dummy',
aws_secret_access_key='dummy',
is_secure=False,
)
print con.list_tables()
I have a 8 tables available inside the db. But I am getting empty list, after executing the list_tables() command.
output:
{u'TableNames':[]}
Instead of accessing the required database, it creating and accessing the new database.
Old database : dummy_us-east-1.db
New database : dummy_localhost.db
How to resolve this.
Please give me some suggestions regarding to the DynamoDbLocal access. Thanks in advance.
It sounds like you are using different approaches to connect to DynamoDB Local.
If so, you can also start DynamoDB Local with the sharedDb flag to force it to use a single db file:
-sharedDb When specified, DynamoDB Local will use a
single database instead of separate databases
for each credential and region. As a result,
all clients will interact with the same set of
tables, regardless of their region and
credential configuration.
E.g.
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar --sharedDb
Here is the solution. this is because you didn't start the dynamodb with it location of jar file.
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb