Real Time Firebase Query - python

I have Firebase DB with the below structure:
and the rules are:
I'm using a python script to save and read DB data. I would like to find the "d1" value.
For example, when using the below code it returns null:
ref = db.reference('Test')
query = ref.order_by_child('f1').equal_to('alal-55')
snapshot = query.get()
for key,val in snapshot.items():
print(val)
Any solution?
Regards,

This query ref.order_by_child('f1').equal_to('alal-55') will not work, because the structure of your database is very deep. You need to change the structure to be able to perform the queries for example:
Test
random_id
f1 : value
Using the above, you can use order_by_child

Related

DynamoDB FilterExpression with multiple condition python and boto3

Please I need help writing filter expressions for scanning data in dynamo db tables using python and boto3.
See my code below.
For some reason unknown to me, this search filter below which I am using is not giving me the right results
Please advice
dynamo_db = boto3.resource('dynamodb')
table = dynamo_db.Table(TABLE_NAME)
my_kwargs = {
'FilterExpression': Key('column_1').eq(val_type_1) and Key("column_2").eq("val_type_string")
}
response = table.scan(**my_kwargs)
items = response['Items']
table_item = items[0]
When you use Scan you do not filter on Key. You filter here is in an attribute so you will need to change Key to Attr.
Furthermore you will need to implement pagination if you are scanning more than 1Mb of data:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Scan.html

How do I make a query in cloudant-python selecting an specific field?

I'm trying to make a query that selects only one of the fields/attributes. The query should result in something like an array of a nameField or a numberField, or even a single variable, what I want is to separate the values so I can manipulate it in the app
I already tried using "query = cloudant.query.Query(myDatabaseDemo, fields=['nameField'])" but the result when I print query is just "fields=['nameField']".
from cloudant.result import Result, ResultByKey
...
client = Cloudant(serviceUsername, servicePassword, url=serviceURL)
myDatabaseDemo = client.create_database(databaseName)
...
result_collection = Result(myDatabaseDemo.all_docs, include_docs=True)
print ("document:\n{0}\n".format(result_collection[0]))
The actual code results in a set of values, like a json document
Have you tried using Cloudant query? It allows you to specify fields (list) – A list of fields to be returned by the query.

GET data using requests than insert into DB2

Currently I am trying to retrieve JSON data from an API and store it in a database. I am able to retrieve the JSON data as a list, and I am able to connect to and query the DB2 Database. My issue is that I can not figure out how to generate an INSERT statement for the data retrieved from the API. The application is only for short term personal use, so SQL Injection attacks are not a concern. So overall I need to generate an sql insert statement from a list. My current code is below, with the api url and info changed.
import ibm_db
import requests
ibm_db_conn = ibm_db.connect("DATABASE=node1;HOSTNAME=100.100.100.100;PORT=50000;PROTOCOL=TCPIP;UID=username;PWD=password;", "", "")
api_request = requests.get("http://api-url/resource?api_key=123456",
auth=('user#api.com', 'password'))
api_code = api_request.status_code
api_data = api_request.json()
print(api_code)
print(api_data)
Depends on the format of the Json returned, and on what your table looks like. My first thought, though, is to use Python's json module:
import json
#...
#...
api_data = json.loads(api_request.json())
Now, you have a Python object you can access like normal:
api_data["key"][2]
for instance. You can itterate, slice, or do whatever else to extract the data you want. Say your json represented rows to be inserted:
query = "INSERT INTO <table> VALUES\n"
i = 0
for row in api_data:
query += "(%s)" %([i for i in row])
if i < len(api_data)-1: query += ",\n"
i += 1
Again, this will vary greatly depending on the format of your table and JSON, but that's the general idea I'd start with.

django substitute dictionary with a faster JSON serializer

I'm using Django raw queryset to select data from database.
I will need a translation (by using ugettext) on a field before I return this json serialized data to django rest_framework as an API
However I'm having optimization issue as this I found out it takes quite a while to manually append dictionary to a list especially if I have a lot of database rows.
After some searching i found a library ujson that claims can serialize JSON faster. However I'm struggling to use this as I need this raw query to return translated name of a field (fruits)
Anyone have any idea how to replace this dictionary method with other faster method to serialize JSON data?
all_fruits = []
activate ("en")
raw_query = MyObject.objects.raw(" select id, fruits from my_table ")
for each_name in raw_query:
json_obj = dict( id = each_name.id,
fruits= ugettext(each_name.fruits)
)
all_fruits.append(json_obj)
It's better to avoid raw SQL only when you really have no other Solution, Django QuerySet provide very great and full-featured API for database query, the following solution could fit your needs:
all_fruits = []
activate ("en")
my_object_list = MyObject.objects.all()
for obj in my_object_list.values():
all_fruits.append({"id":obj.id, "fruits" : ugettext(obj.fruits)})

Efficiently doing thousands of lookups in a single table via django's ORM?

I have two tables. Table A is in my local database. Table B is a series of records behind a rest API. I need to pull the data down from table B, find if it has a corresponding entry in my local db, and then modify that local entry with some additional information.
In pseudo python, it looks something like this:
all_records = my_rest_api.get_everything()
for record in all_records:
try:
u = User.objects.get(
name=record.name,
other_thing=record.thing,
some_more_params=record.other
)
except:
pass
It takes an incredibly long time to do all of the lookups since we're doing a unique lookup on each iteration of the loop. Is there a good way to deal with this kind of pattern? Should I try processing these in batched so I can have giant where-clause-y queries..?
some_records = all_records[:100]
query = [Q(dynamically built query) for criteria in some_records]
Users.objects.filter(query)
# and so on..
Or is the first option my best bet?
You are close. The use of Q() for creating complex OR queries is the key. Try something like this:
import operator
qs = []
for r in all_records:
qs.append(Q(name=r.name, thing=r.thing, more=r.more))
Users.objects.filter(reduce(operator.or_, qs))

Categories