Google Sheets API - Python - Constructing Body for BatchUpdate - python

I need to create the body for multiple updates to a Google Spreadsheet using Python.
I used the Python dictionary dict() but that doesn't work for multiple values that are repeated as dict() doesn't allow multiple keys.
My code snippet is:
body = {
}
for i in range (0,len(deltaListcolNames) ):
rangeItem = deltaListcolNames[i]
batch_input_value = deltaListcolVals[i]
body["range"] = rangeItem
body["majorDimension"] = "ROWS"
body["values"] = "[["+str(batch_input_value)+"]]"
batch_update_values_request_body = {
# How the input data should be interpreted.
'value_input_option': 'USER_ENTERED',
# The new values for the input sheet... to apply to the spreadsheet.
'data': [
dict(body)
]
}
print(batch_update_values_request_body)
request = service.spreadsheets().values().batchUpdate(
spreadsheetId=spreadsheetId,
body=batch_update_values_request_body)
response = request.execute()

Thanks for the answer, Graham.
I doubled back and went away from using the dict paradigm and found that by using this grid, I was able to make the data structure. Here is how I coded it...
perhaps a bit quirky but it works nicely:
range_value_data_list = []
width = 1
#
height = 1
for i in range (0,len(deltaListcolNames) ):
rangeItem = deltaListcolNames[i]
# print(" the value for rangeItem is : ", rangeItem)
batch_input_value = str(deltaListcolVals[i])
print(" the value for batch_input_value is : ", batch_input_value)
# construct the data structure for the value
grid = [[None] * width for i in range(height)]
grid[0][0] = batch_input_value
range_value_item_str = { 'range': rangeItem, 'values': (grid) }
range_value_data_list.append(range_value_item_str)

Review the documentation for the Python client library methods: The data portion is a list of dict objects.
So your construct is close, you just need a loop that fills the data list:
data = []
for i in range(0, len(deltaListcolNames)):
body = {}
# fill out the body
rangeItem = deltaListcolNames[i]
....
# Add this update's body to the array with the other update bodies.
data.append(body)
# build the rest of the request
...
# send the request
...

Related

Python How to extract data for all index in to a only function

I am new to python and currently learning this language. I am trying to build a web scraper that will export the data to a CSV. I have the data I want and downloaded it to a CSV. The problem is that I have only managed to dump the data from one index and I want to dump all the data from all the indexes into the same CSV to form a database.
The problem I have is that I can only request n_companies indicating the index. For example (n_company[0] ) and I get the data from the first index of the list. What I want is to get all the data from all the indexes in the same function and then dump them with pandas in a CSV and thus be able to create a DB.
I'm stuck at this point and don't know how to proceed. Can you help me please.
This is the function:
def datos_directorio(n_empresa):
r = session.get(n_empresa[0])
home=r.content.decode('UTF-8')
tree=html.fromstring(home)
descripcion_direccion_empresas = '//p[#class = "paragraph"][2]//text()[normalize-space()]'
nombre_e = '//h1[#class ="mb3 h0 bold"][normalize-space()]/text()'
email = '//div[#class = "inline-block mb1 mr1"][3]/a[#class = "mail button button-inverted h4"]/text()[normalize-space()]'
teléfono = '//div[#class = "inline-block mb1 mr1"][2]/a[#class = "tel button button-inverted h4"]/text()[normalize-space()]'
d_empresas=tree.xpath(descripcion_direccion_empresas)
d_empresas = " ".join(d_empresas)
empresas_n=tree.xpath(nombre_e)
empresas_n = " ".join(empresas_n[0].split())
email_e=tree.xpath(email)
email_e = " ".join(email_e[0].split())
teléfono_e=tree.xpath(teléfono)
teléfono_e = " ".join(teléfono_e[0].split())
contenido = {
'EMPRESA' : empresas_n,
'EMAIL' : email_e,
'TELÉFONO' : teléfono_e,
'CONTACTO Y DIRECCIÓN' : d_empresas
}
return contenido
Best regards.

How to convert Hive schema to Bigquery schema using Python?

What i get from api:
"name":"reports"
"col_type":"array<struct<imageUrl:string,reportedBy:string>>"
So in hive schema I got:
reports array<struct<imageUrl:string,reportedBy:string>>
Note: I got hive array schema as string from api
My target:
bigquery.SchemaField("reports", "RECORD", mode="NULLABLE",
fields=(
bigquery.SchemaField('imageUrl', 'STRING'),
bigquery.SchemaField('reportedBy', 'STRING')
)
)
Note: I would like to create universal code that can handle when i receive any number of struct inside of the array.
Any tips are welcome.
I tried creating a script that parses your input which is reports array<struct<imageUrl:string,reportedBy:string>>. This converts your input to a dictionary that could be used as schema when creating a table. The main idea of the apporach is instead of using SchemaField(), you can create a dictionary which is much easier than creating SchemaField() objects with parameters using your example input.
NOTE: The script is only tested based on your input and it can parse more fields if added in struct<.
import re
from google.cloud import bigquery
def is_even(number):
if (number % 2) == 0:
return True
else:
return False
def clean_string(str_value):
return re.sub(r'[\W_]+', '', str_value)
def convert_to_bqdict(api_string):
"""
This only works for a struct with multiple fields
This could give you an idea on constructing a schema dict for BigQuery
"""
num_even = True
main_dict = {}
struct_dict = {}
field_arr = []
schema_arr = []
# Hard coded this since not sure what the string will look like if there are more inputs
init_struct = sample.split(' ')
main_dict["name"] = init_struct[0]
main_dict["type"] = "RECORD"
main_dict["mode"] = "NULLABLE"
cont_struct = init_struct[1].split('<')
num_elem = len(cont_struct)
# parse fields inside of struct<
for i in range(0,num_elem):
num_even = is_even(i)
# fields are seen on even indices
if num_even and i != 0:
temp = list(filter(None,cont_struct[i].split(','))) # remove blank elements
for elem in temp:
fields = list(filter(None,elem.split(':')))
struct_dict["name"] = clean_string(fields[0])
# "type" works for STRING as of the moment refer to
# https://cloud.google.com/bigquery/docs/schemas#standard_sql_data_types
# for the accepted data types
struct_dict["type"] = clean_string(fields[1]).upper()
struct_dict["mode"] = "NULLABLE"
field_arr.append(struct_dict)
struct_dict = {}
main_dict["fields"] = field_arr # assign dict to array of fields
schema_arr.append(main_dict)
return schema_arr
sample = "reports array<struct<imageUrl:string,reportedBy:string,newfield:bool>>"
bq_dict = convert_to_bqdict(sample)
client = bigquery.Client()
project = client.project
dataset_ref = bigquery.DatasetReference(project, '20211228')
table_ref = dataset_ref.table("20220203")
table = bigquery.Table(table_ref, schema=bq_dict)
table = client.create_table(table)
Output:

Processing API data (json) into a singular data frame (list of list of dictionaries)?

So this is a somewhat of a continuation from a previous post of mine except now I have API data to work with. I am trying to get keys Type and Email as columns in a data frame to come up with a final number. My code:
jsp_full=[]
for p in payloads:
payload = {"payload": {"segmentId":p}}
r = requests.post(url,headers = header, json = payload)
#print(r, r.reason)
time.sleep(r.elapsed.total_seconds())
json_data = r.json() if r and r.status_code == 200 else None
json_keys = json_data['payload']['supporters']
json_package = []
jsp_full.append(json_package)
for row in json_keys:
SID = row['supporterId']
Handle = row['contacts']
a_key = 'value'
list_values = [a_list[a_key] for a_list in Handle]
string = str(list_values).split(",")
data = {
'SupporterID' : SID,
'Email' : strip_characters(string[-1]),
'Type' : labels(p)
}
json_package.append(data)
t2 = round(time.perf_counter(),2)
b_key = "Email"
e = len([b_list[b_key] for b_list in json_package])
t = str(labels(p))
#print(json_package)
print(f'There are {e} emails in the {t} segment')
print(f'Finished in {t2 - t1} seconds')
excel = pd.DataFrame(json_package)
excel.to_excel(r'C:\Users\am\Desktop\email parsing\{0} segment {1}.xlsx'.format(t, str(today)), sheet_name=t)
This part works all well and good. Each payload in the API represents a different segment of people so I split them out into different files. However, I am at a point where I need to combine all records into a single data frame hence why I append out to jsp_full. This is a list of a list of dictionaries.
Once I have that I would run the balance of my code which is like this:
S= pd.DataFrame(jsp_full[0], index = {0})
Advocacy_Supporters = S.sort_values("Type").groupby("Type", as_index=False)["Email"].first()
print(Advocacy_Supporters['Email'].count())
print("The number of Unique Advocacy Supporters is :")
Advocacy_Supporters_Group = Advocacy_Supporters.groupby("Type")["Email"].nunique()
print(Advocacy_Supporters_Group)
Some sample data:
[{'SupporterID': '565f6a2f-c7fd-4f1b-bac2-e33976ef4306', 'Email': 'somebody#somewhere.edu', 'Type': 'd_Student Ambassadors'}, {'SupporterID': '7508dc12-7647-4e95-a8b8-bcb067861faf', 'Email': 'someoneelse#email.somewhere.edu', 'Type': 'd_Student Ambassadors'},...`
My desired output is a dataframe that looks like so:
SupporterID Email Type
565f6a2f-c7fd-4f1b-bac2-e33976ef4306 somebody#somewhere.edu d_Student Ambassadors
7508dc12-7647-4e95-a8b8-bcb067861faf someoneelse#email.somewhere.edu d_Student Ambassadors
Any help is greatly appreciated!!
So because this code creates an excel file for each segment, all I did was read back in the excels via a for loop like so:
filesnames = ['e_S Donors', 'b_Contributors', 'c_Activists', 'd_Student Ambassadors', 'a_Volunteers', 'f_Offline Action Takers']
S= pd.DataFrame()
for i in filesnames:
data = pd.read_excel(r'C:\Users\am\Desktop\email parsing\{0} segment {1}.xlsx'.format(i, str(today)),sheet_name= i, engine = 'openpyxl')
S= S.append(data)
This did the trick since it was in a format I already wanted.

python : append to data use json.dumps

I am trying to create JSON file. using json.dumps and success printing.
I have a question.
The format I wanted was
channel_info = OrderedDict()
table = OrderedDict()
table2 = OrderedDict()
channel_info["KIND1"] = pkind[2].text
table[ptime[10].text] = pnk[11].text
table[ptime[11].text] = pnk[12].text
channel_info["TABLE1"] = table
channel_info["KIND2"] = pkind[2].text
table2[ptime[10].text] = pnk[11].text
table2[ptime[11].text] = pnk[12].text
channel_info["TABLE2"] = table2
result:
{
"KIND1": "xxxx",
"TABLE1": {
"09:10": "aaaa",
"10:10": "bbbb"
},
"KIND2": "yyyy",
"TABLE2": {
"09:10": "cccc",
"10:10": "dddd"
}
}
How to output the same format using a while loop?
The names of the JSON objects? KIND1, TABLE1, KIND2, TABLE2 and so on ...
I wonder how you can change these names dynamically using a while loop.
thank you.
You could do something like this (assuming table dictionary is static over each loop, as it seems in the example you give):
channel_info = dict()
# n_tables is the number of iterations you need
for i in range(n_tables):
table = dict()
channel_info["KIND%s" % (i+1)] = pkind[1].text
table[ptime[10].text] = pnk[11].text
table[ptime[11].text] = pnk[12].text
channel_info["TABLE%s" % (i+1)] = table
You don't need the table name dynamically since you assign it to a dictionary key.
Basically, if I understood your question right:
...
i=0
no_of_tables = 4
while i<=no_of_tables:
table_counter = i+1
table_counter = str(table_counter)
kind = 'KIND' + table_counter
table = 'TABLE' + table_counter
channel_info[kind] = pkind[2].text
table[ptime[10].text] = pnk[11].text
table[ptime[11].text] = pnk[12].text
channel_info[table] = table
Note: I know it can be optimized, but for the sake of simplicity I left it as is.

Manipulating CSV data stored as a string Python

I have an API string which responds with an XML page, and has my data stored as CSV in the "data" tag (I can request it in JSON format but I haven't been able to handle the data correctly in my Python script in that format).
<reports.getAccountsStatsResponse xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:com:gigya:api" xsi:schemaLocation="urn:com:gigya:api http://socialize-api.gigya.com/schema">
<statusCode>200</statusCode>
<errorCode>0</errorCode>
<statusReason>OK</statusReason>
<callId>ae1b3f13ba1c4e62ad3120afb1269c76</callId>
<time>2015-09-01T09:01:46.511Z</time>
<headers>
<header>date</header>
<header>initRegistrations</header>
<header>registrations</header>
<header>siteLogins</header>
<header>newUsers</header>
</headers>
<data xmlns:q1="http://www.w3.org/2001/XMLSchema" xsi:type="q1:string">
"date","initRegistrations","registrations","siteLogins","newUsers" "01/01/2015","0","0","0","0" "01/02/2015","0","0","0","0" "01/03/2015","0","0","0","0" "01/04/2015","0","0","0","0" "01/05/2015","0","0","0","0" "01/06/2015","0","0","0","0" "01/07/2015","0","0","0","0" "01/08/2015","0","0","0","0" "01/09/2015","0","0","0","0" "01/10/2015","0","0","0","0" "01/11/2015","0","0","0","0" "01/12/2015","0","0","0","0" "01/13/2015","0","0","0","0" "01/14/2015","0","0","0","0" "01/15/2015","0","0","0","0" "01/16/2015","0","0","0","0" "01/17/2015","0","0","0","0" "01/18/2015","0","0","0","0" "01/19/2015","0","0","0","0" "01/20/2015","34","34","72","34" "01/21/2015","33","23","58","23" "01/22/2015","19","19","49","19" "01/23/2015","21","21","50","21" "01/24/2015","1","1","2","1" "01/25/2015","0","0","0","0" "01/26/2015","8","4","49","4" "01/27/2015","8","8","35","8" "01/28/2015","4","2","16","2" "01/29/2015","7","7","27","7" "01/30/2015","69","58","516","58" "01/31/2015","9","6","76","6" "02/01/2015","0","0","2","0" "02/02/2015","304","203","2317","203" "02/03/2015","122","93","786","93" "02/04/2015","69","47","435","47" "02/05/2015","93","64","677","64" "02/06/2015","294","255","1327","255" "02/07/2015","0","0","0","0" "02/08/2015","0","0","0","0" "02/09/2015","0","0","3","0" "02/10/2015","1","0","1","0" "02/11/2015","3","3","7","3" "02/12/2015","0","0","0","0" "02/13/2015","2","2","4","2" "02/14/2015","0","0","1","0" "02/15/2015","0","0","0","0" "02/16/2015","0","0","0","0" "02/17/2015","3","3","7","3" "02/18/2015","0","0","0","0" "02/19/2015","1","1","3","1" "02/20/2015","3","3","10","3" "02/21/2015","0","0","0","0" "02/22/2015","0","0","1","0" "02/23/2015","1","1","4","1" "02/24/2015","0","0","1","0" "02/25/2015","0","0","0","0" "02/26/2015","0","0","0","0" "02/27/2015","0","0","1","0" "02/28/2015","1","1","2","1" "03/01/2015","1","1","3","1" "03/02/2015","19","9","348","9" "03/03/2015","14","9","132","9" "03/04/2015","4","4","41","4" "03/05/2015","8","5","101","5" "03/06/2015","6","5","71","5" "03/07/2015","8","4","42","4" "03/08/2015","7","4","45","4" "03/09/2015","5","4","30","4" "03/10/2015","7","7","39","7" "03/11/2015","9","9","41","9" "03/12/2015","1","1","20","1" "03/13/2015","3","3","26","3" "03/14/2015","2","0","21","0" "03/15/2015","3","3","28","3" "03/16/2015","3","3","38","3" "03/17/2015","4","4","43","4" "03/18/2015","5","3","45","3" "03/19/2015","19","16","108","16" "03/20/2015","11","8","96","8" "03/21/2015","276","261","807","261" "03/22/2015","197","192","604","192" "03/23/2015","0","0","3","0" "03/24/2015","1","1","4","1" "03/25/2015","181","166","401","166" "03/26/2015","124","109","265","109" "03/27/2015","53","47","124","47" "03/28/2015","41","39","99","39" "03/29/2015","75","65","173","65" "03/30/2015","249","239","536","239" "03/31/2015","222","212","487","212" "04/01/2015","40","29","394","29" "04/02/2015","16","10","132","10" "04/03/2015","13","10","125","10" "04/04/2015","6","4","49","4" "04/05/2015","2","1","46","1" "04/06/2015","4","3","38","3" "04/07/2015","1","0","32","0" "04/08/2015","4","2","16","2" "04/09/2015","9","8","30","8" "04/10/2015","31","29","96","29" "04/11/2015","17","14","90","14" "04/12/2015","10","7","46","7" "04/13/2015","19","13","69","13" "04/14/2015","63","58","199","58" "04/15/2015","17","16","58","16" "04/16/2015","13","12","41","12" "04/17/2015","7","5","51","5" "04/18/2015","51","46","165","46" "04/19/2015","51","45","179","45" "04/20/2015","28","21","110","21" "04/21/2015","32","24","290","24" "04/22/2015","47","31","329","31" "04/23/2015","30","27","183","27" "04/24/2015","71","65","284","65" "04/25/2015","25","17","268","17" "04/26/2015","26","24","268","24" "04/27/2015","72","67","172","67" "04/28/2015","28","25","96","25" "04/29/2015","72","48","159","48" "04/30/2015","50","22","136","22" "05/01/2015","33","23","126","23" "05/02/2015","22","17","112","17" "05/03/2015","31","21","169","21" "05/04/2015","29","21","182","21" "05/05/2015","12","10","24","10" "05/06/2015","369","354","790","354" "05/07/2015","409","401","839","401" "05/08/2015","258","253","539","253" "05/09/2015","227","221","469","221" "05/10/2015","138","134","297","134" "05/11/2015","14","13","32","13" "05/12/2015","57","24","452","24" "05/13/2015","23","12","300","12" "05/14/2015","7","5","70","5" "05/15/2015","7","6","15","6" "05/16/2015","3","3","7","3" "05/17/2015","3","3","8","3" "05/18/2015","2","4","4","2" "05/19/2015","10","16","24","8" "05/20/2015","4","8","10","4" "05/21/2015","7","12","14","6" "05/22/2015","9","14","33","7" "05/23/2015","9","14","19","7" "05/24/2015","16","32","39","16" "05/25/2015","11","9","21","7" "05/26/2015","23","16","87","16" "05/27/2015","30","24","87","24" "05/28/2015","12","12","39","12" "05/29/2015","14","12","37","12" "05/30/2015","8","7","19","7" "05/31/2015","5","4","17","4" "06/01/2015","10","10","31","10" "06/02/2015","23","20","95","20" "06/03/2015","11","9","31","9" "06/04/2015","14","13","36","13" "06/05/2015","12","11","27","11" "06/06/2015","8","6","20","6" "06/07/2015","9","9","21","9" "06/08/2015","16","16","37","16" "06/09/2015","24","17","40","17" "06/10/2015","8","8","34","8" "06/11/2015","46","27","464","27" "06/12/2015","45","23","383","23" "06/13/2015","12","9","143","9" "06/14/2015","22","15","112","15" "06/15/2015","14","13","74","13" "06/16/2015","63","56","197","56" "06/17/2015","28","25","114","25" "06/18/2015","17","15","85","15" "06/19/2015","143","135","460","135" "06/20/2015","54","46","217","46" "06/21/2015","60","55","211","55" "06/22/2015","91","78","249","78" "06/23/2015","99","87","295","87" "06/24/2015","115","103","315","103" "06/25/2015","455","380","964","380" "06/26/2015","585","489","1144","489" "06/27/2015","345","300","695","300" "06/28/2015","349","320","783","320" "06/29/2015","113","98","362","98" "06/30/2015","128","113","424","113" "07/01/2015","115","99","277","99" "07/02/2015","73","65","323","65" "07/03/2015","22","16","184","16" "07/04/2015","13","12","69","12" "07/05/2015","15","12","71","12" "07/06/2015","31","25","107","25" "07/07/2015","15","10","63","10" "07/08/2015","16","12","60","12" "07/09/2015","35","32","103","32" "07/10/2015","22","19","72","19" "07/11/2015","7","7","25","7" "07/12/2015","4","4","27","4" "07/13/2015","81","73","195","73" "07/14/2015","60","53","157","53" "07/15/2015","44","40","115","40" "07/16/2015","40","40","112","40" "07/17/2015","27","23","64","23" "07/18/2015","15","11","56","11" "07/19/2015","19","14","63","14" "07/20/2015","21","17","48","17" "07/21/2015","11","10","30","10" "07/22/2015","13","12","40","12" "07/23/2015","9","6","43","6" "07/24/2015","9","8","32","8" "07/25/2015","8","5","20","5" "07/26/2015","20","18","64","18" "07/27/2015","15","14","80","14" "07/28/2015","9","8","48","8" "07/29/2015","21","13","88","13" "07/30/2015","9","5","92","5" "07/31/2015","4","3","81","3" "08/01/2015","4","3","23","3" "08/02/2015","11","5","29","5" "08/03/2015","19","17","50","17" "08/04/2015","15","10","32","10" "08/05/2015","14","9","31","9" "08/06/2015","26","5","338","5" "08/07/2015","22","13","182","13" "08/08/2015","9","7","72","7" "08/09/2015","7","4","58","4" "08/10/2015","17","14","88","14" "08/11/2015","23","17","100","17" "08/12/2015","20","20","62","20" "08/13/2015","23","21","81","21" "08/14/2015","30","26","136","26" "08/15/2015","12","7","59","7" "08/16/2015","12","8","61","8" "08/17/2015","68","46","331","46" "08/18/2015","72","48","327","48" "08/19/2015","149","75","542","75" "08/20/2015","95","59","358","59" "08/21/2015","93","54","342","54" "08/22/2015","69","40","300","40" "08/23/2015","150","103","505","103" "08/24/2015","39","30","105","30"
</data>
</reports.getAccountsStatsResponse>
And in JSON format:
{
"statusCode": 200,
"errorCode": 0,
"statusReason": "OK",
"callId": "99949da72d034b04ba910c91704ba4c0",
"time": "2015-09-01T09:19:30.569Z",
"headers": [
"date",
"initRegistrations",
"registrations",
"siteLogins",
"newUsers"
],
"data": "\"date\",\"initRegistrations\",\"registrations\",\"siteLogins\",\"newUsers\"\r\n\"01/01/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/02/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/03/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/04/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/05/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/06/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/07/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/08/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/09/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/10/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/11/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/12/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/13/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/14/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/15/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/16/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/17/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/18/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/19/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/20/2015\",\"34\",\"34\",\"72\",\"34\"\r\n\"01/21/2015\",\"33\",\"23\",\"58\",\"23\"\r\n\"01/22/2015\",\"19\",\"19\",\"49\",\"19\"\r\n\"01/23/2015\",\"21\",\"21\",\"50\",\"21\"\r\n\"01/24/2015\",\"1\",\"1\",\"2\",\"1\"\r\n\"01/25/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/26/2015\",\"8\",\"4\",\"49\",\"4\"\r\n\"01/27/2015\",\"8\",\"8\",\"35\",\"8\"\r\n\"01/28/2015\",\"4\",\"2\",\"16\",\"2\"\r\n\"01/29/2015\",\"7\",\"7\",\"27\",\"7\"\r\n\"01/30/2015\",\"69\",\"58\",\"516\",\"58\"\r\n\"01/31/2015\",\"9\",\"6\",\"76\",\"6\"\r\n\"02/01/2015\",\"0\",\"0\",\"2\",\"0\"\r\n\"02/02/2015\",\"304\",\"203\",\"2317\",\"203\"\r\n\"02/03/2015\",\"122\",\"93\",\"786\",\"93\"\r\n\"02/04/2015\",\"69\",\"47\",\"435\",\"47\"\r\n\"02/05/2015\",\"93\",\"64\",\"677\",\"64\"\r\n\"02/06/2015\",\"294\",\"255\",\"1327\",\"255\"\r\n\"02/07/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/08/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/09/2015\",\"0\",\"0\",\"3\",\"0\"\r\n\"02/10/2015\",\"1\",\"0\",\"1\",\"0\"\r\n\"02/11/2015\",\"3\",\"3\",\"7\",\"3\"\r\n\"02/12/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/13/2015\",\"2\",\"2\",\"4\",\"2\"\r\n\"02/14/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/15/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/16/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/17/2015\",\"3\",\"3\",\"7\",\"3\"\r\n\"02/18/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/19/2015\",\"1\",\"1\",\"3\",\"1\"\r\n\"02/20/2015\",\"3\",\"3\",\"10\",\"3\"\r\n\"02/21/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/22/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/23/2015\",\"1\",\"1\",\"4\",\"1\"\r\n\"02/24/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/25/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/26/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/27/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/28/2015\",\"1\",\"1\",\"2\",\"1\"\r\n\"03/01/2015\",\"1\",\"1\",\"3\",\"1\"\r\n\"03/02/2015\",\"19\",\"9\",\"348\",\"9\"\r\n\"03/03/2015\",\"14\",\"9\",\"132\",\"9\"\r\n\"03/04/2015\",\"4\",\"4\",\"41\",\"4\"\r\n\"03/05/2015\",\"8\",\"5\",\"101\",\"5\"\r\n\"03/06/2015\",\"6\",\"5\",\"71\",\"5\"\r\n\"03/07/2015\",\"8\",\"4\",\"42\",\"4\"\r\n\"03/08/2015\",\"7\",\"4\",\"45\",\"4\"\r\n\"03/09/2015\",\"5\",\"4\",\"30\",\"4\"\r\n\"03/10/2015\",\"7\",\"7\",\"39\",\"7\"\r\n\"03/11/2015\",\"9\",\"9\",\"41\",\"9\"\r\n\"03/12/2015\",\"1\",\"1\",\"20\",\"1\"\r\n\"03/13/2015\",\"3\",\"3\",\"26\",\"3\"\r\n\"03/14/2015\",\"2\",\"0\",\"21\",\"0\"\r\n\"03/15/2015\",\"3\",\"3\",\"28\",\"3\"\r\n\"03/16/2015\",\"3\",\"3\",\"38\",\"3\"\r\n\"03/17/2015\",\"4\",\"4\",\"43\",\"4\"\r\n\"03/18/2015\",\"5\",\"3\",\"45\",\"3\"\r\n\"03/19/2015\",\"19\",\"16\",\"108\",\"16\"\r\n\"03/20/2015\",\"11\",\"8\",\"96\",\"8\"\r\n\"03/21/2015\",\"276\",\"261\",\"807\",\"261\"\r\n\"03/22/2015\",\"197\",\"192\",\"604\",\"192\"\r\n\"03/23/2015\",\"0\",\"0\",\"3\",\"0\"\r\n\"03/24/2015\",\"1\",\"1\",\"4\",\"1\"\r\n\"03/25/2015\",\"181\",\"166\",\"401\",\"166\"\r\n\"03/26/2015\",\"124\",\"109\",\"265\",\"109\"\r\n\"03/27/2015\",\"53\",\"47\",\"124\",\"47\"\r\n\"03/28/2015\",\"41\",\"39\",\"99\",\"39\"\r\n\"03/29/2015\",\"75\",\"65\",\"173\",\"65\"\r\n\"03/30/2015\",\"249\",\"239\",\"536\",\"239\"\r\n\"03/31/2015\",\"222\",\"212\",\"487\",\"212\"\r\n\"04/01/2015\",\"40\",\"29\",\"394\",\"29\"\r\n\"04/02/2015\",\"16\",\"10\",\"132\",\"10\"\r\n\"04/03/2015\",\"13\",\"10\",\"125\",\"10\"\r\n\"04/04/2015\",\"6\",\"4\",\"49\",\"4\"\r\n\"04/05/2015\",\"2\",\"1\",\"46\",\"1\"\r\n\"04/06/2015\",\"4\",\"3\",\"38\",\"3\"\r\n\"04/07/2015\",\"1\",\"0\",\"32\",\"0\"\r\n\"04/08/2015\",\"4\",\"2\",\"16\",\"2\"\r\n\"04/09/2015\",\"9\",\"8\",\"30\",\"8\"\r\n\"04/10/2015\",\"31\",\"29\",\"96\",\"29\"\r\n\"04/11/2015\",\"17\",\"14\",\"90\",\"14\"\r\n\"04/12/2015\",\"10\",\"7\",\"46\",\"7\"\r\n\"04/13/2015\",\"19\",\"13\",\"69\",\"13\"\r\n\"04/14/2015\",\"63\",\"58\",\"199\",\"58\"\r\n\"04/15/2015\",\"17\",\"16\",\"58\",\"16\"\r\n\"04/16/2015\",\"13\",\"12\",\"41\",\"12\"\r\n\"04/17/2015\",\"7\",\"5\",\"51\",\"5\"\r\n\"04/18/2015\",\"51\",\"46\",\"165\",\"46\"\r\n\"04/19/2015\",\"51\",\"45\",\"179\",\"45\"\r\n\"04/20/2015\",\"28\",\"21\",\"110\",\"21\"\r\n\"04/21/2015\",\"32\",\"24\",\"290\",\"24\"\r\n\"04/22/2015\",\"47\",\"31\",\"329\",\"31\"\r\n\"04/23/2015\",\"30\",\"27\",\"183\",\"27\"\r\n\"04/24/2015\",\"71\",\"65\",\"284\",\"65\"\r\n\"04/25/2015\",\"25\",\"17\",\"268\",\"17\"\r\n\"04/26/2015\",\"26\",\"24\",\"268\",\"24\"\r\n\"04/27/2015\",\"72\",\"67\",\"172\",\"67\"\r\n\"04/28/2015\",\"28\",\"25\",\"96\",\"25\"\r\n\"04/29/2015\",\"72\",\"48\",\"159\",\"48\"\r\n\"04/30/2015\",\"50\",\"22\",\"136\",\"22\"\r\n\"05/01/2015\",\"33\",\"23\",\"126\",\"23\"\r\n\"05/02/2015\",\"22\",\"17\",\"112\",\"17\"\r\n\"05/03/2015\",\"31\",\"21\",\"169\",\"21\"\r\n\"05/04/2015\",\"29\",\"21\",\"182\",\"21\"\r\n\"05/05/2015\",\"12\",\"10\",\"24\",\"10\"\r\n\"05/06/2015\",\"369\",\"354\",\"790\",\"354\"\r\n\"05/07/2015\",\"409\",\"401\",\"839\",\"401\"\r\n\"05/08/2015\",\"258\",\"253\",\"539\",\"253\"\r\n\"05/09/2015\",\"227\",\"221\",\"469\",\"221\"\r\n\"05/10/2015\",\"138\",\"134\",\"297\",\"134\"\r\n\"05/11/2015\",\"14\",\"13\",\"32\",\"13\"\r\n\"05/12/2015\",\"57\",\"24\",\"452\",\"24\"\r\n\"05/13/2015\",\"23\",\"12\",\"300\",\"12\"\r\n\"05/14/2015\",\"7\",\"5\",\"70\",\"5\"\r\n\"05/15/2015\",\"7\",\"6\",\"15\",\"6\"\r\n\"05/16/2015\",\"3\",\"3\",\"7\",\"3\"\r\n\"05/17/2015\",\"3\",\"3\",\"8\",\"3\"\r\n\"05/18/2015\",\"2\",\"4\",\"4\",\"2\"\r\n\"05/19/2015\",\"10\",\"16\",\"24\",\"8\"\r\n\"05/20/2015\",\"4\",\"8\",\"10\",\"4\"\r\n\"05/21/2015\",\"7\",\"12\",\"14\",\"6\"\r\n\"05/22/2015\",\"9\",\"14\",\"33\",\"7\"\r\n\"05/23/2015\",\"9\",\"14\",\"19\",\"7\"\r\n\"05/24/2015\",\"16\",\"32\",\"39\",\"16\"\r\n\"05/25/2015\",\"11\",\"9\",\"21\",\"7\"\r\n\"05/26/2015\",\"23\",\"16\",\"87\",\"16\"\r\n\"05/27/2015\",\"30\",\"24\",\"87\",\"24\"\r\n\"05/28/2015\",\"12\",\"12\",\"39\",\"12\"\r\n\"05/29/2015\",\"14\",\"12\",\"37\",\"12\"\r\n\"05/30/2015\",\"8\",\"7\",\"19\",\"7\"\r\n\"05/31/2015\",\"5\",\"4\",\"17\",\"4\"\r\n\"06/01/2015\",\"10\",\"10\",\"31\",\"10\"\r\n\"06/02/2015\",\"23\",\"20\",\"95\",\"20\"\r\n\"06/03/2015\",\"11\",\"9\",\"31\",\"9\"\r\n\"06/04/2015\",\"14\",\"13\",\"36\",\"13\"\r\n\"06/05/2015\",\"12\",\"11\",\"27\",\"11\"\r\n\"06/06/2015\",\"8\",\"6\",\"20\",\"6\"\r\n\"06/07/2015\",\"9\",\"9\",\"21\",\"9\"\r\n\"06/08/2015\",\"16\",\"16\",\"37\",\"16\"\r\n\"06/09/2015\",\"24\",\"17\",\"40\",\"17\"\r\n\"06/10/2015\",\"8\",\"8\",\"34\",\"8\"\r\n\"06/11/2015\",\"46\",\"27\",\"464\",\"27\"\r\n\"06/12/2015\",\"45\",\"23\",\"383\",\"23\"\r\n\"06/13/2015\",\"12\",\"9\",\"143\",\"9\"\r\n\"06/14/2015\",\"22\",\"15\",\"112\",\"15\"\r\n\"06/15/2015\",\"14\",\"13\",\"74\",\"13\"\r\n\"06/16/2015\",\"63\",\"56\",\"197\",\"56\"\r\n\"06/17/2015\",\"28\",\"25\",\"114\",\"25\"\r\n\"06/18/2015\",\"17\",\"15\",\"85\",\"15\"\r\n\"06/19/2015\",\"143\",\"135\",\"460\",\"135\"\r\n\"06/20/2015\",\"54\",\"46\",\"217\",\"46\"\r\n\"06/21/2015\",\"60\",\"55\",\"211\",\"55\"\r\n\"06/22/2015\",\"91\",\"78\",\"249\",\"78\"\r\n\"06/23/2015\",\"99\",\"87\",\"295\",\"87\"\r\n\"06/24/2015\",\"115\",\"103\",\"315\",\"103\"\r\n\"06/25/2015\",\"455\",\"380\",\"964\",\"380\"\r\n\"06/26/2015\",\"585\",\"489\",\"1144\",\"489\"\r\n\"06/27/2015\",\"345\",\"300\",\"695\",\"300\"\r\n\"06/28/2015\",\"349\",\"320\",\"783\",\"320\"\r\n\"06/29/2015\",\"113\",\"98\",\"362\",\"98\"\r\n\"06/30/2015\",\"128\",\"113\",\"424\",\"113\"\r\n\"07/01/2015\",\"115\",\"99\",\"277\",\"99\"\r\n\"07/02/2015\",\"73\",\"65\",\"323\",\"65\"\r\n\"07/03/2015\",\"22\",\"16\",\"184\",\"16\"\r\n\"07/04/2015\",\"13\",\"12\",\"69\",\"12\"\r\n\"07/05/2015\",\"15\",\"12\",\"71\",\"12\"\r\n\"07/06/2015\",\"31\",\"25\",\"107\",\"25\"\r\n\"07/07/2015\",\"15\",\"10\",\"63\",\"10\"\r\n\"07/08/2015\",\"16\",\"12\",\"60\",\"12\"\r\n\"07/09/2015\",\"35\",\"32\",\"103\",\"32\"\r\n\"07/10/2015\",\"22\",\"19\",\"72\",\"19\"\r\n\"07/11/2015\",\"7\",\"7\",\"25\",\"7\"\r\n\"07/12/2015\",\"4\",\"4\",\"27\",\"4\"\r\n\"07/13/2015\",\"81\",\"73\",\"195\",\"73\"\r\n\"07/14/2015\",\"60\",\"53\",\"157\",\"53\"\r\n\"07/15/2015\",\"44\",\"40\",\"115\",\"40\"\r\n\"07/16/2015\",\"40\",\"40\",\"112\",\"40\"\r\n\"07/17/2015\",\"27\",\"23\",\"64\",\"23\"\r\n\"07/18/2015\",\"15\",\"11\",\"56\",\"11\"\r\n\"07/19/2015\",\"19\",\"14\",\"63\",\"14\"\r\n\"07/20/2015\",\"21\",\"17\",\"48\",\"17\"\r\n\"07/21/2015\",\"11\",\"10\",\"30\",\"10\"\r\n\"07/22/2015\",\"13\",\"12\",\"40\",\"12\"\r\n\"07/23/2015\",\"9\",\"6\",\"43\",\"6\"\r\n\"07/24/2015\",\"9\",\"8\",\"32\",\"8\"\r\n\"07/25/2015\",\"8\",\"5\",\"20\",\"5\"\r\n\"07/26/2015\",\"20\",\"18\",\"64\",\"18\"\r\n\"07/27/2015\",\"15\",\"14\",\"80\",\"14\"\r\n\"07/28/2015\",\"9\",\"8\",\"48\",\"8\"\r\n\"07/29/2015\",\"21\",\"13\",\"88\",\"13\"\r\n\"07/30/2015\",\"9\",\"5\",\"92\",\"5\"\r\n\"07/31/2015\",\"4\",\"3\",\"81\",\"3\"\r\n\"08/01/2015\",\"4\",\"3\",\"23\",\"3\"\r\n\"08/02/2015\",\"11\",\"5\",\"29\",\"5\"\r\n\"08/03/2015\",\"19\",\"17\",\"50\",\"17\"\r\n\"08/04/2015\",\"15\",\"10\",\"32\",\"10\"\r\n\"08/05/2015\",\"14\",\"9\",\"31\",\"9\"\r\n\"08/06/2015\",\"26\",\"5\",\"338\",\"5\"\r\n\"08/07/2015\",\"22\",\"13\",\"182\",\"13\"\r\n\"08/08/2015\",\"9\",\"7\",\"72\",\"7\"\r\n\"08/09/2015\",\"7\",\"4\",\"58\",\"4\"\r\n\"08/10/2015\",\"17\",\"14\",\"88\",\"14\"\r\n\"08/11/2015\",\"23\",\"17\",\"100\",\"17\"\r\n\"08/12/2015\",\"20\",\"20\",\"62\",\"20\"\r\n\"08/13/2015\",\"23\",\"21\",\"81\",\"21\"\r\n\"08/14/2015\",\"30\",\"26\",\"136\",\"26\"\r\n\"08/15/2015\",\"12\",\"7\",\"59\",\"7\"\r\n\"08/16/2015\",\"12\",\"8\",\"61\",\"8\"\r\n\"08/17/2015\",\"68\",\"46\",\"331\",\"46\"\r\n\"08/18/2015\",\"72\",\"48\",\"327\",\"48\"\r\n\"08/19/2015\",\"149\",\"75\",\"542\",\"75\"\r\n\"08/20/2015\",\"95\",\"59\",\"358\",\"59\"\r\n\"08/21/2015\",\"93\",\"54\",\"342\",\"54\"\r\n\"08/22/2015\",\"69\",\"40\",\"300\",\"40\"\r\n\"08/23/2015\",\"150\",\"103\",\"505\",\"103\"\r\n\"08/24/2015\",\"39\",\"30\",\"105\",\"30\"\r\n"
}
Firstly, I would like to store the text from the "data" tag by referencing the name of it, but I've currently only had success by using this following:
response = requests.get(url)
root = ElementTree.fromstring(response.content)
dataString = root[6].text
Is there a separate command to be able to specify the name of the tag?
Next, my goal is to loop through different URL's (which correspond to different accounts), and append the name of those accounts to the end of the data. Is this possible, given that the data is stored as a string and I would need to add it to the end of each row? As a follow up, what's the best convention for saving multiple values in a variable to be able to loop through i.e. the list of accounts?
Apologies if this is unclear, I'm happy to provide any more information if it means anybody can help.
As far as I understood, you have a specific URL for each user and you want to collect data for all users given.
However, since you are not able to get the username out of the response you have to combine the response with the username corresponding to the URL the request was sent to. If so, you could use a dictionary to store the data of your response since the JSON-format is equivalent to Python's dictionary.
The code below simply iterates through a set of tuples containing the different user names and the corresponding URL. For each URL a request is sent, the data is extracted from the JSON-formatted response and stored in a dictionary with the username as a key. This dictionary is then stored (.update()) in a kind of main dictionary containing all your collected datasets.
# replace names 'url_xyz' with corresponding names and url
users = {('Albert', 'url_albert'), ('Steven', 'url_steven'), ('Mike', 'url_mike')}
all_data = dict()
for name, url in users:
response = requests.get(url)
data = response['data'].replace('\"', '')
all_data.update({name: data})
Thank you Albert.
Your JSON suggestion let me control the data in a much better way. The code below is what I ended up with to get to my desired output. Now just to work out how to convert the date from MM/DD/YYYY into DD/MM/YYYY.
startDate = '2015-01-01' # Must be in format YYYY-MM-DD
endDate = '2015-12-31' # Must be in format YYYY-MM-DD
dimensions = 'date' # Available dimensions are 'date' and 'cid'
format = 'json'
dataFormat = 'json'
measures = 'initRegistrations,registrations,siteLogins,newUsers'
allData = []
# Construct API URL
for i in range(0,len(apiKey)):
url = ('https://reports.eu1.gigya.com/reports.getAccountsStats?secret=' + secret + '&apiKey=' + apiKey[i] + \
'&uid=' + uid + '&startDate=' + startDate + '&endDate=' + endDate + '&dimensions='+ dimensions +\
'&measures=' + measures + '&format=' + format + '&dataFormat=' + dataFormat)
response = requests.get(url)
json = response.json()
data = json['data']
if i == 0:
headers = json['headers']
headers.append('brand')
for x in range(0,len(data)):
data[x].append(brand[i])
brandData = [headers] + data
else:
for x in range(0,len(data)):
data[x].append(brand[i])
brandData = data
allData += brandData
with open("testDataJSON.csv", "wb") as f:
writer = csv.writer(f)
writer.writerows(allData)
I don't know how well this follows best practice for Python but as I said, I am very new to it.

Categories