Python complete newbie, JSON formatting - python

I have never used Python before but am trying to use it due to some restrictions in another (proprietary) language, to retrieve some values from a web service and return them in json format to a home automation processor. The relevant section of code below returns :
[u'Name:London', u'Mode:Auto', u'Name:Ling', u'Mode:Away']
["Name:London", "Mode:Auto", "Name:Ling", "Mode:Away"]
…which isn't valid json. I am sure this is a really dumb question but I have searched here and haven't found an answer that helps me. Apologies if I missed something obvious but can anyone tell me what I need to do to ensure the json.dumps command outputs data in the correct format?
CresData = []
for i in range(0, j):
r = requests.get('http://xxxxxx.com/WebAPI/emea/api/v1/location/installationInfo?userId=%s&includeTemperatureControlSystems=True' % UserID, headers=headers)
CresData.append("Name:" + r.json()[i]['locationInfo']['name'])
r = requests.get('http://xxxxxx.com/WebAPI/emea/api/v1/location/%s/status?includeTemperatureControlSystems=True' % r.json()[i]['locationInfo']['locationId'], headers = headers)
CresData.append('Mode:' + r.json()['gateways'][0]['temperatureControlSystems'][0]['systemModeStatus']['mode'])
Cres_json = json.dumps(CresData)
print CresData
print Cres_json

I wasn't able to test the code as the link you mentioned is not a live link but your solution should be something like this
It looks like you are looking for JSON format with key value pair. you need to pass a dict object into json.dumps() which will return you string in required JSON format.
CresData = dict()
key_str = "Location"
idx = 0
for i in range(0, j):
data = dict()
r = requests.get('http://xxxxxx.com/WebAPI/emea/api/v1/location/installationInfo?userId=%s&includeTemperatureControlSystems=True' % UserID, headers=headers)
data["Name"] = r.json()[i]['locationInfo']['name']
r = requests.get('http://xxxxxx.com/WebAPI/emea/api/v1/location/%s/status?includeTemperatureControlSystems=True' % r.json()[i]['locationInfo']['locationId'], headers = headers)
data["mode"] = r.json()['gateways'][0]['temperatureControlSystems'][0]['systemModeStatus']['mode']
CresData[key_str + str(idx)] = data
idx +=1
Cres_json = json.dumps(CresData)
print CresData
print Cres_json

Related

Adding multiple values for a key in a dictionary for Python

Hello I am trying to retrieve multiple indcode using the below code. However I get an error saying "cannot specify for field more than once" Can you anyone please assist?
URL = "https://data.colorado.gov/resource/cjkq-q9ih.json"
D = dict()
D["area"] = 57
D["indcode"] = 10,23,81
document = requests.get(URL, D)
print(document.request.url)
Error message received.
{
"error" : true,
"message" : "cannot specify a field more than once"
}
screenshot attached
document = requests.get(URL, D)
print(document.request.url)
Getting data with multiple indcode is impossible at that URL until web developers provide that feature. You can try with another way.
If you want to retrieve multiple indcode, just try to iterate all data in one request from https://data.colorado.gov/resource/cjkq-q9ih.json?area=57. Try this :
import requests
URL = "https://data.colorado.gov/resource/cjkq-q9ih.json"
D = dict()
D["area"] = 57
# D["indcode"] = 10,23,81
indcode = [10,23,81]
document = requests.get(URL, D)
data = document.json()
filteredData = list(filter(lambda p: int(p['indcode']) in indcode, data))
print(filteredData)

Distinguish found from not found items

i am currently working on getting some track infos from last fm by using their api.
I am using the request library by requesting track infos based on artist and track name, i specify the user to know the times they listened to it.
I call it like the following
ric = 'http://ws.audioscrobbler.com/2.0/?method=track.getInfo&api_key=API_KEY&artist=Tomasz%20Stanko&track=Love%20Theme%20From%20Forewell%20To%20Maria&username=miner&autocorrect=1&format=json'
r = req.get(ric)
if (r.status_code == 200):
d = r.json()
else:
d = dict()
i made sure that the request got the response by checking the status code, but, if the track is not found, it still responds with code 200, and the answer looks like this.
{"error":6,"message":"Track not found","links":[]}
Since i need to use the track infos to export them in a json file, i would like to know if you can suggest me a way to ensure that the track is actually found.
Thanks to anybody who bothered to read till now.
I guess, it's enough to just check if key 'message' is in a dict and it is equal to 'Track not found'. Something like that:
ric = 'http://ws.audioscrobbler.com/2.0/?method=track.getInfo&api_key=API_KEY&artist=Tomasz%20Stanko&track=Love%20Theme%20From%20Forewell%20To%20Maria&username=miner&autocorrect=1&format=json'
r = req.get(ric)
if (r.status_code == 200):
d = r.json()
if d.get("message") and d.get("message") == "Track not found":
d = dict()
else:
d = dict()
Or you can check 'error' key the same way:
ric = 'http://ws.audioscrobbler.com/2.0/?method=track.getInfo&api_key=API_KEY&artist=Tomasz%20Stanko&track=Love%20Theme%20From%20Forewell%20To%20Maria&username=miner&autocorrect=1&format=json'
r = req.get(ric)
if (r.status_code == 200):
d = r.json()
if d.get("error") and d.get("error") == 6:
d = dict()
else:
d = dict()

Selecting values from a JSON file in Python

I am getting JIRA data using the following python code,
how do I store the response for more than one key (my example shows only one KEY but in general I get lot of data) and print only the values corresponding to total,key, customfield_12830, summary
import requests
import json
import logging
import datetime
import base64
import urllib
serverURL = 'https://jira-stability-tools.company.com/jira'
user = 'username'
password = 'password'
query = 'project = PROJECTNAME AND "Build Info" ~ BUILDNAME AND assignee=ASSIGNEENAME'
jql = '/rest/api/2/search?jql=%s' % urllib.quote(query)
response = requests.get(serverURL + jql,verify=False,auth=(user, password))
print response.json()
response.json() OUTPUT:-
http://pastebin.com/h8R4QMgB
From the the link you pasted to pastebin and from the json that I saw, its a you issues as list containing key, fields(which holds custom fields), self, id, expand.
You can simply iterate through this response and extract values for keys you want. You can go like.
data = response.json()
issues = data.get('issues', list())
x = list()
for issue in issues:
temp = {
'key': issue['key'],
'customfield': issue['fields']['customfield_12830'],
'total': issue['fields']['progress']['total']
}
x.append(temp)
print(x)
x is list of dictionaries containing the data for fields you mentioned. Let me know if I have been unclear somewhere or what I have given is not what you are looking for.
PS: It is always advisable to use dict.get('keyname', None) to get values as you can always put a default value if key is not found. For this solution I didn't do it as I just wanted to provide approach.
Update: In the comments you(OP) mentioned that it gives attributerror.Try this code
data = response.json()
issues = data.get('issues', list())
x = list()
for issue in issues:
temp = dict()
key = issue.get('key', None)
if key:
temp['key'] = key
fields = issue.get('fields', None)
if fields:
customfield = fields.get('customfield_12830', None)
temp['customfield'] = customfield
progress = fields.get('progress', None)
if progress:
total = progress.get('total', None)
temp['total'] = total
x.append(temp)
print(x)

ValueError: can only parse strings python

I am trying to gather a bunch of links using xpath which need to be scraped from the next page however, I keep getting the error that can only parse strings? I tried looking at the type of lk and it was a string after I casted it? What seems to be wrong?
def unicode_to_string(types):
try:
types = unicodedata.normalize("NFKD", types).encode('ascii', 'ignore')
return types
except:
return types
def getData():
req = "http://analytical360.com/access-points"
page = urllib2.urlopen(req)
tree = etree.HTML(page.read())
i = 0
for lk in tree.xpath('//a[#class="sabai-file sabai-file-image sabai-file-type-jpg "]//#href'):
print "Scraping Vendor #" + str(i)
trees = etree.HTML(urllib2.urlopen(unicode_to_string(lk)))
for ll in trees.xpath('//table[#id="archived"]//tr//td//a//#href'):
final = etree.HTML(urllib2.urlopen(unicode_to_string(ll)))
You should pass in strings not urllib2.orlopen.
Perhaps change the code like so:
trees = etree.HTML(urllib2.urlopen(unicode_to_string(lk)).read())
for i, ll in enumerate(trees.xpath('//table[#id="archived"]//tr//td//a//#href')):
final = etree.HTML(urllib2.urlopen(unicode_to_string(ll)).read())
Also, you don't seem to increment i.

Manipulating CSV data stored as a string Python

I have an API string which responds with an XML page, and has my data stored as CSV in the "data" tag (I can request it in JSON format but I haven't been able to handle the data correctly in my Python script in that format).
<reports.getAccountsStatsResponse xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="urn:com:gigya:api" xsi:schemaLocation="urn:com:gigya:api http://socialize-api.gigya.com/schema">
<statusCode>200</statusCode>
<errorCode>0</errorCode>
<statusReason>OK</statusReason>
<callId>ae1b3f13ba1c4e62ad3120afb1269c76</callId>
<time>2015-09-01T09:01:46.511Z</time>
<headers>
<header>date</header>
<header>initRegistrations</header>
<header>registrations</header>
<header>siteLogins</header>
<header>newUsers</header>
</headers>
<data xmlns:q1="http://www.w3.org/2001/XMLSchema" xsi:type="q1:string">
"date","initRegistrations","registrations","siteLogins","newUsers" "01/01/2015","0","0","0","0" "01/02/2015","0","0","0","0" "01/03/2015","0","0","0","0" "01/04/2015","0","0","0","0" "01/05/2015","0","0","0","0" "01/06/2015","0","0","0","0" "01/07/2015","0","0","0","0" "01/08/2015","0","0","0","0" "01/09/2015","0","0","0","0" "01/10/2015","0","0","0","0" "01/11/2015","0","0","0","0" "01/12/2015","0","0","0","0" "01/13/2015","0","0","0","0" "01/14/2015","0","0","0","0" "01/15/2015","0","0","0","0" "01/16/2015","0","0","0","0" "01/17/2015","0","0","0","0" "01/18/2015","0","0","0","0" "01/19/2015","0","0","0","0" "01/20/2015","34","34","72","34" "01/21/2015","33","23","58","23" "01/22/2015","19","19","49","19" "01/23/2015","21","21","50","21" "01/24/2015","1","1","2","1" "01/25/2015","0","0","0","0" "01/26/2015","8","4","49","4" "01/27/2015","8","8","35","8" "01/28/2015","4","2","16","2" "01/29/2015","7","7","27","7" "01/30/2015","69","58","516","58" "01/31/2015","9","6","76","6" "02/01/2015","0","0","2","0" "02/02/2015","304","203","2317","203" "02/03/2015","122","93","786","93" "02/04/2015","69","47","435","47" "02/05/2015","93","64","677","64" "02/06/2015","294","255","1327","255" "02/07/2015","0","0","0","0" "02/08/2015","0","0","0","0" "02/09/2015","0","0","3","0" "02/10/2015","1","0","1","0" "02/11/2015","3","3","7","3" "02/12/2015","0","0","0","0" "02/13/2015","2","2","4","2" "02/14/2015","0","0","1","0" "02/15/2015","0","0","0","0" "02/16/2015","0","0","0","0" "02/17/2015","3","3","7","3" "02/18/2015","0","0","0","0" "02/19/2015","1","1","3","1" "02/20/2015","3","3","10","3" "02/21/2015","0","0","0","0" "02/22/2015","0","0","1","0" "02/23/2015","1","1","4","1" "02/24/2015","0","0","1","0" "02/25/2015","0","0","0","0" "02/26/2015","0","0","0","0" "02/27/2015","0","0","1","0" "02/28/2015","1","1","2","1" "03/01/2015","1","1","3","1" "03/02/2015","19","9","348","9" "03/03/2015","14","9","132","9" "03/04/2015","4","4","41","4" "03/05/2015","8","5","101","5" "03/06/2015","6","5","71","5" "03/07/2015","8","4","42","4" "03/08/2015","7","4","45","4" "03/09/2015","5","4","30","4" "03/10/2015","7","7","39","7" "03/11/2015","9","9","41","9" "03/12/2015","1","1","20","1" "03/13/2015","3","3","26","3" "03/14/2015","2","0","21","0" "03/15/2015","3","3","28","3" "03/16/2015","3","3","38","3" "03/17/2015","4","4","43","4" "03/18/2015","5","3","45","3" "03/19/2015","19","16","108","16" "03/20/2015","11","8","96","8" "03/21/2015","276","261","807","261" "03/22/2015","197","192","604","192" "03/23/2015","0","0","3","0" "03/24/2015","1","1","4","1" "03/25/2015","181","166","401","166" "03/26/2015","124","109","265","109" "03/27/2015","53","47","124","47" "03/28/2015","41","39","99","39" "03/29/2015","75","65","173","65" "03/30/2015","249","239","536","239" "03/31/2015","222","212","487","212" "04/01/2015","40","29","394","29" "04/02/2015","16","10","132","10" "04/03/2015","13","10","125","10" "04/04/2015","6","4","49","4" "04/05/2015","2","1","46","1" "04/06/2015","4","3","38","3" "04/07/2015","1","0","32","0" "04/08/2015","4","2","16","2" "04/09/2015","9","8","30","8" "04/10/2015","31","29","96","29" "04/11/2015","17","14","90","14" "04/12/2015","10","7","46","7" "04/13/2015","19","13","69","13" "04/14/2015","63","58","199","58" "04/15/2015","17","16","58","16" "04/16/2015","13","12","41","12" "04/17/2015","7","5","51","5" "04/18/2015","51","46","165","46" "04/19/2015","51","45","179","45" "04/20/2015","28","21","110","21" "04/21/2015","32","24","290","24" "04/22/2015","47","31","329","31" "04/23/2015","30","27","183","27" "04/24/2015","71","65","284","65" "04/25/2015","25","17","268","17" "04/26/2015","26","24","268","24" "04/27/2015","72","67","172","67" "04/28/2015","28","25","96","25" "04/29/2015","72","48","159","48" "04/30/2015","50","22","136","22" "05/01/2015","33","23","126","23" "05/02/2015","22","17","112","17" "05/03/2015","31","21","169","21" "05/04/2015","29","21","182","21" "05/05/2015","12","10","24","10" "05/06/2015","369","354","790","354" "05/07/2015","409","401","839","401" "05/08/2015","258","253","539","253" "05/09/2015","227","221","469","221" "05/10/2015","138","134","297","134" "05/11/2015","14","13","32","13" "05/12/2015","57","24","452","24" "05/13/2015","23","12","300","12" "05/14/2015","7","5","70","5" "05/15/2015","7","6","15","6" "05/16/2015","3","3","7","3" "05/17/2015","3","3","8","3" "05/18/2015","2","4","4","2" "05/19/2015","10","16","24","8" "05/20/2015","4","8","10","4" "05/21/2015","7","12","14","6" "05/22/2015","9","14","33","7" "05/23/2015","9","14","19","7" "05/24/2015","16","32","39","16" "05/25/2015","11","9","21","7" "05/26/2015","23","16","87","16" "05/27/2015","30","24","87","24" "05/28/2015","12","12","39","12" "05/29/2015","14","12","37","12" "05/30/2015","8","7","19","7" "05/31/2015","5","4","17","4" "06/01/2015","10","10","31","10" "06/02/2015","23","20","95","20" "06/03/2015","11","9","31","9" "06/04/2015","14","13","36","13" "06/05/2015","12","11","27","11" "06/06/2015","8","6","20","6" "06/07/2015","9","9","21","9" "06/08/2015","16","16","37","16" "06/09/2015","24","17","40","17" "06/10/2015","8","8","34","8" "06/11/2015","46","27","464","27" "06/12/2015","45","23","383","23" "06/13/2015","12","9","143","9" "06/14/2015","22","15","112","15" "06/15/2015","14","13","74","13" "06/16/2015","63","56","197","56" "06/17/2015","28","25","114","25" "06/18/2015","17","15","85","15" "06/19/2015","143","135","460","135" "06/20/2015","54","46","217","46" "06/21/2015","60","55","211","55" "06/22/2015","91","78","249","78" "06/23/2015","99","87","295","87" "06/24/2015","115","103","315","103" "06/25/2015","455","380","964","380" "06/26/2015","585","489","1144","489" "06/27/2015","345","300","695","300" "06/28/2015","349","320","783","320" "06/29/2015","113","98","362","98" "06/30/2015","128","113","424","113" "07/01/2015","115","99","277","99" "07/02/2015","73","65","323","65" "07/03/2015","22","16","184","16" "07/04/2015","13","12","69","12" "07/05/2015","15","12","71","12" "07/06/2015","31","25","107","25" "07/07/2015","15","10","63","10" "07/08/2015","16","12","60","12" "07/09/2015","35","32","103","32" "07/10/2015","22","19","72","19" "07/11/2015","7","7","25","7" "07/12/2015","4","4","27","4" "07/13/2015","81","73","195","73" "07/14/2015","60","53","157","53" "07/15/2015","44","40","115","40" "07/16/2015","40","40","112","40" "07/17/2015","27","23","64","23" "07/18/2015","15","11","56","11" "07/19/2015","19","14","63","14" "07/20/2015","21","17","48","17" "07/21/2015","11","10","30","10" "07/22/2015","13","12","40","12" "07/23/2015","9","6","43","6" "07/24/2015","9","8","32","8" "07/25/2015","8","5","20","5" "07/26/2015","20","18","64","18" "07/27/2015","15","14","80","14" "07/28/2015","9","8","48","8" "07/29/2015","21","13","88","13" "07/30/2015","9","5","92","5" "07/31/2015","4","3","81","3" "08/01/2015","4","3","23","3" "08/02/2015","11","5","29","5" "08/03/2015","19","17","50","17" "08/04/2015","15","10","32","10" "08/05/2015","14","9","31","9" "08/06/2015","26","5","338","5" "08/07/2015","22","13","182","13" "08/08/2015","9","7","72","7" "08/09/2015","7","4","58","4" "08/10/2015","17","14","88","14" "08/11/2015","23","17","100","17" "08/12/2015","20","20","62","20" "08/13/2015","23","21","81","21" "08/14/2015","30","26","136","26" "08/15/2015","12","7","59","7" "08/16/2015","12","8","61","8" "08/17/2015","68","46","331","46" "08/18/2015","72","48","327","48" "08/19/2015","149","75","542","75" "08/20/2015","95","59","358","59" "08/21/2015","93","54","342","54" "08/22/2015","69","40","300","40" "08/23/2015","150","103","505","103" "08/24/2015","39","30","105","30"
</data>
</reports.getAccountsStatsResponse>
And in JSON format:
{
"statusCode": 200,
"errorCode": 0,
"statusReason": "OK",
"callId": "99949da72d034b04ba910c91704ba4c0",
"time": "2015-09-01T09:19:30.569Z",
"headers": [
"date",
"initRegistrations",
"registrations",
"siteLogins",
"newUsers"
],
"data": "\"date\",\"initRegistrations\",\"registrations\",\"siteLogins\",\"newUsers\"\r\n\"01/01/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/02/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/03/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/04/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/05/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/06/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/07/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/08/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/09/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/10/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/11/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/12/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/13/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/14/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/15/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/16/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/17/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/18/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/19/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/20/2015\",\"34\",\"34\",\"72\",\"34\"\r\n\"01/21/2015\",\"33\",\"23\",\"58\",\"23\"\r\n\"01/22/2015\",\"19\",\"19\",\"49\",\"19\"\r\n\"01/23/2015\",\"21\",\"21\",\"50\",\"21\"\r\n\"01/24/2015\",\"1\",\"1\",\"2\",\"1\"\r\n\"01/25/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"01/26/2015\",\"8\",\"4\",\"49\",\"4\"\r\n\"01/27/2015\",\"8\",\"8\",\"35\",\"8\"\r\n\"01/28/2015\",\"4\",\"2\",\"16\",\"2\"\r\n\"01/29/2015\",\"7\",\"7\",\"27\",\"7\"\r\n\"01/30/2015\",\"69\",\"58\",\"516\",\"58\"\r\n\"01/31/2015\",\"9\",\"6\",\"76\",\"6\"\r\n\"02/01/2015\",\"0\",\"0\",\"2\",\"0\"\r\n\"02/02/2015\",\"304\",\"203\",\"2317\",\"203\"\r\n\"02/03/2015\",\"122\",\"93\",\"786\",\"93\"\r\n\"02/04/2015\",\"69\",\"47\",\"435\",\"47\"\r\n\"02/05/2015\",\"93\",\"64\",\"677\",\"64\"\r\n\"02/06/2015\",\"294\",\"255\",\"1327\",\"255\"\r\n\"02/07/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/08/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/09/2015\",\"0\",\"0\",\"3\",\"0\"\r\n\"02/10/2015\",\"1\",\"0\",\"1\",\"0\"\r\n\"02/11/2015\",\"3\",\"3\",\"7\",\"3\"\r\n\"02/12/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/13/2015\",\"2\",\"2\",\"4\",\"2\"\r\n\"02/14/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/15/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/16/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/17/2015\",\"3\",\"3\",\"7\",\"3\"\r\n\"02/18/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/19/2015\",\"1\",\"1\",\"3\",\"1\"\r\n\"02/20/2015\",\"3\",\"3\",\"10\",\"3\"\r\n\"02/21/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/22/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/23/2015\",\"1\",\"1\",\"4\",\"1\"\r\n\"02/24/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/25/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/26/2015\",\"0\",\"0\",\"0\",\"0\"\r\n\"02/27/2015\",\"0\",\"0\",\"1\",\"0\"\r\n\"02/28/2015\",\"1\",\"1\",\"2\",\"1\"\r\n\"03/01/2015\",\"1\",\"1\",\"3\",\"1\"\r\n\"03/02/2015\",\"19\",\"9\",\"348\",\"9\"\r\n\"03/03/2015\",\"14\",\"9\",\"132\",\"9\"\r\n\"03/04/2015\",\"4\",\"4\",\"41\",\"4\"\r\n\"03/05/2015\",\"8\",\"5\",\"101\",\"5\"\r\n\"03/06/2015\",\"6\",\"5\",\"71\",\"5\"\r\n\"03/07/2015\",\"8\",\"4\",\"42\",\"4\"\r\n\"03/08/2015\",\"7\",\"4\",\"45\",\"4\"\r\n\"03/09/2015\",\"5\",\"4\",\"30\",\"4\"\r\n\"03/10/2015\",\"7\",\"7\",\"39\",\"7\"\r\n\"03/11/2015\",\"9\",\"9\",\"41\",\"9\"\r\n\"03/12/2015\",\"1\",\"1\",\"20\",\"1\"\r\n\"03/13/2015\",\"3\",\"3\",\"26\",\"3\"\r\n\"03/14/2015\",\"2\",\"0\",\"21\",\"0\"\r\n\"03/15/2015\",\"3\",\"3\",\"28\",\"3\"\r\n\"03/16/2015\",\"3\",\"3\",\"38\",\"3\"\r\n\"03/17/2015\",\"4\",\"4\",\"43\",\"4\"\r\n\"03/18/2015\",\"5\",\"3\",\"45\",\"3\"\r\n\"03/19/2015\",\"19\",\"16\",\"108\",\"16\"\r\n\"03/20/2015\",\"11\",\"8\",\"96\",\"8\"\r\n\"03/21/2015\",\"276\",\"261\",\"807\",\"261\"\r\n\"03/22/2015\",\"197\",\"192\",\"604\",\"192\"\r\n\"03/23/2015\",\"0\",\"0\",\"3\",\"0\"\r\n\"03/24/2015\",\"1\",\"1\",\"4\",\"1\"\r\n\"03/25/2015\",\"181\",\"166\",\"401\",\"166\"\r\n\"03/26/2015\",\"124\",\"109\",\"265\",\"109\"\r\n\"03/27/2015\",\"53\",\"47\",\"124\",\"47\"\r\n\"03/28/2015\",\"41\",\"39\",\"99\",\"39\"\r\n\"03/29/2015\",\"75\",\"65\",\"173\",\"65\"\r\n\"03/30/2015\",\"249\",\"239\",\"536\",\"239\"\r\n\"03/31/2015\",\"222\",\"212\",\"487\",\"212\"\r\n\"04/01/2015\",\"40\",\"29\",\"394\",\"29\"\r\n\"04/02/2015\",\"16\",\"10\",\"132\",\"10\"\r\n\"04/03/2015\",\"13\",\"10\",\"125\",\"10\"\r\n\"04/04/2015\",\"6\",\"4\",\"49\",\"4\"\r\n\"04/05/2015\",\"2\",\"1\",\"46\",\"1\"\r\n\"04/06/2015\",\"4\",\"3\",\"38\",\"3\"\r\n\"04/07/2015\",\"1\",\"0\",\"32\",\"0\"\r\n\"04/08/2015\",\"4\",\"2\",\"16\",\"2\"\r\n\"04/09/2015\",\"9\",\"8\",\"30\",\"8\"\r\n\"04/10/2015\",\"31\",\"29\",\"96\",\"29\"\r\n\"04/11/2015\",\"17\",\"14\",\"90\",\"14\"\r\n\"04/12/2015\",\"10\",\"7\",\"46\",\"7\"\r\n\"04/13/2015\",\"19\",\"13\",\"69\",\"13\"\r\n\"04/14/2015\",\"63\",\"58\",\"199\",\"58\"\r\n\"04/15/2015\",\"17\",\"16\",\"58\",\"16\"\r\n\"04/16/2015\",\"13\",\"12\",\"41\",\"12\"\r\n\"04/17/2015\",\"7\",\"5\",\"51\",\"5\"\r\n\"04/18/2015\",\"51\",\"46\",\"165\",\"46\"\r\n\"04/19/2015\",\"51\",\"45\",\"179\",\"45\"\r\n\"04/20/2015\",\"28\",\"21\",\"110\",\"21\"\r\n\"04/21/2015\",\"32\",\"24\",\"290\",\"24\"\r\n\"04/22/2015\",\"47\",\"31\",\"329\",\"31\"\r\n\"04/23/2015\",\"30\",\"27\",\"183\",\"27\"\r\n\"04/24/2015\",\"71\",\"65\",\"284\",\"65\"\r\n\"04/25/2015\",\"25\",\"17\",\"268\",\"17\"\r\n\"04/26/2015\",\"26\",\"24\",\"268\",\"24\"\r\n\"04/27/2015\",\"72\",\"67\",\"172\",\"67\"\r\n\"04/28/2015\",\"28\",\"25\",\"96\",\"25\"\r\n\"04/29/2015\",\"72\",\"48\",\"159\",\"48\"\r\n\"04/30/2015\",\"50\",\"22\",\"136\",\"22\"\r\n\"05/01/2015\",\"33\",\"23\",\"126\",\"23\"\r\n\"05/02/2015\",\"22\",\"17\",\"112\",\"17\"\r\n\"05/03/2015\",\"31\",\"21\",\"169\",\"21\"\r\n\"05/04/2015\",\"29\",\"21\",\"182\",\"21\"\r\n\"05/05/2015\",\"12\",\"10\",\"24\",\"10\"\r\n\"05/06/2015\",\"369\",\"354\",\"790\",\"354\"\r\n\"05/07/2015\",\"409\",\"401\",\"839\",\"401\"\r\n\"05/08/2015\",\"258\",\"253\",\"539\",\"253\"\r\n\"05/09/2015\",\"227\",\"221\",\"469\",\"221\"\r\n\"05/10/2015\",\"138\",\"134\",\"297\",\"134\"\r\n\"05/11/2015\",\"14\",\"13\",\"32\",\"13\"\r\n\"05/12/2015\",\"57\",\"24\",\"452\",\"24\"\r\n\"05/13/2015\",\"23\",\"12\",\"300\",\"12\"\r\n\"05/14/2015\",\"7\",\"5\",\"70\",\"5\"\r\n\"05/15/2015\",\"7\",\"6\",\"15\",\"6\"\r\n\"05/16/2015\",\"3\",\"3\",\"7\",\"3\"\r\n\"05/17/2015\",\"3\",\"3\",\"8\",\"3\"\r\n\"05/18/2015\",\"2\",\"4\",\"4\",\"2\"\r\n\"05/19/2015\",\"10\",\"16\",\"24\",\"8\"\r\n\"05/20/2015\",\"4\",\"8\",\"10\",\"4\"\r\n\"05/21/2015\",\"7\",\"12\",\"14\",\"6\"\r\n\"05/22/2015\",\"9\",\"14\",\"33\",\"7\"\r\n\"05/23/2015\",\"9\",\"14\",\"19\",\"7\"\r\n\"05/24/2015\",\"16\",\"32\",\"39\",\"16\"\r\n\"05/25/2015\",\"11\",\"9\",\"21\",\"7\"\r\n\"05/26/2015\",\"23\",\"16\",\"87\",\"16\"\r\n\"05/27/2015\",\"30\",\"24\",\"87\",\"24\"\r\n\"05/28/2015\",\"12\",\"12\",\"39\",\"12\"\r\n\"05/29/2015\",\"14\",\"12\",\"37\",\"12\"\r\n\"05/30/2015\",\"8\",\"7\",\"19\",\"7\"\r\n\"05/31/2015\",\"5\",\"4\",\"17\",\"4\"\r\n\"06/01/2015\",\"10\",\"10\",\"31\",\"10\"\r\n\"06/02/2015\",\"23\",\"20\",\"95\",\"20\"\r\n\"06/03/2015\",\"11\",\"9\",\"31\",\"9\"\r\n\"06/04/2015\",\"14\",\"13\",\"36\",\"13\"\r\n\"06/05/2015\",\"12\",\"11\",\"27\",\"11\"\r\n\"06/06/2015\",\"8\",\"6\",\"20\",\"6\"\r\n\"06/07/2015\",\"9\",\"9\",\"21\",\"9\"\r\n\"06/08/2015\",\"16\",\"16\",\"37\",\"16\"\r\n\"06/09/2015\",\"24\",\"17\",\"40\",\"17\"\r\n\"06/10/2015\",\"8\",\"8\",\"34\",\"8\"\r\n\"06/11/2015\",\"46\",\"27\",\"464\",\"27\"\r\n\"06/12/2015\",\"45\",\"23\",\"383\",\"23\"\r\n\"06/13/2015\",\"12\",\"9\",\"143\",\"9\"\r\n\"06/14/2015\",\"22\",\"15\",\"112\",\"15\"\r\n\"06/15/2015\",\"14\",\"13\",\"74\",\"13\"\r\n\"06/16/2015\",\"63\",\"56\",\"197\",\"56\"\r\n\"06/17/2015\",\"28\",\"25\",\"114\",\"25\"\r\n\"06/18/2015\",\"17\",\"15\",\"85\",\"15\"\r\n\"06/19/2015\",\"143\",\"135\",\"460\",\"135\"\r\n\"06/20/2015\",\"54\",\"46\",\"217\",\"46\"\r\n\"06/21/2015\",\"60\",\"55\",\"211\",\"55\"\r\n\"06/22/2015\",\"91\",\"78\",\"249\",\"78\"\r\n\"06/23/2015\",\"99\",\"87\",\"295\",\"87\"\r\n\"06/24/2015\",\"115\",\"103\",\"315\",\"103\"\r\n\"06/25/2015\",\"455\",\"380\",\"964\",\"380\"\r\n\"06/26/2015\",\"585\",\"489\",\"1144\",\"489\"\r\n\"06/27/2015\",\"345\",\"300\",\"695\",\"300\"\r\n\"06/28/2015\",\"349\",\"320\",\"783\",\"320\"\r\n\"06/29/2015\",\"113\",\"98\",\"362\",\"98\"\r\n\"06/30/2015\",\"128\",\"113\",\"424\",\"113\"\r\n\"07/01/2015\",\"115\",\"99\",\"277\",\"99\"\r\n\"07/02/2015\",\"73\",\"65\",\"323\",\"65\"\r\n\"07/03/2015\",\"22\",\"16\",\"184\",\"16\"\r\n\"07/04/2015\",\"13\",\"12\",\"69\",\"12\"\r\n\"07/05/2015\",\"15\",\"12\",\"71\",\"12\"\r\n\"07/06/2015\",\"31\",\"25\",\"107\",\"25\"\r\n\"07/07/2015\",\"15\",\"10\",\"63\",\"10\"\r\n\"07/08/2015\",\"16\",\"12\",\"60\",\"12\"\r\n\"07/09/2015\",\"35\",\"32\",\"103\",\"32\"\r\n\"07/10/2015\",\"22\",\"19\",\"72\",\"19\"\r\n\"07/11/2015\",\"7\",\"7\",\"25\",\"7\"\r\n\"07/12/2015\",\"4\",\"4\",\"27\",\"4\"\r\n\"07/13/2015\",\"81\",\"73\",\"195\",\"73\"\r\n\"07/14/2015\",\"60\",\"53\",\"157\",\"53\"\r\n\"07/15/2015\",\"44\",\"40\",\"115\",\"40\"\r\n\"07/16/2015\",\"40\",\"40\",\"112\",\"40\"\r\n\"07/17/2015\",\"27\",\"23\",\"64\",\"23\"\r\n\"07/18/2015\",\"15\",\"11\",\"56\",\"11\"\r\n\"07/19/2015\",\"19\",\"14\",\"63\",\"14\"\r\n\"07/20/2015\",\"21\",\"17\",\"48\",\"17\"\r\n\"07/21/2015\",\"11\",\"10\",\"30\",\"10\"\r\n\"07/22/2015\",\"13\",\"12\",\"40\",\"12\"\r\n\"07/23/2015\",\"9\",\"6\",\"43\",\"6\"\r\n\"07/24/2015\",\"9\",\"8\",\"32\",\"8\"\r\n\"07/25/2015\",\"8\",\"5\",\"20\",\"5\"\r\n\"07/26/2015\",\"20\",\"18\",\"64\",\"18\"\r\n\"07/27/2015\",\"15\",\"14\",\"80\",\"14\"\r\n\"07/28/2015\",\"9\",\"8\",\"48\",\"8\"\r\n\"07/29/2015\",\"21\",\"13\",\"88\",\"13\"\r\n\"07/30/2015\",\"9\",\"5\",\"92\",\"5\"\r\n\"07/31/2015\",\"4\",\"3\",\"81\",\"3\"\r\n\"08/01/2015\",\"4\",\"3\",\"23\",\"3\"\r\n\"08/02/2015\",\"11\",\"5\",\"29\",\"5\"\r\n\"08/03/2015\",\"19\",\"17\",\"50\",\"17\"\r\n\"08/04/2015\",\"15\",\"10\",\"32\",\"10\"\r\n\"08/05/2015\",\"14\",\"9\",\"31\",\"9\"\r\n\"08/06/2015\",\"26\",\"5\",\"338\",\"5\"\r\n\"08/07/2015\",\"22\",\"13\",\"182\",\"13\"\r\n\"08/08/2015\",\"9\",\"7\",\"72\",\"7\"\r\n\"08/09/2015\",\"7\",\"4\",\"58\",\"4\"\r\n\"08/10/2015\",\"17\",\"14\",\"88\",\"14\"\r\n\"08/11/2015\",\"23\",\"17\",\"100\",\"17\"\r\n\"08/12/2015\",\"20\",\"20\",\"62\",\"20\"\r\n\"08/13/2015\",\"23\",\"21\",\"81\",\"21\"\r\n\"08/14/2015\",\"30\",\"26\",\"136\",\"26\"\r\n\"08/15/2015\",\"12\",\"7\",\"59\",\"7\"\r\n\"08/16/2015\",\"12\",\"8\",\"61\",\"8\"\r\n\"08/17/2015\",\"68\",\"46\",\"331\",\"46\"\r\n\"08/18/2015\",\"72\",\"48\",\"327\",\"48\"\r\n\"08/19/2015\",\"149\",\"75\",\"542\",\"75\"\r\n\"08/20/2015\",\"95\",\"59\",\"358\",\"59\"\r\n\"08/21/2015\",\"93\",\"54\",\"342\",\"54\"\r\n\"08/22/2015\",\"69\",\"40\",\"300\",\"40\"\r\n\"08/23/2015\",\"150\",\"103\",\"505\",\"103\"\r\n\"08/24/2015\",\"39\",\"30\",\"105\",\"30\"\r\n"
}
Firstly, I would like to store the text from the "data" tag by referencing the name of it, but I've currently only had success by using this following:
response = requests.get(url)
root = ElementTree.fromstring(response.content)
dataString = root[6].text
Is there a separate command to be able to specify the name of the tag?
Next, my goal is to loop through different URL's (which correspond to different accounts), and append the name of those accounts to the end of the data. Is this possible, given that the data is stored as a string and I would need to add it to the end of each row? As a follow up, what's the best convention for saving multiple values in a variable to be able to loop through i.e. the list of accounts?
Apologies if this is unclear, I'm happy to provide any more information if it means anybody can help.
As far as I understood, you have a specific URL for each user and you want to collect data for all users given.
However, since you are not able to get the username out of the response you have to combine the response with the username corresponding to the URL the request was sent to. If so, you could use a dictionary to store the data of your response since the JSON-format is equivalent to Python's dictionary.
The code below simply iterates through a set of tuples containing the different user names and the corresponding URL. For each URL a request is sent, the data is extracted from the JSON-formatted response and stored in a dictionary with the username as a key. This dictionary is then stored (.update()) in a kind of main dictionary containing all your collected datasets.
# replace names 'url_xyz' with corresponding names and url
users = {('Albert', 'url_albert'), ('Steven', 'url_steven'), ('Mike', 'url_mike')}
all_data = dict()
for name, url in users:
response = requests.get(url)
data = response['data'].replace('\"', '')
all_data.update({name: data})
Thank you Albert.
Your JSON suggestion let me control the data in a much better way. The code below is what I ended up with to get to my desired output. Now just to work out how to convert the date from MM/DD/YYYY into DD/MM/YYYY.
startDate = '2015-01-01' # Must be in format YYYY-MM-DD
endDate = '2015-12-31' # Must be in format YYYY-MM-DD
dimensions = 'date' # Available dimensions are 'date' and 'cid'
format = 'json'
dataFormat = 'json'
measures = 'initRegistrations,registrations,siteLogins,newUsers'
allData = []
# Construct API URL
for i in range(0,len(apiKey)):
url = ('https://reports.eu1.gigya.com/reports.getAccountsStats?secret=' + secret + '&apiKey=' + apiKey[i] + \
'&uid=' + uid + '&startDate=' + startDate + '&endDate=' + endDate + '&dimensions='+ dimensions +\
'&measures=' + measures + '&format=' + format + '&dataFormat=' + dataFormat)
response = requests.get(url)
json = response.json()
data = json['data']
if i == 0:
headers = json['headers']
headers.append('brand')
for x in range(0,len(data)):
data[x].append(brand[i])
brandData = [headers] + data
else:
for x in range(0,len(data)):
data[x].append(brand[i])
brandData = data
allData += brandData
with open("testDataJSON.csv", "wb") as f:
writer = csv.writer(f)
writer.writerows(allData)
I don't know how well this follows best practice for Python but as I said, I am very new to it.

Categories