Python - save multiple responses from multiple requests - python

I am pulling JSON data from an api and I am looking to pass in a different parameter for each request and save each response
My current code
# create an empty list to store each account id
accounts = []
##store in accounts list every id
for each in allAccounts['data']:
accounts.append((each['id']))
#for each account , call a new account id for the url
for id in accounts:
urlAccounts = 'https://example.somewebsite.ie:000/v12345/accounts/'+id+'/users'
I save a response and print out the values.
accountReq = requests.get(urlAccounts, headers=headers)
allUsers = accountReq.json()
for each in allUsers['data']:
print(each['username']," " +each['first_name'])
This is fine and it works but I only store the first ID's response.
How do I store the responses from all of the requests?
So I'm looking to send multiple requests where the ID changes every time and save each response essentially.
I'm using python version 3.10.4 .

My code for this in case anyone stumbles across this.
# list of each api url to use
link =[]
#for every id in the accounts , create a new url link into the link list
for i in accounts:
link.append('https://example.somewebsite.ie:000/v12345/accounts/'+i+'/users')
#create a list with all the different requests
accountReq = []
for i in link:
accountReq.append(requests.get(i, headers=headers).json())
# write to a txt file
with open('masterSheet.txt', 'x') as f:
#for every request
for each in accountReq:
#get each accounts data
account = each['data']
#for each accounts data
#get each users email and names
for data in account:
sheet=(data['username']+" "+" ",data['first_name'],data['last_name'])
f.write(str(sheet)+"\n")

Related

Issues dynamically changing Python 'Requests' header to iterate through API URL endpoints

My issue is as follows:
I am attempting to pull down a list of all email address entries from an API. The data set is so large that it spans multiple API 'pages' with unique URLs. The page number can be specified as a parameter in the API request URL. I wrote a loop to try and collect email information from an API page, add the email addresses to a list, add 1 to the page number, and repeat the process up to 30 pages. Unfortunately it seems like the loop is only querying the same page 30 times and producing duplicates. I feel like I'm missing something simple (beginner here) but please let me know if anyone can help. Code is below:
import requests
import json
number = 1
user_list = []
parameters = {'page': number, 'per_page':50}
response = requests.get('https://api.com/profiles.json', headers=headers, params=parameters)
while number <=30:
formatted_data = response.json()
profiles = formatted_data['profiles']
for dict in profiles:
user_list.append(dict['email'])
number = number + 1
print(sorted(user_list))
In Python, numbers and strings are passed by value, not by reference.
That means you need to update dictionary after every iteration.
You also need to place requests.get() inside your loop to get different results
import json
number = 1
user_list = []
parameters = {'page': number, 'per_page':50}
while number <=30:
response = requests.get('https://api.com/profiles.json', headers=headers, params=parameters)
formatted_data = response.json()
profiles = formatted_data['profiles']
for dict_ in profiles: # try to avoid using keywords for your variables
user_list.append(dict_['email'])
number = number + 1
parameters['page'] = number
print(sorted(user_list))

How to add multiple responses into a single json object

I am making some request to another site from my flask api. Basically my flask api is a proxy. So initially I substitute the parameters with the known company id and and get all the workers id. Given the workers id, I try to make another request which helps me get all their details. However with the code below I am only getting the last response which means only the details of the last worker. You can ignore the j==1 for now I did it for testing purposes.
tempDict={}
updateDic={}
dictToSend={}
j=0
#i = companyid
#id=workerid
# I make several calls to url2 depending on the number of employee ids in number
for id in number:
url2="someurl/" + str(i)+ "/contractors/"+str(id)
r = requests.get(url2, headers={'Content-type': 'application/json',"Authorization":authenticate,'Accept': 'application/json'})
print("id"+str(id))
print(url2)
loadJsonResponse2=json.loads(r.text)
print(loadJsonResponse2)
key = i
tempDict.update(loadJsonResponse2)
# I want to have all of their details and add the company number before
print(tempDict)
if(j==1):
dictToSend[key]=tempDict
return jsonify(dictToSend)
j=j+1
return jsonify(dictToSend)
So I have all the workers ids and I request the other url to get all their details. The response is in json format. However I am only getting the last response with the above code. I did something like j==1 because I wanted to check the return.
dictToSend[key]=tempDict
return jsonify(dictToSend)
The key is the company id so that I can identify which company the worker is from.
How can I get to concatenate all the json responses and at the end add a key like "5":{concatenation of all json requests}
Thank you,
Your key for json object is
#i = companyid
.
.
.
key = i
.
.
.
# You are adding all your responses to companyid,
# better make a key with companyid and workerid
# key = str(companyid) + ":" + str(workerid)
dictToSend[key]=tempDict
And here
# you may not need this, since there is already a loop iterating on workerid
if(j==1):
dictToSend[key]=tempDict
return jsonify(dictToSend)
j=j+1
# then only useful line would be
dictToSend[key]=tempDict

How to get information about groups liked by a user using Facebook Graph API

I am very new to the Graph API and a basically trying to write a python (v2.7) script which takes as input the userID of a Facebook user and returns names/IDs of all groups/pages that have been liked by the user.
I have managed to acquire an Access Token that uses the following permissions: user_likes and user_groups. Do I need anything else?
I have used the following code so far to get a JSON dump of all the output from this access token:
import urllib
import json
import sys
import os
accessToken = 'ACCESS_ToKEN_HERE' #INSERT YOUR ACCESS TOKEN
userId = sys.argv[1]
limit=100
# Read my likes as a json object
url='https://graph.facebook.com/'+userId+'/feed?access_token='+accessToken +'&limit='+str(limit)
data = json.load(urllib.urlopen(url))
id=0
print str(data)
I did get some JSON data but I couldn't find any page/group related info in it neither did it seem to be the most recently updated data! Why is this?
Also, what are the field names that must be tracked to detect a page or a group in the likes data? Please help!
You are using the wrong API- /feed - this will fetch the feeds/posts of the user, not the pages/groups.
To get the groups he has joined:
API: /{user-id}/groups
Permissions req: user_groups
To get the pages he has liked:
API: /{user-id}/likes
Permissions req: user_likes

Sorting Facebook data using python

I am retrieving data from Facebook, the retrieved data gives me information about my Facebook friends likes.
I have written this:
import requests # pip install requests
import json
ACCESS_TOKEN='' #my Facebook access token here
base_url = 'https://graph.facebook.com/me'
# Get 10 likes for 10 friends
fields = 'id,name,friends.limit(10).fields(likes.limit(10))'
url = '%s?fields=%s&access_token=%s' %(base_url, fields, ACCESS_TOKEN,)
# This API is HTTP-based and could be requested in the browser,
# with a command line utlity like curl, or using just about
# any programming language by making a request to the URL.
# Click the hyperlink that appears in your notebook output
# when you execute this code cell to see for yourself...
print url
# Interpret the response as JSON and convert back
# to Python data structures
content = requests.get(url).json()
# Pretty-print the JSON and display it
print json.dumps(content, indent=1)
###############################More Options########################################
# Get all likes for 10 friends
fields = 'id,name,friends.limit(10).fields(likes)'
# Get all likes for 10 more friends
fields = 'id,name,friends.offset(10).limit(10).fields(likes)'
# Get 10 likes for all friends
fields = 'id,name,friends.fields(likes.limit(10))'
I have used execfile('Filename.py') command in order to run this code on python interactive shell.
Now what I want to do is to store the retrieved data in a text file called "likes.txt".
Does anyone have an idea how to do this?

get logged in user data using facebook api in python and store in mongo database

i have already developed a FB app.now,i am trying to harvest the user data and store it into the database.But, one thing i am not understanding is how to store the data of the logged in user in mongo db using graph api? prior to that how to fetch data in python?
I know this is repetitive but,i am not able to clear my concept of how to use the api in python.
i have tried this:
#!/usr/bin/env python
# encoding: utf-8
import json
import urllib2
import re
def getdata(id):
'''Queries the Facebook API for the specific group ID, and populates the
results dictionary with the Group ID, User Name, and User ID'''
#An access token is now required for quering the group messages.
a_token='access_token=<access token>'
urlquery='https://graph.facebook.com/'+id+'/feed&limit=20?access_token='+ a_token +''
print urlquery
data=json.load(urllib2.urlopen(urlquery))
harvest = []
results = {}
for item in data['data']:
try:
results = {}
results['grpid'] = id
user = item['from']
results['uname'] = user['name']
results['uid'] = user['id']
harvest.append(results)
except:
pass
print len(harvest)
def getgrpids():
urlquery='https://graph.facebook.com/<any username>'#can i put my app name?
#not clear from examples given on facebook graph api page.
data=json.load(urllib2.urlopen(urlquery))
ids=[]
for item in data['data']:
try:
ids.append(item['id'])
except:
pass
return ids
def main():
idres=getgrpids()
for id in idres:
#Loops through all of the group ids returned by getgrpids()
print 'Group ID:', id
getdata(id)
if __name__ == '__main__':
main()
Now the problem goes like this.When i change the username to some other the error report says that the user should be logged in or the app cannot get the user details.I am not understanding this since my friend was online at that time still the error?
Am i missing something?secondly i am not able to put my APP NAME IN THE QUERY (see the comment). Somebody please help.
thanks,
Assuming that you have access token of the user:
import facebook as fb
graph = fb.GraphAPI(access_token)
profile = graph.get_object("me")
# if you want to get the user's profile picture (you need to have the session UID)
profile.update({"picture": "http://graph.facebook.com/%s/picture?type=large" % uid})

Categories