using grequest for many request - python

I'm using the grequest library for translating api requests, I need to quickly send about 2000 requests, but in this case I get an error: response 429, too many requests. How is it possible to set a delay between sending requests in this library?
def translate(array):
config.read("config.ini")
url = "https://translo.p.rapidapi.com/api/v3/translate"
headers = {
"User-agent": "bot 0.1",
"Retry-after": "1",
"content-type": "application/x-www-form-urlencoded",
"X-RapidAPI-Key": config['user']['api_key'],
"X-RapidAPI-Host": "translo.p.rapidapi.com"
}
response = (grequests.request("POST", url, data=payload, headers=headers) for payload in array)
resp = grequests.map(response)
logger.info(f"{resp}")
return resp
array = notes to translate
I try setup "Retry-After", but it doesn't work.

Related

How does Postman to generate this cookie?

import requests
url = "https://apiexample.com/load/v1/action/aaaaaaaaaaaaa"
payload={}
headers = {
'Authorization': 'OAuth oauth_consumer_key="aaaaaa",oauth_signature_method="HMAC-SHA1",oauth_timestamp="1664837015",oauth_nonce="mKyTFn7OtsV",oauth_version="1.0",oauth_signature="aaaaaaaaaaaa"',
'Cookie': 'JSESSIONID=M7n-S-aGe8asRnTjNOUGowak5i5avsRBx6A4H8au.madsedepre'
}
response = requests.request("GET", url, headers=headers, data=payload)
print(response.text)
I am trying to retrieve information from an api, and I am using postman...
How can I know what postman is doing to generate that cookie??
and how can I generate it using python requests?

Python request headers. error authenticating with JSON WEB TOKEN

I'm learning python by building a simple trading bot. I receive this error while trying to authenticate using a JWT
{
"error": {
"status": 403,
"message": "Authentication credentials were not provided."
}
}
following the example here https://docs.ledgerx.com/reference/tradedcontracts
import requests
url = "https://api.ledgerx.com/trading/contracts/traded"
headers = {
"Accept": "application/json",
"Authorization": "MY_API_KEY"
}
response = requests.get(url, headers=headers)
print(response.text)
for now im inserting the string literal later i will store this value in an .env
thanks for taking the time to read
Can you try this please
import requests
url = "https://api.ledgerx.com/trading/contracts/traded"
headers = {
"Accept": "application/json",
"Authorization": "JWT MY_API_KEY"
}
response = requests.get(url, headers=headers)
print(response.text)

Shazam detect API throws a HTTP 406 Not Acceptable

I'm trying to test the Shazam API Detect feature on a mono sample as provided by this tutorial.
My code reads the raw audio file and sends it as base64 plaintext in the body of a POST requests to the Shazam API.
The audio file can be downloaded via this link:
import requests
import base64
def shazam(payload):
url = "https://shazam.p.rapidapi.com/songs/detect"
payload = open(payload,"rb").read()
payload = base64.b64encode(payload)
payload = str(payload)
headers = {
'x-rapidapi-host': "shazam.p.rapidapi.com",
'x-rapidapi-key':str(open("./api.txt","r").read().strip()),
'content-type': "text/plain",
'accept': "text/plain"
}
response = requests.request("POST", url, data=payload, headers=headers)
print(response.text)
shazam("/home/samples/mono.raw")
Any ideas where I'm going wrong?
I'am having problems with this API as well been searching up and down to find a solution :(
it work on me
from pydub import AudioSegment
import base64
import requests
import json
file_path="./test.raw"
url = "https://rapidapi.p.rapidapi.com/songs/detect"
encode_string = base64.b64encode(open(file_path, "rb").read())
payload=encode_string
print(type(payload))
headers = {
'content-type': "text/plain",
'x-rapidapi-key': "<<<you key>>>",
'x-rapidapi-host': "shazam.p.rapidapi.com"
}
response = requests.request("POST", url, data=payload, headers=headers)
print(json.dumps(json.loads(response.text)))

Error posting comments using Facebook Graph API

My aim is to post a comment to a particular post id using Facebook graph API.
This is the code snippet for the same:
url = 'https://graph.facebook.com/v2.11/<post_id>/comments'
parameters = {'access_token': <FACEBOOK_ACCESS_TOKEN>, 'message': 'test comment'}
headers = {"content-type": "application/json"}
parameters = json.dumps(parameters)
response = requests.post(url, data=parameters, headers=headers, timeout=10)
I am calling this API inside my DJANGO POST API.
For Some Reason, Calling the Facebook API through this code doesnt work. The API call gets timeout after 10 seconds.
If I call the Facebook API through Postman / YARC , the comment gets posted successfully.
Can any one tell me where I am going wrong?
Python Requests example:
import requests
url = "https://graph.facebook.com/v2.11/yourPostId/comments"
querystring = {"access_token":"yourtoken"}
payload = "message=test%20comment"
headers = {
'content-type': "application/x-www-form-urlencoded",
'cache-control': "no-cache"
}
response = requests.request("POST", url, data=payload, headers=headers, params=querystring)
print(response.text)
Python http.client example:
import http.client
conn = http.client.HTTPSConnection("graph.facebook.com")
payload = "message=test%20comment"
headers = {
'content-type': "application/x-www-form-urlencoded",
'cache-control': "no-cache"
}
conn.request("POST", "/v2.11/yourPostId/comments?access_token=yourtoken", payload, headers)
res = conn.getresponse()
data = res.read()
print(data.decode("utf-8"))

Exporting gooddata report python

Looking to connect to the Gooddata API and export a report via the API in python. The documentation is a bit confusing to follow.
I've defined a login to my instance of gooddata:
from urllib2 import Request, urlopen
import json
import requests
def login_gooddata(my_email, my_password):
url = 'https://secure.gooddata.com/gdc/account/login'
values = {
"postUserLogin": {
"login": my_email,
"password": my_password,
"remember": 0,
"verify_level": 0
}
}
headers = {
'Content-Type': 'application/json',
'Accept': 'application/json'
}
encoded_values = json.dumps(values)
#request = Request(url, data=encoded_values, headers=headers)
r = requests.post(url, data=encoded_values)
return r
That successfully logs me in, returning a 200 response.
Given the documentation from the gooddata website on connecting to the API, I'm trying to export a raw project file.
I set the project and object ids:
project_id = 'asibfakuyebkbhdbfaisdf'
object_id = '87234760'
values = {
"report_req": {
"reportDefinition": "/gdc/md/"+ project_id + "/obj/" + object_id
}
}
headers = {
'Accept': 'application/json',
'Content-Type': 'application/json'
}
url = 'https://secure.gooddata.com/gdc/app/projects/' + project_id + '/execute/raw/'
r = requests.post(url, data=json.dumps(values), headers=headers)
request = Request(url, data=json.dumps(values), headers=headers)
response_body = urlopen(requests).read()
print response_body
I played around with using r = requests.post(url, data=encoded_values and request = Request(url, data=encoded_values, headers=headers). Still receiving an error. I'm not really sure how to tackle next steps.
Following directions as stated in documentation for connecting to the API:
You need to perform all HTTP requests from a single "session" that remembers cookies from the login: perform s = requests.Session() once, then use s.post instead of requests.post.
See https://stackoverflow.com/a/31571805/3407728 for more.

Categories