Using the library httpx which allows me to make HTTP/2 request to target sites.
However when I use the proxy it seem to automatically set my request to HTTP/1.
I.e
async def main():
client = httpx.AsyncClient(http2=True)
response = await client.get('someurl', headers=headers)
print(response.http_version)
This prints HTTP/2
But same thing using proxy like so client = httpx.AsyncClient(http2=True, proxies=someproxydictionary)
it prints HTTP/1
Why is this behavior happening only when routing through proxies?
Related
I'm using aiohttp client to send request to an URL and collecting the redirected URLs.
In my case, the redirected URL contains Unicode text in it. I want it unmodified.
For eg the actual redirected URL is example.com/förderprojekte-e-v, but aiohttp client auto encodes it & returns me example.com/f\udcf6rderprojekte-e-v.
How to make aiohttp to disable auto encoding of redirected urls.
For requests module, this solution works, but I need help for aiohttp.
My code:
async fetch(url):
#url = 'https://www.example.com/test/123'
async with client.get(url, allow_redirects = True ) as resp:
html = await resp.read()
redir_url = str(resp.url)
#example.com/f\udcf6rderprojekte-e-v
or atleast tell me how to convert \udcf6 to ö
async def handle_redirected_url(request):
# https://
scheme = request.scheme
# example.com
host = request.host
# /f\udcf6rderprojekte-e-v
unencoded_path = request.raw_path
unencoded_url = scheme+host+unencoded_path
return web.json_response(status=200, data={'unencoded_url': unencoded_url)
See request object attributes in https://docs.aiohttp.org/en/stable/web_reference.html
I am trying to make two services communicate. The first API is exposed to the user.
The second is hidden and can process files. So the first can redirect requests.
I want to make of the post request asynchronus using aiohttp but i am facing this error : "There was an error parsing the body"
To recreate the error :
Lets say this is the server code
from fastapi import FastAPI
from fastapi import UploadFile, File
app = FastAPI()
#app.post("/upload")
async def transcript_file(file: UploadFile = File(...)):
pass
And this is the client code :
from fastapi import FastAPI
import aiohttp
app = FastAPI()
#app.post("/upload_client")
async def async_call():
async with aiohttp.ClientSession() as session:
headers = {'accept': '*/*',
'Content-Type': 'multipart/form-data'}
file_dict = {"file": open("any_file","rb")}
async with session.post("http://localhost:8000/upload", headers=headers, data=file_dict) as response:
return await response.json()
Description :
Run the server on port 8000 and the client on any port you like
Open the browser and open docs on the client.
Execute the post request and see the error
Environment :
aiohttp = 3.7.4
fastapi = 0.63.0
uvicorn = 0.13.4
python-multipart = 0.0.2
Python version: 3.8.8
From this answer:
If you are using one of multipart/* content types, you are actually required to specify the boundary parameter in the Content-Type header, otherwise the server (in the case of an HTTP request) will not be able to parse the payload.
You need to remove the explicit setting of the Content-Type header, the client aiohttp will add it implicitly for you, including the boundary parameter.
I am getting 403 HTTP error while sending a HTTP POST with SSL certificate. I tried to enable the debugging at the local server and it looks like locust is not sending the certs with the request which results in a 403 error. I tried python request lib and it works fine. Also, how to enable locust in verbose mode to see if it really attaches the certificate with the request?
Do you guys know how to debug this issue? The same request works fine with Postman.
from locust import task, tag, between
from locust.contrib.fasthttp import FastHttpUser
class ApiClient(FastHttpUser):
wait_time = between(0, 100)
def on_start(self):
self.client.verify = "~/client.crt"
self.client.cert = ('~/client.crt', '~client.key')
#task
def get_profile(self):
print(self.client.cert)
resp = self.client.post("/android/callback", {
"pcc" "123"})
print("Response status code:", resp.status_code)
print("Response text:", resp.text)
I suggest to use HttpUser that is based on requests framework and use instruction from official request documentation.
https://2.python-requests.org/en/latest/user/advanced/#ssl-cert-verification
I need to test POST and GET calls against an NGINX server.
I need to capture the error codes and verify the response. I was able to test the GET requests by hitting localhost:8080 (NGINX is running on docker exposing 8080), but I'm not sure how to test the POST calls.
Can we construct a dummy request and test POST call? NGINX runs with default page.
Below is one way to make a post request to an endpoint in python
import requests
API_ENDPOINT = "http://pastebin.com/api/api_post.php"
data = {param1:value1,
param2:value2}
#sending post request and saving response as response object
r = requests.post(url = API_ENDPOINT, data = data)
#extracting response text
pastebin_url = r.text
print("The pastebin URL is:%s"%pastebin_url)
We are using aiohttp to make multiple requests to various website vendors to grab their latest data.
Some of the content providers serve the data from a cache. Is it possible to request the data from the server directly? We have tried to pass in the headers parameter with no luck.
async def fetch(url):
global response
headers = {'Cache-Control': 'no-cache'}
async with ClientSession() as session:
async with session.get(url, headers=headers, proxy="OUR-PROXY") as response:
return await response.read()
The goal is to get the last-modified date header, which is not provided from the cache request.
Try to add some additional variable with dynamic value to URL (e.g. timestamp).
This will prevent caching on the server side even if it ignores Cache-Control.
Example:
from: https://example.com/test
to: https://example.com/test?timestamp=20180724181234