I have a simple Python script which should read a file from HTTP source and make a PUT request to another HTTP source.
block_size = 4096
file = urllib2.urlopen('http://path/to/someting.file').read(block_size)
headers = {'X-Auth-Token': token_id, 'content-type': 'application/octet-stream'}
response = requests.put(url='http://server/path', data=file, headers=headers)
How can I make synchronous reading and putting this file by block_size (chunk) while the block is not empty?
What you want to do is called "streaming uploads". Try the following.
Get the file as a stream:
resp = requests.get(url, stream = True)
And then post the file like object:
requests.post(url, data= resp.iter_content(chunk_size= 4096))
Related
Need to have JSON file contents inserted inside {} for Payload. Unable to do this successfully. Any thoughts?
Attempted to write the JSON file contents as a string, this failed. Attempted to insert JSON file into payload = {}, failed.
import requests, meraki, json, os, sys
with open('networkid.txt') as file:
array = file.readlines()
for line in array:
line = line.rstrip("\n")
url = 'https://api.meraki.com/api/v0/networks/%s/alertSettings' %line
payload = {}
headers = { 'X-Cisco-Meraki-API-Key': 'API Key','Content-Type': 'application/json'}
response = requests.request('PUT', url, headers = headers, data = payload, allow_redirects=True, timeout = 10)
print(response.text)
I am writing a script to deploy parameters to Meraki networks via API. I have the JSON information formatted correctly and in its own file and what I am needing is to insert the JSON data into the location of Payload in the script. Any ideas on how to do this? I already have a for loop which is necessary to run a list of network ID's contained in a .txt file. Any thoughts?
The data parameter in requests.request takes (optional) Dictionary, list of tuples, bytes, or file-like object to send in the body of the :class:Request.
You can convert your properly formatted json file to a python dictionary using json.load:
with open('json_information.json') as f:
payload = json.load(f)
Then you can directly pass data=payload into the call to requests.request:
with open('networkid.txt') as file:
array = file.readlines()
for line in array:
line = line.rstrip("\n")
url = 'https://api.meraki.com/api/v0/networks/%s/alertSettings' % line
headers = { 'X-Cisco-Meraki-API-Key': 'API Key','Content-Type': 'application/json'}
response = requests.request('PUT',
url,
headers=headers,
data=payload,
timeout = 10) # allow_redirects is True by default
print(response.text)
I am currently able to send OpenCV image frames to my Flask Server using the following code
def sendtoserver(frame):
imencoded = cv2.imencode(".jpg", frame)[1]
headers = {"Content-type": "text/plain"}
try:
conn.request("POST", "/", imencoded.tostring(), headers)
response = conn.getresponse()
except conn.timeout as e:
print("timeout")
return response
But I want to send a unique_id along with the frame I tried combining the frame and the id using JSON but getting following error TypeError: Object of type 'bytes' is not JSON serializable does anybody have any idea how I can send some additional data along with the frame to the server.
UPDATED:
json format code
def sendtoserver(frame):
imencoded = cv2.imencode(".jpg", frame)[1]
data = {"uid" : "23", "frame" : imencoded.tostring()}
headers = {"Content-type": "application/json"}
try:
conn.request("POST", "/", json.dumps(data), headers)
response = conn.getresponse()
except conn.timeout as e:
print("timeout")
return response
I have actually solved the query by using the Python requests module instead of the http.client module and have done the following changes to my above code.
import requests
def sendtoserver(frame):
imencoded = cv2.imencode(".jpg", frame)[1]
file = {'file': ('image.jpg', imencoded.tostring(), 'image/jpeg', {'Expires': '0'})}
data = {"id" : "2345AB"}
response = requests.post("http://127.0.0.1/my-script/", files=file, data=data, timeout=5)
return response
As I was trying to send a multipart/form-data and requests module has the ability to send both files and data in a single request.
You can try encoding your image in base64 string
import base64
with open("image.jpg", "rb") as image_file:
encoded_string = base64.b64encode(image_file.read())
And send it as a normal string.
As others suggested base64 encoding might be a good solution, however if you can't or don't want to, you could add a custom header to the request, such as
headers = {"X-my-custom-header": "uniquevalue"}
Then on the flask side:
unique_value = request.headers.get('X-my-custom-header')
or
unique_value = request.headers['X-my-custom-header']
That way you avoid the overhead of processing your image data again (if that matters) and you can generate a unique id for each frame with something like the python uuid module.
Hope that helps
I've been looking around for ways to upload large file with additional data, but there doesn't seem to be any solution. To upload file, I've been using this code and it's been working fine with small file:
with open("my_file.csv", "rb") as f:
files = {"documents": ("my_file.csv", f, "application/octet-stream")}
data = {"composite": "NONE"}
headers = {"Prefer": "respond-async"}
resp = session.post("my/url", headers=headers, data=data, files=files)
The problem is that the code loads the whole file up before sending, and I would run into MemoryError when uploading large files. I've looked around, and the way to stream data is to set
resp = session.post("my/url", headers=headers, data=f)
but I need to add {"composite": "NONE"} to the data. If not, the server wouldn't recognize the file.
You can use the requests-toolbelt to do this:
import requests
from requests_toolbelt.multipart import encoder
session = requests.Session()
with open('my_file.csv', 'rb') as f:
form = encoder.MultipartEncoder({
"documents": ("my_file.csv", f, "application/octet-stream"),
"composite": "NONE",
})
headers = {"Prefer": "respond-async", "Content-Type": form.content_type}
resp = session.post(url, headers=headers, data=form)
session.close()
This will cause requests to stream the multipart/form-data upload for you.
I have a REST PUT request to upload a file using the Django REST framework. Whenever I am uploading a file using the Postman REST client it works fine:
But when I try to do this with my code:
import requests
API_URL = "http://123.316.118.92:8888/api/"
API_TOKEN = "1682b28041de357d81ea81db6a228c823ad52967"
URL = API_URL + 'configuration/configlet/31'
#files = {
files = {'file': open('configlet.txt','rb')}
print URL
print "Update Url ==-------------------"
headers = {'Content-Type' : 'text/plain','Authorization':API_TOKEN}
resp = requests.put(URL,files=files,headers = headers)
print resp.text
print resp.status_code
I am getting an error on the server side:
MultiValueDictKeyError at /api/configuration/31/
"'file'"
I am passing file as key but still getting above error please do let me know what might I am doing wrong here.
This is how my Django server view looks
def put(self, request,id,format=None):
configlet = self.get_object(id)
configlet.config_path.delete(save=False)
file_obj = request.FILES['file']
configlet.config_path = file_obj
file_content = file_obj.read()
params = parse_file(file_content)
configlet.parameters = json.dumps(params)
logger.debug("File content: "+str(file_content))
configlet.save()
For this to work you need to send a multipart/form-data body. You should not be setting the content-type of the whole request to text/plain here; set only the mime-type of the one part:
files = {'file': ('configlet.txt', open('configlet.txt','rb'), 'text/plain')}
headers = {'Authorization': API_TOKEN}
resp = requests.put(URL, files=files, headers=headers)
This leaves setting the Content-Type header for the request as a whole to the library, and using files sets that to multipart/form-data for you.
I'm using httplib.HTTPConnection to submit an HTTP POST request. I get a 200 response status but the response data looks obfuscated or something.
When I submit the request in Firefox, the response is displayed fine.
conn = httplib.HTTPConnection("www.foo.com")
conn.request('POST', '/foo', postdata, headers)
resp = conn.getresponse()
conn.close()
print resp.read()
If the response is unexpectedly 'binary', look at the Content-Encoding header. Most likely you are being served a compressed response; it can be either gzip or deflate.
If you the encoding is gzip, decode it with:
import zlib
decompressor = zlib.decompressobj(16 + zlib.MAX_WBITS)
data = decompressor.decompress(response_body)
For deflate, you'd have to try both a default compressor and one with -zlib.MAX_WBITS:
try:
decompressor = zlib.decompressobj()
data = decompressor.decompress(response_body)
except zlib.error:
decompressor = zlib.decompressobj(-zlib.MAX_WBITS)
data = decompressor.decompress(response_body)
In addition to the other answer, you could probably disable encoding by setting the Accept-Encoding header to identity.
headers = {
# ...
"Accept-Encoding": "identity",
# ...
}