Download file, parse it and serve in Flask - python

I'm taking my first steps with Flask. I can successfuly download a file from a client and give it back with the code from here:
http://flask.pocoo.org/docs/patterns/fileuploads/
But how to change it (eg. line after line) and then serve it to the client?
I can get the string with read() after:
if file and allowed_file(file.filename):
and then process it. So the question really is: how do I serve output string as a file?
I don't want to save it on a server's hdd at all (both original version and changed).

You can use make_response to create the response for your string and add Content-Disposition: attachment; filename=anyNameHere.txt to it before returning it:
#app.route("/transform-file", methods=["POST"])
def transform():
# Check for valid file and assign it to `inbound_file`
data = inbound_file.read()
data = data.replace("A", "Z")
response = make_response(data)
response.headers["Content-Disposition"] = "attachment; filename=outbound.txt"
return response
See also: The docs on streaming content

Related

How to return a PDF file from in-memory buffer using FastAPI?

I want to get a PDF file from s3 and then return it to the frontend from FastAPI backend.
This is my code:
#router.post("/pdf_document")
def get_pdf(document : PDFRequest) :
s3 = boto3.client('s3')
file=document.name
f=io.BytesIO()
s3.download_fileobj('adm2yearsdatapdf', file,f)
return StreamingResponse(f, media_type="application/pdf")
This API returns 200 status code, but it does not return the PDF file as a response.
As the entire file data are already loaded into memory, there is no actual reason for using StreamingResponse. You should instead use Response, by passing the file bytes (use BytesIO.getvalue() to get the bytes containing the entire contents of the buffer), defining the media_type, as well as setting the Content-Disposition header, so that the PDF file can be either viewed in the browser or downloaded to the user's device. For more details have a look at this answer, as well as this and this answer. Additionally, as the buffer is discarded when the close()method is called, you could also use FastAPI/Starlette's BackgroundTasks to close the buffer after returning the response, in order to release the memory. Alternatively, you could get the bytes using pdf_bytes = buffer.getvalue(), then close the buffer using buffer.close() and finally, return Response(pdf_bytes, headers=.... Example:
from fastapi import Response, BackgroundTasks
#app.get("/pdf")
def get_pdf(background_tasks: BackgroundTasks):
buffer = io.BytesIO()
# ...
background_tasks.add_task(buffer.close)
headers = {'Content-Disposition': 'inline; filename="out.pdf"'}
return Response(buffer.getvalue(), headers=headers, media_type='application/pdf')
To have the PDF file downloaded rather than viewed in the borwser, use:
headers = {'Content-Disposition': 'attachment; filename="out.pdf"'}

Upload large video file as chunks and send some parameters along with that using python flask?

I was able to upload large file to server using the below code -
#app.route("/upload", methods=["POST"])
def upload():
with open("/tmp/output_file", "bw") as f:
chunk_size = 4096
while True:
chunk = request.stream.read(chunk_size)
if len(chunk) == 0:
return
f.write(chunk)
But if I use request.form['userId'] or any parameter which is sent as form data in the above code it fails.
As per one of the blog post it says- Flask’s request has a stream, that will have the file data you are uploading. You can read from it treating it as a file-like object. The trick seems to be that you shouldn’t use other request attributes like request.form or request.file because this will materialize the stream into memory/file. Flask by default saves files to disk if they exceed 500Kb, so don’t touch file.
Is there a way where we can send additional parameters like userId along with the file being uploaded in flask?
use headers in requests.
if you want to send user name along with data
headers['username'] = 'name of the user'
r = requests.post(url, data=chunk, headers=headers)

ReSTful Flask file upload with request.stream

I am attempting to create a simple Flask endpoint for uploading files via POST or PUT. I want the filename in the URL, and then to (after the request headers) just stream the raw file data in the request.
I also need to be able to upload files slightly larger than 2GB, and I need to do this without storing the entire file in memory. At first, this seemed simple enough:
#application.route("/upload/<filename>", methods=['POST', 'PUT'])
def upload(filename):
# Authorization and sanity checks skipped.
filename = secure_filename(filename)
fileFullPath = os.path.join(application.config['UPLOAD_FOLDER'], filename)
with open(fileFullPath, 'wb') as f:
copyfileobj(request.stream, f)
return jsonify({'filename': filename})
With a multipart/formdata upload, I can simply call .save() on the file.
However, any file I upload seems to have a different checksum (well, sha256sum, on the server then on the source). When uploading a standard text file, newlines seem to be getting stripped. Binary files seem to be getting mangled in other strange ways.
I am sending Content-Type: application/octet-stream when uploading to try to make Flask treat all uploads as binary. Is request.stream (a proxy to wsgi.input) opened as non-binary? I can't seem to figure that out from the Flask code. How can I stream the request data, in raw binary format, to a file on disk?
I'm open to hacks; this is for a test project (so I'm also not interested in hearing how sending this as formdata would be better, or how this isn't a good way to upload files, etc.)
I am testing this via:
curl -H 'Content-Type: application/octet-stream' -H 'Authorization: ...' -X PUT --data #/path/to/test/file.name https://test.example.com/upload/file.name

Locally save and parse suds response?

I am new to SOAP and suds. I am calling a non-XML SOAP API using suds. A given result contains a ton of different sub-arrays. I thought I would just locally save the whole response for parsing later but easier said than done. And I don't get this business with the built in cache option where you cache for x days or whatever. Can I permanently save and parse a response locally?
You can write the response to a local file:
client = Client("http://someWsdl?wsdl")
# Getting response object
method_response = client.service.someMethod(methodParams)
# Open local file
fd = os.open("response_file.txt",os.O_RDWR|os.O_CREAT)
# Convert response object into string
response_str = str(method_response)
# Write response to the file
ret = os.write(fd,response_str)
# Close the file
os.close(fd)

Webapp2 request application/octet-stream file upload

I am trying to upload a file to s3 from a form via ajax. I am using fineuploader http://fineuploader.com/ on the client side and webapp2 on the server side. it sends the parameter as qqfile in the request and I can see the image data in the request headers but I have no idea how to get it back in browsers that do not use the multipart encoding.
This is how I was doing it in the standard html form post with multipart.
image = self.request.POST["image"]
this gives me the image name and the image file
currently I have only been able to get the image filename back not the data with
image = self.request.get_all('image')
[u'image_name.png']
when using the POST I get a warning about the content headers being application/octet-stream
<NoVars: Not an HTML form submission (Content-Type: application/octet-stream)>
How do I implement a BlobStoreHandler in webapp2 outside of GAE?
Your question code is not very clear to me. But you can use an example.
Have a look at this article from Nick Johnson. He implements a dropbox service using app engine blobstore and Plupload in the client : http://blog.notdot.net/2010/04/Implementing-a-dropbox-service-with-the-Blobstore-API-part-2
I ended up using fineuploader http://fineuploader.com/ which sends a multipart encoded form to my handler endpoint.
inside the handler I could simply just reference the POST and then read the FieldStorage Object into a cStringIO object.
image = self.request.POST["qqfile"]
imgObj = cStringIO.StringIO(image.file.read())
# Connect to S3...
# create s3 Key
key = bucket.new_key("%s" % uuid.uuid4());
# guess mimetype for headers
file_mime_type = mimetypes.guess_type(image.filename)
if file_mime_type[0]:
file_headers = {"Content-Type": "%s" % file_mime_type[0]}
else:
file_headers = None
key.set_contents_from_string(imgObj.getvalue(), headers=file_headers)
key_str = key.key
#return JSON response with key and append to s3 url on front end.
note: qqfile is the parameter fineuploader uses.
I am faking the progress but that is ok for my use case no need for BlobStoreUploadHandler.

Categories