how to Upload jpg file and save it in restplas flask api? - python

I use restplus flask api . I want to upload jpg file then rename and save to files location. then save its url. I searched and found this code on https://flask-restplus.readthedocs.io/en/stable/parsing.html#file-upload, but I dont understand do_something_with_file statment in this code. could you help me?
from werkzeug.datastructures import FileStorage
upload_parser = api.parser()
upload_parser.add_argument('file', location='files',
type=FileStorage, required=True)
#api.route('/upload/')
#api.expect(upload_parser)
class Upload(Resource):
def post(self):
uploaded_file = args['file'] # This is FileStorage instance
url = do_something_with_file(uploaded_file)
return {'url': url}, 201

You can refer to flask original documentation for uploading files Uploading files
Basically, all you need is FileStorage.save() method to save uploaded file.

Related

Django - FileField - PDF vs octet-stream - AWS S3

I have a model with a file field like so:
class Document(models.Model):
file = models.FileField(...)
Elsewhere in my application, I am trying to download a pdf file from an external url and upload it to the file field:
import requests
from django.core.files.base import ContentFile
...
# get the external file:
response = requests.get('<external-url>')
# convert to ContentFile:
file = ContentFile(response.content, name='document.pdf')
# update document:
document.file.save('document.pdf', content=file, save=True)
However, I have noticed the following behavior:
files uploaded via the django-admin portal have the content_type "application/json"
files uploaded via the script abobe have the content_type "application/octet-stream"
How can I ensure that files uploaded via the script have the "application/json" content_type? Is it possible to set the content_type on the the ContentFile object? This is important for the frontend.
Other notes:
I am using AWS S3 as my file storage system.
Uploading a file from my local file storage via the scirpt (i.e. using with open(...) as file: still uploads a file as "applicaton/octet-stream"

How do I save a image using a flask API then return it to my React App can use it

I am trying to use my Flask API to save an image to the database OR just a file system but this is something I have never done and am getting nowhere with it.
I would like to be able to return the image back when the route is called and be able to use it in my ReactJS Application using just a img tag.
All I have been able to find is how to save the image to the Database and then download it using a route. I need to be able to return it. (It works just not what I need.)
Here is what that was:
#app.route('/img-upload', methods=['POST'])
def img_upload():
file = request.files['image']
newFile = Mealplan(name=file.filename, data=file.read())
db.session.add(newFile)
db.session.commit()
return jsonify({"Done!" : "The file has been uploaded."})
#app.route('/get-mealplan-image/<given_mealplan_id>')
def download_img(given_mealplan_id):
file_data = MealPlan.query.filter_by(id=given_mealplan_id).first()
return send_file(BytesIO(file_data.data), attachment_filename=file_data.name, as_attachment=True)
Save the files on the file system will be a more proper method. Here is a minimal example:
from flask import send_from_directory
basedir = os.path.abspath(os.path.dirname(__file__))
uploads_path = os.path.join(basedir, 'uploads') # assume you have created a uploads folder
#app.route('/img-upload', methods=['POST'])
def upload_image():
f = request.files['image']
f.save(os.path.join(uploads_path , f.filename)) # save the file into the uploads folder
newFile = Mealplan(name=f.filename) # only save the filename to database
db.session.add(newFile)
db.session.commit()
return jsonify({"Done!" : "The file has been uploaded."})
#app.route('/images/<path:filename>')
def serve_image(filename):
return send_from_directory(uploads_path, filename) # return the image
In your React app, you can use the filename to build to the image URL: /images/hello.jpg
Update:
If you can only get the id, the view function will be similar:
#app.route('/get-mealplan-image/<given_mealplan_id>')
def download_img(given_mealplan_id):
file_data = MealPlan.query.filter_by(id=given_mealplan_id).first()
return send_from_directory(uploads_path, file_data.name)

excel download with Flask-RestPlus?

How to implement an API endpoint to download excel file using Flask-RestPlus?
Previously I had implemented similar function using Pyramid. However that method didn't work here.
Here is the old code snippet:
workBook = openpyxl.Workbook()
fileName = 'Report.xls'
response = Response(content_type='application/vnd.ms-excel',
content_disposition='attachment; filename=%s' % fileName)
workBook.save(response)
return response
Thanks for the help.
send_from_directory provides a secure way to quickly expose static files from an upload folder or something similar when using Flask-RestPlus
from flask import send_from_directory
import os
#api.route('/download')
class Download(Resource):
def get(self):
fileName = 'Report.xls'
return send_from_directory(os.getcwd(), fileName, as_attachment=True)
I have assumed file is in current working directory. The path to download file can be adjusted accordingly.

How to download an object from google cloud storage bucket?

I uploaded a CSV file test.csv to Google Cloud Storage's bucket. The resulting public_url is like this
https://storage.googleapis.com/mybucket/test-2017-04-11-025727.csv
Originally `test.csv' has some row and column containing numbers like this
6,148,72,35,0,33.6,0.627,50,1
8,183,64,0,0,23.3,0.672,32,1
...
...
I uploaded the file by referring to bookshelf tutorial --> https://github.com/GoogleCloudPlatform/getting-started-python no 6-pubsub. The uploaded file will be saved with timestamp added to it.
Now I want to download the file that I uploaded to the bucket by using requests
Here is what I've been working on. The original sample is at 6-pubsub/bookshelf/crud.py. Below is script that I already edited based on the original sample.
from machinelearning import get_model, oauth2, storage, tasks
from flask import Blueprint, current_app, redirect, render_template, request, session, url_for
import requests
import os
...
...
crud = Blueprint('crud', __name__)
save_folder = 'temp/'
def upload_csv_file(file):
...
...
return public_url
...
...
#crud.route('/add', methods=['GET', 'POST'])
def add():
data = request.form.to_dict(flat=True)
# This will upload the file that I pushed from local directory to GCS
if request.method == 'POST':
csv_url = upload_csv_file(request.files.get('file'))
if csv_url:
data['csvUrl'] = csv_url
# I think this is not working. This should download back the file and save it to a temporary folder inside current working directory
response = requests.get(public_url)
if not os.path.exists(save_folder):
os.makedirs(save_folder)
with open(save_folder + 'testdata.csv', 'wb') as f:
f.write(response.content)
...
...
I opened folder temp and check the testdata.csv. It shows me an error like this inside the CSV file.
<?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>Anonymous users does not have storage.objects.get access to object mybucket/test-2017-04-11-025727.csv.</Details></Error>
I was hoping testdata.csv will have same contents like test.csv but it did not.
I already recheck my OAuth client and secret, bucket id on config.py but the error still there.
How do I solve this kind of error?
Thank you in advanced.
I solved it. It is just like mr #Brandon Yarbrough said that the bucket's object is not publicly readable.
To make the bucket public, taking from this link --> https://github.com/GoogleCloudPlatform/gsutil/issues/419
gsutil defacl set public-read gs://<bucket_name>

Django and S3 direct uploads

In my project I've got configured and properly working S3 storages . Now I'm trying to configure direct uploads to s3 using s3 direct. It is working almost fine. The user is able to upload the image and it get stored in S3. The problems come when I am saving a reference in the DB to the image.
models.py
class FullResPicture(Audit):
docfile = models.ImageField()
picture = models.OneToOneField(Picture, primary_key=True)
settings.py
...
S3DIRECT_DESTINATIONS = {
# Allow anybody to upload jpeg's and png's.
'imgs': ('uploads/imgs', lambda u: u.is_authenticated(), ['image/jpeg', 'image/png'], 'public-read','bucket-name'),
}
...
views.py
#Doc file is the url to the image that the user uploaded directly to S3
#https://s3-eu-west-1.amazonaws.com/bucket/uploads/imgs/picture.jpeg
fullRes = FullResPicture(docfile = form_list[1].cleaned_data['docfile'])
So if I look at my DB, I've got some images that works fine (those I upload using only django-storages) with a docfile value like this:
images/2015/08/11/image.jpg
When the application tries to access those images, S3 boto is able to get the image properly.
But then I've got the images uploaded directly from the user's browser. For those, I am storing the full url, so they look like this in the DB:
https://s3-eu-west-1.amazonaws.com/bucket/uploads/imgs/Most-Famous-Felines-034.jpg
When the application tries to access them, I've got this exception:
File "/Users/mariopersonal/Documents/dev/venv/pictures/lib/python2.7/site-packages/django/db/models/fields/files.py", line 49, in _get_file
self._file = self.storage.open(self.name, 'rb')
File "/Users/mariopersonal/Documents/dev/venv/pictures/lib/python2.7/site-packages/django/core/files/storage.py", line 35, in open
return self._open(name, mode)
File "/Users/mariopersonal/Documents/dev/venv/pictures/lib/python2.7/site-packages/storages/backends/s3boto.py", line 363, in _open
name = self._normalize_name(self._clean_name(name))
File "/Users/mariopersonal/Documents/dev/venv/pictures/lib/python2.7/site-packages/storages/backends/s3boto.py", line 341, in _normalize_name
name)
SuspiciousOperation: Attempted access to 'https:/s3-eu-west-1.amazonaws.com/bucket/uploads/imgs/Most-Famous-Felines-034.jpg' denied.
So apparently, S3 boto doesn't like the file references as full url.
For troubleshooting purpose, I tried hardcoding the value that is saved, so instead of the full url it saves only the last part, but then I've got this other exception when it tries to access the image:
IOError: File does not exist: uploads/imgs/Most-Famous-Felines-034.jpg
Anybody knows what is going wrong here? Does anybody has any working example of direct upload to s3 that stores the reference to the uploaded file in a model?
Thanks.
This is the way I fixed, in case it helps somebody else. This solution applies if you already have django-storages working properly django-s3direct uploading the images from the client side but you cannot make them to work together.
Use the same bucket
First thing I did was making sure that both, django-storages and django-s3direct were configured to use the same bucket. As you already have both django-storages and django-s3direct working separately, just check that both are using the same bucket. For most users, just need to do something like this:
settings.py
...
S3DIRECT_DESTINATIONS = {
# Allow anybody to upload jpeg's and png's.
'imgs': ('uploads/imgs', lambda u: u.is_authenticated(), ['image/jpeg', 'image/png'], 'public-read', AWS_STORAGE_BUCKET_NAME),
}
...
Note that we are using AWS_STORAGE_BUCKET_NAME, which should be defined for django-storages configuration.
In my case was little more complex as I am using different bucket for different models.
Store only the key
When using s3-direct, once the user has uploaded the image and submit the form, our view will receive the url where S3 has placed the image. If we store this url, when s3-storages tries to access the image, it won't work, so what we have to do is store only the file's key.
The file's key is the path to the image inside the bucket. E.g, for the url https://s3-eu-west-1.amazonaws.com/bucket/uploads/imgs/Most-Famous-Felines-034.jpg the key is uploads/imgs/Most-Famous-Felines-034.jpg so that is the value we need to store on our model. In my case I'am using this snippet to extract the key from the url:
def key_from_url(url, bucket):
try:
indexOf = url.index(bucket)
return url[indexOf:]
except:
raise ValueError('The url provided does not match the bucket name')
Once I made those changes, it worked seamlessly.
I hope this helps anybody in the same situation.

Categories