Issue in uploading image for prediction using a MultiPart POST Request - python

The call is currently happening via a Flutter application which makes a multi-part POST request.
Flutter Code
var request = http.MultipartRequest(
'POST',
Uri.parse('https://techfarmtest.herokuapp.com/upload'),
);
Map<String, String> headers = {"Content-type": "multipart/form-data"};
request.files.add(
http.MultipartFile(
widget.selectedImage.toString(),
widget.selectedImage.readAsBytes().asStream(),
widget.selectedImage.lengthSync(),
filename: widget.selectedImage.path.split('/').last,
),
);
request.headers.addAll(headers);
var res = await request.send();
http.Response response = await http.Response.fromStream(res);
var data = jsonDecode(response.body);
return data;
I intend to upload the image to the backend and then perform the prediction and retrieve the result in JSON format and the backend is scripted using Flask.
Flask Code
#app.route('/upload',methods=["POST"])
def upload_image():
if request.method == "POST":
imageFile = request.files['image']
fileName = werkzeug.utils.secure_filename(imageFile.filename)
print('\nRecieved File name : ' + imageFile.filename)
imageFile.save('./uploadedImages/' + fileName)
pred('./uploadedImages/fileName')
def pred(sampleFile):
model = load_model('./model.h5')
# model.summary()
sample_file = sampleFile
sample_img = image.load_img(sample_file,target_size = (256,256,3))
sample_img = image.img_to_array(sample_img)
sample_img = np.expand_dims(sample_img,axis=0)
prediction_arr = model.predict(sample_img)
result = {
'Sample' : str(sampleFile),
'Label' : str(class_names[prediction_arr.argmax()]),
'Confidence' : str(prediction_arr.max())
}
return jsonify(result)
The current issue I am facing is that I am making a bad request (400).This is a rough code (pseudocode) that I have figured out from various resources. Is there any way to go about it.

So, I figured it out by myself.
I will be attaching the code below for future references.
Flutter Code :
var request = http.MultipartRequest(
'POST',
Uri.parse('https://techfarmtest.herokuapp.com/upload'),
);
request.files.add(
await http.MultipartFile.fromPath('image', img.path),
);
var res = await request.send();
You can verify the POST request by using the below logs :
log('${res.statusCode}', name: 'POST-request-statusCode');
log('${res.reasonPhrase}', name: 'POST-request-status');
With respect to Flask :
#app.route('/upload',methods=["POST","PUT"])
def upload_image():
if request.method == "POST":
imageFile = request.files['image']
***you can perform any operation on the file you have recieved from the request now***
Thank you!

Related

onSnapshot() Firestore sending changes multiple times with Flask-Sockets

I'm trying to develop a web chat with Flask and Firestore. I set a flow to receive new messages from firestore (when something changes at the database) and send through websockets to UI. Something like that:
Python:
#sockets.route('/messages')
def chat_socket(ws):
message = None
def callback_snapshot(col_snapshot, changes, read_time):
with app.app_context():
Messages = []
for change in changes:
if change.type.name == 'ADDED':
Messages.append(change.document)
conversation = render_template(
'conversation.html',
Messages = Messages,
)
numberID = None
if len(col_snapshot) > 0:
for i in col_snapshot:
a = i
numberID = a.reference.parent.parent.id
response = json.dumps({
'conversation': conversation,
'numberID': numberID
})
ws.send(response)
while not ws.closed:
response = json.loads(ws.receive())
newNumberID = response['newNumberID'].strip()
query_snapshot = fn.GetMessages(newNumberID)
doc_watch = query_snapshot.on_snapshot(callback_snapshot)
if message is None:
continue
Javascript:
function messages(numberID) {
var scheme = window.location.protocol == "https:" ? 'wss://' : 'ws://';
var webSocketUri = scheme
+ window.location.hostname
+ (location.port ? ':'+location.port: '')
+ '/messages';
/* Get elements from the page */
var form = $('#chat-form');
var textarea = $('#chat-text');
var output = $('.messages');
var status = $('.messages');
var websocket = new WebSocket(webSocketUri);
websocket.onopen = function() {};
websocket.onclose = function() {};
websocket.onmessage = function(e) {
numberID = JSON.parse(e.data).numberID
conversation = JSON.parse(e.data).conversation
output.append(conversation);
if (numberID == null){
output.empty();
}};
websocket.onerror = function(e) {console.log(e);};
websocket.onopen = () => websocket.send(numberID);
};
The problem is: When I use col_snapshot as Messages, everything is ok besides I get the whole firestore Collection sent every time to the user when a message is sent. So it's totally not efficient. When I set callback only for changes, as described above, if I trigger the function more than one time, somehow I set multiple listeners for the same collection, so I get multiple "changes updates" in UI. How I can keep track of those listeners so I only set one listener per Collection?
As you can see from the documentation, you should only call GetMessages and on_snapshot once per document.
#sockets.route('/messages')
def chat_socket(ws):
message = None
def callback_snapshot(col_snapshot, changes, read_time):
with app.app_context():
# Rest of the function ...
ws.send(response)
response = json.loads(ws.receive())
numberID = response['newNumberID'].strip()
query_snapshot = fn.GetMessages(numberID)
doc_watch = query_snapshot.on_snapshot(callback_snapshot)
while not ws.closed:
newNumberID = response['newNumberID'].strip()
response = json.loads(ws.receive())
if newNumberID != numberID:
numberID = newNumberID
query_snapshot = fn.GetMessages(numberID)
doc_watch = query_snapshot.on_snapshot(callback_snapshot)

angular js upload file function does't work as expected

using angular to make upload file function and flask as the backend server.
However when I try to upload the file , always get error message.
Here are the codes.
HTML Elements:
<input class="paper-trade__upload-button_hidden" type="file" id="csv-upload" name="file" accept=".csv" ngf-max-size="2MB" (change)="csvToArray($event)">
Angular:
csvToArray(fileInput: any) {
const hotInstance = this.hotRegisterer.getInstance(this.instance);
let fileReaded = fileInput.target.files[0];
this.tradeCaptureService.uploadPaperTradeCSV('KGI',
fileReaded).subscribe();
}
the request body:
uploadPaperTradeCSV(brokerName: string, file): Observable<Trade> {
const httpOptions = {
headers: new HttpHeaders({
'Content-Type': 'multipart/form-data',
}),
};
let formData: FormData = new FormData();
formData.append('file', file);
return this.http.post<Trade>(`${this.baseUrl}/trade/upload?broker_name=${brokerName}`, formData, httpOptions)
.pipe(
tap(data => console.log(data)),
);
}
Backend codes(python): as the request.files the requirments, send the file data as form-data
#hello.route('/upload', methods=['POST'])
#jwt_required
def import_trades():
if request.method == 'POST':
broker_name = request.args.get('broker_name', '')
data = request.form.get('file')
if 'file' not in request.files:
return jsonify({'fail': 'no file found'})
file = request.files['file']
if not file.filename:
return jsonify({'fail': 'no file selected'})
file_format = file.filename.split('.', 1)[1].lower()
allowed_extension = set(['csv'])
if file and file_format in allowed_extension:
filename = secure_filename(file.filename)
file_path = os.path.join(UPLOAD_FOLDER, filename)
file.save(file_path)
response = trade_orm.convert_file_data(broker_name, file_path)
if os.path.exists(file_path):
os.remove(file_path)
return response
else:
return jsonify({'fail': 'please input valid info'})
However I try to upload the file , it always return 'no file found' message.Buy when using Postman to send request, it works fine.I stuck here for quite long time .....
Any help would be great appreciated, thanks for the help
I finally find the solution for this issue, remove the 'const httpOptions = {}' staff in uploadPaperTradeCSV() method, then the would work well.so the new code would be like this
uploadPaperTradeCSV(brokerName: string, file): Observable<Trade> {
let formData: FormData = new FormData();
formData.append('file', file);
return this.http.post<Trade>(`${this.baseUrl}/trade/upload?broker_name=${brokerName}`, formData, httpOptions)
.pipe(
tap(data => console.log(data)),
);
but still I cant figure out why...

How to get only json value using jquery from calling an API?

The API returns both JSON and also render the template and when i call $.getJSON it will only return that render template but not JSON value. I have tried this
if request.args['type'] == 'json':
return json.dumps(group)
else:
return render_template("/c.., summary=json.dumps(group))
but it says
bad request
Is there any way I can get that JSON value whenever I need it?
This is my view
#cms.route('/add/asset/<client_id>', methods=["GET"])
#login_required
def asset_add(client_id):
if int(current_user.id_) == int(client_id):
group = {}
group['id'] = []
group['pid'] = []
group['name'] = []
for index in range(len([r.id_ for r in db.session.query(Assetgroup.id_)])):
for asset in (Assetgroup.query.filter_by(parent_id=(index or ''))):
group['id'].append(asset.id_)
group['pid'].append(asset.parent_id)
group['name'].append(asset.name)
if request.args['type'] == 'json':
return json.dumps(group)
else:
return render_template("/cms/asset_add.html", action="/add/asset", asset=None,
client_id=client_id,
types=Type.query.all())
else:
return 'permission denied'
and this is my ajax request
$(document).ready(function () {
$('#group_id').click(function () {
$.getJSON(
'/add/asset/' + {{ client_id }},
function (data) {
$('#group_id').find('option').remove();
var len = data.id.length;
for (var i = 0; i < len; i++) {
var option_item = '<option value="' + data.id[i] + '">' + data.name[i] + "</option>";
$('#group_id').append(option_item);
}
}
);
});
});
You can add parameter in html call to get the json result...
i.e)
const Endpoint = '/add/asset/?'
$.getJSON(Endpoint, {type: 'json'}).done(function(data...)
I believe this is what you are looking for
http://flask.pocoo.org/docs/0.12/api/#flask.Request.is_json
That is a flask method that checks if the request is json
Then you can use jsonify still in flask to return json (you need to import it though)
from flask import jsonify
so your code becomes
if request.is_json:
return jsonify(group)
Hope you find that useful and more elegant
One of the easier ways to debug is just return json alone for a start to see how the response looks in a browser. So you can remove login required (assuming you are not yet in production), do not check if the request is_json, then call the api and see what it returns. So assuming your client id is 1
#cms.route('/add/asset/<client_id>', methods=["GET"])
def asset_add(client_id):
if int(current_user.id_) == int(client_id):
group = {}
group['id'] = []
group['pid'] = []
group['name'] = []
for index in range(len([r.id_ for r in db.session.query(Assetgroup.id_)])):
for asset in (Assetgroup.query.filter_by(parent_id=(index or ''))):
group['id'].append(asset.id_)
group['pid'].append(asset.parent_id)
group['name'].append(asset.name)
return jsonify(group)
Now you can visit http://yoursite.com/add/asset/1 to see your response

Create Google Cloud Function using API in Python

I'm working on a project with Python(3.6) & Django(1.10) in which I need to create a function at Google cloud using API request.
How can upload code in the form of a zip archive while creating that function?
Here's what I have tried:
From views.py :
def post(self, request, *args, **kwargs):
if request.method == 'POST':
post_data = request.POST.copy()
post_data.update({'user': request.user.pk})
form = forms.SlsForm(post_data, request.FILES)
print('get post request')
if form.is_valid():
func_obj = form
func_obj.user = request.user
func_obj.project = form.cleaned_data['project']
func_obj.fname = form.cleaned_data['fname']
func_obj.fmemory = form.cleaned_data['fmemory']
func_obj.entryPoint = form.cleaned_data['entryPoint']
func_obj.sourceFile = form.cleaned_data['sourceFile']
func_obj.sc_github = form.cleaned_data['sc_github']
func_obj.sc_inline_index = form.cleaned_data['sc_inline_index']
func_obj.sc_inline_package = form.cleaned_data['sc_inline_package']
func_obj.bucket = form.cleaned_data['bucket']
func_obj.save()
service = discovery.build('cloudfunctions', 'v1', http=views.getauth(), cache_discovery=False)
requ = service.projects().locations().functions().generateUploadUrl(parent='projects/' + func_obj.project + '/locations/us-central1', body={})
resp = requ.execute()
print(resp)
try:
auth = views.getauth()
# Prepare Request Body
req_body = {
"CloudFunction": {
"name": func_obj.fname,
"entryPoint": func_obj.entryPoint,
"timeout": '60s',
"availableMemoryMb": func_obj.fmemory,
"sourceArchiveUrl": func_obj.sc_github,
},
"sourceUploadUrl": func_obj.bucket,
}
service = discovery.build('cloudfunctions', 'v1beta2', http=auth, cachce_dicovery=False)
func_req = service.projects().locations().functions().create(location='projects/' + func_obj.project
+ '/locations/-',
body=req_body)
func_res = func_req.execute()
print(func_res)
return HttpResponse('Submitted',)
except:
return HttpResponse(status=500)
return HttpResponse('Sent!')
Updated Code below:
if form.is_valid():
func_obj = form
func_obj.user = request.user
func_obj.project = form.cleaned_data['project']
func_obj.fname = form.cleaned_data['fname']
func_obj.fmemory = form.cleaned_data['fmemory']
func_obj.entryPoint = form.cleaned_data['entryPoint']
func_obj.sourceFile = form.cleaned_data['sourceFile']
func_obj.sc_github = form.cleaned_data['sc_github']
func_obj.sc_inline_index = form.cleaned_data['sc_inline_index']
func_obj.sc_inline_package = form.cleaned_data['sc_inline_package']
func_obj.bucket = form.cleaned_data['bucket']
func_obj.save()
#######################################################################
# FIRST APPROACH FOR FUNCTION CREATION USING STORAGE BUCKET
#######################################################################
file_name = os.path.join(IGui.settings.BASE_DIR, 'media/archives/', func_obj.sourceFile.name)
print(file_name)
service = discovery.build('cloudfunctions', 'v1')
func_api = service.projects().locations().functions()
url_svc_req = func_api.generateUploadUrl(parent='projects/'
+ func_obj.project
+ '/locations/us-central1',
body={})
url_svc_res = url_svc_req.execute()
print(url_svc_res)
upload_url = url_svc_res['uploadUrl']
print(upload_url)
headers = {
'content-type': 'application/zip',
'x-goog-content-length-range': '0,104857600'
}
print(requests.put(upload_url, headers=headers, data=func_obj.sourceFile.name))
auth = views.getauth()
# Prepare Request Body
name = "projects/{}/locations/us-central1/functions/{}".format(func_obj.project, func_obj.fname,)
print(name)
req_body = {
"name": name,
"entryPoint": func_obj.entryPoint,
"timeout": "3.5s",
"availableMemoryMb": func_obj.fmemory,
"sourceUploadUrl": upload_url,
"httpsTrigger": {},
}
service = discovery.build('cloudfunctions', 'v1')
func_api = service.projects().locations().functions()
response = func_api.create(location='projects/' + func_obj.project + '/locations/us-central1',
body=req_body).execute()
pprint.pprint(response)
Now the function has been created successfully, but it fails because the source code doesn't upload to storage bucket, that's maybe something wrong at:
upload_url = url_svc_res['uploadUrl']
print(upload_url)
headers = {
'content-type': 'application/zip',
'x-goog-content-length-range': '0,104857600'
}
print(requests.put(upload_url, headers=headers, data=func_obj.sourceFile.name))
In the request body you have a dictionary "CloudFunction" inside the request. The content of "CloudFunction" should be directly in request.
request_body = {
"name": parent + '/functions/' + name,
"entryPoint": entry_point,
"sourceUploadUrl": upload_url,
"httpsTrigger": {}
}
I recomend using "Try this API" to discover the structure of projects.locations.functions.create .
"sourceArchiveUrl" and "sourceUploadUrl" can't appear together. This is explained in Resorce Cloud Function:
// Union field source_code can be only one of the following:
"sourceArchiveUrl": string,
"sourceRepository": { object(SourceRepository) },
"sourceUploadUrl": string,
// End of list of possible types for union field source_code.
In the rest of the answer I assume that you want to use "sourceUploadUrl". It requires you to pass it a URL returned to you by .generateUploadUrl(...).execute(). See documentation:
sourceUploadUrl -> string
The Google Cloud Storage signed URL used for source uploading,
generated by [google.cloud.functions.v1.GenerateUploadUrl][]
But before passing it you need to upload a zip file to this URL:
curl -X PUT "${URL}" -H 'content-type:application/zip' -H 'x-goog-content-length-range: 0,104857600' -T test.zip
or in python:
headers = {
'content-type':'application/zip',
'x-goog-content-length-range':'0,104857600'
}
print(requests.put(upload_url, headers=headers, data=data))
This is the trickiest part:
the case matters and it should be lowercase. Because the signature is calculated from a hash (here)
you need 'content-type':'application/zip'. I deduced this one logically, because documentation doesn't mention it. (here)
x-goog-content-length-range: min,max is obligatory for all PUT requests for cloud storage and is assumed implicitly in this case. More on it here
104857600, the max in previous entry, is a magical number which I didn't found mentioned anywhere.
where data is a FileLikeObject.
I also assume that you want to use the httpsTrigger. For a cloud function you can only choose one trigger field. Here it's said that trigger is a Union field. For httpsTrigger however that you can just leave it to be an empty dictionary, as its content do not affect the outcome. As of now.
request_body = {
"name": parent + '/functions/' + name,
"entryPoint": entry_point,
"sourceUploadUrl": upload_url,
"httpsTrigger": {}
}
You can safely use 'v1' instead of 'v1beta2' for .create().
Here is a full working example. It would be to complicated if I presented it to you as part of your code, but you can easily integrate it.
import pprint
import zipfile
import requests
from tempfile import TemporaryFile
from googleapiclient import discovery
project_id = 'your_project_id'
region = 'us-central1'
parent = 'projects/{}/locations/{}'.format(project_id, region)
print(parent)
name = 'ExampleFunctionFibonacci'
entry_point = "fibonacci"
service = discovery.build('cloudfunctions', 'v1')
CloudFunctionsAPI = service.projects().locations().functions()
upload_url = CloudFunctionsAPI.generateUploadUrl(parent=parent, body={}).execute()['uploadUrl']
print(upload_url)
payload = """/**
* Responds to any HTTP request that can provide a "message" field in the body.
*
* #param {Object} req Cloud Function request context.
* #param {Object} res Cloud Function response context.
*/
exports.""" + entry_point + """= function """ + entry_point + """ (req, res) {
if (req.body.message === undefined) {
// This is an error case, as "message" is required
res.status(400).send('No message defined!');
} else {
// Everything is ok
console.log(req.body.message);
res.status(200).end();
}
};"""
with TemporaryFile() as data:
with zipfile.ZipFile(data, 'w', zipfile.ZIP_DEFLATED) as archive:
archive.writestr('function.js', payload)
data.seek(0)
headers = {
'content-type':'application/zip',
'x-goog-content-length-range':'0,104857600'
}
print(requests.put(upload_url, headers=headers, data=data))
# Prepare Request Body
# https://cloud.google.com/functions/docs/reference/rest/v1/projects.locations.functions#resource-cloudfunction
request_body = {
"name": parent + '/functions/' + name,
"entryPoint": entry_point,
"sourceUploadUrl": upload_url,
"httpsTrigger": {},
"runtime": 'nodejs8'
}
print('https://{}-{}.cloudfunctions.net/{}'.format(region,project_id,name))
response = CloudFunctionsAPI.create(location=parent, body=request_body).execute()
pprint.pprint(response)
Open and upload a zip file like following:
file_name = os.path.join(IGui.settings.BASE_DIR, 'media/archives/', func_obj.sourceFile.name)
headers = {
'content-type': 'application/zip',
'x-goog-content-length-range': '0,104857600'
}
with open(file_name, 'rb') as data:
print(requests.put(upload_url, headers=headers, data=data))

JSON data not reaching view pyramid

I'm trying to build a page where when the user presses a button a variable which initially is 0 increments with 1. This number is then sent asynchronously to the server by using jQuery AJAX.
What I have so far is:
In my __init__.py file:
def main(global_config, **settings):
engine = engine_from_config(settings, 'sqlalchemy.')
DBSession.configure(bind = engine)
Base.metadata.bind = engine
config = Configurator(settings = settings)
config.include('pyramid_jinja2')
config.add_static_view('static', 'static')
config.add_static_view('scripts', 'scripts')
# Removed the other views
config.add_route("declare_usage", '/user/{user_id}/{address_id}/declare')
config.add_route("declare_usage_json",'/user/{user_id}/{address_id}/declare.json')
config.scan()
My HTML + Jinja2:
#Removed code for simplicity
<div id="button_add">Add</div>
{{val}}
My JS:
$(document).ready(function(){
var room = 0;
jQuery.ajax({type:'POST',
url: '/user/1/5/declare', #I use a direct user ID and a direct address ID as I'm not sure how to send this to JS from Pyramid ... yet :).
data: JSON.stringify(room),
contentType: 'application/json; charset=utf-8'});
$('#button_add').click(function(){
room = room + 1;
});
});
My view code:
#view_config(route_name = 'declare_usage', renderer = 'declara.jinja2')
#view_config(route_name = 'declare_usage_json', renderer = 'json')
def declara_consum(request):
#Removed code for simplicity
val = request.POST.get('room') #I get a "None value in my html" if I change to request.json_body -> I get an error that there is no json to be parsed.
return { 'val' : val }
What happens is that when I open the debugger the POST request is successful with no data and on the page I get 2 options for 'val':
None -> When I use val = request.POST.get('room')
Error ValueError: No JSON object could be decoded -> When I use val = request.json_body
Also, still can't get it to work if in my JS i change url to be /user/1/5/declare.json and/or data to {'room' : room}
Can somebody please point out what I'm doing wrong?
you don't need another route declare_usage_json, just need separate two function like this
#view_config(route_name = 'declare_usage', renderer = 'declara.jinja2')
def declara_consum(request):
# this will response to your jinja2
return { 'val' : val }
#view_config(route_name = 'declare_usage', xhr=True, renderer = 'json')
def declara_consum_ajax(request):
# this will response to your asynchronously request
val = request.POST.get('room')
return { 'val' : val }
when you send a request using ajax, this will goto the second function.
$.ajax({
type: 'POST',
url: '/user/1/5/declare',
data: {'room' : room},
dataType: 'json'
}).done(function(response){
// update your data at html
});

Categories