WSO2 api manager not sending uploaded multiple files to backend server - python

I have built a python webservices application using flask.
After creation of swagger document, I published the api in WSO2 API manager.
The publish of API and subscription to the API are successful.
I am able to get authentication token as well.
What I am doing
To consume the API, in Postman/Angular application I am uploading 2 excel file
Files are then sent to WSO2 server url which then sends it to python server
When python server receives the file, it parses it.
Does some calculations and returns the response data object.
Now the problem is at step 3. The files received at Python end are not in the excel format. The data of both the file are combined into one FileStorage object.
Please see the snapshot below
Instead of two One FileStorage object received in request.files
I am trying this all in postman as well as with an angular application and both doesn't work.
I even tried it via swagger at published app's page in WSO2 API manager but that also doesn't works.
Sending Files Via WSO2 Swagger
What only works
When I try the Swagger of published app, Or when I consume the services via postman making a request directly to python's server.
I am getting both the files in proper required format.
File received properly when tried with python app's swagger page
Can you please help me to understand as to what I might be doing wrong.
I am using content type as below
'content-type: multipart/form-data;

After some digging up I found my resolution at Multipart form data file upload using WSO2 API manager?
I added the
<messageBuilder contentType="multipart/form-data"
class="org.wso2.carbon.relay.BinaryRelayBuilder"/>
<messageFormatter contentType="multipart/form-data"
class="org.wso2.carbon.relay.ExpandingMessageFormatter"/>
inside the JSON Message Builders and JSON Message Formatters section in axis2.xml file at
<API-M_HOME>repository\conf\axis2\axis2.xml

Related

How to download all files from google bucket directory to local directory using google oauth

Is there any way using OAuth to download all content of a google bucket directory to a local directory.
I found two ways using (get request object) from storage API and gsutil. But since API uses direct name downloading I have to first parse all the list of bucket content and then send get request and then download it. I find gsutil more convenient but for this, I have to hard code details for the credential.
Basically, i am developing a client related application where I have to download the big query table data to the client local server
Can anyone help me for this
Unless your application knows ahead of time the object names that you want to download, you'll need to perform a list followed by GETs for each object.
You can use the gcloud-python client library to do this. You can configure your client application has the OAuth2 credentials and the library should handle the rest of the necessary authentication for you. See the documentation here for the basics of authentication, and [here](https://googlecloudplatform.github.io/google-cloud-python/stable/storage-blobs.html for interacting with Google Cloud Storage objects.

MobileApp Architecture: Allow process client request using server scripts

I am working on capstone project and I am looking for ideas to implement this one.
I have a working web application that was created using OpenCV Python and deployed on a Django Framework. Now I want to also create a hybrid mobile application for the project. The idea is, the mobile app will allow the user to upload an image to the server, and then the web server will process the image then finally return a response to the mobile app.
===============================
Client (Hybrid Mobile app):
Take image
Web server:
Receive user uploaded image
Call Image Processing class (views.py) and do something more in the backend
Save the result to the database
Display the result to the client/Response to client (could be a page redirect)
============================
I know that it is possible to save an image to the database through REST API, however, I have no idea if it is possible for the client side to call a class from the server side through REST API? If no, then is there any other way to implement this method? Do know any references that can give me some ideas on how to implement it?
You can't call a class on the server from the mobile phone.
You can just post the file to your view (this will differ based on what architecture the app is running):
files = {'file': ('file_name.jpeg', file_data)}
response = requests.post(url, files=files, auth=(USERNAME, PASSWORD))
# add error checking etc
It sounds like you need to read up a bit more about creating an API. Most commomly, when you want data you do a http get and the data is returned in json. When sending data you do a http post.
Also, don't save file data directly in the database

Google App Engine: Allow Admins to Upload Data via Python Script

I have an application hosted on Google App Engine and would like to make uploading data more secure. Currently, I upload data by using a python script to make POST requests with the relevant data to a URL on the /_ah/spi/... path (and the app takes care of uploading the data correctly). My goal is to make uploading data only available to admins of the application. I tried specifying an admin security-constraint for /_ah/spi/* in my web.xml but this seemed to block me from being able to upload data.
What is the best/easiest way only allow admins to upload data via my python script?
I didn't quite get exactly what I wanted (allow access to my application endpoints by admins only) but I did find a way to secure it. I followed the steps here. Basically I had to:
generate a client ID via the google developer console
add the client ID to the #Api annotation (under clientIds) for my endpoint
add a User parameter to all of the protected methods
authenticate myself using OAuth and my public/secret client ID (generated in step 1) in my python script (sample code here)
This scheme at least requires the script accessing my application to have both the public/secret client ID. Also, this question had some very useful information.

python (flask) redirect request via client

I have been trying to do something which I think should be pretty simple. The situation is as follows. The client makes a request for a resource on my web server. My flask application processes the request and determines that this resource is located at a certain location on another web server and the client should make a request of that server instead.
I know I can use the redirect function to tell the client to send a request to the remote location, but my problem is that the remote location is the Amazon Glacier servers. These servers require a request to be made in a certain way, with a special signature (see http://docs.aws.amazon.com/amazonglacier/latest/dev/amazon-glacier-signing-requests.html). My flask application knows how to go about the business of making these requests in the required way. I essentially want to know if it's possible to send a response to my client saying, send this request (generated by my application, with all the required signing) to the Amazon server?
Any ideas?
If the request can be encoded with get params like
http://www.redirecturl.com/?param1=bla&param2=blub
then it should work no problem. Just construct the request as a string and pass it to redirect().
As far as i know, you can't tell a client to send specific headers to a HTTP redirect URL.
Hitting the Glacier URL serverside would be the easiest. Using javascript on the clientside would only work if Glacier is implementing CORS.

Uploading to Blobstore without using blobstore.create_upload_url

I would like to create an app using python on google app engine to handle file upload and store them into the blobstore.
However, currently blobstore requires the use of blobstore.create_upload_url to create the url for the file upload form post. Since I am uploading from another server, my question is, is it possible to upload file to gae blobstore without using that dynamic url from blobstore.create_upload_url?
FYI, it is ok that I request a upload URL from the python script before I upload from another server but this creates extra latency and that is not what I want. Also I read that using the so called "file-like API" from http://code.google.com/intl/en/appengine/docs/python/blobstore/overview.html#Writing_Files_to_the_Blobstore but the documentation didn't seem to cover the part on uploading.
Also, previously I tried to use datastore for file upload, but the max file size is 1MB which is not enough for my case. Please kindly advise, thanks.
There are exactly two ways to write files to the blobstore: 1) using create_upload_url, and posting a form with a file attachment to it, 2) writing to the blobstore directly using an API (with a solution here for large files).
If you want a remote server to be able to upload, you have the same two choices:
1) The remote server firsts requests a URL. You have a handler that's just is only to return such a URL. With this URL, the remote server crafts a properly-encoded POST message and posts it to the URL.
2) You send the file data as an HTTP parameter to a handler to your app engine server, which uses the blobstore write API to then write the file directly. Request size is limited to 32MB, so the maximum file size you can write using this method will be slightly under that.

Categories