upload image file to datastore using endpoints in python? - python

I am trying to make a form for adding users that will store user information to my database.
I want to upload an image file using Cloud Endpoints (python). I do not have any idea how to do it.
What will be the input class (request class) and output class (response class)?
#endpoints.method(inputclass, outputclass,
path='anypath', http_method='GET',
name='apiname')"
What url will I provide in the action= form field for uploading the image? How can I then show the file?

You have two ways to store data files (include image file).
The first one is to convert your image in base 64 format, and store it on datastore (Not the best).
The second one, is to store your image file in Google Cloud Storage (That is the best) or the blobstore (By Google themself, there is no good reason to use Blobstore).
So you have to store, your image file to Google Cloud Storage with your endpoint: https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/?hl=fr
Personnaly, I use a servlet (App Engine) to store image in GCS. My endpoints call my servlet and pass image in parameter, and my servlet store image in GCS. It works very well =)
Hope I help you..

Related

How to upload an image to MongoDB using an S3 bucket and Boto3 in Python

I'm working on a Python application where the desired functionality is that the webcam is used to take in a live video feed and based on whether a condition is true, an image is clicked and uploaded to a database.
The database I am using is MongoDB. As far as I can understand, uploading images straight-up to a database is not the correct method. So, what I wanted to do is the following:
an image is clicked from the webcam
the image is uploaded to an S3 bucket (from the same Python script, so using boto3 perhaps)
a URL of the uploaded image is retrieved (this seems to be the tricky part)
and then this URL along with some other details is uploaded to the database. (this is the easy part)
My ideal workflow would be a way to take that image and upload it to an S3 bucket, retrieve the URL and then upload this URL to the database all in one .py script.
My question is: how do I upload an image to an S3 bucket and then retrieve its public URL all through boto3 in a Python script?
I also welcome any suggestions for a better approach/strategy for storing images into MongoDB. I saw on some pages that GridFS could be a good method but that it is not recommended for the image uploads happening frequently (and really that using AWS is the more preferable way).
The URL of an S3 object can be construed if you know the S3 bucket and name:
https://{bucket}.s3.{region}.amazonaws.com/{key}
Using boto3 will be the easiest way to upload a file if you're using Python anyway.
See another answer of mine on different ways how to upload files here: https://stackoverflow.com/a/67108609/13245310
You don't need to 'retrieve' the public url, you get to specify the bucket and name of the s3 object when you upload it, so you already have the information you need to know what the public url will be once uploaded, its not like s3 assigns a new unique name to your object once uploaded.

How to properly lazy load images from a blob container?

I'm willing to implement a lazy loading approach to load images stored in an "images" folder inside an azure storage account.
I have a container in my flutter app where whenever the user scrolls down the bottom new 10 images will be loaded from the storage based on the most recent(timestamp).
I looked into this sample retrieved from: https://azuresdkdocs.blob.core.windows.net/$web/python/azure-storage-blob/12.0.0b5/index.html#id20
from azure.storage.blob.aio import ContainerClient
container = ContainerClient.from_connection_string(conn_str="my_connection_string", container_name="my_container")
blob_list = []
async for blob in container.list_blobs():
blob_list.append(blob)
print(blob_list)
But it's not what I need.I am looking for a way to make a get request that will retrieve me new set of images whenever the function is invoked..
Thankful for suggestions!
I was able to implement a lazy loading approach by using the marker continuation object
Example:
mark=req.params.get('NextMarker')
entit = table_service.query_entities('UserRequests','PartitionKey eq \'' + emailAddress + '\'',num_results=21,select= '..', marker=mark)
Dict = {"NextMarker": entit.next_marker}
return json.dumps(Dict)
This way I am able to send the marker in the http get request every time to get the second batch.
I hope this helps someone one day!
If you want to list blobs by blob creation time, unfortunately, it is not supported by Azure list blobs API(SDKs are based on APIs). Blob creation time belongs to blob properties, and as the official doc indicates, blob properties can't be set as a request param.
So if you want to fetch all new images for each request, maybe you should get a blob list first and sort them yourself and cut out the items that you need. There will be some extra codes that you need to write. But if you use Azure PowerShell to do that, you can implement the whole process easier. You can refer to this similar requirement.

Persisting File to App Engine Blobstore

The App Engine documentation for the Blobstore gives a pretty thorough explanation of how to upload a file using the BlobstoreUploadHandler provided by the webapp framework.
However, I have a cgi.FieldStorage instance that I would like to store directly into the Blobstore. In other words, I don't need to upload the file since this is taken care of by other means; I just need to store it.
I've been looking through the blobstore module source to try to understand how the upload handler creates/generates blobstore keys and ultimately writes files to the blobstore itself, but I'm getting lost. It seems like the CreateUploadURLResponse in blobstore_service_pb is where the actual write would occur, but I'm not seeing the component that actually implements that functionality.
Update
There is also an implementation for storing files directly into the filesystem, which I think is what the upload handler does in the end. I am not entirely sure about this, so an explanation as to whether or not using the FileBlobStorage is the correct way to go would be appreciated.
After the deprecation of the files API you can no longer write directly to blobstore.
You should write to Google Cloud Storage instead. For that you can use the AE GCS client
Files written to Google Cloud Storage could be served by the Blobstore API by creating a blob key.

Storing images (from URL) to GAE Datastore

I am building a web application based on Google App Engine using python; however I have seem to have hit a roadblock.
Currently, my site fetches remote image URLs from an external website. The URLs are placed in a list and sent back to my application. I want to be able to store the respective images (not the URLs) in my datastore; in order to avoid having to fetch the remote images each time as well as having to deal with broken links.
The solutions I have found online all deal with a user having to upload their own images. Which I tried implementing, but I am not sure what happens to an uploaded image (or how it gets converted into a blob) once the user hits the submit button.
To my understanding, a blob is a collection of binary data stored as a single entity (from Wikipedia). Therefore, I tried using the following:
class rentalPropertyDB(ndb.Model):
streetNAME = ndb.StringProperty(required=True)
image = ndb.BlobProperty(default=None)
class MainPage(BaseHandler):
def get(self):
self.render("index.html")
def post(self):
rental = rentalPropertyDB()
for image in img_urls:
rental.image = urlfetch.Fetch(image).content
rental.put()
The solution to this question: Image which is stored as a blob in Datastore in a html page
is identical to mine, however the solution suggests to upload the image to the blobstore and uses:
upload_files = self.get_uploads('file')
blob_info = upload_files[0]
This confuses me because I am not sure what exactly 'file' refers to. Would I replace 'file' with the URL of each image? Or would I need to perform some operation to each image prior to replacing?
I have been stuck on this issue for at least two days now and would greatly appreciate any help that's provided. I think the main reason why this is confusing me so much is because of the variety of methods used in each solution. I.e. using Google Cloud Storage, URLFetch, Images API, and the various types of ndb.Blobstore's (BlobKeyProperty vs. BlobProperty) and etc.
Thank you.
Be careful with blob inside Models. A Model cannot be more than 1 Mo including the blobproperty.
If we listen Google, there is no good reason to use the blobstore. If you can, use Google Cloud Storage. It made to store files.

Serving images directly from GCS in GAE using Blobstore API and Images API

Many questions and answers on Blobstore and Google Cloud Storage(GCS) are two or three years old, while things change dramatically these years. GCS is no longer a standalone service. It is integrated into Google App Engine (GAE) now.
Google seems to push GCS so hard that Blobstore is deprecated, for example,
The Files API feature used here to write files to Blobstore has been
deprecated and is going to be removed at some time in the future, in
favor of writing files to Google Cloud Storage and using Blobstore to
serve them.
I believe it is high time to switch to GCS.
For example, www.example.com is a site built on GAE, while example.jpg is an image stored on GCS, I want to serve the image using the url http://www.example.com/images/example.jpg
This used to be impossible, but now it is possible thanks to the integration.
I found this:
https://developers.google.com/appengine/docs/python/googlecloudstorageclient/
says:
When the Blobstore API is used together with the Images API, you get a
powerful way to serve images, because you can serve images directly
from GCS, bypassing the App Engine app, which saves on instance hour
costs.
I do know how to 'bypassing the App Engine app'. Is there any example on how to bypass GAE while serving the images using Blobstore API and Images API?
Instructions are here: https://developers.google.com/appengine/docs/python/images/functions#Image_get_serving_url
Start with an image hosted in Google Cloud Storage.
First, use the Blobstore API's create_gs_key() function to generate a blob key for your GCS image object. Then, pass that blob key into the Image API's get_serving_url() function.
The Image API will give you a special URL that skips over your app engine app and serves the image directly.
You actually do not need to use BlobStore at all now. The following will work to get the images API URL for serving images stored in GCS:
from google.appengine.api import images
images.get_serving_url(None, filename='/gs/<bucket>/<object>'))
Serving images from 'www' is not a good idea if you are using www as you GAE Cname, to serve images you can create a new sub-domain.In our case we are using cdn.example.com, and we serve our images like http://cdn.example.com/images/example.jpg
How Do we do it.
create a GCS bucket with name cdn.example.com and place your images under /images path.
Specify your index and 404 pages and it will be good to go.
more on this https://cloud.google.com/storage/docs/website-configuration?hl=en

Categories