Python Google Cloud Storage Image Store - python

Ok...I have been trying to figure out how to do this for a long time no without much success.
I have a Python Script locally on Google App Engine Launcher that receives a image file via post. I have not launched application yet, however I am able to get to the Google Cloud SQL so I assume I can get to Google Cloud Storage.
import MySQLdb
import logging
import webapp2
import json
class PostTest(webapp2.RequestHandler):
def post(self):
image = self.request.POST.get('file'))
logging.info("Pic: %s" % self.request.POST.get('file'))
#################################
#Main Portion
#################################
application = webapp2.WSGIApplication([
('/', PostTest)
], debug=True)
The logging outputs this, so I know it is receiving the image:
INFO 2014-08-04 23:20:43,299 posttest.py:21] Pic Bytes: FieldStorage(u'file', u'tmp.jpg')
Do I connect to GoogleCloudStorage
How do I upload this image to my GoogleCloudStorage bucket called 'app'?
How do I retrieve it once it is there?
Should be a simple thing to do, however I haven't been able to find good/clear documentation on how to do this. There is REST API which is depreciated and the GoogleAppEngineCloudStorageClient confuses me.
Can someone help me please with a code example? I will be really grateful!

I created a repository with a script to do this simply: https://github.com/itsdeka/python-google-cloud-storage
Example of integration with DJango:
picture = request.FILES.get('picture', None)
file_name = 'test'
directory = 'myfolder'
format = '.jpg'
GoogleCloudStorageUtil.uploadMediaObject(file=picture,file_name=file_name,directory=directory,format=format)
Tip: it automatically creates a folder called 'myfolder' in your bucket if that folder doesn't exist
The link of all pictures uploaded in your bucket is the same except for the file name, so it is pretty easy retrieve the picture that you want.

Related

Azure Functions - How to Connect to other services using API connections

I'm new to azure functions, I want to deploy my python code in function app, where my code is linked with SharePoint, Outlook, SQL Server, could some one suggest me the best way to connect all 3 of them in azure functions App....#python #sql #sharepoint #azure
Firstly, would like to discuss about accessing SharePoint files from Azure function, we just need to use few imports for it from VSCode and we also have the python documentation for Office365-Rest-Client.
Below is one of the examples to download a file from SharePoint:
Import os
import tempfile
from office365.sharepoint.client_context import ClientContext
from tests import test_team_site_url, test_client_credentials
ctx = ClientContext(test_team_site_url).with_credentials(test_client_credentials)
# file_url = '/sites/team/Shared Documents/big_buck_bunny.mp4'
file_url = "/sites/team/Shared Documents/report #123.csv"
download_path = os.path.join(tempfile.mkdtemp(), os.path.basename(file_url))
with open(download_path, "wb") as local_file:
file = ctx.web.get_file_by_server_relative_path(file_url).download(local_file).execute_query()
print("[Ok] file has been downloaded into: {0}".format(download_path))
To get all the details of files folders and their operations refer to this GIT link
For connecting to SQL we have a blog which has all insights with Python code, thanks to lieben.

Returning an image that is not stored locally with a flask endpoint

I'm not sure if this is a duplicate, I've seen plenty of questions on returning an image from a flask endpoint but what happens when the photo you're trying to return resides remotely? Eg, in an S3 bucket?
Originally my code returns an image from a python flask endpoint when there is a GET request.
class ReturnImage(Resource):
def get(self):
#Some code here
full_file_path = '/local/home/pic.gif'
return send_file(full_file_path,mimetype='image/gif')
Except that only works when the image pic.gif is stored locally on my machine. If I now want to return an image that is not stored locally, how do I do it? I tried to use urllib library to download the image and then return the output of that but doesn't seem to do the trick. Below is my attempt,
import urllib.request
class ReturnImage(Resource):
def get(self):
#Some code here
full_file_path = '/local/home/pic.gif'
image = urllib.request.urlretrieve("https://s3-us-west-2.amazonaws.com/pic.gif")
return send_file(image,mimetype='image/gif')

FileUploadMiscError while persisting output file from Azure Batch

I'm facing the following error while trying to persist log files to Azure Blob storage from Azure Batch execution - "FileUploadMiscError - A miscellaneous error was encountered while uploading one of the output files". This error doesn't give a lot of information as to what might be going wrong. I tried checking the Microsoft Documentation for this error code, but it doesn't mention this particular error code.
Below is the relevant code for adding the task to Azure Batch that I have ported from C# to Python for persisting the log files.
Note: The container that I have configured gets created when the task is added, but there's no blob inside.
import datetime
import logging
import os
import azure.storage.blob.models as blob_model
import yaml
from azure.batch import models
from azure.storage.blob.baseblobservice import BaseBlobService
from azure.storage.common.cloudstorageaccount import CloudStorageAccount
from dotenv import load_dotenv
LOG = logging.getLogger(__name__)
def add_tasks(batch_client, job_id, task_id, io_details, blob_details):
task_commands = "This is a placeholder. Actual code has an actual task. This gets completed successfully."
LOG.info("Configuring the blob storage details")
base_blob_service = BaseBlobService(
account_name=blob_details['account_name'],
account_key=blob_details['account_key'])
LOG.info("Base blob service created")
base_blob_service.create_container(
container_name=blob_details['container_name'], fail_on_exist=False)
LOG.info("Container present")
container_sas = base_blob_service.generate_container_shared_access_signature(
container_name=blob_details['container_name'],
permission=blob_model.ContainerPermissions(write=True),
expiry=datetime.datetime.now() + datetime.timedelta(days=1))
LOG.info(f"Container SAS created: {container_sas}")
container_url = base_blob_service.make_container_url(
container_name=blob_details['container_name'], sas_token=container_sas)
LOG.info(f"Container URL created: {container_url}")
# fpath = task_id + '/output.txt'
fpath = task_id
LOG.info(f"Creating output file object:")
out_files_list = list()
out_files = models.OutputFile(
file_pattern=r"../stderr.txt",
destination=models.OutputFileDestination(
container=models.OutputFileBlobContainerDestination(
container_url=container_url, path=fpath)),
upload_options=models.OutputFileUploadOptions(
upload_condition=models.OutputFileUploadCondition.task_completion))
out_files_list.append(out_files)
LOG.info(f"Output files: {out_files_list}")
LOG.info(f"Creating the task now: {task_id}")
task = models.TaskAddParameter(
id=task_id, command_line=task_commands, output_files=out_files_list)
batch_client.task.add(job_id=job_id, task=task)
LOG.info(f"Added task: {task_id}")
There is a bug in Batch's OutputFile handling which causes it to fail to upload to containers if the full container URL includes any query-string parameters other than the ones included in the SAS token. Unfortunately, the azure-storage-blob Python module includes an extra query string parameter when generating the URL via make_container_url.
This issue was just raised to us, and a fix will be released in the coming weeks, but an easy workaround is instead of using make_container_url to craft the URL, craft it yourself like so: container_url = 'https://{}/{}?{}'.format(blob_service.primary_endpoint, blob_details['container_name'], container_sas).
The resulting URL should look something like this: https://<account>.blob.core.windows.net/<container>?se=2019-01-12T01%3A34%3A05Z&sp=w&sv=2018-03-28&sr=c&sig=<sig> - specifically it shouldn't have restype=container in it (which is what the azure-storage-blob package is including)

I have been using pyimgur to upload pictures and I would like to know how to delete them

I have been using this code to upload pictures to imgur:
import pyimgur
CLIENT_ID = "Your_applications_client_id"
PATH = "A Filepath to an image on your computer"
im = pyimgur.Imgur(CLIENT_ID)
uploaded_image = im.upload_image(PATH, title="Uploaded with PyImgur")
print(uploaded_image.title)
print(uploaded_image.link)
print(uploaded_image.size)
print(uploaded_image.type)
I'm using the client ID from my Imgur account, but after I upload them they wont't appear among my user photos.
Do you know where are they stored and how can I delete them?
Many thanks in advance.
The answer was:
print(uploaded_image.deletebash)
and once you get that code:
imgur.com/delete/

Urllib error 500 works locally

I want to read a INI file from python, and I came up with the following script:
import webapp2
import urllib
import ConfigParser
class MainPage(webapp2.RequestHandler):
def get(self):
getstreet = self.request.get('street')
getlocation = self.request.get('location')
self.response.headers['Content-Type'] = 'text/plain'
map = ConfigParser.RawConfigParser()
map.read('Map.ini')
coords = map.get(getlocation, getstreet)
self.response.out.write(coords)
app = webapp2.WSGIApplication([('/map', MainPage)],
debug=True)
From outside I just need to read:
xxx.appspot.com/map?location=mylocation&street=mystreet
And I double checked the INI file is uploaded and available:
xxx.appspot.com/Map.ini
It works just fine locally, but I get an error when deploying.
I've seen mentions of url fetch, but all examples I could find were the other way around
Any idea?
If your file is available via the URL it's almost certainly marked as static. One of the limitations of AppEngine is that static files cannot be read from code. You should be able to fix it by removing the static declaration.
There's loads more information about this at Read a file on App Engine with Python?

Categories