How to delivery content (video) by download with python sdk? - python

i'm a python developer, inexperienced in microsoft azure services.
For a client I have to allow downloading of videos using the azure media service (video streaming). I did find information on the subject in the documentation (https://learn.microsoft.com/en-us/azure/media-services/previous/media-services-deliver-asset-download), but I want to get there using python (so either the rest azure api, or the python sdk).
I'm starting to believe it's impossible.
I need your help please.

Everything you need to do should be completely possible with the Python SDK.
I do not recommend using the REST API directly! It does not have any built in retry policies that Azure Resource Management API requires. You can get into issues with that in production - unless you know what you are doing and roll your own retry logic.
Use the official Python SDK client for Media Services only.
Also, the link above for the REST API is pointing to the legacy v2 API - do not use that now. Use the latest v3 SDK client only here -
pip install azure-mgmt-media
We have a limited number of Python samples up here that show how to use the client SDK for Python - https://github.com/Azure-Samples/media-services-v3-python
None of us on the team are Python experts, and we don't seem to get a lot of contributions to that repo - so it is not anywhere near as comprehensive as our .NET samples here - https://github.com/Azure-Samples/media-services-v3-dotnet
But keep in mind that all the Azure SDK's are just auto generated off the REST API Swagger (Open API) - so they all use the exact same entities, and use the same JSON structure on the wire - so if you know what the REST API is doing and what the Entites are - you can easily port things around between languages. Helps to know Python first though!
You mentioned you want to download stuff - that will require the use of the Storage SDKs for python. Media Services just uses Azure Storage accounts. Meaning you can access the containers using SAS URl's to upload and download stuff. Look at the Storage samples for Python to see what to do there. https://pypi.org/project/azure-storage-blob/

The uploaded video are stored as an Assest file if the files are uploaded using Azure Media Services SDK. which will make it easier to stream video to different devices.
To stream or download an asset, you first need to "publish" it by creating a locator. Locators provide access to files contained in the asset.
Media Services supports two types of locators:
OnDemandOrigin locators, used to stream media (for example, MPEG DASH, HLS, or Smooth Streaming)
Access Signature (SAS) locators, used to download media files.
Once you create the locators, you can build the URLs that are used to stream or download your files.
Here's a guide for doing that using Rest API : https://learn.microsoft.com/en-us/azure/media-services/previous/media-services-rest-get-started
Note : you're uploading your videos directly to Azure Storage? If that's the case, instead of uploading your videos directly to Azure Storage, my suggestion would be to upload your videos using the Azure Media Services SDK
Azure Media Services has pretty good documentation which might help with your other asks: http://azure.microsoft.com/en-us/develop/media-services/resources/

Related

Make the Google python client library for accessing Google cloud storage hit a stubbed API

I am writing an application that uses Google's python client for GCS.
https://cloud.google.com/storage/docs/reference/libraries#client-libraries-install-python
I've had no issues using this, until I needed to write my functional tests.
The way our organization tests integrations like this is to write a simple stub of the API endpoints I hit, and point the Google client library (in this case) to my stub, instead of needing to hit Google's live endpoints.
I'm using a service account for authentication and am able to point the client at my stub when fetching a token because it gets that value from the service account's json key that you get when you create the service account.
What I don't seem able to do is point the client library at my stubbed API instead of making calls directly to Google.
Some work arounds that I've though of, that I don't like are:
- Allow the tests to hit the live endpoints.
- Put in some configuration that toggles using the real Google client library, or a mocked version of the library. I'd rather mock the API versus having mock code deployed to production.
Any help with this is greatly appreciated.
I’ve made some research and it seems like there’s nothing supported specifically for Cloud Storage using python. I found this GitHub issue entry with a related discussion, but for go.
I think you can open a public issue tracker asking for this functionality. I’m afraid by now it’s easier to keep using your second workaround.

PDF/TIFF Document Text Detection

I am currently trying to use Google's cloud vision API for my project. The problem is that Google cloud vision API for document text detection accepts only Google Cloud Services URI as input and output destination. But I have all my projects, data in Amazon S3 server which cant be directly used with this API.
Points to be noted:-
All data should be in kept in S3 only.
I can't change my cloud storage to GCS now.
I can't download files from S3 and upload to GCS manually.The number
of files that are incoming per day is more than 1000 and less than
100,000.
Even if I could automate downloading and uploading of the pdf, this
would serve as a bottleneck for the entire project, since I would have to deal
with concurrency issues and memory management.
Is there any workaround to make this API work with S3 URI? I am in need of your help.
Thank You
Currently, Vision API doesn't work with URLs, apart from the Google Cloud Storage ones. There's a feature request for the image search related to use the API with specific URLs where you could ask to consider this feature for the PDF/TIFF documents too, or raise a new feature request for this scenario.

How to download all files from google bucket directory to local directory using google oauth

Is there any way using OAuth to download all content of a google bucket directory to a local directory.
I found two ways using (get request object) from storage API and gsutil. But since API uses direct name downloading I have to first parse all the list of bucket content and then send get request and then download it. I find gsutil more convenient but for this, I have to hard code details for the credential.
Basically, i am developing a client related application where I have to download the big query table data to the client local server
Can anyone help me for this
Unless your application knows ahead of time the object names that you want to download, you'll need to perform a list followed by GETs for each object.
You can use the gcloud-python client library to do this. You can configure your client application has the OAuth2 credentials and the library should handle the rest of the necessary authentication for you. See the documentation here for the basics of authentication, and [here](https://googlecloudplatform.github.io/google-cloud-python/stable/storage-blobs.html for interacting with Google Cloud Storage objects.

Twisted Python Google Drive API for Tahoe-LAFS

I am planning on creating a Twisted Python Google Drive API using the open-source txboxdotnet API as a base, which is based on Box.com's API and is located here.
Doing this would allow me to create a plugin for Tahoe-LAFS, a Least Authority File System which gives the user a number of additional client-side security functions. For my purposes, I am using the open-source branch of Tahoe-LAFS that allows for cloud interaction located here and an open-source public-clouds modification made for this branch.
My main question is would the creation of a Twisted Python API for Google Drive be allowed under the Terms and Conditions for Google Developers? The API will (of course) not contain any of my confidential developer credentials.
Well going on the assumption of best practices here and your description of the API usage and intent you are within the bounds of their terms. You can read for yourself here. Sections 3 and 4 to be exact.

Google Storage Client Library from outside AppEngine

I'm curious, is there a way I could use the new Google Cloud Storage client library from outside AppEngine? If so, how would I go about setting the credentials/API key? I looked though the sparse documentation, to no avail. Any help is much appreciated.
Thanks.
Google Cloud Storage and Google AppEngine are separate products that can be used seperately. AppEngine provides an AppEngine-specific client for Google Cloud Storage that provides several useful features for developing an AppEngine app that will use Google Cloud Storage, which I believe is the library you're referring to.
You can absolutely use Google Cloud Storage from outside AppEngine, but you cannot use AppEngine's GCS library to do so. Instead, you'll have to use one of GCS's APIs or client libraries. There are two main APIs (XML and JSON-based), and also client libraries for many major languages, including Python and Java.
For getting started, check out https://developers.google.com/storage/docs/signup
It should be possible to use gcs client from outside GAE, however you will still need to have the GAE SDK so the imports can work.
Take a look at the method common.set_access_token, you would need to refresh the token by yourself however.
If you are willing to dig further, you can take a look at the constructor of the _RestApi class which receives a token maker function.
This is an open source project and changes are welcomed.

Categories