Serve streaming video in Google App Engine - python

I'm building a small educational web application. Along with other features like discussion forums, registered users will be able to view streaming videos. I'll be using Google App Engine's webapp2 framework for back-end development (with python). I want to specifically ask that how can I integrate video streaming into my application? I'm fairly new to web development and have a basic working knowledge of App Engine. I'll be using Google's Datastore to store all the app's data, but where do I store my videos that the app serves to users? I don't want to make the video content publicly available (e.g. YouTube), so what's the way to go?
I'm aware that GAE's Blobstore is dedicated to serving large files (e.g. videos) so will it be appropriate for this purpose? What are some other options?

Yes, Blobstore is fine. You can also use Google Cloud Storage, either directly or through the Blobstore API
Plenty of related Q&As to study, many contain code snippets: https://stackoverflow.com/search?q=[google-app-engine]+video+streaming
.

Related

How to delivery content (video) by download with python sdk?

i'm a python developer, inexperienced in microsoft azure services.
For a client I have to allow downloading of videos using the azure media service (video streaming). I did find information on the subject in the documentation (https://learn.microsoft.com/en-us/azure/media-services/previous/media-services-deliver-asset-download), but I want to get there using python (so either the rest azure api, or the python sdk).
I'm starting to believe it's impossible.
I need your help please.
Everything you need to do should be completely possible with the Python SDK.
I do not recommend using the REST API directly! It does not have any built in retry policies that Azure Resource Management API requires. You can get into issues with that in production - unless you know what you are doing and roll your own retry logic.
Use the official Python SDK client for Media Services only.
Also, the link above for the REST API is pointing to the legacy v2 API - do not use that now. Use the latest v3 SDK client only here -
pip install azure-mgmt-media
We have a limited number of Python samples up here that show how to use the client SDK for Python - https://github.com/Azure-Samples/media-services-v3-python
None of us on the team are Python experts, and we don't seem to get a lot of contributions to that repo - so it is not anywhere near as comprehensive as our .NET samples here - https://github.com/Azure-Samples/media-services-v3-dotnet
But keep in mind that all the Azure SDK's are just auto generated off the REST API Swagger (Open API) - so they all use the exact same entities, and use the same JSON structure on the wire - so if you know what the REST API is doing and what the Entites are - you can easily port things around between languages. Helps to know Python first though!
You mentioned you want to download stuff - that will require the use of the Storage SDKs for python. Media Services just uses Azure Storage accounts. Meaning you can access the containers using SAS URl's to upload and download stuff. Look at the Storage samples for Python to see what to do there. https://pypi.org/project/azure-storage-blob/
The uploaded video are stored as an Assest file if the files are uploaded using Azure Media Services SDK. which will make it easier to stream video to different devices.
To stream or download an asset, you first need to "publish" it by creating a locator. Locators provide access to files contained in the asset.
Media Services supports two types of locators:
OnDemandOrigin locators, used to stream media (for example, MPEG DASH, HLS, or Smooth Streaming)
Access Signature (SAS) locators, used to download media files.
Once you create the locators, you can build the URLs that are used to stream or download your files.
Here's a guide for doing that using Rest API : https://learn.microsoft.com/en-us/azure/media-services/previous/media-services-rest-get-started
Note : you're uploading your videos directly to Azure Storage? If that's the case, instead of uploading your videos directly to Azure Storage, my suggestion would be to upload your videos using the Azure Media Services SDK
Azure Media Services has pretty good documentation which might help with your other asks: http://azure.microsoft.com/en-us/develop/media-services/resources/

It's possible to upload content to Cloud Storage automatically from Google Drive?

I need to load CSV files from Google Drive into BigQuery automatically and I was wondering if it's possible to do it that way:
Google Drive Folder
Pub/Sub, Cloud Functions, DriveApi... ??
Cloud Storage Bucket
Bigquery
I have developed a python script that uploads the CSV file stored in Cloud Storage automatically to BigQuery, now I need to create the workflow between Google Drive and Cloud Storage.
I've been researching but really donĀ“t really know how to proceed.
Any hints?
You will need to develop an app to listen for changes, Google App Engine works well here or Cloud Functions.
The app will need to implement the Retrieve Changes logic that makes sense to your use case.
See these Google Drive API docs https://developers.google.com/drive/api/v3/manage-changes
With Drive, I recommend asking whether the OAuth is worth it for any app. Asking your users to submit to a lightweight frontend might be easy and faster to develop.
Try using Google drive API to pull data from google drive and load it to which ever location you want, i.e. GCS, BQ table and so on.
You can refer following example to create a code to achieve same.
You will need to develop an app to listen for changes, Google App Engine works well here or Cloud Functions.
The app will need to implement the Retrieve Changes logic that makes sense to your use case.
See these Google Drive API docs https://developers.google.com/drive/api/v3/manage-changes
With Drive, I recommend asking whether the OAuth is worth it for any app. Asking your users to submit to a lightweight frontend might be easy and faster to develop.

How to access app engine data model from desktop python application?

I am attempting to create a python application on a Raspberry Pi that can access data stored in a db model on an App Engine application. Specifically the latest entry in the data store.
I have no experience doing this type of remote data access but have a fair bit of experience with App Engine and Python.
I have found very little that I understand on this subject of remote data access.
I would like to access the data store directly, not text on a web page like this.
ProtoRPC kind of looks like it may work but Im not familiar with it and it looks like it is pretty involved when I just need to access a few strings.
What would make the most sense to accomplish this? If an example is easy to provide I would appreciate it.
What you looking for is the appengine remote api.
https://cloud.google.com/appengine/docs/python/tools/remoteapi
Google App Engine doesn't allow direct access to it's databases from your local python script. Instead, only an application hosted on App engine's server can access that data for your application.
Essentially, you're looking for a Google App Engine compatible, automatic, Restful API. Several exist, and have been discussed here. YMMV with the various frameworks that are discussed there.

something like statsd + graphite for Google App Engine?

I'm new to GAE and though I've looked around a fair bit, I haven't seen anything that mimics the functionality of statsd for GAE. Basically it would be nice to have something that you could easily set stats on and see the results graphed.
http://codeascraft.etsy.com/2011/02/15/measure-anything-measure-everything/
One thing that seems to be difficult for statsd is handling unlimited amount of data. If you are interested in aggregate application statistics (across the entire dataset), I would suggest using the App Engine Log API or the App Engine Datastore in conjunction with Google BigQuery.
If you are interested specifically in analyzing App Engine logs, there are two projects that you can take a look at that helps move App Engine Log data into BigQuery:
log2bq, a Python app for GAE logs->BigQuery
Mache, a framework for pushing GAE log data into BigQuery (I know you are
asking about Python , but this one is written in Java)
For general stats collection and analysis, it's also possible to move Datastore data into BigQuery for analysis. The GAE team has recently started testing a feature that imports data from the experimental Datastore backup tool directly into BigQuery. Check this link for more info.
BigQuery doesn't provide visualization tools on it's own, but there are lots of ways to visualize BigQuery's query results, examples include:
Google Chart Tools API
Google Apps Script
Tableau
QlikView
There's a lot more on the BigQuery third party tools page.

Twisted vs Google App Engine in serving mobile clients

So far I have been using Twisted to simultaneously serve a lot of mobile clients (Android, iPhone) with their HTTP requests exchanging JSON messages.
For my next project I'd like to try out Google App Engine, but I'm wondering if it is capable of doing the same or if I should rather go with a custom built solution again.
Certainly. App Engine will scale your application up as the load increases automatically and will be spread over many machines. The web api they have is pretty nice too. You don't have to worry about deferreds either because it scales by bringing more instances up instead of making things asynchronous.
BTW: I have web services hosted on app engine that are consumed by iPhone.

Categories