Google Cloud Storage - Python Client - Get Link URL of a blob - python

The Link URL at the Object details in the GoogleCloudStorage browser follows the template:
https://storage.cloud.google.com/<project_name>/<file_name>?authuser=0&organizationId=<org_id>
I'm trying to get the exact same link using the python package google-cloud-storage. Diving into the blob properties I've found the followings (none of which are exactly what I need):
self_link: https://www.googleapis.com/storage/v1/b/<project_name>/o/<file_name>
media_link: https://storage.googleapis.com/download/storage/v1/b/<project_name>/o/<file_name>?generation=<gen_id>&alt=media
Note: If I replace storage.googleapis.com with storage.cloud.google.com at the media_link I get to download the file as I expect (getting asked for a valid Google Account with the required permissions).
Is there any way to get the link directly from the blob object?

Here the pattern:
https://storage.googleapis.com/<BUCJET_NAME>/path/to/file
For example, for a bucket my_bucket and a file stored in this path folder_save/2020-01-01/backup.zip, you have this url https://storage.googleapis.com/my_bucket/folder_save/2020-01-01/backup.zip

I think that the best approach is manually generate the URL that you need by replacing the domain of the URL.
In client library source I couldn’t find any reference to a method/property in the blob class that uses the domain “storage.cloud.google.com”
even using the public url property the result is the URL that points to googleapis

Related

SharePoint list files from under a URL

I have a simple task to do: script reads a free-text string which will contain SharePoint URLs. Now those URLs are provided by the users, basically a copy-paste from their browser. The thing that my app has to do is go to those links and check if there are any files under it.
So from what I can gather, there are many possible SharePoint URLs, for example:
<host>/sites/<site_name>/SitePages/something.aspx - for example a simple post
<host>/:w/r/sites/<site_name>/_layouts/15/something.aspx (like a shortcut URL) - for example a MS Office Word document
<host>/sites/<site_name>/<drive_name>/Forms/something.aspx?[...]&id=%2Fsites%2F<site-name>%2F<drive_name>%2F<path> - a URL to a file tree view of some files on a drive
<host>/:f:/r/sites/<site_name>/<drive_name>/<path_to_a_file>
The last one is perfect, because it contains the path to the directory in the url path. The 3rd one does have it as well, but in the urlencoded query params part.
What I do in this scenario is I parse the URL, extracting:
site name
drive name (not ID)
path (from the path in url or from the encoded &id= part)
Then, I can connect to SharePoint, get a site, list all the site drives (/drives), check if their "web_url" is a substring of my Sharepoint URL (I could search the appropriate drive by name, but the thing returned from the API is the "display name" and in my URL resides an "actual drive name"). Okay, so I've got my drive and now I can get my item by path. This all can be done via the regular MS Graph API (each step is needed for getting the object - site/drive ID) or via a python wrapper (I use python-o365).
As you can see, this is a real pain. Is there a standard way to deal with this? I mean, if I had the site and drive IDs, I could do it in a single API call, but given the fact that I only have a SharePoint link, I can't get those two, right? And how about the URL parsing?

Azure Function Set Blob Output Filenamr

I have a durable activity function which downloads a file from an api. The activity function receives the href for the file to download as an input.
I’d like to extract the filename part from the href and set this in the path parameter. E.g. container/directory/{filename}.txt
From the docs, I see that is possible to access input bindings but I cannot find and example for an activityTrigger.
I’m using the Python worker.
To achieve the above requirement ,
You can use os.path.basename as shown in below example:
print(os.path.basename("https://example.com/file.html"))
NOTE:- There is no such output binding available as mentioned in this Open GitHub Issue yet to use input binding please refer this MICROSOFT DOCUMENTATION.
For more information please refer this GitHub|How to set Storage Blob Filename, Content-Type, etc. on output binding?

How do I get the direct link for a Google Drive video (Google Drive API v3)

I'd like to make a function which converts Google Drive videos into VLC streamable links (e.g. vlc://https://WEBSITE.com/FILE_ID.mkv.
I've tried methods which were shared on stack overflow, such as modifying the Google Drive link to:
https://drive.google.com/uc?export=download&id=FILE_ID
All the methods I've tried seem to not work anymore. Any ideas?
I've figured out the answer.
Google Drives' API has a download feature, you just need to make a request to https://www.googleapis.com/drive/v3/files/FILE_ID?alt=media&key=API_KEY
Now this doesn't generate a direct file path ending with .mp4 or .mkv but VLC and PotPlayer are able to recognize this link like this:
potplayer://https://www.googleapis.com/drive/v3/files/FILE_ID?alt=media&key=API_KEY
vlc://https://www.googleapis.com/drive/v3/files/FILE_ID?alt=media&key=API_KEY
Edit: this doesn't work in development, Google prevents bots from making requests like that. To work around this you need to set a header in your request. e.g.
url = "https://www.googleapis.com/drive/v3/files/FILE_ID?alt=media&key=API_KEY"
r = requests.get(url, headers={"Authorization":"Bearer " + accessToken})
You get the accessToken from the Google Drive API
Just make the file public and copy your ID.
You can find it here: /file/d/YOUR ID/view?usp=sharing.
Copy your ID and paste it in this:
drive.google.com/uc?export=view&id=YOUR ID

Client side s3 upload and returning the public url of the image

I am writing a client side image uploader library for python. I need to upload an image to Amazon S3 and return the public URL of the image. I can do this using BOTO however I have to share my Secret Key which is not the correct way of doing it. As an alternative, I can use Browser upload using Amazon's POST request but that doesn't give me access to the image's public URL. How do I solve this conundrum?
I have the same issue. The only thing I have found is to make the image public on save and later use:
item.image.url.split('?')[0]
to get the URL.
How about using the urlparse module to get the URL without the query parameters?

How to retrieve a picasa photo using Google api in Python

I am trying to retrieve a photo (the .jpg file) from a Picasa album using the gdata Google API. However I did not find any method which does it, even though it is possible to upload a photo using methods like InsertPhoto and InsertPhotoSimple.
I guess I must be missing something :-(. A simple example would help.
Client API provides methods only for obtaining information on photos: http://code.google.com/apis/picasaweb/docs/1.0/developers_guide_python.html#Photos
To retrieve an image, use an HTTP GET with information previously obtained. Read the documentation.
For this purpose you may use a Python library:
httplib
urllib2

Categories