How do I access files hosted on web via python? - python

I am working on a system (python program) that runs on local machine but it needs to fetch data hosted somewhere on web (images in my case).
What it does is:
Send a SQL query to webhost (localhost currently)
The response sends back the names of images (it is stored in an array called fetchedImages lets assume).
Now once I have all the names of required images all I want to do is access the file directly from localhost and copy it to local machine. But this is what my problem is:
I am trying to access it as:
source = "localhost/my-site/images"
localDir = "../images"
for image in fetchedImages:
copy(source+image,localDir)
but the problem is that, the localhost is created using XAMPP and I cannot access localhost since python doesn't accept it as path. How can I access localhost if it isn't created via SimpleHTTPServer but XAMPP?

It can be solved using requests as:
import requests as req
from StringIO import StringIO
from PIL import Image
source = "http://localhost/my-site/images/"
localDir = "../images"
for image in fetchedImages:
remoteImage = req.get(source+image)
imgToCopy = Image.open(StringIO(remoteImage.content))
imgToCopy.save(localDir+image)
the requests will access the web resource thus making system easy to work with dynamic paths (localhost/my-site or www.my-site.com) and then copy those resources to local machine for processing.

Related

How to download a file in Python or Linux from Google Cloud storage?

Given a public download url belonging to a download button as
url = "https://storage.cloud.google.com/gresearch/maxim/ckpt/Enhancement/FiveK/checkpoint.npz"
is it possible to download and save the file using any of curl, python or wget?
I tried using:
!pip install google-cloud-storage
!pip install --upgrade google-cloud-storage
from google.cloud import storage
import os
# Instantiate a CGS client
storage_client=storage.Client()
bucket_name = 'gresearch'
folder='/maxim/ckpt/Enhancement/FiveK/'
delimiter='/'
file = 'checkpoint.npz'
# Retrieve all blobs with a prefix matching the file.
bucket=storage_client.get_bucket(bucket_name)
# List blobs iterate in folder
blobs=bucket.list_blobs(prefix=file, delimiter=delimiter) # Excluding folder inside bucket
for blob in blobs:
print(blob.name)
destination_uri = '{}/{}'.format(folder, blob.name)
blob.download_to_filename(destination_uri)
but keep on getting lots of error one after another. Is there another way?
Your code (as is) won't work outside a Google Cloud environment because you need some form of authentication to instantiate your Cloud Storage Client.
If you look at the documentation for Google Cloud Storage, credentials is an optional parameter which you pass while instantiating the client and the documentation has this to say about 'credentials'
credentials (Credentials) – (Optional) The OAuth2 Credentials to use for this client. If not passed (and if no _http object is passed), falls back to the default inferred from the environment.
Going back to your original question, if you find a way to pass credentials to your curl or wget or just plain python, then you should be able to do what you want. One possible way is to pass credentials via an environment variable (see documentation). Also, take a look at this link.

Editing config file inside Docker image on client site

I have created and pushed a docker image to Docker Hub. Am pulling the image on the other side on the client machines. However there are config files inside the image that are client site specific (change from site to site) - for example the addresses of the RTSP cameras per site. How would I edit these files on each client site? Do I need to manually vim each image on each client site manually or is there a simpler way?
Or is the solution to extract these config files entirely from the image, copy them separately to client site and somehow change the code to reach these files outside the image?
thanks
You better keep your image in DockerHub as a baseimage w/o any dynamic config in it (or simply ignore it).
On the client side, you need to create your local image from the baseimage from the DockerHub with replacing via COPY or by mounting it as Volume.
OR as #Klaus D. commented

Is there a way to obtain the date when the docker image was created using docker API for python

I would like to obtain the created at date for a docker image using docker API. Is this possible?
I don't see it in the docs https://docker-py.readthedocs.io/en/stable/images.html.
I see a way to obtain it using requests but was hoping to use docker API as my code uses docker API to grab other information such as Registry ID.
import docker
cli=docker.from_env()
cl_image = cli.images.get_registry_data(reg_url, auth_config=None)
image_hash = cl_image.id
The image creation timestamp is on the Image object, under the attrs property:
from dateutil.parser import isoparse
import docker
client = docker.from_env()
images = client.images.list()
img = images[0]
created_str = img.attrs["Created"]
created_datetime = isoparse(created_str)
print(created_datetime)
No, getting the creation date is not supported by the Docker SDK for python. The create attribute simply doesn't exist so you will not be able to get that value. So you will have to use the request module to fetch the data from Docker API.
Note: Your import library should not be referred to as API it is simply a library supporting the Docker Engine API. The real API is here which you'll use it to make a GET request.
Edit:
I am not sure if you are doing your authentication correctly. You need to provide credentials with values and encode in base64url (JSON) and pass them as X_Registry-Auth header in your request call, see this. This example perfectly illustrates what you have to do, albeit it's shown in the context of cURL POST request.

Hosting a Flask-Angular app for multiple users

I have an Angular app that runs on a Python-Flask server on port 5000. Right now the app works well on local host. But now I want the app to be accessible to multiple users. Seems like I will have to create sessions (as each user will generate some temp data and I'd like this data to be stored in a directory where directory name is session id). How do I proceed with this? Also how can I test this multi user functionality on my local machine since flask only listens to one port.
Use gunicorn(Gunicorn.org) to run your app, it will automatically create multiple threads to handle requests from multiple users.
You can store the temp data on client side in local or session storage and send the required data with each request(**Do not store sensitive data on client side like password)

Post form-data with remote url instead of local file

Scenario: Website x has a form in which a file can be uploaded from your local machine.
Question: Is it possible to pass in a remote url (e.g. http://blah.whatever/file.ext) instead of a local file, thus causing the remote server to download the file found at http://blah.whatever/file.ext instead of having to download the file to my local machine and then upload it?
I'm most familiar with Python. So, if this is possible through something like the requests module, that would be fantastic.

Categories