I'm trying to upload a file to my azure file service using the following C++ code (which is referenced from official documentation here):
const utility::string_t storage_connection_string(U("DefaultEndpointsProtocol=https;AccountName=myaccount;"
"AccountKey=mykey"));
// Retrieve storage account from connection string.
azure::storage::cloud_storage_account storage_account =
azure::storage::cloud_storage_account::parse(storage_connection_string);
// Create the Azure Files client.
azure::storage::cloud_file_client file_client =
storage_account.create_cloud_file_client();
// Get a reference to the share.
azure::storage::cloud_file_share share =
file_client.get_share_reference(_XPLATSTR("myfileservice"));
//Get a reference to the root directory for the share.
azure::storage::cloud_file_directory root_dir = share.get_root_directory_reference();
// Upload a file from a file.
azure::storage::cloud_file file =
root_dir.get_file_reference(_XPLATSTR("my-sample-file"));
file.upload_from_file(_XPLATSTR("test.pdf"));
This gives out the following error:
terminate called after throwing an instance of 'azure::storage::storage_exception'
what(): Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
I've searched the internet and found it should be something related to authentication using SAS. I tried many example codes online but still cannot succeed.
However, I can do similar file upload by the following Python code without SAS authentication and without any trouble:
import os
from azure.storage.file import FileService, ContentSettings
currentDir = os.path.dirname(os.path.abspath(__file__))
#credentials for azure account
file_service = FileService(account_name='myaccount', account_key='mykey')
#path for azure file
global pathStr
pathStr = 'https://myaccount.file.core.windows.net/myfileservice/'
#function for uploading file onto Azure container
def upload(myfile):
file_service.create_file_from_path(
'myfileservice',
'files',
test + '.pdf',
myfile,
content_settings=ContentSettings(content_type='text/pdf')
)
print('finished uploading file.')
What could be the solution for this?
Related
I am using python and azure function app to send a document to be translated using the google cloud translation api.
I am trying to load the credentials from a tempfile (json) using the below code. The idea is to later download the json file from blob storage and store it in a temp file but I am not thinking about the blob storage for now.
key= {cred info}
f= tempfile.NamedTemporaryFile(suffix='.json', mode='a+')
json.dump(key, f)
f.flush()
f.seek(0)
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.name
client= translate.TranslationServiceClient()
But when I run this I get the following error:
Exception: PermissionError: [Errno 13] Permission denied:
How can I correctly load the creds from a temp file?. Also what is the relationship between translate.TranslationServiceClient() and os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.name? Does the TranslationServiceClient() get the creds from the environment variable?
I have been looking at this problem for a while now and I cannot find a good solution. Any help would be amazing!
edit:
when I change it to
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.read()
I get a different error:
System.Private.CoreLib: Exception while executing function:
Functions.Trigger. System.Private.CoreLib: Result: Failure
Exception: DefaultCredentialsError:
EDIT 2:
Its really weird, but it works when I read the file just before like so:
contents= f.read()
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.name
client= translate.TranslationServiceClient()
Any ideas why?
Any application which connects to any GCP Product requires credentials to authenticate. Now there are many ways how this authentication works.
According to the Google doc
Additionally, we recommend you use Google Cloud Client Libraries for your application. Google Cloud Client Libraries use a library called Application Default Credentials (ADC) to automatically find your service account credentials. ADC looks for service account credentials in the following order:
If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set, ADC uses the service account key or configuration file that the variable points to.
If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set, ADC uses the service account that is attached to the resource that is running your code.
This service account might be a default service account provided by Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run, or Cloud Functions. It might also be a user-managed service account that you created.
If ADC can't use any of the above credentials, an error occurs.
There are also modules provided by Google that can be used to pass the credentials.
If you already have the JSON value as dictionary then you can simply pass dictionary in from_service_account_info(key)
Example:
key = json.load(open("JSON File Path")) # loading my JSON file into dictionary
client = translate.TranslationServiceClient().from_service_account_info(key)
In your case you already have the key as dictionary
As for the error you are getting, I believe that has to be something with the temp file. Because GOOGLE_APPLICATION_CREDENTIALS needs full access to the JSON file path to read from it.
Having problem uploading file to azure blob storage container, using azure.storage.blob for python 2.7. (I know i should use newer python, but it's a part of big ROS application, hence not just so to upgrade it all.)
from azure.storage.blob import BlobServiceClient
...
container_name = "operationinput"
self.back_up_root = "~/backup/sql/lp/"
self.back_up_root = os.path.expanduser(self.back_up_root)
file = 'test.sql'
try:
client = BlobServiceClient.from_connection_string(conn_str=connection_string)
blob = client.get_blob_client(container='container_name', blob='datafile')
except Exception as err:
print(str(err))
with open(self.back_up_root + file, "rb") as data:
blob.upload_blob(data)
I get the following error:
azure.core.exceptions.HttpResponseError: The specifed resource name contains invalid characters.
RequestId:3fcb6c26-101e-007e-596d-1c7d61000000
Time:2022-02-07T21:58:17.1308670Z
ErrorCode:InvalidResourceName
All post i have found refers to people using capital letters or so, but i have:
operationinput
datafile
All should be within specification.
Any ideas?
We have tried with below sample code to upload files to Azure blob storage (Container ) using SAS token , and can able to achieve it successfully.
Code sample:-
from azure.storage.blob import BlobClient
upload_file_path="C:\\Users\\Desktop\\filename"
sas_url="https://xxx.blob.core.windows.nethttps://cloudsh3D?sastoken"
client = BlobClient.from_blob_url(sas_url)
with open(upload_file_path,'rb') as data:
client.upload_blob(data)
print("**file uploaded**")
To generate SAS url and connection string we have selected as below:-
For more information please refer this Microsoft Documentation: Allow or disallow public read access for a storage account
I'm trying to upload a file into Azure blob storage. My application is hosted in the Azure app service Linux server.
Now when I request to file upload from a remote machine, I want a file to be uploaded from the given path.
I have three request parameters which take the value-form GET request
https://testApp.azurewebsites.net/blobs/fileUpload/
containerName:test
fileName:testFile.txt
filePath:C:\Users\testUser\Documents
#app.route("/blobs/fileUpload/")
def fileUpload():
container_name = request.form.get("containerName")
print(container_name)
local_file_name =request.form.get("fileName")
print(local_file_name)
local_path =request.form.get('filePath')
ntpath.normpath(local_path)
print(local_path)
full_path_to_file=ntpath.join(local_path,local_file_name)
print(full_path_to_file)
# Upload the created file, use local_file_name for the blob name
block_blob_service.create_blob_from_path(container_name,
local_file_name, full_path_to_file)
return jsonify({'status': 'fileUploaded'})
local_path =request.form.get('filePath') the value which I get from the request is C:\Users\testUser\Documents\
becasue of which I get this error
OSError: [Errno 22] Invalid argument: 'C:\Users\testUser\Documents\testFile.txt'
all I want is to get the same path that I send in the request. Since the application is hosted in the Linux machine it treats the path as a UNIX file system if I use OS.path
please help me with this
As per the error message says, the local path is invalid for 'C:\Users\testUser\Documents\testFile.txt'. It means that there is no such file path in your local system.
If you want to use create_blob_from_path method, you should download the file to your local system first, then use the method to upload to blob storage.
Or you can get the stream / text of the file from remote, then use create_blob_from_stream / create_blob_from_text method respectively.
blob.upload_from_filename(source) gives the error
raise exceptions.from_http_status(response.status_code, message, >response=response)
google.api_core.exceptions.Forbidden: 403 POST >https://www.googleapis.com/upload/storage/v1/b/bucket1-newsdata->bluetechsoft/o?uploadType=multipart: ('Request failed with status >code', 403, 'Expected one of', )
I am following the example of google cloud written in python here!
from google.cloud import storage
def upload_blob(bucket, source, des):
client = storage.Client.from_service_account_json('/path')
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket)
blob = bucket.blob(des)
blob.upload_from_filename(source)
I used gsutil to upload files, which is working fine.
Tried to list the bucket names using the python script which is also working fine.
I have necessary permissions and GOOGLE_APPLICATION_CREDENTIALS set.
This whole things wasn't working because I didn't have permission storage admin in the service account that I am using in GCP.
Allowing storage admin to my service account solved my problem.
As other answers have indicated that this is related to the issue of permission, I have found one following command as useful way to create default application credential for currently logged in user.
Assuming, you got this error, while running this code in some machine. Just following steps would be sufficient:
SSH to vm where code is running or will be running. Make sure you are user, who has permission to upload things in google storage.
Run following command:
gcloud auth application-default login
This above command will ask to create token by clicking on url. Generate token and paste in ssh console.
That's it. All your python application started as that user, will use this as default credential for storage buckets interaction.
Happy GCP'ing :)
This question is more appropriate for a support case.
As you are getting a 403, most likely you are missing a permission on IAM, the Google Cloud Platform support team will be able to inspect your resources and configurations.
This is what worked for me when the google documentation didn't work. I was getting the same error with the appropriate permissions.
import pathlib
import google.cloud.storage as gcs
client = gcs.Client()
#set target file to write to
target = pathlib.Path("local_file.txt")
#set file to download
FULL_FILE_PATH = "gs://bucket_name/folder_name/file_name.txt"
#open filestream with write permissions
with target.open(mode="wb") as downloaded_file:
#download and write file locally
client.download_blob_to_file(FULL_FILE_PATH, downloaded_file)
I am trying to get the blob from Azure blob storage and return the files to the user for download.
Now what I am trying to do is to get the file from azure, save it locally, and return the file as using Static file:
def getDownload(filename):
try:
file = blob.get_blob('picture', filename)
with open(filename, 'w') as f:
f.write(file)
except:
abort(400, 'Download blob fail')
return static_file(filename, root='.', download=filename)
What I am trying to do it to stream it to user without first saving the file in the server.
How can I achieve it?
Unfortunately I don't have any python sample code, but here is what you can do for verification purposes:
Make sure the container is not publicly accessible
Client sends a request to your web application for a given file inside the blob to your web application
Verify if this client is allowed to access that blob (return a 401 error, if not)
Create a Shared Access Signature for this blob for a short timeframe (approx. 5 mins should do)
Return a 303 (see other) html status code to the client containing the url to the blob in the Location-Header
Example:
Client requests http://myservice.cloudapp.net/blobs/somefile.ext and is verified to access the resource. Then he will be redirected to http://mystorage.blob.core.windows.net/container/somefile.ext?SHARED_ACCESS_SIGNATURE. This link is only available to that client for a few minutes.