I've been using the Azure ML Python SDK to create pipelines for weeks now, but all of the sudden I started getting this error when trying to get the default datastore
ws = Workspace.from_config()
def_blob_store = ws.get_default_datastore()
Traceback (most recent call last): File "lstm_evaluate_pipeline.py",
line 14, in
def_blob_store = ws.get_default_datastore() File "/opt/anaconda3/envs/azure_ml/lib/python3.8/site-packages/azureml/core/workspace.py",
line 1154, in get_default_datastore
return _DatastoreClient.get_default(self) File "/opt/anaconda3/envs/azure_ml/lib/python3.8/site-packages/azureml/data/datastore_client.py",
line 699, in get_default
return _DatastoreClient._get_default(workspace) File "/opt/anaconda3/envs/azure_ml/lib/python3.8/site-packages/azureml/data/_exception_handler.py",
line 19, in decorated
raise UserErrorException(str(e)) azureml.exceptions._azureml_exception.UserErrorException:
UserErrorException: Message: (UserError) Unable to get MSI token
using identity secret. The application associated with this managed
identity InnerException None ErrorResponse {
"error": {
"code": "UserError",
"message": "(UserError) Unable to get MSI token using identity secret. The application associated with this managed identity"
} }
How can I fix this? I'm running this on MacOS Monterey in a conda environment using Python 3.8. The sdk version is 1.42.0
As noticed that the issue is with blob storage, check the following documentation, how to configure the datastore.
Source Document: Link Link2
Related
I'm trying to authenticate to SharePoint Online. Using sharepy v 2.0, pyCharm community edition, and python 3.9.
When I run:
'sharepy.connect('siteurl')'
From within PyCharm, Sharepy will freeze after I input my username in the run dialog box.
If I add the 'username' parameter and run it. Nothing happens. I'm never prompted for a password
If I use the console and enter in sharepy.connect('siteurl') then username and password (same goes for passing those parameters) I will get an error:
Traceback (most recent call last):
File "C:\Users\Andrew\AppData\Local\Programs\Python\Python39\lib\site-packages\sharepy\auth\adfs.py", line 75, in _get_token
token = root.find('.//wsse:BinarySecurityToken', ns).text
AttributeError: 'NoneType' object has no attribute 'text'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Users\Andrew\AppData\Local\Programs\Python\Python39\lib\code.py", line 90, in runcode
exec(code, self.locals)
File "<input>", line 1, in <module>
File "C:\Users\Andrew\AppData\Local\Programs\Python\Python39\lib\site-packages\sharepy\session.py", line 15, in connect
return SharePointSession(site, auth=autoauth)
File "C:\Users\Andrew\AppData\Local\Programs\Python\Python39\lib\site-packages\sharepy\session.py", line 61, in __init__
self.auth.login(self.site)
File "C:\Users\Andrew\AppData\Local\Programs\Python\Python39\lib\site-packages\sharepy\auth\adfs.py", line 27, in login
self._get_token()
File "C:\Users\Andrew\AppData\Local\Programs\Python\Python39\lib\site-packages\sharepy\auth\adfs.py", line 77, in _get_token
raise errors.AuthError('Token request failed. Invalid server response')
sharepy.errors.AuthError: Token request failed. Invalid server response
It should be noted I'm getting O365 from godaddy and the login page is federated? I think is the correct term.
According to the new release of Sharepy, this shouldn't matter.
Has anyone else had this freezing problem happen for them?
How would I authenticate with sharepoint using sharepy given my current situation?
The source of this problem ended up being GoDaddy. As we were federated using GoDaddy as the O365 provider. There was no way to authenticate correctly using sharepy.
The ultimate solution was to defederate away from GoDaddy (pretty easy to do thanks to this guy: Defederation Guide)
The reason we were unable to authenticate was because our provider redirects the login to their own login site. And unfortunately the sharepy builtin method of "auth" wouldn't work with GoDaddy.
I tested this theory before migrating away from GoDaddy. By using a fresh tenant. I also found that when you enable MFA the password/username method of authentication doesn't work.
NOTE: When new tenants are created they utilize a blanket security protocol which forces MFA. Even though MFA is shown as disabled in the Azure AD > Users section. To turn this off you must disable "Security Defaults": portal.azure.com > Azure Active Directory > Properties > "Manage security defaults" (at the bottom of the screen, its a small hyperlink).
A note on MFA and authentication with sharepy. There are methods to leave MFA enabled which work with other sharepoint/python things. I haven't tested them using sharepy yet, but will be turning on MFA and using one of the following methods:
App Password
Sharepoint API client secret
Azure App Registration (Azure App Reg)
I have been trying to work with polyglot and build a simple python processor. I followed the polyglot recipe and I could not get the stream to deploy. I originally deployed the same processor that is used in the example and got the following errors:
Unknown command line arg requested: spring.cloud.stream.bindings.input.destination
Unknown environment variable requested: SPRING_CLOUD_STREAM_KAFKA_BINDER_BROKERS
Traceback (most recent call last):
File "/processor/python_processor.py", line 10, in
consumer = KafkaConsumer(get_input_channel(), bootstrap_servers=[get_kafka_binder_brokers()])
File "/usr/local/lib/python2.7/dist-packages/kafka/consumer/group.py", line 353, in init
self._client = KafkaClient(metrics=self._metrics, **self.config)
File "/usr/local/lib/python2.7/dist-packages/kafka/client_async.py", line 203, in init
self.cluster = ClusterMetadata(**self.config)
File "/usr/local/lib/python2.7/dist-packages/kafka/cluster.py", line 67, in init
self._bootstrap_brokers = self._generate_bootstrap_brokers()
File "/usr/local/lib/python2.7/dist-packages/kafka/cluster.py", line 71, in _generate_bootstrap_brokers
bootstrap_hosts = collect_hosts(self.config['bootstrap_servers'])
File "/usr/local/lib/python2.7/dist-packages/kafka/conn.py", line 1336, in collect_hosts
host, port, afi = get_ip_port_afi(host_port)
File "/usr/local/lib/python2.7/dist-packages/kafka/conn.py", line 1289, in get_ip_port_afi
host_and_port_str = host_and_port_str.strip()
AttributeError: 'NoneType' object has no attribute 'strip'
Exception AttributeError: "'KafkaClient' object has no attribute '_closed'" in <bound method KafkaClient.del of <kafka.client_async.KafkaClient object at 0x7f8b7024cf10>> ignored
I then attempted to pass the environment and binding arguments through the deployment stream but that did not work. When I manually inserted the SPRING_CLOUD_STREAM_KAFKA_BINDER_BROKERS and spring.cloud.stream.bindings.input.destination parameter into Kafka's consumer I was able to deploy the stream as a workaround. I am not entirely sure what is causing the issue, would deploying this on Kubernetes be any different or is this an issue with Polyglot and Dataflow? Any help with this would be appreciated.
Steps to reproduce:
Attempt to deploy polyglot-processor stream from polyglot recipe on local dataflow server. I am also using the same stream definition as in the example: http --server.port=32123 | python-processor --reversestring=true | log.
Additional context:
I am attempting to deploy the stream on a local installation of SPDF and Kafka since I had some issues deploying custom python applications with Docker.
The recipe you have posted above expects the SPRING_CLOUD_STREAM_KAFKA_BINDER_BROKERS environment variable present as part of the server configuration (since the streams are managed via Skipper server, you would need to set this environment variable in your Skipper server configuration).
You can check this documentation on how you can set SPRING_CLOUD_STREAM_KAFKA_BINDER_BROKERS as environment property in Skipper server deployment.
You can also pass this property as a deployer property when deploying the python-processor stream app. You can refer this documentation on how you can pass deployment property to set the Spring Cloud Stream properties (here the binder configuration property) at the time of stream deployment.
I'm using this guide:
https://learn.microsoft.com/en-us/azure/cloud-services/cloud-services-python-how-to-use-service-management.
I'm doing exact what they wrote in the guide and I keep getting error message.
>>> from azure import *
>>> from azure.servicemanagement import *
>>> subscription_id = '************************'
>>> import os
>>> os.path.isfile(r'c:\key\mycert.pem')
True
>>> certificate_path = r'c:\key\mycert.pem'
>>> sms = ServiceManagementService(subscription_id, certificate_path)
>>> result = sms.list_locations()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\python27\lib\site-packages\azure\servicemanagement\servicemanagementservice.py", line 1131, in list_locations
Locations)
File "C:\python27\lib\site-packages\azure\servicemanagement\servicemanagementclient.py", line 365, in _perform_get
response = self.perform_get(path, x_ms_version)
File "C:\python27\lib\site-packages\azure\servicemanagement\servicemanagementclient.py", line 175, in perform_get
response = self._perform_request(request)
File "C:\python27\lib\site-packages\azure\servicemanagement\servicemanagementclient.py", line 339, in _perform_request
return _management_error_handler(ex)
File "C:\python27\lib\site-packages\azure\servicemanagement\servicemanagementclient.py", line 419, in _management_error_handler
return _general_error_handler(http_error)
File "C:\python27\lib\site-packages\azure\servicemanagement\_common_error.py", line 34, in _general_error_handler
raise AzureHttpError(message, http_error.status)
azure.common.AzureHttpError: Forbidden
<Error xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance"><Code>ForbiddenError</Code><Message>The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.</Message></Error>
I've uploaded the mycert.cer to a cloud service in my azure portal.
someone have an idea why the problem is?
I'm sure that the pem and the cer files are ok.
According to your error information & the offical document of Service Management Status and Error Codes, the issue reason as the error information said as below.
Per my experience, I think there are two reasons which will cause the issue.
The certificate you current used is invalid on Azure. Besides re-generate a new certificate refered to the offical document, you also can follow the document of Azure SDK for Python about Using the Azure .PublishSettings certificate to create the client certificate.
Using the Azure .PublishSettings certificate
You can download your Azure publish settings file and use the certificate that is embedded in that file to create the client certificate. The server certificate already exists, so you won’t need to upload one.
The server certificate file did not be uploaded into Azure Management portal settings, as below.
As reference, there is a blog writed by #GauravMantri, which is very helpful for you, that introduced the steps in details alought it's for Java.
Hope it helps.
calling API and getting the error need to translate from English to France but getting this error so I have used this code but it is not working properly
code:
from __future__ import print_function
__author__ = 'jcgregorio#google.com (Joe Gregorio)'
from googleapiclient.discovery import build
def main():
service = build('translate', 'v2',
developerKey='AIzaSyDRRpR3GS1F1_jKNNM9HCNd2wJQyPG3oN0')
print(service.translations().list(
source='en',
target='fr',
q="flower"
).execute())
if __name__ == '__main__':
main()
Error
Traceback (most recent call last):
File "trail.py", line 19, in <module>
main()
File "trail.py", line 15, in main
q="flower"
File "build/bdist.linux-i686/egg/oauth2client/_helpers.py", line 133, in positional_wrapper
File "build/bdist.linux-i686/egg/googleapiclient/http.py", line 840, in execute
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://www.googleapis.com/language/translate/v2?q=flower&source=en&alt=json&target=fr&key=AIzaSyDRRpR3GS1F1_jKNNM9HCNd2wJQyPG3oN0 returned "Bad Request">
Use the new Google Cloud client library and authorize your API using service account credentials JSON and calling export GOOGLE_APPLICATION_CREDENTIALS=your_service.json instead of using the API key. You can also use default application credentials by calling gcloud auth application-default login.
This is demonstrated in the Google Cloud Translate Python samples which should also help you get started more quickly.
Note the translate API requires billing, as described in the translate quickstart instructions, so be prepared to set that up for your project.
Following the instructions here:
https://developers.google.com/accounts/docs/OAuth2ServiceAccount?hl=en_US
I try running this python script on my google compute engine instance:
import httplib2
from googleapiclient.discovery import build
from oauth2client.gce import AppAssertionCredentials
credentials = AppAssertionCredentials("https://www.googleapis.com/auth/datastore")
http = credentials.authorize(httplib2.Http())
service = build('datastore', 'v1beta2', http=http)
x = service.datasets().lookup(body='', datasetId='surferjeff-easybates').execute(http=http)
But I still get this error:
Traceback (most recent call last):
File "C:/Users/surferjeff-easybates/Desktop/test.py", line 8, in <module>
x = service.datasets().lookup(body='', datasetId='surferjeff-easybates').execute(http=http)
File "C:\Python27\lib\site-packages\oauth2client\util.py", line 135, in positional_wrapper
return wrapped(*args, **kwargs)
File "C:\Python27\lib\site-packages\googleapiclient\http.py", line 723, in execute
raise HttpError(resp, content, uri=self.uri)
HttpError: <HttpError 401 when requesting https://www.googleapis.com/datastore/v1beta2/datasets/surferjeff-easybates/lookup?alt=json returned "Invalid Credentials">
And I have verified that my account has datastore enabled. What am I doing wrong?
The problem is related to a Google simple bug.
When you create a new instance using the cloud console and check the "Allow API access to all Google Cloud services in the same project", you are including this instance in all of the possible cloud scopes.
When you try to view which REST request generated this great functionality (by pressing the "Equivalent REST or command line" button) you see that the instance is generated using one global scope as follows:
"scopes": [
"https://www.googleapis.com/auth/cloud-platform"
]
If you attempt to create this instance using the Google API client (apiclient) with the same scopes you find out that it cannot get valid credentials to use the Datastore service.
The solution resides in the fact that the single scope is apparently not what Google assigns to your instance when you create it through the console.
If you open the instance details after creating it, you find out that it has the following scopes:
"scopes": [
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/datastore",
"https://www.googleapis.com/auth/userinfo.email"
]
Once you mention datastore's scope explicitly, things will work as expected.