I am trying to first get connection to office 365 using python where I'll be able to hit the office 365 api with post request to create accounts. I have seen some solution using Flask and Django. However, I would like to just implement without these framework with a pure python script using only libraries built for doing such. This is a link to the solution in Flask: https://dev.office.com/code-samples-detail/5989
where the api connection is done:
# Put your consumer key and consumer secret into a config file
# and don't check it into github!!
microsoft = oauth.remote_app(
'microsoft',
consumer_key=client_id,
consumer_secret=client_secret,
request_token_params={'scope': 'User.Read Mail.Send'},
base_url='https://graph.microsoft.com/v1.0/',
request_token_url=None,
access_token_method='POST',
access_token_url='https://login.microsoftonline.com/common/oauth2/v2.0/token',
authorize_url='https://login.microsoftonline.com/common/oauth2/v2.0/authorize'
)
Also, unlike the Flask solution, I won't be using any views but it will run mainly as a backend task.
I want to replicate the api connection as above using only Python OAuth libraries so it runs as a Python script. So far, I'm only getting a 403 error when I tried curl or using request_oauth. Once I have connection to the api, I'll be able to perform CRUD operations on accounts.
Any advice or guidance would be appreciated. Thanks.
Related
I am trying to use the Healthcare API, specifically the Healthcare Natural Language API for which there is this tutorial as well as this other one
The tutorial outlines how to run the API on a string; I've been tasked with figuring out how to make use of the API with a dataset of medical text data. I am most comfortable in Python out of all GCP options, so I attempted to run the code through Colab.
I used a service account json key for authentication, but this isn't best practice. So, I had to delete the key since we are dealing with patient data and everyone on my team is new to GCP.
In order for me to continue exploring running the Healthcare NLP API on a dataset rather than one string, I need to figure out authentication through a different method. In this regard, I have the following questions:
Pros/cons of me trying to run this through Colab?
Should I look to shifting to running my code within the GCP interface?
Are my choices Colab (and being forced to use a json key) vs working within GCP shell/terminal (with a plethora of authentication options)? This is what I gather from my research, but I am quite new to using APIs, working with cloud computing, etc.
I've tried to look into related tutorials such as this but their lack of direct relationship to what I am doing (ie: can't find one where the API being used is Healthcare), and my lack of familiarity with APIs and GCPs, I don't particularly understand what is going on + I keep seeing service accounts generally mentioned at one point or another, and I am not allowed to use service account keys for the time being.
Instead of using a service account you can use your own credentials, and supply them to your code using the "application default credentials". To set this up, make sure the GOOGLE_APPLICATION_CREDENTIALS environment variable is unset, then run gcloud auth application-default login (docs). After going through the login flow you can either generate an auth token manually, using gcloud auth application-default print-access-token, or you can use the Google client libraries, i.e.
from googleapiclient import discovery
api_version = "v1"
service_name = "healthcare"
# Returns an authorized API client by discovering the Healthcare API
# and using GOOGLE_APPLICATION_CREDENTIALS environment variable.
client = discovery.build(service_name, api_version)
Under the hood this is using the application default credentials, you can also do with the google-auth Python package if you want.
You can find a summary of all the standard authentication methods here
I'm finding a way to automate the authentication from AAD with python.
In the past i used username and password for login to OneDrive and it worked properly. Here is the code:
pca = msal.PublicClientApplication(CLIENT_ID, authority=AUTHORITY_URL)
token = pca.acquire_token_by_username_password(USERNAME, PASSWORD, SCOPES)
Now, that i have two factor authentication i cannot use the same code to access OneDrive account.
I've searched the internet but all the solutions that i found requires to open the browser, and i can't do this because python script is a chronjob and runs in the late night. I need a solution that works in "background" without any action required.
Possibly it's better if there is a solution with MSAL library due to some permissions that i should request if i would change the library.
Thanks for the help!
You might consider using the Client-Credentials Grant flow or OAuth2. You would have to modify your cronjob to move away from fetching a token on behalf of the user and update it to acquire a token as the application using the application's identity (the app registration done in AAD). Upgrading to client-credentials flow, which is actually designed for scenarios similar to yours, will help you in situations where you cannot afford user interaction and you want the service to work in the backend.
For more information on Client-Credentials flow, check here.
Also, you can refer to the following python app that implements client-credentials flow:
Call Microsoft Graph API using App Client Secret
Call Microsoft Graph API using App Client Certificate:
I am using the Google API client in Python3.
All the examples always go about using discovery.build to create a service and then perform actions using that.
So... I don't think uploads as described here are possible simply using the API. Is that right?
https://developers.google.com/apis-explorer/#p/discovery/v1/discovery.apis.getRest?api=photoslibrary&version=v1&_h=2&
Is the way folks solve this by making a request and not relying upon the google API client?
I'm trying to submit a query into Google's BigQuery and retrieve results - all from a python script. While there's straightforward documentation on doing so, the only option that I've found for querying from private tables/collections is to use an authorization code. However, this python script is utilized via a webpage used by users who know nothing about code - therefore there is no room to get/submit authorization codes, as the user simply uses the webpage and python script by clicking a few buttons. Is there any way to get the authorization code and submit it behind the scenes, or to query a private table without an authorization code altogether (best option)? Thanks so much!
You can use a service account:
Client libraries can use Application Default Credentials to easily
authenticate with Google APIs and send requests to those APIs. With
Application Default Credentials, you can test your application locally
and deploy it without changing the underlying code
https://cloud.google.com/bigquery/authentication#bigquery-authentication-python
I have developed a fb app and the front end side is working great with my own db.
Now I want to make a python program that runs through the command line which posts to my fb account.
I know with this I need to access the access token as I come across this while access my app outside fb which does not function properly unless run inside the fb iframe.
So is there a python sdk etc so that I can access the access token and or storing the access token in the db so python can pull it out then use the python sdk to post if exists.
Thanks
PyFacebook is a thin wrapper for accessing Facebook's RESTful API through Python.
https://github.com/pythonforfacebook/facebook-sdk
facebook-sdk, a set of essential tools for working with Facebook in Python.
django-facebook, an extensible django plugin for building facebook integrated sites.
http://www.pythonforfacebook.com/