Im working on a bot and would like to integrate to a website for my project. My Bot is fully functional, lambda fulfillment coded in python runtime. I see example of how to integrate bot implemented in javascript(lambda node runtime) but not able to find help for python implementation.
Does my bot need to be implemented in javascript(lambda node runtime) to be able to integrate with website?
I not very familiar with UI stuff and any help to get me started is appreciated.
Your bot can be written in any language and still be integrated with a website.
You can use an AWS SDK to access the bot or HTTPS calls.
From the documentation for HTTPS calls:
POST /bot/botName/alias/botAlias/user/userId/text HTTP/1.1
Content-type: application/json
{
"inputText": "string",
"requestAttributes": {
"string" : "string"
},
"sessionAttributes": {
"string" : "string"
}
}
Or you can use an SDK. This can be done through the UI layer, using the JavaScript SDK; or through the website back-end, using the SDK that matches your backend. Here's the list of SDKs.
You will be calling these methods from the Lex SDK to use your bot. PostContent is for sending voice, and PostText is for sending text. The response will contain the bots response.
Hope that helps
Related
So, I created the slack bot and it was working completely fine but, When I distributed the slack bot app to another workspace I started facing the "user not found" error on API call of "users_profile_get" as I have also cross-checked the required scopes for this API, user_id, and access token and they are completely fine but still API call returns user not found.
I need your help guys on it as I am missing something while distributing the app or Is there any other problem?
There is also One strange thing that I can call "chat.postMessage" API of slack and it runs successfully.
result = app.client.users_profile_get(user=slack_id)
While calling this API,
I am getting error :
{
"ok": false,
"error": "user_not_found"
}
Distributed apps cannot call API methods across Workspaces, so you would need to make sure that you're using an API token that belongs to the same Workspace that the user belongs to. You can double check by passing your token to the https://api.slack.com/methods/auth.test method. to retrieve tokens from other workspaces make sure you have enabled OAuth 2.0 (https://api.slack.com/authentication/oauth-v2.
I am working on developing a mobile application using flutter and I use the Python language as part of the project. I want to know how to send data between flutter and Python, especially sending data from Python to flutter.
For example if i have Python code like this :
**thisdict** = {
"brand": "Ford",
"model": "Mustang",
"year": 1964
}
i want to send the value of the dictionary thisdict to flutter How to do that ?
I'm assuming the values stored in the dictionary are read from a database, which means you will hav to implement a RESTful-API on Python using Django's REST Framework or Flask's REST Framework.
Django Rest Framework
Flask-RESTful
This API will read values from the database upon a request and send the data back as a JSON object to the user/API Caller with the data required in the dictionary.
To send HTTP Post or GET Requests you will have to use Flutter-BLoC to send POST as well as GET Requests to an API and then capture the Response.
Flutter BLoC
Here's the documentation on FLutter BLoC.
I'm a newbie so please bear with me.
I'm trying to set up a Google Cloud Function that can access a single GMail account on the same domain, download some emails and push them to Cloud Storage. I honestly would like to just use an email & password in the script (using Google KMS /w environment variables?) but I understand that isn't possible, and OAuth2 is required.
I've set up an OAuth Client in GCP and I have run the GMail API Python Quickstart guide. Running it locally I am prompted to allow access, and the token is saved so subsequent runs work without prompts.
I deployed the Cloud Function with the pickle file to test if the refresh token will still work, planning to figure out how to use KMS to make this more secure later on. But there's an issue loading the pickle:
UnpicklingError: invalid load key, '\xef'
Which makes it seem like the pickle gets compressed/corrupted on upload.
Is this even a sane approach? How could I do this? The email address is mine, so I was hoping I could just authenticate once and be done with it.
It's not possible for me to use a domain-delegated Service Account by the way- nor use IMAP.
Since you can't do domain delegated service accounts, you can try following this guide on setting up server side authorization. This sounds like what you want since it requires that the user authorizes the app once and then reuses that token. Here is a codelab that will take you through the auth part.
Alternatively, you can use push notifications. It sounds like your current design is to poll the Gmail API to look for new emails periodically, which also involves authorizing access to the account in the Cloud Function. However, if you take advantage of Push Notifications, you can both get the data in real time, and avoid having to authorize the Cloud Function to read the Gmail API. See guide here.
However, probably the easiest solution is to use app scripts. If you set up your cloud function to be triggered via an HTTP target, you can write an app script to ping that URL with the messages you want to send to GCS. Docs here.
function getEmails() {
let inbox = GmailApp.getInboxThreads(0, 50);
// Inbox Threads
for (i=0; i < inbox.length; i++){
let threadMessages = inbox[i].getMessages();
// Thread Messages
for (j=0; j < threadMessages.length; j++){
let message = threadMessages[j].getBody();
let subject = threadMessages[j].getSubject();
var options = {
'method' : 'post',
'contentType': 'application/json',
'payload' : JSON.stringify({"message": message, "subject": subject})
};
UrlFetchApp.fetch('YOUR_FUNCTION_TRIGGER_URL', options);
}
}
}
This question refers to youtube, specifically youtube analytics API and OAuth flow.
Using a ruby server-side webapp, I have created a token.
I now want to use that token in a client-side python app.
I have the client_secrets.json that generated the token.
I have generated tokens with python before and the format does not match that of the ruby generated token.
Is there existing code or a simple way to convert the ruby formatted token for use in python? Yes, they are both json but the structures are different.
https://developers.google.com/youtube/reporting/guides/authorization/server-side-web-apps
Assuming that you use the same client id and client secret for both ruby and python. The refresh token that you got from one will work in the other.
Raw Oauth response:
{
"access_token" : "ya29.1.AADtN_VSBMC2Ga2lhxsTKjVQ_ROco8VbD6h01aj4PcKHLm6qvHbNtn-_BIzXMw",
"token_type" : "Bearer",
"expires_in" : 3600,
"refresh_token" : "1/J-3zPA8XR1o_cXebV9sDKn_f5MTqaFhKFxH-3PUPiJ4"
}
Assuming you are using the client libraries then the issue you are going to have is how the different libraries store there credentials and read from them. You are going to have to make your own parser for that.
I am trying to figure out how the oauth flow will work for my google calendar app. I have a desktop app which I will be distributing and which will be using google calendar. I know there's a client secrets, but I'd like to know if there's a way to request a token rather than sending the client secrets file along. My worry is that someone will just spam the calendar and my app won't work for anyone else. Is this a possibility? What solutions exist to mitigate this?
thanks,
Using OAuth2, the client secret is used to authenticate your app to Google, not vice versa. The OAuth2 server will only issue a token to an application that submits a correct client id and client secret (among other things). This arrangement is (in part) precisely about making sure that someone else cannot (e.g.) "spam your calendar". Otherwise, how is Google to know that the requesting application is actually legitimate, rather than malicious code crafted by a spammer to just look like your application?
Before you can use OAuth2, your application has be be registered with Google. As part of this process Google issues you with a client secret, which you then have to build in to your application instance. This application instance is also tied to redirect URIs for when the authorisation handshake is complete.
The upshot of this is that you can't really distribute an OAuth2-using app without each deployment having to go through the registration process. If you try and distribute the secret with your application, then it's no longer a secret, and in any case, you can't know all of the URIs at which it may be deployed.
An approach I've taken with an application I'm working on is to have the application read a client secret file that has to be provided by the installer of the application, based on their own registration. This has a format based on the JSON download that Google provides when you register an application. It's something of a pain requiring every installer to go through this dance, but as things stand I don't believe there's an easier way that is also secure.
For example, I have an implementation of OpenID Connect authentication using Google that builds upon the oauth2client library, and uses a file format based on Google's client secrets file. When a service is registered with Google to use OAuth2, there's a "client secrets" file that Google can provide that looks something like this:
{
"web": {
"client_id": "9876543210.apps.googleusercontent.com",
"client_secret": "secret-12345678901234567890",
"client_email": "9876543210#developer.gserviceaccount.com",
"client_x509_cert_url":
"https://www.googleapis.com/robot/v1/metadata/x509/9876543210#developer.gserviceaccount.com",
"redirect_uris":
[ "http://localhost:8000/annalist/login_done/",
"http://annalist-demo.example.org:8000/annalist/login_done/"
],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs"
}
}
In my code, I use this file to initiate an OAuth2 flow using the oauth2client library, with code that looks like this:
flow = flow_from_clientsecrets(
clientsecrets_filename,
scope=scope,
redirect_uri=request.build_absolute_uri(login_done)
)
So you can see that there's much more information than just the client id that is used when initiating the flow. The full code of my implementation is at https://github.com/gklyne/annalist/blob/master/src/annalist_root/oauth2/views.py, beginning about line 273, but there's a lot more logic in that module that's to do with passing details back to a an application running in the Django framework.
I've also created documentation of the procedure I use to register the application with Google and deploy the client credentials. Note that these instructions are for the identity service not the calendar API, and that the details are specific to my application, but I'm hoping there's enough commonality to help you on your way.
Looking to the future, the IETF are working on a spec for allowing automated registration of an OAuth2-using application instance. I think this is the relevant specification: https://datatracker.ietf.org/doc/draft-ietf-oauth-dyn-reg/. It appears that it is currently (as of September 2014) being considered for standards track publication, but that doesn't say when it might be more widely available.