Is there any way I can unify the responses in dialgflow across multiple platforms?
I'm building a chatbot where users can interact with using:
Facebook Messenger
Custom API Endpoints
In the dialogflow console, I can specify the response on facebook that is different than the default response
Default Response
Facebook Response
But i do want buttons to appear on both responses. Is there any way to do that?
I do understand that Facebook requires the response to be in a specific format in order to interpret the buttons... and I'm free to set any custom payload in the default response that can be interpreted by the client ( App, website ), but did anyone manage to combine both responses?
If not, what's a good way to set the custom payload? any examples might help.
Note: Webhook calls are enabled for all intents
You can send custom payloads in the JSON format provided in the platforms documentation, it will be rendered on a specific platform accordingly. Below is the format which you need to follow to send custom payload.
{
"facebook": {
},
"kik": {
},
"line": {
},
"skype": {
},
"slack": {
},
"telegram": {
}
"viber": {
}
}
You can also send a custom payload to self-developed integrations. It won’t be processed by Dialogflow, so you'll need to handle it in your own business logic.
Hope it helps.
Related
A little more detail on the question -
Scenario
The app I'm working on currently performs the following -
Logs in users via Google OAuth ( added to Auth0 login )
Comprises of a list of Google Sheets with their links, which the user can open when he is logged in
When the user clicks on a sheet's link to open it, he is redirected to a page where the sheet is expected to be displayed in an iframe.
The gspread module in Python retrieves the list of users the sheet has been shared with (permission list) (gspread is authenticated using a service account which helps do this). If the authenticated user is a part of the permission list, the iframe is displayed, else, an error message is displayed.
Now, the next requirement we'd like to achieve is for specific users in the site to be able to share the Google Sheet with other users, using the share method in the gspread module. However, we would like to share it with users with regular Google accounts, and not those enabled with Google Workspace, owing to business requirements which I prefer not to disclose at this point.
Is there a way to do this? I've found a something here - https://developers.google.com/admin-sdk/directory/v1/quickstart/python#configure_the_sample, but this is only to check with the users of the same workspace, if the service account I possess is that of the workspace's admin, but what I need to know is in general if a given account is a regular one or is linked to the workspace of any organization.
The People api has a method called people.get If i pass it me and check the person fields for memberships
Workspace domain account
{
"resourceName": "people/106744391248434652261",
"etag": "%EgMBLjcaBAECBQciDFpMNzJsdkk3SG80PQ==",
"memberships": [
{
"metadata": {
"source": {
"type": "DOMAIN_PROFILE",
"id": "106744391248434652261"
}
},
"domainMembership": {
"inViewerDomain": true
}
}
]
}
standard gmail user
{
"resourceName": "people/117200475532672775346",
"etag": "%EgMBLjcaBAECBQciDEdwc0JEdnJyNWRnPQ==",
"memberships": [
{
"metadata": {
"source": {
"type": "CONTACT",
"id": "3faa96eb08baa4be"
}
},
"contactGroupMembership": {
"contactGroupId": "myContacts",
"contactGroupResourceName": "contactGroups/myContacts"
}
}
]
}
So the answer is yes you need to go though the google people api. I dont have any python examples for the people api on hand but let me know if you cant get it working.
See the example here. Why is a redirect used simply to connect to a service? Why bother with mocking a service and all that stuff? Is there some valid reason for all of this or is this just because someone made an assumption about how authentications would be used (i.e. author and user are different)? Is there a good way of avoiding this within the REPL?
https://github.com/SaxoBank/openapi-samples-python/blob/master/authentication/oauth/code-flow/bare-bones-code-flow-app.py
I don't fully understand your issue, but regarding the SAXO API and the oauth token, you always to need to define the RedirectUrls for generating the token. That's why the 5 keys listed in the provided are indeed mandatory :
params = {
"grant_type": "refresh_token",
"refresh_token": token_data["refresh_token"],
"redirect_uri": app_config["RedirectUrls"][0],
"client_id": app_config["AppKey"],
"client_secret": app_config["AppSecret"]
}
FYI, you can find the full doc of the Redirect URI here below :
https://www.oauth.com/oauth2-servers/redirect-uris/
I have a problem with a job in the Cloud Scheduler for my cloud function. I created the job with next parameters:
Target: HTTP
URL: my trigger url for cloud function
HTTP method: POST
Body:
{
"expertsender": {
"apiKey": "ExprtSender API key",
"apiAddress": "ExpertSender APIv2 address",
"date": "YYYY-MM-DD",
"entities": [
{
"entity": "Messages"
},
{
"entity": "Activities",
"types":[
"Subscriptions"
]
}
]
},
"bq": {
"project_id": "YOUR GCP PROJECT",
"dataset_id": "YOUR DATASET NAME",
"location": "US"
}
}
The real values has been changed in this body.
When I run this job I got an error. The reason is caused by processing body from POST request.
However, when I take this body and use it as Triggering event in Testing I don't get any errors. So I think, that problem in body representation for my job but I havn't any idea how fix it. I'll be very happy for any idea.
Disclaimer:
I have tried to solve the same issue using NodeJS and I'm able to get a solution
I understand that this is an old question. But I felt like its worth to answer this question as I have spent almost 2 hours figuring out the answer for this issue.
Scenario - 1: Trigger the Cloud Function via Cloud Scheduler
Function fails to read the message in request body.
Scenario - 2: Trigger the Cloud Function via Test tab in Cloud Function interface
Function call always executes fine with no errors.
What did I find?
When the GCF routine is executed via Cloud Scheduler, it sends the header content-type as application/octet-stream. This makes express js unable to parse the data in request body when Cloud scheduler POSTs the data.
But when the exact same request body is used to test the function via the Cloud Function interface, everything works fine because the Testing feature on the interface sends the header content-type as application/json and express js is able to read the request body and parses the data as a JSON object.
Solution
I had to manually parse the request body as JSON (explicitly using if condition based on the content-type header) to get hold of data in the request body.
/**
* Responds to any HTTP request.
*
* #param {!express:Request} req HTTP request context.
* #param {!express:Response} res HTTP response context.
*/
exports.helloWorld = (req, res) => {
let message = req.query.message || req.body.message || 'Hello World!';
console.log('Headers from request: ' + JSON.stringify(req.headers));
let parsedBody;
if(req.header('content-type') === 'application/json') {
console.log('request header content-type is application/json and auto parsing the req body as json');
parsedBody = req.body;
} else {
console.log('request header content-type is NOT application/json and MANUALLY parsing the req body as json');
parsedBody = JSON.parse(req.body);
}
console.log('Message from parsed json body is:' + parsedBody.message);
res.status(200).send(message);
};
It is truly a feature issue which Google has to address and hopefully Google fixes it soon.
Cloud Scheduler - Content Type header issue
Another way to solve the problem is this:
request.get_json(force=True)
It forces the parser to treat the payload as json, ingoring the Mimetype.
Reference to the flask documentation is here
I think this is a bit more concise then the other solutions proposed.
Thank you #Dinesh for pointing towards the request headers as a solution! For all those who still wander and are lost, the code in python 3.7.4:
import json
raw_request_data = request.data
# Luckily it's at least UTF-8 encoded...
string_request_data = raw_request_data.decode("utf-8")
request_json: dict = json.loads(string_request_data)
Totally agree, this is sub-par from a usability perspective. Having the testing utility pass a JSON and the cloud scheduler posting an "application/octet-stream" is incredibly irresponsibly designed.
You should, however, create a request handler, if you want to invoke the function in a different way:
def request_handler(request):
# This works if the request comes in from
# requests.post("cloud-function-etc", json={"key":"value"})
# or if the Cloud Function test was used
request_json = request.get_json()
if request_json:
return request_json
# That's the hard way, i.e. Google Cloud Scheduler sending its JSON payload as octet-stream
if not request_json and request.headers.get("Content-Type") == "application/octet-stream":
raw_request_data = request.data
string_request_data = raw_request_data.decode("utf-8")
request_json: dict = json.loads(string_request_data)
if request_json:
return request_json
# Error code is obviously up to you
else:
return "500"
One of the workarounds that you can use is to provide a header "Content-Type" set to "application/json". You can see a setup here.
I am going to use Firebase Auth and Database modules to create my web app. However, not all things that I want my app to do is possible to achieve on only front end. So I want to also use backend with Python's Bottle framework to handle requests and Pyrebase to get access to Firebase Database.
Let's say that after logging in I need to go to mainpage and see personalized content, for example my notes. They are structured this way in DB:
{
"notes": [{
"id": "1",
"title": "X",
"author": "user1"
},
{
"id": "2",
"title": "Y",
"author": "user2"
} and so on... ]
}
So how it's possible to implement showing only my articles on main page?
I understand that I need to filter my notes based on author value, but how to let Bottle understand who is currently logged in?
I've read there, that I should somehow send unique token to backend server to authenticate current user, but how to do that? Inserting Token in every link as GET parameter seems to be silly, but I see no other way to implement that.
Start by organizing your database so that each note becomes a child object:
{
"notes": {
"id1": {
"id": "id1",
"title": "X",
"author": "user1",
},
"id2": {
}
}
}
Then this particular interaction can be implemented entirely in the client-side. Just execute a query to filter the notes you want. For example in a JS client:
var uid = firebase.auth().currentUser.uid;
var query = ref.orderByChild('author').equalTo(uid);
// Listen for query value events
If you want to run this on a backend server, and you want to ensure that only logged in users are allowed to execute it, then you must pass the ID token from the client app to the server on each request. Here's how to implement the server-side logic using the Python Admin SDK:
import firebase_admin
from firebase_admin import auth
from firebase_admin import db
token = '....' # Extract from the client request
try:
decoded = auth.verify_id_token(token)
uid = decoded.uid
ref = db.reference('path/to/notes')
notes = ref.order_by_child('author').equal_to(uid).get()
# Process notes response
except ValueError as ex:
print(ex)
# Send error to client
I started writing a Slack bot in Python and came to a halt when I couldn't find a way to send richly-formatted messages using the either of the two methods:
sc.rtm_send_message("channel_name", my_message)
sc.api_call("chat.postMessage", channel="channel_name", text=my_message, username="username", icon_url="icon_url")
where my_message = json.dumps({'attachments': [{...}]})
I now know that I can do this using the webhook approach but is it possible with the above method?
Both API (method chat.postMessage) and incoming webhook offer the same options for formatting your messages including markup and attachments.
Hint: if you want to use markup in your attachments, make sure to add the field "mrkdwn_in" and name the field your want to use it in or it will be ignored by Slack.
Example:
{
"attachments": [
{
"title": "Title",
"pretext": "Pretext _supports_ mrkdwn",
"text": "Testing *right now!*",
"mrkdwn_in": ["text", "pretext"]
}
]
}
See here for full documentation.
I found out where I was going wrong.
I was passing my message to the wrong argument in the sc.api_call method.
I should've been passing it to sc.api_call(attachments=...) argument, not the text argument.