How does cloud pubsub receive user activities on a e-commerce website?. Is the endpoint going to be a URL or an IP address and a port
On this graph you can see the workflow. You will need your app or your service to publish a message to the Pub/Sub topic which will then the topic will pass to a subscriber via Pull or Push.
To publish a message you can call a URL like
"http s://pubsub.googleapis.com/v1/projects/myproject/topics/mytopic:publish" as mentioned on the docs
or with the help of a Client library wich will need you to call a method, fo example in NodeJS it will be somehting like
const messageId = await pubSubClient.topic(topicName).publish(dataBuffer);
console.log(`Message ${messageId} published.`);
that is also described on the docs mentioned before.
I'm not really getting the use of Pub/Sub since it will be better if you explained your arquitecture and use case, but for example here is a quick architecture model on GCP
Related
How would one implement a comprehensive chat system using sockets within FastAPI. Specifically keeping the following in mind:
Multiple Chat rooms many-to-many between users
Storing messages with a SQL or NoSQL database for persistence
Security: Authentication or possibly encryption
I've looked at some libraries, but actual useful implementations are far between, regrettably.
Any advice or redirects towards places for more information will be of great help!
For chat rooms you could use FastAPI builtin websockets support and add redis pubsub or PostgreSQL pg_notify to it for sending messages to all participants in the room.
Storing messages in PostgreSQL is a solid choice because of its long history and stability.
Authentication can be handled by OAuth2 provider in FastAPI. Authorization can be handled by OAuth2 scopes that is hidden in the Advanced Security section in the FastAPI Documentation. Encryption is provided by HTTPS and reverse proxy that you put in front of your app.
There aren't any fully ready made libraries that provide everything out of the box. But breaking down the problem in to smaller pieces and then working on those will get you pretty far.
Write down what fields/data you want to store about your users, chat rooms, messages.
Implement those basic models in FastAPI probably using SQLAlchemy.
Wire up those models to api endpoints so that you can use those models in Swagger (list chatrooms, get and post messages into chatrooms).
Implement a websocket endpoint in FastAPI that will echo back everything sent to it. That should allow you to wire up some client side javascript for sending and receiving messages from the websocket.
Modify your exising message storing endpoint to push the same message also to redis publish topic and change your websocket endpoint to subscribe to the redis subscribe topic.
Add authentication to your endpoints. At first basic user/password, later more advanced configurations.
Add reverse proxy with https in front and voila.
I'm able to push messages from Python backend (which is on a VM instance) to topic & see messages on Pub/sub topic. But there is no code for pulling data from topic using angular. I want to pull that data & show it to Angular UI. Could you please help me with this?
With PubSub, there are 2 subscriptions mode that imply 2 kind of authentication:
Push Subscription, where the sender (PubSub subscription) need to be authenticated to push the message to a secure endpoint
Pull Subscription, where the client need to be authenticated to be able to get the messages.
So, in your case, you need to authenticate your Angular app on PubSub Pull Subscription to be able to read the messages. You have 2 solutions:
Either you generate a service account key file and you put it in your static code. It's obviously a stupid idea, because you share publicly a secret, and thus it's like if there is no security!
Or, because the previous solution is like having no security, you can make the pull subscription public. Grant allUsers as PubSub Subscriber.
It will work, but there is a design issue: anyone will be able to subscribe to your Pull subscription, and because the messages aren't duplicated between the subscriber, you will potentially loose messages.
A better solution could be to serve an endpoint in streaming, with Cloud Run for example, to authenticate your user on the Cloud Run endpoint, and to stream the messages from PubSub pull subscription through Cloud Run streaming connexion.
Like this, you add a security layer, something like a proxy.
I am using MQTT to send and receive messages from/to Google IoT Core (telemetry messages).
Messages are then fowarded to different Pub/Sub Topic.
I'd like to add custom attributes to my messages.
The goal is to use differents subscriptions on my topic. Then, filter incoming messages by my custom attributes, and finally get my messages by 'theme' on my dataflow pipeline.
I know we can do it when we use the Pub/Sub client but I can't manage to do this from the mqtt client.
I am using the python client.
So as it was requested I add some details. This is a very classic situation.
I am using a MQTT client as in the google example here: https://cloud.google.com/iot/docs/how-tos/mqtt-bridge#configuring_mqtt_clients. Then using the publish code from the same documentation: https://cloud.google.com/iot/docs/how-tos/mqtt-bridge#publishing_telemetry_events (I juste replaced
for i in range(0, 60):
time.sleep(1)
client.loop()
by time.sleep(1) because I do not want to wait a minute between each message.
I publish my messages calling the previous code like this:
publisher.publish(topic, payload)
where topic is my PubSub topic and payload is my data.
The documentation says:
The forwarded message data field contains a copy of the message published by the device, and the following message attributes are added to each message in the Cloud Pub/Sub topic
(link if you want to see the attributes: https://cloud.google.com/iot/docs/how-tos/mqtt-bridge#publishing_telemetry_events)
What I want to do is add custom attributes to this list.
If I call the Pub/Sub client directly I can do this (from documentation):
# Add two attributes, origin and username, to the message
future = publisher.publish(
topic_path, data, origin="python-sample", username="gcp"
)
where origin and username are custom attributes, Is it possible to do this using the MQTT client ?
Thanks :)
So it is NOT possible.
What google advice is to use topic sub-directories to 'group' messages by theme.
You can associate topics to your registry, then set subfolder for your topic. You can have as much subdirectory as you want as long that it follows this convention:
topic: topic-name
subtopic: topic_name/theme1, topic_name/theme2....
I have been trying to figure out how to use one of the following python packages to create a python-based client that is capable of receiving XMPP-based messages via Google Cloud Messenging.
https://github.com/geeknam/python-gcm
https://github.com/daftshady/py-gcm
https://pypi.python.org/pypi/gcm-client/
https://github.com/pennersr/pulsus
From all I can see, (e.g., the documentation for gcm-client), these packages can send messages to other clients that are identified by registration_id. But how do I get a registration IDs for each client in the first place? In other words, how do I register the client-app that I am creating so that it can receive messages?
It is starting to seem to me that these are not clients per-se, but just libraries that can be used to push messages to clients. I hope that I am wrong about that and just missing a key concept.
Each client application has to call the getRegistrationId() to get the registration id once. Then they can receive messages. A more detailed function call is here
I hope this give you an idea on client devices. :)
I'm running an RESTful API with Python (Flask).
I want to be able to track:
which requests have been made
when did those requests happen
how long did it take to send the response
I want to use Google Analytics for this, because of it's nice dashboard and extended functionalities.
My question
How can I implement Google Analytics into a REST API?
OR does anyone know another tool/library that can be implemented?
This is what I found at the moment:
a tracking app that uses MongoDB
the Google data API - but this is for reading GA data, not tracking the API?
There are actually two ways to send server-side data to Google Analytics. The standard method is the GIF Image Request API, which is the same API that ga.js uses on the client-side. Google has started developing a new REST API known as the Measurement Protocol, but this is only in developer preview.
Server-Side GA
There are a few issues to work through when trying to send server-side data to GA.
Like #mehaase pointed out above, the gif API takes the ip address from the request, so all of your server-side requests will appear as users coming from the location of your servers. The measurement protocol doesn't let you change the request's ip either. I'll assume the publicly available gif API in this answer.
Another issue is that the gif endpoint requires a client-side cookie. You can fake this cookie on every request but this will cause each event to look like a new visitor. That's fine as long as you keep the server-side API and website in separate Google Analytics profiles.
Also beware that Google can take up to an hour to show your events once you've sent them. This can make debugging a bit painful, so be patient.
Here's the breakdown of what each variable in the GA cookie means, and a good node.js example of sending server-side data to GA.
Other Event Tracking Options
Even though GA is excellent for tracking website metrics, it's not built for tracking server-side events. A category of analytics known as event tracking is the perfect application for restful API usage tracking.
The API generally looks like this:
analytics.track('API Response', {
method : 'POST',
endpoint: '/comments'
duration: 124
status : 500
});
And lets you see reports on the frequencies and distributions of each event and event property You can answer questions like: how many /comments API calls happened today? How many were 200s? How many had a response higher than 200ms? etc.
Here are some event tracking tools that can help you do this:
Mixpanel
KissMetrics
Keen.IO
I'm the co-founder of Segment.io, a company that provides a simple API for client-side, server-side and mobile analytics. We let you send data from python, php, ruby, node, java, .net, javascript, and iOS, and we'll forward it to Google Analytics, Mixpanel, KissMetrics, Keen.IO, or any of our other supported services without you having to learn their API.
And finally, here's an article from our analytics academy that explains why event tracking is useful.
I know this is very old post. I came across google analytics support in Python
https://developers.google.com/api-client-library/python/apis/analytics/v3
Thought this is right place to document as well (y)