Airflow to export big query table data to Google sheet? - python

Does anyone know if it's possible to take the contents of a bigquey table and append that data to a Google sheet, using airflow. I have looked through the docs and can only find this: https://airflow.apache.org/docs/apache-airflow-providers-google/stable/operators/transfer/sql_to_sheets.html
Firstly, I'm not sure what a database_conn_id is? And I'm not sure if this supports any Google sheets validations such as drop downs via data validation or append functions.

Related

How to call list of API's one by one in python and store the data to snowflake

Suppose I have list of API's like the following...
https://developer.genesys.cloud/devapps/api-explorer#get-api-v2-alerting-alerts-active
https://developer.genesys.cloud/devapps/api-explorer#get-api-v2-alerting-interactionstats-rules
https://developer.genesys.cloud/devapps/api-explorer#get-api-v2-analytics-conversations-details
I want to read this API's one by one and store the data to snowflake using Pandas and SQLalchemy.
Do you have any ideas for reading the API's one by one in my python script?
-Read the API's one by one from a file.
Load the data to snowflake table directly.

Google BigQuery + Python

I need to do an exploratory analysis using python over two tables that are in a Google BigQuery database.
The only thing I was provided is a JSON file containing some credentials.
How can I access the data using this JSON file?
This is the first time I try to do something like this, so I have no idea on how to do it.
I tried reading different tutorials and documentations, but nothing worked.

Unable to access tableau data using rest api via python when we have multiple dashboards in a a worksheet

I am basically trying to access tableau underline data via rest api using python.
I am able to do so when we have one dashboard(chart) in the worksheet. However, when we have multiple dashboards in one worksheet it's only returning the data for first dashboard in the worksheet.
Not entirely sure what your question is.
But each view/sheet in the dashboard has a different ID.
You need to specific which one you would like to see. You can get the different IDs using a GET request.
The first dashboard is the default one. This may be the source of your error and why you are only getting the first one returned.

Taking data from SQL Server and updating existing Excel sheet while keeping table possible?

I have been trying to write a program that runs a SQL query then takes the data from that SQL query updates an existing Excel sheet while keeping/updating a chart in Python.
Is this possible? I haven't been able to find any existing questions regarding this so I am sorry if this is a repeat.
**
For example, I have an items table with three items of varying quantities. I can get the items to appear in my Excel sheet -- but any graph/chart I make is discarded once new information is updated.
If you look into this python package
https://xlsxwriter.readthedocs.io/
You'll find ways to generate charts from python in your excel file.
Other possibility is to have your data go into a table in excel (not a range but an actual table)
you can follow the logic in this post for more information on how to achieve this
Manipulate existing excel table using openpyxl

Is there a way to serverlessly update a big aggregated table daily?

I am trying to build a big aggregated table with googles tools but I am a bit lost on the 'how to do it'.
Here is what I would like to create: I have a big table in bigquery. Its updated daily with about 1.2M events for evert user of the application. I would like to have an auto updating aggregate table(udpated once every day) built upon that with all user data broken by userID. But how do I continiously update the data inside of it?
I read a bit about firebase and bigquery but since they are very new to me I cant figure out if this is possible to do serverlessly?
I know how to do it with a jenkins process that queries the big events table for the last day, gets all userIDs, joins with the data from the existing aggregate values for the userIDs, takes whatever is changed and deletes from the aggregate in order to insert the updated data. (In python)
The issue is I want to do this entirely within the structure of google. Is firebase able to do that? Is bigquery able? How? What tools? Could this be solved using the serverless functions available?
I am more familiar with Redshift.
You can use a pretty new BigQuery feature that to schedule queries.
I use it to create rollup tables. If you need more custom stuff you can use cloud scheduler to call any Google product that can be triggered by an HTTP request such as cloud function, cloud run, or app engine.

Categories