My company is trying out google's recommendation AI using BQ exports of merchant center and GA data sources. However, we discovered a configuration error in the merchant feed which led to most of the events being unjoined.
I would like to do a new (clean) setup and am looking for the best way to delete the old data. It seems only possible via the API?
Secondly, while the UserEventService has a purge function, there doesn't seem to be a similar function for the ProductService.
Is deleting each product one by one the only way to go?
Any pointers and examples (Python) would be greatly appreciated as there seems to be very little documentation about this at this point in time.
As you mentioned, the only way to delete data is through the API, you can use Google Cloud Client Libraries or use REST requests; however, the library does not have a function to purge all the Product data.
In this case will be necessary to delete one product at a time by using the delete_prod() function (example).
Nevertheless, as a workaround you can get the id product get_product()function (example) of your products and add them into a collection, then, iterate this collection and pass each value into the delete_prod(). In that way you can delete all the data products, but this needs to be reviewed on your side.
Additionally, I would like to share additional information provided by Google where you can find all related to Python Library.
Retail Docs API,
Python Retail library, GitHub Repository Retail API
Please keep in mind that Stackoverflow is for specific questions about code such as errors.
Related
I have recently started developing an application to analyse my all-time exercises in the Polar platform.
I'm using their Accesslink API to get new sessions and I have exported my old sessions through another service they offer.
The exported sessions come with fully detailed information (instant GPS location, speed, heart rate), but the JSON data provided by the API is just a summary. I am looking for a way to get the initial position (GPS location) of my session to, later, find the city's name from another source. I think that the only way to do this is by getting the GPS info of my sessions.
Although the sessions have a has-route field, I cannot find in their documentation a way to request this route. They have provided a working example, but it does not provide a way to get these data.
Does anyway know if this is possible and, if so, could you please give me some directions?
Thanks in advance.
Turns out that the GPS information is provided through GPX files, which are provided by the API mentioned on the question. There is a method implemented to do this on their github (link also on the question) which already performs this task. I have added the call to this method and saved its output in this project.
Im using python library to interact with google bigquery and create a group a new views, however, those view need to be added in a different share dataset as authorized views, but Im not able to find how to do using scripting due is a big amount. Somebody have an idea?
Thanks!!
The short answer to this is unfortunately,no. This can not be done directly as you describe in your question.
As per the official documentation "Currently, you cannot grant permissions on tables, views, or rows. You can set access controls at the dataset level, and you can restrict access to columns with BigQuery Column-level security" Controlling access to datasets
. Controlling access to views, requires you to grant a Cloud IAM role to an entity at the dataset level or higher
There is however a possible workaround that would allow you achieve your goal.
It would be possible to share access to BigQuery views using project- level IAM roles or dataset-level access controls. This is a very detailed walk through of how you could achieve this, it uses only two datasets. But the solution could be expanded for a larger number of datasets.
The subtle art of sharing “views” in BigQuery
Additionally, as you ask about using a Python script. There is no reason that the steps described could not be implemented using the Python client library for Big Query..
I hope this helps.
The closest I've gotten is via the rest API call.
https://{HOST}/rest/api/2/project/{Project Key}/statuses
But I need the same call via Python. But I'm unable to find an adequate way.
The closest I've gotten in Python is
jiraInstance.statuses() but this returns all possible statuses for our Jira site.
I need to narrow it down to the workflows for a specific project.
Any help would be appreciated.
Background
This is for a reporting tool where we create a table with all the defects for the specific project in question. In python I can currently retrieve all the statuses/priorities for the Bugs/Bug-task issues but that only returns statuses for the existing bugs. I require a way to retrieve all the statuses from a workflow of the specific project.
This will list all project keys in a given project with their respective status...
issues = jira.search_issues('project=projectname')
for issue in issues:
print (issue.key, 'Status: ',issue.fields.status)
In Azure portal ,if one subscription is selected ,the cost analysis can be viewed like the following screenshot
I want to programmatically fetch the information the like the one displayed above may be using using some python SDK API/REST API.
If anybody has any experience/idea on this ,please help.
After going through replies ,I have gone through the Azure Billing Rest API and I am now able to call the Usage Aggrgate and RateCard related Rest APIs.
Following are the results of those REST Calls. .Azure Billing Usage Aggregate Response
Azure Billing Ratecard Response
But honestly speaking ,I still have not figured out how these to would give me detailed view like the cost analysis does where for each resource how much cost associated can be displayed.Actually I am very new to Azure probably that is why I am missing the link some where .
Can somebody give some hint here ?
If you already have the usage and the ratecard data, then you must combine them. Take the meterId of the usage data and get the related ratecard data. The ratecard data contains the MeterRates and the IncludedQuantity which you must take. There are probably multiple meter rates and the included quantity because there are probably different costs per usage (e.g. first 10 calls for free, 3 GB for free, ...). The consumption starts/is reseted at the 14th of the month. That's the reason why you have to read the data from the whole billing period (begins with 14th of each month), because that's the only way how you get the correct consumption.
So, if you are using e.g. Azure Functions and you have a usage of 100.000 units per day and you want the costs from 20th - 30th, then the calculation works as follows:
read data from 14th - 30th. These are 17 days and therefore it used 1.700.000 units.
The first 400.000 are for free = IncludedQuantity (so in this sample the first 4 days). From the 400.001 unit on, you have to take the meter rate (0,0000134928 €) and calculate the costs. 1.300.000 * 0,0000134928 = ~17,54€. Fortunately, the azure functions have only one rate. If the rate changes e.g. after 5.000.000 units, then you also have to take this into account.
If you have the whole costs, then you can filter on your date which is 20.-30. and you will get the result.
That's the short explanation of the calculation. I implemented this calculation in C# and published it as a NuGet package. The source code is on github - probably it helps. It also contains a sample console which you could use to export the data.
Source: https://github.com/codehollow/AzureBillingApi
Blog post: https://codehollow.com/2017/02/using-the-azure-billing-api-to-calculate-the-costs/
I have the same issue. But unluckily, python SDK is too hard to use.
Moreover, you cannot find an availabale sample or example on Google.
So, I choose using restapi rather than python SDK.
With python code, you can do this firstly,
import requests
from azure.common.credentials import ServicePrincipalCredentials
and set headers pyload and url.
headers = {
"Content-Type": "application/json",
"Authorization": <token> }
you can get token through credentials, which generated by client_id,secret,tenant.
credentials.token() will return the token you can use in headers.
you can find restapi in https://learn.microsoft.com/en-us/rest/api/ or use F12 in Chrome when you access the Azure dashboard.
There are the offical documents below for retrieving the billing data using Python SDK or REST API in Python.
For using Python SDK, please see http://azure-sdk-for-python.readthedocs.io/en/latest/resourcemanagementcommerce.html.
For using Billing REST API, please see https://learn.microsoft.com/en-us/rest/api/billing/, and you can try to use Python package requests to get these data.
However, I think you may have know these above, the key is that you need to follow the tutorial Manage access to billing information for Azure using role-based access control to get the role permission via your account admin.
And then you may also need to register a client app to get the client id for Resource Management Authentication if you want to use Service Principal/ADAL (not AD User/Password) in Python SDK or use REST API to do the same authentication as Azure REST API Reference page said. For the authentication topic, you can refer to the content map of Manage Apps to know more if you have been getting some trouble.
Hope it helps. Any concern, please feel free to let me know.
I'm trying to build a python solution where a user can enter a credit card which will be submitted and saved to the Payflow pro servers and can be billed on an on-demand basis. I know python-payflowpro supports recurring billing, but that occurs on a regular schedule, such as weekly or monthly. I'm looking to find a solution that will bill a user's card at their request without them having to enter in their card information.
I've looked through the payflow pro api docs and it looks like there is some feature where you can bill a user's account multiple times if you have the transaction id that payflow pro gives you. However, I'm not sure if this is only so merchants can make adjustments to an existing order (such as the customer wishes to later add an additional item). And I don't think that python-payflowpro supports this.
Has anyone used payflow in this way to store credit cards online and make on-demand payments to them? Is there an existing python api for this, whether it be python-payflowpro or something else? Or do I have to roll my own API for this?
I'm pretty new to payflow, so maybe I'm missing something obvious. Was wondering how other people approached this situation.
Thank you for reading and for your consideration.
Joe
This is the python-payflowpro package that I am currently using:
https://github.com/bkeating/python-payflowpro/blob/master/payflowpro/tests/client.py
Looking closer at the api source code, I found an undocumented function called reference_transaction, which allows reference transactions. It seems to allow you to make use of PayPal's Reference Transactions and allow you to store credit cards online and charge them on an ad-hoc basis.
https://github.com/bkeating/python-payflowpro/blob/master/payflowpro/client.py#L259
With a little digging, I found a way to utilize this api method, but had to do a few tricks to pass in the proper arguments. I've documented them here:
https://github.com/bkeating/python-payflowpro/issues/5