I am wanting to integrate a slack workflow with connectwise manage. Our employees are able to submit a workflow via slack when they need help with a ticket. In the workflow they are required to list the ticket details, including ticket number, which is the value I want to be able to match with connectwise, mainly wanting to pull reports for training purposes. I have already connected with slack api using python, and have pulled the data into a dataframe, but need some guidance on how to build the app to integrate with connectwise manage. Thank you!
Related
I am trying to get the cost of each and every resource in my azure subscription everyday.
I was thinking to use this link Azure REST API - Where do I find resource costs? but i don't have the resource group names for the subscriptions with me.
At the subscription level, Azure provides and ability to track your spending over your whole infrastructure with "Cost Management + Biling". You can preview your cost analysis and see your burning partner overtime. There's also a new functionality in preview so you can see cost by resource for example.
The above picture is how the cost analysis look like and how you can imagine it.
If you would like to know more: https://learn.microsoft.com/en-us/azure/cost-management-billing/cost-management-billing-overview
EDIT
The above is accessible in Azure portal directly. But I've personally never did it via REST API, but so far I was able to find this and I believe it might help you.
https://learn.microsoft.com/en-us/rest/api/cost-management/
https://learn.microsoft.com/en-us/rest/api/consumption/
https://learn.microsoft.com/en-us/rest/api/consumption/usage-details
I actually wrote a blog post about this:
You can use the Azure Cost Management connector in Power BI Desktop
You can exported daily data in CSV format
I'm building a project using python and grafana where I'd like to generate a certain number of copies of certain grafana dashboards based on certain criteria. I've downloaded the grafanalib library to help me out with that, and I've read through the Generating Dashboards From Code section of the grafanalib website, but I feel like I still need more context to understand how to use this library.
So my first question is, how do I convert a grafana dashboard JSON model into a python friendly format? What method of organization do I use? I saw the dashboard generation function written in the grafanalib documentation, but it looked quite a bit different from how my JSON data is organized. I'd just like some further description of how to do the conversion.
My second question is, once I've converted my grafana JSON into a python format, how do I then get the proper information to send that generated dashboard to my grafana server? I see in the grafanalib documentation the "upload_to_grafana" function used to send the information and it takes in the three parameters (json, server, api_key), and I understand where its getting the json parameter from, but I dont get where the server information or API key are coming from or where that information is found to be input.
This is all being developed on a raspberry pi 4 just to put that out there. I'm working on a personal smart agriculture project as a way to develop my coding abilities further, as I'm self taught. Any help that can be provided to help me in my understanding is most appreciated. Thank you.
create an API key in Grafana configuration ..The secret key that u get while creating is the API key ..Server is localhost:3000 in case of installed grafana
My company is trying out google's recommendation AI using BQ exports of merchant center and GA data sources. However, we discovered a configuration error in the merchant feed which led to most of the events being unjoined.
I would like to do a new (clean) setup and am looking for the best way to delete the old data. It seems only possible via the API?
Secondly, while the UserEventService has a purge function, there doesn't seem to be a similar function for the ProductService.
Is deleting each product one by one the only way to go?
Any pointers and examples (Python) would be greatly appreciated as there seems to be very little documentation about this at this point in time.
As you mentioned, the only way to delete data is through the API, you can use Google Cloud Client Libraries or use REST requests; however, the library does not have a function to purge all the Product data.
In this case will be necessary to delete one product at a time by using the delete_prod() function (example).
Nevertheless, as a workaround you can get the id product get_product()function (example) of your products and add them into a collection, then, iterate this collection and pass each value into the delete_prod(). In that way you can delete all the data products, but this needs to be reviewed on your side.
Additionally, I would like to share additional information provided by Google where you can find all related to Python Library.
Retail Docs API,
Python Retail library, GitHub Repository Retail API
Please keep in mind that Stackoverflow is for specific questions about code such as errors.
We have a requirement where we want to show zendesk tickets updated with the data from PostgreSQL database , We are using Python as the scripting language and planning to use this API "http://docs.facetoe.com.au/zenpy.html" for this.
The idea is to help the service team to gather and see all the information in the Zendesk itself.There are additional data in the database which we want to show it in the tickets either as comments or a table structure with the details from other tickets which is raised by this user(We are taking the email address of the user for this).
There is no application at our DWH, So mostly google reference shows the integration between zendesk and some other applications and not much references about updating the tickets from the database via Python or other scripting languages.
So is it possible to pass the data from our DWH to be appeared in the zendesk tickets?
Can anyone helps/suggest me on how to achieve/start on this.
It is possible to update tickets from anywhere using python and some codding.
Your problem can be solved in different ways.
The first one, a little simpler:
You make a simple python app and launch it with cron. App architecture will be like this:
Main process periodically track new tickets in Zendesk using search request. If relevant to database ticket is found (you need some metrics to understand is it relevant ticket) you main process makes a post via ticket.update with information from database. And make a special tag on ticket, to understand that it was already updated.
This is easy to write, but if you database data will be updated it will not be updated in ticket.
The second option is to make private app on zendesk side with backend on you side.
So in this case when your staff member opens some ticket app will request backend to display current data from database, relevant to this ticket. In this case you will see actual information everytime, but will get some database requests on every ticket open case.
To make first script you will need:
zenpy, sqlalchemy and 1-2 days codding.
To make second option you will need:
zenpy, sqlalchemy, flask, front-end interface.
In Azure portal ,if one subscription is selected ,the cost analysis can be viewed like the following screenshot
I want to programmatically fetch the information the like the one displayed above may be using using some python SDK API/REST API.
If anybody has any experience/idea on this ,please help.
After going through replies ,I have gone through the Azure Billing Rest API and I am now able to call the Usage Aggrgate and RateCard related Rest APIs.
Following are the results of those REST Calls. .Azure Billing Usage Aggregate Response
Azure Billing Ratecard Response
But honestly speaking ,I still have not figured out how these to would give me detailed view like the cost analysis does where for each resource how much cost associated can be displayed.Actually I am very new to Azure probably that is why I am missing the link some where .
Can somebody give some hint here ?
If you already have the usage and the ratecard data, then you must combine them. Take the meterId of the usage data and get the related ratecard data. The ratecard data contains the MeterRates and the IncludedQuantity which you must take. There are probably multiple meter rates and the included quantity because there are probably different costs per usage (e.g. first 10 calls for free, 3 GB for free, ...). The consumption starts/is reseted at the 14th of the month. That's the reason why you have to read the data from the whole billing period (begins with 14th of each month), because that's the only way how you get the correct consumption.
So, if you are using e.g. Azure Functions and you have a usage of 100.000 units per day and you want the costs from 20th - 30th, then the calculation works as follows:
read data from 14th - 30th. These are 17 days and therefore it used 1.700.000 units.
The first 400.000 are for free = IncludedQuantity (so in this sample the first 4 days). From the 400.001 unit on, you have to take the meter rate (0,0000134928 €) and calculate the costs. 1.300.000 * 0,0000134928 = ~17,54€. Fortunately, the azure functions have only one rate. If the rate changes e.g. after 5.000.000 units, then you also have to take this into account.
If you have the whole costs, then you can filter on your date which is 20.-30. and you will get the result.
That's the short explanation of the calculation. I implemented this calculation in C# and published it as a NuGet package. The source code is on github - probably it helps. It also contains a sample console which you could use to export the data.
Source: https://github.com/codehollow/AzureBillingApi
Blog post: https://codehollow.com/2017/02/using-the-azure-billing-api-to-calculate-the-costs/
I have the same issue. But unluckily, python SDK is too hard to use.
Moreover, you cannot find an availabale sample or example on Google.
So, I choose using restapi rather than python SDK.
With python code, you can do this firstly,
import requests
from azure.common.credentials import ServicePrincipalCredentials
and set headers pyload and url.
headers = {
"Content-Type": "application/json",
"Authorization": <token> }
you can get token through credentials, which generated by client_id,secret,tenant.
credentials.token() will return the token you can use in headers.
you can find restapi in https://learn.microsoft.com/en-us/rest/api/ or use F12 in Chrome when you access the Azure dashboard.
There are the offical documents below for retrieving the billing data using Python SDK or REST API in Python.
For using Python SDK, please see http://azure-sdk-for-python.readthedocs.io/en/latest/resourcemanagementcommerce.html.
For using Billing REST API, please see https://learn.microsoft.com/en-us/rest/api/billing/, and you can try to use Python package requests to get these data.
However, I think you may have know these above, the key is that you need to follow the tutorial Manage access to billing information for Azure using role-based access control to get the role permission via your account admin.
And then you may also need to register a client app to get the client id for Resource Management Authentication if you want to use Service Principal/ADAL (not AD User/Password) in Python SDK or use REST API to do the same authentication as Azure REST API Reference page said. For the authentication topic, you can refer to the content map of Manage Apps to know more if you have been getting some trouble.
Hope it helps. Any concern, please feel free to let me know.