I'm building a tool in Python for which I need to read out error codes for specific devices using the Server-Eye API. Server-Eye is our monitoring solution where all of our devices and the devices of our customers are registered. The documentation at https://api.server-eye.de/ wasn't very helpful or I'm just not finding what I need. Does anybody have any experience with the Server-Eye API?
The Server-Eye support wasn't helpful either. My request apparently was so exotic that they have to discuss the problem in a weekly meeting they are having. No answer from them yet.
What I am able to do is read out customers from our tenant and the devices registered in a customer. I'm also able to format the data that the API gives you which is a huge pain. I just can't seem to find how I can read which sensors are applied to a device or what errors these sensors found.
A usual request would look like this:
requests.get('https://api.server-eye.de/2/customer/cId/containers', params=apiKey)
cId would be the ID of the customer you want to read from.
apiKey is an authorization token which can be generated in the webconsole.
This request gives a response object out of which you have to read the contents with something like .json() or .text.
Any help is appreciated, I'm starting to get real frustrated here as my deadline is slowly approaching.
Related
I have a json full of event data that I need to send into snowplow in python using an iglu webhook but having trouble finding any solid guidance on this. Most of the documentation I've been able to find relates to tracking specific events and sending the data through but I need to backfill historical data in the same manner I'll fill forward looking data hence having to send a large json with activity history at the outset.
Is this possible using snowplow/python/iglu or am I approaching the problem incorrectly?
This question is getting old and OP may have moved on, but I'll leave an answer for anyone else who might stumble upon it.
A Snowplow collector (eg, the stream-collector) receives data over HTTP. Any method of sending an HTTP request should work in theory, however there are specific SDKs that address common use cases. For Python specifically, there is the snowplow-python-tracker. You can refer to the full documentation here: Snowplow Python Tracker Docs.
You do not need to be using an Iglu webhook. You can point your Python tracker instance directly to your collector via the existing request paths, which are documented here. Yes, one of these paths is for requests via the Iglu webhook adapter but that is meant to be used in specific situations where you don't control the environment, in which the tracker is instantiated, eg third-pary vendor systems.
I have recently started developing an application to analyse my all-time exercises in the Polar platform.
I'm using their Accesslink API to get new sessions and I have exported my old sessions through another service they offer.
The exported sessions come with fully detailed information (instant GPS location, speed, heart rate), but the JSON data provided by the API is just a summary. I am looking for a way to get the initial position (GPS location) of my session to, later, find the city's name from another source. I think that the only way to do this is by getting the GPS info of my sessions.
Although the sessions have a has-route field, I cannot find in their documentation a way to request this route. They have provided a working example, but it does not provide a way to get these data.
Does anyway know if this is possible and, if so, could you please give me some directions?
Thanks in advance.
Turns out that the GPS information is provided through GPX files, which are provided by the API mentioned on the question. There is a method implemented to do this on their github (link also on the question) which already performs this task. I have added the call to this method and saved its output in this project.
I have never used an API before, but I am trying to learn how to use Scopus for a project I'm doing with a few colleagues. I have gotten about this far:
response = requests.get("https://api.elsevier.com/content/search/scopus/",
headers={'Accept':'application/json',
'X-ELS-APIKey': '[My_API_Key]'})
I keep getting a 400 error in response to this, even though my API Key is valid, and I've entered it correctly. I'm guessing I'm getting the error because the query is too large since I am just searching Scopus instead of looking for any specific author ID or ISSN.
I want to run queries to get all of the author data for a handful of specific ISSNs. As someone who is very uncomfortable using Python (I'm a web developer, not a Python programmer) and also someone who has never used an API themselves, I have no idea how to proceed from here. I've read the guides provided by Elsevier, but as I don't understand this stuff, I haven't found them helpful at all. I've also watched and read some tutorials about APIs, but none of them have helped me figure out how to make an actual specific request.
If any of you have used Scopus before, can you please tell me how to make a request based on the parameters I need? Am I supposed to put the ISSN at the end of the URL? If so, how should I format it, and how do I specify what other data I want for that specific ISSN?
I apologize for the lack of specificity in this particular question, I am just completely lost here. I am currently using Jupyter notebooks to write and run my code.
As other responders have said you are on the right track, however, your query is essentially blank, in that you are not actually asking the API for anything.
This request
"https://api.elsevier.com/content/serial/title/issn/[ISSN]?apiKey=[My_API_Key]"
will bring back general metadata about an ISSN, but to go deep with author and institution metadata, you should start with a Scopus Search API query:
"https://api.elsevier.com/content/search/scopus?query=issn([ISSN])?apiKey=[My_API_Key]"
The API Docs page, located here:
"https://dev.elsevier.com/api_docs.html"
Has more tech details about the APIs
In Azure portal ,if one subscription is selected ,the cost analysis can be viewed like the following screenshot
I want to programmatically fetch the information the like the one displayed above may be using using some python SDK API/REST API.
If anybody has any experience/idea on this ,please help.
After going through replies ,I have gone through the Azure Billing Rest API and I am now able to call the Usage Aggrgate and RateCard related Rest APIs.
Following are the results of those REST Calls. .Azure Billing Usage Aggregate Response
Azure Billing Ratecard Response
But honestly speaking ,I still have not figured out how these to would give me detailed view like the cost analysis does where for each resource how much cost associated can be displayed.Actually I am very new to Azure probably that is why I am missing the link some where .
Can somebody give some hint here ?
If you already have the usage and the ratecard data, then you must combine them. Take the meterId of the usage data and get the related ratecard data. The ratecard data contains the MeterRates and the IncludedQuantity which you must take. There are probably multiple meter rates and the included quantity because there are probably different costs per usage (e.g. first 10 calls for free, 3 GB for free, ...). The consumption starts/is reseted at the 14th of the month. That's the reason why you have to read the data from the whole billing period (begins with 14th of each month), because that's the only way how you get the correct consumption.
So, if you are using e.g. Azure Functions and you have a usage of 100.000 units per day and you want the costs from 20th - 30th, then the calculation works as follows:
read data from 14th - 30th. These are 17 days and therefore it used 1.700.000 units.
The first 400.000 are for free = IncludedQuantity (so in this sample the first 4 days). From the 400.001 unit on, you have to take the meter rate (0,0000134928 €) and calculate the costs. 1.300.000 * 0,0000134928 = ~17,54€. Fortunately, the azure functions have only one rate. If the rate changes e.g. after 5.000.000 units, then you also have to take this into account.
If you have the whole costs, then you can filter on your date which is 20.-30. and you will get the result.
That's the short explanation of the calculation. I implemented this calculation in C# and published it as a NuGet package. The source code is on github - probably it helps. It also contains a sample console which you could use to export the data.
Source: https://github.com/codehollow/AzureBillingApi
Blog post: https://codehollow.com/2017/02/using-the-azure-billing-api-to-calculate-the-costs/
I have the same issue. But unluckily, python SDK is too hard to use.
Moreover, you cannot find an availabale sample or example on Google.
So, I choose using restapi rather than python SDK.
With python code, you can do this firstly,
import requests
from azure.common.credentials import ServicePrincipalCredentials
and set headers pyload and url.
headers = {
"Content-Type": "application/json",
"Authorization": <token> }
you can get token through credentials, which generated by client_id,secret,tenant.
credentials.token() will return the token you can use in headers.
you can find restapi in https://learn.microsoft.com/en-us/rest/api/ or use F12 in Chrome when you access the Azure dashboard.
There are the offical documents below for retrieving the billing data using Python SDK or REST API in Python.
For using Python SDK, please see http://azure-sdk-for-python.readthedocs.io/en/latest/resourcemanagementcommerce.html.
For using Billing REST API, please see https://learn.microsoft.com/en-us/rest/api/billing/, and you can try to use Python package requests to get these data.
However, I think you may have know these above, the key is that you need to follow the tutorial Manage access to billing information for Azure using role-based access control to get the role permission via your account admin.
And then you may also need to register a client app to get the client id for Resource Management Authentication if you want to use Service Principal/ADAL (not AD User/Password) in Python SDK or use REST API to do the same authentication as Azure REST API Reference page said. For the authentication topic, you can refer to the content map of Manage Apps to know more if you have been getting some trouble.
Hope it helps. Any concern, please feel free to let me know.
I was hoping to create my own in-house analytics so I tell my customers how many visits their company page got on my site and which URL they came from. I am coding this in Python (Flask) and I wondered if anyone could tell me what is the standard, or sensible approach to this problem.
I think it might be to have some sort of Redis queue which is triggered when a visitor comes and then this information is added to the database later so the site doesn't seem slow.
The standard, and sensible approach is to use Google Analytics. If you must roll your own, you have one of two approaches. JavaScript that is executed on every page (like GA) and pulls this kind of info into a DB. The second approach is parsing log files on the server. Awstats is a good bet for that.