Im on Google App Engine, I try to call the a specific method of the Monitoring API via Google API Client. When I call the timeSeries.list with interval.startTime then the error is SyntaxError: keyword can't be an expression. When I replace interval.startTime and interval.EndTime with interval=intervalobj the error is:
File "/base/data/home/apps/e~bwm2-bgi/scaler:
scaling-readmon.412218217025616715/lib/googleapiclient/discovery.py",
line 716, in method raise TypeError
('Got an unexpected keyword argument "%s"' % name)
TypeError: Got an unexpected keyword argument "interval"
I used the Compute API in the same manner like interval=intervalobland it worked. Any tip is appreciated.
CODE:
import webapp2
import logging
from google.appengine.ext import vendor
vendor.add('lib')
from google.appengine.api import app_identity
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
monitoring = discovery.build('monitoring','v3', credentials=GoogleCredentials.get_application_default())
class Scaler(webapp2.RequestHandler):
def post(self):
'''
req = monitoring.projects().metricDescriptors().list(name='projects/PROJ')
res = req.execute()
logging.info(res)
'''
intervalobj = {
'startTime': '2018-08-10T11:01:23.045123456Z',
'endTime': '2018-08-10T11:01:23.045123456Z'
}
res = monitoring.projects().timeSeries().list(
name = 'projects/bwm2-bgi',
filter = 'metric.type="appengine.googleapis.com/http/server/response_style_count"',
interval.startTime = '2018-08-10T11:01:23.045123456Z',
interval.endTime = '2018-08-28T11:01:23.045123456Z').execute()
logging.info(res)
app = webapp2.WSGIApplication([
('/scaler', Scaler)
], debug=True)
Using interval_startTime and interval_endTime worked in place of the interval.startTime and interval.endTime for me.
request = monitor.projects().timeSeries().list(name=project_name,
interval_startTime='2019-03-19T06:00:00.045123456Z',
interval_endTime='2019-03-19T07:00:00.045123456Z',
filter='metric.type="appengine.googleapis.com/http/server/response_style_count"')
I believe setting a point interval with interval.endTime is used when specifying points with the monitoring_v3.MetricServiceClient TimeSeries but not with the discovery resource.
Related
According to the Authlib documentation, there is a built-in approach to use introspection endpoint to validate the given token when resource server has no access to token database:
import requests
from authlib.oauth2.rfc7662 import IntrospectTokenValidator
from your_project import secrets
class MyIntrospectTokenValidator(IntrospectTokenValidator):
def introspect_token(self, token_string):
url = 'https://example.com/oauth/introspect'
data = {'token': token_string, 'token_type_hint': 'access_token'}
auth = (secrets.internal_client_id, secrets.internal_client_secret)
resp = requests.post(url, data=data, auth=auth)
resp.raise_for_status()
return resp.json()
We can then register this token validator in to resource protector:
require_oauth = ResourceProtector()
require_oauth.register_token_validator(MyIntrospectTokenValidator())
When I use #require_oauth for my api routes, I have the following error:
TypeError: 'ResourceProtector' object is not callable
Can someone help please?
Source: https://docs.authlib.org/en/latest/specs/rfc7662.html#use-introspection-in-resource-server
UPDATE: The problem has been found. Wrong source import
Correct one:
authlib.integrations.flask_oauth2
I am trying to access a .NET WCF web service from Python and am getting the following error - would anyone be able to let me know what I am doing wrong:
File "C:\temp\anaconda3\lib\site-packages\suds\mx\literal.py", line 87, in start
raise TypeNotFound(content.tag)
suds.TypeNotFound: Type not found: 'authToken'
Below is the python code that I have:
import uuid
from suds.client import Client
from suds.xsd.doctor import Import, ImportDoctor
url = 'http://something/something?wsdl'
imp = Import('http://www.w3.org/2001/XMLSchema', location='http://www.w3.org/2001/XMLSchema.xsd')
imp = Import('http://schemas.xmlsoap.org/soap/encoding/')
imp = Import('http://schemas.xmlsoap.org/soap/encoding/')
imp.filter.add('http://tempuri.org/')
doctor = ImportDoctor(imp)
client = Client(url, doctor=doctor, headers={'Content-Type': 'application/soap+xml'})
logging.basicConfig(level=logging.INFO)
logging.getLogger('suds.client').setLevel(logging.DEBUG)
logging.getLogger('suds.transport').setLevel(logging.DEBUG)
logging.getLogger('suds.xsd.schema').setLevel(logging.DEBUG)
logging.getLogger('suds.wsdl').setLevel(logging.DEBUG)
client.set_options
myMethod = client.factory.create('myMethod')
myMethod.authToken = uuid.UUID('xxxxxxxx-35f4-4b7b-accf-yyyyyyyyyyyy')
print(f'CLIENT: {client}')
print(f'myMethod: {myMethod}')
ls_Token = client.service.myMethod(myMethod)
print(f'ACCESSTOKEN: {ls_Token}')
Create ResponseData object, the type is defined in wsdl, if there are multiple schemas, you need to add a prefix, such as ns0, ns1, etc.
ResponseData = client.factory.create('ns1:ResponseData')
ResponseData.token = "Test"
Make sure that the properties of the object you created exist,You can view the properties of the object after successfully creating the object.
ResponseData = client.factory.create('ns1:ResponseData')
ResponseData.token = "Test"
print ResponseData
The following picture is the property of ResponseData object:
If I use the following code I will get the same error as you:
ResponseData = client.factory.create('ns1:ResponseData')
ResponseData.authToken = "Test"
So you need to check whether the myMethod object has authToken property.
We could not get it to work with SOAP WCF and had to modify it to a REST based service to make it work. Python seems to work better with REST service.
I am trying to generate a swagger documentation for the API I am building in flask however when I execute below code I get very staring error, What I am missing here ?
This is what I have tried :
try:
from flask import Flask, g, request
from flask_restful import Resource, Api
from flask_limiter.util import get_remote_address
from flask_limiter import Limiter
from flasgger import Swagger
from flasgger.utils import swag_from
from flask_restful_swagger import swagger
from flask_httpauth import HTTPTokenAuth
import os
import json
except Exception as e:
print("some modules are missing {}".format(e))
app = Flask(__name__)
api = Api(app)
limiter = Limiter(app, key_func=get_remote_address)
limiter.init_app(app)
api = swagger.docs(Api(app), apiVersion='0.1', api_spec_url='/doc')
auth = HTTPTokenAuth(scheme='Token')
tokens = {
'awserftggyjkk34)ghtyrjrhhye34nnmw': 'cadmin',
'bwsosjhee(dhj345gtyuioplsertbsjkl': 'dadmin',
'mnchthas(sjklertyusvfgmbshdls234h': 'eadmin'
}
#auth.verify_token
def verify_token(token):
if token in tokens:
g.current_user = tokens[token]
return True
return False
class defr(Resource):
decorators = [limiter.limit("100/day")]
#swagger.model
#swagger.operation(notes='my notes ntes')
#auth.login_required
def currentuser(self):
return "Hello, %s!" % g.current_user
api.add_resource(defr, '/user')
error :
File "C:\Users\codamd\AppData\Local\Continuum\anaconda3\lib\site-packages\flask_restful_swagger\swagger.py", line 294, in extract_operations
for method in [m.lower() for m in resource.methods]:
TypeError: 'NoneType' object is not iterable
Any help would be great
The problem here is that you've used a custom named method in your defr class. The Flask-RESTful documentation specifies defining HTTP methods in your Resource class e.g.
class defr(Resource):
decorators = [limiter.limit("100/day")]
#swagger.model
#swagger.operation(notes='my notes ntes')
#auth.login_required
def get(self):
return "Hello, %s!" % g.current_user
I have a Python Flask application that uses the requests library. I am trying to implement OpenTracing in the application. A simplified version is:
from flask_opentracing import FlaskTracer
from flask import Flask
from jaeger_client import Config
from opentracing_instrumentation.client_hooks import install_all_patches
from os import getenv
import requests
JAEGER_HOST = getenv('JAEGER_HOST', 'localhost')
def initialise_tracer():
config = Config(
config={
'local_agent': {
'reporting_host': 'jaeger'
}
},
service_name='opentracing_test',
validate=True
)
return config.initialize_tracer()
app = Flask(__name__)
tracer = FlaskTracer(initialise_tracer, trace_all_requests=True, app=app)
install_all_patches()
#app.route('/foo')
def foo():
r = requests.get('http://localhost/bar')
return 'foo : ' + r.text
#app.route('/bar')
def bar():
return 'bar'
if __name__ == '__main__':
app.run(debug=True, host='0.0.0.0', port=80)
All endpoints are traced because of giving trace_all_requests to FlaskTracer. All requests made with the requests library are traced because of install_all_patches() from opentracing_instrumentation.
My '/foo' endpoint makes a request to '/bar'. When I access the '/foo' endpoint and view the traces, I can see two separate traces: One containing just '/foo', and a second that contains both the requests GET and the call to '/bar'. I would have expected the spans to propagate through the entire process and therefore I should see a single trace with '/foo' -> GET -> '/bar'.
Do I need to manually set the headers on each of my requests to pass the span? It looks like that is exactly what the requests monkey patch should do...
Here is the error dat i'm getting while trying to make an authentication call to bigquery
HttpError: <HttpError 400 when requesting https://www.googleapis.com/bigquery/v2/projects/ClientId/datasets/samples/tables/natality?alt=json returned "Invalid project ID 'ClientId'. Project IDs must contain 6-63 lowercase letters, digits, or dashes. IDs must start with a letter and may not end with a dash.">
Here is my main.py
import httplib2
import os
from google.appengine.api import memcache
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
from oauth2client.appengine import oauth2decorator_from_clientsecrets
from bqclient import BigQueryClient
PROJECT_ID = "########" this is the Client Id
DATASET = "samples"
TABLE = "natality"
CLIENT_SECRETS = os.path.join(os.path.dirname(__file__),
'client_secrets.json')
http = httplib2.Http(memcache)
decorator = oauth2decorator_from_clientsecrets(CLIENT_SECRETS,
'https://www.googleapis.com/auth/bigquery')
bq = BigQueryClient(http, decorator)
class MainHandler(webapp.RequestHandler):
#decorator.oauth_required
def get(self):
self.response.out.write("Hello Dashboard!\n")
modTime = bq.getLastModTime(PROJECT_ID, DATASET, TABLE)
if modTime is not None:
msg = 'Last mod time = ' + modTime
else:
msg = "Could not find last modification time.\n"
self.response.out.write(msg)
application = webapp.WSGIApplication([
('/', MainHandler),
(decorator.callback_path, decorator.callback_handler())
], debug=True)
def main():
run_wsgi_app(application)
if __name__ == '__main__':
main()
And here is the app.yaml
application: hellomydashboard
version: 1
runtime: python
api_version: 1
handlers:
- url: /favicon\.ico
static_files: favicon.ico
upload: favicon\.ico
- url: .*
script: main.py
And here is the bqclient.py
import httplib2
from apiclient.discovery import build
from oauth2client.appengine import oauth2decorator_from_clientsecrets
class BigQueryClient(object):
def __init__(self, http, decorator):
"""Creates the BigQuery client connection"""
self.service = build('bigquery', 'v2', http=http)
self.decorator = decorator
def getTableData(self, project, dataset, table):
decorated = self.decorator.http()
return self.service.tables().get(projectId=project, datasetId=dataset,
tableId=table).execute(decorated)
def getLastModTime(self, project, dataset, table):
data = self.getTableData(project, dataset, table)
if data is not None and 'lastModifiedTime' in data:
return data['lastModifiedTime']
else:
return None
def Query(self, query, project, timeout_ms=10000):
query_config = {
'query': query,
'timeoutMs': timeout_ms
}
decorated = self.decorator.http()
result_json = (self.service.jobs()
.query(projectId=project, body=query_config)
.execute(decorated))
return result_json
I also tried replacing the ClientId with Project Id as said in the error but it gives another error
HttpError: <HttpError 404 when requesting https://www.googleapis.com/bigquery/v2/projects/hellodashboard87/datasets/samples/tables/natality?alt=json returned "Not Found: Dataset hellodashboard87:samples">
I'm following the tutorial on this page
https://developers.google.com/bigquery/articles/dashboard#firstcall
In order to use the public data sets offered by Google's BigQuery, use the following parameters:
Project ID: publicdata
Dataset ID: samples
Table ID: natality (or whatever you want to use)
In order to use any data sets that you own, switch your Project ID to the one found in the API Console dashboard.
In order to use BigQuery, you must create a project in the APIs Console that has BigQuery enabled (I'm assuming you have done this). Once you've created the project, you'll be able to get the project number from the URL, e.g.
https://code.google.com/apis/console/#project:12345XXXXXXX
In the example, the project number is 12345XXXXXXX and this is the value you would use for PROJECT_ID.