Google BigQuery execute() method - python

I was looking at the Google BigQuery API reference sample Python code, and I came across the execute() call.
Can anyone provide me with documentation on what this call does?

You linked to code of the form:
bigquery.jobs().insert(...).execute();
The .jobs() call gets an object represent the BigQuery jobs collection.
The .insert(...) call creates a request object representing a (future) call to the BigQuery jobs.insert API method with the specified parameters. But note that this code only constructs the request--it does not actually send the request.
The .execute() call actually sends the API request to the BigQuery API and returns the response.
Note that this is an automatically-generated Java client library for the BigQuery API. The structure of these API clients is similar for all Google APIs.

This page hints at what execute might do in certain scenarios...
https://cloud.google.com/bigquery/querying-data
But it really should be obvious... It executes your query. Probably though a http protocol of some sort.
If you have all of the API in front of you, you should be able to find where execute is defined on certain classes.
execute() is an inherited property of many types of requests from what it seems.

Related

Get data from Scroll in http request api elasticsearch

I'm trying to write a code in python to get all the data from an api through an http request.
I am wondering if there is a way to use the _scroll_id and it's contents in python. If so, how do I implement it or could you share some documentation regarding it?
All the documentation regarding elasticsearch in python is using a localhost...
Any leads would be highly appreciated.
Elasticsearch has a Python library that helps with pinging the database.
You can use the scan() helper function. Internally, it calls the scroll API so you don't have to worry about any of that.
For the last part of your question, you'd have to follow the tutorial to see how to connect to different databases.
https://elasticsearch-py.readthedocs.io/en/v8.3.3/helpers.html#scan

How to cache authentication in a python API

Our python Api has a structure where consumers will call our main api which will call many apis internally to get data. Since we are calling around 5-8 apis internally depending on the consumer needs, the response time is more than 1 second. We verified the flow and figured out that the authentication step in each internal api is taking longer time and hence bumping up the response time. Do we have any dependencies we can add to cache the authentication so that we dont need to authenticate for each internal api?

Django - repeatedly send API call result via websocket on events (REST Framework + Channels)

I came with a problem while integrating Django REST Framework with Django Channels.
I have a viewset with retrieve (GET) method that prepares information from several different models in tricky way and sends this "complex" result to the frontend. So when client sends GET request with entity primary key to this endpoint (like /complex_entity/1) he instantly receives everything he needed.
And now guys on the frontend side want to have another feature - backend should be able to send results of this complex request to the frontend each time when some of relevant underlying models were changed. Like this: browser subscribes for the changes of ComplexEntity with primary key 1 and when ComplexEntity 1 is changed (or its linked entities which is not a problem) server sends the result of this complex request via websocket. So the request can be executed many times during one websocket connection (on each model change signal).
I see two intuitive ways to provide this behaviour:
Good(?): somehow execute requests to this viewset retrieve method from the django itself - either by calling this method internally or by executing "loopback" HTTP request.
Bad/ugly: copy all complex logic from viewset retrieve method to websocket consumer
Also I've found Django Channels REST Framework which allows to subscribe to model entity but the problem is I need to return not just model instance but this "custom" result glued from several models. DCRF lacks that feature as I understood.
For now I don't really know what is the best way to solve my problem - looks like calling method internally is ok but how to do it?
Loopback HTTP request is ok too (I think) but it should be untied from site hostname and sanity says that it's better to forward "originator" cookies to such request to prevent unauthorized access to entities. The question is, again, how to do it right.
So does anybody know what is a best way to execute same complex request several times during one websocket connection?
The proper way would be to move the common logic into a reusable method and use it in both DRF view and in channels.
That method will receive some arguments (I guess ComplexEntity's ID) and will return the result data in the format you need.

Robot Framework - AWS API Gateway secured by IAM

Background
I've been using Robot Framework and RequestsLibrary to write automated tests against RESTful endpoints I expose via AWS API Gateway. So I'm writing tests that look roughly like this:
*** Settings ***
Library Collections
Library RequestsLibrary
*** Test Cases ***
Get Requests
Create Session Sess https://<api-gateway-url>
${resp}= Get Request Sess /path/to/my/api?param=value
Should Be Equal As Strings ${resp.status_code} 200
Dictionary Should Contain Value ${resp.json()} someValueIwantToVerify
Now, I'm getting around to securing those API Gateway endpoints with IAM. Therefore requests need to be sig4 signed.
The application that consumes these services is written in javascript, and uses aws-api-gateway-client to sign requests. Testing (manually) in Postman is also easy enough, using AWS Signature Authorization type. However, I'm struggling with figuring out a strategy for Robot Framework.
Question(s)
In the broadest sense, I'm wondering if anyone else is using Robot Framework to test IAM secured API Gateway endpoints. If so, how did you pull it off?
More specifically:
Is there an existing Robot Framework library that addresses this use case?
If not, is writing my own library my only option?
If I am stuck writing a library (this looks promising), what sorts of keywords would I define, and how would I use them?
I don't know of an RF library that does what you want, but my first instinct would be to use Amazon's own AWS SDK for Python (boto3), and write a thin keyword wrapper library around it. I've done that for test cases that used AWS S3, but boto3 also supports API Gateway: http://boto3.readthedocs.io/en/latest/reference/services/apigateway.html
The requests library accepts an auth parameter, which is expected to be a subclass of AuthBase.
It turns out that modifying RequestsLibrary to leverage this existing functionality was a grand total of two lines of code (not counting comments, tests, and whitespace for readability):
def create_custom_session(self, alias, url, auth, headers={}, cookies=None, timeout=None, proxies=None, verify=False, debug=0, max_retries=3, backoff_factor=0.10, disable_warnings=0):
return self._create_session(alias, url, headers, cookies, auth, timeout, max_retries, backoff_factor, proxies, verify, debug, disable_warnings)
Sometimes the answer is just to issue your own pull request.
The new Create Custom Session keyword will accept any arbitrary auth object, including the one produced by the library I'm using (which I highly recommend).
The original issue now contains the details of how aws-requests-auth and RequestsLibrary work together with the new keyword.
#the_mero mentioned boto3 which is almost certainly what you want to use to actually marshal up the credentials. You don't want to hard code them in your test like the simple examples in the issue/pull request do.

Request XML from SUDS python

Is there any method to return the SOAP Request XML before triggering a call to SOAP method in Suds library?
The client.last_sent() returns request XML after triggered the call. But I want to see before triggering the call.
Yes this is possible and seems to be used in different "fixer" implementations to take care of buggy servers. Basically you should write a MessagePlugin and implement the sending method.

Categories