Testing two methods in an Azure function in python - python

I am writing tests for my azure function, and for some reason - I can't mock a function call. I should also mention this is the first time I'm writing a python test case so be nice :)
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
try:
req_body = req.get_json()
except ValueError as error:
logging.info(error)
download_excel(req_body)
return func.HttpResponse(
"This HTTP triggered function executed successfully.",
status_code=200
)
so thats the initial function. This function calls download_excel and pass the request body. The next function receives the request body, and writes that excel to a blob storage.
def download_excel(request_body: Any):
excel_file = request_body["items_excel"]
#initiate the blob storage client
blob_service_client = BlobServiceClient.from_connection_string(os.environ["AzureWebJobsStorage"])
container = blob_service_client.get_container_client(CONTAINER_NAME)
blob_path = "excel-path/items.xlsx"
blob_client = container.get_blob_client(blob_path)
blob_client.upload_blob_from_url(excel_file)
Those are the two functions. receive a file, save it to blob storage, but i can't mock the download_excel call in the main function. I've tried using mock, patch, went through all sorts of links, and i just can't find a way to achieve this. Any help would be appreciated. here is what i currently have in the test file.
class TestFunction(unittest.TestCase):
##patch('download_excel')
def get_excel_files_main(self):
"""Test main function"""
req = Mock()
resp = main(req)
# download_excel= MagicMock()
self.assertEqual(resp.status_code, 200)
commenting out the function call in the function and in the test makes the test pass, but i need to know how to mock the download_excel call. I'm still going to write a test case for the download_excel function, but will cross that bridge when i get to it.

Figured it out. I'm pretty silly. Main issue was in an azure function, I figured since there was no class I can ignore every example in the doc that had to do with classes.
The trick is to use the function name as a class. so say you have function name - http_trigger, and a init.py file within that function folder. Within that init file - you have your main method, and a second method thats called from the main method - you can use MagicMock.
import function_name
def test_main_function(self):
"""Testing main function"""
function_name.second_method_being_called = MagicMock()
Thats it. Thats how you mock it! *facepalm

Related

Unit testing lambdas which call dynamodb in python

I am having lambdas which use boto3.client() to connect to a dynamoDB.
I tried to test it like this
#mock.patch("boto3.client")
def test(self, mock_client, test):
handler(event, context)
print(mock_client.call_count) # 1
print(mock_client.put_item.call_count) # 0
However, the mock_client.call_count is 1, but not the put_item_call_count.
My handler looks like this:
def handler(event, context):
dynamodb = boto3.client('dynamodb')
response = dynamodb.put_item(// same attributed)
Any suggestion, how to test if the correct item gots putted in the database without using moto?
I believe you're very close, there's just one tiny problem.
When your mocked boto3.client is called, it returns another mock and you want to evaluate that mocks call_count. By accessing the return_value of the original mock, you get access to the created magic mock.
#mock.patch("boto3.client")
def test(self, mock_client, test):
handler(event, context)
print(mock_client.call_count)
# .return_value refers to the magic mock that's
# created when boto3.client is called
print(mock_client.return_value.put_item.call_count)
What you're currently evaluating is the call count of boto3.client.put_item and not boto3.client("dynamodb").put_item().

Trouble with async context and structuring my code

I am using a library that requires async context(aioboto3).
My issue is that I can't call methods from outside the async with block on my custom S3StreamingFile instance. If I do so, python raises an exception, telling me that HttpClient is None.
I want to access the class methods of S3StreamingFile from an outer function, for example in a API route. I don't want to return anything more(from file_2.py) than the S3StreamingFile class instance to the caller(file_3.py). The aioboto3 related code can't be moved to file_3.py. file_1.py and file_2.py need to contain the aioboto3 related logic.
How can I solve this?
Example of not working code:
# file_1.py
class S3StreamingFile():
def __init__(self, s3_object):
self.s3_object = s3_object
async def size(self):
return await self.s3_object.content_length # raises exception, HttpClient is None
...
# file_2.py
async def get_file():
async with s3.resource(...) as resource:
s3_object = await resource.Object(...)
s3_file = S3StreamingFile(s3_object)
return s3_file
# file_3.py
async def main()
s3_file = await get_file()
size = await s3_file.size() # raises exception, HttpClient is None
Example of working code:
# file_1.py
class S3StreamingFile():
def __init__(self, s3_object):
self.s3_object = s3_object
async def size(self):
return await self.s3_object.content_length
...
# file_2.py
async def get_file():
async with s3.resource(...) as resource:
s3_object = await resource.Object(...)
s3_file = S3StreamingFile(s3_object)
size = await s3_file.size() # works OK here, HttpClient is available
return s3_file
# file_3.py
async def main()
s3_file = await get_file()
I want to access the class methods from an outer function... how do I solve this?
Don't. This library is using async context managers to handle resource acquisition/release. The whole point about the context manager is that things like s3_file.size() only make sense when you have acquired the relevant resource (here the s3 file instance).
But how do you use this data in the rest of your program? In general---since you haven't said what the rest of your program is or why you want this data---there are two approaches:
acquire the resource somewhere else, and then make it available in much larger scopes, or
make your other functions resource-aware.
In the first case, you'd acquire the resource before all the logic runs, and then hold on to it. (This might look like RAII.) This might well make sense in smaller scripts, or when a resource is designed to be held by only one process at a time. It's a poor fit for code which will spend most of its time doing nothing, or has to coexist with other users of the resource. (An extension of this is writing your own code as a context manager, and effectively moving the problem up the calling stack. If each code path only handles one resource, this might well be the way to go.)
In the second, you'd write your higher-level functions to be aware that they're accessing a resource. You might do this by passing the resource itself around:
def get_file(resource: AcquiredResource) -> FileResource:
...
def get_size(thing: AcquirableResource) -> int:
with thing as resource:
s3_file = get_file(resource)
return s3_file.size
(using made-up generic types here to illustrate the point).
Or you might want a static copy of (some) attrs of a particular resource, like a file here, and a step where you build that copy. Personally I would likely store those in a dict or local object to make it clear that I wasn't handling the resource itself.
The basic idea here is that the with block guards access to a potentially difficult-to-acquire resource. That safety is built into the library, but it comes at the cost of having to think about the acquisition and structure it into the flow of your code.

Cannot set side_effect on a mocked method

I'm trying to write a test for a function that uses a class as a dependency and calls this class method(s).
Let's assume the function is
def store_username_and_password(**kwargs) -> Tuple[str, StorageResult]:
storage = MyDependency(param1, param2)
try:
storage.read_data(mountpoint, path)
except InvalidPathException:
storage.write_data(data, mountpoint, path)
return (f"Stored successfully {some_params}", StorageResult(some_params))
In the test I'm trying to patch MyDependency like this:
input = {....}
with patch("my.application.namespace.MyDependency") as mock_storage:
mock_storage.read_data.side_effect = InvalidPathException("the data does not exist yet")
with raises(InvalidPathException) as e:
store_username_and_password(**input)
However, when I debug it and step inside the function call from the test above and proceed to the storage.read_data(mountpoint, path) call, I see in the debugger that there is no side_effect set. So it never raises that exception I want on read_data call.
See below:
Anthony's comment helped me but I thought I'd add an explanation...
mock_storage.read_data.side_effect will work, but only if you call MyDependency.read_data(). What happens is the call to MyDependency(param1, param2) creates a new MagicMock that doesn't have the side_effect set on it. The new mock can be accessed with mock_storage.return_value.
So, if you have:
storage = MyDependency(param1, param2)
storage.read_data(mountpoint, path)
Then to mock that method you need:
with patch("my.application.namespace.MyDependency") as mock_storage:
mock_storage.return_value.read_data.side_effect = Exception

How to deploy a script in AWS lambda

I have two scripts which I need to deploy in AWS lambda, I have never done it before, from the documentation I created kind of a few steps which would summarize the flow:
Create a lambda function
Install boto3
Use invoke function
Lets say I have a simple function:
def first_function():
return print('First function')
When I go to AWS -> Lambda -> Functions -> Create function I get to the configuration part where in the editor I see this:
import json
def lambda_handler(event, context):
# TODO implement
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
Is this how I should edit this to deploy my function:
import json
def lambda_handler(event, context):
# TODO implement
return {
def first_function():
return print('First function')
first_function()
}
Tha lambda_handler that shows up when you create a function in the console is simply boiler plate code.
You can name your handler anything, or simply place your function code under lambda_handler
def lambda_handler(event, context):
return print('First function')
The name lambda_handler is configurable, meaning you could use the code
def first_function(event, context):
return print('First function')
But you'll need to ensure that the function is configured to use first_function as it's handler.
I'd recommend reading through the docs specific to python handlers
Whatever functionality you need to implement in your lambda, you should write within the lambda_handler. If you want to refer to other smaller function you can define it outside the lambda handler function and can refer to it in the handler. So it might be like below
import x
def functiona():
print(‘something’)
def functionb():
print(‘somethingelse’)
def lambda_handler(event,context)
print(‘lambda entry point)
functiona()
functionb()
Since the module will first be imported, you can still write code outside of functions although it is usually not a good practice since you cannot access the context and parameters you have sent to lambda.

How to patch a function that a Flask view calls

My webapp wants to send a message to AWS SQS with boto and I'd want to mock out sending the actual message and just checking that calling send_message is called. However I do not understand how to use python mock to patch a function that a function being tested calls.
How could I achieve mocking out boto con.send_message as in the pseudo-like code below?
views.py:
#app.route('/test')
def send_msg():
con = boto.sqs.connect_to_region("eu-west-1",aws_access_key_id="asd",aws_secret_access_key="asd")
que = con.get_queue('my_queue')
msg = json.dumps({'data':'asd'})
r=con.send_message(que, msg)
tests.py
class MyTestCase(unittest.TestCase):
def test_test(self):
with patch('views.con.send_message') as sqs_send:
self.test_client.get('/test')
assert(sqs_send.called)
To do this kind of test you need patch connect_to_region(). When this method is patched return a MagicMock() object that you can use to test all your function behavior.
Your test case can be something like this one:
class MyTestCase(unittest.TestCase):
#patch("boto.sqs.connect_to_region", autospec=True)
def test_test(self, mock_connect_to_region):
#grab the mocked connection returned by patched connect_to_region
mock_con = mock_connect_to_region.return_value
#call client
self.test_client.get('/test')
#test connect_to_region call
mock_connect_to_region.assert_called_with("eu-west-1",aws_access_key_id="asd",aws_secret_access_key="asd")
#test get_queue()
mock_con.get_queue.assert_called_with('my_queue')
#finaly test send_message
mock_con.send_message.assert_called_with(mock_con.get_queue.return_value, json.dumps({'data':'asd'}))
Just some notes:
I wrote it in a white box style and check all calls of your view: you can do it more loose and omit some checks; use self.assertTrue(mock_con.send_message.called) if you want just check the call or use mock.ANY as argument if you are not interested in some argument content.
autospec=True is not mandatory but very useful: take a look at autospeccing.
I apologize if code contains some error... I cannot test it now but I hope the idea is clear enough.

Categories