import Python files in Azure function - python

I did import a file in azure function in a structure like this
from folder1.folder.script import function
but it showed me error
Exception: ModuleNotFoundError: No module named 'folder1'
whene I tried outside the azure function it worked so any idea why it's can't be accessed within the function

Created a Folder calcfunction and added the add_number function:
def add_number(n1,n2):
sum = n1 + n2;
return sum;
print("The sum of two number is",sum)
Calling this method from Azure Function Python Class:
import logging
import azure.functions as func
from calcfunction import calc
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
id = req.params.get('id')
if not id:
try:
req_body = req.get_json()
except ValueError:
pass
else:
id = req_body.get('id')
if id:
return func.HttpResponse(f"This is the user entered userId {id} and calc function value {calc.add_number(12,24)}")
else:
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a ID in the query string or in the request body for a personalized response.",
status_code=200
)
My Azure Functions Project Folder Structure and the result:

Related

Receiving TypeError in Azure Function with Python runtime

I'm having trouble understanding the problem here.
HTTP-triggered Azure Function
Python runtime
Testing on localhost with HTTPS (no problem here)
URL:https://localhost:5007/api/BARCODE_API
Goal: Check and validate the type parameter of the URL
Ensure it is present, a string, etc.
This works:
import azure.functions as func
import logging
def main(req: func.HttpRequest) -> func.HttpResponse:
if req.params.get('type'):
return func.HttpResponse(
"Test SUCCESS",
status_code=200
)
else:
return func.HttpResponse(
"Test FAIL",
status_code=400
)
I can't see why this does NOT work...
import azure.functions as func
import logging
def main(req: func.HttpRequest) -> func.HttpResponse:
def check_type():
try:
if req.params.get('type'):
return func.HttpResponse(
"Test SUCCESS",
status_code=200
)
except:
return func.HttpResponse(
"Test FAIL",
status_code=400
)
check_barcode = check_type()
I also tried passing req.params.get('type') to the check_type() function, but same error..
Error
Exception: TypeError: unable to encode outgoing TypedData: unsupported type "<class 'azure.functions.http.HttpResponseConverter'>" for Python type "NoneType"
I can't see why this is happening when I send https://localhost:5007/api/BARCODE_API?type=ean13
EDIT 1: Using #MohitC's recommended syntax still causes the error above.
Test1 shows it failing with a Status 500 (crux of this question)
Test2 shows it succeeding with a Status 200 then failing with a Status 400 (as it should)
The problem with your code is that if your if req.params.get('type'): is evaluating to false, then no exception is raised and your function returns None type, which is probably further causing the mentioned error.
There are couple of things you could do here,
Return the test fail status code in the else part of your code.
Raise an exception if the condition is not true, then the except part would return what you want.
def check_type():
try:
if req.params.get('type'):
return func.HttpResponse(
"Test SUCCESS",
status_code=200
)
else:
return func.HttpResponse(
"Test FAIL",
status_code=400
)
except:
return func.HttpResponse(
"Test FAIL",
status_code=400
)
EDIT:
Based on architecture of your Azure API shown elegantly in Gif's, it looks like your main function must return something. You are collecting the HTTP response in check_barcode but not returning it.
Try below code:
import azure.functions as func
import logging
def main(req: func.HttpRequest) -> func.HttpResponse:
def check_type():
try:
if req.params.get('type'):
return True
else:
return False
except:
return False
check_barcode = check_type()
if check_barcode:
return func.HttpResponse(
"Test SUCCESS",
status_code=200
)
else:
return func.HttpResponse(
"Test FAIL",
status_code=400
)

Azure Durable functions python DurableOrchestrationContext get_input returning null

I am testing this function, get_input() from Azure durable functions for Orchestra function. More details of the function.
What I'm facing now is that, when I'm trying to test with postman and input a json input for e.g.
{
"points": 222
}
as the json body and while calling for http://localhost:<portnumber>/api/orchestrators/DurableFunctionsOrchestrator1, it will always return me a null value when i try to return the get_input function's value
Below is a screenshot of what it returns me. As you can see everything is working fine since its completed status but the output always returns me a null.
In a Instance_id you can pass your Client_input over there to get the request body parameter off the Durable Orchestration Client. Pass a Json serialized payload over the orchestrator.
async def main(req: func.HttpRequest, starter: str) -> func.HttpResponse:
client = df.DurableOrchestrationClient(starter)
function_name = req.route_params["functionName"]
requestBody = json.loads(req.get_body().decode())
instance_id = await client.start_new(function_name, client_input=requestBody)
In a orchestrator, the code you can use the same get_input():
requestBody : str = context.get_input()
I am tried with the blog. I am not getting any Null value on get_input().

Python Azure function with http trigger not seeing json body

I am building locally a Blazor server app which calls an azure function written in python. I am developing both on my local machine using visual studio for the Blazor app and VS code for the python function. The Python is 3.8.7
The Blazor app sends data to the azure function at http://localhost:7071/api/xxxxx using PostAsJsonAsync as json data in the body. I have tested that that works using webhook.site. The JSON data is (mainly) a base64 encoded .wav file.
The call to PostAsJsonAsync seems to be seen by the python azure function and works "a bit" as if I add a parameter to the call I can read it. However the python function always reports the body as being of zero length.
What am I doing wrong?
Check if you are sending the request like below:
var modelNew = new Model() { Description = "willekeurige klant", Name = "John Doe" }; response = await client.PostAsJsonAsync("api/ModelsApi/", modelNew);
if (response.IsSuccessStatusCode) //check is response succeeded
{
// Do something
}
And, reading it like below:
import logging
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello {name}!")
else:
return func.HttpResponse(
"Please pass a name on the query string or in the request body",
status_code=400
)
Check if you are using this code to encode it:
private static string Base64Encode(string plainText)
{
var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(plainText);
return System.Convert.ToBase64String(plainTextBytes);
}
Further, pls consider using files like .wav in Azure Blob Storage and pass it's url location instead of the whole object for better security.

Python Lambda giving botocore.errorfactory.InvalidLambdaResponseException when triggered on postconfirmation

I have setup AWS Lambda function that is triggered on an AWS Cognito. The trigger on a successful email confirmation. The Lambda function is in Python3.6.
I am referring to the AWS documentation for Cognito postConfirmation trigger.
https://docs.aws.amazon.com/cognito/latest/developerguide/user-pool-lambda-post-confirmation.html
"response": {}
So far I have tried returning None, {}, '{}'(empty json string) or valid dictionary like {'status':200, 'message':'the message string'} but it is giving error.
botocore.errorfactory.InvalidLambdaResponseException: An error occurred (InvalidLambdaResponseException) when calling the ConfirmSignUp operation: Unrecognizable lambda output
What should be a valid response for the post confirmation function?
here is the part of code.
from DBConnect import user
import json
def lambda_handler(event, context):
ua = event['request']['userAttributes']
print("create user ua = ", ua)
if ('name' in ua):
name = ua['name']
else:
name = "guest"
newUser = user.create(
name = name,
uid = ua['sub'],
owner = ua['sub'],
phoneNumber = ua['phone_number'],
email = ua['email']
)
print(newUser)
return '{}' # <--- I am using literals here only.
You need to return the event object:
return event
This is not obvious in the examples they provide in the documentation. You may want to check and ensure the event object does contain a response key (it should).

Notifications keep posting to slack channel

I am using slack webhooks to send message to my slack channel.
The problem it keeps posting messages every few mins.
Here what I did...
I created a simple function under util folder.
def send_to_slack(text):
conn_id = "https://hooks.slack.com/services/your/slack/URL"
task_slack_alert(text, url, is_error=False, args=None)
def task_slack_alert(msg, url, is_error=False, args=None):
slack_msg = ":red_circle: Task Failed" if is_error else ":green_heart: Task Message"
"""*Task*: {task}
*Dag*: {dag}
*Execution Time*: {exec_ts}""".format(
task=args["task"],
dag=args["dag"],
exec_ts=args["ts"],
) if args else ""
message = {'text': + msg}
response = requests.post(url=url, data=json.dumps(message))
time.sleep(1)
print(f"Slack response {response}")
if response.status_code != 200:
print(f"Error sending chat message. Got: {response.status_code}")
In my dag (which is under another folder) I call the function
The dag copy data from oracle to snowflake db and this works without slack part.
Inside my dag i do the following:
x = {‘key1’: [‘value1’, ‘value 2’, … ‘value10]}
send_to_slack('My test message from python')
default_args = {...
'on_failure_callback': send_to_slack, }
with DAG(‘my_dag’,
default_args=default_args,
catchup=False) as dag:
parallel = 4
start = DummyOperator(task_id='start')
tasks = []
i = 0
for s in x.keys():
for t in x.get(s):
task = OracleToSnowflakeOperator(
task_id=s + '_' + t,
source_oracle_conn_id=source_oracle_conn_id,
source_schema=schema,
source_table=table,…
)
if i <= parallel:
task.set_upstream(start)
else:
task.set_upstream(tasks[i - (parallel + 1)])
i = i + 1
tasks.append(task)
I know if I define the function inside the same dag, it will be called every time the dag is parsed.
not my case, so What's wrong?
Thanks
You're calling the function send_to_slack inside your DAG file, this means it will run every time the scheduler evaluates your DAG (every few minutes).
You should either:
Use the slack operator that comes with Airflow and put it downstream from your OracleToSnowflakeOperator and treat it like any other operator
Edit your OracleToSnowflakeOperator, which I assume is a custom one, and put the logic to call Slack in there (use the slack hook)
Basically you should put be encapsulating the call to Slack inside a custom operator or use the standard Slack operator provided, don't put it inside your DAG definition.

Categories