python script on azure web jobs - No module named request - python

I need to run the python scripts on Azure web jobs but i am getting the below error. I tried all the possible ways like scripts with virtualenv and append the path but none of them is working.
[10/08/2018 11:27:27 > ca6024: ERR ] ImportError: No module named request
Can you please help me to fix?
The script used in the file is,
import urllib.request
print('success')

according to
https://docs.python.org/2/library/urllib.html
you can check your python version. it's different between python2 and python3.
in python2.7, use :
urllib.urlopen()
instead of :
urllib.request.urlopen()

Please refer to below steps which I uploaded Python script into Webjobs previously.
1: Use the virtualenv component to create an independent Python runtime environment in your system.If you don't have it, just install it first with command pip install virtualenv
If you installed it successfully, you could see it in your python/Scripts file.
2 : Run the command to create independent Python runtime environment.
3: Then go into the created directory's Scripts folder and activate it (this step is important , don't miss it)
Please don't close this command window and use pip install <your libraryname> to download external libraries in this command window. Such as pip install request for you.
4:Keep the Sample.py uniformly compressed into a folder with the libs packages in the Libs/site-packages folder that you rely on.
5: Create webjob in Web app service and upload the zip file, then you could execute your Web Job and check the log
You could also refer to the SO thread :Options for running Python scripts in Azure

Related

IBM Cloud Functions - "Invalid virtualenv. Zip file does not include activate_this.py"

I want to deploy a python script with a virtual environment (I need a library which is not in the runtime provided by IBM Cloud Functions) to IBM Cloud Functions. I want to do this with simple zipping so
I followed the documentation: Packaging Python code with a local virtual environment in a compressed file in the link below:
https://cloud.ibm.com/docs/openwhisk?topic=openwhisk-prep
I have python 3.7 installed and the virtualenv also uses that. The virtualenv is named "virtualenv" as required, however I still receive the error when I try to invoke the action:
Results:
{
"error": "The action failed to generate or locate a binary. See logs for details."
}
Logs:
[
"2021-05-20T09:27:03.627094Z stderr: Invalid virtualenv. Zip file does not include activate_this.py",
"2021-05-20T09:27:03.627Z stderr: The action did not initialize or run as expected. Log data might be missing."
]
I checked the virtualenv directory and I have "activate_this.py" in the Scripts folder.
What am I missing? The only difference between the steps I take and the documentation is that I have a windows computer, so the activation of the environment went like virtualenv\Scripts\activate, not through bin, and I zipped the script and the virtualenv via windows GUI.
Is it possible that Cloud Functions tries to find the file in a "bin" folder instead of a "Scripts"? If so, what can I do?
Thanks
This is a bug in handling windows packaged venv, as you noted. It is fixed in this fork https://github.com/nimbella-corp/openwhisk-runtime-python/commit/2eb3422cb2dca291cff47ed3239de8512170a1be you can apply and build your own container image as a workaround.

how to install python packages in kubernetes pods

I have a custom airflow image that has some python packages in it. I am running it in local kubernetes cluster. I have created a DAG in my airflow that uses one of the python packages and it works totally fine.
But if I use some other python package that's not in my custom base image(imageio or anyother image), it gives me module not found error.
So I added the line "RUN pip3 install imageio==2.8.0" (or any other package) in my Dockerfile, first of all, it gave me the warning: python install warning
Now the import error is no more there but if I run my DAG it fails without outputting any logs.
So next I added the line "ENV PATH="/usr/local/airflow/.local/bin:${PATH}" but still the same thing.
I am guessing my DAG is not able to find the extra python packages that are being installed, or more clearly somehow the pods being created don't have those extra python packages with them.
However, if I do "docker run -it {image_name} bash" and type import imageio inside a python shell, it works fine.
Is there some config file in which I will need to mention the extra python packages that I want so that the pods running the DAGs will register those packages?
or is there a way to specify it in the values.yaml file?

Azure can not find python version

When I try to run func init for azure core tools I get the error "Could not find a Python version" but when I run az --version, it shows that it detects python 3.6 as required.
This python is installed with the Azure Cli, however it's not set to the system path.
So now your choice is to download a new python installation package it will set path for you.
Another choice is you the Cli Python, cause it already give you the python directory, so just add the directory C:\Program Files (x86)\Microsoft SDKs\Azure\CLI2 to the path, then you will be able to use python in the command line.
First, please make sure you have installed python3.6 or python3.7.
If you don't install python you will get the same result as your screenshot, like this:
Please download python3.6 or python3.7 on the python offcial website and install. What your screenshot showed can not mean the python is installed.
Second, please add the python path to the env variable. After add the env variable, remember to restart your computer otherwise the env varible will not work.
This is my env variable configuration of python:
After that, you can create python function app successfully.

Azure functions: Installing Python modules and extensions on consumption plan

I am trying to run a python script with Azure functions.
I had success updating the python version and installing modules on Azure functions under the App Services plan but I need to use it under the Consumption plan as my script will only execute once everyday and for only a few minutes, so I want to pay only for the time of execution. See: https://azure.microsoft.com/en-au/services/functions/
Now I'm still new to this but from my understanding the consumption plan spins up the vm and terminates it after your script has been executed unlike the App Service plan which is always on.
I am not sure why this would mean that I can't have install anything on it. I thought that would just mean I have to install it every time I spin it up.
I have tried installing modules through the python script itself and the kudu command line with no success.
While under the app service plan it was simple, following this tutorial: https://prmadi.com/running-python-code-on-azure-functions-app/
On Functions Comsumption plan, Kudu extensions are not available. However, you can update pip to be able to install all your dependencies correctly:
Create your Python script on Functions (let's say NameOfMyFunction/run.py)
Open a Kudu console
Go to the folder of your script (should be d:/home/site/wwwroot/NameOfMyFunction)
Create a virtualenv in this folder (python -m virtualenv myvenv)
Load this venv (cd myenv/Scripts and call activate.bat)
Your shell should be now prefixed by (myvenv)
Update pip (python -m pip install -U pip)
Install what you need (python -m pip install flask)
Now in the Azure Portal, in your script, update the sys.path to add this venv:
import sys, os.path
sys.path.append(os.path.abspath(os.path.join(os.path.dirname( __file__ ), 'myvenv/Lib/site-packages')))
You should be able to start what you want now.
(Reference: https://github.com/Azure/azure-sdk-for-python/issues/1044)
Edit: reading previous comment, it seems you need numpy. I just tested right now and I was able to install 1.12.1 with no issues.
You may upload the modules for the Python version of your choice in Consumption Plan. Kindly refer to the instructions at this link: https://github.com/Azure/azure-webjobs-sdk-script/wiki/Using-a-custom-version-of-Python
This is what worked for me:
Dislaimer: I use C# Function that includes Python script execution, using command line with System.Diagnostics.Process class.
Add relevant Python extension for the Azure Function from Azure Portal:
Platform Features -> Development Tools -> Extensions
It installed python to D:\home\python364x86 (as seen from Kudu console)
Add an application setting called WEBSITE_USE_PLACEHOLDER and set its value to 0. This is necessary to work around an Azure Functions issue that causes the Python extension to stop working after the function app is unloaded.
See: Using Python 3 in Azure Functions question.
Install the packages from Kudu CMD line console using pip install ...
(in my case it was pip install pandas)

How to use python virtual env in azure webjob

I'm trying to set up a python script as an azure webjob and the script is using several external dependencies, and in the documantation there seem to be no reference to using virtual env for webjobs.
how can i set a virtual env for the webjob? preferably without collecting the enviroment locally and running the script thorugh run.cmd
If you are trying to activate an already existing virtualenv, you can call its activate script. For instance, if you want to activate the web app's virtualenv, you can run
/path/to/web-app/env/Scripts/activate.bat
to acticate that particular virtulenv.
I had the same question, and found an answer in this post.
Short answer : put the directory of the module you want to include in the ZIP file you upload in the webjob. You can then reference it directly in your code.
Hope that helps!
This is kind of workaround but it works. Just add these lines to web job script.
import sys
site_packages = "D:\\home\\site\\wwwroot\\env\\Lib\\site-packages"
sys.path.append(site_packages)
import requests

Categories