My Linux Python function app is refusing to run the most recent code deployed to it.
I've tried deploying via Visual studio code and GitHub which does work as I can see the code within the "Code + Test" section within the function, but the function app refuses to run it.
I've tried restarting and the different deployment methods but have now run out of ideas on why this is happening.
Deployed the Azure Functions Python (3.9.9 Version) Http Trigger to Azure Function App:
Changed the HTTP Trigger code to get the ID parameter in the function URL along with the response string and published to Azure:
It took a minute or two to replace the new code in the "Code+Test" Section of the Portal of Azure Function App HTPT Trigger.
And for the first time, if any code changes to the Project and that has to be picked by Azure from the GitHub, it takes 5 to 10 minutes long only for the 1st time in order to reflect the changes of code made and sometimes too long, that depends on the external libraries used and Number of files present in the project to sync with Azure.
Please refer to the workarounds of the similar issues:
Continuous deployment from GitHub takes several minutes
Increase deployment speed of Azure Functions
https://github.com/Azure/functions-action/issues/61
If this issue persists and takes a long time for simple function code, then please raise a ticket with Microsoft Support.
Related
I’m working on a project that will need to load python code files dynamically from GitHub on launch. Here’s what it needs to look like:
User asks us to launch an instance for them and provides us with a GitHub url
We have an existing docker with our own python code (a server) that will be using those files from GitHub
We need to launch the docker with our own code, but subbing in parts that we got from the users GitHub, basically creating a server with half our code, half user code
In other words, we need to launch a docker that has some pre planned code from us, and some dynamic code from the user.
Any ideas how to do this? I’ve seen many examples of docker files that load code from GitHub, but I’m having a hard time figuring out how to make it half our code, and half code dynamically from GitHub on run.
This problem is showing up only on a specific project in Google Cloud (on multiple VMs). The exact same code works perfectly on other projects and on my Mac.
I have a python module which is running streaming speech API with something like this:
responses = client.streaming_recognize(
streaming_config,
get_requests()
)
The code is based on the Google sample here
I can run this module directly and see that it is getting result callbacks. This is true even on the Google VM where later (see below) I have the issues.
Then I run the same module with multiprocessing.Process and then, the call to streaming_recognize just does nothing.
I used logging to see it is being called and the code continues to run. I get no errors.
But get_requests() is NEVER getting called.
I have tried different service accounts on each of the environment. But the failure shows only in the one project, regardless of service account used.
Any reason there should be a different behavior is a sub process?
UPDATE:
I found out that if I start the process instead with subprocess.Popen everything works correctly. And just as a reminder, this problem (difference in behavior based on how I started the process) happens only in one specific Google Cloud project, and not anywhere else.
I currently have a free Azure Notebook account on https://notebooks.azure.com/ and would like to execute a Python script (or Jupyter Notebook) hosted on Azure automatically once every 10 minutes.
Is there a way to do this within the free Azure notebook account?
I am aware of several approaches which are described on the web, such as using Azure WebJobs, Azure Functions, Azure IoT and so on. However, all these approaches require me to upgrade to the the "Free" account which actually is only free for the first 12 months, so I would like to avoid that if possible.
As I known, there is not any feature about job like WebJobs for Azure WebApp or Jobs for Azure Databricks in Microsoft Azure Notebooks. So I tried to trigger a Python script via crontab on Ubuntu of Azure Notebooks, but failed because the cron service default not started and Azure does not offer the nbuser password for using sudo to start cron service.
However, I also tried to write a Python script hello.py as below.
from datetime import datetime
import time
while(True):
print(f"{datetime.now()} => Hello, world! ")
time.sleep(10) // 10 seconds
And I ran it in Terminal of Azure Notebooks, as the figures below, then I closed the terminal page and ran !tail -f ~/hello.log, it seems not to be terminated by the closed event of terminal page.
You can try to this way. If it's not what you want, I think it's impossible on Azure Notebooks.
There are a number of 'always free' services that come with the free Azure account which includes Azure Functions.
As long as you stay within the free functions limit, which is currently 1,000,000 requests per month, you wouldn't have to pay anything.
https://azure.microsoft.com/en-us/free/free-account-faq/
Need advice on how to incorporate Python into an Azure ASP.NET web application environment. Please excuse this question but I am new to Azure and I'm not clear on how to proceed. Every option that I look into looks promising but they all seem to have their own issues. Below is a more thorough explanation but the deal is that I have an Azure account with all kinds of goodies, a full fledged ASP.NET (C#) web app running via App Service, I am new to Azure (but not Python), and I'm hoping to add Python functionality to this whole setup. In short:
I want to add Python to this setup mainly to run scheduled jobs and also to trigger Python code from ASP.NET web form submissions
ideally I want a solution that resembles a non-cloud setup. I know this sounds silly but I'm finding the cloud/Azure functionality to be nuanced and not straightforward. I want a place to put a bunch of Python scripts, run, edit, schedule and trigger them from ASP.NET
for example: I created a WebJob that runs manually and from the documentation it wasn't clear how it should be called. I just figured out that you need to POST with Basic Auth (and the credentials provided).
!Also, Azure CMD does NOT like files with 'underscore _' in them! You cannot submit a Web Job with a py file with an underscore nor can you write output with a file with an underscore
!Also, I don't see an option for this Web Job to run Python 3.6.4 (which I installed via extension). Right now it is using 2.7.15...
!Also, CRON expression in Azure has six *, not five plus a command. Again, more weird stuff to worry about
I tried these instructions but the updates to the web page's Web.config file breaks the ASP.NET web pages
ideally the most cost effective option
Any info is greatly appreciated
MORE DETAILED EXPLANATION
Currently I have an ASP.NET site running via Azure App Service and I would like to add Python scripts and possibly Flask/Rest functionality. Note that I am not expecting to serve any content via Python and will largely be running Python scripts either on a scheduled basis or call them from ASP.NET. As a matter of fact, and this is an important point, I'm hoping to have ASP.NET trigger/run a Python script when a web form is submitted. I realize that I could get a similar effect if I make a web call to a Rest api that is running Python. In any event, I can't tell if I should:
add a Python extension to the current App Service running the web page (I tried this) OR
I did install Python 3.6.4 and some packages via pip
These instructions were useful, however the updates to the web page's Web.config file breaks the ASP.NET web pages
set up a VM that will have all of the Python code (but how can I have the .NET web page(s) call the Python in the VM?) OR
use Azure functions (I'm completely new to this and must admit that I prefer to have my old school Python environment instead although I see the benefit of using functions. But how do you deal with logging and debugging?)
or what about a custom windows container (Docker)?
This requires installing VS Code and that is OK but I'm looking for a solution that another user can get into with as few interruptions as possible
The idea is to ramp up the use of Python although, like I said, I don't expect Python to be serving any of the web content. It will be used to run in the background and to run scheduled jobs. What is the most robust and hopefully easiest way to add Python functionality to Azure (most importantly in a way to be able to trigger/use Python from an App Service running .NET?)? I've searched online and stack overflow so far with interesting finds but nothing to my liking.
For example, the following link discusses how to schedule WebJobs. I just created a manual one and when I called the webhook I got the message: "No route registered for '/api/triggeredwebjobs/TestPython/run'" How to schedule python web jobs on azure
The Docker method looks very promising, however, I'm looking for a simple solution as there is another person who will be involved in all of this and he's busy with other projects
Thank you very much!
I found a solution, though I'm open to more info. Like I mentioned in my post, I used the 'add extension' tool to add Python 3.6.4 to my Azure (installed in D:\home\python364x64).
Then I installed a bunch of packages via pip, these installed into D:\home\python364x64\Lib\site-packages.
I created a Python folder in webpages\Python where I put my scripts.
Finally, in ASP.NET I used the Diagnostics.Process call to run my code in ~\webpages\Python\somecode_2.py
The main issue is that Azure came with Python 2.7.15 installed. And for some reason when my Python code got executed it was using 3.4 (where that version came from beats me). So for each script, I had to create an _2.py version where I simply did the following in order to call the original script via Python 3.6.4. Looks a little nasty but it works. So like I said, I would welcome more info for ways to do this better...
--
import os<br>
os.system("D:\\home\python364x64\python.exe SomePython.py {0}".format(add arguments here)
I am developing a python django app for my project. Due the nature of one of my apps, I need to run a certain script for a long period of time (maybe several hours).
Obviously everything is fine in my local environment. However, when I publish that app to the azure, the app crashes after a period of time due to max. execution time (It is not giving some error related to max. execution instead it throws an internal server error)
At this point I have 2 questions:
Is it possible to increase max. execution time for a python web-app in azure? If yes, how can I do that?
Should I be using some other azure service rather than a web-app for such an operation?
Thank you.
You can try to leverage WebJobs to run your certain scripts or programs for on demand, continuously, or on a schedule tasks on background.
And at the same time, as web apps are unloaded if they are idle for some period of time
by default. This lets the system conserve resources. In Basic or Standard mode, you can enable Always On to keep the app loaded all the time. Especially if you need to run continuous or a long time job or task, you should enable Always On.
You can modify this setting in manage portal, refer to Configure web apps in Azure App Service for details.