How to connect Azure Python package in WebJob - python

I have my code in my local related to our business, I am trying to deploy it to Azure but displayed with few import errors and few internal server errors.
Here I am interacting with some services like storage etc.. so I installed all the services with pip(pip is also a latest version).
I am new to Azure in interacting with SDK's. Any suggestions or steps are highly appreciated

We will have all the packages in sitepackages in our local. Whenever you install all the packages you need to install them by activating virtual environment in you local, so that they will be accessible when you import them.
You can try something like below in you code so that your webjob will load all your packages when the code runs:
import sys
package = "D:\home\site\wwwroot\env\Lib\site-packages"
sys.path.append(package)
Also you can refer to this SO where we have clear explanation similar to your problem, thanks to Gary for covering it.

Related

Is there anyway to run and deploy ubuntu packages on Azure functions Startup?

In my Az Function app, I have some ubuntu packages like Azure CLI and Kubectl that I need to install on the AZ Host whenever it starts a new container. I have already tried Start-up Commands and also going into the Bash. The former doesnt work and the latter tells me permission is denied and resource is locked. Is there any way to install these packages on function start-up in Azure Functions?
If you try to install the package via bash, it is impossible and will not be dealt with at all. The reason is because when you use python to write functions and deploy them to linux os on azure, in fact it installs various packages according to requirements.txt, and finally merges these packages into a whole. When you run the function on azure, you are based on this whole package. Therefore, if it is incorrect to try to install the package after deployment, you should specify the package to be installed in requirements.txt before deployment and then deploy to azure.

Azure Linux Python Webapp ImportError: libodbc.so.2 cannot open shared object file

Built a flask app that works when I run it locally (Windows 10). Deployment through local git is successful, but when I try to visit the url, all I see if Application Error. So when I check the log stream, it shows that the app breaks down when it tries to import pyodbc with the error "ImportError libodbc.so.2: cannot open shared object file."
So I SSH'ed into the container, activated the virtual environment, called open python and called import pyodbc and it worked. Now I'm confused as to why it would fail when I try to visit the URL, vs when I SSH and import pyodbc myself it would work. I'm not sure if I've provided enough detail so let me know if I should add anything else.
What is going on? How do I fix my container so that it can import pyodbc?
#user152836 - Can you please let us know the following information to debug further:
Version of python that you are using in your web app? Native or extension?
OS of your web app?
If you installed a python extension, the problem could be that azure web app uses default python which does not have the pyodbc driver. In order to use the python extension you can follow this thread - install odbc driver to azure app service
You also need to export the relevant LD LIBRARY path and add it in your bash profile file as in this example:
Python executable not finding libpython shared library
If the above two suggestions don't work, you can also check this link: pyodbc - error while running application within a container

How do I connect to an external Oracle database using the Python cx_Oracle package on Google App Engine Flex?

My Python App Engine Flex application needs to connect to an external Oracle database. Currently I'm using the cx_Oracle Python package which requires me to install the Oracle Instant Client.
I have successfully run this locally (on macOS) by following the Instant Client installation steps. The steps required me to do the following:
Make a directory called /opt/oracle
Create a symlink from /opt/oracle/instantclient_12_2/libclntsh.dylib.12.1 to ~/lib/
However, I am confused about how to do the same thing in App Engine Flex (instructions). Specifically, here's what I'm confused about:
The instructions say I should run sudo yum install libaio to install the libaio package. How do I do this on GAE Flex? Or is this package already available?
I think I can add the Instant Client files to GAE (a whopping ~100MB!), then set the LD_LIBRARY_PATH environment variable in app.yaml to export LD_LIBRARY_PATH=/opt/oracle/instantclient_12_2:$LD_LIBRARY_PATH. Will this work?
Is this even feasible without using custom Docker containers on App Engine Flex?
Overall I'm not sure if I'm on the right track. Would love to hear from someone who has managed this before :)
If any of your dependencies is not available in the base GAE flex images provided by Google and cannot be installed via pip (because it's not a python package or it's not available in PyPI or whatever other reason) then you can't use the requirements.txt file to get it installed in your GAE flex app.
The proper way to satisfy such dependencies would be to build your own custom runtime. From About Custom Runtimes:
Custom runtimes allow you to define new runtime environments, which
might include additional components like language interpreters or
application servers.
Yes, that means providing a custom Docker file. In your particular case you'd be installing the Instant Client and libaio inside this Dockerfile. See also Building Custom Runtimes.
Answering your first question, I think that the instructions in the oracle website just show that you have to install said library for your application to work.
In the case of App engine flex, they way to ensure that the libraries are present in the deployment is with the requirements.txt textfile. There is a documentation page which does explain how to do so.
On the other hand, I will assume that "Instant Client Files" are not libraries, but necessary data for your App to run. You should use Google Cloud Storage to serve them, or any other alternative of Storage within Google Cloud.
I believe that, if this is all what you need for your App to work, pushing your own custom container should not be necessary.

How to 'pip install packages' inside Azure WebJob to resolve package compatibility issues

I am deploying a WebJob inside Azure Web App that uses Google Maps API and Azure SQL Storage.
I am following the typical approach where I make a WebJob directory and copy my 'site-packages' folder inside the root folder of the WebJob. Then I also add my code folder inside 'site-packages' and make a run.py file inside the root that looks like this:
import sys, os
sys.path.append(os.path.join(os.getcwd(), "site-packages"))
import aero2.AzureRoutine as aero2
aero2.run()
Now the code runs correctly in Azure. But I am seeing warnings after a few commands which slow down my code.
I have tried copying 'pyopenSSL' and 'requests' module into my site-packages folder, but the error persists.
However, the code runs perfectly on my local machine.
How can I find this 'pyopenSSL' or 'requests' that is compatible with the python running on Azure?
Or
How can I modify my code so that it pip installs the relevant packages for the python running on Azure?
Or more importantly,
How can I resolve this error?
#Saad,
If your webjob worked fine on Azure Web App, but you got inscuritywaring, I suggest you can try to disable the warning information via this configuration(https://urllib3.readthedocs.org/en/latest/security.html#disabling-warnings ).
Meanwhile,requests lib has some different with the high version, I recommend you refer to this document:
http://fossies.org/diffs/requests/2.5.3_vs_2.6.0/requests/packages/urllib3/util/ssl_.py-diff.html
And Azure web app used the Python 2.7.8 version which is lower than 2.7.9. So you can download the requests lib as version 2.5.3
According the doc referred in the warning message https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning:
Certain Python platforms (specifically, versions of Python earlier than 2.7.9) have restrictions in their ssl module that limit the configuration that urllib3 can apply. In particular, this can cause HTTPS requests that would succeed on more featureful platforms to fail, and can cause certain security features to be unavailable.
So the easiest way fix this warning, is to upgrade the python version of the Azure Web Apps. Login the Azure manager portal, change the python version to 3.4 in Application settings column:
As I test in webjob task to use requests module to request a "https://" url, and since upgrade python version to 3.4, there are no more warnings.
I followed this article and kind of 'pip installed' the pymongo library for my script. Not sure if it works for you but here are the steps:
Make sure you include the library name and version in the requirements.txt
Deploy the web app using Git. The directory should include at least
requirements.txt (only to install whatever is in requirements.txt in the virtual environment, which is shared with Web App in D:\home\site\wwwroot\env\Lib\site-packages)
add this block of code to the Python code you want to use in the WebJob zip file.
import sys
sitepackage = "D:\home\site\wwwroot\env\Lib\site-packages"
sys.path.append(sitepackage)

Easiest way to share work between backend and front end

Hey everyone,
I am expanding my team and I have recently added an additional front end engineer on my site. I am currently using django to run my site but my site is using a lot of plugins, namely: django-celery, django-mailer, django-notification, and django-socialregistration.
Let me describe my situation:
He is using Mac OS X, and I have no experience in installing stuff on mac os X or configuration on that platform
I believe that getting my backend to run on his computer might be somewhat troublesome, i.e. I have to install a bunch of plugins (which are not available on pip or easy_install as they are the latest version) and I have also made heavy modification to django-socialregistration which I am currently using by symlinking to the modified code in my repos in my python path
I tried to look into solutions like pip and easy_install but I have not been able to get them to install code from github
I think the easiest way is to get my backend working on his computer and then he just commiting to the repos. Any ideas how I can make this easy?
Another, free option, is to use VirtualBox. I would recommend installing the same OS on it as your production server. Then, he's developing in the same environment as the live site, and can just check into the repo the same as you. Hey, you may want to do the same on your end--then both of your environments are the same and also the same as the live site.
Get him to set up a virtual machine on his Mac, using VMWare Fusion or Parallels, running the same operating system that you currently use for your back end. If he prefers developing using Mac tools he can do still that by sharing his local changes to the virtual machine via a shared directory.
An alternative, if that's possible, would be to set up a testing/development environment on a machine with an OS you're familiar with, then install something like Dropbox on his local machine where he can develop the frontend code, and install Dropbox on that other environment with the backend components. Dropbox would sync his local changes to that testing environment for him to run the code on.
That way, he would be able to use that environment to test his code, you wouldn't need to set up a backend on his machine (or keep it up to date) and you'd still be getting the same functionality.
Again, if that's an option.

Categories