python modules in gcloud deployment manager template - python

Is it possible to use modules installed via python pip in gcloud deployment manager templates (python templates, not jinja)?
I have only being able to find reference of how to import .py files through a deployment manager schema file. e.g.
app.py.schema
info:
title: app
author: me
description: this is a description
imports:
- path: helper.py
i.e. i can only import a single .py at a time, so not useful for importing pip modules.
this link explains that to use libraries that is not explicitly supported we need to import the full library source. Although it does not mention if this full library source can actually be a pip module, or is it only referring to single .py files.
The module i'm trying to use inside my python templates is netaddr for manipulating ip address and subnets.
Any help is appreciated.

what you are looking for it not possible, you cannot install module using pip with interacting the the API, unless if you want to import the whole netaddr module as source code in your *.yaml config file (by adding the path for all the files related to the module) then importing which function your *.py file as Google mention in the documentation some library are supported, even with that some sys and network call will be rejected, you may think about using template_module
Original Answer:
Yes, you can check the link Here for importing multiple python files and using multiple templates.

Related

Creating package from boost::python modules

I am trying to follow the tutorial for creating python packages from shared objects compiled from C++ via boost::python, but I am running into some problems I need clarification about.
Assume I have a local $install_dir into which I install the compiled shared objects in the form of a python package via CMake. Parallel to the tutorial liked above, my structure is:
$installdir/
my_package/
__init__.py
module/
__init__.py
_module.so
I have added $installdir to my $PYTHONPATH.
$installdir/my_package/__init__.py is empty.
$installdir/my_package/module/__init__.py contains:
from _module import *
When I then import my_package.module I get ModuleNotFoundError: No module named '_module' raised from $installdir/my_package/module/__init__.py.
The issue seems to be that _module.so is not found from $installdir/my_package/module/__init__.py.
Why is the approach from the tutorial not working?
If I add $installdir/my_package/module to $PYTHONPATH directly everything works fine, but it feels like that should not be neccessary, as $installdir/my_package/module/__init__.py should find _module.so locally.
I implemented the following portable workaround for now within $installdir/my_package/module/__init__.py:
import sys, pathlib
sys.path.insert(0,str(pathlib.Path(__file__).parent.absolute()))
from _module import *
Bonus Question:
Changing the file name extension from .so to .pyd breaks the import (ModuleNotFoundError) even without any packaging and .pyd being accessible directly via $PYTHONPATH. I define the extension via CMake's SUFFIX target property. This is obviously mostly cosmetic, but I would still like to understand the reason and how to fix it.
Edit:
This is Ubuntu 20.04 with python 3.8 and boost 1.71

Unable to 'relative import' a local Python library using a symbolic link

My project has two main folders: sourceCode and lib: Highlighted file tree here
I'm working in \sourceCode\mainFile.ipynb and would like to import a library residing in lib called modifiedLibrary, which has an __init__.py file.
Currently, I'm using a symbolic link for relative-importing the library. The symbolic link is located in \sourceCode and called sym_link with the following content:
../lib/modifiedLibrary/modifiedLibrary
In the project, the library and the symbolic link have the same name.
but when I import in python using
import modifiedLibrary
I receive ModuleNotFoundError: No module named 'modifiedLibrary'
I understand that the same code functions on another device that I do not have access to right now, and I do not seem to find what the issue is.
I successfully included the needed library by:
changing the working directory temporarily to where the library's __init__.py is located,
importing the library
then reverting back to my original directory
but I would like to know what the issue is with the current symbolic link.
Windows 10 / Python 3.7.3 / Jupyter
Relevant Question: Interactive Python - solutions for relative imports
The other solution I found, rather than changing the working directory temporarliy to include a local module, was to add the location of the module on my device to sys.path before importing it:
import sys
sys.path.append('C:/Users/user/myProject/../modifiedLibrary/')
import modifiedLibrary
It doesn't make use of the symbolic link but it seems to do the trick for now. Would be an issue when the code is shared and ran on another device.

What is the cleanest way to add a directory of third-party packages to the beginning of the Python path?

My context is appengine_config.py, but this is really a general Python question.
Given that we've cloned a repo of an app that has an empty directory lib in it, and that we populate lib with packages by using the command pip install -r requirements.txt --target lib, then:
dirname ='lib'
dirpath = os.path.join(os.path.dirname(__file__), dirname)
For importing purposes, we can add such a filesystem path to the beginning of the Python path in the following way (we use index 1 because the first position should remain '.', the current directory):
sys.path.insert(1, dirpath)
However, that won't work if any of the packages in that directory are namespace packages.
To support namespace packages we can instead use:
site.addsitedir(dirpath)
But that appends the new directory to the end of the path, which we don't want in case we need to override a platform-supplied package (such as WebOb) with a newer version.
The solution I have so far is this bit of code which I'd really like to simplify:
sys.path, remainder = sys.path[:1], sys.path[1:]
site.addsitedir(dirpath)
sys.path.extend(remainder)
Is there a cleaner or more Pythonic way of accomplishing this?
For this answer I assume you know how to use setuptools and setup.py.
Assuming you would like to use the standard setuptools workflow for development, I recommend using this code snipped in your appengine_config.py:
import os
import sys
if os.environ.get('CURRENT_VERSION_ID') == 'testbed-version':
# If we are unittesting, fake the non-existence of appengine_config.
# The error message of the import error is handled by gae and must
# exactly match the proper string.
raise ImportError('No module named appengine_config')
# Imports are done relative because Google app engine prohibits
# absolute imports.
lib_dir = os.path.join(os.path.dirname(os.path.realpath(__file__)), 'libs')
# Add every library to sys.path.
if os.path.isdir(lib_dir):
for lib in os.listdir(lib_dir):
if lib.endswith('.egg'):
lib = os.path.join(lib_dir, lib)
# Insert to override default libraries such as webob 1.1.1.
sys.path.insert(0, lib)
And this piece of code in setup.cfg:
[develop]
install-dir = libs
always-copy = true
If you type python setup.py develop, the libraries are downloaded as eggs in the libs directory. appengine_config inserts them to your path.
We use this at work to include webob==1.3.1 and internal packages which are all namespaced using our company namespace.
You may want to have a look at the answers in the Stack Overflow thread, "How do I manage third-party Python libraries with Google App Engine? (virtualenv? pip?)," but for your particular predicament with namespace packages, you're running up against a long-standing issue I filed against site.addsitedir's behavior of appending to sys.path instead of inserting after the first element. Please feel free to add to that discussion with a link to this use case.
I do want to address something else that you said that I think is misleading:
My context is appengine_config.py, but this is really a general Python
question.
The question actually arises from the limitations of Google App Engine and the inability to install third-party packages, and hence, seeking a workaround. Rather than manually adjusting sys.path and using site.addsitedir. In general Python development, if your code uses these, you're Doing It Wrong.
The Python Packaging Authority (PyPA) describes the best practices to put third party libraries on your path, which I outline below:
Create a virtualenv
Mark out your dependencies in your setup.py and/or requirements files (see PyPA's "Concepts and Analyses")
Install your dependencies into the virtualenv with pip
Install your project, itself, into the virtualenv with pip and the -e/--editable flag.
Unfortunately, Google App Engine is incompatible with virtualenv and with pip. GAE chose to block this toolset in an attempt sandbox the environment. Hence, one must use hacks to work around the limitations of GAE to use additional or newer third party libraries.
If you dislike this limitation and want to use standard Python tooling for managing third-party package dependencies, other Platform as a Service providers out there eagerly await your business.

Re-opening a package in Python

Maybe it's not possible (I'm more used to Ruby, where this sort of thing is fine). I'm writing a library that provides additional functionality to docker-py, which provides the docker package, so you just import docker and then you get access to docker.Client etc.
Because it seemed a logical naming scheme, I wanted users to pull in my project with import docker.mymodule, so I've created a directory called docker with an __init__.py, and mymodule.py inside it.
When I try to access docker.Client, Python can't see it, as if my docker package has hidden it:
import docker
import docker.mymodule
docker.Client() # AttributeError: 'module' object has no attribute 'Client'
Is this possible, or do all top-level package names have to differ between source trees?
This would only be possible if docker was set up as a namespace package (which it isn't).
See zope.schema, zope.interface, etc. for an example of a namespace package (zope is the namespace package here). Because zope is declared as a namespace package in setup.py, it means that zope doesn't refer to a particular module or directory on the file system, but is a namespace shared by several packages. This also means that the result of import zope is pretty much undefined - it will simply import the top-level module of the first zope.* package found in the import path.
Therefore, when dealing with namespace packages, you need to explicitely import a specific one with import zope.schema or from zope import schema.
Unfortunately, namespace packages aren't that well documented. As noted by #Bakuriu in the comment, these are some resources that contain some helpful information:
Stackoverflow: How do I create a namespace package in Python?
Built-in support for namespace packages in Python 3.3
Namespace packages in the setuptools documentation
Post about namespace packages at sourceweaver.com

Python modules' paths

In some Python scripts I see the following imports:
import fileA
import someDir.fileB
from fileC import functionA
There exist corresponding files fileA.py, someDir/fileB.py and fileC.py. However, while looking in the Requests source code, I found this in the __init__.py file:
from requests.packages.urllib3.contrib import pyopenssl
In this case, requests is the CWD and packages.urllib3.contrib.pyopenssl.py is the file. Why does this defy convention? I do see that the packages.urllib3.contrib directory does also have a __init__.py file, which seems to be related.
Furthermore, I'm not sure if it is related but I think it is so I post it here. In my script I have the folder kennethreitz/requests, since the application depends on the Requests module but I'm deploying it to environments which might not have Requests installed. However, simply adding to the file import kennethreitz.requests is not including the Requests module. I import kennethreitz.requests.__init__ and a few other obvious permutations but I cannot get the module to import. How can I package Requests with my code? The obvious Google searches are not helping.
requests is using an absolute import. You cannot arbitrarily nest packages into other directories and still expect things to work.
Instead, add the kennethreitz directory (which should not have a __init__.py file) to your sys.path module search path. That way the requests module will still be importable as a top-level package.
Next, you may want to look into Python packaging, dependencies and using a tool like pip or zc.buildout to deploy your code for you. Those tools handle dependencies for you and will install requests as required. See the Python Packaging User Guide for an introduction.

Categories