how to add precompiled binaries to a python project (cloud-functions python)? - python

I have to deploy a python function to GCP. The libraries I want to use a library (USD by Pixar specifically) which needs I need to build myself. To make the library accessible I need to make changes to $PATH an $PYTHONPATH.
So the problem is that I have to include everything in one project to deploy the function and I don't know where to start.
I have tried appending to PYTHONPATH on run-time but it gives no module error. Also I have no idea how can I change PATH variable to be able to use executables inside usd/bin folder
import sys
sys.path.append('lib/python') # relative path.

Related

How does python venv manage C++ dependencies

I'm using a library which offers a python wrapper to a c++ executable.
I installed it (https://github.com/bulletphysics/bullet3) using venv (https://docs.python.org/3/library/venv.html) - and all is working great.
I'm considering trying to build https://github.com/bulletphysics/bullet3
From the root of the venv folder I found gym/lib/python3.7/site-packages/pybullet.cpython-37m-x86_64-linux-gnu.so. I'm guessing this is the executable that is invoked eventually from python.
What steps are involved in calling from Python to the correct external binary executable? How does import pybullet as p resolve to gym/lib/python3.7/site-packages/pybullet.cpython-37m-x86_64-linux-gnu.so?
This seems to be close to the end of the c++ world; but I can't find the right key word searches to see exactly how that allows python usages.
Thanks
In short words: C-python just looks for correctly named dynamic libraries in PYTHONPATH, loads such library and uses predefined interface to understand what exactly from this library shall be visible as contents of the module inside Python.
In long words details of how to prepare such shared object and what are required contents of it are described in https://docs.python.org/3/extending/index.html
So venv just puts dynamic library in directory which is part of PYTHONPATH of virtual environment.
python just looks for correctly named dynamic libraries in PYTHONPATH, loads such library and uses predefined interface
for further info :
https://docs.python.org/3/extending/index.html

How other systems will recognize my PYTHONPATH since it isn't provided?

I'm coding a bot.
In this bot, deep in the program directory structure, I have to make an import that needs the absolute path of a package far away in the directory structure. In a way that I can't make the imports.
I've managed to import it successfully by exporting the PYTHONPATH variable in my local ~/.bashrc file containing the absolute path to my package.
Then I can import things in my program like:
import absolute_path.module
The thing is, when someone else downloads this program files for use, or when I upload it to a server, how is this other party going to manage this absolute importing I made? (Provided the package to be imported is going along with the program files, in the same path where I make the importing).
They didn't set the PYTHONPATH variable, so, are they going to have troubles?
It depends. Is the other module something standard (ie installable via pip etc)? then you just add it to your project's requirements.txt and the users should be able to figure it out from there.
If it's something you've written, then you can use something like PyInstaller to package all the dependencies of your module (including imports and even the python interpreter) so users don't need to download anything extra.
Another option is to put the other module with your bot module and distribute them together, and use relative paths.
Make your bot into an installable package

Setting relative pythonpath for project in Windows (Visual Studio Code)

I'm trying to set up an environment for Python project to help out a friend (developing on different OS).
I want to set environment variables per project in the .env file, but ultimately it fails to recognize Pythonpath.
The goal is to set the /api subfolder of my currently opened folder, but ir doesn't recognize the relative path. Writing there absolute path (as in screenshot below) sort of works - it recognizes the custom library and all others, but on debugging it fails on other relative paths in the code (or so it appears)
Any advice? Thanks.
One approach is to avoid the whole environment setup to begin with and define your dependencies in source. This works when your repo has all you need.
Consider the project structure like this
proj
/api
api_source.py
/utils
utils_source.py
/tests
api_source.py
import os
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '../')))
import utils.utils_source as utils_source
It's a bit messy but I like this approach in that I do not have to setup any environment wherever I clone my repo, at least not for the self contained dependencies within the repo itself.
Usually I'll do this path appending in the Main script and structure all my imports in the dependencies to rely on a fixed path prefix in the path list that way I do this operation once and all the dependent modules run with it.

Python:Sharing codes of models between 2 modules

I'm climbing my learning curve in Python and try to understand where to put everything.
I originally have a python module in a folder and then a sub folder src, in this src folder I will then have my main source files say main.py then I will have models folder storing my models codes.
/myproject/src/main.py
/myproject/src/models/a-model.py
/myproject/src/models/b-model.py
So my main will import the model like this:
from models.a-model import a
Then when I package the zip file I just zip the myproject folder with that folder structure and deploy and everything is fine.
Now I have another new module doing something different but need to use the same models.
I can easily duplicate them all and code separately and deploy. But I would like to share the codes to the models, so that when one model changes, I only need to update once, instead of 2 places.
My new module is like
/mynew/src/main-b.py
/mynew/src/models/a-model.py
/mynew/src/models/b-model.py
What is the best practise to do this?
Do I put like this?
/myproject/src/main.py
/mynew/src/main-b.py
/models/a-model.py
/models/b-model.py
And then update the import?
But I have doubt how do I deploy? Do I also have to setup the same folder structures?
One would be adding /myproject/src/models to the environment variable PYTHONPATH. Python adds the directories listed in PYTHONPATH environment variable to sys.path, the list of directories where Python searches when you try to import something. This is bad, because modifying PYTHONPATH has its own side effects, fortunately, virtual environments provide a way to get around those side effects.
Alternatively and much better you could add your modules to site-packages directory, site-packages is added to sys.pathby default, this obviates the need to modifyPYTHONPATH. To locate thesite-packages` directory, refer to this page from Python Documentation: Installing Python Modules (Legacy version).
You could also use LiClipse IDE which comes with Pydev already installed. Create source a folder from the IDE and link your previous project with your newer project. When you link your projects the IDE adds the source folders of your older project to the PYTHONPATH of your newer project and thus Python will be able to locate your modules.

How can I make a Python extension module packaged as an egg loadable without installing it?

I'm in the middle of reworking our build scripts to be based upon the wonderful Waf tool (I did use SCons for ages but its just way too slow).
Anyway, I've hit the following situation and I cannot find a resolution to it:
I have a product that depends on a number of previously built egg files.
I'm trying to package the product using PyInstaller as part of the build process.
I build the dependencies first.
Next I want to run PyInstaller to package the product that depends on the eggs I built. I need PyInstaller to be able to load those egg files as part of it's packaging process.
This sounds easy: you work out what PYTHONPATH should be, construct a copy of sys.environ setting the variable up correctly, and then invoke the PyInstaller script using subprocess.Popen passing the previously configured environment as the env argument.
The problem is that setting PYTHONPATH alone does not seem to be enough if the eggs you are adding are extension modules that are packaged as zipsafe. In this case, it turns out that the embedded libraries are not able to be imported.
If I unzip the eggs (renaming the directories to .egg), I can import them with no further settings but this is not what I want in this case.
I can also get the eggs to import from a subshell by doing the following:
Setting PYTHONPATH to the directory that contains the egg you want to import (not the path of the egg itself)
Loading a python shell and using pkg_resources.require to locate the egg.
Once this has been done, the egg loads as normal. Again, this is not practical because I need to be able to run my python shell in a manner where it is ready to import these eggs from the off.
The dirty option would be to output a wrapper script that took the above actions before calling the real target script but this seems like the wrong thing to do: there must be a better way to do this.
Heh, I think this was my bad. The issue appear to have been that the zipsafe flag in setup.py for the extension package was set to False, which appears to affect your ability to treat it as such at all.
Now that I've set that to True I can import the egg files, simply by adding each one to the PYTHONPATH.
I hope someone else finds this answer useful one day!
Although you have a solution, you could always try "virtualenv" that creates a virtual environment of python where you can install and test Python Packages without messing with the core system python:
http://pypi.python.org/pypi/virtualenv

Categories