CloudFunctions does not support private dependencies.
Following their recommendation (https://cloud.google.com/functions/docs/writing/specifying-dependencies-python#using_private_dependencies) I downloaded the package via:
pip install -t vendor foo
touch vendor/__init__.py
Which results in a directory:
vendor
|- foo-0.0.1.dist-info
|- INSTALLER
|- METADTA
|- RECORD
|- REQUESTED
|- top_level.txt
|- WHEEL
|- __init__.py
Now trying to import vendor.foo results in an error:
ModuleNotFoundError: No module named 'vendor.foo
Is there an import subtlety I am missing or how is this supposed to work?
There are multiple ways to do this.
relative imports, see here, also the guide you linked uses relative imports. If you import as import vendor.foo it will only work if you run the python script in the folder, where also the 'vendor' folder is.
adding the package location to PYTHONPATH environment variable. export PYTHONPATH = $PYTHONPATH":/path/to/vendor. This can also be done inside a python script using sys.path. I'm not sure how this is done in case of Gcloud
installing the package to a virtual environment. As shown here you can setup a virtual environment. One you activated this you should be able to pip install packages to it.
First approach should definatly work for you.
The problem was in foo. Forgot a toplevel __init__.py, package autodetection failed and it added no sources to the wheel.
Thus when downloading later, it created no package directory.
Related
I have a python package with the following standard directory structure:
package_name/
setup.py
package_name/
module_1.py
module_2.py
...
tests/
docs/
I've installed this package with pip3 install -e .. I've noticed an inconsistent importing issue. (Please read to the end!) If I restart terminal and run the following (1) within the interpreter:
>>> from package_name import module_1
I get an import error. If I instead run this (2):
>>> from package_name.package_name import module_1
it imports fine. If I then navigate to the directory and rerun pip3 install -e ., I can import in the standard way (following (1)). What on earth is causing this? To make things stranger, I can import in the standard way (1) in Jupyter and my IDE without reinstalling the package. This issue only comes up when I open/restart terminal.
This should be fixed by adding the main project folder package_name/ to your PATH
Also, try renaming your project folders with different names, in order to avoid confusion for yourself, people working with you and also to help python to find the right module location
You should also create the __init__.py files on each module folders, even if those are empty files. This also helps python to find the modules locations.
I have a Git repository cloned into myproject, with an __init__.py at the root of the repository, making the whole thing an importable Python package.
I'm trying to write a setuptools setup.py for the package, which will also sit in the root of the repository, next to the __init__.py file. I want setup.py to install the directory it resides in as a package. It's fine if setup.py itself comes along as part of the installation, but it would be better if it didn't. Ideally this should work also in editable mode (pip install -e .)
Is this configuration at all supported? I can kind of make it work by having a package_dir= {"": ".."}, argument to setup(), telling it to look for myproject in the directory above the current one. However, this requires the package to always be installed from a directory named myproject, which does not appear to be the case if, say, it's being installed through pip, or if someone is working out of a Git clone named myproject-dev, or in any number of other cases.
Another hack I'm contemplating is a symlink to . named mypackage inside of the repository. That ought to work, but I wanted to check if there was a better way first.
See also Create editable package setup.py in the same root folder as __init__.py
As far as I know this should work:
myproject-dev/
├── __init__.py
├── setup.py
└── submodule
└── __init__.py
#!/usr/bin/env python3
import setuptools
setuptools.setup(
name='MyProject',
version='0.0.0.dev0',
packages=['myproject', 'myproject.submodule'],
package_dir={
'myproject': '.',
},
)
One way to make this work for editable or develop installations is to manually modify the easy-install.pth file.
Assuming:
the project lives in: /home/user/workspace/empty/project;
a virtual environment .venv is used;
the project is installed with python3 -m pip install -e . or python3 setup.py develop;
the Python version is 3.6.
Then:
the file is found at a location such as /home/user/workspace/empty/project/.venv/lib/python3.6/site-packages/easy-install.pth;
its content is: /home/user/workspace/empty/project.
In order to let the imports work as expected one can edit this line to read the following:
/home/user/workspace/empty
Note:
Everything in /home/user/workspace/empty that looks like a Python package is then susceptible to be imported, that is why it is a good idea to place the project in its own directory, in this case the directory empty contains nothing else but the directory project.
The module project.setup is also importable.
I am trying to generate Python bindings for a C++ shared library with SWIG and distribute the project with conda. The build process seems to work as I can execute
import mymodule as m
ret = m.myfunctioninmymodule()
in my build directory. Now I want to install files that are created (namely, mymodule.py and _mymodule.pyd) in my conda environment on Windows so that I can access them from everywhere. But where do I have to place the files?
What I have tried so far is to put both files in a package together with a __init__.py (which is empty, however) and write a setup.py as suggested here. The package has the form
- mypackage
|- __init__.py
|- mymodule.py
|- _mymodule.pyd
and is installed under C:\mypathtoconda\conda\envs\myenvironmentname\Lib\site-packages\mypackage-1.0-py3.6-win-amd64.egg. However, the python import (see above) fails with
ImportError: cannot import name '_mymodule'
It should be noted that under Linux this approach with the package works perfectly fine.
Edit: The __init__.py is empty because this is sufficient to build a package. I am not sure, however, what belongs in there. Do I have to give a path to certain components?
Initial directory structure: (assume __init__.py files in directories)
project/
▾ lib/
▸ dir/
▸ package
I need to reuse lib.package in other projects, hence I created a python package for it and removed the directory. But installing it now as lib.package I can't import it from root of the project as it has lib directory there leading to a namespace collision.
Final structure:
▾ project/
▾ lib/
▸ dir/
And a package named lib.package installed in the virtualenv.
▾ lib/
▸ package/
__init__.py
I looked into pkgutil.extendpath, but adding it to __init__.py of the lib.package python package didn't help. Are there any ways I can add both the local and virtualenv installed packages in the same namespace lib?
In Python 2.7 it's better to avoid this problem by choosing a name for the dependency that's unique or at least importable in your application's codebase without manipulating paths et cetera. I think the situation has improved in Python 3.4 but I'm not entirely sure here.
Furthermore, I had the same problem and decided to use unique names for another reason: when you wish to figure out where a module is from, it's quite hard when everything's namespaced under lib. It means the module can be from a dependency as well as the local application. It also means I can simply look at an import line and know, at all times, whether it's from my own application codebase, one of my own dependencies or an external dependency.
How can I 'embed' a Python library in my own Python package?
Take the Requests library, for instance.
How could I integrate it into my own package, the objective being to allow me to run my application on different machines without actually installing Requests on every one, but having it in the same folder as my package?
Is this even possible?
If it's a pure python library (no compiled modules) you can simply place the library in a folder in your project and add that folder to your module search path. Here's an example project:
|- application.py
|- lib
| `- ...
|- docs
| `- ...
`- vendor
|- requests
| |- __init__.py
| `- ...
`- other libraries...
The vendor folder in this example contains all third party modules. The file application.py would contain this:
import os
import sys
# Add vendor directory to module search path
parent_dir = os.path.abspath(os.path.dirname(__file__))
vendor_dir = os.path.join(parent_dir, 'vendor')
sys.path.append(vendor_dir)
# Now you can import any library located in the "vendor" folder!
import requests
Bonus fact
As noted by seeafish in the comments, you can install packages directly into the vendor directory:
pip install <pkg_name> -t /path/to/vendor_dir
If you only need to run your application may be pyinstaller packaging is a better option.
It will create a single bundle with everything that is needed, including Python, to avoid dependencies on the system you're running in.
While not a direct answer to your question. You may want to look at setuptools. By leveraging this package distribution mechanism you can describe your dependencies and when your package is "installed" all the dependent packages will be installed too. You would create a setup.py file at the top of your package structure similar to:
from setuptools import setup, find_packages
setup(
name = 'MyPackage',
version = '1.0',
packages = find_packages(),
...
install_requires = ['requests'],
...
)
this would be installed by the user
python setup.py install
Requests would be automatically installed too.
All of the above answers are correct but the best solution is creating a standard package.
You can refer to this link:
https://packaging.python.org/tutorials/packaging-projects/