Trouble loading local modules only with AWS Lambda - python

app structure:
.
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.md
├── template.yaml
├── tests
│ ├── __init__.py
│ └── unit
│ └── lambda_application
│ ├── test_handler.py
│ └── test_parent_child_class.py
└── lambda_application
├── __init__.py
├── first_child_class.py
├── lambda_function.py
├── second_child_class.py
├── requirements.txt
└── parent_class.py
4 directories, 14 files
Code sample from lambda_function.py:
import os
import json
from hashlib import sha256
import boto3
from requests import Session
from .first_child_class import FirstChildClass
def lambda_handler(event, context):
# Do some stuff.
As is, I get the error message
Unable to import module 'lambda_function'
but If I comment out the last import, from .first_child_class import FirstChildClass, it is able to get past that part and get the error that I haven't loaded the module for that class.
I only seem to get this error when I run it in the lambci/lambda:python3.7 docker image and when I deploy on AWS. All my tests pass and it is able to import the module with no problems.
Is there something I should load/setup in the __init__.py file?
EDIT I changed the names of some of the files to post it here.

You are using a relative import here which works in case the code you are executing is in a module. However, since your code is being executed not as a module, your AWS Lambda fails.
https://stackoverflow.com/a/73149/6391078
A quick run locally gave the following error:
PYTHON 3.6
Traceback (most recent call last):
File "lambda_function.py", line 4, in <module>
from .first_child_class import FirstChildClass
ModuleNotFoundError: No module named '__main__.first_child_class'; '__main__' is not a package
Your tests pass because your testing suite imports the file as a module from the lambda_application folder which gets treated as a package in the testing module
This got me going in the correct direction but didn't quite give me the answer but did lead me to the answer, so I thought I would update what I found here.
I didn't try it but from what I found, I believe that:
from first_child_class import FirstChildClass
would be the simplest resolution.
What I ended up doing was moving the classes into a sub-directory and essentially did the same as above but with a package name prepended.
So, the file structure changed to:
.
├── Makefile
├── Pipfile
├── Pipfile.lock
├── README.md
├── template.yaml
├── tests
│ ├── __init__.py
│ └── unit
│ └── lambda_application
│ ├── test_handler.py
│ └── test_parent_child_class.py
└── lambda_application
├── __init__.py
└── lib
├── first_child_class.py
├── second_child_class.py
└── parent_class.py
├── lambda_function.py
└── requirements.txt
and my import became from lib.first_child_class import FirstChildClass

Related

Import project's subpackages in Python

i'm experimenting with DDD in Python so i've decided to implement a toy project.
I've created different directories in other to separate shared concepts from specific bounded contexts concepts.
As i try to import these files, i'm facing a No module named error exceptions
For example, with this project structure:
.
└── src/
├── Book/
│ ├── application
│ ├── domain/
│ │ ├── Book.py
│ │ └── __init__.py
│ ├── infrastructure
│ └── __init__.py
└── Shared/
├── application
├── domain/
│ ├── Properties/
│ │ ├── __init__.py
│ │ └── UuidProperty.py
│ ├── ValueObjects/
│ │ ├── __init__.py
│ │ └── BookId.py
│ └── __init__.py
└── infrastructure
On src/Book/domain/Book.py i have:
from Shared.domain.ValueObjects.BookId import BookId
class Book:
bookId: BookId
pages: int
As i've seen in other answer (pretty old ones) it can be fixed by adding these folders to PYTHONPATH or PATH like sys.path.insert(*path to file*) but i'm wondering if there is a more pythonic way to achieve that.
I've also tried to add an __init__.py file to src and import as from src.Shared.domain.ValueObjects.BookId import BookId but none of previous attempts worked for me
On other repos i've saw that they use setuptools to install the src package in order to import it at unit tests (i cant either import them at tests), but i don't know if that is recommended or would work inside package imports
In case someone is facing the same issue as me, i managed to import subpackages and the full package in tests directory.
Just include, in each subpackage, an __init__.py file, and inside the package/subpackages, use relative imports (it looses semantic of imports, when we know where each imports comes from by absolute path from root directory, but works)
from ..Properties import UuidProperty
# inside __init__.py of Properties directory
from .UuidProperty import UuidProperty
And, by including an __init__.py inside src/ we could import them at tests directory like
from src.Book.domain import Book
Hope this helps someone!

ImportError: No module named unable to import toplevel directory of module in python

My Directory structure:
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
I need to import common/common.py module in project1/scripts/example_import.py file
example_import.py:
import sys
sys.path.append("../common")
from common import Test
print("Module Not import error")
Error:
Traceback (most recent call last):
File "project1/scripts/example_import.py", line 3, in <module>
from common import Test
ImportError: No module named common
How to fix a issue?
Understanding how Python imports work is tricky in the beginning but makes sense once you understand how.
There are different way to fix your import issue. I would not recommend messing up with sys.path. Depending on where you are calling your script, you have multiple choice at hand.
Your Directory structure.
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
On the root of directory
python project1/scripts/example_import.py
Will work, assuming that the imports in example_import.py looks like
from common.common import Test
If you want to use from common import Test, you need to add from common.common import Test in __init__.py file in common folder.
You can also use PYTHONPATH environment variable to tell Python where to look for the base module.
PYTHONPATH=/pathto/basefolder python folder/filesx.py
Another way is to create a setup.py on base and do development installation python -m pip install --editable . on environment you are working on.
Example of setup.py
#!/usr/bin/env python
from distutils.core import setup
setup(name='projectX',
version='1.0',
description='The World is not Enoug',
author='James Bond',
author_email='agent#007.net',
url='https://www.python.org/sigs/distutils-sig/',
packages=['common', 'project1', 'project2],
)
See Python Documentation for more setup.py options.

ImportError from tests directory in Python

I have the following project structure:
python_project
├── module
│   ├── constants.py
│   └── __init__.py
├── scripts
│   ├── __init__.py
│   └── script.py
└── tests
├── constants.py
└── __init__.py
python_project is in PYTHONPATH.
module/constants.py:
VAR_MODULE = 25
tests/constants.py:
VAR = 17
I'm facing the following issue in script.py file:
from module.constants import VAR_MODULE
works
from tests.constants import VAR
throws the following exception:
from tests.constants import VAR
ImportError: cannot import name 'VAR'
I know that there is no point in importing stuff from tests directory, just wondering why this does not work. Is tests directory excluded somehow?
Thanks!

How can I import my own package in python?

I have a project
testci/
├── __init__.py
├── README.md
├── requirements.txt
├── src
│   ├── __init__.py
│   └── mylib.py
└── test
├── __init__.py
└── pow_test.py
When I run python3.6 test/pow_test.py I see an error:
File "test/pow_test.py", line 3, in
import testci.src.mylib as mylib
ModuleNotFoundError: No module named 'testci'
pow_test.py
from testci.src.mylib import get_abs
def test_abs():
assert get_abs(-10) == 10
How can I fix this error?
System details: Ububntu 16.04 LTS, Python 3.6.10
try this
from .src import mylib
from mylib import get_abs
if it won't work then import one by one. But don't import the root folder since the file you are importing to is on the same folder you are trying to import then it will always raise an error
Run Python with the -m argument within the base testsci package to execute as a submodule.
I made a similar mock folder structure:
├───abc_blah
│ │ abc_blah.py
│ │ __init__.py
│
└───def
│ def.py
│ __init__.py
abc_blah.py
print('abc')
def.py
import abc_blah.abc_blah
Execute like such:
python -m def.def
Correctly prints out 'abc' as expected here.
simply add __package__ = "testci" and also it is a good practice to add a try and except block
Your final code should look something like
try:
from testci.src.mylib import get_abs
except ModuleNotFoundError:
from ..testci.src.mylib import get_abs
for running it, type python -m test.pow_test
I think your issue is how the package is installed. The import looks fine to me. As it says CI I'm guessing you're having the package installed remotely with only the test folder somehow.
Try adding a setup.py file where you define that both the test as well as the src packages are part of your testci package.
there are many ways to organize a project, keep things consider in mind, structure should be simple and more scaleable, can differentiate the things in codebase easily.
one of the few good possible ways to structure a project is below
project/
├── app.py
├── dockerfile
├── pipfile
├── Readme.md
├── requiements.txt
├── src_code
│   ├── code
│   │   ├── __init__.py
│   │   └── mylib.py
│   └── test
│   ├── __init__.py
│   └── test_func.py
└── travisfile
here app.py is main file which is responsible to run your entire project

Can't run server and tests without modifying imports

I'm building a website with the bottle framework. I am using nosetests for my unit testing. However there is a problem I can't solve on my own.
I can't seem to find a way that allows me to run the tests without breaking my server. When I run my tests, I have to do relative imports for them to work, but they don't work when I start the server.
This is my folder structure:
├── service
│ ├── __init__.py
│ ├── application.py
│ ├── main.py
│ ├── response_header.py
│ ├── global_data_loader.py
│ ├── renderer
│ │ ├── __init__.py
│ │ ├── header.py
│ │ ├── ....
│ ├── tests
│ │ ├── test_application.py
│ │ ├── ....
Here is how I import so my server works:
application.py --
from response_header import ResponseHeader
from global_data_loader import GlobalDataLoader
from renderer import Application
However, when I run nosetests, I get this message:
ModuleNotFoundError: No module named 'response_header'
So when I want nosetests to work, I have to change the imports to look like this:
application.py --
from .response_header import ResponseHeader
from .global_data_loader import GlobalDataLoader
from .renderer import Application
Then my nosetests work, but when I want to start my server I get this message:
from .response_header import ResponseHeader
ImportError: attempted relative import with no known parent package
I have tried using sys.path.append in my tests, and it works, but I need a solution that doesn't involve PYTHONPATH or using os/sys
edit: i fixed it by moving my main.py I use to start the server to the root folder, above my service and tests module. now it works fine

Categories