I'm struggling in figuring out how to properly set up a python Lambda in a subdirectory and have imports work correctly. For example, if you don't put your python in the root folder as recommended by AWS and instead, put it in a src folder with a lambda_handler.py that is the main handler in there and then packages/folders inside that, so you might have src/api, as an example. I am using the new SAM accelerate and they acknowledge a bug where it doesn't ignore the .aws-sam folder, so it will infinitely loop with the project in root, so they recommend a subfolder, but that greatly complicates things with Python, apparently.
I think I properly figured out how to get it to properly to read my own packages and modules in subfolders using init.py, but I can't get my requirements.txt to install, so they don't show up in the local build or the cloud build. I have found quite a few StackOverflows that are seemingly on the subject, but none of them seem to work for me or give an example that I can follow that works. The following is my structure:
/
.aws-sam
└── src
├── app_folder_1
│ ├── __init__.py
│ ├── example1.py
├── lambda_handler.py
├── app_folder_2
│ ├── __init__.py
│ ├── example2.py
├── requirements.txt
└── template.yml
I have an import of pymysql as an example and my dependencies for requirements.txt are never installed, so pymysql never is found. I feel like things shouldn't be this difficult. Can anyone assist?
UPDATE: I may have figured out the issue, which this post gave me a cluse to https://github.com/aws/serverless-application-model/issues/1927 It appears that sam invoke local has the same issue with custom templates -- something I was utilizing -- and that isn't very intuitive, even though they claim that is working as intended.
UPDATE 2: Definitely assisted with my progress, but it still isn't working as intended.
I forgot to return to this, so I figured I would post this here to assist others that may be struggling as Python is really finicky.
The following is the structure of a functional project in Lambda:
├── README.md
├── buildspec.yml
├── lambda_handler.py
├── locales
│ ├── en
│ │ └── LC_MESSAGES
│ │ └── base.po
│ ├── es
│ │ └── LC_MESSAGES
│ │ └── base.po
│ ├── fr
│ │ └── LC_MESSAGES
│ │ └── base.po
├── my_project_name
│ ├── __init__.py
│ ├── python_file_1.py
│ ├── python_file_2.py
│ ├── python_file_3.py
│ ├── repositories
│ │ ├── __init__.py
│ │ ├── resolvers_1.py
│ │ ├── resolvers_2.py
├── requirements-dev.txt
├── requirements.txt
├── template.yml
└── tests
├── __init__.py
├── tests_file_1.py
├── tests_file_2.py
├── tests_file_3.py
└── tests_file_4.py
Note that the base.mo files are generated in the build in the same location as the base.po files and that requirements-dev.txt has dev-only libraries, such as pep8 for formatting and is only run within my local conda environments [I set up virtual environments for each project using miniconda]. The purpose of the resolvers in the nested repositories folder is to separate out the controller-like python files from the calls to the repositories to retrieve data.
Related
I currently have the following project structure in python:
| project
├── module_1
│ ├── __init__.py
│ ├── class_1.py
├── module_2
│ ├── __init__.py
│ ├── class_2.py
├── main_1.py
├── main_2.py
├── ..
├── main_n.py
├── set_up.py
Because of the vast amount of main files we use to run our scripts, I would like to organise these in different modules within the project (different module based on different usage of the endpoint). I have tried something like below:
| project
├── module_1
│ ├── __init__.py
│ ├── class_1.py
├── module_2
│ ├── __init__.py
│ ├── class_2.py
├── endpoints_1
│ ├── __init__.py
│ ├── main_1.py
│ ├── set_up.py
├── endpoints_2
│ ├── __init__.py
│ ├── main_2.py
│ ├── set_up.py
The problem that arises however is that I would have to make a different 'set_up.py' file for each of the endpoint modules. In this set_up file I would add the project path to the sys.paths so that all imports work as expected.
So to conclude I have the following two questions:
Is there any way I can have this project structure without having to mess to much with the sys paths and having a custom 'set_up.py' for each module which contains endpoints?
Should I, to begin with, even use this kind of project structure or should I just keep the original structure and not bundle the endpoints in a module ?
I have a Project where there is a python(.py) file inside a Directory say A, under the folder A , I have a file having the code:
from utils.utils import csv_file_transform
where utils is a sibling directory to A
/A
file.py
/utils
__init__.py
utils.py
but, I am getting a error No Module named utils.utils, utils is not a package
I tried adding the path using sys, but its still giving the same error
import sys
sys.path.append('C:/Users/Sri/Documents/newmod/folder/utils')
The code is working fine in ubuntu but not in windows, I am using a windows system
as much as I don't support link only / reference only answers I reckon that your question is answered best by referencing the existing discussions and providing broader context on Python importing mechanism.
Your Requirements
Are you actually working on creating on Python module? If this is the case, it would be good if you could establish a rigorous structure for your module and follow those guidelines.
Consider the following article by Jason C. McDonald on strutting a hypothetical module omission in Python. Given the following structure:
omission-git
├── LICENSE.md
├── omission
│ ├── app.py
│ ├── common
│ │ ├── classproperty.py
│ │ ├── constants.py
│ │ ├── game_enums.py
│ │ └── __init__.py
│ ├── data
│ │ ├── data_loader.py
│ │ ├── game_round_settings.py
│ │ ├── __init__.py
│ │ ├── scoreboard.py
│ │ └── settings.py
│ ├── game
│ │ ├── content_loader.py
│ │ ├── game_item.py
│ │ ├── game_round.py
│ │ ├── __init__.py
│ │ └── timer.py
│ ├── __init__.py
│ ├── __main__.py
│ ├── resources
│ └── tests
│ ├── __init__.py
│ ├── test_game_item.py
│ ├── test_game_round_settings.py
│ ├── test_scoreboard.py
│ ├── test_settings.py
│ ├── test_test.py
│ └── test_timer.py
├── pylintrc
├── README.md
└── .gitignore
Jason would import the modules in a following manner from omission.common.game_enums import GameMode.
Relative Imports
#am5 suggests adding
import os, sys; sys.path.append(os.path.dirname(os.path.realpath(__file__)))
to your __init__.py files. If you study the related discussion you will observer that views on modifying sys.path are diverse. Some may argue that failed imports and errors like:
ImportError: attempted relative import with no known parent package
are actually indication of code smell. As a solution, which I hope you won't find patronising, I would suggest that you solve your importing challenge by:
a) Deciding on your requirements on how "deep" you want to go into modular architecture.
b) Adopting a corresponding recommended architecture with rigorous approach to module structure
c) Refraining from hard-coding any path via sys.path. Appending relative paths, as outlined above, is not a flawless solution but has some merits and may be worth considering.
Other worthy discussions/artefacts
Best structure for the Python project
Sample project by PyPa (GitHub)
Importing files from different folders
(In the interest of transparency, this is a follow up to a question asked here)
I'm dealing with related files in which a namespace package seems a good fit. I'm following the guide from the packaging authority, which places a setup.py in each namespace package;
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
In my tests, created a similar project. Apart from setup.py, I placed my unit tests, docs, and other stuff per namespace (I left out some of the directories for compactness.). I used pyscaffold to generate the namespaces.
├── namespace-package-test.package1
│ ├── LICENSE.txt
│ ├── README.md
│ ├── setup.cfg
│ ├── setup.py
│ ├── src
│ │ └── pkg1
│ │ ├── cli
│ │ │ ├── __init__.py
│ │ │ └── pkg1_cli.py
│ │ └── __init__.py
│ └── tests
├── namespace-package-test.package2
│ ├── AUTHORS.rst
However, I then noticed that pyscaffold has the option to create namespaces packages in the putup command.
(venv) steve#PRVL10SJACKSON:~/Temp$ putup --force my-package -p pkg1 --namespace namespace1
(venv) steve#PRVL10SJACKSON:~/Temp$ putup --force my-package -p pkg1 --namespace namespace2
This creates a folder structure like this;
├── AUTHORS.rst
├── CHANGELOG.rst
├── LICENSE.txt
├── README.rst
├── requirements.txt
├── setup.cfg
├── setup.py
├── src
│ ├── namespace1
│ │ ├── __init__.py
│ │ └── pkg1
│ │ ├── __init__.py
│ │ └── skeleton.py
│ └── namespace2
│ ├── __init__.py
│ └── pkg1
│ ├── __init__.py
│ └── skeleton.py
└── tests
├── conftest.py
└── test_skeleton.py
So I'm conflicted; I trust the team at pyscaffold, but it goes against the example from the packaging authority.
Are both approaches valid?
Is there a reason to choose one approach over the other?
The idea behind the namespace option in PyScaffold is to share/reuse namespaces across projects (in opposite of having more than one namespace inside a single project). Or in other words, to split a larger project in independently maintained/developed projects.
To my best understanding, having an structure like the one you showed in the 4th code block will not work. Using putup --force twice with 2 different namespaces for the same root folder is not the intended/supported usage.
The approach of PyScaffold is the same as the package authority, the only difference is that PyScaffold will assume you have only one package contained in a single project and git repository (PyScaffold also uses a src directory for the reasons explained in Ionel's blog post)
The reason behind adopting one setup.py per namespace+package is that it is required for building separated distribution files (i.e. you need one setup.py per *.whl).
I am writing a program which is mainly in python but some interactive features are done through a web-app that talks to flask. It would be nice to have the web-app inside the python program so I am looking at using PyQtWebEngine.
This works surprisingly well except that I cannot get spell checking to work. I have run
self.page().profile().setSpellCheckEnabled(True)
self.page().profile().setSpellCheckLanguages({"en-GB"})
from inside my child class of QWebEngineView, and I have checked isSpellCheckEnabled() is True.
I wonder if it cannot find the languages. No qWarning is detected which I would expect if it cannot find the dictionary. As suggested by the non-python example.
I have an en-GB.bdic which I copied from the Chromium hunspell git. I have tried putting the file at:
<directory_my_py_file_is_in>/qtwebengine_dictionaries/en-GB.bdic
When I run
app = QApplication(sys.argv)
print(app.applicationDirPath())
the result is
/usr/bin
so I tried
/usr/bin/qtwebengine_dictionaries/en-GB.bdic
This wouldn't have been OK because I cannot edit this location when the program is pip installed, but it was worth a try.
With the .bdic file in either place I never see any spell check feature.
Has anyone got spellchecking working in PyQtWebEngine? I have not been able to find much in the way of documentation.
Assuming that the .bdic are valid then I have established the path of the dictionaries through the environment variable QTWEBENGINE_DICTIONARIES_PATH, for example I have translated the official example into python with the following structure:
├── data
│ ├── icon.svg
│ ├── index.html
│ ├── spellchecker.qrc
│ └── style.css
├── dict
│ ├── de
│ │ ├── de-DE.aff
│ │ ├── de-DE.dic
│ │ └── README.txt
│ └── en
│ ├── en-US.aff
│ ├── en-US.dic
│ └── README.txt
├── main.py
├── spellchecker_rc.py
├── qtwebengine_dictionaries
│ ├── de-DE.bdic
│ └── en-US.bdic
└── README.md
main.py
# ...
CURRENT_DIR = os.path.dirname(os.path.realpath(__file__))
os.environ["QTWEBENGINE_DICTIONARIES_PATH"] = os.path.join(
CURRENT_DIR, "qtwebengine_dictionaries"
)
# ...
Note: To get the bdic I have used the qwebengine_convert_dict tool executing:
qwebengine_convert_dict dict/en/en-US.dic qtwebengine_dictionaries/en-US.bdic
qwebengine_convert_dict dict/de/de-DE.dic qtwebengine_dictionaries/de-DE.bdic
The complete code is here.
I'm trying to run a .py file through the command prompt using the command "python filename.py". I've already set the environment variables for python after I installed it, so I don't get any error when I type python. The file I'm running imports a few directories, all of which are preexistent in the same directory as the file I'm running, apart from the file web.py, which I can't seem to locate in the directory, so I'm assuming it's somewhere inside the python package, I have downloaded. But, I couldn't find it there either, so would I need to install an extension for python for the web.py file to be successfully imported or is there another way around this.
I've downloaded Python 3.4, I'm using windows 7 as my operating system and the exact error I receive when I try to compile the file is
ImportError: No module named 'utils'
Can someone please explain or direct me to a page which shows in detail how to install extensions for python?
The specific error happens when the Python interpreter can't find a particular ".py" file. In your case, it is the file "utils.py".
First you need to find which file is trying to import "utils.py". Starting with your main file, look up all the files you are importing. (I am guessing this issue is coming from one of the non-library files, but I could be wrong.)
Once you have the "top level" import list, check each of those files to see what THEY are importing, and repeat the process for them. Eventually, you will find the .py file which is trying to import "utils". There might be a directory specification forcing Python to look in the wrong place.
Finally, using windows' file manager, perform a search for "utils.py". As a temporary fix, you can copy it from its current location into your working directory. That will at least allow you to get your project up and running until you sort out the real cause.
This error occurs due to file(s)/folder(s) that are not in their respective locations.
I had a very similar error with a Python Flask Framework app, it turns out that my manage.py and config.py files were inside the app folder with the other folders(they were supposed to be outside the app directory), and that cause the error in my situation.
Once I placed the files in their proper location boom error was gone.
So Check you application framework and make sure things are located were they're supposed to be.
Good luck
I installed via apt (I use debian linux) and had this same error in one project. For me, the solution was to install via pip:
$ pip install utils
It should work for both python 2 and python 3.
So in my case I ran tree command in my Pipenv environment and it should be look like as below: I hope this helps.
.
├── README.md
├── __init__.py
├── core.yaml
├── core_blueprints
│ ├── __init__.py
│ ├── ami_lookup.py
│ ├── chef_buckets.py
│ ├── custom_resources
│ │ ├── __init__.py
│ │ └── cfn_custom_classes.py
│ ├── cw_alarm.py
│ ├── roles.py
│ ├── security_groups.py
│ ├── shared_iam
│ │ ├── __init__.py
│ │ └── iam_policies.py
│ ├── sns_subscription.py
│ ├── sns_topic.py
│ ├── ssm_chefrun_documents.py
│ ├── tf_state.py
│ ├── utils . #### This is not correct location.
│ │ ├── __init__.py
│ │ ├── standalone_output.py
│ │ ├── version.py
│ │ └── version_check.py
│ ├── vpc.py
│ ├── vpn_eip.py
│ └── vpn_server.py
├── core_hooks
│ ├── __init__.py
│ ├── cookbook_archive.py
│ ├── core_lambda.py
│ ├── keypair.py
│ ├── s3.py
│ ├── s3_cache.py
│ └── ssm.py
├── platform_version.py
├── prd1-ca-central-1.env
├── setup.py
└── utils ###### This is a correct location.
├── __init__.py
├── standalone_output.py
├── version.py
└── version_check.py